Abstract
This experiment was designed to explore articulatory perception at the phonetic gesture level. It was hypothesized that with relatively brief experience individuals could learn to identify words from video displays of linguapalatal contact patterns, that such learning will reveal general visual articulatory perception rules, and that both vowels and consonants will be differentiated by their contact patterns. Movements to and from articulatory positions, the number of syllables in a word, place and extent of linguapalatal contact, and specific articulatory gestures such as sibilant grooving and stop occlusion patterns were found to be used in word differentiation and recognition.
Key Words: