What is Cued Speech?

Cued Speech is a communication system (not a language system) consisting of

  1. Manual parameters (handshape and movement/placement)
  2. Non-manual parameters (speech reading)

The system is a series of handshapes formed near the mouth to represent a spoken or a written form of English. There are 8 handshapes, 4 placements, and a combination of movements that make up the English Cueing System.

Manual Parameters:

  • The handshape represents the English consonant that is spoken or written.
  • The placement of the handshape is the vowel that is spoken or written.
  • The movement of the hand represents the diphthongs vowels of English.


“The system is designed as follows: consonants that look alike on the mouth are assigned different handshapes. Vowel phonemes that look similar on the mouth have different hand locations and movements. This allows the cuer to portray the meaningful sounds of the spoken language visually and without any ambiguity to the ‘cue reader'” (Franklin, H. & Montgomery, J., 2013, Pg. 2).

Below you will find a chart of all the manual parameters founds in Cued Speech for American English



Click here for video link to learn more about the facts of Cued Speech.


Who was the Creator of Cued Speech?

In 1966-1967, Dr. Richard Orin Cornett created the system called cued speech.


He first worked at the United States Office of Education, but later became a professor of mathematics, electronics, and physics at Gallaudet University, as well as vice president of the University late in his life. Other systems such as visual phonics attempted to compete with cued speeches success; however, Orin Cornett’s system is currently the most successful visual system supporting phonemic awareness for deaf and hard of hearing children.

Who Uses Cued Speech?

The following is a list of Cued Speech Users (please note that this is not an exclusive list):

  1. Deaf and Hard of Hearing Individuals
  2. Cochlear Implant Users
  3. Hearing Assistive Device Users
  4. Families of the Deaf and Hard of Hearing
  5. Individuals with Language Learning Difficulties
  6. CODAs (Children of Deaf Adults)
  7. NCSA Members

Cued Speech is very easy to learn. In fact, adults and students can learn cued speech in just a weekend. Today, cued speech has adapted to over 60 oral languages, so many families with non-English language users may opt to use cued speech in the home. Since 9 out of 10 deaf children are born to hearing parents, ASL is most likely foreign to the family, and will take years of commitment before the family members begin to sign fluently; therefore, cued speech as a source of communication often chosen for deaf children and hearing parents or siblings.

Thanks to Cued Speech’s visual representations of English phonemes, the system is effective for Deaf and Hard of Hearing users of all sorts. Simultaneously supplementing speech with Cued Speech increased speech perception scores of profoundly deaf children whom were Cued Speech users of least three years. Scores increased from 30% to 80%. Studies also showed that deaf and hard of hearing students that used Cued speech read at the same grade level as their peers in comparison to the national average of deaf students graduated high school with a fourth grade reading level (LaSasso, C., Crain, K., & Leybaert, 2010, pg. 98).

Research has proved cued speech to be an effective communication tool for children with cochlear implants because Spencer and Marschark (2005) found that implanted children demonstrated higher speech intelligibility and higher incidences of correct syntax from cued speech as oppose to peers whom used oral or manual modes of communication. Due to technological limitations, implanted children might rely more on speech reading; therefore, auditory discrimination through cued speech is a necessity for children with hearing loss to avoid language delay. Cochlear implants struggle with discriminating perception of musical pitch or when surrounded by loud background noise, and “it is likely that the way the cortex integrates auditory and visual signal is different in children with a cochlear implant than it is in normally hearing children” (LaSasso, C., Crain, K., & Leybaert, 2010, pg. 109). Cued Speech users displayed higher number of content and function words in their vocabulary. Additional research on Cued Speech found that even children with cochlear implants benefitted with the system due to the enhancement of speech perception and evident auditory discrimination (LaSasso, C., Crain, K., & Leybaert, 2010, pg. 107-114). See Chapter six of “Cued Speech and Cued Language For Deaf and Hard of Hearing Children” for additional information on cued speech with implanted children.

(LaSasso, C.J., Crain, K.L., Leybaert, J. (2010) Cued Speech and Cued Language for Deaf and Hard of Hearing Children. San Diego, CA. Plural Publishing, 2010.)

Members of the National Cued Speech Association has its own “culture” of cuers. They meet bi-annually or annually at cue camps hosted throughout the United States and the UK. Click here for a calendar of upcoming cue camps. Additionally, below is a link to the National Cued Speech Association homepage where you can connect with Cued Speech users.

National Cued Speech Association


Is Cued Speech part of American Sign Language (ASL)?

American Sign Language is a language system with it’s own phonology, pragmatics, semantics, syntax, and morphology that is very much independent from English or any other language. Although some signs have iconicity, the system is not pantomimic, and the majority of signs have proven to be, in fact, non-iconic. William Stoke, a linguist, recognized ASL as a legitimate language system in the 1960s. Today, the United States, as well as many other countries throughout the world, nationally recognizes their sign language system as a legitimate language.

As we have already discussed, Cued Speech is a communication system based on written and spoken language, therefore, Cued Speech is not part of American Sign Language. It is a tool to supplement English learners.  It supports the information produced through auditory means by visually representing language on the phonemic level needed for literacy, speech development, auditory discrimination development, and speech reading development.

Just to Reiterate:

Cued Speech is not American Sign Language (ASL)

Cued Speech is not Manually Coded English (MCE)

Cued Speech is not Signing Exact English (SEE)

Cued Speech is not Conceptually Accurate Signed English (CASE)

Cued Speech is not Pidgin Signed English (PSE)

Cued Speech is not Finger Spelling

Cued Speech is not a language

Cued Speech is a tool for literacy, speech, and language development

Cued Speech is a Communication System

Cued Speech is a Visual Representation of written and spoken English

What are the Mechanics of Cueing?

Physical Requirements:

  • Appropriate arm posture and side placements: the arm should be raised comfortably for placement and movement. Fingertips should be on same level as chin and forearm should be 45-80 degrees from the floor. Make sure handshapes made at the side location are performed approximately 4 inches from the center of the chin

The Mouth Placement:

  • Only the tip of the fingers should touch the corner of the mouth. Since Cued Speech is 50% speech reading you must be careful NOT to cover the mouth.

The Chin Placement:

  • Always use the tip of the pointer finger for contact and always aim for the center of the chin when cueing.

The Throat Placement:

  • Always use the tip of the pointer finger for contact with the larynx, or the area 2-3 inces below the center of the chin.

The Importance of Consistent Touching:

  • In order to transmit correct information through cueing, you must always touch the body part when required. In doing so, the cuer will gain tactile feedback supporting the proper communication meaning. Inconsistent touching can lead to parallax (error that occurs when cue is viewed at an angle or improper location).
  • Consistent touching helps maintain consistent synchronization

Acquiring and Maintaining Consistent Synchronization:

  • Synchronization is acquired with fluency of Cued Speech. It is important to have synchronization with handshapes and placements in order to portray the appropriate spoken and written English properties

The Timing Movements:

  • Use Successive touching when a cue performed at the same place as the previous cue.
  • Use movement from the side location to indicate timing of an initial vowel of a word such as apple, or angry.
  • Use the “flick” rule by flicking your hand approximately 1/4 inch forward and back to indicate differences in sounds when the same handshape is performed from the side location. The rule can also be applied when dealing with consonants in isolation.

Other Requirements:

  • Cue intonation whenever possible. Intonation can be shown through changing the angle of the cueing hand.
  • Cue with both hands to indicate different speakers.
  • Show additional suprasegmentals of spoken or written English though facial expressions and body movements

(Franklin, H. & Montgomery, J. (2013) Introduction to Cued Speech: The system and its applications to traditionally spoken languages. New York, NY. Teachers College, Columbia University, pg. 8-14)


What Does Research Tell Us About Cued Speech?

Dodd (1976, et. al, 1995) found that 60% of oral and spelling errors were from lipreading ambiguity among the deaf and hard of hearing individuals. These ambiguities include “stopping (tip for ship), suppression of final segments (ma for mat), and suppression of weak syllables (nana for banana)” (LaSasso, C., Crain, K., & Leybaert, 2010, pg. 97).

Erber (1967), Green and Miller (1985), and Summerfield (1987) “suggest that lipreading gives information about place of articulation but no information about voicing or nasality” (LaSasso, C., Crain, K., & Leybaert, 2010, pg. 100).

Alegria and Lechat (2005) found “when manual cues were congruent with mouthshapes, cueing substantially improved performance compared with pure lipreading conditions… [In fact, there is] strong evidence in favor of the notion that manual cues are processed even if they are incongruent with the lipread information” (LaSasso, C., Crain, K., & Leybaert, 2010, pg. 100-101).

Nicholls and Ling (1982) found that “the speech reception scores of profoundly deaf children taught at school with cued Speech for at least three years increase from about 30% for both syllables and words in the lipreading alone condition to more than 80% when cues were added” (LaSasso, C., Crain, K., & Leybaert, 2010, pg. 98).


Stumby and Pollack (1954) found that “visual speech information dramatically enhances the identification of speech when the auditory information is degraded by noise. Auditory and visual modalities are complementary in the transmission of phonetic features” (LaSasso, C., Crain, K., & Leybaert, 2010, pg. 108).

Perier, Charlier, Hage, and Alegria (1988) found that “an increase from 39% correct responses in the lipreading condition to 72% in the lipreading + cues condition for a group of children exposed to Cued Speech, and from 37 to 53% for those who were exposed to Cued Speech later and only at school, suggesting a variability related to experience in perceiving and discriminating the phonetic structure of Cued Speech” (LaSasso, C., Crain, K., & Leybaert, 2010, pg. 107-108).

(LaSasso, C.J., Crain, K.L., Leybaert, J. (2010) Cued Speech and Cued Language for Deaf and Hard of Hearing Children. San Diego, CA. Plural Publishing, 2010.)

What Online Resources Can I Use to Supplement my Student’s/Child’s Learning?

Click on the links below to be directed to the website’s homepage.

  1. Success for Kids with Hearing Loss
  2. Hands and Voices 
  3. StarFall
  4. Sheppards Software
  5. Math Fact Cafe
  6. Enchanted Learning
  7. Ed Helper
  8. Super Teacher Worksheets
  9. Brain Pop Jr.  (Username and Password Required)
  10. Tumble Books
  11. Book Flix (Username and Password Required)
  12. Discovery
  13. United Streaming
  14. RAZ Kids (Username and Password Required)
  15. PBS Kids