jillio
New Member
- Joined
- Jun 14, 2006
- Messages
- 60,232
- Reaction score
- 22
shel90 - The statement: "Please cue to me. When you sign, I cant hear what you say,", I can see how you would view this statement is misleading, but in all reality it is true. You have not experienced how Cued Speech teaches the visual representation of sound to deaf people, and how the brain processes the "visual sound". A deaf person who cues, developes an "inner voice", similiar to that of a hearing person who is for example reading silently to themselves. It is not ASL it is Cued Speech.
That is absurd. Cueing does not allow one to hear sound. Visual representation of sound is not hearing. The brain processes visual information as visual information, period. And a deaf person who is unable to hear the sound, and only has a visual representation of that sound develops an inner voice based on visual information, not sound. The phonetic loop in ASL users adapts to the phonemes and morphemes of ASL, and their inner voice is dependent upon that. And the phonetic loop is tied to memory, not inner voice.
The brain in a deaf ASL user adapts to process visual information as language, and therefore, visual langauge is processed as visual linguistic input. The brain recognizes the phonemes and morphemes of ASL as linguistic input just as a hearing person recognizes the phonemes and morphemes of their native language as liguistic input. The brain of the ASL user sees handshapes (phonemes) and the combinations of phonemes and placement and movement as linguistic components and they are processed as language; i.e. the brain recognizes these handshapes, movements, and placements as linguistic components and automatically processes them as language, whereas gestures not specifiically related to ASL are not processed as linguistic input. The same with a hearing individual. They process the phonemes and morphemes specific to the language spoken in the environment are processed in the brain as linguistic input, while sounds unrelated to that langauge are not processed as linguistic input. The brain easily discerns nonsense syllables and environmental noises as nonlinguistic input. This is dependent upon a fully functioning auditory system, and begins at birth. Infants as young as a few weeks will attend to auditory input that contains the phonemes of their mother tongue, and will not respond to phonemes of another language. The same is true of ASL, only in a visual mode.