Previous: Section 2
Next: Section 4
Language, music, and the brain
Q: Can you give me an example of how brain function is integrated across regions?
People generally believe that language is a left-brain function separate from music, which happens over on the right hemisphere. The persistence of this belief is not surprising. Research during the 1960s and into the 1990s found that patients with damage to certain areas of the left hemisphere suffered from "aphasia," the inability to speak; while those with damage to certain areas of the right hemisphere developed "amusia", the inability to process musical pitch. Although it might initially make sense to conclude that language must be a left-brain function, a closer look reveals that language recruits areas and abilities from both hemispheres, as does music. All major categories of human behavior, especially the skills that schools strive to develop in students, use multiple parts of the brain, not merely the left or right side.
Broca's area and Wernicke's area, two parts of the left hemisphere (in most people), had been thought to be exclusively responsible for the production and perception of language.
|It turns out that the two areas of the left hemisphere shown above are activated for both language and music, suggesting that simple modular theories of brain function do not capture its...|
However, using functional magnetic resonance imaging (fMRI), scientists discovered activation in two overlapping areas when music was being processed: The inferior frontal gyrus and nearby premotor cortex (overlapping Broca's area) were recruited for sight-reading, while reading and listening to a score activated the supramarginal gyrus (overlapping Wernicke's area).
It seems that language areas are not "language," and music areas are not (top)
(End of the first column online)
"music." Instead, the left and right hemispheres may have broader functions that are recruited across domains like language and music to support abilities in each. Both hemispheres contribute to the production and understanding of language and music. The left hemisphere seems to specialize in tasks involving hierarchical sequencing (like grammar, syntax, and meaning), and the right hemisphere seems to deal more with contour-based patterns (like melodic contour and large repeating patterns, especially with emotional significance). The contributions of the left hemisphere are less emotional—such as grammar and definitions of words. The contributions of the right are more emotional—such as the melody and contour that produce the affective music of language (intonations that express our intention, like sarcasm or sincerity). Broadly speaking, the left hemisphere works with the denotation of our language; the right plays with the connotation. In most people, both are essential for fully expressive communication and understanding.
Take a simple example: Two sentences have the same words, syntax, and grammar but communicate very different meanings, depending on how they are said:
"You love me."
"You love me?"
Beyond the rising inflection that distinguishes the question from the statement, speak these words with different melodies, stresses, and pitches, and each result will yield a different meaning. The emotional prosody of language, its rhythms and intonations, is neurologically related to music processing.
But there is another layer to prosody: Some languages, like Mandarin, rely on prosody for grammatical and lexical meanings; the syllable "ma," for example, has four different meanings depending on the melody with which it is pronounced (only one of which is "mother"). Because grammar and definitions tend to be left-hemisphere functions, it may be that the purpose to which prosody is put determines which hemisphere will be more heavily recruited. In most Mandarin speakers, the right hemisphere is recruited for tones communicating emotional meaning, and the left is recruited for tones communicating "denotative" meaning, such as the specific meaning of "ma."