Shared Evolutional Ancestor May Be Linchpin in Evolution of Language
How rhesus macaque monkeys process sounds to discern their meaning may lead scientists to an understanding of the point in the evolutionary timeline when the building blocks of language first appeared in human development.
Schmalfeldt: What do these sounds mean to you?
SFX: (Coos, barks and screams of a rhesus monkey.)
Schmalfeldt: Nothing, right? Well, that's because you're not a rhesus monkey. If you were a rhesus monkey, those sounds would mean a great deal to you. How rhesus macaque monkeys process those sounds to discern their meaning may lead scientists to an understanding of the point in the evolutionary timeline when the building blocks of language first appeared in human development. According to research conducted by scientists at the National Institute on Deafness and Other Communication Disorders and the National Institute of Mental Health, the parts of the brain the rhesus monkey uses to make sense of the sounds you just heard correspond to the two principal language centers of the human brain. Scientists say this advances the theory that a shared ancestor to humans and present-day non-human primates may have possessed the key neural mechanisms on which language was built. Now, monkeys are not capable of language in any real sense of the word, according to Dr. Allen Braun, chief of NIDCD's Language Section.
Braun: But they are capable of processing "meaning" that's encoded in an acoustic signal, which is the case with their calls. In the areas of the brain that are involved with processing that meaning are areas that we think eventually became bootstrapped to process these combinatorial features of language in humans.
Schmalfeldt: To measure the brain activity of monkeys as they tried to make sense of various sounds, brain scans were done as they listened to three types of sounds — the recorded coos and screams of other rhesus monkeys, as well as other assorted non-biological sounds, according to Dr. Ricardo Gil da Costa of the Gulbenkian Science Institute in Portugal, who conducted the study during a three-year joint appointment at the NIDCD and NIMH.
Gil da Costa: What we did was building up a set of non-biological sounds that included a lot of different things, from modern human environment like telephones ringing, doors closing, et cetera, to natural sounds like rain, water flowing, that kind of thing, to computer-synthesized noise. Now what we tried was to make this set of non-biological sounds as broad as possible in a way, and in the other way make sure we'd include most of the bio-acoustic features that are present in both coos and screams separately.
Schmalfeldt: Based on these findings, scientists suggest that the communication centers in the brain of the last common ancestor of humans and rhesus monkeys — particularly those parts of the brain used for interpreting species-specific vocalizations — may have been recruited during the evolution of language in humans. In the light of an earlier study from the same group which showed that these kind of monkey calls activated brain regions that process high-order visual and emotional information, researchers suggest the language areas of the brain may have evolved from a much larger system used to extract meaning from socially-relevant situations — a system in which humans and non-human primates might share similar neural pathways. Further studies will look into which regions of the non-human primate brain are activated when animals listen to other meaningful sounds such as predator calls, sounds made by humans or other relevant environmental noises. In addition, scientists will study the pattern of brain activation caused by non-auditory stimuli — such as visual images of monkeys producing visualizations. From the National Institutes of Health, I'm Bill Schmalfeldt in Bethesda, Maryland.