One of the most peculiar languages in the world, whistled Turkish, is challenging the long-standing idea that the left brain hemisphere is solely responsible for processing language and extracting meaning. Any language, be it spoken, written or signed is processed in the left hemisphere, but whistled languages are processed equally by both sides of the brain. It’s a striking discovery that suggests people devoid of left hemisphere processing abilities, following a stroke for instance, can still communicate using their right hemisphere. Just whistle.
Whistled Turkish is still Turkish, it’s only the way it’s conveyed that’s different. Just like spoken and written English are different, as a medium, but are interpreted with the same meaning, so does whistling the language produce the same effect. Today, some 10,000 Turks living in the mountains of north-east Turkey still use this form. It appeared as an inventive way of communication over long distance, since it’s difficult to articulate words when you have to shout to your neighbor who lives hundreds of meters away. Whistled Turkish can be reportedly understood even from 5 kilometers away.
Here’s what “One kilogram tomatoes, please” (Bir kilo domates lütfen) sounds like in whistled Turkish.
In this instance, the discovery was made by Onur Güntürkün of Ruhr-University Bochum in Germany who was inspired by previous findings that show music is processed in the right hemisphere. Being familiar with the melodic whistled Turkish, Güntürkün thought it was worth investigating.
Nerves in the right ear relay language back to the left hemisphere, while those in the left ear send the signal first to the right hemisphere, before transferring to the left for actual decoding. For instance, if you wear a pair of headphones and play back on each channel (left or right) different syllables at the exact time (“ma” in the left or “am” in the right), you’ll only hear the sound playing in the right ear (“am”). That’s because by the time the brain has time to send the signal to the right hemisphere then to the left, the language sounds picked up by the right ear gets decoded first while those in the left get lost in the noise.
However, when Güntürkün performed the same test with native speakers of whistled Turkish, syllables played in both ears were identified with equal frequency – sometimes people would hear the right sound, sometimes the left sound. The results (Current Biology) are extremely convincing and clearly show that language processing isn’t solely up to the left hemisphere and that both are involved, maybe even in conventional languages – it’s just that subtleties owing to sound frequency or timbre have yet to be uncovered. Previously, a 2005 study used fMRI scans to show that shepherds in the Canary Islands use both hemispheres to interpret a whistled form of Spanish.
“We could show that whistled Turkish creates a balanced contribution of the hemispheres,” Güntürkün says. “The left hemisphere is involved since whistled Turkish is a language, but the right hemisphere is equally involved since for this strange language all auditory specializations of this hemisphere are needed.”
While whistled Turkish is still basically Turkish, understanding it can be difficult even for a native speaker.
“As a native Turkish-speaking person, I was struck that I did not understand a single word when these guys started whistling,” he says. “Not one word! After about a week, I started recognizing a few words, but only if I knew the context.”
Sadly, young people living in the remote Turkish mountains have now turned to mobile phones, whistled Turkish risks dying off.
“You can gossip with a mobile phone, but you can’t do that with whistling because the whole valley hears,” Güntürkün says humorously.
Source: Science Mag
Enjoyed this article? Join 40,000+ subscribers to the ZME Science newsletter. Subscribe now!