How the brain decodes languages made for long-distances
Can you speak Turkish? Can you whistle Turkish? Residents of Kuşköy, a small town along the Black Sea coast have been whistling in Turkish for generations. Like other whistled languages, kuşdili, or “bird language” was first created as a way to communicate across distances too large and noisy for yelling or shouting. A whistle carries farther with less effort, making conversations across a valley become much more practical. This much was clear, but researchers recently went to Kuşköy to find out how speakers’ brains handle such communication. Is it identical to other languages, or does speaking like a bird change the way our language centers function?
A new way to “speak” the same words
While whistling changes the way we say and hear words, the underlying structure of bird language is still Turkish. Syllables are converted to whistled tones, with a bias toward more piercing frequencies to maximize the audible range of each sound. So while grammar would likely rely on normal language centers in the brain, mostly in the left-hemisphere, there could still be interesting shifts necessary for creating and understanding each whistled sound. Human brains seem to have extra reliance on the right-hemisphere when it comes to processing and decoding pitch, melody and rhythms, which the researchers needed to find a way to observe.
Testing how well the brain listens
To test what parts of the brain were most activated by bird language, researchers would have ideally used an fMRI to monitor brain activity. Since those devices aren’t what you’d call “portable,” they opted for a technique called dichotic listening. In this test, participants listened to different syllables played through headphones in both ears. They would then respond with what ear they could most easily recognize the sound, presumably corresponding with which hemisphere of the brain was most adept at decoding that sound. So spoken Turkish was most likely to be understood in right ears (indicating the left-hemisphere’s language centers were active) whereas whistling might be more easily understood in the left ear thanks to the right-hemisphere’s more musical decoding abilities. Instead, bird language activated both sides of the brain, indicating a much more dynamic process at work. While it’s known that the hemispheres do routinely communicate with each other, this suggests a more sophisticated degree of interaction.
Further testing of bird language may be limited though, since it appears to be a dying tradition. The spread of cell phones, particularly texting, has proven to be easier and more convenient for both long and short distances. One of the drawbacks in a language that is great at projecting is that it’s hard to maintain any kind of privacy. Groups of young men still learn bird language as a point of pride, but if they don’t want to share their thoughts with the whole valley, emoji may push their brains from listening to music to deciphering visual icons.
Source: The Whistled Language of Northern Turkey by Michelle Nijhuis, The New Yorker