An Auditory Illusion
Does how we speak determine how we hear?
Researcher Elizabeth Petitti sits in a quiet lab, opens her laptop, and plays two musical notes. Some would hear the notes rise in pitch, while others would hear them fall. Why would two people hear the same notes differently? The answer may improve our understanding of how our auditory system develops, and may help speech-language pathologists who work with people who have hearing impairment.
Petitti says the answer comes down to the way our brains perceive two components that make up sound: fundamental frequency and harmonics.
A note’s fundamental frequency is the primary element of sound from which our brains derive pitch—the highness or lowness of a note. Harmonics give a note its timbre, the quality that makes instruments sound distinct from one another.
Many sounds in the world are made up of these tones, whether you strike a key on a keyboard, play a note on a clarinet, or say a letter, says Petitti (’15), who graduated from Boston University’s Sargent College of Health & Rehabilitation Sciences with a master’s in speech-language pathology. Our brains expect the fundamental and the harmonics to be present in any given note. But when some of this information drops out, “the way you perceive the note can change in surprising ways,” says Petitti’s mentor, Tyler Perrachione, a Peter Paul Career Development Professor at Sargent and director of the Communication Neuroscience Research Laboratory.
Petitti explains that when she removes the fundamental from a tone (using signal processing software), and then plays that note, the listener’s brain automatically supplies the pitch. People’s brains deliver this information in different ways: they either fill in the missing fundamental frequency—similar to the way the brain would compensate for a blind spot in our eye—or they determine the pitch from the harmonics.
Here’s where it gets interesting: when two different tones that have been stripped of their fundamentals are played in succession, some listeners hear their pitch rising, and some hear it falling. Who’s right?
“There’s no right answer,” Perrachione says. “Pitch only exists in our minds. It’s a perceptual quality.” So, how exactly do we determine pitch? It turns out the language we speak plays a role.
Petitti and Perrachione theorized that individuals who grew up speaking a tone language like Mandarin would perceive pitch differently than those who grew up speaking a non-tone language like English. In Mandarin, for example, a word often has several meanings, depending on how the speaker employs pitch; mā (with a level tone) means “mother,” while mă (which drops, then rises in tone) means “horse.”
To test this theory, Petitti invited 40 native-English speakers and 40 native tone language speakers to participate in a study, which she and Perrachione presented at the International Congress of Phonetic Sciences in August 2015. Each participant listened to 72 pairs of tones stripped of their fundamental frequencies, and then indicated if the tones were moving up or down.
Petitti and Perrachione found that language does change the way we hear. Individuals who grow up speaking English are more attuned to a note’s harmonics, while the tone-language speakers are more attuned to its fundamental. So, when a note is stripped of that component, they’re more likely to derive pitch by supplying the missing fundamental than by listening to the harmonics still present in the note.
“We are interested in how brains change with experience and how our experiences predispose us to certain auditory skills.” — Tyler Perrachione
These results led Petitti and Perrachione to wonder if the difference in pitch is grounded in our earliest language acquisition, or if other experiences can also affect how our brains process sound. For instance, would musicians—who also rely on pitch—perceive sound the same way as tone-language speakers?
When they put the question to the test, Petitti and Perrachione found that neither the age at which a musician began studying nor the number of years he’d practiced impacted his perception of pitch. To Petitti, this suggests the way we listen is determined by our earliest brain development. While you may begin learning an instrument as early as three, “you start language learning from birth,” she says. “So your auditory system is influenced by the language you are exposed to from day one.”
It’s not just theoretical. “Big picture: we are interested in how brains change with experience and how our experiences predispose us to certain auditory skills,” Perrachione says. This understanding could “help us better understand the opposite, when things don’t work quite right,” such as when a person has a disorder like amusia (tone deafness).
Petitti underscores the study’s potential clinical impact; in her career as a speech-language pathologist, she intends to work with clients who have hearing impairments, which will involve teaching them to perceive and use pitch. This ability is “crucial when you’re teaching how to ask a question, and how to use pitch to signal the difference between words,” she says—all skills we typically begin to develop early and unconsciously. For a seemingly simple ability that most of us take for granted, there is much at play.
Comments & Discussion
Boston University moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (EST) and can only accept comments written in English. Statistics or facts must include a citation or a link to the citation.