Mind Reader
To treat language disorders, Tyler Perrachione investigates what makes dyslexic brains different

Celeste Hamre and her sister Britta, 23, are fraternal twins. They have the same blue eyes and amber-blonde hair, the same love of Brie and running. But it was clear early on that there was something different, too. When Britta began learning to read and write, Celeste lagged behind. When Celeste tried to speak new words, a mixed-up jumble spilled out. Sounding out words in front of her class was so embarrassing that Celeste would try to memorize the stories Britta read aloud so she could parrot them back to her teachers and classmates. She coveted the thick New York Times readers her classmates got, but her teacher passed her a skinny abridged version instead.

The girls’ parents signed Celeste up for specialized testing, which revealed that she has a reading disorder called dyslexia. They enrolled her in intensive one-on-one tutoring, and it worked: by the time she was eleven, she recalls, she was snagging books from her siblings and sneaking them into bed. Less than a decade later, Celeste—who graduated high school as valedictorian and joined the Boston University class of 2016 with a full merit scholarship—entered the laboratory of Tyler Perrachione, an assistant professor at BU’s College of Health & Rehabilitation Sciences: Sargent College (SAR). Perrachione studies how language and reading skills develop—and how they sometimes go awry—and he was looking for volunteers with dyslexia, just like Celeste.
Researchers estimate that between 5 and 17 percent of schoolchildren have dyslexia, which is defined as any difficulty reading single words. Contrary to common belief, people with dyslexia don’t read words backward, says Perrachione, and the disorder doesn’t have anything to do with overall intelligence.
Between 5 and 17 percent of schoolchildren have dyslexia.
Intensive training like Celeste’s can help kids with dyslexia become fluent readers, especially when it starts in kindergarten or first grade. But this practice-practice-practice approach is demanding and time-consuming for kids and teachers, and people who start treatment after first grade may still lag behind their peers, says Perrachione. And because most current training regimens emphasize “decoding”—that is, recognizing words by sounding them out—they also fall short of making reading truly automatic, says Karole Howland, a SAR clinical assistant professor who tests new treatments for dyslexia and other learning differences.
“Reading remains a labored process for people with dyslexia,” says Howland. But a better understanding of what exactly goes wrong in the brain when a person with dyslexia reads could help researchers develop better therapies and diagnose dyslexia earlier. “Some of the most successful new interventions are directly based on the findings people have been coming up with through neuroscience and MRI studies,” she says.
Most of what we know about dyslexia and the brain comes from studying how volunteers’ brains “light up,” or become active, as they read. But Perrachione’s approach skips reading and focuses instead on a skill called phonological working memory, which he describes as a person’s ability to “hold speech sounds in mind.” Phonological working memory is important for reading, but also for a host of other daily tasks, he says. “Any time you are listening to speech, you are using phonological working memory to keep track of all the words you’re hearing. When someone gives directions or introduces themselves, when they tell you about what they’re doing later, it supports your ability to keep all this in mind as you hear it.”
It might seem strange to study a reading disorder without actually observing the brain as it reads. But, Perrachione points out, reading isn’t like other brain functions. “It’s not like learning to speak, where kids go ba-ba-ba-ba-ba and next thing you know they’re asking for sandwiches,” he says. “Reading takes a long time and a lot of explicit instruction, and a lot of people still really struggle with it.” That may be because, in the scope of human evolution, reading is a very new invention. “Reading is a technology,” says Perrachione. “It’s a tool we’ve developed, in the same way we’ve developed hammers and tennis rackets and cars. It’s not something the brain has evolved to do.” So while the brain does have a “reading center,” he says, it’s a sort of neurological MacGyver device that has been cobbled together from parts that evolved for other purposes.

That makes dyslexia especially difficult to explain because, unlike disorders with trail-of-breadcrumbs symptoms that lead straight to faults in particular brain structures, it is a mystery with just one clue. Except for reading, the differences between people with dyslexia and people without it are so small that they can only be spotted and studied in the laboratory. So, to gather more clues to the disorder’s origin, researchers have to develop tests that can reveal extremely subtle variations that don’t show up in everyday tasks but which might point the way to specific brain anomalies.
One of those subtle differences is in phonological working memory. To examine how well an individual’s phonological working memory is operating, language researchers like Perrachione use a test they call “nonword repetition,” in which the experimenter says a made-up word and asks the subject to repeat it. The words start short—“tector,” “sufting,” “mubler”—and get progressively longer—“dorichiter,” “fandosity,” “perplisteronk.” (Because the words aren’t real English words, the thinking goes, subjects remember them as sounds alone rather than sticking them to any particular meaning or experience.) As the words get longer, everyone has a harder time remembering and saying them back accurately, but the task is markedly more difficult for individuals with dyslexia. Their performance “just falls off precipitously” as the nonwords stretch to four syllables and beyond, says Perrachione.
The problem isn’t unique to dyslexia. People with autism, Down syndrome, and other language disorders, like stuttering, struggle to remember and repeat nonwords. “There’s something about the ability to hold these sounds in mind that is impaired” across this diverse group of disorders, says Perrachione. But what?
There are two broad possibilities. Psychologists typically think of the brain as a modular system with different functions linked together in sequence. When we hear a word, a network of brain areas known as the language module decodes the message and passes it off to a second network, the memory storage module, for safekeeping. So when an individual’s phonological working memory is subpar, the problem could be in either the language module or the memory module, says Perrachione. If it is purely a memory fault, it should affect memory for other kinds of information too, like strings of numbers or locations of dots on a checkerboard. On the other hand, a bug in the brain’s ability to process the incoming speech sounds or to manage the handoff between the language module and the memory module should leave other kinds of memory intact.

Now, with support from a grant from the National Institutes of Health, and help from Terri Scott (GRS’12, MED’19), a graduate student in neuroscience, Perrachione is working to identify the specific parts of the brain that are involved in phonological working memory so that he can figure out which module is malfunctioning. Although psychologists have known since the 1980s that individuals with dyslexia and other language disorders also struggle with phonological memory skills, Perrachione’s study is the first to use brain imaging to spotlight it and its links with the brain’s language and memory systems. “Anything that we know about working memory versus phonological memory will help us hone our interventions in that area,” says Howland.
The study began in summer 2015. By the time it ends, in June 2018, Perrachione will have scanned the brains of some 60 volunteers, including about 35 adults and kids with dyslexia and 25 adults with typical language skills, using a noninvasive technique called functional magnetic resonance imaging, or fMRI, which shows how hard different parts of the brain are working at a particular task. First, he maps each subject’s brain to find out exactly where his or her brain processes language. (Human brains are different enough that it’s necessary to create an individualized language map for each person, says Perrachione.) Then, subjects do a series of tests while in the MRI machine: a nonword repetition task; a number memory quiz, in which they try to remember and repeat a list of numbers; and a location recall test, in which subjects try to remember the arrangement of polka dots on a grid.
“Reading is a technology. It’s a tool we’ve developed, in the same way we’ve developed hammers and tennis rackets and cars.”
So far, Perrachione has found that subjects who read normally recruit the brain’s language module, not the memory module, to handle nonword repetition. Next, he will begin running the same tests on subjects with dyslexia. He suspects that the language areas will be less active as dyslexic subjects work on nonword repetition, but it’s also possible that the language module will actually work harder, then “max out” prematurely. Or perhaps unexpected parts of the brain will come online, suggesting that the language module is getting a helping hand from brain structures that usually work on other tasks, or conversely, that those areas are “butting in” and derailing the language module. He will also scrutinize linkages between different brain areas using a special MRI scan called diffusion weighted imaging, which shows how information passes from one part of the brain to another.
“We’d really love to help give a better understanding of what those nonword repetition tasks are telling you about the impairments that kids with language disorders face,” says Perrachione, so that kids can focus their energy where it will have the most impact.
Meanwhile, in a separate study recently published in Neuron, Perrachione and a group of colleagues at the Massachusetts Institute of Technology and Massachusetts General Hospital have uncovered another key difference in how dyslexic brains process incoming information. When people without dyslexia are exposed to new sights and sounds—new voices, faces, or pictures, for instance—their brains take a few seconds to “tune in,” then process them more efficiently after that. But fMRI scans of more than 150 subjects revealed that people with dyslexia don’t adapt in the same way. Their brains treat the signals as brand-new every time, even when they’ve seen or heard them before. This could make it harder for people with dyslexia to hold speech sounds in mind. “Like reading back a poorly written note, it may be that the way the brain is remembering speech sounds in the short term is not as robust as in people who are better at rapid learning,” Perrachione hypothesizes.

Celeste knows that she was lucky to get an early diagnosis and first-rate tutoring. Her successes may have surprised those around her, but, she says, they demonstrated what she had always believed—that the hard work of learning to manage her dyslexia made her a stronger student. “People with dyslexia, with the right resources at the right time, can learn how to read and be academically successful,” she says. “Unfortunately, many of these essential resources are not universally available and not all students with dyslexia will be given the correct tools to help them thrive in the classroom.”
“I think that we’re on the cusp of understanding the relationship between the brain and behavior in new ways,” says Perrachione, “and by using the insights we gain from advances in brain imaging, we will be able to create new opportunities to help individuals with communication disorders like dyslexia succeed.”
Comments & Discussion
Boston University moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (EST) and can only accept comments written in English. Statistics or facts must include a citation or a link to the citation.