“We live in a multisensory world. We’re embedded within this incredibly rich tapestry of sensory information,” says Dr. Mark Wallace, Director of the Vanderbilt Brain Institute and Multisensory Laboratory.
Part of the human brain’s job is to make sense of that tapestry: What input is most important at the moment? What can I safely ignore? One of the core differences between individuals with autism and neurotypicals is how we process sensory information. Specifically, evidence is mounting that sensory “binding,” or integrating multiple sensory inputs into one coherent perceived event, occurs differently in the brains of people on and off the autism spectrum. These differences may affect language development and social interaction in autism.
A 2014 study from Wallace’s lab is the first to link sensory binding differences with speech comprehension challenges in autism, a step forward in cohering the condition’s many facets.
Sensory Binding is Flexible
First, the team determined the “temporal binding window” (TBW) of the participants: 64 individuals ages 6 to 18, half with autism and half without. The TBW is the time period within which multiple stimuli may be interpreted as a single event. To determine participants’ TBWs, the researchers used a “simultaneous judgment task,” which is “one of the workhorse tests in our lab,” according to Wallace.
The task asked participants to detect whether two stimuli (one auditory, one visual) were presented simultaneously or not. Three stimulus types were used: simple flashes and beeps, recordings of tools (e.g. a hammer hitting a nail), and faces speaking a single syllable. A variety of time gaps between the two stimuli (0ms to 300ms) helped researchers measure how far apart in time two stimuli had to be for a participant to determine they were not simultaneous.
The team found that the TBWs of both groups (autism and neurotypical) widened, to bind more information occurring over a longer time frame, as the stimulus complexity increased. When the stimuli were simplest, participants could detect that stimuli were non-simultaneous when they were closer together. But as for a difference between the children with autism and those without? For the flash/beep and tool trials, “There was absolutely no difference. Which was really a violation of our prediction,” said Wallace. Only in the speech trials did a difference between the two groups emerge: Children with autism perceived stimuli that were farther apart in time as still simultaneous.
Since the 2014 study, Wallace’s lab has added more data to this line of research. “We now have about another 50 children in this cohort,” he said at a seminar at the University of Maryland, Baltimore on February 11, 2016. “The differences in the flash/beep have emerged, the differences in tools have emerged, and the differences in speech are even greater.”
Seeing is Believing
Wallace’s team also tested the “McGurk Effect”: If someone watches a person’s lips move to produce one sound (“ga”), but hears another (“ba”), he may actually perceive that a third syllable was spoken (“da,” the “percept”). Participants on the autism spectrum were less likely to report perceiving the combined “da,” and instead were more likely to prioritize the auditory stimulus. The two groups performed similarly on visual-only (lip-reading) trials, so the preference for the auditory stimulus among individuals with autism doesn’t seem to reflect a challenge with reading lips and faces.
Wallace went even further. He wondered, “What’s the relationship between the temporal binding window in the individual child, and the nature of the McGurk percepts?” It turns out that for individuals with autism, the wider one’s TBW, the less likely one is to perceive the combined “da” as a result of the McGurk Effect. There was no such correlation for neurotypicals.
Then researchers presented participants with either two auditory or two visual stimuli. They attempted to determine which came first, with gaps varying from 10 ms to 250 ms. There was no significant difference between neurotypical individuals and those with autism in this experiment. Both groups required the same separation between stimuli to accurately determine which stimulus was presented first.
This research indicates that individuals with autism have wider TBWs specifically during speech processing tasks, and that a wider TBW is associated with reduced susceptibility to the McGurk effect. Both of these findings suggest that difficulty with speech processing, part of the communication challenges inherent to autism, may be linked to differences in sensory binding.
“I believe strongly that sensory and multisensory representations are basically building blocks for perceptual and cognitive representations,” said Wallace. Basically, sensory processing may be a prerequisite for higher-order cognitive functions.
These findings encourage “perceptual training” as a possible intervention strategy. Wallace has conducted preliminary trials with eight children, all of whom decreased their TBWs after training. But the improvements were not limited to the TBW. “The really cool piece of information,” Wallace said, “is that when we have these kids come back and we look at measures of speech, there’s a social interaction improvement as well.”
Wallace’s work establishes an important link between sensory differences and speech processing challenges, and he suspects additional connections exist between sensory processing and higher-order cognitive functions.
“We think we may be onto something,” he said. “Training low-level sensory processes has cascading effects, and may be useful as a really important therapeutic tool in autism. It’s early, but we’re really excited about it.”
Stevenson RA, Siemann JK, Schneider BC, Eberly HE, Woynaroski TG, Camarata SM, Wallace MT. (2014). Multisensory temporal integration in autism spectrum disorders. The Journal of Neuroscience, 34(3):691-697.