SAN DIEGO, Calif. Director Steven Hillyard's Event-Related Potentials Lab (ERPL) at the University of California at San Diego may have found the "pulse" of human attention. The ERPL recently reported specific progress toward modeling the manner in which selective attention cued by sound can speed up object recognition.
The research is showing that different sense channels for example, sight and sound interact to aid the brain in assembling all of the elements of a specific perception.
The ability to accurately model the human perception task, especially recognition and memory, enables virtual reality (VR) engineers to coordinate the presentation of sounds with sight sensations, with the aim of eliciting realistic perceptions in the VR participant. The identification of the pulse of human attention would enable VR engineers to "inject" perceptions into participants, as opposed to merely hoping the participant is paying attention at the right time.
"The mission here at the Event-Related Potentials Lab is to characterize the patterns in the brain corresponding to attention, perception, recognition and memory. This study shows how paying attention to sound influences our ability to see," said ERPL researcher John McDonald.
Preps human attention
The study reports that paying attention to a sound's location is independent of looking in that direction, and furthermore, that preceding a visual presentation with sound preps the participant for faster and more accurate visual recognition. "The brain relates sound, sight, touch and all the senses together, and our ability to pay attention to one affects what we perceive with another," said McDonald.
For example, when a pedestrian hears a motorcycle coming from a direction in which he or she is not looking, that person's visual system is prepped to see a motorcycle before the individual's eyes actually acquire an image. Hillyard and McDonald's study, along with project scientist Wolfgang Teder-Salejarvi, supports the hypothesis that even if visual perception is not directed toward the motorcycle sound, the sound cue actually pre-attentively tunes the visual neurons to have "Harley-vs.-Honda" comparisons online for a quicker identification when the visual image finally arrives.
"We made our subjects focus in the center of their visual field, then made a sound on one side followed by a faint green light and a brighter red one to mask any green afterimages. Then we asked them if they saw a green light and on which side," said McDonald.
With that simple procedure, the study was able to statistically validate that the brain's attention can be caught by one sense modality sound that then focuses a second sense modality sight to perceive more efficiently. Attention itself, the study maintains, is independent of any particular sense modality but cuts across the brain, relating event potentials together. The Event-Related Potentials Lab derives its name from this hypothesis.
It's not just sight and sound that's involved. The ERPL's overall goal is to characterize the actual brain events that result from each sensory input, and discover how selective attention can bind together sight, sound, smell, touch and taste into more accurate perceptions of objects. Actual brain recordings were not done for this study, but the lab's primary goal is to use such recordings to validate its models.
"Next we will record dorsal and ventral processing streams from discrete zones of extrastriate visual cortex after the presentation of the sound stimulus to confirm that attention to sound produces enhanced neural activity in the visual cortex," said McDonald.
Next: brain recordings
In other words, the ERPL will study real brain recordings during the selective attention task to identify and characterize how the visual cortex changes in response to attention drawn by sound. Its working hypothesis is that spatial attention is multimodal, facilitating stimuli in all modalities at the location on which the subject has focused, even if that modality is not relevant at the moment. Attention, Hillyard's hypothesis maintains, integrates the different aspects of an event into a unified perception of an object.
"We're now compiling brain recordings that give us a precise measurement of the moment-to-moment changes in the visual cortex as a result of paying attention to sound," McDonald said.
In preliminary work, the lab has stimulated participants with 5- to 6-Hz sound events and recorded participants' responses, which consisted of a continuous oscillatory electrical response in the visual cortex. This signal is so strong and reliably elicited that Hillyard has dubbed it the steady-state visual evoked potential (SSVEP). Since the SSVEP rides beneath the higher-frequency EEG "noise," like a low-frequency carrier signal, it can be accurately tracked from moment to moment. The Lab's most remarkable preliminary finding is that the SSVEP seems to perfectly track even fine-grained temporal reports on focusing attention. Whether the researchers have found the "pulse" of human attention, however, must wait for quantitative follow-up studies.