Project Detail
Interagency Autism Coordinating Committee (IACC) logo
Office of Autism Research Coordination (OARC) logo

Neural basis of cross-modal influences on perception  

Many of the objects and events that we encounter in everyday life, such as a barking dog or a honking car, are both seen and heard. A basic task our brains must carry out is to bring together the sensory information that is received concurrently by our eyes and ears, so that we perceive a world of unitary objects having both auditory and visual properties. With funding from the National Science Foundation, Dr. Steven A. Hillyard and colleagues, of the University of California, San Diego, are investigating when and where in the brain the visual and auditory signals are combined and integrated to produce coherent, multi-dimensional perceptions of objects in the environment. The sensory inputs from the eyes and ears are projected initially to separate regions of the brain specialized for perceiving the visual and auditory modalities, respectively. The timing and anatomical localization of neural interactions between auditory and visual inputs is being analyzed by means of scalp recordings of brain potentials that are triggered by these sensory events, together with magnetic resonance imaging of stimulus-induced brain activity patterns. A major aim is to analyze the brain interactions that cause a stimulus in one modality (auditory or visual) to alter the perception of a stimulus in the other modality. Three types of such auditory-visual interactions are being studied: (1) the brightness enhancement of a visual event when accompanied by a sound, (2) the ventriloquist illusion, which is a shift in perceived sound location towards the position of a concurrent visual event, and (3) the double-flash illusion that is induced when a single flash is interposed between two pulsed sounds. In each case, the precisely timed sequence of neural interactions in the brain that underlie the perceptual experience will be identified, and the influence of selective attention on these interactions will be determined. The overall aim is to understand the neural mechanisms by which stimuli in different sensory modalities are integrated in the brain to achieve unified perceptions of multi-modal events in the world. Because much of our everyday experience involves recognizing and reacting to the sights and sounds of our surroundings, understanding the principles by which these auditory and visual inputs are synthesized in the brain is important. The ability to combine auditory and visual signals effectively is particularly important in teaching and learning situations, where spoken words must be put together with a variety of pictorial, graphic, and written information in order to understand the material. This research program can contribute to the development of more efficient learning environments and teaching techniques, and lead to improved designs for communication media such as audio-visual teaching tools, information displays, and critical warning signals. These studies are exploring the role of selective attention in synthesizing auditory and visual signals, research that can lead to improved teaching techniques that emphasize the training of attention. By studying the brain systems that enable auditory and visual inputs to be combined into perceptual wholes, this research can also help to understand what goes wrong in the brains of patients who suffer from abnormalities of perception, including those with learning disabilities, attention deficit disorder, autism, and schizophrenia. Project Status
NEW

2010

Funder National Science Foundation
Fiscal Year Funding $156,423.50
Current Award Period 2010-2014
Project Number 1029084
Principal Investigator Hillyard, Steven
Received ARRA Funding? No
Strategic Plan Question Question 2: How Can I Understand What Is Happening? (Biology)
Subcategory Sensory and Motor Function
Strategic Plan Objective 2O. Not specific to Question 2 objectives
Federal or Private? Federal
Institution University of California, San Diego
State/Country California
Web Link 1 Neural basis of cross-modal influences on perception (External web link)
Web Link 2 Neural basis of cross-modal influences on perception (External web link)
Web Link 3 No URL available.
New! History/Related Projects Not available at this time. This functionality is experimental.