Skip to content
Interagency Autism Coordinating Committee (IACC)
Autism Research Database
Office of Autism Research Coordination (OARC)
 
Project Element Element Description

Project Title

Project Title

Neural basis of cross-modal influences on perception

Principal Investigator

Principal Investigator

Hillyard, Steven

Description

Description

Many of the objects and events that we encounter in everyday life, such as a barking dog or a honking car, are both seen and heard. A basic task our brains must carry out is to bring together the sensory information that is received concurrently by our eyes and ears, so that we perceive a world of unitary objects having both auditory and visual properties. With funding from the National Science Foundation, Dr. Steven A. Hillyard and colleagues, of the University of California, San Diego, are investigating when and where in the brain the visual and auditory signals are combined and integrated to produce coherent, multi-dimensional perceptions of objects in the environment. The sensory inputs from the eyes and ears are projected initially to separate regions of the brain specialized for perceiving the visual and auditory modalities, respectively. The timing and anatomical localization of neural interactions between auditory and visual inputs is being analyzed by means of scalp recordings of brain potentials that are triggered by these sensory events, together with magnetic resonance imaging of stimulus-induced brain activity patterns. A major aim is to analyze the brain interactions that cause a stimulus in one modality (auditory or visual) to alter the perception of a stimulus in the other modality. Three types of such auditory-visual interactions are being studied: (1) the brightness enhancement of a visual event when accompanied by a sound, (2) the ventriloquist illusion, which is a shift in perceived sound location towards the position of a concurrent visual event, and (3) the double-flash illusion that is induced when a single flash is interposed between two pulsed sounds. In each case, the precisely timed sequence of neural interactions in the brain that underlie the perceptual experience will be identified, and the influence of selective attention on these interactions will be determined. The overall aim is to understand the neural mechanisms by which stimuli in different sensory modalities are integrated in the brain to achieve unified perceptions of multi-modal events in the world. Because much of our everyday experience involves recognizing and reacting to the sights and sounds of our surroundings, understanding the principles by which these auditory and visual inputs are synthesized in the brain is important. The ability to combine auditory and visual signals effectively is particularly important in teaching and learning situations, where spoken words must be put together with a variety of pictorial, graphic, and written information in order to understand the material. This research program can contribute to the development of more efficient learning environments and teaching techniques, and lead to improved designs for communication media such as audio-visual teaching tools, information displays, and critical warning signals. These studies are exploring the role of selective attention in synthesizing auditory and visual signals, research that can lead to improved teaching techniques that emphasize the training of attention. By studying the brain systems that enable auditory and visual inputs to be combined into perceptual wholes, this research can also help to understand what goes wrong in the brains of patients who suffer from abnormalities of perception, including those with learning disabilities, attention deficit disorder, autism, and schizophrenia.

Funder

Funder

National Science Foundation

Fiscal Year Funding

Fiscal Year Funding

0

Current Award Period

Current Award Period

2010-2015

Strategic Plan Question

Strategic Plan Question

Question 2: How Can I Understand What Is Happening?

Strategic Plan Objective

Strategic Plan Objective

2O. Not specific to Question 2 objectives

Project Link

Project Link

Neural basis of cross-modal influences on perception (External web link)

Institution

Institution

University of California, San Diego

State/Country

State/Country

California

Project Number

Project Number

1029084

Federal or Private?

Federal or Private?

Federal

Received ARRA Funding?

Received ARRA Funding?

No

History/Related Projects

History/Related Projects

Neural basis of cross-modal influences on perception | 156424 | 2010 | 1029084
Neural basis of cross-modal influences on perception | 154104 | 2011 | 1029084
Neural basis of cross-modal influences on perception | 158282 | 2012 | 1029084
Neural basis of cross-modal influences on perception | 163755 | 2013 | 1029084
Neural basis of cross-modal influences on perception | 0 | 2015 | 1029084

 
Back to Top