Many primate species, including humans, appear to be specialized for complex social behavior. This behavior is mediated by a specialized combination of visual and auditory communication signals that operate via specific neural circuitry. A hallmark of autism, however, is an inability to behave in a socially-appropriate manner because of an inability to process relevant sensory cues correctly. The long-term goal of this project is to understand how social information from different sensory inputs is combined into a unified concept by the primate brain. Among human and nonhuman primates, faces and voices are the primary means of transmitting social signals. The major aim of this research is to understand how dynamic facial expressions are integrated with vocal expressions in the auditory and multisensory regions of the cerebral cortex. By examining the relationships between auditory and visual signals, the role of facial movement, and the ability of primates to use these signals to make sophisticated judgments about the physical characteristics of social group members, this research may uncover principles of visual-auditory neuronal interactions related to social cognition in the temporal lobe of the brain.