Description
|
This Expedition will develop novel computational methods for measuring and analyzing the behavior of children and adults during face-to-face social interactions. Social behavior plays a key role in the acquisition of social and communicative skills during childhood. Children with developmental disorders, such as autism, face great challenges in acquiring these skills, resulting in substantial lifetime risks. Current best practices for evaluating behavior and assessing risk are based on direct observation by highly-trained specialists, and cannot be easily scaled to the large number of individuals who need evaluation and treatment. For example, autism affects 1 in 110 children in the U.S., with a lifetime cost of care of $3.2 million per person. By developing methods to automatically collect fine-grained behavioral data, this project will enable large-scale objective screening and more effective delivery and assessment of therapy. Going beyond the treatment of disorders, this technology will make it possible to automatically measure behavior over long periods of time for large numbers of individuals in a wide range of settings. Many disciplines, such as education, advertising, and customer relations, could benefit from a quantitative, data-drive approach to behavioral analysis. Human behavior is inherently multi-modal, and individuals use eye gaze, hand gestures, facial expressions, body posture, and tone of voice along with speech to convey engagement and regulate social interactions. This project will develop multiple sensing technologies, including vision, speech, and wearable sensors, to obtain a comprehensive, integrated portrait of expressed behavior. Cameras and microphones provide an inexpensive, noninvasive means for measuring eye, face, and body movements along with speech and nonspeech utterances. Wearable sensors can measure physiological variables such as heart-rate and skin conductivity, which contain important cues about levels of internal stress and arousal that are linked to expressed behavior. This project is developing unique capabilities for synchronizing multiple sensor streams, correlating these streams to measure behavioral variables such as affect and attention, and modeling extended interactions between two or more individuals. In addition, novel behavior visualization methods are being developed to enable real-time decision support for interventions and the effective use of repositories of behavioral data. Methods are also under development for reflecting the capture and analysis process to users of the technology. The long-term goal of this project is the creation of a new scientific discipline of computational behavioral science, which draws equally from computer science and psychology in order to transform the study of human behavior. A comprehensive education plan supports this goal through the creation of an interdisciplinary summer school for young researchers and the development of new courses in computational behavior. Outreach activities include significant and on-going collaborations with major autism research centers in Atlanta, Boston, Pittsburgh, Urbana-Champaign, and Los Angeles.
|
History/Related Projects
|
|