This award supports a rising interdisciplinary scholar investigating visuomotor integration in typical development and Autism Spectrum Disorder (ASD). Effective navigation and action requires accurate visuomotor integration, or the use of visual information to guide movement. Visuomotor integration also requires attentional filtering to ensure that relevant information is processed, while irrelevant information is disregarded or suppressed. Few studies have examined visuomotor integration in naturalistic settings that allow measurement of both full-body motion and eye movement. The proposed project combines new technologies to investigate visuomotor integration in typical development and Autism Spectrum Disorder (ASD). ASD is a clinical population with known visual and motor differences in the brain regions that support these systems and in functional performance. At present, most treatment approaches focus on social communication in a broad, qualitative manner without specific attention to the role of visual and motor functioning. This project quantifies differences in eye and body movements during imitative gesturing with a robot partner, as well as during other movements such as walking. By studying how visuomotor integration impacts movement and interaction with real and virtual objects, researchers gain a better understanding of the impact this skill has on higher-order features of ASD. This, in turn, aids in developing and delivering more targeted, evidence-based treatments. Quantitative approaches to measuring visuomotor integration skills also serve as an effective biomarker of ASD for early diagnosis, since visual and motor skills can be precisely measured earlier than social communication skills.
Recent innovations in eye tracking, robotics, and virtual reality have yielded technologies that can be integrated to study the interaction between visual, motor, and attentional systems in real-time. This project investigates visual, motor, and attentional processes in ASD and typical development to determine their relative contributions to accurate perception and action using virtual environments and human-robot interaction tasks that test visual and motor responses to object motion and imitative gesturing. The project aims include refining software used to analyze motion and eye data together by calculating gaze vector in a manner that accounts for head and body movement. This enables researchers to examine the strategies used by individuals with and without ASD when locating and tracking moving objects (e.g., preference for moving the head versus shifting gaze) or gestures (e.g., waving hand). In addition to head-eye integration, the project aims include measuring full-body motor responses to object motion. In order to comprehensively investigate imitative gesturing in the human-robot interaction tasks, the researchers use a new technique, Dynamic Time Warping, which allows examination of both spatial and temporal synchrony between a gesture modeled by the robot and the imitative movement generated by a participant. This novel approach to analysis may reveal important biomarkers of ASD related to visuomotor integration, which would not be evident in studies that examine only the spatial properties of imitative gesturing. The proposed project also advances methodological approaches via the development of new tools for data collection and analysis that are specifically suited to investigating perception and action in a naturalistic environment.