Individuals with autism spectrum disorder (ASD) have difficulties engaging in social interactions, including problems making eye contact with other people and sharing attention with others about objects in their environment (joint attention). Many interventions aim to allow individuals to more effectively use gaze during social interactions. Currently, there is a shortage of objective, quick and reliable measurements that can detect changes in social communication behaviors. Without good measures to capture change, it is almost impossible to determine if a treatment is effective or is helping to ameliorate ASD symptoms. In addition, measures that are used to determine change are often time intensive, making them impractical to use in many clinical settings.
Jame Rehg and Agata Rozga at the Georgia Institute of Technology, in collaboration with Rebecca Jones and Catherine Lord at Weill Cornell Medical College, plan to develop objective, automated measurements of eye contact and joint attention and to test whether these automated measures are as effective as human video coding. The team at Georgia Tech provides considerable computational analysis expertise, while the Weill Cornell group has ASD phenotype expertise, enabling the success of this research.
Individuals in the study will participate in a series of (semi) standardized play interactions designed to elicit eye contact and joint attention. Automated coding will be compared with manual coding as well as clinical ratings, with measures first validated in a group of typically developing children and subsequently in children with ASD. A key objective of this research is to develop an automated measurement system that is reliable and less time intensive than manual coding, which is also subject to human error. If successful, the novel automated measure could be scaled across clinical settings and eventually natural environments (e.g., home, school) and be used to effectively demonstrate change in social communication behaviors in ASD.