A novel and transformative robotic intervention technology, called ARIA
(Adaptive Robot-mediated Intervention Architecture), with the potential to accelerate social communication skill development for young children with autism spectrum disorders (ASD) is proposed in this research. ARIA will fluidly integrate a humanoid robot, multiple spatially distributed network of cameras, an array of display monitors, as well as a complex but efficient computational face, gaze and gesture detection methodology in order to create a highly flexible and adaptive intelligent environment to potentially advance early joint attention and imitation related skills for young children with ASD. Application of this system will be examined across two user studies with well-defined samples of young children with ASD to provide specific answers and direction to important questions of generalization and potential impact of robotic intervention.
Intellectual Merit: The proposed research advances the design and development of intelligent adaptive robotic platforms to offer a potentially transformative intervention application for young children with ASD. The specific technological innovation proposed here has the potential to significantly contribute to new non-invasive and closed-loop human-robot interaction learning paradigms with potential broad extension to individuals with a vast array of neurodevelopmental conditions and limiting sensory vulnerabilities across the lifespan. From the perspective of the science and technology of robotics, the project will contribute towards the design and development of smart environments for learning, intelligent system architecture for adaptive robotics as well as affective computing and control of dynamic human-robot interaction. In particular, it has the potential to significantly contribute towards developing novel efficient applications of computational methods for affective computing, particularly affective computing mediated by non-invasive gaze and attention processing. It will also contribute towards closed loop gesture-based human-robot interaction by developing new methodologies for gesture recognition and adaptive response from the robot. The project will develop a framework and tools to design adaptive environments for enhanced robotic and embodied social interaction that intelligently and fluidly integrates real-time behavioral indices of attentive and gesture information into flexible and controllable response systems. In short, the proposed activity represents a system has the potential to fundamentally advance the engineering knowledge of intelligent human-robotic interaction. This paradigm may also potently impact our understanding of the science of ASD intervention itself. The embedded user studies will test the potential efficacy of robotic intervention on the earliest core symptoms of ASD.