We propose to develop a robotic architecture with music-based interactions for multi-modal stimuli and emotional robotic engagement. Through this project, we aim to enhance the engagement of children with ASD in daily activities using musical stimuli, and we propose to design dual-layer metrics for engagement in both the emotional domain and the spatial domain for emotional and social interactions. With this system the children will be stimulated, guided, and finally challenged to perform creative interactions through musically driven activities. This process will build up a semantic knowledge for the behavioral pattern of the children with ASD in response to the musical stimulus and physical activities, leading to new therapeutic solutions, which will provide an excellent foundation for prolonged treatment and positive engagement. The specific aims we will pursue are: Aim 1: To develop and evaluate the framework for musical engagement using auditory signals and musical components as well as robotic demonstration of activities such as games and dances. Aim 2: To design and evaluate a mutual learning process between the robotic agent and a child for engaging in musical activities and define a structure of individually parameterized mappings between musical stimulus and spatio-physical response of the child for maximizing interaction. Aim 3: To define and develop key metrics for the evaluation of emotional and social engagement with the robotic agent during physio-musical sessions. This proposed research will produce state-of-the-art techniques that facilitate autonomous interaction methods for robots to effectively stimulate the emotional and social interactivity of children. Theinnovative aspect of our proposed research is the use of musical interaction and activities to provide a new therapeutic domain for effective development in the children's emotion and social interactions.