Humans are masters in predicting others' behavior. By watching our child's facial expression, we know exactly which toy she will go for. When seeing someone frown at an open window, we are not surprised when she gets up and closes it. Conversely, a breakdown of these predictions might be one reason why social interactions are so confusing to those with autism. This project tests, using behavioral and psychophysical measures, whether there is a sophisticated mechanism in our brains that ‘knows’ which cues signal the intentions of others and uses this knowledge to predict these people's actions (eg, looking at something signals interest, a smile signals the tendency to approach).The first aim is to demonstrate that predictions of other’s behavior are indeed generated when watching others. We will test whether the perception of different social cues is automatically converted into predictions of their future actions.A second aim is to find out how these predictions come about, and specifically whether these predictions rely on our own action knowledge. A third and final aim is to establish whether such predictions are crucial for social interactions, and whether their breakdown is related to the social difficulties in individuals with autism spectrum disorder.