Understanding infant development through baby humanoid robots.

Matej Hoffmann

Czech Technical University in Prague

Human newborns are at first virtually helpless, but within a year or two they learn to effectively control their bodies, to reach and to locomote. What brain algorithms are at the basis of this learning? Current knowledge on sensorimotor development is patchy. A common but unverified idea is that learning involves intrinsically motivated exploration of the body and the world, giving rise to coherent internal representations that subsequently allow effective motor control. In this talk, will I question these assumptions.

We longitudinally follow infants in three different scenarios – spontaneous behavior, reaching for tactile stimuli on the body, and reaching for objects presented visually – and use behavioral data from these three contexts to: (i) identify signatures of active exploration and goal-orientedness in spontaneous behavior, (ii) compare reaching to somatosensory and visual targets over development to understand the interplay of target localization and motor control.

To shed light on the mechanisms of the development of the “sensorimotor self”, we instantiate these scenarios in baby humanoid robots. I will also show AI tools that we develop to extract infant 3D pose from videos, reattribute motion data to robots, and replay the motor, proprioceptive, visual, and tactile experience of infants.

Back