SELFCEPTION: Robotic self/other distinction for interaction under uncertainty

SELFCEPTION is an interdisciplinary project that combines robotics and cognitive psychology. I investigate brain-inspired computational models for i) building robots that learn to recognize their own body for improving interaction and ii) investigate the self construction in humans using synthetic models.

Recent evidence suggests that self/other distinction will be a major breakthrough for improving interaction and might be the connection between low-level sensorimotor abilities and voluntary actions, or even abstract thinking. The project follows the hypothesis that the “sensorimotor self” learning will permit that humanoid robots could distinguish between the machine and the other agents during interaction. For that purpose, SELFCEPTION proposes combining advanced sensorimotor learning with new multimodal sensing devices, such as artificial skin, in order to permit the robot to acquire its perceptual representation.

The project pursues the materialization of the next generation of perceptive robots: multisensory machines able to build their perceptual body schema and distinguish their actions from other entities. We already have robots that navigate and now it is the time to develop robots that interact.

PI: Dr. Pablo Lanillos