Learning Body Models:From Humans to Humanoids
Type of documenthabilitační práce
Institutions assigning rankČeské vysoké učení technické v Praze. Fakulta elektrotechnická.
MetadataShow full item record
Humans and animals excel in combining information from multiple sensory modalities, control-ling their complex bodies, adapting to growth, failures, or using tools. These capabilities are alsohighly desirable in robots. They are displayed by machines to some extent. Yet, the artificial crea-tures are lagging behind. The key foundation is an internal representation of the body that theagent—human, animal, or robot—has developed. The mechanisms of operation of body modelsin the brain are largely unknown and even less is known about how they are constructed fromexperience after birth. In collaboration with developmental psychologists, we conducted targetedexperiments to understand how infants acquire first “sensorimotor body knowledge”. These ex-periments inform our work in which we construct embodied computational models on humanoidrobots that address the mechanisms behind learning, adaptation, and operation of multimodalbody representations. At the same time, we assess which of the features of the “body in the brain”should be transferred to robots to give rise to more adaptive and resilient, self-calibrating ma-chines. We extend traditional robot kinematic calibration focusing on self-contained approacheswhere no external metrology is needed: self-contact and self-observation. Problem formulationallowing to combine several ways of closing the kinematic chain simultaneously is presented,along with a calibration toolbox and experimental validation on several robot platforms. Finally,next to models of the body itself, we study peripersonal space—the space immediately surround-ing the body. Again, embodied computational models are developed and subsequently, the pos-sibility of turning these biologically inspired representations into safe human-robot collaborationis studied.
The following license files are associated with this item: