Competence Area "Situation"
Besides basic reactive behaviour, situational controllers need to sense and integrate body and sensors information with situation information to allow the robot to build a representation of the task and control the interaction with humans. In particular, the integration of multi-modal sensory input leads to a more situated awareness in order to predict, detect and avoid threats before they arise. Situated controllers also need to make use of additional cues from the human interaction. For instance, integrating spoken utterances or sounds can modify movement plans and interrupt actions spontaneous and instantaneous. Here, we pursue an approach where the situational control system co-adapts with the sensorimotor body system and the environmental constraints.
Within SECURE, the fellows will work on a variety of neuro-inspired control systems ranging from low-level sensorimotor control to higher-level control tasks related to language interaction between humans and robots. Neurocognitive evidence on action learning and representation, such as the evidence on the mirror neuron system for action production and recognition, provides support for an integrated view of sensorimotor control where touch, proprioception, and vision are intertwined with motor information in a multisensory representation of the space around the body. Moreover, these sensorimotor representations are also integrated with linguistic knowledge from a developmental perspective. Thus some of the fellows’ projects will develop statistical and neuro-inspired robot control systems to support safe interaction with humans.