Francois Foerster

Neural Basis of Object-Based Actions in Humans

Principle Supervisor: 
Dr. Jeremy Goslin
University of Plymouth

Collaboration partners:

  • Fraunhofer-Institut für Produktionstechnik und Automatisierung IPA
  • Telerobot Labs

Competence Area: Situation

Contact: francois.foerster@plymouth.ac.uk

Biography

Francois FOERSTER is a Marie-Curie PhD fellow at the University of Plymouth (UK), where he developed an original research paradigm combining EEG recordings and Virtual Reality, simulating our everyday interaction with objects.
He captures in real-time the neuronal activities responsible of our ability to grasp and use objects.

His research is focused on the Cognitive Neuroscience of Perception and Action: How assemblies of neurons synchronise together during object recognition and motor control. 

He holds a BSc in Psychology from the University of Strasbourg (France) and a MSc in Cognitive Science from the Grenoble Institute of Technology (France).
His previous research experience was focused on spatial memory in humans and rats (LPNC, Grenoble / LNCA, Strasbourg)

He also worked on human-robot interaction (HRI) for non-verbal communication using the iCub (GIPSA-lab, Grenoble)
Paper (Humanoids 2015) available: https://hal.archives-ouvertes.fr/hal-01228887/document

Objectives

I conduct human neurocognitive examinations (EEG) of the processes involved in effector-object interaction, leading to enhance cognitive architectures for object manipulation in robots. 

In embodied models of cognition, our representations of objects are formed around the motor programs used to manipulate them. This means that not only do we automatically prepare relevant actions when viewing objects, but also automatically recollect object knowledge derived from our everyday experience.

Our next generation of robots should also benefit of such cognitive capacities to solve common issues concerning object manipulation and demonstrate human-like sensorimotor skills.

More generally, my work describes new perspectives about how and which neuronal mechanisms provide our fast and efficient ability to manipulate objects.

Current work

To study how humans process objects and prepare motor interaction, I propose an original paradigm combining ElectroEncephaloGram (EEG) and Virtual Reality (VR). On one hand, EEG allows me to compare different brain activities representing how objects are processed (eg. different shapes, sizes, location in space, etc.) and how different actions are performed (eg. grasping and moving an object, using a tool). On the other hand, VR allows me to create these objects to manipulate (as Blender 3D models), in order to control their novelty. Finally, it's also provide an easy way to reproduce the experiments and to record the kinematics of the controller in the space during these object-based actions.

First study: Object knowledge are extracted during visual processing, not motor planning.

How human's brain implement different object-based actions, as reaching, grasping and moving or using a tool?

Our EEG results suggest that semantic properties of objects (e.g. the function of a tool) are accessed during object viewing rather than motor preparation.
This also means that the brain recollects object knowledge by default, independently of the motor task.

https://www.youtube.com/watch?v=Y20SEX14Az4

Study presented in:

BACN 2017: National Conference of the British Association of Cognitive Neuroscience, 7-8 September 2017, Plymouth University, UK.

CuttingEEG 2017: 3rd Symposium on cutting-edge methods for EEG research, June 19-22, Glasgow, UK.

Second study: The role of the mu and beta neural rhythms in the extraction of motor and semantic knowledge about objects

Our EEG analyses suggest that the mu and beta frequency bands have different roles in the extraction of motor information and semantic information about object for action.

Whereas the mu rhythm (8-13 Hz) appears to process motor information ("How" to manipulate an object), the beta rhythm (14-30 Hz) seems to process only the semantic properties about an object ("What" is the common purpose of the object)

In the video, we depict our experiment where participants learnt "How" to use a novel tool (performing a key-like manipulation) and "What" is the function of this novel tool (opening a box).

https://www.youtube.com/watch?v=5e4BmL8MSG4

 

Study presented in:

CAOs 2018International workshop Concepts, Actions and Objects, May 3-5, Rovereto, Italy

CuttingEEG 2018: 4rd Symposium on cutting-edge methods for EEG research, July 2-5, Paris, France

Movement & Cognition 2018:  International conference Movement & Cognition (Brain, Body, Cognition), 27-29 July 2018, Harvard Medical School, Boston, USA

Third study: The role of neuronal rhythms in object perception and the selection of object-based actions

How our neuronal activities reflects our everyday ability to light up our cigarette with lighters or opening our doors with keys?

Our current experiment evaluates the neuronal mechanisms responsible for the selection of two tool use (lighting up a candle and opening a chest) associated with novel tools.

Our time-frequency analyses derived from EEG recordings suggested that motor and semantic content during object perception and action selection rely on distinct frequency bands.

 Whereas motor and semantic learnt information about objects are extracted through fast Mu (8-13 Hz) and Beta (14-30 Hz) frequency bands, selecting which action associated with these objects to perform rely on slow Delta (1-4 Hz) and Theta (4-8 Hz) frequency bands.

 

Study presented in:

Hand, Brain and Technology 2018: International conference on transdisciplinary neuroscience, September 7-12, Monte Verità, Ascona, Switzerland

 

 

Seminar at Rome CNR-ICTS

Title: How and when does the brain access learned properties of objects? Evidence from EEG recordings on neuronal rhythms

Abstract: We spend our entire life manipulating objects around us. Two main information processing occurs during the perception and action with objects: processing the potential effector-object motor interactions (e.g., "How" to use a tool) and the semantic knowledge learned about objects (e.g., weight, name and function of a tool). But how and when does the brain process these information to select how to manipulate objects? I will present a series of three experiments combining EEG recordings and virtual reality, where I've evaluated the neurophysiological correlates of motor and semantic object knowledge during passive observation of novel tools but also during the performance of different manipulations (moving or using them). I will discuss the distinct role of neuronal rhythms in the sensorimotor brain areas in the perception of learned affordances and the selection of object-based actions.

Third study: The role of linguistic labels in the preparation of object manipulation

Expected Results

To provide a new understanding of interactive object manipulation through the examination of the modulation of action affordances when objects are shared and manipulated between multiple agents, both human and robot. Electrophysiological experiments will examine the neural activity associated with action affordances during interactive object use, which will inform software models for robot object manipulation and human-robot collaboration.