NESTCOM

 

Home
Fact Sheet
Nest Projects
Associated Staff
Partners
Publications
Reports
Call for Papers
ICANN '07
ESN '08
NCAF '09
Events
Location
Contact
Internal


Recently new theories and experiments in neuroscience have indicated that a biological and neuroscience-oriented approach for multimodal processing will lead to new life-like perception action systems. In particular, concepts from mirror neurons suggest that own actions, observed actions and language are very much interrelated since the same mirror neurons fire at the same time based on experiments with monkeys and humans. Mirror neuron areas correspond to cortical areas which are related to human language centres (e.g. Broca region) and could provide a cortical substrate for the integration of vision, language and action. This project MirrorBot will develop and study emerging embodied representations based on mirror neurons.We will develop new techniques including cell assemblies, associative neural networks, and Hebbian-type learning in order to associate vision, language and motor concepts. We develop biomimetic multimodal learning and language instruction in a robot to investigate the task of searching for objects.