Integrating multi-sensory input in the body model, an RNN approach to connect proprioception, visual features and motor control

TitleIntegrating multi-sensory input in the body model, an RNN approach to connect proprioception, visual features and motor control
Publication TypeConference Paper
Year of Publication2011
AuthorsSchilling, M.
Other Numbers3267
Abstract

An internal model of the own body can be assumedto be a central and early representation as such a model isalready required in simple behavioural tasks. More and moreevidence is showing that such grounded internal models areapplied in higher level tasks. Internal models appear to berecruited in service for cognitive function. Understanding whatanother person is doing seems to rely on the ability to step intothe shoes of the other person and map the observed action ontoones own action system. This rules out dedicated and highlyspecialized models, but presupposes a flexible internal modelwhich can be applied in different context and fulfilling differentfunctions. Here, we are going to present a recurrent neuralnetwork approach of an internal body model. The model can beused in the context of movement control, e.g. in reaching tasks,but can also be employed as a predictor, e.g. for planning ahead.The introduced extension allows to integrate visual features intothe kinematic model. Simulation results show how in this waythe model can be to utilised in perception.

Acknowledgment

This work was partially funded by the Deutscher Akademischer Austausch Diesnst (DAAD) through a postdoctoral fellowship.

URLhttp://www.icsi.berkeley.edu/pubs/ai/rnnapproach11.pdf
Bibliographic Notes

Proceedings of the International Joint Conference on Neural Networks (IJCNN 2011), San Jose, California

Abbreviated Authors

M. Schilling

ICSI Research Group

AI

ICSI Publication Type

Article in conference proceedings