Title: Neural models for robot motor skill learning.
Abstract:
The challenges in understanding human motor control, in brain-machine interfaces
and anthropomorphic robotics are currently converging. Modern anthropomorphic
robots with their compliant actuators and various types of sensors (e.g., depth
and vision cameras, tactile fingertips, full-body skin, proprioception) have
reached the perceptuomotor complexity faced in human motor control and learning.
While outstanding robotic and prosthetic devices exist, current brain machine
interfaces (BMIs) and robot learning methods have not yet reached the required
autonomy and performance needed to enter daily life.
For truly autonomous robotic and prosthetic devices four major challenges have
to be addressed. These fields can be grouped into the major area of
Neurorobotics and are, (1) the decomposability of complex motor skills into
basic primitives organized in complex architectures, (2) the ability to learn
from partial observable noisy observations of inhomogeneous high-dimensional
sensor data, (3) the learning of abstract features, generalizable models and
transferable policies from human demonstrations, sparse rewards and through
active learning, and (4), accurate predictions of self-motions, object dynamics
and of humans movements for assisting and cooperating autonomous systems.
My contributions are probabilistic computational models that can be trained from
high-dimensional input streams of neural and artificial data (e.g., action
potentials, movement kinematics, joint forces, EMG signals, tactile readings).
The learned models are evaluated in human motor adaptation experiments and in
robot reaching and balancing tasks. These probabilistic models can be
co-activated and sequenced in time as movement primitives and can be modulated
by a small set of control parameters to generalize to new tasks. In neural
network implementations forward and inverse kinematic models are learned
simultaneously and used to generate movement plans in a compliant humanoid
robot. The neural models capture the correlations of the input and can forecast
self-motions or co-workers intentions as demonstrated in a recent human
adaptation experiment which showed that postural control precedes and predicts
volitional motor control.