Future neuroprosthetics will be tightly coupled with the user in such a way that the resulting system can replace and restore impaired upper limb functions because controlled by the same neural signals than their natural counterparts. However, robust and natural interaction of subjects with sophisticated prostheses over long periods of time remains a major challenge. To tackle this challenge we can get inspiration from natural motor control, where goal-directed behavior is dynamically modulated by perceptual feedback resulting from executed actions.
Current brain-machine interfaces (BMI) partly emulate human motor control as they decode cortical correlates of movement parameters --from onset of a movement to directions to instantaneous velocity-- in order to generate the sequence of movements for the neuroprosthesis. A closer look, though, shows that motor control results from the combined activity of the cerebral cortex, subcortical areas and spinal cord. This hierarchical organization supports the hypothesis that complex behaviours can be controlled using the low-dimensional output of a BMI in conjunction with intelligent devices in charge to perform low-level commands.
A further component that will facilitate intuitive and natural control of motor neuroprosthetics is the incorporation of rich multimodal feedback and neural correlates of perceptual processes resulting from this feedback. As in natural motor control, these sources of information can dynamically modulate interaction.
I will illustrate these principles and approach with a variety of brain-controlled robots and devices that have been extensively tested by users, many of them with severe motor disabilities.