Learning with a sensori-motor, cortical brain-machine interface

20 NOVEMBER 2017 - 11 AM

Luc Estebanez, Unité de Neuroscience, Information et Complexité (UNIC, FRE CNRS 3693), Gif-sur-Yvette, France.

Summary: The development of brain-machine interfaces (BMIs) brings a new perspective to patients with a loss of autonomy. By combining online recordings of brain activity with a decoding algorithm, patients can learn to control a robotic arm into performing simple actions. However, in contrast to the vast amounts of somatosensory information channeled by limbs to the brain, current BMIs are devoid of touch and force sensors, and patients in trial experiments must therefore fall back to vision and audition, which are maladapted to the control of a prothesis. It is a challenge to estimate only with vision the forces exerted by a prosthesis on a glass filled with juice, and how smooth its trajectory is.

To address this shortcoming, we developed in the mouse model a BMI that includes a rich artificial somatosensory-like cortical feedback. This setup records online the activity
of multiple neurons in the whisker primary motor cortex (vM1), while simultaneous feedback is delivered via a low-latency, high-refresh rate and spatially structured photo-stimulation of the whisker primary somatosensory cortex (vS1), based on a mapping obtained via intrinsic imaging. It is a tool that we will use as both a research tool and also as a preclinical model to bring sensory inputs to prosthesis. 

Lieu : Room B501.