A better understanding neural population function would be an important advance in systems neuroscience.
Neurons encode many parameters simultaneously, but the fidelity of encoding at the level of individual neurons is weak. However, because encoding is redundant and consistent across the population, extraction methods based on multiple neurons are capable of generating a faithful representation of intended movement. The realization that useful information is embedded in the population has spawned the current success of brain-controlled interfaces. Since multiple movement parameters are encoded simultaneously in the same population of neurons, we have been gradually increasing the degrees of freedom (DOF) that a subject can control through the interface. Our early work showed that 3-dimensions could be controlled in a virtual reality task. We then demonstrated control of an anthropomorphic physical device with 4 DOF in a self-feeding task. Currently, monkeys in our laboratory are using this interface to control a very realistic, prosthetic arm with a wrist and hand to grasp objects in different locations and orientations. Our recent data show that we can extract 10-DOF to add hand shape and dexterity to our control set.
This technology has now been extended has been extended to a paralyzed patient who cannot move any part of her body below her neck. Based on our laboratory work and using a high-performance “modular prosthetic limb” she has been able to control 10 degrees-of-freedom simultaneously. The control of this artificial limb is intuitive and the movements are coordinated and graceful, closely resembling natural arm and hand movement. This subject has been able to perform tasks of daily living– reaching to, grasping and manipulating objects, as well as performing spontaneous acts such as self-feeding. Current work is progressing toward making this technology more robust and extending the control with tactile feedback to sensory cortex.