1,438
Views
7
CrossRef citations to date
0
Altmetric
Editorial

The body–machine interface: a pathway for rehabilitation and assistance in people with movement disorders

, &
Pages 145-147 | Published online: 09 Jan 2014

At the beginning of the millennium, a new frontier of neuroscience and engineering appeared in the landscape of scientific and technological research: breaking the barriers of paralysis, even in its most severe forms, by developing human–machine interfaces based on what we know about how the brain plans and control movements. If we can read the neural activity from the motor cortex and correctly guess how these activities guide the movements of our arm, we can use this information for bypassing broken neural circuits and moving a prosthetic limb to interact with the environment. This vision has created a whole new research field that we know today as neural engineering Citation[1].

While neural engineering is a young discipline, the connection between neuroscience and engineering is much older. One of the founders of computer science, and one of the inventors of the digital computer, John von Neumann, was driven by the goal of creating an artificial brain Citation[2]. It is somewhat ironic that in later times the so-called ‘von Neumann’ architecture of computers has been taken as a model of ‘nonbiological’ or even ‘antibiological’ architecture, because of its separation between processing and memory. Another giant of that time, around the time of World War II, is Norbert Wiener who set the basis for the control theory or cybernetics, guided by curiosity for biological control mechanisms Citation[3].

Another branch of science and engineering that both influenced and was influenced by the interest in the brain is information theory. This fruitful interaction has followed different pathways, from the neuron model of Hodgkin and Huxley Citation[4] to the development of artificial intelligence between the 1960s and the 1980s Citation[5]. In the most recent developments, the interaction between engineering and biology has moved from understanding and imitating the brain, to the concept of interfacing with the brain. Moreover, early research in understanding movement control and decision-making also borrowed heavily from the information theory. Principles such as Fitts’ Law and Hick’s law describe the way information is processed by the brain and predict the timing of actions and decisions. These principles now form the basis for bioengineering design in fields that involve the integration of humans and machines, such as human–computer interaction.

In this editorial, we focus on recent developments in the field of body–machine interfaces (BMI). The acronym BMI has traditionally been used to indicate ‘brain–machine interfaces’. Here, we use it in what we see as a broader scope, with ‘B’ standing for ‘body’ Citation[6]. The first element of a BMI system is the human body, from which signals of different types can be extracted for operating external devices. These signals may be extracted directly from body motions, using goniometers, magnetic or infrared sensors, accelerometers, cameras, force sensors and pressure switches. On the other hand, they may measure some underlying neurophysiological activity, such as muscle activity (using EMG), electroencephalographic signals or neuronal firing as in BMI. On the other side of the BMI, the machine is the device or instrument to be controlled. This may be something in common use, such as an automotive or a musical instrument or a special tool for people with movement disorders, such as a bionic limb or a powered wheelchair. Placed between the body and the machine, the interface transforms the body signals into commands to the device. In principle, the link may also operate in the reverse direction, by encoding the state of the device (or the environment) into stimuli to be delivered to the user (as in cochlear implants) Citation[7,8]. However, the inclusion of sensory interfaces in BMIs appears to be a greater challenge than the decoding of body signals into commands to the machine.

Many of the existing interfaces that humans interact with on a regular basis (like computer keyboards, joysticks, video game controllers or musical instruments) can also be termed as BMIs. However, one of the key features of what we would today call a BMI system is its ability to be adaptive and effectively ‘learn its user’. In this sense, many rehabilitation robotics devices may be considered as adaptive BMIs. Commercial devices for robot-assisted therapy, such as the InMotion Arm (Interactive Motion Technologies, Inc., MA, USA), the HapticMaster (FCS Control Systems, Fokkerweg, The Netherlands) and others, measure body-derived signals (such as hand movements and forces) to produce assistive or resistive forces based on these signals Citation[9]. An advanced trend in the field is the development of ‘learning interfaces.’ In this case, the human and machine adapt interactively to each other, although human learning and machine learning typically operate at different timescales Citation[10].

Early BMIs were a means for investigating the basic mechanisms underlying the neural control of movement. Litvintsev used EMG signals from rats as arguments to a nonlinear function, whose value was then fed back to the rat as a pain stimulus Citation[11]. They found that the rats could learn to control the activities of two muscles by employing search strategies that minimized pain. Shortly afterwards, Fetz and Finocchio published the first experiment in which monkeys were trained by operant conditioning to control the activity of individual cortical neurons Citation[12]. Similar studies were performed in humans using goniometers and visual feedback, showing how BMIs could provide a useful tool to study the problem of redundancy, that is, how the nervous system controls a system with multiple degrees of freedom.

Most recent work on BMIs involves learning to control a cursor on a computer display or a robotic arm. Once one is able to move a cursor on a screen, the same skill can be used to control virtually any device, in the same way as we control machines by typing commands on a computer keyboard and by operating a joystick. Several approaches to cursor control have been developed that are based on various body signals. These include EMG control Citation[13], eye movements Citation[14], head motions Citation[15] and tongue-based control Citation[16]. Beyond cursor control, there have also been approaches that attempt to directly interface with assistive devices. These include multiple-degree-of-freedom robots, ankle/knee orthoses and electrically powered wheelchairs.

As mentioned earlier, a particular type of BMI, which has received much attention, is the BMI in which signals from the brain – obtained either invasively through electrode recordings Citation[17] or noninvasively by electroencephalography Citation[18] – are directly interfaced to the machine, thereby completely bypassing movements of the body. BMI are especially relevant to movement disorders such as severe paralysis, late-stage amyotrophic lateral sclerosis or locked-in syndrome, where there is little or no residual movement control ability. Aside from such extreme cases, it might be preferable to develop and use BMIs that rely on movements (rather than neural activity) for three main reasons. First, noninvasive interfaces do not carry the risk of surgical complications. Second, while the rate of information transmission of electroencephalographic signal-based brain–computer interfaces ranges from 0.05 to 0.5 bits/s a recent study estimated that body motions might operate at about 5 bits/s Citation[19]. Third, and perhaps most importantly, disabled people, even those with severe forms of paralysis, still have a body that can move and that benefits from remaining active in many important ways.

Survivors of spinal cord injury, stroke and people suffering from a variety of disorders face several related complications, some being immediate consequence of their neurological condition, others being side effects of immobility or reduced mobility. The limited possibility for functional use of the upper body contributes to weakness, poor posture and, with time, causes pain, bone loss and attenuates voluntary control of residual movements. By controlling the interface with their residual body motions, people with paralysis will not only become able to operate assistive devices (such as wheelchairs and computers), but will also engage in a sustained physical exercise while performing functional and entertaining activities. In preliminary observations, we noticed that these activities appear to have relevant collateral benefits in terms of motor control, strength, engagement and mood Citation[20]. By mapping all the residual movement capacity into specific operational functions, paralyzed users of assistive devices may find a natural balance between ease of device operation and exercise of under-utilized muscles. Moreover, motion-based BMIs are suited to exercise all of the available degrees-of-freedom in the upper body through targeted practice of control actions in virtual reality environments. There, it is possible to establish a transformation from body motions to a command space emphasizing degrees of freedom that are more difficult to control. This would stimulate the disabled users to expand their effective range of motion.

The length of an editorial is certainly not sufficient to provide an exhaustive perspective on the wide and growing field of human machine interfacing. The ambitious task of establishing new communication pathways between the brain and a variety of devices calls for the combination of many domains of knowledge, from cellular and molecular neurobiology to cognitive sciences and from tissue engineering to computer science. However, most importantly, when disease or accident impairs our ability to interact with the world, finding alternative pathways for expressing our intentions and for receiving information is a formidable challenge that science and technology have finally begun to tackle.

Disclaimer

The contents of this article are solely the responsibility of the authors and do not necessarily represent the official view of the NIH.

Financial & competing interests disclosure

This publication was made possible by grants 1R01HD072080 and 1R01NS05358 from the NIH. The authors have no other relevant affiliations or financial involvement with any organization or entity with a financial interest in or financial conflict with the subject matter or materials discussed in the manuscript apart from those disclosed.

No writing assistance was utilized in the production of this manuscript.

References

  • Eliasmith C, Anderson CH. Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems. MIT Press, MA, USA (2004).
  • von Neumann J. The Computer and the Brain. Yale University Press, New Haven, CT, USA (1958).
  • Wiener N. Cybernetics; or Control and Communication in the Animal and the Machine. MIT Press, MA, USA (1948).
  • Hodgkin AL, Huxley AF. A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. (Lond.) 117(4), 500–544 (1952).
  • Minsky M. Steps toward artificial intelligence. Proc. IRE 49(1), 8–30 (1961).
  • Casadio M, Ranganathan R, Mussa-Ivaldi FA. The body–machine interface: a new perspective on an old theme. J. Mot. Behav. 44(6), 419–433 (2012).
  • Loeb GE. Cochlear prosthetics. Annu. Rev. Neurosci. 13, 357–371 (1990).
  • Romo R, Hernández A, Zainos A, Brody CD, Lemus L. Sensing without touching: psychophysical performance based on cortical microstimulation. Neuron 26(1), 273–278 (2000).
  • Volpe BT, Krebs HI, Hogan N, Edelsteinn L, Diels CM, Aisen ML. Robot training enhanced motor outcome in patients with stroke maintained over 3 years. Neurology 53(8), 1874–1876 (1999).
  • Danziger Z, Fishbach A, Mussa-Ivaldi FA. Learning algorithms for human–machine interfaces. IEEE Trans. Biomed. Eng. 56(5), 1502–1511 (2009).
  • Litvintsev A. Search activity of muscles in the presence of an artificial feedback loop enclosing several muscles simultaneously. Automation Remote Control 29, 464–472 (1968).
  • Fetz EE, Finocchio DV. Operant conditioning of isolated activity in specific muscles and precentral cells. Brain Res. 40(1), 19–23 (1972).
  • Barreto AB, Scargle SD, Adjouadi M. A practical EMG-based human–computer interface for users with motor disabilities. J. Rehabil. Res. Dev. 37(1), 53–63 (2000).
  • Jacob RJK. The use of eye movements in human–computer interaction techniques: what you look at is what you get. ACM Trans. Inf. Syst. 9(2), 152–169 (1991).
  • Mandel C, Rofer T, Frese U. Applying a 3DOF orientation tracker as a human–robot interface for autonomous wheelchairs. Presented at: Institute of Electrical and Electronics Engineers 10th International Conference on Rehabilitation Robotics, 2007. Noordwijk aan Zee, The Netherlands, 13–15 June (2007).
  • Salem C, Zhai S. An isometric tongue pointing device. In: ACM CHI 97 Conference on Human Factors in Computing Systems, volume 1 of Technical Notes: Input & Output in the Future. ACM Press, NY, USA, 538–539 (1997).
  • Hochberg LR, Serruya MD, Friehs GM et al. Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature 442(7099), 164–171 (2006).
  • Wolpaw JR, McFarland DJ. Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. Proc. Natl Acad. Sci. USA 101(51), 17849–17854 (2004).
  • Felton EA, Radwin RG, Wilson JA, Williams JC. Evaluation of a modified Fitts law brain–computer interface target acquisition task in able and motor disabled individuals. J. Neural Eng. 6(5), 056002 (2009).
  • Casadio M, Pressman A, Acosta S, et al. Body machine interface: remapping motor skills after spinal cord injury. Institute of Electrical and Electronics Engineers International Conference on Rehabilitation Robotics (ICORR). Zurich, Switzerland, 29 June–1 July 2011.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.