855
Views
46
CrossRef citations to date
0
Altmetric
Reviews

Wheelchair control for disabled patients using EMG/EOG based human machine interface: a review

Pages 61-74 | Received 14 Aug 2020, Accepted 15 Nov 2020, Published online: 11 Dec 2020

References

  • The HFOR. Monitoring Health for the SDGs. World Heal Stat. 2016:1.121.
  • Moon I, Lee M, Mun M. A novel EMG-based human-computer interface for persons with disability. Proceedings of the IEEE International Conference on Mechatronics 2004, ICM’04; 2004 Jun 5; Istanbul, Turkey. p. 519–524.
  • Morgan PL, Caleb-Solly P, Voinescu A, et al. Literature review: human-machine interface. 2016.
  • Auti A, Amolic R, Bharne S, et al. Sign-talk: hand gesture recognition system. IJCA. 2017;160:13–16.
  • Arsi D, Hörnler B, Schuller B, et al. A Hierarchical approach for visual suspicious behaviour detection in aircrafts. Proceedings of the 16th International Conference on Digital Signal Processing; 2009 Jul 5–7; Santorini-Hellas, Greece.
  • Kane SK, Avrahami D, Wobbrock JO, et al. Bonfire: A nomadic system for hybrid laptop-tabletop interaction. Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology – UIST 2009; 2009. p. 129–38.
  • Kiefer C, Collins N, Fitzpatrick G. Phalanger: controlling music software with hand movement using a computer vision and machine learning approach. I Can; 2009. p. 246–249.
  • Peer A, Schauss T, Unterhinninghofen U, et al. A mobile haptic interface for bimanual manipulations in extended remote/virtual environments. Robot Res Trends. 2008:267–287.
  • Suhr JK, Jung HG, Bae K, et al. Automatic free parking space detection by using motion stereo-based 3D reconstruction. Mach Vis Appl. 2010;21:163–176.
  • Rakesh D. Desale RDD. A study on wearable gestural interface – a sixthsense technology. IOSR-JCE. 2013;10:10–16.
  • Michel D, Argyros AA, Grammenos D, et al. Building a multi-touch display based on computer vision techniques. Proceedings of the IAPR Conference on Machine Vision Applications; 2009 May 20–22; Yokohama, Japan. p. 74–79.
  • Cannan J, Hu H. Human-machine interaction (HMI): a survey. Tech Rep CES-508. 2011.
  • Yuen H, Pineau J, Archambault P. Automatically characterizing driving activities onboard smart wheelchairs from accelerometer data. Proceedings of the International Conference on Intelligent Robots and Systems (IROS); 2015 Sep 28–Oct 2; Hamburg, Germany.
  • Ghorbel M, Pineau J, Gourdeau R, et al. A decision-theoretic approach for the collaborative control of a smart wheelchair. Int J of Soc Robotics. 2018;10:131–145.
  • Lopes J, Simão M, Mendes N, et al. Hand/arm gesture segmentation by motion using IMU and EMG sensing. Procedia Manuf. 2017;11:107–113.
  • Soh H, Demiris Y. Learning assistance by demonstration: smart mobility with shared control and paired haptic controllers. J Human-Robot Interact. 2015;4:76.
  • Huang CK, Wang ZW, Chen GW, et al. Development of a smart wheelchair with dual functions: real-time control and automated guide. Proceedings of the 2nd International Conference on Control and Robotics Engineering ICCRE 2017; 2017 Apr 1–3; Bangkok, Thailand. p. 73–76.
  • Rossiter J, Mukai T. A novel tactile sensor using a matrix of LEDs operating in both photoemitter and photodetector modes. Proceedings of the IEEE Sensors 2005; 2005 Oct 30–Nov 3; Irvine, CA. p. 994–997.
  • Champaty B, Jose J, Pal K, et al. Development of EOG based human machine interface control system for motorized wheelchair. Proceedings of the 2014 Annual International Conference on Emerging Research Areas: Magnetics, Machines and Drives (AICERA/iCMMD); 2014 Jul 24–26; Kottayam, India.
  • Matsumoto Y, Ino T, Ogasawara T. Development of intelligent wheelchair system with face and gaze based interface. Proceedings of the 10th IEEE International Workshop on Robot and Human Interactive Communication; 2001 Sep 18–21; Paris, France. p. 262–267.
  • Overview P. The modularity of the electronic guidance systems of the. IEEE Robot Autom Mag. 2001:46–56.
  • Muhammad Sidik MH, Che Ghani SA, Ishak M, et al. A review on electric wheelchair innovation to ease mobility and as a rehabilitation tool for spinal cord impairment patient. IJET. 2018;10:803–815.
  • Martin TB, Ambrose RO, Diftler MA, et al. Tactile gloves for autonomous grasping with the NASA/DARPA Robonaut. Proceedings of the IEEE International Conference on Robotics and Automation, 2004; 2004 Apr 26–May 1; New Orleans, LA. p. 1713–1718.
  • Farina D, Jiang N, Rehbaum H, et al. The extraction of neural information from the surface EMG for the control of upper-limb prostheses: Emerging avenues and challenges. IEEE Trans Neural Syst Rehabil Eng. 2014;22:797–809.
  • Merletti R, Farina D. Surface electromyography: physiology, engineering, and applications. Hoboken (NJ): Wiley; 2016.
  • Farina D, Negro F. Accessing the neural drive to muscle and translation to neurorehabilitation technologies. IEEE Rev Biomed Eng. 2012;5:3–14.
  • Rayo A, Hernandez Gomez L, Sánchez A, et al. Design and manufacturing of a dry electrode for EMG signals recording with microneedles. In: Improved performance of materials. 2017. p. 259–267.
  • Alkan A, Günay M. MG. Identification of EMG signals using discriminant analysis and SVM classifier. Expert Syst Appl. 2012;39:44–47.
  • Andrews S, Mora J, Lang J, et al. Hapticast: a physically-based 3D game with haptic feedback. Proceedings of Future Play; 2006; London p. 1–8. http://profs.etsmtl.ca/sandrews/publication/hapticast/
  • Qi L, Ferguson-Pell M, Lu Y. The effect of manual wheelchair propulsion speed on users’ shoulder muscle coordination patterns in time-frequency and principal component analysis. IEEE Trans Neural Syst Rehabil Eng. 2019;27:60–65.
  • Lu Z, Zhou P. Hands-free human-computer interface based on facial myoelectric pattern recognition. Front Neurol. 2019;10:1–10.
  • Dai C, Cao Y, Hu X. Prediction of individual finger forces based on decoded motoneuron activities. Ann Biomed Eng. 2019;47:1357–1368.
  • Simao M, Mendes N, Gibaru O, et al. A review on electromyography decoding and pattern recognition for human-machine interaction. IEEE Access. 2019;7:39564–39582.
  • Liu J, Zhou P. A novel myoelectric pattern recognition strategy for hand function restoration after incomplete cervical spinal cord injury. IEEE Trans Neural Syst Rehabil Eng. 2013;21:96–103.
  • Yulianto E, Indrato TB. The design of electrical wheelchairs with electromyography signal controller for people with paralysis. Electr Electron Eng. 2018;8:1–9.
  • Xia W, Zhou Y, Yang X, et al. Toward portable hybrid surface electromyography/a-mode ultrasound sensing for human-machine interface. IEEE Sensors J. 2019;19:5219–5228.
  • Moon I, Lee M, Chu J, et al. Wearable EMG-based HCI for electric-powered wheelchair users with motor disabilities. Proceedings of the 2005 IEEE International Conference on Robotics and Automation; 2005 Apr 18–22; Barcelona, Spain. p. 2649–2654.
  • Kaur A, Agarwal R, Kumar A. Comparison of muscles activity of abled bodied and amputee subjects for around shoulder movement. Biomed Mater Eng. 2016;27:29–37.
  • Ramkumar S, Sathesh Kumar K, Dhiliphan Rajkumar T, et al. A review-classification of electrooculogram based human computer interfaces. Biomed Res. 2018;29:1078–1084.
  • Wu JF, Ang AMS, Tsui KM, et al. Efficient implementation and design of a new single-channel electrooculography-based human-machine interface system. IEEE Trans. Circuits Syst. II. 2015;62:179–183.
  • Liu J, Zhang D, Sheng X, et al. Enhanced robustness of myoelectric pattern recognition to across-day variation through invariant feature extraction. Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC); 2015 Aug 25–29; Milan, Italy. p. 7262–7265.
  • Reza T, Ferdous SM, Hasan N, et al. A low cost surface electromyogram (sEMG) signal guided automated wheel chair for the disabled. Int J Sci Eng Res. 2012;3:1–6.
  • Mahendran R. EMG signal based control of an intelligent wheelchair. Proceedings of the 2014 International Conference on Communication and Signal Processing; 2014 Apr 3–5; Melmaruvathur, India. p. 1267–1272.
  • Xu X, Zhang Y, Luo Y, et al. Robust bio-signal based control of an intelligent wheelchair. Robotics. 2013;2:187–197.
  • Puchinger M, Babu N, Kurup R, et al. A preliminary muscle activity analysis: handle based and push-rim wheelchair propulsion. J Biomech. 2019;89:119–122.
  • Trigili E, Grazi L, Crea S, et al. Detection of movement onset using EMG signals for upper-limb exoskeletons in reaching tasks. J Neuroeng Rehabil. 2019;16:45.
  • Huang G, Ceccarelli M, Huang Q, et al. Design and feasibility study of a leg-exoskeleton assistive wheelchair robot with tests on gluteus. Sensors. 2019;19:548.
  • Odle B, Reinbolt J, Forrest G, et al. Construction and evaluation of a model for wheelchair propulsion in an individual with tetraplegia. Med Biol Eng Comput. 2019;57:519–532.
  • Surface HIU, Sensors E. Human-computer interfaces using surface. 2018:1–31.
  • Kucukyildiz G, Ocak H, Karakaya S, et al. Design and implementation of a multi sensor based brain computer interface for a robotic wheelchair. J Intell Robot Syst. 2017;87:247–263.
  • Kaiser MS, Chowdhury ZI, Mamun SA, et al. A neuro-fuzzy control system based on feature extraction of surface electromyogram signal for solar-powered wheelchair. Cogn Comput. 2016;8:946–954.
  • Heide W, Koenig E, Trillenberg P, et al. Electrooculography: technical standards and applications. The International Federation of Clinical Neurophysiology. Electroencephalogr Clin Neurophysiol Suppl. 1999;52:223–240.
  • Bulling A, Roggen D, Tröster G. Wearable EOG goggles: seamless sensing and context-awareness in everyday environments. J Ambient Intell Smart Environ. 2009;1:157–171.
  • Lamti HA, Moncef Ben Khelifa M, Alimi AM, et al. A Brain Eyes WHEELchair Interface for severely disabled people assistance. Assistive Technology Research Series. 2011;29:686–694.
  • Rajesh A, Mantur M. Eyeball gesture controlled automatic wheelchair using deep learning. Proceedings of the 5th IEEE Region 10 Humanitarian Technology Conference (R10-HTC); 2017 Dec 21–23; Dhaka, Bangladesh. 2018. p. 387–391.
  • Manabe H, Fukumoto M, Yagi T. Direct gaze estimation based on nonlinearity of EOG. IEEE Trans Biomed Eng. 2015;62:1553–1562.
  • Richard MG, Tello ALCB. Development of a human machine interface for control of robotic wheelchair and smart environment. 11th IFAC Symp Robot Control. 2015.
  • Singh H, Singh J. Human eye tracking and related issues: a review. Int J Sci Res Publ. 2012;2:2250–3153.
  • Chang WD, Cha HS, Im CH. Removing the interdependency between horizontal and vertical eye-movement components in electrooculograms. Sensors (Basel). 2016;16:227.
  • Chang WD. Electrooculograms for human-computer interaction: a review. Sensors (Basel). 2019;19:2690.
  • Young LR, Sheena D. SD. Eye-movement measurement techniques. Am Psychol. 1975;30:315–330.
  • Deng LY, Hsu CL, Lin TC, et al. EOG-based human-computer interface system development. Expert Syst Appl. 2010;37:3337–3343.
  • Paul GM, Cao F, Torah R, et al. A smart textile based facial EMG and EOG computer interface. IEEE Sens J. 2014;14:393–400.
  • Kanoh S, Ichi-Nohe S, Shioya S, et al. Development of an eyewear to measure eye and body movements. Proceedings of the 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS 2015; 2015 Aug 25–29; Milan, Italy. p. 2267–2270.
  • Khandelwal A, Kumar Singh G, Rahul K, et al. Eye movement based electric wheel chair. IJCESR. 2016;3:61–67.
  • Bulling A, Sensing W, Recognition A, et al. Eye movement analysis for context inference and cognitive-awareness: wearable sensing and activity recognition using electrooculography. ETH Zurich; 2010.
  • Barea R, Boquete L, Mazo M, et al. Wheelchair guidance strategies using EOG. J Intell Robot Syst Theory Appl. 2002;34:279–299.
  • Lin M, Li B. A wireless EOG-based human computer interface. Proceedings of the 3rd International Conference on Biomedical Engineering and Informatics, BMEI 2010; 2010 Oct 16–18; Yantai, China. p. 1794–1796.
  • Banerjee A, Datta S, Pal M, et al. Classifying electrooculogram to detect directional eye movements. Procedia Technol. 2013;10:67–75.
  • Venkataramanan S, Prabhat P, Choudhury SR, et al. Biomedical instrumentation based on Electrooculogram (EOG) signal processing and application to a hospital alarm system. Proceedings of the International Conference Intell Sens Inf Process ICISIP’05 2005;2005. 535–40.
  • Usakli AB, Gurkan S. Design of a novel efficient humancomputer interface: An electrooculagram based virtual keyboard. IEEE Trans Instrum Meas. 2010;59:2099–2108.
  • Kirbiš M, Kramberger I. Mobile device for electronic eye gesture recognition. IEEE Trans Consumer Electron. 2009;55:2127–2133.
  • Hansen DW, Ji Q. In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell. 2010;32:478–500.
  • Chang WD, Cha HS, Kim DY, et al. Development of an electrooculogram-based eye-computer interface for communication of individuals with amyotrophic lateral sclerosis. J Neuroeng Rehabil. 2017;14:7–9.
  • Huang Q, He S, Wang Q, et al. An EOG-based human-machine interface for wheelchair control. IEEE Trans Biomed Eng. 2018;65:2023–2032.
  • Fang F, Shinozaki T. Electrooculography-based continuous eye-writing recognition system for efficient assistive communication systems. PLoS One. 2018;13:e0192684.
  • Barea R, Boquete L, Mazo M, et al. System for assisted mobility using eye movements based on electrooculography. IEEE Trans Neural Syst Rehabil Eng. 2002;10:209–218.
  • Ri R, Dqg H. $) Hdvlelolw\6Wxg\Ri Dq (\H Zulwlqj 6\Vwhp % Dvhg Rq (Ohfwur Rfxorjudsk\2015.
  • Aungsakul S, Phinyomark A, Phukpattaranont P, et al. Evaluating feature extraction methods of electrooculography (EOG) signal for human-computer interface. Procedia Eng. 2012;32:246–252.
  • Champaty B, Jose J, Pal K, et al. Development of EOG based human machine interface control system for motorized wheelchair. Proceedings of the Annual International Conference on Emerging Research Areas: Magnetics, Machines and Drives (AICERA/iCMMD); 2014 Jul 24–26; Kottayam, India.
  • Huang Q, Chen Y, Zhang Z, et al. An EOG-based wheelchair robotic arm system for assisting patients with severe spinal cord injuries. J Neural Eng. 2019;16:026021.
  • Ma J, Zhang Y, Cichocki A, et al. A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: application to robot control. IEEE Trans Biomed Eng. 2015;62:876–889.
  • Aziz FB, Arof H, Mokhtar N, et al. HMM based automated wheelchair navigation using EOG traces in EEG. J Neural Eng. 2014;11(5):056018.
  • Ramli R, Arof H, Ibrahim F, et al. Expert systems with applications using finite state machine and a hybrid of EEG signal and EOG artifacts for an asynchronous wheelchair navigation. Expert Syst Appl. 2015;42:2451–2463.
  • Soltani S, Mahnam A. A practical efficient human computer interface based on saccadic eye movements for people with disabilities. Comput Biol Med. 2016;70:163–173.
  • Choudhari AM, Porwal P, Jonnalagedda V, et al. An electrooculography based human machine interface for wheelchair control. Biocybern Biomed Eng. 2019;39:673–685.
  • Latha K. Efficient Classification of EOG using CBFS Feature Selection Algorithm. Proceedings of the International Conference on Emerging Research in Computing, Information, Communication and Applications, ERCICA; 2013.
  • Aungsakun S, Phinyomark A, Phukpattaranont P, et al. Robust eye movement recognition using EOG signal for human-computer interface. Proceedings of the 2nd International Conference on Software Engineering and Computer Systems (ICSECS 2011). 2011. p. 714–723.
  • Bulling A, Ward JA, Gellersen H, et al. Eye movement analysis for activity recognition using electrooculography. IEEE Trans Pattern Anal Mach Intell. 2011;33:741–753.
  • Perdiz J, Pires G, Nunes UJ. Emotional state detection based on EMG and EOG biosignals: a short survey. Proceedings of the 5th Portuguese Meeting on Bioengineering (ENBENG); 2017 Feb 16–18; Coimbra, Portugal.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.