891
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Trigger motion and interface optimization of an eye-controlled human-computer interaction system based on voluntary eye blinks

, , , , &
Received 16 Aug 2022, Accepted 18 Mar 2023, Published online: 24 Apr 2023

References

  • Abe, K., Sato, H., Matsuno, S., Ohi, S., & Ohyama, M. (2015 Aug 02-07). Input interface using eye-gaze and blink information . (Eds.), Communications in Computer and Information Science. 2nd International Conference on Learning and Collaboration Technologies/17th International Conference on Human-Computer Interaction, Los Angeles, CA.
  • Aksu, D., & Aydin, M. A. (2017, Sep 16-17). Human computer interaction by eye blinking on real time (Eds.). International Confernce on Computational Intelligence and Communication Networks. 9th International Conference on Computational Intelligence and Communication Networks (CICN), Final Int Univ, Girne, CYPRUS.
  • Akyeampong, J., Udoka, S., Caruso, G., & Bordegoni, M. (2014). Evaluation of hydraulic excavator human-machine interface concepts using NASA tlx. International Journal of Industrial Ergonomics, 44(3), 374–382. https://doi.org/10.1016/j.ergon.2013.12.002
  • Alppay, C., & Bayazit, N. (2015). An ergonomics based design research method for the arrangement of helicopter flight instrument panels. Applied Ergonomics, 51, 85–101. https://doi.org/10.1016/j.apergo.2015.04.011
  • Alzahrani, S., & Roberts, L. (2021). The effect of visuospatial designing elements of zoomable user interfaces on second language vocabulary acquisition. System Article 102396, 96, 102396. https://doi.org/10.1016/j.system.2020.102396
  • Ang, A. M. S., Zhang, Z. G., Hung, Y. S., Mak, J. N. F., & Ieee. (2015 Apr 22-24). A user-friendly wearable single-channel EOG-based human-computer interface for cursor control (Eds.). International IEEE EMBS Conference on Neural Engineering. 7th Annual International IEEE EMBS Conference on Neural Engineering (NER), Montpellier, FRANCE.
  • Antonio Diego-Mas, J., Garzon Leal, D., Poveda-Bautista, R., & Alcaide Marzal, J. (2019). User-interfaces layout optimization using eye-tracking, mouse movements and genetic algorithms. Applied Ergonomics, 78, 197–209. https://doi.org/10.1016/j.apergo.2019.03.004
  • Attiah, A. Z., Khairullah, E. F., & Ieee. (2021 Mar 27-28). Eye-blink detection system for virtua keyboard . (Ed.),(eds.). IEEE 4th National Computing Colleges Conference (NCCC) ELECTR NETWORK, IEEE.
  • Banerjee, A., Pal, M., Datta, S., Tibarewala, D. N., & Konar, A. (2015). Voluntary eye movement controlled electrooculogram based multitasking graphical user interface. International Journal of Biomedical Engineering Technology, 18(3), 24–25. https://doi.org/10.1504/IJBET.2015.070574
  • Brumby, D. P., & Howes, A. (2008). Strategies for guiding interactive search: An empirical investigation into the consequences of label relevance for assessment and selection. Human-Computer Interaction, 23(1), 1–46. https://doi.org/10.1080/07370020701851078
  • Carvalho, P. V. R., dos Santos, I. L., Gomes, J. O., Borges, M. R. S., & Guerlain, S. (2008). Human factors approach for evaluation and redesign of human-system interfaces of a nuclear power plant simulator. Displays, 29(3), 273–284. https://doi.org/10.1016/j.displa.2007.08.010
  • Cecotti, H. (2016). A multimodal gaze-controlled virtual keyboard. IEEE Transactions on Human-Machine Systems, 46(4), 601–606. https://doi.org/10.1109/thms.2016.2537749
  • Champaty, B., Nayak, S. K., Pal, K., Thirugnanam, A., & Ieee. (2015 Dec17-20). Development of an EOG based computer aided communication support system . (Ed.),^(eds.), Annual IEEE India Conference. 12 IEEE Int C Elect Energy Env Communications Computer Control, New Delhi, INDIA.
  • Champaty, B., Tibarewala, D. N., Mohapatra, B., & Pal, K. (2016). Development of EOG and EMG-Based multimodal assistive systems. In N. Dey, V. Bhateja, & A. E. Hassanien (Eds.), Medical imaging in clinical applications: Algorithmic and computer-based approaches (pp. 285–310). Springer International Publishing. https://doi.org/10.1007/978-3-319-33793-7_13
  • Chen, S., & Epps, J. (2014). Using task-induced pupil diameter and blink rate to infer cognitive load. Human-Computer Interaction, 29(4), 390–413. https://doi.org/10.1080/07370024.2014.892428
  • Costela, F. M., Otero-Millan, J., McCamy, M. B., Macknik, S. L., Troncoso, X. G., Jazi, A. N., Crook, S. M., Martinez-Conde, S., & Paterson, K. Fixational eye movement correction of blink-induced gaze position errors. (2014). PLoS One, 9(10), e110889. Article e110889. https://doi.org/10.1371/journal.pone.0110889
  • Cousineau, D. (2020). Approximating the distribution of cohen’s d(p) in within-subject designs. The Quantitative Methods for Psychology, 16(4), 418–421. https://doi.org/10.20982/tqmp.16.4.p418
  • Coutinho, F. L., & Morimoto, C. H. (2006Oct 08-11. Free head motion eye gaze tracking using a single camera and multiple light sources. 19th Brazilian Symposium on Computer Graphics and Image Processing (SIBGRAPI 2006), Manaus, BRAZIL. //WOS:000242969400021
  • Ding, X., & Lv, Z. (2020). Design and development of an EOG-based simplified Chinese eye-writing system. Biomedical Signal Processing and Control Article 101767, 57, 101767. https://doi.org/10.1016/j.bspc.2019.101767
  • Donmez, K., Demirel, S., & Ozdemir, M. (2020). Handling the pseudo pilot assignment problem in air traffic control training by using NASA TLX. Journal of Air Transport Management Article 101934, 89, 101934. https://doi.org/10.1016/j.jairtraman.2020.101934
  • Dorr, M., Pomarjanschi, L., & Barth, E. (2009). Gaze beats mouse: A case study on a gaze-controlled breakout. PsychNology Journal, 7(2), 197–211.
  • Durlach, P. J. (2004). Change blindness and its implications for complex monitoring and control systems design and operator training. Human-Computer Interaction, 19(4), 423–451. https://doi.org/10.1207/s15327051hci1904_10
  • Feng, C., & Shen, M. (2006). Task Efficiency of Different Arrangements of Objects in an Eye-movement Based User Interface. Acta Psychologica Sinica, 38(4): 515–522. Article 0439-755x(2006)38:4<515:Ydjhzd>2.0.Tx;2-q. <Go to ISI>://CSCD:2662956
  • Halverson, T., & Hornof, A. J. (2011). A computational model of “active vision” for visual search in human-computer interaction. Human-Computer Interaction, 26(4), 285–314. https://doi.org/10.1080/07370024.2011.625237
  • Hansen, D. W., & Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3), 478–500. https://doi.org/10.1109/tpami.2009.30
  • Hansen, J. P., Tørning, K., Johansen, A. S., Itoh, K., & Aoki, H. (2004). Gaze typing compared with input by head and hand. Proceedings of the 2004 symposium on Eye tracking research & applications, San Antonio, Texas. https://doi.org/10.1145/968363.968389
  • Heikkilä, H., & Räihä, K. -J. (2012). Simple gaze gestures and the closure of the eyes as an interaction technique. Proceedings of the Symposium on Eye Tracking Research and Applications, Santa Barbara, California. https://doi.org/10.1145/2168556.2168579
  • Helmert, J. R., Pannasch, S., & Velichkovsky, B. M. 2008.Influences of dwell time and cursor control on the performance in gaze driven typing.Journal of Eye Movement Research24 Article 3.4.https://doi.org/10.16910/jemr.2.4.3
  • Higgins, J. S., Irwin, D. E., Wang, R. F., & Thomas, L. E. (2009). Visual direction constancy across eyeblinks. Attention, Perception, & Psychophysics, 71(7), 1607–1617. https://doi.org/10.3758/APP.71.7.1607
  • Hori, J., Sakano, K., Saitoh, Y., & ieee. (2004 Sep 01-05). Development of communication supporting device controlled by eye movements and voluntary eye blink . (Ed.)^(eds.), Proceedings of Annual International Conference of the Ieee Engineering in Medicine and Biology Society. 26th Annual International Conference of the IEEE-Engineering-in-Medicine-and-Biology-Society, San Francisco, CA.
  • Hornof, A. J. (2004). Cognitive strategies for the visual search of hierarchical computer displays. Human-Computer Interaction, 19(3), 183–223. https://doi.org/10.1207/s15327051hci1903_1
  • Hwang, S. -L., Liang, S. -F.M., Liu, T. -Y. -Y., Yang, Y. -J., Chen, P. -Y., & Chuang, C. -F. (2009). Evaluation of human factors in interface design in main control rooms. Nuclear Engineering and Design, 239(12), 3069–3075. https://doi.org/10.1016/j.nucengdes.2009.09.006
  • Ianez, E., Azorin, J. M., Perez-Vidal, C., & Coles, J. A. Using eye movement to control a computer: A design for a lightweight electro-oculogram electrode array and computer interface. (2013). PLoS One, 8(7), e67099. Article e67099. https://doi.org/10.1371/journal.pone.0067099
  • Jacob, R. J. K. (1990). What you look at is what you get: Eye movement-based interaction techniques. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Seattle, Washington, USA. https://doi.org/10.1145/97243.97246
  • Just, M. A., & Carpenter, P. A. (1980). A Theory of reading - from eye fixations to comprehension. Psychological Review, 87(4), 329–354. https://doi.org/10.1037/0033-295x.87.4.329
  • Kern, M., Schulze-Bonhage, A., & Ball, T. (2021). Blink- and saccade-related suppression effects in early visual areas of the human brain: Intracranial EEG investigations during natural viewing conditions. Neuroimage, 230, Article 117788. https://doi.org/10.1016/j.neuroimage.2021.117788
  • Khazali, M. F., Pomper, J. K., Smilgin, A., Bunjes, F., & Thier, P. (2016). A new motor synergy that serves the needs of oculomotor and eye lid systems while keeping the downtime of vision minimal. Elife, 5. Article e16290. https://doi.org/10.7554/eLife.16290
  • Kim, M., Jin, D., Rhiu, I., & Yun, M. H. (2018). The effect of stimulus size and position on the task performance of an eye mouse: Comparing blink and dwell methods in a click task. International Journal of Human-Computer Interaction, 34(7), 603–620. https://doi.org/10.1080/10447318.2017.1385174
  • Kirbis, M., & Kramberger, I. (2009). Mobile device for electronic eye gesture recognition. IEEE Transactions on Consumer Electronics, 55(4), 2127–2133. https://doi.org/10.1109/tce.2009.5373778
  • Kocejko, T., Bujnowski, A., Wtorek, J., & Ieee. (2008, May 25-27). Eye mouse for disabled. (Ed.),^(eds.), Eurographics Technical Report Series. 2008 Conference on Human System Interactions, Kracow, POLAND.
  • Koesling, H., Zoellner, M., Sichelschmidt, L., & Ritter, H. (2009). With a flick of the eye: Assessing gaze-controlled human-computer interaction. In H. Ritter, G. Sagerer, R. Dillmann, & M. Buss (Eds.), Human centered robot systems: Cognition, interaction, technology (pp. 83–92). Springer. https://doi.org/10.1007/978-3-642-10403-9_9
  • Krolak, A., & Strumillo, P. (2008, May 25-27). Eye-blink controlled human-computer interface for the disabled (Ed.),^(eds.), Advances in Intelligent and Soft Computing. 2008 Conference on Human System Interactions, Kracow, POLAND.
  • Krolak, A., & Strumillo, P. (2012). Eye-blink detection system for human-computer interaction. Universal Access in the Information Society, 11(4), 409–419. https://doi.org/10.1007/s10209-011-0256-6
  • Kumar, D., & Sharma, A. (2016 Sep21-24). Electrooculogram-based virtual reality game control using blink detection and gaze calibration. (Ed.),^(eds.). International Conference on Advances in Computing, Communications and Informatics (ICACCI), Jaipur, INDIA.
  • Li, S. (2018). Eye control interface design and evaluation method research. Southeast University]. https://kns.cnki.net/KCMS/detail/detail.aspx?dbname=CMFD201901&filename=1019650795.nh
  • Li, F., Chen, C. -H., Xu, G., Khoo, L. P., & Liu, Y. (2019). Proactive mental fatigue detection of traffic control operators using bagged trees and gaze-bin analysis. Advanced Engineering Informatics Article 100987, 42, 100987. https://doi.org/10.1016/j.aei.2019.100987
  • Li HongTing, X. Y., Tian, Y., & LiFen, T. (2017). The influence of fixation time parameter of activated clicks on eye - controlled interaction operation performance. Chinese Journal of Ergonomics, 23(2), 45–49.
  • Lin, M., & Majumder, A. (2016). Study on directional eye movements in non-frontal face images for eye-controlled interaction. Journal of the Society for Information Display, 24(10), 611–620. https://doi.org/10.1002/jsid.505
  • Li Shan, P. G., & Shijian, L. (2011). Eye-controlled painting system for disabled. ACTA Electronica Sinica.
  • Lv, Z., Zhang, C., Zhou, B., Gao, X., & Wu, X. (2018). Design and implementation of an eye gesture perception system based on electrooculography. Expert Systems with Applications, 91, 310–321. https://doi.org/10.1016/j.eswa.2017.09.017
  • Maffei, A., & Angrilli, A. (2018). Spontaneous eye blink rate: An index of dopaminergic component of sustained attention and fatigue. International Journal of Psychophysiology, 123, 58–63. https://doi.org/10.1016/j.ijpsycho.2017.11.009
  • Magliacano, A., Fiorenza, S., Estraneo, A., & Trojano, L. (2020). Eye blink rate increases as a function of cognitive load during an auditory oddball paradigm. Neuroscience Letters Article 135293, 736, 135293. https://doi.org/10.1016/j.neulet.2020.135293
  • Majaranta, P., Aoki, H., Donegan, M., Hansen, D. W., & Hansen, J. P. (2011). Eye Tracker Hardware Design. In G. Daunys (Ed.), Interaction and Applications of Eye Tracking: Advances in Assistive Technologies (pp. 326–335). IGI Publishing.
  • Majaranta, P., Aula, A., & Räihä, K. -J. (2004). Effects of feedback on eye typing with a short dwell time. Proceedings of the 2004 symposium on Eye tracking research & applications, San Antonio, Texas. https://doi.org/10.1145/968363.968390
  • Maus, G. W., Duyck, M., Lisi, M., Collins, T., Whitney, D., & Cavanagh, P. (2017). Target displacements during eye blinks trigger automatic recalibration of gaze direction. Current Biology, 27(3), 445–450. https://doi.org/10.1016/j.cub.2016.12.029
  • Murata, A. (2006). Eye-gaze input versus mouse: Cursor control as a function of age. International Journal of Human-Computer Interaction, 21(1), 1–14. https://doi.org/10.1080/10447310609526168
  • Murata, A., & Fukunaga, D. (2018). Extended Fitts’ model of pointing time in eye-gaze input system - incorporating effects of target shape and movement direction into modeling. Applied Ergonomics, 68, 54–60. https://doi.org/10.1016/j.apergo.2017.10.019
  • Murata, A., Uetsugi, R., Fukunaga, D., & Ieee. (2014 Sep 09-12). Effects of target shape and display location on pointing performance by eye-gaze input system. (Ed.),^(eds.), Proceedings of the SICE Annual Conference. SICE Annual Conference (SICE), Hokkaido Univ, Sapporo, JAPAN.
  • Nachreiner, F., Nickel, P., & Meyer, I. (2006). Human factors in process control systems: The design of human-machine interfaces. Safety Science, 44(1), 5–26. https://doi.org/10.1016/j.ssci.2005.09.003
  • Nakano, T., Yamamoto, Y., Kitajo, K., Takahashi, T., & Kitazawa, S. (2009). Synchronization of spontaneous eyeblinks while viewing video stories. Proceedings of the Royal Society B-Biological Sciences, 276(1673), 3635–3644. https://doi.org/10.1098/rspb.2009.0828
  • Okugawa, K., Nakanishi, M., Mitsukura, Y., & Takahashi, M. (2013, 2014Nov 29-30). Driving control of a powered wheelchair by voluntary eye blinking and with environment recognition.(Ed.),^(Eds.), Applied Mechanics and Materials. 2nd International Conference on Mechanical Design and Power Engineering (ICMDPE 2013), Beijing, PEOPLES R CHINA.
  • Onuki, Y., Kumazawa, I., & Ieee. (2019, 2019 Mar 23-27). Reorient the gazed scene towards the center: novel virtual turning using head and gaze motions and blink(Ed.),^(Eds.). 26th IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, JAPAN.
  • Parisay, M., Poullis, C., Kersten-Oertel, M., & Ieee. (2020, Jun 06-08). FELiX: Fixation-based eye fatigue load index a multi-factor measure for gaze-based interactions. (Ed.),^(eds.), Conference on Human System Interaction. 13th IEEE International Conference on Human System Interaction (IEEE HSI), Toyo Univ, ELECTR NETWORK.
  • Paulus, Y. T., & Remijn, G. B. (2021). Usability of various dwell times for eye-gaze-based object selection with eye tracking. Displays Article 101997, 67, 101997. https://doi.org/10.1016/j.displa.2021.101997
  • Qin, L., Zhong, N., Lv, S., & Li, M. (2013). Visual decision making theory combined method to predict user choice. Application Research of Computers, 30(8): 2549–2551. 2556 Article 1001-3695(2013)30:8<2549:Yzrhsj>2.0.Tx;2-q. <Go to ISI>://CSCD:4909700
  • Rajanna, V., & Hammond, T. (2016 Mar 14-17). GAWSCHI: Gaze-Augmented, Wearable-Supplemented Computer-Human Interaction. (Ed.),^(Eds.). 9th Biennial ACM Symposium on Eye Tracking Research and Applications (ETRA), Charleston, SC.
  • Rayner, K. The 35th sir Frederick Bartlett lecture: Eye movements and attention in reading, scene perception, and visual search. (2009). The Quarterly Journal of Experimental Psychology, 62(8), 1457–1506. Article Pii 911217562. https://doi.org/10.1080/17470210902816461
  • Rezae, M., Chen, N., McMeekin, D., Tan, T., Krishna, A., & Lee, H. (2020). The evaluation of a mobile user interface for people on the autism spectrum: An eye movement study. International Journal of Human-Computer Studies, 142. Article 102462. https://doi.org/10.1016/j.ijhcs.2020.102462
  • Riggs, L. A., Kelly, J. P., Manning, K. A., Moore, R. K. J. I. O., & Science, V. (1987). Blink-related eye movements. Investigative Ophthalmology & Visual Science, 28(2), 334.
  • Rottach, K. G., Das, V. E., Wohlgemuth, W., Zivotofsky, A. Z., & Leigh, R. J. (1998). Properties of horizontal saccades accompanied by blinks. Journal of Neurophysiology, 79(6), 2895–2902. https://doi.org/10.1152/jn.1998.79.6.2895
  • Rozado, D., Moreno, T., San Agustin, J., Rodriguez, F. B., & Varona, P. (2015). Controlling a smartphone using gaze gestures as the input mechanism. Human-Computer Interaction, 30(1), 34–63. https://doi.org/10.1080/07370024.2013.870385
  • Rozado, D., Niu, J., & Lochner, M. Fast human-computer interaction by combining gaze pointing and face gestures [article]. (2017). ACM Transactions on Accessible Computing, 10(3), 18. Article 10. https://doi.org/10.1145/3075301
  • Sarker, S., Mazumder, M. S., Rahman, M. S., Rabbi, M. A., & Ieee. (2019, 2019 Nov-Dec 22-22). An assistive HCI system based on block scanning objects using eye blinks. (Ed.),^(eds.), International Conference on Electrical Engineering and Information Communication Technology. 4th International Conference on Electrical Information and Communication Technology (EICT), Khulna, BANGLADESH.
  • Shen, M. -W., Feng, C. -Z., & Su, H. //MEDLINE:14594043. (2003). Spatial and temporal characteristic of eye movement in human-computer interface design. Hang tian yi xue yu yi xue gong cheng = space medicine & medical engineering 16(4), 304–306.
  • Sibert, L. E., & Jacob, R. J. K. (2000). Evaluation of eye gaze interaction. Proceedings of the SIGCHI conference on Human Factors in Computing Systems, The Hague, The Netherlands. https://doi.org/10.1145/332040.332445
  • Singh, H., & Singh, J. (2019). Object acquisition and selection using automatic scanning and eye blinks in an HCI system. Journal on Multimodal User Interfaces, 13(4), 405–417. https://doi.org/10.1007/s12193-019-00303-0
  • Soares, G., de Lima, D., & Miranda Neto, A. (2019). A mobile application for driver’s drowsiness monitoring based on perclos estimation. IEEE Latin America Transactions, 17(2), 193–202. https://doi.org/10.1109/tla.2019.8863164
  • Stern, J. A., Boyer, D., & Schroeder, D. (1994). Blink rate - a possible measure of fatigue. Human Factors, 36(2), 285–297. https://doi.org/10.1177/001872089403600209
  • Stewart, T. (1995). ergonomics standards concerning human-system interaction - visual-displays, controls and environmental requirements. Applied Ergonomics, 26(4), 271–274. https://doi.org/10.1016/0003-6870(95)00031-7
  • Surakka, V., Illi, M., & Isokoski, P. (2004). Gazing and frowning as a new human–computer interaction technique. 1(1 %Journal of ACM Trans Applied ercept), 1(1), 40–56. https://doi.org/10.1145/1008722.1008726
  • Thomas, E., Hutchinson, K., Preston White, W., Martin, N. K.C, Reichert, L.A., & %J IEEE Trans. Systems, M., & Cybernetics. (1989). Human-computer interaction using eye-gaze input. 19.
  • Thropp, J. E., Scallon, J. F. V., & Buza, P. (2018). PERCLOS as an indicator of slow-onset hypoxia in aviation. Aerospace Medicine and Human Performance, 89(8), 700–707. https://doi.org/10.3357/amhp.5059.2018
  • Tien, G., & Atkins, M. S. (2008, Mar26-28). Improving Hands-free Menu Selection Using Eyegaze Glances and Fixations. (Ed.),^(Eds.). Eye Tracking Research and Applications Symposium, Savannah, GA.
  • Torricelli, D., Goffredo, M., Conforto, S., & Schmid, M. (2009). An adaptive blink detector to initialize and update a view-based remote eye gaze tracking system in a natural scenario. Pattern Recognition Letters, 30(12), 1144–1150. https://doi.org/10.1016/j.patrec.2009.05.014
  • Tresanchez, M., Palleja, T., & Palacin, J. (2019). Optical mouse sensor for eye blink detection and pupil tracking: Application in a low-cost eye-controlled pointing device. Journal of Sensors Article 3931713, 2019, 1–19. https://doi.org/10.1155/2019/3931713
  • Van Opstal, F., De Loof, E., Verguts, T., & Cleeremans, A. Spontaneous eyeblinks during breaking continuous flash suppression are associated with increased detection times. (2016). Journal of Vision, 16(14), 21. Article 21. https://doi.org/10.1167/16.14.21
  • Velichkovsky, B. B., Rumyantsev, M. A., & Morozov, M. A. (2014). New solution to the midas touch problem: Identification of visual commands via extraction of focal fixations. 6th International Conference on Intelligent Human Computer Interaction, 39, 75–82. https://doi.org/10.1016/j.procs.2014.11.012
  • Velloso, E., Carter, M., Newn, J., Esteves, A., Clarke, C., & Gellersen, H. Motion correlation: Selecting objects by matching their movement. (2017). ACM Transactions on Computer-Human Interaction, 24(3), 1–35. Article 22. https://doi.org/10.1145/3064937
  • Volkmann, F. C., Riggs, L. A., & Moore, R. K. (1980). EYEBLINKS AND VISUAL SUPPRESSION. Science, 207(4433), 900–902. https://doi.org/10.1126/science.7355270
  • Ward, D. J., & MacKay, D. J. C. (2002). Artificial intelligence - fast hands-free writing by gaze direction. Nature, 418(6900), 838-838. https://doi.org/10.1038/418838a
  • Wierwille, W. W., & Eggemeier, F. T. (1993). Recommendations for mental workload measurement in a test and evaluation environment. Human Factors, 35(2), 263–281. https://doi.org/10.1177/001872089303500205
  • Wilcox, R. (2018). A robust nonparametric measure of effect size based on an analog of cohen’s d, plus inferences about the median of the typical difference. Journal of Modern Applied Statistical Methods, 17(2). Article eP2726. https://doi.org/10.22237/jmasm/1551905677.
  • Xin, Y. (2019). Research on design method of interface controls oriented to eye-computer interaction. Southeast University].
  • Xu, Y. (2016). The ergonomics research of eye control - hand control dual channel interaction. Zhejiang Sci-Tech University]. https://kns.cnki.net/kcms/detail/detail.aspx?FileName=1017059591.nh&DbName=CMFD2017
  • Ya-Feng, N., Jin, L., Jia-Qi, C., Wen-Jun, Y., Hong-Rui, Z., Jia-Xin, H., Lang, X., Jia-Hao, W., Guo-Rui, M., Zi-Jian, H., Cheng-Qi, X., Xiao-Zhou, Z., & Tao, J. (2022). Research on visual representation of icon colour in eye-controlled systems. Advanced Engineering Informatics, 52, 101570. https://doi.org/10.1016/j.aei.2022.101570
  • Yang, S. -W., Lin, C. -S., Lin, S. -K., & Lee, C. -H. (2013). Design of virtual keyboard using blink control method for the severely disabled. Computer Methods and Programs in Biomedicine, 111(2), 410–418. https://doi.org/10.1016/j.cmpb.2013.04.012
  • Zeng Youwen, F. Z., Yabing, Z. H. U., & Qi, L. I. (2017). Study on the correlation of blink frequency and fatigue based on EEG. Journal of Changchun University of Science and Technology (natural Science Edition), 40(2), 123–126.
  • Zhao, Q., Yuan, X., Tu, D., & Lu, J. (2015). Eye moving behaviors identification for gaze tracking interaction. Journal on Multimodal User Interfaces, 9(2), 89–104. https://doi.org/10.1007/s12193-014-0171-2
  • Zheng, X., Li, X., Liu, J., Chen, W., Hao, Y., & Ieee. (2009, 2009 Apr 09-11). A portable wireless eye movement-controlled human-computer interface for the disabled. (Ed.),^(eds.). ICME International Conference on Complex Medical Engineering, Tempe, AZ.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.