1,137
Views
42
CrossRef citations to date
0
Altmetric
Original Articles

Multimodal Intelligent Eye-Gaze Tracking System

&

Keep up to date with the latest research on this topic with citation updates for this article.

Read on this site (7)

Xin Liu, Yao Zhang, Xianta Jiang & Bin Zheng. (2023) Human Eyes Move to the Target Earlier When Performing an Aiming Task with Increasing Difficulties. International Journal of Human–Computer Interaction 39:6, pages 1341-1346.
Read now
Kelsey Mandak, Janice Light & Savanna Brittlebank-Douglas. (2022) Exploration of multimodal alternative access for individuals with severe motor impairments: Proof of concept. Assistive Technology 34:6, pages 674-683.
Read now
Andreia Rodrigues, Marcelo Bender Machado, Ana Margarida Pisco Almeida, J. F. D. Abreu & Tatiana Aires Tavares. (2022) Interaction Devices as Assistive Technology: Current Practices about Evaluation Methodologies. International Journal of Human–Computer Interaction 38:3, pages 201-212.
Read now
Yi Liu, Bu-Sung Lee & Martin J. McKeown. (2016) Robust Eye-Based Dwell-Free Typing. International Journal of Human–Computer Interaction 32:9, pages 682-694.
Read now
Kiavash Bahreini, Rob Nadolski & Wim Westera. (2016) Data Fusion for Real-time Multimodal Emotion Recognition through Webcams and Microphones in E-Learning. International Journal of Human–Computer Interaction 32:5, pages 415-430.
Read now
Pradipta Biswas, Varun Dutt & Pat Langdon. (2016) Comparing Ocular Parameters for Cognitive Load Measurement in Eye-Gaze-Controlled Interfaces for Automotive and Desktop Computing Environments. International Journal of Human–Computer Interaction 32:1, pages 23-38.
Read now

Articles from other publishers (35)

Thomas Kosch, Jakob Karolus, Johannes Zagermann, Harald Reiterer, Albrecht Schmidt & Paweł W. Woźniak. (2023) A Survey on Measuring Cognitive Workload in Human-Computer Interaction. ACM Computing Surveys 55:13s, pages 1-39.
Crossref
Suprakas Saren, Abhishek Mukhopadhyay, Debasish Ghose & Pradipta Biswas. (2023) Comparing alternative modalities in the context of multimodal human–robot interaction. Journal on Multimodal User Interfaces.
Crossref
Oleg Spakov, Hanna Venesvirta, Jani Lylykangas, Ahmed Farooq, Roope Raisamo & Veikko Surakka. 2023. Design, User Experience, and Usability. Design, User Experience, and Usability 333 352 .
Parampalli Archana Hebbar, Abhay Anant Pashilkar & Pradipta Biswas. (2022) USING EYE TRACKING SYSTEM FOR AIRCRAFT DESIGN – A FLIGHT SIMULATOR STUDY. Aviation 26:1, pages 11-21.
Crossref
Gowdham Prabhakar, Priyam Rajkhowa, Dharmesh Harsha & Pradipta Biswas. (2021) A wearable virtual touch system for IVIS in cars. Journal on Multimodal User Interfaces 16:1, pages 87-106.
Crossref
Md Samiul Haque Sunny, Md Ishrak Islam Zarif, Ivan Rulik, Javier Sanjuan, Mohammad Habibur Rahman, Sheikh Iqbal Ahamed, Inga Wang, Katie Schultz & Brahim Brahmi. (2021) Eye-gaze control of a wheelchair mounted 6DOF assistive robot for activities of daily living. Journal of NeuroEngineering and Rehabilitation 18:1.
Crossref
Mohsen Parisay, Charalambos Poullis & Marta Kersten-Oertel. (2021) EyeTAP: Introducing a multimodal gaze-based technique using voice inputs with a comparative analysis of selection techniques. International Journal of Human-Computer Studies 154, pages 102676.
Crossref
P. Archana Hebbar, Kausik Bhattacharya, Gowdham Prabhakar, Abhay A. Pashilkar & Pradipta Biswas. (2021) Correlation Between Physiological and Performance-Based Metrics to Estimate Pilots' Cognitive Workload. Frontiers in Psychology 12.
Crossref
P. Ramakrishnan, B. Balasingam & F. Biondi. 2021. Learning Control. Learning Control 35 58 .
Rúbia E. O. Schultz Ascari, Roberto Pereira & Luciano Silva. (2020) Computer Vision-based Methodology to Improve Interaction for People with Motor and Speech Impairment. ACM Transactions on Accessible Computing 13:4, pages 1-33.
Crossref
Mohammed Abdul Hameed Jassim Al-kufi, Ola N. Kadhim & Eman Saleem Razaq. (2020) Simulate a first-order Bézier curve in image encoding. Journal of Physics: Conference Series 1530:1, pages 012080.
Crossref
Gowdham Prabhakar, Aparna Ramakrishnan, Modiksha Madan, L. R. D. Murthy, Vinay Krishna Sharma, Sachin Deshmukh & Pradipta Biswas. (2019) Interactive gaze and finger controlled HUD for cars. Journal on Multimodal User Interfaces 14:1, pages 101-121.
Crossref
A. Vijayalakshmi & P. Mohanaiah. 2020. Advances in Communication, Signal Processing, VLSI, and Embedded Systems. Advances in Communication, Signal Processing, VLSI, and Embedded Systems 345 360 .
Paulo André da Rocha Perris & Fernando da Fonseca de Souza. 2020. Universal Access in Human-Computer Interaction. Design Approaches and Supporting Technologies. Universal Access in Human-Computer Interaction. Design Approaches and Supporting Technologies 33 47 .
D.V. Jeevithashree, Kamalpreet Singh Saluja & Pradipta Biswas. (2019) A case study of developing gaze controlled interface for users with severe speech and motor impairment. Technology and Disability 31:1-2, pages 63-76.
Crossref
Xiaoyi Shen, Markku Similä, Wolfgang Dierking, Xi Zhang, Changqing Ke, Meijie Liu & Manman Wang. (2019) A New Retracking Algorithm for Retrieving Sea Ice Freeboard from CryoSat-2 Radar Altimeter Data during Winter–Spring Transition. Remote Sensing 11:10, pages 1194.
Crossref
Ayush Agarwal, DV JeevithaShree, Kamalpreet Singh Saluja, Atul Sahay, Pullikonda Mounika, Anshuman Sahu, Rahul Bhaumik, Vinodh Kumar Rajendran & Pradipta Biswas. 2019. Research into Design for a Connected World. Research into Design for a Connected World 641 652 .
Bin Li, Hong Fu, Desheng Wen & WaiLun LO. (2018) Etracker: A Mobile Gaze-Tracking System with Near-Eye Display Based on a Combined Gaze-Tracking Algorithm. Sensors 18:5, pages 1626.
Crossref
Pradipta Biswas & Jeevithashree DV. (2018) Eye Gaze Controlled MFD for Military Aviation. Eye Gaze Controlled MFD for Military Aviation.
Enrique Cáceres, Miguel Carrasco & Sebastián Ríos. (2018) Evaluation of an eye-pointer interaction device for human-computer interaction. Heliyon 4:3, pages e00574.
Crossref
Pradipta Biswas & Pat Langdon. 2018. The Wiley Handbook of Human Computer Interaction. The Wiley Handbook of Human Computer Interaction 421 443 .
Gowdham Prabhakar & Pradipta Biswas. (2018) Eye Gaze Controlled Projected Display in Automotive and Military Aviation Environments. Multimodal Technologies and Interaction 2:1, pages 1.
Crossref
Bashar I. Ahmad, Patrick M. Langdon, Lee Skrypchuk & Simon J. Godsill. 2018. Advances in Human Aspects of Transportation. Advances in Human Aspects of Transportation 423 434 .
Jiao Xu & Qijie Zhao. (2017) Adaptive calibration method based on state space model for eye gaze HCI system. Adaptive calibration method based on state space model for eye gaze HCI system.
Gowdham Prabhakar & Pradipta Biswas. (2017) Evaluation of laser pointer as a pointing device in automotive. Evaluation of laser pointer as a pointing device in automotive.
P Archana Hebbar & Abhay A Pashilkar. (2017) Pilot performance evaluation of simulated flight approach and landing manoeuvres using quantitative assessment tools. Sādhanā 42:3, pages 405-415.
Crossref
Gowdham Prabhakar, J Rajesh & Pradipta Biswas. (2016) Comparison of three hand movement tracking sensors as cursor controllers. Comparison of three hand movement tracking sensors as cursor controllers.
Pradipta Biswas. (2016) Multimodal Gaze Controlled Dashboard. Multimodal Gaze Controlled Dashboard.
Pradipta BiswasPradipta Biswas. 2016. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments 89 91 .
Pradipta BiswasPradipta Biswas. 2016. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments 59 76 .
Pradipta BiswasPradipta Biswas. 2016. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments 41 57 .
Pradipta BiswasPradipta Biswas. 2016. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments 5 28 .
Andréia Sias Rodrigues, Vinicius da Costa, Márcio Bender Machado, Angélica Lacerda Rocha, Joana Marini de Oliveira, Marcelo Bender Machado, Rafael Cunha Cardoso, Cleber Quadros & Tatiana Aires Tavares. 2016. Universal Access in Human-Computer Interaction. Interaction Techniques and Environments. Universal Access in Human-Computer Interaction. Interaction Techniques and Environments 81 91 .
Jianzhong Wang, Guangyue Zhang & Jiadong Shi. (2015) Pupil and Glint Detection Using Wearable Camera Sensor and Near-Infrared LED Array. Sensors 15:12, pages 30126-30141.
Crossref
Seungyup Lee, Juwan Yoo & Gunhee Han. (2015) Gaze-Assisted User Intention Prediction for Initial Delay Reduction in Web Video Access. Sensors 15:6, pages 14679-14700.
Crossref

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.