Thomas Kosch, Jakob Karolus, Johannes Zagermann, Harald Reiterer, Albrecht Schmidt & Paweł W. Woźniak. (2023) A Survey on Measuring Cognitive Workload in Human-Computer Interaction. ACM Computing Surveys 55:13s, pages 1-39.
Crossref
Suprakas Saren, Abhishek Mukhopadhyay, Debasish Ghose & Pradipta Biswas. (2023) Comparing alternative modalities in the context of multimodal human–robot interaction. Journal on Multimodal User Interfaces.
Crossref
Oleg Spakov, Hanna Venesvirta, Jani Lylykangas, Ahmed Farooq, Roope Raisamo & Veikko Surakka. 2023. Design, User Experience, and Usability. Design, User Experience, and Usability
333
352
.
Parampalli Archana Hebbar, Abhay Anant Pashilkar & Pradipta Biswas. (2022) USING EYE TRACKING SYSTEM FOR AIRCRAFT DESIGN – A FLIGHT SIMULATOR STUDY. Aviation 26:1, pages 11-21.
Crossref
Gowdham Prabhakar, Priyam Rajkhowa, Dharmesh Harsha & Pradipta Biswas. (2021) A wearable virtual touch system for IVIS in cars. Journal on Multimodal User Interfaces 16:1, pages 87-106.
Crossref
Md Samiul Haque Sunny, Md Ishrak Islam Zarif, Ivan Rulik, Javier Sanjuan, Mohammad Habibur Rahman, Sheikh Iqbal Ahamed, Inga Wang, Katie Schultz & Brahim Brahmi. (2021) Eye-gaze control of a wheelchair mounted 6DOF assistive robot for activities of daily living. Journal of NeuroEngineering and Rehabilitation 18:1.
Crossref
Mohsen Parisay, Charalambos Poullis & Marta Kersten-Oertel. (2021) EyeTAP: Introducing a multimodal gaze-based technique using voice inputs with a comparative analysis of selection techniques. International Journal of Human-Computer Studies 154, pages 102676.
Crossref
P. Archana Hebbar, Kausik Bhattacharya, Gowdham Prabhakar, Abhay A. Pashilkar & Pradipta Biswas. (2021) Correlation Between Physiological and Performance-Based Metrics to Estimate Pilots' Cognitive Workload. Frontiers in Psychology 12.
Crossref
P. Ramakrishnan, B. Balasingam & F. Biondi. 2021. Learning Control. Learning Control
35
58
.
Rúbia E. O. Schultz Ascari, Roberto Pereira & Luciano Silva. (2020) Computer Vision-based Methodology to Improve Interaction for People with Motor and Speech Impairment. ACM Transactions on Accessible Computing 13:4, pages 1-33.
Crossref
Mohammed Abdul Hameed Jassim Al-kufi, Ola N. Kadhim & Eman Saleem Razaq. (2020) Simulate a first-order Bézier curve in image encoding. Journal of Physics: Conference Series 1530:1, pages 012080.
Crossref
Gowdham Prabhakar, Aparna Ramakrishnan, Modiksha Madan, L. R. D. Murthy, Vinay Krishna Sharma, Sachin Deshmukh & Pradipta Biswas. (2019) Interactive gaze and finger controlled HUD for cars. Journal on Multimodal User Interfaces 14:1, pages 101-121.
Crossref
A. Vijayalakshmi & P. Mohanaiah. 2020. Advances in Communication, Signal Processing, VLSI, and Embedded Systems. Advances in Communication, Signal Processing, VLSI, and Embedded Systems
345
360
.
Paulo André da Rocha Perris & Fernando da Fonseca de Souza. 2020. Universal Access in Human-Computer Interaction. Design Approaches and Supporting Technologies. Universal Access in Human-Computer Interaction. Design Approaches and Supporting Technologies
33
47
.
D.V. Jeevithashree, Kamalpreet Singh Saluja & Pradipta Biswas. (2019) A case study of developing gaze controlled interface for users with severe speech and motor impairment. Technology and Disability 31:1-2, pages 63-76.
Crossref
Xiaoyi Shen, Markku Similä, Wolfgang Dierking, Xi Zhang, Changqing Ke, Meijie Liu & Manman Wang. (2019) A New Retracking Algorithm for Retrieving Sea Ice Freeboard from CryoSat-2 Radar Altimeter Data during Winter–Spring Transition. Remote Sensing 11:10, pages 1194.
Crossref
Ayush Agarwal, DV JeevithaShree, Kamalpreet Singh Saluja, Atul Sahay, Pullikonda Mounika, Anshuman Sahu, Rahul Bhaumik, Vinodh Kumar Rajendran & Pradipta Biswas. 2019. Research into Design for a Connected World. Research into Design for a Connected World
641
652
.
Bin Li, Hong Fu, Desheng Wen & WaiLun LO. (2018) Etracker: A Mobile Gaze-Tracking System with Near-Eye Display Based on a Combined Gaze-Tracking Algorithm. Sensors 18:5, pages 1626.
Crossref
Pradipta Biswas & Jeevithashree DV. (2018) Eye Gaze Controlled MFD for Military Aviation. Eye Gaze Controlled MFD for Military Aviation.
Enrique Cáceres, Miguel Carrasco & Sebastián Ríos. (2018) Evaluation of an eye-pointer interaction device for human-computer interaction. Heliyon 4:3, pages e00574.
Crossref
Pradipta Biswas & Pat Langdon. 2018. The Wiley Handbook of Human Computer Interaction. The Wiley Handbook of Human Computer Interaction
421
443
.
Gowdham Prabhakar & Pradipta Biswas. (2018) Eye Gaze Controlled Projected Display in Automotive and Military Aviation Environments. Multimodal Technologies and Interaction 2:1, pages 1.
Crossref
Bashar I. Ahmad, Patrick M. Langdon, Lee Skrypchuk & Simon J. Godsill. 2018. Advances in Human Aspects of Transportation. Advances in Human Aspects of Transportation
423
434
.
Jiao Xu & Qijie Zhao. (2017) Adaptive calibration method based on state space model for eye gaze HCI system. Adaptive calibration method based on state space model for eye gaze HCI system.
Gowdham Prabhakar & Pradipta Biswas. (2017) Evaluation of laser pointer as a pointing device in automotive. Evaluation of laser pointer as a pointing device in automotive.
P Archana Hebbar & Abhay A Pashilkar. (2017) Pilot performance evaluation of simulated flight approach and landing manoeuvres using quantitative assessment tools. Sādhanā 42:3, pages 405-415.
Crossref
Gowdham Prabhakar, J Rajesh & Pradipta Biswas. (2016) Comparison of three hand movement tracking sensors as cursor controllers. Comparison of three hand movement tracking sensors as cursor controllers.
Pradipta Biswas. (2016) Multimodal Gaze Controlled Dashboard. Multimodal Gaze Controlled Dashboard.
Pradipta BiswasPradipta Biswas. 2016. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments
89
91
.
Pradipta BiswasPradipta Biswas. 2016. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments
59
76
.
Pradipta BiswasPradipta Biswas. 2016. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments
41
57
.
Pradipta BiswasPradipta Biswas. 2016. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments. Exploring the Use of Eye Gaze Controlled Interfaces in Automotive Environments
5
28
.
Andréia Sias Rodrigues, Vinicius da Costa, Márcio Bender Machado, Angélica Lacerda Rocha, Joana Marini de Oliveira, Marcelo Bender Machado, Rafael Cunha Cardoso, Cleber Quadros & Tatiana Aires Tavares. 2016. Universal Access in Human-Computer Interaction. Interaction Techniques and Environments. Universal Access in Human-Computer Interaction. Interaction Techniques and Environments
81
91
.
Jianzhong Wang, Guangyue Zhang & Jiadong Shi. (2015) Pupil and Glint Detection Using Wearable Camera Sensor and Near-Infrared LED Array. Sensors 15:12, pages 30126-30141.
Crossref
Seungyup Lee, Juwan Yoo & Gunhee Han. (2015) Gaze-Assisted User Intention Prediction for Initial Delay Reduction in Web Video Access. Sensors 15:6, pages 14679-14700.
Crossref