References
- Breuer R, Kimmel R. 2017. A deep learning perspective on the origin of facial expressions [thesis]. Haifa (Israel): Department of Computer Science Technion - Israel Institute of Technology Technion. doi:10.48550/arXiv.1705.01842.
- Chan M, Singhal A. 2013. The emotional side of cognitive distraction: implications for road safety. Accident; Anal Prev. 50(0):147–154. doi:10.1016/j.aap.2012.04.004.
- Chanel Ggcuc, Kierkels JJM, Soleymani M, Pun T. 2009. Short-term emotion assessment in a recall paradigm. Int J Human - Computer Stud. 67(8):607–627. doi:10.1016/j.ijhcs.
- D'mello SK, Kory J. 2015. A review and meta-analysis of multimodal affect detection systems. ACM Comput Surv. 47(3):1–36. doi:10.1145/2682899.
- Dempster AP. 1967. Upper and lower probabilities induced by a multivalued mapping. Ann Math Statist. 38(2):325–339. doi:10.1214/aoms/1177698950.
- Ferdinando H, Alasaarela E. 2018a. Emotion recognition using cvxEDA-based features(Article). J Telecommun Electron Comput Engin. 10(2–3):19–23.
- Ferdinando H, Alasaarela E. 2018b. Enhancement of emotion recogniton using feature fusion and the neighborhood components analysis. Proceedings of the 7th International Conference on Pattern Recognition Applications and Methods (ICPRAM 2018), Funchal, Madeira, Portugal. 1:463–469. doi:10.5220/0006642904630469.
- Hieida C, Yamamoto T, Kubo T, Yoshimoto J, Ikeda K. 2023. Negative emotion recognition using multimodal physiological signals for advanced driver assistance systems. Artif Life Robotics. 28(2):388–393. doi:10.1007/s10015-023-00858-y.
- Huang Y, Yang J, Liu S, Pan J. 2019. Combining facial expressions and electroencephalography to enhance emotion recognition. Future Internet. 11(5):105. doi:10.3390/fi11050105.
- Jeon M, Walker BN, Yim J-B. 2014. Effects of specific emotions on subjective judgment, driving performance, and perceived workload. Transport Res Part F- Traffic Psychol Behav. 24:197–209. doi:10.1016/j.trf.2014.04.003.
- Katsigiannis S, Ramzan N. 2018. DREAMER: a database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. IEEE J Biomed Health Inform. 22(1):98–107. doi:10.1109/jbhi.2017.2688239.
- Mohammadi Z, Frounchi J, Amiri M. 2017. Wavelet-based emotion recognition system using EEG signal. Neural Comput & Applic. 28(8):1985–1990. doi:10.1007/s00521-015-2149-8.
- Monkaresi H, Bosch N, Calvo RA, D'Mello SK. 2017. Automated detection of engagement using video-based estimation of facial expressions and heart rate. IEEE Trans Affect Comput. 8(1):15–28. doi:10.1109/TAFFC.2016.2515084.
- Mou L, Zhao Y, Zhou C, Nakisa B, Rastgoo MN, Ma L, Huang T, Yin B, Jain R, Gao W. 2023. Driver emotion recognition with a hybrid attentional multimodal fusion framework. IEEE Trans Affect Comput. 14(4):2970–2981. doi:10.1109/TAFFC.2023.3250460.
- Osuna E, Rodríguez L-F, Gutierrez-Garcia JO, Castro LA. 2020. Development of computational models of emotions: a software engineering perspective. Cognit Syst Res. 60(C):1–19. doi:10.1016/j.cogsys.2019.11.001.
- Poria S, Cambria E, Bajpai R, Hussain A. 2017. A review of affective computing: from unimodal analysis to multimodal fusion. Information Fusion. 37(C):98–125. doi:10.1016/j.inffus.2017.02.003.
- Sarkar P, Ross K, Ruberto AJ, Rodenbura D, Hungler P, Etemad A. 2019. Classification of cognitive load and expertise for adaptive simulation using deep multitask learning. Department of Electrical and Computer Engineering Department of Electrical and Computer Engineering Departments of Emergency Medicine and Critical Care Medicine Faculty of Engineering and Applied Science Queen’s University, Department of Chemical Engineering, Kingston, Canada Department of Electrical and Computer Engineering; p. 1–7. doi:10.1109/ACII.2019.8925507.
- Soleymani M, Pantic M, Pun T. 2012. Multimodal emotion recognition in response to videos(Article). IEEE Trans Affect Comput. 3(2):211–223. doi:10.1109/T-AFFC.2011.37.
- Verma B, Choudhary A. 2018. A framework for driver emotion recognition using deep learning and grassmann manifolds. New Delhi, India: School of Computer and Systems Sciences, Jawaharlal Nehru University, New Delhi, India School of Computer and Systems Sciences, Jawaharlal Nehru University. p. 1421–1426. doi:10.1109/ITSC.2018.8569461.
- Verma GK, Tiwary US. 2014. Multimodal fusion framework: a multiresolution approach for emotion classification and recognition from physiological signals. Neuroimage. 102 Pt 1(P1):162–172. doi:10.1016/j.neuroimage.
- Wang X, Liu Y, Wang F, Wang J, Liu L, Wang J. 2019. Feature extraction and dynamic identification of drivers’ emotions. Transport Res: Part F. 62:175–191. doi:10.1016/j.trf.2019.01.002.
- Yin Z, Liu L, Chen J, Zhao B, Wang Y. 2020. Locally robust EEG feature selection for individual-independent emotion recognition. Expert Syst Appl. 162:113768. doi:10.1016/j.eswa.2020.113768.
- Yu D, Sun S. 2020. A systematic exploration of deep neural networks for EDA-based emotion recognition. Information. 11(4):212. doi:10.3390/info11040212.
- Zhang J, Yin Z, Chen P, Nichele S. 2020. Emotion recognition using multi-modal data and machine learning techniques: a tutorial and review. Information Fusion. 59:103–126. doi:10.1016/j.inffus.2020.01.011.
- Zhu J, Ji L, Liu C. 2019. Heart rate variability monitoring for emotion and disorders of emotion. Physiol Meas. 40(6):064004. doi:10.1088/1361-6579/ab1887.