59
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Decoding Group Emotional Dynamics in a Web-Based Collaborative Environment: A Novel Framework Utilizing Multi-Person Facial Expression Recognition

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Received 21 Nov 2023, Accepted 29 Mar 2024, Published online: 17 Apr 2024

References

  • Andiani, F. M., & Soewito, B. (2021). Face recognition for work attendance using multitask convolutional neural network (MTCNN) and pre-trained facenet. ICIC Express Letters, 15(1), 57–65. https://doi.org/10.24507/icicel.15.01.57
  • Andrean, M. N., Shidik, G. F., Naufal, M., Al Zami, F., Winarno, S., Al Azies, H., & Putra, P. L. W. E. (2024). Comparing Haar cascade and YOLOFACE for region of interest classification in drowsiness detection. Jurnal Media Informatika Budidarma, 8(1), 272–281.
  • Bakariya, B., Singh, A., Singh, H., Raju, P., Rajpoot, R., & Mohbey, K. K. (2023). Facial emotion recognition and music recommendation system using CNN-based deep learning techniques. Evolving Systems, 15(2), 641–658. https://doi.org/10.1007/s12530-023-09506-z
  • Barrett, L. F., Adolphs, R., Marsella, S., Martinez, A. M., & Pollak, S. D. (2019). Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychological Science in the Public Interest, 20(1), 1–68. https://doi.org/10.1177/1529100619832930
  • Barsoum, E., Zhang, C., Ferrer, C. C., & Zhang, Z. (2016, October). Training deep networks for facial expression recognition with crowd-sourced label distribution. In Proceedings of the 18th ACM International Conference on Multimodal Interaction (pp. 279–283). https://doi.org/10.1145/2993148.2993165
  • Bartkiene, E., Steibliene, V., Adomaitiene, V., Juodeikiene, G., Cernauskas, D., Lele, V., Klupsaite, D., Zadeike, D., Jarutiene, L., & Guiné, R. P. F. (2019). Factors affecting consumer food preferences: Food taste and depression-based evoked emotional expressions with the use of face reading technology. BioMed Research International, 2019, 2097415. https://doi.org/10.1155/2019/2097415
  • Belli, S. (2018). Managing negative emotions in online collaborative learning: A multimodal approach to solving technical difficulties. Digithum, (22), 35–46. https://doi.org/10.7238/d.v0i22.3140
  • Chamikara, M. A. P., Bertok, P., Khalil, I., Liu, D., & Camtepe, S. (2020). Privacy preserving face recognition utilizing differential privacy. Computers & Security, 97, 101951. https://doi.org/10.1016/j.cose.2020.101951
  • Chang, W. J., Schmelzer, M., Kopp, F., Hsu, C. H., Su, J. P., Chen, L. B., & Chen, M. C. (2019, February). A deep learning facial expression recognition based scoring system for restaurants. In 2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC) (pp. 251–254). IEEE. https://doi.org/10.1109/ICAIIC.2019.8668998
  • Chang, W., Jianhua, H., & Xiaoning, S. (2021). Research on the influencing factors of users’ emotion in collaborative information seeking: based on self-efficacy and task complexity. Documentation, Information & Knowledge, (1), 76–84. https://doi.org/10.13366/j.dik.2021.01.076
  • Chowdary, M. K., Nguyen, T. N., & Hemanth, D. J. (2023). Deep learning-based facial emotion recognition for human–computer interaction applications. Neural Computing and Applications, 35(32), 23311–23328. https://doi.org/10.1007/s00521-021-06012-8
  • Deng, J., Guo, J., Ververas, E., Kotsia, I., & Zafeiriou, S. (2020). Retinaface: Single-shot multi-level face localisation in the wild. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 5203–5212). https://doi.org/10.1109/cvpr42600.2020.00525
  • Dhavalikar, A. S., & Kulkarni, R. K. (2014, February). Face detection and facial expression recognition system. In 2014 International Conference on Electronics and Communication Systems (ICECS) (pp. 1–7). IEEE. https://doi.org/10.1109/ECS.2014.6892834
  • Dipeolu, A., Hargrave, S., Leierer, S. J., Tineo, Y. A. C., Longoria, A., & Escalante, M. (2022). Dysfunctional career thoughts and the sophomore slump among students with learning disabilities. Journal of Career Development, 49(4), 862–874. https://doi.org/10.1177/08948453211000130
  • Ertay, E., Huang, H., Sarsenbayeva, Z., & Dingler, T. (2021, September). Challenges of emotion detection using facial expressions and emotion visualisation in remote communication. In Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers (pp. 230–236). https://doi.org/10.1145/3460418.3479341
  • Ezzameli, K., & Mahersia, H. (2023). Emotion recognition from unimodal to multimodal analysis: A review. Information Fusion, 99, 101847. https://doi.org/10.1016/j.inffus.2023.101847
  • Fei, Z., Yang, E., Li, D. D. U., Butler, S., Ijomah, W., Li, X., & Zhou, H. (2020). Deep convolution network based emotion analysis towards mental health care. Neurocomputing, 388, 212–227. https://doi.org/10.1016/j.neucom.2020.01.034
  • Ge, D. H., Li, H. S., Zhang, L., Liu, R., Shen, P., & Miao, Q. G. (2020). Survey of lightweight neural network. Journal of Software, 31(9), 2627–2653. https://dx.doi.org/10.13328/j.cnki.jos.005942
  • Ge, H., Zhu, Z., Dai, Y., Wang, B., & Wu, X. (2022). Facial expression recognition based on deep learning. Computer Methods and Programs in Biomedicine, 215, 106621. https://doi.org/10.1016/j.cmpb.2022.106621
  • Gera, D., Balasubramanian, S., & Jami, A. (2022). CERN: Compact facial expression recognition net. Pattern Recognition Letters, 155, 9–18. https://doi.org/10.1016/j.patrec.2022.01.013
  • González-Rodríguez, M. R., Díaz-Fernández, M. C., & Gómez, C. P. (2020). Facial-expression recognition: An emergent approach to the measurement of tourist satisfaction through emotions. Telematics and Informatics, 51, 101404. https://doi.org/10.1016/j.tele.2020.101404
  • Goodfellow, I. J., Erhan, D., Carrier, P. L., Courville, A., Mirza, M., Hamner, B., Cukierski, W., Tang, Y. C., Thaler, D., Lee, D. H., Zhou, Y. B., Ramaiah, C., Feng, F. X., Li, R. F., Wang, X. J., Athanasakis, D., Shawe-Taylor, J., Milakov, M., Park, J., … Bengio, Y. (2015). Challenges in representation learning: A report on three machine learning contests. Neural Networks, 64, 59–63. https://doi.org/10.1016/j.neunet.2014.09.005
  • Grandey, A. A., & Melloy, R. C. (2017). The state of the heart: Emotional labor as emotion regulation reviewed and revised. Journal of Occupational Health Psychology, 22(3), 407–422. https://doi.org/10.1037/ocp0000067
  • Gu, W., Xiang, C., Venkatesh, Y. V., Huang, D., & Lin, H. (2012). Facial expression recognition using radial encoding of local Gabor features and classifier synthesis. Pattern Recognition, 45(1), 80–91. https://doi.org/10.1016/j.patcog.2011.05.006
  • Guo, X., Zhou, J., & Xu, T. (2018). Evaluation of teaching effectiveness based on classroom micro-expression recognition. International Journal of Performability Engineering, 14(11), 2877. https://doi.org/10.23940/ijpe.18.11.p33.28772885
  • He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 770–778). https://doi.org/10.1109/cvpr.2016.90
  • He, Q., Zhang, H., Mei, Z., & Xu, X. (2023). High accuracy intelligent real-time framework for detecting infant drowning based on deep learning. Expert Systems with Applications, 228, 120204. https://doi.org/10.1016/j.eswa.2023.120204
  • Hussain, S. A., & Al Balushi, A. S. A. (2020). A real time face emotion classification and recognition using deep learning model. Journal of Physics: Conference Series, 1432(1), 012087. (IOP Publishing. https://doi.org/10.1088/1742-6596/1432/1/012087
  • Karnati, M., Seal, A., Bhattacharjee, D., Yazidi, A., & Krejcar, O. (2023). Understanding deep learning techniques for recognition of human emotions using facial expressions: A comprehensive survey. IEEE Transactions on Instrumentation and Measurement, 72, 1–31. https://doi.org/10.1109/TIM.2023.3243661
  • Koval, P., Kalokerinos, E. K., Greenaway, K. H., Medland, H., Kuppens, P., Nezlek, J. B., Hinton, J. D. X., & Gross, J. J. (2023). Emotion regulation in everyday life: Mapping global self-reports to daily processes. Emotion, 23(2), 357–374. https://doi.org/10.1037/emo0001097
  • Krause, F. C., Linardatos, E., Fresco, D. M., & Moore, M. T. (2021). Facial emotion recognition in major depressive disorder: A meta-analytic review. Journal of Affective Disorders, 293, 320–328. https://doi.org/10.1016/j.jad.2021.06.053
  • Kuruvayil, S., & Palaniswamy, S. (2022). Emotion recognition from facial images with simultaneous occlusion, pose and illumination variations using meta-learning. Journal of King Saud University-Computer and Information Sciences, 34(9), 7271–7282. https://doi.org/10.1016/j.jksuci.2021.06.012
  • Landowska, A., Brodny, G., & Wrobel, M. R. (2017, April). Limitations of emotion recognition from facial expressions in e-learning context. In International Conference on Computer Supported Education (Vol. 2, pp. 383–389). SciTePress. https://doi.org/10.5220/0006357903830389
  • Lee, J., Kim, J., Lee, E., & Lee, H. (2023). Deep learning model structure for Korean facial expression detection. The Journal of Korean Institute of Information Technology, 21(2), 9–17. https://doi.org/10.14801/jkiit.2023.21.2.9
  • Li, D., Mei, H., Shen, Y., Su, S., Zhang, W., Wang, J., Zu, M., & Chen, W. (2018). ECharts: A declarative framework for rapid construction of web-based visualization. Visual Informatics, 2(2), 136–146. https://doi.org/10.1016/j.visinf.2018.04.011
  • Li, J., Shi, D., Tumnark, P., & Xu, H. (2020). A system for real-time intervention in negative emotional contagion in a smart classroom deployed under edge computing service infrastructure. Peer-to-Peer Networking and Applications, 13(5), 1706–1719. https://doi.org/10.1007/s12083-019-00863-8
  • Li, S., Deng, W., & Du, J. (2017). Reliable crowdsourcing and deep locality-preserving learning for expression recognition in the wild. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 2852–2861). https://doi.org/10.1109/cvpr.2017.277
  • Ling, X., Liang, J., Wang, D., & Yang, J. (2021, April). A facial expression recognition system for smart learning based on YOLO and vision transformer. In 2021 7th International Conference on Computing and Artificial Intelligence (pp. 178–182). https://doi.org/10.1145/3467707.3467733
  • Liu, J. L., Liu, B., & Zhang, W. L. (2019). The influences of academic emotions on collaborative problem solving of online learners. China Educational Technology, 390(07), 82–90. https://doi.org/10.3969/j.issn.1006-9860.2019.07.012
  • Liying, Z., Shijie, S., ZiWen, Q., Huizhen, Y., Guoxiang, Y., & Weiguo, Y. (2020). Intelligent monitoring system based on pain expression recognition. Foreign Electronic Measurement Technology, 39(3), 148–151. https://doi.org/10.19652/j.cnki.femt.201901907
  • Lu, G., Xie, K., & Liu, Q. (2022). What influences student situational engagement in smart classrooms: Perception of the learning environment and students’ motivation. British Journal of Educational Technology, 53(6), 1665–1687. https://doi.org/10.1111/bjet.13204
  • Madrid, H. P., Barros, E., & Vasquez, C. A. (2020). The emotion regulation roots of job satisfaction. Frontiers in Psychology, 11, 609933. https://doi.org/10.3389/fpsyg.2020.609933
  • Meena, G., Mohbey, K. K., & Kumar, S. (2023). Sentiment analysis on images using convolutional neural networks based Inception-V3 transfer learning approach. International Journal of Information Management Data Insights, 3(1), 100174. https://doi.org/10.1016/j.jjimei.2023.100174
  • Meena, G., Mohbey, K. K., Indian, A., Khan, M. Z., & Kumar, S. (2023). Identifying emotions from facial expressions using a deep convolutional neural network-based approach. Multimedia Tools and Applications, 83(6), 15711–15732. https://doi.org/10.1007/s11042-023-16174-3
  • Meena, G., Mohbey, K. K., Kumar, S., Chawda, R. K., & Gaikwad, S. V. (2023). Image-based sentiment analysis using InceptionV3 transfer learning approach. SN Computer Science, 4(3), 242. https://doi.org/10.1007/s42979-023-01695-3
  • Mehendale, N. (2020). Facial emotion recognition using convolutional neural networks (FERC). SN Applied Sciences, 2(3), 446. https://doi.org/10.1007/s42452-020-2234-1
  • Minaee, S., Luo, P., Lin, Z., & Bowyer, K. (2021). Going deeper into face detection: A survey. arXiv Preprint arXiv:2103.14983.
  • Moolchandani, M., Dwivedi, S., Nigam, S., & Gupta, K. (2021, April). A survey on: Facial emotion recognition and classification. In 2021 5th International Conference on Computing Methodologies and Communication (ICCMC) (pp. 1677–1686). IEEE. https://doi.org/10.1109/ICCMC51019.2021.9418349
  • Mukhopadhyay, M., Pal, S., Nayyar, A., Pramanik, P. K. D., Dasgupta, N., & Choudhury, P. (2020, February). Facial emotion detection to assess Learner’s State of mind in an online learning system. In Proceedings of the 2020 5th International Conference on Intelligent Information Technology (pp. 107–115). https://doi.org/10.1145/3385209.3385231
  • Olderbak, S., Wilhelm, O., Hildebrandt, A., & Quoidbach, J. (2019). Sex differences in facial emotion perception ability across the lifespan. Cognition & Emotion, 33(3), 579–588. https://doi.org/10.1080/02699931.2018.1454403
  • Patel, K., Mehta, D., Mistry, C., Gupta, R., Tanwar, S., Kumar, N., & Alazab, M. (2020). Facial sentiment analysis using AI techniques: State-of-the-art, taxonomies, and challenges. IEEE Access, 8, 90495–90519. https://doi.org/10.1109/ACCESS.2020.2993803
  • Putro, M. D., Nguyen, D. L., & Jo, K. H. (2022). A fast CPU real-time facial expression detector using sequential attention network for human–robot interaction. IEEE Transactions on Industrial Informatics, 18(11), 7665–7674. https://doi.org/10.1109/TII.2022.3145862
  • Qi, D., Tan, W., Yao, Q., & Liu, J. (2022, October). YOLO5Face: Why reinventing a face detector. In European Conference on Computer Vision (pp. 228–244). Springer Nature Switzerland. https://doi.org/10.1007/978-3-031-25072-9_15
  • Qi, Y., Zhou, C., & Chen, Y. (2023). NA-Resnet: Neighbor block and optimized attention module for global-local feature extraction in facial expression recognition. Multimedia Tools and Applications, 82(11), 16375–16393. https://doi.org/10.1007/s11042-022-14191-2
  • Salazar Kämpf, M., Adam, L., Rohr, M. K., Exner, C., & Wieck, C. (2023). A meta-analysis of the relationship between emotion regulation and social affect and cognition. Clinical Psychological Science, 11(6), 1159–1189. https://doi.org/10.1177/21677026221149953
  • Sanchez-Moreno, A. S., Olivares-Mercado, J., Hernandez-Suarez, A., Toscano-Medina, K., Sanchez-Perez, G., & Benitez-Garcia, G. (2021). Efficient face recognition system for operating in unconstrained environments. Journal of Imaging, 7(9), 161. https://doi.org/10.3390/jimaging7090161
  • Saxena, S., Tripathi, S., & Sudarshan, T. S. B. (2019, November). Deep dive into faces: Pose & illumination invariant multi-face emotion recognition system. In 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 1088–1093). IEEE. https://doi.org/10.1109/IROS40897.2019.8967874
  • Schroff, F., Kalenichenko, D., & Philbin, J. (2015). Facenet: A unified embedding for face recognition and clustering. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 815–823). https://doi.org/10.1109/cvpr.2015.7298682
  • Sohail, M., Ali, G., Rashid, J., Ahmad, I., Almotiri, S. H., AlGhamdi, M. A., Nagra, A. A., & Masood, K. (2021). Racial identity-aware facial expression recognition using deep convolutional neural networks. Applied Sciences, 12(1), 88. https://doi.org/10.3390/app12010088
  • Starr, K., & Braun, S. (2020). Audio description 2.0: Re-versioning audiovisual accessibility to assist emotion recognition. In Innovation in audio description research (pp. 97–120). Routledge.
  • Suk, M., & Prabhakaran, B. (2014). Real-time mobile facial expression recognition system–A case study. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops (pp. 132–137). https://doi.org/10.1109/cvprw.2014.25
  • Sun, W., Li, Y., Tian, F., Fan, X., & Wang, H. (2019). How presenters perceive and react to audience flow prediction in-situ: An explorative study of live online lectures. Proceedings of the ACM on Human–Computer Interaction, 3(CSCW), 1–19. https://doi.org/10.1145/3359264
  • Ulandari, N., & Lubis, R. K. (2020). The effect of giving discounts and service quality on sales at PT Midi Utama Indonesia Tbk Alfamidi Branch of SM Raja 4 Medan. Journal of Economics and Business, 2(1), 91–96. https://doi.org/10.58471/jecombi.v2i1.12
  • Wang, K., Peng, X., Yang, J., Lu, S., & Qiao, Y. (2020). Suppressing uncertainties for large-scale facial expression recognition. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 6897–6906). https://doi.org/10.1109/cvpr42600.2020.00693
  • Wang, W., Xu, K., Niu, H., & Miao, X. (2020). Emotion recognition of students based on facial expressions in online education based on the perspective of computer simulation. Complexity, 2020, 1–9. https://doi.org/10.1155/2020/4065207
  • Xia, X., Yang, L., Wei, X., Sahli, H., & Jiang, D. (2022). A multi-scale multi-attention network for dynamic facial expression recognition. Multimedia Systems, 28(2), 479–493. https://doi.org/10.1007/s00530-021-00849-8
  • Yang, S., Luo, P., Loy, C. C., & Tang, X. (2016). Wider face: A face detection benchmark. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 5525–5533). https://doi.org/10.1109/cvpr.2016.596
  • Yu, Z., Huang, H., Chen, W., Su, Y., Liu, Y., & Wang, X. (2022). Yolo-facev2: A scale and occlusion aware face detector. arXiv Preprint arXiv:2208.02019.
  • Zapf, D., Kern, M., Tschan, F., Holman, D., & Semmer, N. K. (2021). Emotion work: A work psychology perspective. Annual Review of Organizational Psychology and Organizational Behavior, 8(1), 139–172. https://doi.org/10.1146/annurev-orgpsych-012420-062451
  • Zhang, J. (2023). Intercultural communication dilemma and countermeasures in international trade. Frontiers in Business, Economics and Management, 10(3), 114–118. https://doi.org/10.54097/fbem.v10i3.11461

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.