231
Views
1
CrossRef citations to date
0
Altmetric
Research Articles

Understanding Flow Experience in Video Learning by Multimodal Data

, , ORCID Icon &
Pages 3144-3158 | Received 23 Nov 2022, Accepted 14 Feb 2023, Published online: 23 Feb 2023

References

  • Abeysekera, L., & Dawson, P. (2015). Motivation and cognitive load in the flipped classroom: Definition, rationale and a call for research. Higher Education Research & Development, 34(1), 1–14. https://doi.org/10.1080/07294360.2014.934336
  • Admiraal, W., Huizenga, J., Akkerman, S., & Dam, G. t (2011). The concept of flow in collaborative game-based learning. Computers in Human Behavior, 27(3), 1185–1194. https://doi.org/10.1016/j.chb.2010.12.013
  • Alyuz, N., Okur, E., Oktay, E., Genc, U., Aslan, S., Mete, S. E., Arnrich, B., & Esme, A. A. (2016). Semi-supervised model personalization for improved detection of learner’s emotional engagement [Paper presentation]. Proceedings of the 18th ACM International Conference on Multimodal Interaction - ICMI '16 (pp. 13–17). ACM. https://doi.org/10.1145/2993148.2993166
  • Bian, Y., Yang, C., Gao, F., Li, H., Zhou, S., Li, H., Sun, X., & Meng, X. (2016). A framework for physiological indicators of flow in VR games: Construction and preliminary evaluation. Personal and Ubiquitous Computing, 20(5), 821–832. https://doi.org/10.1007/s00779-016-0953-5
  • Blikstein, P., & Worsley, M. (2016). Multimodal learning analytics and education data mining: Using computational technologies to measure complex learning tasks. Journal of Learning Analytics, 3(2), 220–238. https://doi.org/10.18608/jla.2016.32.11
  • Brunstein, J. C., & Heckhausen, H. (2008). Motivation and action (H. Heckhausen & J. Heckhausen, Eds., pp. 137–183). Cambridge University Press.
  • Chanel, G., Rebetez, C., Bétrancourt, M., Pun, T. (2008). Boredom, engagement and anxiety as indicators for adaptation to difficulty in games [Paper presentation]. Proceedings of the 12th International Conference on Entertainment and Media in the Ubiquitous era - MindTrek '08 (pp. 13–17). ACM. https://doi.org/10.1145/1457199.1457203
  • Chango, W., Lara, J. A., Cerezo, R., & Romero, C. (2022). A review on data fusion in multimodal learning analytics and educational data mining. WIREs Data Mining and Knowledge Discovery, 12(4), e1458. https://doi.org/10.1002/widm.1458
  • Chen, J., Luo, N., Liu, Y., Liu, L., Zhang, K., & Kolodziej, J. (2016). A hybrid intelligence-aided approach to affect-sensitive e-learning. Computing, 98(1–2), 215–233. https://doi.org/10.1007/s00607-014-0430-9
  • Clarke, S. G., & Haworth, J. T. (1994). “Flow” experience in the daily lives of sixth-form college students. British Journal of Psychology, 85(4), 511–523. https://doi.org/10.1111/j.2044-8295.1994.tb02538.x
  • Crescenzi‐Lanna, L. (2020). Multimodal learning analytics research with young children: A systematic review. British Journal of Educational Technology, 51(5), 1485–1504. https://doi.org/10.1111/bjet.12959
  • Csikszentmihalyi, M. (1975). Beyond boredom and anxiety. Jossey-Bass.
  • Csikszentmihalyi, M., & Csikszentmihalyi, I. S. (1988). Optimal experience: Psychological studies of flow in consciousness. university press. https://doi.org/10.1017/CBO9780511621956
  • Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. Harper & Row.
  • Daoudi, I., Chebil, R., Tranvouez, E., Lejouad Chaari, W., & Espinasse, B. (2021). Improving learners’ assessment and evaluation in crisis management serious games: An emotion-based educational data mining approach. Entertainment Computing, 38(May), 100428. https://doi.org/10.1016/j.entcom.2021.100428
  • de Manzano, O., Theorell, T., Harmat, L., & Ullen, F. (2010). The psychophysiology of flow during piano playing. Emotion, 10(3), 301–311. https://doi.org/10.1037/a0018432
  • de Sampaio Barros, M. F., Araujo-Moreira, F. M., Trevelin, L. C., & Radel, R. (2018). Flow experience and the mobilization of attentional resources. Cognitive, Affective & Behavioral Neuroscience, 18(4), 810–823. https://doi.org/10.3758/s13415-018-0606-4
  • Di Mitri, D., Schneider, J., Specht, M., & Drachsler, H. (2018). From signals to knowledge: A conceptual model for multimodal learning analytics. Journal of Computer Assisted Learning, 34(4), 338–349. https://doi.org/10.1111/jcal.12288
  • Dickerson, S. S., & Kemeny, M. E. (2004). Acute stressors and cortisol responses: A theoretical integration and synthesis of laboratory research. Psychological Bulletin, 130(3), 355–391. https://doi.org/10.1037/0033-2909.130.3.355
  • dos Santos, W. O., Bittencourt, I. I., Dermeval, D., Isotani, S., Marques, L. B., & Silveira, I. F. (2018). Flow theory to promote learning in educational systems: Is it really relevant? Revista Brasileira de Informática na Educação, 26(02), 29–59. https://doi.org/10.5753/rbie.2018.26.02.29
  • Engeser, S., & Rheinberg, F. (2008). Flow, performance and moderators of challenge-skill balance. Motivation and Emotion, 32(3), 158–172. https://doi.org/10.1007/s11031-008-9102-4
  • Gendolla, G. H. (2004). The intensity of motivation when the self is involved: An application of Brehm’s theory of motivation to effort-related cardiovascular response. In R. A. Wright, J. Greenberg, & S. S. Brehm (Eds.), Motivational analyses of social behavior (pp. 205–244). Lawrence Erlbaum Associates Publishers. https://psycnet.apa.org/record/2004-12591-013
  • Giannakos, M. N., Sharma, K., Pappas, I. O., Kostakos, V., & Velloso, E. (2019). Multimodal data as a means to understand the learning experience. International Journal of Information Management, 48(October), 108–119. https://doi.org/10.1016/j.ijinfomgt.2019.02.003
  • Harmat, L., de Manzano, O., Theorell, T., Hogman, L., Fischer, H., & Ullen, F. (2015). Physiological correlates of the flow experience during computer game playing. International Journal of Psychophysiology, 97(1), 1–7. https://doi.org/10.1016/j.ijpsycho.2015.05.001
  • Henderson, N. L., Rowe, J. P., Mott, B. W., & Lester, J. C. (2019). Sensor-based data fusion for multimodal affect detection in game-based learning environments [Paper presentation]. Paper Presented at 12th International Conference on Educational Data Mining 2019—Workshop on EDM & Games: Leveling up Engaged Learning with Data-Rich Analytics (pp. 44–50). CEUR Workshop Proceedings. http://ceur-ws.org/Vol-2592/paper6.pdf
  • Houlden, S., Veletsianos, G. (2020, March 12). Coronavirus pushes universities to switch to online classes — But are they ready? The Conversation. https://www.globalpolicyjournal.com/blog/16/03/2020/covid-19-pushes-universities-switch-online-classes-are-they-ready
  • Hu, G., Wang, Y., Yang, Z., Hu, Z., Gan, T., & Liu, H. (2022). Is this science video popular? Let us see how the audience reacts! International Journal of Human–Computer Interaction, 1–11. https://doi.org/10.1080/10447318.2022.2098570
  • Kazemi, V., & Sullivan, J. (2014). One millisecond face alignment with an ensemble of regression trees [Paper presentation]. 2014 IEEE Conference on Computer Vision and Pattern Recognition (pp. 1867–1874). IEEE. https://doi.org/10.1109/cvpr.2014.241
  • Kiili, K., de Freitas, S., Arnab, S., & Lainema, T. (2012). The design principles for flow experience in educational games. Procedia Computer Science, 15, 78–91. https://doi.org/10.1016/j.procs.2012.10.060
  • Lal, S., Eysink, T. H., Gijlers, H. A., Verwey, W. B., Veldkamp, B. P. (2021). Detecting emotions in a learning environment: A multimodal exploration [Paper presentation]. Proceedings of the Doctoral Consortium of Sixteenth European Conference on Technology Enhanced Learning (EC-TEL Doctoral Consortium). https://ris.utwente.nl/ws/files/278037936/ECTEL2021_DC_paper09.pdf
  • Liu, Y. J., Yu, M. J., Zhao, G. Z., Song, J., Ge, Y., & Shi, Y. C. (2018). Real-time movie-induced discrete emotion recognition from EEG signals. IEEE Transactions on Affective Computing, 9(4), 550–562. https://doi.org/10.1109/TAFFC.2017.2660485
  • Malandrakis, N., Potamianos, A., Evangelopoulos, G., & Zlatintsi, A. (2011). A supervised approach to movie emotion tracking [Paper presentation]. 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (pp. 2376–2379). IEEE. https://doi.org/10.1109/ICASSP.2011.5946961
  • Mayer, R. E. (2008). Applying the science of learning: Evidence-based principles for the design of multimedia instruction. American Psychologist, 63(8), 760–769. https://doi.org/10.1037/0003-066X.63.8.760
  • Mu, S., Cui, M., & Huang, X. (2020). Multimodal data fusion in learning analytics: A systematic review. Sensors, 20(23), 6856. https://doi.org/10.3390/s20236856
  • Noetel, M., Griffith, S., Delaney, O., Sanders, T., Parker, P., del Pozo Cruz, B., & Lonsdale, C. (2021). Video improves learning in higher education: A systematic review. Review of Educational Research, 91(2), 204–236. https://doi.org/10.3102/0034654321990713
  • Oliveira, W., Toda, A. M., Palomino, P. T., Rodrigues, L., Shi, L., & Isotani, S. (2020). Towards automatic flow experience identification in educational systems: A qualitative study. In Anais do XXXI Simpósio Brasileiro de Informática na Educação (pp. 702–711). SBC. https://doi.org/10.5753/cbie.sbie.2020.702
  • Oviatt, S., Grafsgaard, J., Chen, L., & Ochoa, X. (2018). Multimodal learning analytics: Assessing learners’ mental state during the process of learning. In S. Oviatt, B. Schuller, P. R. Cohen, D. Sonntag, G. Potamianos, & A. Krüger (Eds.), The handbook of multimodal-multisensor interfaces: Signal processing, architectures, and detection of emotion and cognition (Vol. 2, pp. 331–374). ACM. https://doi.org/10.1145/3107990.3108003
  • Palo, H., Mohanty, M. N., & Chandra, M. (2015). Use of different features for emotion recognition using MLP network. In I. K. Sethi (Ed.), Computational vision and robotics (Vol. 332, pp. 7–15). Springer. https://doi.org/10.1007/978-81-322-2196-8_2
  • Peifer, C., Schulz, A., Schächinger, H., Baumann, N., & Antoni, C. H. (2014). The relation of flow-experience and physiological arousal under stress — Can u shape it? Journal of Experimental Social Psychology, 53, 62–69. https://doi.org/10.1016/j.jesp.2014.01.009
  • Peng, S., & Nagao, K. (2021). Recognition of students’ mental states in discussion based on multimodal data and its application to educational support. IEEE Access, 9, 18235–18250. https://doi.org/10.1109/ACCESS.2021.3054176
  • Poquet, O., Lim, L., Mirriahi, N., Dawson, S. (2018). Video and learning: A systematic review (2007–2017) [Paper presentation]. Proceedings of the 8th International Conference on Learning Analytics and Knowledge (Vol. 10, pp. 151–160). https://doi.org/10.1145/3170358.3170376
  • Rajendra Acharya, U., Paul Joseph, K., Kannathal, N., Lim, C. M., & Suri, J. S. (2006). Heart rate variability: A review. Medical & Biological Engineering & Computing, 44(12), 1031–1051. https://doi.org/10.1007/s11517-006-0119-0
  • Rajesh, S., & Nalini, N. J. (2020). Musical instrument emotion recognition using deep recurrent neural network. Procedia Computer Science, 167, 16–25. https://doi.org/10.1016/j.procs.2020.03.178
  • Redondo, M., & Del Valle-Inclan, F. (1992). Decrements in heart rate variability during memory search. International Journal of Psychophysiology, 13(1), 29–35. https://doi.org/10.1016/0167-8760(92)90017-6
  • Rey, G. D. (2012). A review of research and a meta-analysis of the seductive detail effect. Educational Research Review, 7(3), 216–237. https://doi.org/10.1016/j.edurev.2012.05.003
  • Rheinberg, F. (2008). Intrinsic motivation and flow. In J. Heckhausen & H. Heckhausen (Eds.), Motivation and action (pp. 323–348). Cambridge University Press. https://doi.org/10.1017/CBO9780511499821.014
  • Schneider, S., Nebel, S., Beege, M., & Rey, G. D. (2018). The autonomy-enhancing effects of choice on cognitive load, motivation and learning with digital media. Learning and Instruction, 58(December), 161–172. https://doi.org/10.1016/j.learninstruc.2018.06.006
  • Sharma, K., & Giannakos, M. (2020). Multimodal data capabilities for learning: What can multimodal data tell us about learning? British Journal of Educational Technology, 51(5), 1450–1484. https://doi.org/10.1111/bjet.12993
  • Sharma, K., Lee-Cultura, S., & Giannakos, M. (2022). Keep calm and do not carry-forward: Toward sensor-data driven AI agent to enhance human learning. Frontiers in Artificial Intelligence, 4, 713176. https://doi.org/10.3389/frai.2021.713176
  • Shin, N. (2006). Online learner’s ‘flow’ experience: An empirical study. British Journal of Educational Technology, 37(5), 705–720. https://doi.org/10.1111/j.1467-8535.2006.00641.x
  • Thayer, J. F., & Lane, R. D. (2000). A model of neurovisceral integration in emotion regulation and dysregulation. Journal of Affective Disorders, 61(3), 201–216. https://doi.org/10.1016/S0165-0327(00)00338-4
  • Thayer, J. F., & Lane, R. D. (2009). Claude Bernard and the heart-brain connection: Further elaboration of a model of neurovisceral integration. Neuroscience & Biobehavioral Reviews, 33(2), 81–88. https://doi.org/10.1016/j.neubiorev.2008.08.004
  • Tozman, T., Magdas, E. S., MacDougall, H. G., & Vollmeyer, R. (2015). Understanding the psychophysiology of flow: A driving simulator experiment to investigate the relationship between flow and heart rate variability. Computers in Human Behavior, 52(November), 408–418. https://doi.org/10.1016/j.chb.2015.06.023
  • Worsley, M., Martinez-Maldonado, R., & D'Angelo, C. (2021). A new era in multimodal learning analytics: Twelve core commitments to ground and grow MMLA. Journal of Learning Analytics, 8(3), 10–27. https://doi.org/10.18608/jla.2021.7361
  • Zhang, G., Mei, S., Xie, K., & Yang, Z. (2021). Multi-index measurement of fatigue degree under the simulated monitoring task of a nuclear power plant. Nuclear Technology, 207(10), 1564–1577. https://doi.org/10.1080/00295450.2020.1824965

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.