539
Views
1
CrossRef citations to date
0
Altmetric
Research Articles

Using Mobile Dual Eye-Tracking to Capture Cycles of Collaboration and Cooperation in Co-located Dyads

ORCID Icon &
Pages 26-55 | Published online: 23 Dec 2022
 

Abstract

The goal of this paper is to bring new insights to the study of social learning processes by designing measures of collaboration using high-frequency sensor data. More specifically, we are interested in understanding the interplay between moments of collaboration and cooperation, which is an understudied area of research. We collected a multimodal dataset during a collaborative learning activity typical of makerspaces: learning how to program a robot. Pairs of participants were introduced to computational thinking concepts using a block-based environment. Mobile eye-trackers, physiological wristbands, and motion sensors captured their behavior and social interactions. In this paper, we analyze the eye-tracking data to capture participants’ tendency to synchronize their visual attention. This paper provides three contributions: (1) we use an emerging methodology (mobile dual eye-tracking) to capture joint visual attention in a co-located setting and replicate findings that show how levels of joint visual attention are positively correlated with collaboration quality; (2) we qualitatively analyzed the co-occurrence of verbal activity and joint visual attention in low and high performing groups to better understand moments of collaboration and cooperation; (3) inspired by the qualitative observations and theories of collaborative learning, we designed a new quantitative measure that captures cycles of collaborative and cooperative work. Compared to simple measures of joint visual attention, we found it to increase correlation coefficients with learning and collaboration scores. We discuss those results and describe how advances in analyzing sensor data can contribute to theories of collaboration. We conclude with implications for capturing students’ interactions in co-located spaces using Multimodal Learning Analytics (MMLA).

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 460.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.