What Does Shared Understanding in Students’ Face-to-Face Collaborative Learning Gaze Behaviours “Look Like”?
In: Rodrigo, M and Matsuda, N and Cristea, A and Dimitrova, V, (eds.)
Artificial Intelligence in Education. AIED 2022. Proceedings.
(pp. pp. 588-593).
Springer: Cham, Switzerland.
Access restricted to UCL open access staff until 28 August 2023.
Several studies have shown a positive relationship between measures of gaze behaviours and the quality of student group collaboration over the past decade. Gaze behaviours, however, are frequently employed to investigate i) students’ online interactions and ii) calculated as cumulative measures of collaboration, rarely providing insights into the actual process of collaborative learning in real-world settings. To address these two limitations, we explored the sequences of students’ gaze behaviours as a process and its relationship to collaborative learning in a face-to-face environment. Twenty-five collaborative learning session videos were included from five groups in a 10-week post-graduate module. Four types of gaze behaviours (i.e., gazing at peers, their laptops, tutors, and undefined objects) were used to label student gaze behaviours and the resulting sequences were analyzed using the Optimal Matching (OM) algorithm and Ward’s Clustering. Two distinct types of gaze patterns with different levels of shared understanding and collaboration satisfaction were identified, i) peer-interaction focused (PIF), which prioritise social interaction dimensions of collaboration and ii) resource-interaction focused (RIF) which prioritise resource management and task execution. The implications of the findings for automated detection of students’ gaze behaviours with computer vision and adaptive support are discussed.
Archive Staff Only