FOCUSED LITERATURE REVIEW 8: Exploring the potential of LMS log data as a proxy measure of student engagement
SOURCE: Henrie, C. R., Bodily, R., Larsen, R., & Graham, C. R. (2018). Exploring the potential of LMS log data as a proxy measure of student engagement. Journal of Computing in Higher Education, 30(2), 344-362. https://doi.org/10.1007/s12528-017-9161-1
Engagement—defined in this study as focused, committed energetic involvement in learning—has been shown to be directly correlated with academic success, and is particularly crucial in technology-mediated distance learning. This study looked at the efficacy of a learning management system as a tool for engagement. The researchers specifically sought a correlation between self-reported engagement survey scores and log data tracking student activity in the Canvas learning management system.
While many kinds of engagement are identified in the literature, the researchers focused on cognitive and emotional engagement because these types in particular have a strong empirically and theoretically supported connection to learning. The researchers found that LMS log data, which would seemingly be a strong indicator of learning activity in a course, did not have a statistically significant relationship to students’ self-reported measures of cognitive and emotional engagement in online courses.
The researchers looked at data from 153 students in 8 individual sections of three undergraduate courses at a single university in the western US. All courses were offered through Canvas in a blended format. Evaluating log data was chosen as a measure of engagement because it was minimally disruptive to each learner’s process since it is automatically tracked behind the scenes. They measured: URLs visited, page views, time spent per page, time stamps of page visitation, number of discussion replies, punctuality of assignment submission, and grades. Each course included discussion boards, quizzes, online videos, and projects, all of which took place within the LMS.
This data was compared against the survey, which consisted of 7 Likert-style questions. Self-report was chosen as a measure of student engagement because cognitive and emotional states are hard to measure by observation and conclusions drawn via observation are highly inferential.
Their unexpected results, they concluded, point to the complexity of learning and the difference between observed measures of engagement versus students’ self-reported intellectual and emotional states. The researchers noted a long list of limitations to their research that may have yielded their unexpected results, and they concluded that behavior captured through log data may be far more complicated than they realized. To yield more reliable results, they suggest that other factors need to be accounted for to better understand what it means to be engaged in online learning, such as previous knowledge, motivation to learn, or level of confusion or frustration.
They suggest further work with other methods for measuring student engagement like mouse tracking, physiological instruments, and human observers. A more complex and longer survey may also help gather additional nuance related to time spent on pages and on assignments. For example, high engagement and motivation may actually require less time of the students, while frustration and confusion may mean more time spent on pages. Also, time spent on pages is difficult to quantify, as without direct observation, it is difficult to determine whether a student was actually focused on their work while a page happens to be open on screen.