Conducting a Formative Evaluation on a Course-Level Learning Analytics Implementation Through the Lens of Self-Regulated Learning and Higher-Order Thinking

, , &
Learning AnalyticsSelf-regulated LearningHigher-order thinking
Self-regulated learning (SRL) and higher-order thinking skills (HOTS) are associated with academic achievement, but fostering these skills is not easy. Scholars have suggested an alternative way to scaffold these important skills through learning analytics (LA). This paper presents a formative evaluation of a course-level LA implementation through the lens of self-regulated learning (SRL) and higher-order thinking skills (HOTS). We explored the changes in students’ SRL, HOTS, and perceptions at the end of the course term. Results indicate an increase in some elements of SRL and HOTS, and positive student perceptions. Discussion on implications and opportunities for informing future teaching strategies and course design reiteration are included.


Research literature documents the crucial role of self-regulation on students’ academic achievement (Broadbent, 2017; Credé & Kuncel, 2008; Nevgi, 2002; Pintrich & de Groot, 1990; Richardson et al., 2012). Students with higher-order thinking skills (HOTS) also tend to be academically successful (Tanujaya et al., 2017) and have strong metacognition and performance calibration essential to self-regulated learning (SRL) (Isaacson & Fujita, 2006; Maki, 1995). There is a direct effect of fundamental SRL strategies on students’ HOTS (Lee & Choi, 2017). Put simply, HOTS and SRL are interrelated and play a fundamental role in determining one’s academic success.

Fostering students’ HOTS and SRL is not simple (Koh et al., 2012; Nouri et al., 2019; Yen & Halili, 2015). Therefore, some scholars adopt learning analytics (LA) to assess to what extent students deploy specific strategies during the learning process (Tabuenca et al., 2015; van Horne et al., 2017; Yamada et al., 2016, 2017; You, 2016). Students’ digital traces can be analyzed for learning behavior patterns to inform interventions that foster exemplary behaviors (Roll & Winne, 2015). Research has also shown that implementing LA helps foster SRL (Tabuenca et al., 2015; van Horne et al., 2017; Yamada et al., 2016, 2017; You, 2016). Despite the benefits of LA, translating data from LA into actionable interventions at the course level is complex and still rare (van Leeuwen, 2019).

This paper presents a formative evaluation conducted at the course level. We utilized Learning Management System (LMS) usage data and an LA framework synthesized from existing literature. The LMS data allowed the instructor to decide when to employ interventions that promoted HOTS and SRL. This formative evaluation included an investigation of student SRL and HOTS using pre- and post-surveys, both of which included closed-ended and open-ended items. Essentially, we traced any changes in student SRL and HOTS after the instructor performed data-informed interventions by following a synthesized LA framework based on the works of Ifenthaler and Widanapathirana (2014), Muljana and colleagues (2021), and Muljana and Placencia (2018). Our findings will be incorporated into future instructional strategies and course design reiteration to encourage SRL and HOTS through an LA implementation.

Literature Review

Applications of LA should align with learning contexts; therefore, it is essential to implement LA in conjunction with an existing learning theory or construct (Gašević et al., 2015). In this evaluation, we utilized LA in parallel with promoting students’ SRL and HOTS

Description of Learning Analytics

LA refers to “the measurement, collection, analysis, and reporting of data about learners and their contexts, for the purpose of understanding and optimizing learning and the environment in which it occurs” (Siemens & Long, 2011, p.32). This definition yields two key points (Muljana & Luo, 2021; Muljana et al., 2021; Muljana & Placencia, 2018). Data collection, analysis, reports, and similar measurements should first consider the learners’ learning context. This can include study time, length of study time, access to materials, discussion participation, student reflection, and grades (Dietz et al., 2018). Second, the goal of employing LA is to optimize learning. Tracking students’ digital traces makes it possible to analyze and diagnose learning progress, struggles, and successes to inform decisions regarding any interventions necessary to promoting learning outcomes (Casey & Azcona, 2017; Dietz-Uhler & Hurn, 2013; Macfadyen & Dawson, 2010). In other words, information about student learning behaviors from LA can be used by instructors to corroborate their instincts, detect student struggles, and advise immediate interventions (Dietz-Uhler & Hurn, 2013; Muljana & Luo, 2021; Muljana et al., 2021; Muljana & Placencia, 2018).

Self-Regulated Learning

SRL is proactive learning activities or process involving learners’ thoughts, behaviors, and affects that systematically and strategically assist them in achieving their goals of improved learning (Zimmerman, 2002, as cited in Dabas et al., 2021). Students who possess self-regulation skills assess the situation, set goals, conduct and monitor their strategies, self-evaluate the outcome, and self-adapt to any improved strategies. In other words, the use of SRL involves students’ cognitions, behaviors (Zimmerman, 1989), and affects (e.g., self-satisfaction) (Zimmerman, 2002), and requires continual iteration. When students self-adapt, they also set new goals for the next learning activity. Among all SRL elements, there is a clear interrelationship between metacognition and regulation (Binbarasan-Tüysüzoğlu & Greene, 2015; Karabenick & Zusho, 2015); students tend to perform better if they continuously regulate their efforts according to their metacognitive awareness about their learning process and progress. In our formative evaluation, we focused on two SRL elements: metacognition and effort regulation.

While an individual’s proactive action is essential to SRL, external factors like study environment, available time to study, access to learning resources, instructional guidance, and instructional conditions play a role in SRL development (Gašević et al., 2016; Winne, 2011, 2017). Yamada et al. (2017) recommend course elements that intentionally promote students’ self-efficacy and cognitive learning strategies because these variables support SRL skills. Broadbent (2017) further recommends scaffolding methods, such as providing learning opportunities and assessments that promote goal setting, planning, and reflection, be integrated into course design to encourage students to adopt SRL strategies.

Several studies connect LA with SRL. For example, student responses to Pintrich et al. ’s (1991) Motivated Strategies for Learning Questionnaire (MSLQ) can be analyzed and correlated with the timeliness of assignment submissions (Yamada et al., 2016, 2017). MLSQ items can also be correlated with student SRL and their access frequency to real-time feedback provided by LA dashboards (van Horne et al., 2017). Still, while many LA-related studies focus on measurement purposes, the emphasis on teaching practices using LA to support students deserves further attention (Viberg et al., 2020).

Higher-Order Thinking Skills

We adopted the following overarching description of HOTS: “higher-order thinking occurs when a person takes new information and information stored in memory and interrelates and/or rearranges and extends this information to achieve a purpose or find possible answers in perplexing situations” (Lewis & Smith, 1993, p. 136). Students with HOTS perform beyond the literal interpretation of information to expound and use reason to build representations from it (Newman, 1990; Resnick, 1987). The professional world demands HOTS (Rotherham & Willingham, 2010; Silva, 2009) that instructional strategies and a well-designed learning environment can scaffold (Heong et al., 2011; Yen & Halili, 2015). HOTS allow students to become independent thinkers, problem solvers, and decision makers, and to facilitate the transfer of these abilities into real-life situations in the professional world (Rotherham & Willingham, 2010; Silva, 2009).

Students acquire HOTS during the learning process by identifying tasks and problems, clarifying components required by problems, judging related information, and evaluating information acquired and procedures for problem-solving (Quellmalz, 1985). These activities promote students’ self-awareness about their thinking, self-monitoring, and problem-solving strategies (Quellmalz, 1985). HOTS involve the execution of critical, logical, reflective, metacognitive, creative thinking, and self-regulation skills (Mainali, 2012; Resnick, 1987; Zohar & Dori, 2003). Metacognitive thinking, self-regulation, and critical and reflective thinking all overlap with SRL elements. During the learning process, critical thinking helps students select, test, evaluate, adopt, and adapt suitable learning strategies in various learning contexts (Brown et al., 1993; Hadwin & Oshige, 2011). As students evaluate the impact of their learning strategies on learning outcomes, they use reflective thinking to improve their learning process (Isaacson & Fujita, 2006). We focused on investigating critical and reflective thinking during our evaluation.

A small number of recent studies have explored the intersection of HOTS and LA. For example, visual LA tools have been used to investigate students’ activities annotating reading materials and commenting on other annotations, which positively impact critical reading achievement (Koh et al., 2019). Learning assisted by visual LA tools also influences students’ higher-order thinking (Zhang & Chan, 2020). However, these studies do not provide practical guidelines to translate LA data into immediate actions.

Using Learning Analytics at the Course Level

Ifenthaler and Widanapathirana (2014) developed an LA matrix outlining the benefits of using LA from predictive, formative, and summative perspectives. Predictive LA helps foresee outcomes and determine future strategies when conducted early on. Formative LA uses real-time data to help instructors decide whether to intervene. Summative LA can give insights after learning events. In our two previous works, we built upon Ifenthaler and Widanapathirana’s (2014) matrix to develop two similar LA frameworks (Muljana et al., 2021; Muljana & Placencia, 2018) that used three analytic perspectives. For the present evaluation, we synthesized those frameworks and adapted them into three phases: (1) early diagnosis, conducted at the start of the semester; (2) formative diagnosis, conducted throughout the semester; and (3) summative diagnosis, conducted partly during and partly at the end of the semester. Table 1 details each phase.

Table 1

Three Phases of LA

Starting pointPhase 1: Early diagnosisPhase 2: Formative diagnosisPhase 3: Summative diagnosis
Performing sound-pedagogy course design as a foundationConsider assigning:
  • Entrance survey
  • Pre-test
  • Ice-breaker discussion
  • Any difficult topics
  • At-risk students
  • Less-engaged students
Analyze and/or identify:
  • Overall student outcomes
  • Online discussion
  • Exit survey results
  • Students who excel or fall behind
Data analyzed:
  • Survey results
  • Test item analysis
  • Discussion posts
  • Course usage data
Data analyzed:
  • Test item difficulty
  • Assignment submission timestamps
  • Course usage data
  • Discussion posts
Data analyzed:
  • Final grades
  • Summary of course usage data
  • The number of discussion participation
  • Module(s) with most or least access
  • Exit survey results
Take the following immediate actions:
  • Give clear expectation
  • Provide SRL and HOTS strategy tips (e.g., motivating message through announcement, tips related to goal setting, time management and selecting learning strategies)
Take the following immediate actions:
  • Add remedial materials or provide a review
  • Provide SRL and HOTS strategy tips
  • Reflect on the current course design and adjust it
  • Intervene any online discussions to encourage more dialogues that promote critical thinking
  • Reflect on the current instructions or prompts and adjust the clarity in the next module
Take the following immediate actions:
  • Reflect on instructor’s strategy performed during the semester
  • Consider applying the successful strategies for the next cohort
  • Consider student feedback to inform the next course design iteration
  • Improve the course design in the next iteration

Note. The LA approach includes three phases, synthesized from existing frameworks (e.g., Ifenthaler & Widanapathirana, 2014; Muljana et al., 2021; Muljana & Placencia, 2018).

Instructors can conduct early diagnosis using data from entrance surveys, pre–tests, ice–breaker discussions, and course usage logs. They can review the data to learn students’ goals for taking the course and students’ prior knowledge and experience. Instructors can then use that information to provide clear expectations and instructions, and offer SRL and HOTS strategy tips through weekly briefings.

Instructors can implement formative diagnosis to detect challenging topic(s), potentially at-risk students, and less-engaged students. They can analyze data like submission timestamps, test difficulty reports, course usage records, and discussion posts. Instructors can then intervene and suggest remedial materials, share SRL and HOTS strategy tips, adjust course design, encourage discussion dialogues, and adjust instruction clarity in subsequent modules.

Instructors can review summative diagnosis such as overall student outcomes to identify students who excel or lag behind so as to observe their engagement level. They can also analyze course usage summaries, participation numbers, and module access to inform decisions for adjusting instructional strategies and course design for future cohorts.

Formative Evaluation Questions

From the lens of SRL and HOTS, we conducted a formative evaluation on an LA implementation performed at a course level. We used an LA framework synthesized from existing literature (Ifenthaler & Widanapathirana, 2014; Muljana et al., 2021; Muljana & Placencia, 2018) to observe students’ learning progress and inform the instructor’s interventions to adjust teaching strategies and improve student learning (see Table 1). As “the function of formative evaluation is to improve” (Nieveen & Folmer 2013, p. 158), our findings will be used to inform instructional strategies and course redesign for subsequent cohorts. Three questions guided this evaluation: (1) Did the LA implementation increase student SRL by the end of the semester? (2) Did the LA implementation increase student HOTS by the end of the semester? (3) How did the students perceive changes in their SRL and HOTS by the end of the semester?

Evaluation Methods

We adopted a case study approach for this formative evaluation. We use pre- and post-surveys (i.e., closed- and open-ended items) to understand student SRL and HOTS, including their perceived understanding toward their own SRL and HOTS. We selected the case study approach because it allowed us to comprehend a contemporary, complex phenomenon (Yin, 2008). In our context, applying LA at the course level is an emerging practice. The practice of translating data from LA into actionable interventions at the course level is still a rarity (van Leeuwen, 2019), and may require instructors to use complex processes (Molenaar & Knoop-van Campen, 2018; Wise & Jung, 2019). The case study approach guided us to explore how an LA implementation supported student SRL and HOTS, allowing us to highlight the practical significance of the results (Newman & Hitchock, 2011).

In this formative evaluation, we analyzed the data from one instructor who taught two course sections on the same topic: one with an LA implementation, the other without. Analyzing these two cases allowed us to examine each situation (Yin, 2008) by whether the LA implementation contributed to any SRL and HOTS changes or not. Given the small number of participants in this formative evaluation, we only used descriptive statistics to answer the evaluation questions regarding the pre- and post-comparison. These gave insight on how to improve our strategies and course design, as well as to inform readers about a potential LA implementation that can enhance SRL and HOTS.

Participants and Context

After receiving approval from the Institutional Review Board, we recruited students from two identical course sections of an upper-level general education course for an engineering program. This course applied economic theory to solve managerial problems and make decisions related to capital allocation for private, public, and governmental sectors. Twelve students participated in this study: four students from course section 1 and eight from course section 2 (see Table 2). We assured their anonymity, and they signed an informed consent form.

Table 2

Demographic and Contextual Information of Evaluation Participants

Demographic informationStudents from course section 1 (Case 1) (N1 = 4)Students from course section 2 (Case 2) (N2 = 8)
Do not wish to mention10
Class standing  
Enrollment status  

Both classes met twice a week using a traditional, face-to-face format. The instructor used a Blackboard LMS to host course materials and facilitate both learning tasks and assessments. The courses utilized a Quality Matters (QM) template built by the university to follow quality course design standards. Course content was segmented into 12 modules and sequenced strategically to present the fundamental topics initially before the more complex ones. The LMS includes built-in data-analytics features, such as Course Reports, Performance Dashboard, and Early Warning System, that record overall course usage, students’ submission activities, and submission timestamps. The Item Analysis feature within the Grade Center in the LMS allowed the instructor to analyze quiz difficulty and overall students’ performance by question.


We used a questionnaire consisting of demographic-related items and selected sub-scales from the MSLQ (Pintrich et al., 1991) to assess students’ prior SRL and HOTS, as well as improvements. We specifically chose the following MSLQ sub-scales: (a) 12 items of Metacognitive Self-Regulation for assessing SRL; (b) four items of Effort Regulation for assessing SRL; and (c) five items of Critical Thinking for assessing HOTS. We also adopted four items from the Reflection sub-scales of the Reflective Thinking Questionnaire (RTQ) by Kember et al. (2000) to assess HOTS. MSLQ is one of the frequently used instruments for assessing SRL strategies (Panadero, 2017; Tong et al., 2020). As cited in Muljana et al. (2021), previous research utilizing MSLQ reported good reliability and validity with Cronbach alpha values between 0.62 to 0.93 (Cho & Shen, 2013; Hederich-Martínez et al., 2016; Kim & Jang, 2015; Li et al., 2020; Stegers-Jager, et al., 2012). RTQ also showed good reliability and validity in several studies, with Cronbach alpha values ranging between 0.62 to 0.91 (Asakereh & Yousofi, 2018; Ghanizadeh, 2017; Ghanizadeh & Jahedizadeh, 2017; Safari et al., 2020; Tsingos-Lucas et al., 2016). In total, we used 25 items from both MSLQ and RTQ in pre- and post-surveys, but excluded some demographic items from the post-survey and included three open-ended questions. The open-ended questions in the post-survey asked students’ whether they perceived any changes in their SRL and HOTS.

Procedures and Data Collection

Formative evaluation took place in two course sections: section 1 (Case 1) and section 2 (Case 2). The same instructor taught both sections on the same topics using the same instructional resources. The instructor conducted the LA phases (as listed in Table 1) in Case 2; but, purposefully, not in Case 1. In week 1 and week 2, the students from both cases completed the pre-survey, and the instructor covered the content of Module 1 and Module 2 that served as the foundation of the more advanced topics in the subsequent modules or weeks. Within Module 3 to Module 12, the instructor included one weekly quiz at the end of each module in Case 1. However, the modules for Case 2 included two weekly quizzes (for mid-module and at the end of a module). The post-survey was made available in week 14, and students had two weeks to complete it. Table 3 lists the overall procedures in both cases.

Table 3

The Formative Evaluation Procedures

WhenCase 1
(Course section 1)
Case 2
(Course section 2
Week 1 and week 2Assigning pre-surveyAssigning pre-survey
Week 3 or 4 through week 14
(At this point, the instructor covered the content of Module 3 through Module 12)
Assigning a weekly quiz at the end of each module

Assigning two weekly quizzes:

  1. in the middle of each module
  2. at the end of each module Performing the LA three phases (as listed in Table 1) throughout the semester
Week 14 and week 15Assigning post-surveyAssigning post-survey

Note. Data were collected from pre- and post-surveys.

Data Analysis

We exported the pre- and post-survey results into a Microsoft Excel spreadsheet. We, then, analyzed these data using the Statistical Package for the Social Sciences for descriptive statistics. Due to the small sample size, we conducted no inferential statistics.

We analyzed the open-ended responses using the structural coding technique. This coding technique utilizes content-based phrases “representing a topic of inquiry to a segment of data that relates to a specific research question” (MacQueen et al., 2008, p. 124, as cited in Saldaña, 2013, p. 84), and is suitable for analyzing open-ended survey responses (Saldaña, 2013). Using structural coding, the first author simultaneously coded and categorized students’ open-ended responses by identifying segments of responses displaying commonalities (Saldaña, 2013), guided by the third question itself and related topics. As stated by Saldaña (2013), structural coding is “framed and driven by a specific research question and topic” (p. 87).

We used three a priori topics related to the third question. For example, students must select and monitor suitable learning strategies when conducting an SRL phase (Zimmerman, 2002). Thus, the first a priori category represented students’ perceptions of any changes in their learning strategies. Second, instructors’ guidance played an imperative role in SRL development (Gašević et al., 2016; Winne, 2011, 2017). So, the second a priori category guided us to analyze students’ perceptions about instructional guidance. Third, our literature review indicated students with HOTS could become independent thinkers, problem solvers, decision makers, and facilitate the transfer of these analytical thinking abilities into real-life situations in the professional world (Rotherham & Willingham, 2010; Silva, 2009). Hence, the third a priori category guided us to analyze perceptions regarding transferable, analytical thinking skills that students gained. The first author, next, presented all analyzed categories to the second author for feedback. They discussed the analyzed categories and resolved any disagreements (See Tables 6 and 7 for the highlighted categories resulting from the structural coding techniques).


Changes in SRL

Comparing pre- and post-survey results, Case 2 showed better average score increases for each variable (see Table 4). For example, while the average score of metacognitive self-regulation did not increase in Case 2, it slightly decreased from 3.25 to 3.08 in Case 1. Students’ average effort regulation scores also increased from 3.28 to 4.50 in Case 2, more than one point higher than for Case 1.

Table 4

Results of Pre- and Post-Surveys Assessing Self-Regulated Learning

SRL VariablesCase 1
(N1 = 4)
Case 2
(N2 = 8)
Metacognitive self-regulation3.25.383.08.62-.173.45.403.45.40.00
Effort regulation3.06.883.751.06.693.28.414.50.521.22

Note. 1=Strongly Disagree, 5=Strongly Agree

Changes in HOTS

Students’ critical thinking scores decreased slightly in Case 1 (M = 3.75 to M = 3.65), while those of students in Case 2 increased (M = 3.45 to M = 3.78). Students in both cases self-rated reflection strategy lower by the end of the semester; however, there was a larger decrease among students in Case 1 (Mdifference = -.62). Table 5 lists pre- and post-survey results for HOTS.

Table 5

Results of Pre- and Post-Surveys Assessing Higher Order Thinking Skills

HOTS VariablesCase 1
(N1 = 4)
Case 2 (N2 = 8)
Critical thinking3.75.553.65.38-.103.45.763.78.88.33

Note. 1=Strongly Disagree, 5=Strongly Agree

Perceived Changes in SRL and HOTS

Case 1

To apply SRL, students must select and monitor suitable learning strategies (Zimmerman, 2002). Thus, we asked students whether they changed such strategies. Three out of four students in Case 1 reported they used the same learning strategies since the beginning of the semester. As one student noted, “My strategies are the same as they were. Pay as much attention in class as possible and supplement with textbook or internet knowledge as needed.”

Students had positive comments about HOTS, despite the absence of the LA implementation. They noted gaining or boosting their skills in the application and analytical-thinking HOTS domains. For example, a student noticed merely plugging numbers would not work. Therefore, this student had “[. . .] to analyze and understand real work applications and that would vary from the formulas.” Another student noted that the course was already intuitive, but they did not learn new information and claimed, “[. . .] the course taught me many new applications of these topics that I’m glad I learned.” Table 6 depicts the categories that emerged from open-ended responses from students in Case 1.

Table 6

Categories of Student Insights from Case 1

CategoryDefinitionNumber of students (N1 = 4)Example comment
No change in learning strategyStudents did not change their learning strategy. They still employed the same strategy that they had been using.3“My strategies are the same as they were. Pay as much attention in class as possible and supplement with textbook or internet knowledge as needed.”
Helpful instructional guidanceStudents thought that the instructional guidance provided by the instructor was helpful. Guidance manifested through materials and projects was clear.3“The content in this course is fairly intuitive to me, but the instructor offers good explanations for less-intuitive concepts so his guidance is helpful.”
Encourage analytical thinkingStudents felt they gained analytical and problem-solving skills through the problems posed in the course. There was no particular simple way to solve the problems.2“[…,] there’s not a preset list of equations to use nor the ability to just plug in values and get an answer. We have to analyze and understand real work applications and that would vary from the formulas.”
New applicationsOne student noted that while they did not feel to learn new information, the course encouraged multiple applications of the topics.

1“Most of the content in this course is fairly intuitive to me and I was already familiar with a few of the topics, so it doesn’t feel like I’ve learned much new information. However, the course taught me many new applications of these topics that I’m glad I learned.”

Case 2

Three out of eight students in Case 2 noticed changes in learning strategies, employing different tactics when approaching a problem (e.g., creating visualizations in Excel to analyze information and solve a problem). One student said, “I find the visual relationship better to understand and have modified that to fit my calculus class as well [another quantitative class].”

Six students recognized the instructor selected and applied suitable instructional strategies, displaying awareness about the instructor’s teaching and scaffolding strategies. They noted that the instructor promoted student-to-content engagement and provided strategic content sequence. One student expressed “[. . .] his [or instructor’s] methodology of teaching builds new material on top of the previous material,” insight that was not detected in Case 1.

Five students also noticed the transferable skills they gained, displaying a change in their HOTS. These students believed they could transfer what they learned into real-life situations, both personally and professionally. As one student noted, “The class is great for project management positions. [. . .] Even just knowing the basics is a great baseline for understanding economics that would come up in future workloads.” Table 7 depicts common themes that emerged from open-ended responses by students in Case 2.

Table 7

Categories of Student Insights from Case 2

CategoryDefinitionNumber of students
(N2 = 8)
Example comment
Change in learning strategyStudents realized they have changed their learning strategy such as by employing different approaches.3“I have found myself depending on excel to better understand this class. While a lot is based off just calculations, if you export the information to a table and populate it, I find the visual relationship better to understand and have modified that to fit my calculus class as well [which is another quantitative class].” “[. . . I] take different approaches when looking at a single problem.”
Helpful instructional guidanceStudents also attested about the helpful guidance provided during instructions. Materials and prompts were clear and helped increase engagement.6“He [the instructor] was extremely helpful and has adapted to student feedback which I feel has made this class easier to learn and more engaging.” “[. . .] his methodology of teaching builds new material on top of the previous material.”
Transferable skillsStudents noted the transferable skills they have gained. These skills are either usable in personal life and/or future career.

5“It is good information [. . .] that I can adapt to my own financials to a greater extent than my professional career.” “The class is great for project management positions. Knowing how to research and create a proposal for a project seemed to be the focus of the class. Even just knowing the basics is a great baseline for understanding economics that would come up in future workloads.” “I think the learning skills from this course allows me to stay prepared in the real world and not fall behind when presented with new information.”


We have conducted a formative evaluation on an LA implementation performed at a course level from the lens of SRL and HOTS. We used LMS usage data and an LA framework synthesized from existing literature. Results suggest an increase in effort regulation and critical thinking, but not in metacognition nor reflection in Case 2. Although metacognition and reflection did not increase in Case 2, we detected a decrease in these variables in Case 1. In terms of student perception, we discovered positive insights in both cases. Students in Case 2 expressed their insights more analytically (e.g., why the instructor’s guidance was helpful). They also noticed instructions were adjusted to match their learning experience (e.g., simpler topics were presented before more complex ones), indicating awareness about their own learning process and progress. We expound these results into several discussion points.

Robust Course Design as a Foundation

Students from Case 1 expressed positive perceptions about changes in their SRL and HOTS at the end of the semester, despite the absence of LA implementation. This may be due to course design and organization. For example, they noted that the course design incorporated instructional guidance. The students also thought learning tasks encouraged analytical thinking beyond simply inputting numbers into formulas. Both cases used similar course structures based on a QM course template developed by the university. Course content was segmented into modules that aligned learning outcomes to individual learning tasks and ordered by complexity. This suggests that well-designed instructional strategies and learning environments can be a foundation for enhancing student SRL and HOTS (Heong et al., 2011; Yen & Halili, 2015). Robust course design can therefore bootstrap effective LA implementation at the course level, allowing instructors to focus more on optimizing learning outcomes (Muljana et al., 2021; Muljana & Placencia, 2018). We intend to continue to practice robust course design in future course cohorts.

Ensuring the Mastery of Prerequisite Topics

Results suggest an increase in HOTS in the critical thinking domain among students in Case 2. This may have been influenced by initial content sequencing and instructional adjustments throughout the semester. Students must apply prior knowledge to adopt critical thinking (Pintrich et al., 1991), meaning instructors may consider ensuring students master prerequisite topics. In this context, the instructor used a mid-module quiz in each module to assess if students understood fundamental concepts. After reviewing quiz results, the instructor analyzed the quiz-item difficulty to determine which topics to review or provide remedial materials. According to existing literature, using LA enables instructors to analyze and diagnose students’ current learning progress, struggles, and successes, thereby determining necessary interventions to help them achieve learning outcomes better (Casey & Azcona, 2017; Dietz-Uhler & Hurn, 2013; Macfadyen & Dawson, 2010). We will continue to use mid-module quizzes in the subsequent course cohorts to help instructor ensure mastery of prerequisite topics.

The Role of Early and Formative Diagnoses

Implementing LA may have increased effort regulation in Case 2. From early in the semester, the instructor checked how long students accessed course materials and whether they clicked course pages without reviewing material thoroughly. This data alerted the instructor to students who might have needed learning strategy tips to regulate their efforts in reviewing course materials in the LMS. Furthermore, students who accessed the course less periodically received an email reminding them to regularly access and review materials in the LMS. According to Kim et al. (2016) and You (2016), analyzing LMS usage data early in the semester can help instructors forecast students’ course access habits (Kim et al., 2016; You, 2016).

Formative diagnosis in Case 2 may have additionally influenced students’ SRL. Formative analysis of LMS data — e.g., student frequency and duration accessing course materials, students’ timing (early or late) submitting assignments, and analyzing topic difficulty based on test report data — can support an instructor’s decision regarding when to adjust learning conditions based on students’ learning progress (Casey & Azcona, 2017; Dietz-Uhler & Hurn, 2013; Macfadyen & Dawson, 2010). In this context, the instructor continuously scaffolded students by adjusting instruction and providing remedial materials and reviews as needed. Consulting LA data throughout the learning process can help instructors find appropriate strategies to proactively help students perform better (Yen et al., 2015). As expected, students in Case 2 expressed their perception more analytically and in-depth, and were aware when instructors provided deliberate instructional guidance. This resonates with Winne’s (2011, 2017) and Gašević et al.’s (2016) suggestions to highlight the important role of instructional guidance. Based on this finding, we will continue to implement analytical diagnoses and provide appropriate instructional guidance and conditions in future course cohorts.

Facilitating Metacognitive and Reflective Learning Activities

Metacognitive self-regulation did not change in Case 2, nor did LMS data appear to capture metacognitive learning activities. In a future course redesign, we plan to ask students to describe their learning habits and strategies during an ice-breaker discussion and explain how they overcome challenges while learning. This will provide the instructor an overview of students’ awareness, knowledge, and control of their cognition before determining suitable instructional conditions to scaffold metacognition (Gašević et al., 2016; Winne, 2017).

Results suggested the reflective-thinking domain of HOTS did not improve, which may be due to the absence of a reflective assignment. While instructional strategies and a well-designed learning environment can promote HOTS (Heong et al., 2011; Yen & Halili, 2015), students need specific instructions to reflect upon their learning (Kember et al., 2000). We, therefore, plan to add a reflective assignment for the next course design iteration.

Future Work

Our findings highlight how LA may have influenced students’ SRL and HOTS in two different course sessions. We recognize that the small number of participants makes it difficult to make a robust comparison or to examine for statistically significant differences on SRL or HOTS due to our LA implementation. Therefore, a future study may include a larger sample size and use an experimental design to investigate such impacts. We would also consider adding alternative data points, such as interviews or focus groups, to enrich our data sources and potentially reveal additional considerations when using LA to support SRL and HOTS.

The findings also give insight on iterative course design. We plan to redesign the course, adjust some assignments, and implement all three phases of LA such as: (1) early diagnosis at the start of the semester; (2) formative diagnosis throughout the semester; and (3) summative diagnosis conducted partly during and at the end of the semester (as shown in Table 1).

Because the practice of translating LA data into actionable interventions at the course level is still emerging (van Leeuwen, 2019), these findings suggest the potential of using LA to foster SRL and HOTS. We, therefore, encourage scholars, instructors, and instructional designers to test the LA framework synthesized from the existing literature for research and teaching purposes to expand the current body of literature at the intersection of LA, SRL, and HOTS.


Asakereh, A., & Yousofi, N. (2018). Reflective thinking, self-efficacy, self-esteem and academic achievement of Iranian EFL students. International Journal of Educational Psychology, 7(1), 68-89.

Binbarasan-Tüysüzoğlu, B., & Greene, J. (2015). An investigation of the role of contingent metacognitive behavior in self-regulated learning. Metacognition and Learning, 10(1), 77–98.

Broadbent, J. (2017). Comparing online and blended learner’s self-regulated learning strategies and academic performance. Internet and Higher Education, 33(3), 24–32. 1096-7516/

Brown, A. L., Ash, D., Rutherford, M., Nakagawa, K., Gordon, A., & Campione, J. C. (1993). Distributed expertise in the classroom. In G. Salomon (Ed.), Distributed cognitions: Psychological and educational considerations (pp. 188–228). Cambridge University Press.

Casey, K., & Azcona, D. (2017). Utilizing student activity patterns to predict performance. International Journal of Educational Technology in Higher Education, 14(1), Article 4.

Cho, M. H., & Shen, D. (2013). Self-regulation in online learning. Distance Education, 34(3), 290–301.

Credé, M., & Kuncel, N. R. (2008). Study habits, skills, and attitude: The third pillar supporting collegiate academic performance. Perspectives on Psychological Science, 3(6), 425-453.

Dabas, C. S., Muljana, P. S., & Luo, T. (2021). Female students in quantitative courses: An exploration of their motivational sources, learning strategies, learning behaviors, and course achievement. Technology Knowledge and Learning. Advance online publication.

Dietz, B., Hurn, J. E., Mays, T. A., & Woods, D. (2018). An introduction to learning analytics. In R. A. Reiser & J. V Dempsey (Eds.), Trends and issues in instructional design and technology (pp. 104–111). Pearson.

Dietz-Uhler, B., & Hurn, J. (2013). Using learning analytics to predict (and improve) student success: A faculty perspective. Journal of Interactive Online Learning, 12(1), 17–26.

Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. Internet and Higher Education, 28, 68–84.

Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71. 014-0822-x

Ghanizadeh, A. (2017). The interplay between reflective thinking, critical thinking, self- monitoring, and academic achievement in higher education. Higher Education, 74(1), 101-114.

Ghanizadeh, A. & Jahedizadeh, S. (2017). Validating the Persian version of Reflective Thinking Questionnaire and probing Iranian university students' reflective thinking and academic achievement. International Journal of Instruction, 10(3), 209-226.

Hadwin, A., & Oshige, M. (2011). Self-regulation, coregulation, and socially shared regulation: Exploring perspectives of social in self-regulated learning theory. Teachers College Record, 113(2), 240–264.

Hederich-Martínez, C., López-Vargas, O., & Camargo-Uribe, A. (2016). Effects of the use of a flexible metacognitive scaffolding on self-regulated learning during virtual education. International Journal of Technology Enhanced Learning, 8(3–4), 199–216.

Heong, M. H., Othman, W. B., Jailani,B. M. Y., Kiong, T. T., Hassan R. B,, & Mohamad, M. M. B. (2011). The level of higher order thinking skills among technical education students. International Journal of Social Science and Humanity, 1(2), 121-125.

Ifenthaler, D. & Widanapathirana, C. (2014). Development and validation of a learning analytics framework: Two case studies using support vector machines. Technology, Knowledge and Learning, 19(1–2), 221–240.

Isaacson, R. M., & Fujita, F. (2006). Metacognitive knowledge monitoring and self-regulated learning: Academic success and reflections on learning. Journal of the Scholarship of Teaching and Learning, 6(1), 39-55.

Karabenick, S. A., & Zusho, A. (2015). Examining approaches to research on self-regulated learning: conceptual and methodological considerations. Metacognition and Learning, 10(1), 151-163.

Kember, D., Leung, D. Y. P., Jones, A., Loke, A. Y., Mckay, J., Sinclair, K., Tse, H., Webb, C., Wong, F. K. Y., Wong, M., & Yeung, E. (2000). Development of a questionnaire to measure the level of reflective thinking. Assessment and Evaluation in Higher Education, 25(4), 381-395.

Kim, D., Park, Y., Yoon, M., & Jo, I. H. (2016). Toward evidence-based learning analytics: Using proxy variables to improve asynchronous online discussion environments. Internet and Higher Education, 30, 30–43.

Kim, K. J., & Jang, H. W. (2015). Changes in medical students’ motivation and self-regulated learning: A preliminary study. International Journal of Medical Education, 6, 213–215.

Koh, K. H., Tan, C., & Ng, P. T. (2012). Creating thinking schools through authentic assessment: The case in Singapore. Educational Assessment, Evaluation and Accountability, 24(2), 135-149.

Koh, W., Jonathan, C., & Tan, J. P. -L. (2019). Exploring conditions for enhancing critical thinking in networked learning: Findings from a secondary school learning analytics environment. Education Sciences, 9, 1-16.

Lee, J., & Choi, H. (2017). What affects learner’s higher-order thinking in technology enhanced learning environments? The effects of learner factors. Computers & Education, 115, 143–152.

Lewis, A., & Smith, D. (1993). Defining higher-order thinking. Theory into Practice, 32(3), 131-137.

Li, S., Du, H., Xing, W., Zheng, J., Chen, G., & Xie, C. (2020). Examining temporal dynamics of self-regulated learning behaviors in STEM learning: A network approach. Computers & Education, 158, Article 103987.

Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588–599.

MacQueen, K. M., McLellan-Lemal, E., Bartholow, K., & Milstein, B. (2008). Team-based codebook development: Structure, process, and agreement. In G. Guest & K. M. MacQueen (Eds.), Handbook for team-based qualitative research (pp. 119-135). AltaMira Press.

Mainali, B. P. (2012). Higher order thinking in education. Academic Voices: A Multidisciplinary Journal, 2(1), 5-10.

Maki, R. H. (1995). Accuracy of metacomprehension judgments for questions of varying importance levels. American Journal of Psychology, 108(3), 327-344.

Molenaar, I., & Knoop-van Campen, C. A. (2018). How teachers make dashboard information actionable. IEEE Transactions on Learning Technologies, 12(3), 347-355.

Muljana, P. S., & Luo, T. (2021). Utilizing learning analytics in course design: Voices from instructional designers in higher education. Journal of Computing in Higher Education, 33(1), 206-234.

Muljana, P. S., & Placencia, G. (2018). Learning analytics: Translating data into “just-in-time” interventions. Scholarship of Teaching and Learning, Innovative Pedagogy, 1(1), 50–69. Retrieved from

Muljana, P. S., Placencia, G., & Luo, T. (2021). Applying a learning-analytics approach to improve course achievement: Using data stored in Learning Management Systems. In P. Maki & P. G. Shea (Eds.), Transforming digital learning and assessment: A guide to available and emerging practices to building institutional consensus (pp. 143–179). Stylus Publishing, LLC.

Nevgi, A. (2002). Measurement of learning strategies: Creating a self-rating tool for students of virtual university. In H. Niemi & P. Ruohotie (Eds.), Theoretical Understanding for Learning in Virtual University (pp. 203–228). University of Tampere: Research Centre for Vocational Education.

Newman, F. M. (1990). Higher order thinking in teaching social studies: A rationale for the assessment of classroom thoughtfulness. Journal of Curriculum Studies, 22, 41-56.

Newman, I., & Hitchcock, J. H. (2011). Underlying agreements between quantitative and qualitative research: the short and tall of it all. Human Resource Development Review, 10(4), 381-398.

Nieveen, N., & Folmer, E. (2013). Formative evaluation in educational design research. In T. Plomp & N. Nieveen (Eds.), Educational design research Part A: An introduction (pp. 152–169). Netherlands Institute for Curriculum Development.

Nouri, J., Ebner, M., Ifenthaler, D., Saqr, M., Malmberg, J., Khalil, M., … Berthelsen, U. D. (2019). Efforts in Europe for data-driven improvement of education – A review of learning analytics research in seven countries. International Journal of Learning Analytics and Artificial Intelligence for Education, 1(1), 8-27.

Panadero, E. (2017). A review of self-regulated learning: Six models and four directions for research. Frontiers in Psychology, 8, Article 422.

Pintrich, P. R., & De Groot, E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82(1), 33–40.

Pintrich, P. R., Smith, D. A. F., García, T., & McKeachie, W. J. (1991). A manual for the use of the motivated strategies for learning questionnaire. University of Michigan, National Center for Research to Improve Postsecondary Teaching and Learning.

Quellmalz, E. S. (1985). Needed: Better methods for testing higher-order thinking skills. Educational Leadership, 43(2), 29-35.

Resnick, L. (1987). Education and learning to think. National Academy Press.

Richardson, M., Abraham, C., & Bond, R. (2012). Psychological correlates of university students’ academic performance: A systematic review and meta-analysis. Psychological Bulletin, 138(2), 353–387.

Roll, I., & Winne, P. H. (2015). Understanding, evaluating, and supporting self-regulated learning using learning analytics. Journal of Learning Analytics, 2(1), 7–12.

Rotherham, A. J., & Willingham, D. T. (2010). “21st-century” skills: Not new, but a worthy challenge. American Educator, 17, 17–20.

Safari I., Davaribina M., & Khoshnevis I. (2020). The influence of EFL teachers’ self-efficacy, job satisfaction and reflective thinking on their professional development: A structural equation modeling. Journal on Efficiency and Responsibility in Education and Science, 13(1), 27-40.

Saldaña, J. (2013). The coding manual for qualitative researchers. Sage.

Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE Review, 46(5), 31–40.

Silva, E. (2009). Measuring skills for 21st-century learning. Phi Delta Kappan, 90(9), 630–634.

Stegers-Jager, K. M., Cohen-Schotanus, J., & Themmen, A. P. N. (2012). Motivation, learning strategies, participation and medical school performance. Medical Education, 46(7), 678-688.

Stetler, C. B., Legro, M. W., Wallace, C. M., Bowman, C., Guihan, M., Hagedorn, H., . . . & Smith, J. L. (2006). The role of formative evaluation in implementation research and the QUERI experience. Journal of General Internal Medicine, 21(2), S1–S8.

Van Horne, S., Curran, M., Smith, A., VanBuren, J., Zahrieh, D., Larsen, R., & Miller, R. (2017). Facilitating student success in introductory chemistry with feedback in an online platform. Technology, Knowledge and Learning, 23, 21–40.

van Leeuwen, A. (2019). Teachers’ perceptions of the usability of learning analytics reports in a flipped university course: When and how does information become actionable knowledge? Educational Technology Research and Development, 67(5), 1043-1064.

Viberg, O., Khalil, M., & Baar, M. (2020). Self-regulated learning and learning analytics in online learning environments: A review of empirical research. In Proceedings of the 10th International Conference on Learning Analytics & Knowledge (LAK’20) (pp. 524 –533).

Tabuenca, B., Kalz, M., Drachsler, H., & Specht, M. (2015). Time will tell: The roll of mobile learning analytics in self-regulated learning. Computers & Education, 89, 53–74.

Tanujaya, B., Mumu, J., & Margono, G. (2017). The relationship between higher order thinking skills and academic performance of student in mathematics instruction. International Education Studies, 10(11), 78–85.

Tong, F., Guo, H., Wang, Z., Min, Y., Guo, W., & Yoon, M. (2020). Examining cross-cultural transferability of self-regulated learning model: An adaptation of the Motivated Strategies for Learning Questionnaire for Chinese adult learners. Educational Studies, 46(4), 422–439.

Tsingos-Lucas, C., Bosnic-Anticevich, S., Schneider, C. R., & Smith, L. (2016). The effect of reflective activities on reflective thinking ability in an undergraduate pharmacy curriculum. American Journal of Pharmaceutical Education, 80(4).

Winne, P. H. (2011). A cognitive and metacognitive analysis of self-regulated learning. In D. H. Schunk & B. J. Zimmerman (Eds.), Handbook of Self-regulation of Learning and Performance (pp. 15–32). Routledge.

Winne, P. H. (2017). Learning Analytics for Self-Regulated Learning. In C. Lang, G. Siemens, A. W., & D. Gašević (Eds.), Handbook of Learning Analytics (1st ed., pp. 241–249). Society for Learning Analytics Research.

Wise, A. F., & Jung, Y. (2019). Teaching with analytics: Towards a situated model of instructional decision-making. Journal of Learning Analytics, 6(2), 53-69.

Yamada, M., Goda, Y., Matsuda, T., Saito, Y., Kato, H., & Miyagawa, H. (2016). How does self-regulated learning relate to active procrastination and other learning behaviors? Journal of Computing in Higher Education, 28(3), 326–343.

Yamada, M., Shimada, A., Okubo, F., Oi, M., Kojima, K., & Ogata, H. (2017). Learning analytics of the relationships among self-regulated learning, learning behaviors, and learning performance. Research and Practice in Technology Enhanced Learning, 12(1), Article 13.

Yen, C. H., Chen, I., Lai, S. C., & Chuang, Y. R. (2015). An analytics-based approach to managing cognitive load by using log data of Learning Management Systems and footprints of social media. Journal of Educational Technology & Society, 18(4), 141–158.

Yen, T. S., & Halili, S. H. (2015). Effective teaching of higher thinking (HOT) in education. The Online Journal of Distance and e-Learning, 3(2), 41–47.

Yin, R. K. (2008). Case study research: Design and methods. Sage.

You, J. W. (2016). Identifying significant indicators using LMS data to predict course achievement in online learning. Internet and Higher Education, 29, 23–30.

Zhang, Y., & Chan, K. K. (2020). Infusing visual analytics technology with business education: An exploratory investigation in fostering higher-order thinking in China. Innovations in Education and Teaching International, 58(5), 586–595.

Zimmerman, B. J. (1989). A social cognitive view of self-regulated academic learning. Journal of Educational Psychology, 81(3), 329–339.

Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory Into Practice, 41(2), 64–70.

Zohar, A., & Dori, Y. J. (2003). Higher order thinking skills and low-achieving students: Are they mutually exclusive? The Journal of the Learning Sciences, 12(2), 145–181.

Pauline S. Muljana

Old Dominion University

Pauline Salim Muljana has 12 years of instructional design experience and is currently a Ph.D. candidate in Instructional Design and Technology at Old Dominion University. Her research interests center on investigations of how data-informed analytics informs instructional design to foster learning behaviors and strategies associated with successful learning.
Tian Luo

Old Dominion University

Tian Luo is an Associate Professor of Instructional Design and Technology at Old Dominion University. She earned her Ph.D. in Instructional Technology from Ohio University. Her research interests center on designing and integrating social media and various forms of emerging technologies to support teaching and learning.
Greg Placencia

California State Polytechnic University

Dr. Greg Placencia is an Assistant Professor of Industrial and Manufacturing Engineering at the California State Polytechnic University, Pomona. He specializes in human–system interaction/integration. He has researched human trafficking, healthcare, and education. Greg received a B.S. in Computer Science and a PhD in Industrial and Systems Engineering from University of Southern California.

This content is provided to you freely by The Journal of Applied Instructional Design.

Access it online or download it at