Home Challenge 6: Literature Review

Challenge 6: Literature Review

by Michelle Starcher

Self-Regulation within Game-Based Learning Environments

In today’s digital world, many people spend their leisure time playing video games.   The popularity of video games among children and adolescents has prompted many to consider how digital games can be used to promote learning in educational settings (Erhel & Jamet, 2013).  Researchers have noted several benefits of using game-based learning in the classroom including increased motivation, engagement, and learning achievement (Erhel & Jamet, 2013).  However, the same characteristics that encourage interest and motivation may also create distractions that take attention away from learning (Sabourin et al., 2013).  Therefore, it is important to consider how digital games support students and their self-regulated learning. 

Self-regulated learning (SRL) is defined as “thoughts, feelings, actions, and behaviors generated by learner to systematically and intentionally attain their goals” (Huh & Reigeluth, 2017).  SRL models include of strategic, metacognitive, and motivational components that impact student learning, academic performance, and the ability to regulate, monitor, and control strategy use (Nietfeld et al., 2014).  Researchers have noted that SRL is important in the development life-long learning skills (Zimmerman, 2002) and a major competency for 21st century learners (Wolters, 2010).  While most students utilize SRL strategies to some extent, students often struggle with applying SRL strategies in open-ended learning environments where goals and expectations are not clearly outlined (Sabourin et al., 2013). 

There is a limited amount of research targeting SRL within game-based learning environments.  The goal of this literature review is to gain a better understanding of how game-based learning supports the development of SRL strategies in k-12 education.  Specifically, this paper will investigate how digital game-based learning environments impact student learning and SRL behaviors such as motivation, engagement, self-efficacy, and flow. 

Literature Review

Self-Efficacy

Multiple research studies have used Crystal Island, a game-based learning environment, to investigate how game play impacted student learning, SRL behaviors, and self-efficacy (Nietfeld et al., 2014; Sabourin et al., 2013; Meluso et al., 2012).  In all three studies, students were given pre-and post-tests that included items designed to assess students’ science self-efficacy and science content knowledge.  In the study by Meluso et al. (2012), 100 fifth graders (45 boys and 55 girls) played Crystal Island after being exposed to landform curriculum.  Although participants made significant increases in their science self-efficacy after playing Crystal Island, results indicate that playing conditions did not influence the change in self-efficacy.  In addition, the playing conditions did not influence a change in science content knowledge.  Participants demonstrated an increase in content knowledge after playing Crystal Island.

Sabourin et al. (2013) conducted a study involving 296 middle school students.  While playing Crystal Island, students were prompted to report on their mood and status.  Although students were not explicitly asked about their goals or progress, many students included this information in their status updates.  Information gleaned from status updates was used to classify students into low, medium, and high self-regulated learning behavior classes.   Results indicated that students’ SRL scores were significantly predictive of their post-test scores.  ANOVA indicated a significant difference between learning gains between the SRL groups.  High and Medium SRL students had significantly better learning gains than Low SRL students.  However, a chi-squared analysis indicated that the percentage of students who solved the game’s mystery did not differ significantly based on SRL group (Sabourin et al., 2013).  There was a difference in the in-game resources students used.  High SRL students made more use of the curricular resources and took more notes that the lower SRL students, but Medium and Low SRL students ran significantly more tests.  An abundant use of the testing device indicates students were gaming the system or didn’t have a good hypothesis in advance (Sabourin et al., 2013).  Recognizing and scaffolding the Low SRL students is important to improve science learning. 

Nietfeld et al. (2013) sought to understand the influence of SRL within Crystal Island.  Comparisons of content knowledge pre-and post-assessments revealed a significant increase in scores.  In addition, comparisons were made between students who solved the mystery and those that did not to determine if any variable related to SRL and content knowledge.  Students who solved the mystery scored significantly better on the content knowledge post-assessment.  These students reported significantly higher levels of interest and self-efficacy.  Three major facets of SRL were found to be significant predictors of performance including strategy use, motivation, self-efficacy, and metacognition (monitoring).  Data from the study suggests that SRL skills were essential components of student success in Crystal Island. 

Flow State and Competition

When a player’s skills match the challenge level within the gaming environment, flow states emerge, and players become more engaged (Chen & Sun, 2016).  To achieve this flow state, game activities must have precise goals and provide explicit feedback (Chen & Sun, 2016).  Chen and Sun (2016) aimed to better understand the influence of SRL during flow states and how SRL can be determined from game data.  Participants (266 eighth-grade students) in the study were asked to play Music Flow and respond to in-game questions regarding their flow state at the end of each level of play.  Most of the participants were found to have strong self-regulating abilities.  As players chose more difficult levels, their flow distances increased, which indicated greater anxiety.  Results indicated that participant’s ability to self-regulate exerted an influence on flow state as players progressed to higher levels.  Furthermore, participants with high SR scores had a significantly higher flow distance. 

Chen et al. (2018) investigated the impact of competition on student’s flow experiences and learning behaviors when compared to non-competitive game environments.  Study results indicated a significant difference in the learning achievements of the competition and non-competitions conditions, with the non-competitive group performing significantly better than those in the competitive group.  However, there was no significant difference between the two groups when it came to flow experience.  Students in the competitive group relied on surface learning and seldom sought help during game play, focusing instead on necessary movements.  The students in the non-competitive group spent most of their time exploring the problem space, seeking solutions.  In addition, the non-competitive group utilized the support tools more often than the competitive group.  The competition group tended to focus on their ranking more so than their learning.  Whereas students in the non-competitive group were able to focus on their learning without time constraints or peer pressure. 

In 2019, Chen conducted a study to explore how different contexts of gameplay affected learning outcomes, intrinsic motivation, and patterns of learning behavior.  Three different contexts of gameplay were explored including individual, collaboration, and competition.  Results indicate that both collaborative and competitive modes of GBL improved conceptual learning of science concepts among 7th grade students.  Furthermore, collaborative gameplay increased intrinsic motivation and lowered tension when compared to individual play.

Feedback and Question Prompts

Erhel and Jamet (2013) conducted two experiments to investigate the impact of instructions and feedback on motivation and learning effectiveness.  Two types of instruction were used in the experiments: learning and entertainment.  The second experiment explored whether the presence of knowledge of correct response (KCR) feedback in digital game-based learning quizzes influenced the type of learning strategies induced by the instructions (entertainment or learning).  Participants were provided feedback on the accuracy of their response.  In response to the participant’s answer, a window opened with a message of either “right answer” or “wrong answer” plus the correct response.  As with many of the studies discussed in this literature review, a pre- and post-assessment was used to measure learning gains.  The type of instruction had no significant impact on performance goals.  However, results indicated that KCR feedback promotes deep cognitive processing during learning when coupled with entertainment instruction. Adding feedback to DGBL environments enhances memorization.  According to study results, entertainment instruction made participants less frightened of failure, thus leading to the adoption of more effective learning strategies. 

Law and Chen (2016) examined the effects of question prompt and feedback types on learning outcomes within a game-based learning environment.  Two types of question prompts (knowledge and application) and two types of feedback (knowledge of correct response and elaborated response) were used as independent variables.  The dependent variable was learning performance.   Knowledge prompts were used to guide the learner to reflect on facts, concepts, and processes.  Application prompts require students to use and apply knowledge in a new situation.  Unlike KCR feedback, elaborated response feedback (ER) provides an explanation for why the correct answer is the best answer choice and the other answers are not correct.  Results suggested that types of question prompts may support student learning in different ways.  Students who received knowledge prompts performed better than students who were given application prompts.  However, the study did not find significant differences between the type of feedback used, despite previous research showing that ER feedback was more effective (Law & Chen, 2016).  One reason for the difference could be due to the interaction effects between types of feedback and question prompts.  Law and Chen (2016) found that the combination of knowledge prompts paired with elaborated feedback support learning better than other prompt, feedback combinations. 

Limitations and Future Research

There is a limited amount of research targeting SRL within game-based learning environments.  Many of the studies mentioned in this literature review were authored or co-authored by Chen.  Therefore, additional sources of information should be consulted for future literature reviews.  Most of the studies cited in this literature review support GBL as a viable method to improve student achievement.  However, more research is needed to understand how game-based learning supports the development of SRL strategies and how SRL strategies can be used to improve GBL environments. Future research could examine different types of question prompts other than knowledge and application and how the question prompts and feedback impact game performance and learning. 

In addition, more research is needed to better understand how collaborative gameplay impacts student learning and the development of SRL. The two studies that investigated collaborative vs. single gameplay had opposing results, with one finding no significant difference (Chen et al., 2018) and the other finding more positive results within the collaborative gameplay environment (Chen, 2019).  Anonymous competition could better promote learning, induce positive motivational outcomes, and facilitate meaningful cognitive engagement.  Future research needs to examine the effects of competition on interest, self-efficacy, and frustration in GBL environments.

References

Chen, C-H.  (2019). The impacts of peer competition-based science gameplay on conceptual knowledge, intrinsic motivation and learning behavioral patterns.  Education Technology Research Development, 67, 179-198.

Chen, C-H., Liu, J-H., & Shou, W-C.  (2018).  How competition in game-based science learning environment influences students’ learning achievement, flow experience, and learning behavioral patterns.  Educational Technology & Society, 21(2), 164-176.

Chen, L-X., & Sun, C-T. (2016).  Self-regulation influence on game play flow state.  Computers in Human Behavior, 54, 341-350. 

Erhel, S., & Jamet, E.  (2013).  Digital game-based learning:  Impact of instructions and feedback on motivation and learning effectiveness.  Computers & Education, 67, 156-167.

Huh, Y. & Reigeluth, C.M.  (2017).  Self-regulated learning:  The continuous-change conceptual framework and a vision of new paradigm, technology system, and pedagogical support.  Journal of Educational Technology Systems, 46(2), 191-214. 

Law, V. & Chen, C-H.  (2016).  Promoting science learning in game-based learning with question prompts and feedback.  Computers & Education, 103, 134-143.

Meluso, A., Zheng, M., Spires, H.A., & Lester, J.  (2012).  Enhancing 5th graders’ science content knowledge and self-efficacy through game-based learning.  Computers & Education, 59, 497-504.

Nietfeld, J.L., Shores, L.R., & Hoffmann, K.F.  (2014).  Self-regulation and gender within a game-based learning environment.  Journal of Educational Psychology, 106(4), 961-973.

Sabourin, J.L., Shores, L.R., Mott, B.W., & Lester, J.C.  (2013).  Understanding and predicting student self-regulated learning strategies in game-based learning environments.  International Journal of Artificial Intelligence Education, 23, 94-114.

Wolters, C.A.  (2010).  Self-regulated learning and 21st century competencies.  Retrieved from http://www7.nationalacademies.org/DBASSE/Wolters_Self_Regulated_Learning_Paper.pdf

Zimmerman, B.J.  (2002).  Becoming a self-regulated learner:  An overview.  Theory into Practice, 41, 64-70.  Retrieved from http://www.jstor.org/stable/1477457