• Multi-Tiered System of Supports Build effective, district-wide MTSS
  • School Climate & Culture Create a safe, supportive learning environment
  • Positive Behavior Interventions & Supports Promote positive behavior and climate
  • Family Engagement Engage families as partners in education
  • Platform Holistic data and student support tools
  • Integrations Daily syncs with district data systems and assessments
  • Professional Development Strategic advising, workshop facilitation, and ongoing support

Mesa OnTime

  • Surveys and Toolkits

book-supporting every student 18 interventions

18 Research-Based MTSS Interventions

Download step-by-step guides for intervention strategies across literacy, math, behavior, and SEL.

  • Connecticut
  • Massachusetts
  • Mississippi
  • New Hampshire
  • North Carolina
  • North Dakota
  • Pennsylvania
  • Rhode Island
  • South Carolina
  • South Dakota
  • West Virginia
  • Testimonials
  • Success Stories
  • About Panorama
  • Data Privacy
  • Leadership Team
  • In the Press
  • Request a Demo

Request a Demo

  • Popular Posts
  • Multi-Tiered System of Supports
  • Family Engagement
  • Social-Emotional Well-Being
  • College and Career Readiness

Show Categories

School Climate

45 survey questions to understand student engagement in online learning.

Nick Woolf

In our work with K-12 school districts during the COVID-19 pandemic, countless district leaders and school administrators have told us how challenging it's been to  build student engagement outside of the traditional classroom. 

Not only that, but the challenges associated with online learning may have the largest impact on students from marginalized communities.   Research   suggests that some groups of students experience more difficulty with academic performance and engagement when course content is delivered online vs. face-to-face.

As you look to improve the online learning experience for students, take a moment to understand  how students, caregivers, and staff are currently experiencing virtual learning. Where are the areas for improvement? How supported do students feel in their online coursework? Do teachers feel equipped to support students through synchronous and asynchronous facilitation? How confident do families feel in supporting their children at home?

Below, we've compiled a bank of 45 questions to understand student engagement in online learning.  Interested in running a student, family, or staff engagement survey? Click here to learn about Panorama's survey analytics platform for K-12 school districts.

Download Toolkit: 9 Virtual Learning Resources to Engage Students, Families, and Staff

45 Questions to Understand Student Engagement in Online Learning

For students (grades 3-5 and 6-12):.

1. How excited are you about going to your classes?

2. How often do you get so focused on activities in your classes that you lose track of time?

3. In your classes, how eager are you to participate?

4. When you are not in school, how often do you talk about ideas from your classes?

5. Overall, how interested are you in your classes?

6. What are the most engaging activities that happen in this class?

7. Which aspects of class have you found least engaging?

8. If you were teaching class, what is the one thing you would do to make it more engaging for all students?

9. How do you know when you are feeling engaged in class?

10. What projects/assignments/activities do you find most engaging in this class?

11. What does this teacher do to make this class engaging?

12. How much effort are you putting into your classes right now?

13. How difficult or easy is it for you to try hard on your schoolwork right now?

14. How difficult or easy is it for you to stay focused on your schoolwork right now?

15. If you have missed in-person school recently, why did you miss school?

16. If you have missed online classes recently, why did you miss class?

17. How would you like to be learning right now?

18. How happy are you with the amount of time you spend speaking with your teacher?

19. How difficult or easy is it to use the distance learning technology (computer, tablet, video calls, learning applications, etc.)?

20. What do you like about school right now?

21. What do you not like about school right now?

22. When you have online schoolwork, how often do you have the technology (laptop, tablet, computer, etc) you need?

23. How difficult or easy is it for you to connect to the internet to access your schoolwork?

24. What has been the hardest part about completing your schoolwork?

25. How happy are you with how much time you spend in specials or enrichment (art, music, PE, etc.)?

26. Are you getting all the help you need with your schoolwork right now?

27. How sure are you that you can do well in school right now?

28. Are there adults at your school you can go to for help if you need it right now?

29. If you are participating in distance learning, how often do you hear from your teachers individually?

For Families, Parents, and Caregivers:

30 How satisfied are you with the way learning is structured at your child’s school right now?

31. Do you think your child should spend less or more time learning in person at school right now?

32. How difficult or easy is it for your child to use the distance learning tools (video calls, learning applications, etc.)?

33. How confident are you in your ability to support your child's education during distance learning?

34. How confident are you that teachers can motivate students to learn in the current model?

35. What is working well with your child’s education that you would like to see continued?

36. What is challenging with your child’s education that you would like to see improved?

37. Does your child have their own tablet, laptop, or computer available for schoolwork when they need it?

38. What best describes your child's typical internet access?

39. Is there anything else you would like us to know about your family’s needs at this time?

For Teachers and Staff:

40.   In the past week, how many of your students regularly participated in your virtual classes?

41. In the past week, how engaged have students been in your virtual classes?

42. In the past week, how engaged have students been in your in-person classes?

43. Is there anything else you would like to share about student engagement at this time?

44. What is working well with the current learning model that you would like to see continued?

45. What is challenging about the current learning model that you would like to see improved?

Elevate Student, Family, and Staff Voices This Year With Panorama

Schools and districts can use Panorama’s leading survey administration and analytics platform to quickly gather and take action on information from students, families, teachers, and staff. The questions are applicable to all types of K-12 school settings and grade levels, as well as to communities serving students from a range of socioeconomic backgrounds.

back-to-school-students

In the Panorama platform, educators can view and disaggregate results by topic, question, demographic group, grade level, school, and more to inform priority areas and action plans. Districts may use the data to improve teaching and learning models, build stronger academic and social-emotional support systems, improve stakeholder communication, and inform staff professional development.

To learn more about Panorama's survey platform, get in touch with our team.

Related Articles

Engaging Your School Community in Survey Results (Q&A Ep. 4)

Engaging Your School Community in Survey Results (Q&A Ep. 4)

Learn how to engage principals, staff, families, and students in the survey results when running a stakeholder feedback program around school climate.

La Cañada Shares Survey Results

La Cañada Shares Survey Results

La Cañada Unified School District, Panorama's first client, shares results from its surveys, used to collect feedback from students, families, and staff.

44 Questions to Ask Students, Families, and Staff During the Pandemic

44 Questions to Ask Students, Families, and Staff During the Pandemic

Identify ways to support students, families, and staff in your school district during the pandemic with these 44 questions.

what are some research questions on student engagement

Featured Resource

9 virtual learning resources to connect with students, families, and staff.

We've bundled our top resources for building belonging in hybrid or distance learning environments.

Join 90,000+ education leaders on our weekly newsletter.

  • Review article
  • Open access
  • Published: 22 January 2020

Mapping research in student engagement and educational technology in higher education: a systematic evidence map

  • Melissa Bond   ORCID: orcid.org/0000-0002-8267-031X 1 ,
  • Katja Buntins 2 ,
  • Svenja Bedenlier 1 ,
  • Olaf Zawacki-Richter 1 &
  • Michael Kerres 2  

International Journal of Educational Technology in Higher Education volume  17 , Article number:  2 ( 2020 ) Cite this article

126k Accesses

276 Citations

58 Altmetric

Metrics details

Digital technology has become a central aspect of higher education, inherently affecting all aspects of the student experience. It has also been linked to an increase in behavioural, affective and cognitive student engagement, the facilitation of which is a central concern of educators. In order to delineate the complex nexus of technology and student engagement, this article systematically maps research from 243 studies published between 2007 and 2016. Research within the corpus was predominantly undertaken within the United States and the United Kingdom, with only limited research undertaken in the Global South, and largely focused on the fields of Arts & Humanities, Education, and Natural Sciences, Mathematics & Statistics. Studies most often used quantitative methods, followed by mixed methods, with little qualitative research methods employed. Few studies provided a definition of student engagement, and less than half were guided by a theoretical framework. The courses investigated used blended learning and text-based tools (e.g. discussion forums) most often, with undergraduate students as the primary target group. Stemming from the use of educational technology, behavioural engagement was by far the most often identified dimension, followed by affective and cognitive engagement. This mapping article provides the grounds for further exploration into discipline-specific use of technology to foster student engagement.

Introduction

Over the past decade, the conceptualisation and measurement of ‘student engagement’ has received increasing attention from researchers, practitioners, and policy makers alike. Seminal works such as Astin’s ( 1999 ) theory of involvement, Fredricks, Blumenfeld, and Paris’s ( 2004 ) conceptualisation of the three dimensions of student engagement (behavioural, emotional, cognitive), and sociocultural theories of engagement such as Kahu ( 2013 ) and Kahu and Nelson ( 2018 ), have done much to shape and refine our understanding of this complex phenomenon. However, criticism about the strength and depth of student engagement theorising remains e.g. (Boekaerts, 2016 ; Kahn, 2014 ; Zepke, 2018 ), the quality of which has had a direct impact on the rigour of subsequent research (Lawson & Lawson, 2013 ; Trowler, 2010 ), prompting calls for further synthesis (Azevedo, 2015 ; Eccles, 2016 ).

In parallel to this increased attention on student engagement, digital technology has become a central aspect of higher education, inherently affecting all aspects of the student experience (Barak, 2018 ; Henderson, Selwyn, & Aston, 2017 ; Selwyn, 2016 ). International recognition of the importance of ICT skills and digital literacy has been growing, alongside mounting recognition of its importance for active citizenship (Choi, Glassman, & Cristol, 2017 ; OECD, 2015a ; Redecker, 2017 ), and the development of interdisciplinary and collaborative skills (Barak & Levenberg, 2016 ; Oliver, & de St Jorre, Trina, 2018 ). Using technology has the potential to make teaching and learning processes more intensive (Kerres, 2013 ), improve student self-regulation and self-efficacy (Alioon & Delialioğlu, 2017 ; Bouta, Retalis, & Paraskeva, 2012 ), increase participation and involvement in courses as well as the wider university community (Junco, 2012 ; Salaber, 2014 ), and predict increased student engagement (Chen, Lambert, & Guidry, 2010 ; Rashid & Asghar, 2016 ). There is, however, no guarantee of active student engagement as a result of using technology (Kirkwood, 2009 ), with Tamim, Bernard, Borokhovski, Abrami, and Schmid’s ( 2011 ) second-order meta-analysis finding only a small to moderate impact on student achievement across 40 years. Rather, careful planning, sound pedagogy and appropriate tools are vital (Englund, Olofsson, & Price, 2017 ; Koehler & Mishra, 2005 ; Popenici, 2013 ), as “technology can amplify great teaching, but great technology cannot replace poor teaching” (OECD, 2015b ), p. 4.

Due to the nature of its complexity, educational technology research has struggled to find a common definition and terminology with which to talk about student engagement, which has resulted in inconsistency across the field. For example, whilst 77% of articles reviewed by Henrie, Halverson, and Graham ( 2015 ) operationalised engagement from a behavioural perspective, most of the articles did not have a clearly defined statement of engagement, which is no longer considered acceptable in student engagement research (Appleton, Christenson, & Furlong, 2008 ; Christenson, Reschly, & Wylie, 2012 ). Linked to this, educational technology research has, however, lacked theoretical guidance (Al-Sakkaf, Omar, & Ahmad, 2019 ; Hew, Lan, Tang, Jia, & Lo, 2019 ; Lundin, Bergviken Rensfeldt, Hillman, Lantz-Andersson, & Peterson, 2018 ). A review of 44 random articles published in 2014 in the journals Educational Technology Research & Development and Computers & Education, for example, revealed that more than half had no guiding conceptual or theoretical framework (Antonenko, 2015 ), and only 13 out of 62 studies in a systematic review of flipped learning in engineering education reported theoretical grounding (Karabulut-Ilgu, Jaramillo Cherrez, & Jahren, 2018 ). Therefore, calls have been made for a greater understanding of the role that educational technology plays in affecting student engagement, in order to strengthen teaching practice and lead to improved outcomes for students (Castañeda & Selwyn, 2018 ; Krause & Coates, 2008 ; Nelson Laird & Kuh, 2005 ).

A reflection upon prior research that has been undertaken in the field is a necessary first step to engage in meaningful discussion on how to foster student engagement in the digital age. In support of this aim, this article provides a synthesis of student engagement theory research, and systematically maps empirical higher education research between 2007 and 2016 on student engagement in educational technology. Synthesising the vast body of literature on student engagement (for previous literature and systematic reviews, see Additional file  1 ), this article develops “a tentative theory” in the hopes of “plot[ting] the conceptual landscape…[and chart] possible routes to explore it” (Antonenko, 2015 , pp. 57–67) for researchers, practitioners, learning designers, administrators and policy makers. It then discusses student engagement against the background of educational technology research, exploring prior literature and systematic reviews that have been undertaken. The systematic review search method is then outlined, followed by the presentation and discussion of findings.

Literature review

What is student engagement.

Student engagement has been linked to improved achievement, persistence and retention (Finn, 2006 ; Kuh, Cruce, Shoup, Kinzie, & Gonyea, 2008 ), with disengagement having a profound effect on student learning outcomes and cognitive development (Ma, Han, Yang, & Cheng, 2015 ), and being a predictor of student dropout in both secondary school and higher education (Finn & Zimmer, 2012 ). Student engagement is a multifaceted and complex construct (Appleton et al., 2008 ; Ben-Eliyahu, Moore, Dorph, & Schunn, 2018 ), which some have called a ‘meta-construct’ (e.g. Fredricks et al., 2004 ; Kahu, 2013 ), and likened to blind men describing an elephant (Baron & Corbin, 2012 ; Eccles, 2016 ). There is ongoing disagreement about whether there are three components e.g., (Eccles, 2016 )—affective/emotional, cognitive and behavioural—or whether there are four, with the recent suggested addition of agentic engagement (Reeve, 2012 ; Reeve & Tseng, 2011 ) and social engagement (Fredricks, Filsecker, & Lawson, 2016 ). There has also been confusion as to whether the terms ‘engagement’ and ‘motivation’ can and should be used interchangeably (Reschly & Christenson, 2012 ), especially when used by policy makers and institutions (Eccles & Wang, 2012 ). However, the prevalent understanding across the literature is that motivation is an antecedent to engagement; it is the intent and unobservable force that energises behaviour (Lim, 2004 ; Reeve, 2012 ; Reschly & Christenson, 2012 ), whereas student engagement is energy and effort in action; an observable manifestation (Appleton et al., 2008 ; Eccles & Wang, 2012 ; Kuh, 2009 ; Skinner & Pitzer, 2012 ), evidenced through a range of indicators.

Whilst it is widely accepted that no one definition exists that will satisfy all stakeholders (Solomonides, 2013 ), and no one project can be expected to possibly examine every sub-construct of student engagement (Kahu, 2013 ), it is important for each research project to begin with a clear definition of their own understanding (Boekaerts, 2016 ). Therefore, in this project, student engagement is defined as follows:

Student engagement is the energy and effort that students employ within their learning community, observable via any number of behavioural, cognitive or affective indicators across a continuum. It is shaped by a range of structural and internal influences, including the complex interplay of relationships, learning activities and the learning environment. The more students are engaged and empowered within their learning community, the more likely they are to channel that energy back into their learning, leading to a range of short and long term outcomes, that can likewise further fuel engagement.

Dimensions and indicators of student engagement

There are three widely accepted dimensions of student engagement; affective, cognitive and behavioural. Within each component there are several indicators of engagement (see Additional file  2 ), as well as disengagement (see Additional file 2 ), which is now seen as a separate and distinct construct to engagement. It should be stated, however, that whilst these have been drawn from a range of literature, this is not a finite list, and it is recognised that students might experience these indicators on a continuum at varying times (Coates, 2007 ; Payne, 2017 ), depending on their valence (positive or negative) and activation (high or low) (Pekrun & Linnenbrink-Garcia, 2012 ). There has also been disagreement in terms of which dimension the indicators align with. For example, Järvelä, Järvenoja, Malmberg, Isohätälä, and Sobocinski ( 2016 ) argue that ‘interaction’ extends beyond behavioural engagement, covering both cognitive and/or emotional dimensions, as it involves collaboration between students, and Lawson and Lawson ( 2013 ) believe that ‘effort’ and ‘persistence’ are cognitive rather than behavioural constructs, as they “represent cognitive dispositions toward activity rather than an activity unto itself” (p. 465), which is represented in the table through the indicator ‘stay on task/focus’ (see Additional file 2 ). Further consideration of these disagreements represent an area for future research, however, as they are beyond the scope of this paper.

Student engagement within educational technology research

The potential that educational technology has to improve student engagement, has long been recognised (Norris & Coutas, 2014 ), however it is not merely a case of technology plus students equals engagement. Without careful planning and sound pedagogy, technology can promote disengagement and impede rather than help learning (Howard, Ma, & Yang, 2016 ; Popenici, 2013 ). Whilst still a young area, most of the research undertaken to gain insight into this, has been focused on undergraduate students e.g., (Henrie et al., 2015 ; Webb, Clough, O’Reilly, Wilmott, & Witham, 2017 ), with Chen et al. ( 2010 ) finding a positive relationship between the use of technology and student engagement, particularly earlier in university study. Research has also been predominantly STEM and medicine focused (e.g., Li, van der Spek, Feijs, Wang, & Hu, 2017 ; Nikou & Economides, 2018 ), with at least five literature or systematic reviews published in the last 5 years focused on medicine, and nursing in particular (see Additional file  3 ). This indicates that further synthesis is needed of research in other disciplines, such as Arts & Humanities and Education, as well as further investigation into whether research continues to focus on undergraduate students.

The five most researched technologies in Henrie et al.’s ( 2015 ) review were online discussion boards, general websites, learning management systems (LMS), general campus software and videos, as opposed to Schindler, Burkholder, Morad, and Marsh’s ( 2017 ) literature review, which concentrated on social networking sites (Facebook and Twitter), digital games, wikis, web-conferencing software and blogs. Schindler et al. found that most of these technologies had a positive impact on multiple indicators of student engagement across the three dimensions of engagement, with digital games, web-conferencing software and Facebook the most effective. However, it must be noted that they only considered seven indicators of student engagement, which could be extended by considering further indicators of student engagement. Other reviews that have found at least a small positive impact on student engagement include those focused on audience response systems (Hunsu, Adesope, & Bayly, 2016 ; Kay & LeSage, 2009 ), mobile learning (Kaliisa & Picard, 2017 ), and social media (Cheston, Flickinger, & Chisolm, 2013 ). Specific indicators of engagement that increased as a result of technology include interest and enjoyment (Li et al., 2017 ), improved confidence (Smith & Lambert, 2014 ) and attitudes (Nikou & Economides, 2018 ), as well as enhanced relationships with peers and teachers e.g., (Alrasheedi, Capretz, & Raza, 2015 ; Atmacasoy & Aksu, 2018 ).

Literature and systematic reviews focused on student engagement and technology do not always include information on where studies have been conducted. Out of 27 identified reviews (see Additional file 3 ), only 14 report the countries included, and two of these were explicitly focused on a specific region or country, namely Africa and Turkey. Most of the research has been conducted in the USA, followed by the UK, Taiwan, Australia and China. Table  1 depicts the three countries from which most studies originated from in the respective reviews, and highlights a clear lack of research conducted within mainland Europe, South America and Africa. Whilst this could be due to the choice of databases in which the literature was searched for, this nevertheless highlights a substantial gap in the literature, and to that end, it will be interesting to see whether this review is able to substantiate or contradict these trends.

Research into student engagement and educational technology has predominantly used a quantitative methodology (see Additional file 3 ), with 11 literature and systematic reviews reporting that surveys, particularly self-report Likert-scale, are the most used source of measurement (e.g. Henrie et al., 2015 ). Reviews that have included research using a range of methodologies, have found a limited number of studies employing qualitative methods (e.g. Connolly, Boyle, MacArthur, Hainey, & Boyle, 2012 ; Kay & LeSage, 2009 ; Lundin et al., 2018 ). This has led to a call for further qualitative research to be undertaken, exploring student engagement and technology, as well as more rigorous research designs e.g., (Li et al., 2017 ; Nikou & Economides, 2018 ), including sampling strategies, data collection, and in experimental studies in particular (Cheston et al., 2013 ; Connolly et al., 2012 ). However, not all reviews included information on methodologies used. Crook ( 2019 ), in his recent editorial in the British Journal of Educational Technology , stated that research methodology is a “neglected topic” (p. 487) within educational technology research, and stressed its importance in order to conduct studies delving deeper into phenomena (e.g. longitudinal studies).

Therefore, this article presents an initial “evidence map” (Miake-Lye, Hempel, Shanman, & Shekelle, 2016 ), p. 19 of systematically identified literature on student engagement and educational technology within higher education, undertaken through a systematic review, in order to address the issues raised by prior research, and to identify research gaps. These issues include the disparity between field of study and study levels researched, the geographical distribution of studies, the methodologies used, and the theoretical fuzziness surrounding student engagement. This article, however, is intended to provide an initial overview of the systematic review method employed, as well as an overview of the overall corpus. Further synthesis of possible correlations between student engagement and disengagement indicators with the co-occurrence of technology tools, will be undertaken within field of study specific articles (e.g., Bedenlier, 2020b ; Bedenlier 2020a ), allowing more meaningful guidance on applying the findings in practice.

The following research questions guide this enquiry:

How do the studies in the sample ground student engagement and align with theory?

Which indicators of cognitive, behavioural and affective engagement were identified in studies where educational technology was used? Which indicators of student disengagement?

What are the learning scenarios, modes of delivery and educational technology tools employed in the studies?

Overview of the study

With the intent to systematically map empirical research on student engagement and educational technology in higher education, we conducted a systematic review. A systematic review is an explicitly and systematically conducted literature review, that answers a specific question through applying a replicable search strategy, with studies then included or excluded, based on explicit criteria (Gough, Oliver, & Thomas, 2012 ). Studies included for review are then coded and synthesised into findings that shine light on gaps, contradictions or inconsistencies in the literature, as well as providing guidance on applying findings in practice. This contribution maps the research corpus of 243 studies that were identified through a systematic search and ensuing random parameter-based sampling.

Search strategy and selection procedure

The initial inclusion criteria for the systematic review were peer-reviewed articles in the English language, empirically reporting on students and student engagement in higher education, and making use of educational technology. The search was limited to records between 1995 and 2016, chosen due to the implementation of the first Virtual Learning Environments and Learning Management Systems within higher education see (Bond, 2018 ). Articles were limited to those published in peer-reviewed journals, due to the rigorous process under which they are published, and their trustworthiness in academia (Nicholas et al., 2015 ), although concerns within the scientific community with the peer-review process are acknowledged e.g. (Smith, 2006 ).

Discussion arose on how to approach the “hard-to-detect” (O’Mara-Eves et al., 2014 , p. 51) concept of student engagement in regards to sensitivity versus precision (Brunton, Stansfield, & Thomas, 2012 ), particularly in light of engagement being Henrie et al.’s ( 2015 ) most important search term. The decision was made that the concept ‘student engagement’ would be identified from titles and abstracts at a later stage, during the screening process. In this way, it was assumed that articles would be included, which indeed are concerned with student engagement, but which use different terms to describe the concept. Given the nature of student engagement as a meta-construct e.g. (Appleton et al., 2008 ; Christenson et al., 2012 ; Kahu, 2013 ) and by limiting the search to only articles including the term engagement , important research on other elements of student engagement might be missed. Hence, we opted for recall over precision. According to Gough et al. ( 2012 ), p. 13 “electronic searching is imprecise and captures many studies that employ the same terms without sharing the same focus”, or would lead to disregarding studies that analyse the construct but use different terms to describe it.

With this in mind, the search strategy to identify relevant studies was developed iteratively with support from the University Research Librarian. As outlined in O’Mara-Eves et al. ( 2014 ) as a standard approach, we used reviewer knowledge—in this case strongly supported through not only reviewer knowledge but certified expertise—and previous literature (e.g. Henrie et al., 2015 ; Kahu, 2013 ) to elicit concepts with potential importance under the topics student engagement, higher education and educational technology . The final search string (see Fig.  1 ) encompasses clusters of different educational technologies that were searched for separately in order to avoid an overly long search string. It was decided not to include any brand names, e.g. Facebook, Twitter, Moodle etc. because it was again reasoned that in scientific publication, the broader term would be used (e.g. social media). The final search string was slightly adapted, e.g. the format required for truncations or wildcards, according to the settings of each database being used Footnote 1 .

figure 1

Final search terms used in the systematic review

Four databases (ERIC, Web of Science, Scopus and PsycINFO) were searched in July 2017 and three researchers and a student assistant screened abstracts and titles of the retrieved references between August and November 2017, using EPPI Reviewer 4.0. An initial 77,508 references were retrieved, and with the elimination of duplicate records, 53,768 references remained (see Fig.  2 ). A first cursory screening of records revealed that older research was more concerned with technologies that are now considered outdated (e.g. overhead projectors, floppy disks). Therefore, we opted to adjust the period to include research published between 2007 and 2016, labeled as a phase of research and practice, entitled ‘online learning in the digital age’ (Bond, 2018 ). Whilst we initially opted for recall over precision, the decision was then made to search for specific facets of the student engagement construct (e.g. deep learning, interest and persistence) within EPPI-Reviewer, in order to further refine the corpus. These adaptations led to a remaining 18,068 records.

figure 2

Systematic review PRISMA flow chart (slightly modified after Brunton et al., 2012 , p. 86; Moher, Liberati, Tetzlaff, & Altman, 2009 ), p. 8

Four researchers screened the first 150 titles and abstracts, in order to iteratively establish a joint understanding of the inclusion criteria. The remaining references were distributed equally amongst the screening team, which resulted in the inclusion of 4152 potentially relevant articles. Given the large number of articles for screening on full text, whilst facing restrained time as a condition in project-based and funded work, it was decided that a sample of articles would be drawn from this corpus for further analysis. With the intention to draw a sample that estimates the population parameters with a predetermined error range, we used methods of sample size estimation in the social sciences (Kupper & Hafner, 1989 ). To do so, the R Package MBESS (Kelley, Lai, Lai, & Suggests, 2018 ) was used. Accepting a 5% error range, a percentage of a half and an alpha of 5%, 349 articles were sampled, with this sample being then stratified by publishing year, as student engagement has become much more prevalent (Zepke, 2018 ) and educational technology has become more differentiated within the last decade (Bond, 2018 ). Two researchers screened the first 100 articles on full text, reaching an agreement of 88% on inclusion/exclusion. The researchers then discussed the discrepancies and came to an agreement on the remaining 12%. It was decided that further comparison screening was needed, to increase the level of reliability. After screening the sample on full text, 232 articles remained for data extraction, which contained 243 studies.

Data extraction process

In order to extract the article data, an extensive coding system was developed, including codes to extract information on the set-up and execution of the study (e.g. methodology, study sample) as well as information on the learning scenario, the mode of delivery and educational technology used. Learning scenarios included broader pedagogies, such as social collaborative learning and self-determined learning, but also specific pedagogies such as flipped learning, given the increasing number of studies and interest in these approaches (e.g., Lundin et al., 2018 ). Specific examples of student engagement and/or disengagement were coded under cognitive, affective or behavioural (dis)engagement. The facets of student (dis)engagement were identified based on the literature review undertaken (see Additional file 2 ), and applied in this detailed manner to not only capture the overarching dimensions of the concept, but rather their diverse sub-meanings. New indicators also emerged during the coding process, which had not initially been identified from the literature review, including ‘confidence’ and ‘assuming responsibility’. The 243 studies were coded with this extensive code set and any disagreements that occurred between the coders were reconciled. Footnote 2

As a plethora of over 50 individual educational technology applications and tools were identified in the 243 studies, in line with results found in other large-scale systematic reviews (e.g., Lai & Bower, 2019 ), concerns were raised over how the research team could meaningfully analyse and report the results. The decision was therefore made to employ Bower’s ( 2016 ) typology of learning technologies (see Additional file  4 ), in order to channel the tools into groups that share the same characteristics or “structure of information” (Bower, 2016 ), p. 773. Whilst it is acknowledged that some of the technology could be classified into more than one type within the typology, e.g. wikis can be used in individual composition, for collaborative tasks, or for knowledge organisation and sharing, “the type of learning that results from the use of the tool is dependent on the task and the way people engage with it rather than the technology itself” therefore “the typology is presented as descriptions of what each type of tool enables and example use cases rather than prescriptions of any particular pedagogical value system” (Bower, 2016 ), p. 774. For further elaboration on each category, please see Bower ( 2015 ).

Study characteristics

Geographical characteristics.

The systematic mapping reveals that the 243 studies were set in 33 different countries, whilst seven studies investigated settings in an international context, and three studies did not indicate their country setting. In 2% of the studies, the country was allocated based on the author country of origin, if the two authors came from the same country. The top five countries account for 158 studies (see Fig.  3 ), with 35.4% ( n  = 86) studies conducted in the United States (US), 10.7% ( n  = 26) in the United Kingdom (UK), 7.8% ( n  = 19) in Australia, 7.4% ( n  = 18) in Taiwan, and 3.7% ( n  = 9) in China. Across the corpus, studies from countries employing English as the official or one of the official languages total up to 59.7% of the entire sample, followed by East Asian countries that in total account for 18.8% of the sample. With the exception of the UK, European countries are largely absent from the sample, only 7.3% of the articles originate from this region, with countries such as France, Belgium, Italy or Portugal having no studies and countries such as Germany or the Netherlands having one respectively. Thus, with eight articles, Spain is the most prolific European country outside of the UK. The geographical distribution of study settings also clearly shows an almost complete absence of studies undertaken within African contexts, with five studies from South Africa and one from Tunisia. Studies from South-East Asia, the Middle East, and South America are likewise low in number this review. Whilst the global picture evokes an imbalance, this might be partially due to our search and sampling strategy, having focused on English language journals, indexed in four primarily Western-focused databases.

figure 3

Percentage deviation from the average relative frequencies of the different data collection formats per country (≥ 3 articles). Note. NS = not stated; AUS = Australia; CAN = Canada; CHN = China; HKG = Hong Kong; inter = international; IRI = Iran; JAP = Japan; MYS = Malaysia; SGP = Singapore; ZAF = South Africa; KOR = South Korea; ESP = Spain; SWE = Sweden; TWN = Taiwan; TUR = Turkey; GBR = United Kingdom; USA = United States of America

Methodological characteristics

Within this literature corpus, 103 studies (42%) employed quantitative methods, 84 (35%) mixed methods, and 56 (23%) qualitative. Relating these numbers back to the contributing countries, different preferences for and frequencies of methods used become apparent (see Fig. 3 ). As a general tendency, mixed methods and qualitative research occurs more often in Western countries, whereas quantitative research is the preferred method in East Asian countries. For example, studies originating from Australia employ mixed methods research 28% more often than the average, whereas Singapore is far below average in mixed methods research, with 34.5% less than the other countries in the sample. In Taiwan, on the other hand, mixed methods studies are being conducted 23.5% below average and qualitative research 6.4% less often than average. However, quantitative research occurs more often than in other countries, with 29.8% above average.

Amongst the qualitative studies, qualitative content analysis ( n  = 30) was the most frequently used analysis approach, followed by thematic analysis ( n  = 21) and grounded theory ( n  = 12). However, a lot of times ( n  = 37) the exact analysis approach was not reported, could not be allocated to a specific classification ( n  = 22), or no method of analysis was identifiable ( n  = 11). Within studies using quantitative methods, mean comparison was used in 100 studies, frequency data was collected and analysed in 83 studies, and in 40 studies regression models were used. Furthermore, looking at the correlation between the different analysis approaches, only one significant correlation can be identified, this being between mean comparison and frequency data (−.246). Besides that, correlations are small, for example, in only 14% of the studies both mean comparisons and regressions models are employed.

Study population characteristics

Research in the corpus focused on universities as the prime institution type ( n  = 191, 79%), followed by 24 (10%) non-specified institution types, and colleges ( n  = 21, 8.2%) (see Fig.  4 ). Five studies (2%) included institutions classified as ‘other’, and two studies (0.8%) included both college and university students. The most frequently studied student population was undergraduate students (60%, n  = 146), as opposed to 33 studies (14%) focused on postgraduate students (see Fig.  6 ). A combination of undergraduate and postgraduate students were the subject of interest in 23 studies (9%), with 41 studies (17%) not specifying the level of study of research participants.

figure 4

Relative frequencies of study field in dependence of countries with ≥3 articles. Note. Country abbreviations are as per Figure 4. A&H = Arts & Humanities; BA&L = Business, Administration and Law; EDU = Education; EM&C = Engineering, Manufacturing & Construction; H&W = Health & Welfare; ICT = Information & Communication Technologies; ID = interdisciplinary; NS,M&S = Natural Science, Mathematics & Statistics; NS = Not specified; SoS = Social Sciences, Journalism & Information

Based on the UNESCO (2015) ISCED classification, eight broad study fields are covered in the sample, with Arts & Humanities (42 studies), Education (42 studies), and Natural Sciences, Mathematics & Statistics (37) being the top three study fields, followed by Health & Welfare (30 studies), Social Sciences, Journalism & Information (22), Business, Administration & Law (19 studies), Information & Communication Technologies (13), Engineering, Manufacturing & Construction (11), and another 26 studies of interdisciplinary character. One study did not specify a field of study.

An expectancy value was calculated, according to which, the distribution of studies per discipline should occur per country. The actual deviation from this value then showed that several Asian countries are home to more articles in the field of Arts & Humanities than was expected: Japan with 3.3 articles more, China with 5.4 and Taiwan with 5.9. Furthermore, internationally located research also shows 2.3 more interdisciplinary studies than expected, whereas studies on Social Sciences occur more often than expected in the UK (5.7 more articles) and Australia (3.3 articles) but less often than expected across all other countries. Interestingly, the USA have 9.9 studies less in Arts & Humanities than was expected but 5.6 articles more than expected in Natural Science.

Question One: How do the studies in the sample ground student engagement and align with theory?

Defining student engagement.

It is striking that almost all of the studies ( n  = 225, 93%) in this corpus lack a definition of student engagement, with only 18 (7%) articles attempting to define the concept. However, this is not too surprising, as the search strategy was set up with the assumption that researchers investigating student engagement (dimensions and indicators) would not necessarily label them as student engagement. When developing their definitions, authors in these 18 studies referenced 22 different sources, with the work of Kuh and colleagues e.g., (Hu & Kuh, 2002 ; Kuh, 2001 ; Kuh et al., 2006 ), as well as Astin ( 1984 ), the only authors referred to more than once. The most popular definition of student engagement within these studies was that of active participation and involvement in learning and university life e.g., (Bolden & Nahachewsky, 2015 ; bFukuzawa & Boyd, 2016 ), which was also found by Joksimović et al. ( 2018 ) in their review of MOOC research. Interaction, especially between peers and with faculty, was the next most prevalent definition e.g., (Andrew, Ewens, & Maslin-Prothero, 2015 ; Bigatel & Williams, 2015 ). Time and effort was given as a definition in four studies (Gleason, 2012 ; Hatzipanagos & Code, 2016 ; Price, Richardson, & Jelfs, 2007 ; Sun & Rueda, 2012 ), with expending physical and psychological energy (Ivala & Gachago, 2012 ) another definition. This variance in definitions and sources reflects the ongoing complexity of the construct (Zepke, 2018 ), and serves to reinforce the need for a clearer understanding across the field (Schindler et al., 2017 ).

Theoretical underpinnings

Reflecting findings from other systematic and literature reviews on the topic (Abdool, Nirula, Bonato, Rajji, & Silver, 2017 ; Hunsu et al., 2016 ; Kaliisa & Picard, 2017 ; Lundin et al., 2018 ), 59% ( n  = 100) of studies did not employ a theoretical model in their research. Of the 41% ( n  = 100) that did, 18 studies drew on social constructivism, followed by the Community of Inquiry model ( n  = 8), Sociocultural Learning Theory ( n  = 5), and Community of Practice models ( n  = 4). These findings also reflect the state of the field in general (Al-Sakkaf et al., 2019 ; Bond, 2019b ; Hennessy, Girvan, Mavrikis, Price, & Winters, 2018 ).

Another interesting finding of this research is that whilst 144 studies (59%) provided research questions, 99 studies (41%) did not. Although it is recognised that not all studies have research questions (Bryman, 2007 ), or only develop them throughout the research process, such as with grounded theory (Glaser & Strauss, 1967 ), a surprising number of quantitative studies (36%, n  = 37) did not have research questions. This is a reflection on the lack of theoretical guidance, as 30 of these 37 studies also did not draw on a theoretical or conceptual framework.

Question 2: which indicators of cognitive, behavioural and affective engagement were identified in studies where educational technology was used? Which indicators of student disengagement?

Student engagement indicators.

Within the corpus, the behavioural engagement dimension was documented in some form in 209 studies (86%), whereas the dimension of affective engagement was reported in 163 studies (67%) and the cognitive dimension in only 136 (56%) studies. However, the ten most often identified student engagement indicators across the studies overall (see Table  2 ) were evenly distributed over all three dimensions (see Table  3 ). The indicators participation/interaction/involvement , achievement and positive interactions with peers and teachers each appear in at least 100 studies, which is almost double the amount of the next most frequent student engagement indicator.

Across the 243 studies in the corpus, 117 (48%) showed all three dimensions of affective, cognitive and behavioural student engagement e.g., (Szabo & Schwartz, 2011 ), including six studies that used established student engagement questionnaires, such as the NSSE (e.g., Delialioglu, 2012 ), or self-developed addressing these three dimensions. Another 54 studies (22%) displayed at least two student engagement dimensions e.g., (Hatzipanagos & Code, 2016 ), including six questionnaire studies. Studies exhibiting one student engagement dimension only, was reported in 71 studies (29%) e.g., (Vural, 2013 ).

Student disengagement indicators

Indicators of student disengagement (see Table  4 ) were identified considerably less often across the corpus, which could be explained by the purpose of the studies being to primarily address/measure positive engagement, but on the other hand this could potentially be due to a form of self-selected or publication bias, due to less frequently reporting and/or publishing studies with negative results. The three disengagement indicators that were most often indicated were frustration ( n  = 33, 14%) e.g., (Ikpeze, 2007 ), opposition/rejection ( n  = 20, 8%) e.g., (Smidt, Bunk, McGrory, Li, & Gatenby, 2014 ) and disappointment e.g., (Granberg, 2010 ) , as well as other affective disengagement ( n  = 18, 7% each).

Technology tool typology and engagement/disengagement indicators

Across the 243 studies, a plethora of over 50 individual educational technology tools were employed. The top five most frequently researched tools were LMS ( n  = 89), discussion forums ( n  = 80), videos ( n  = 44), recorded lectures ( n  = 25), and chat ( n  = 24). Following a slightly modified version of Bower’s ( 2016 ) educational tools typology, 17 broad categories of tools were identified (see Additional file 4 for classification, and 3.2 for further information). The frequency with which tools from the respective groups employed in studies varied considerably (see Additional file 4 ), with the top five categories being text-based tools ( n  = 138), followed by knowledge organisation & sharing tools ( n  = 104), multimodal production tools ( n  = 89), assessment tools ( n  = 65) and website creation tools ( n  = 29).

Figure  5 shows what percentage of each engagement dimension (e.g., affective engagement or cognitive disengagement) was fostered through each specific technology type. Given the results in 4.2.1 on student engagement, it was somewhat unsurprising to see the prevalence of text-based tools , knowledge organisation & sharing tools, and multimodal production tools having the highest proportion of affective, behavioural and cognitive engagement. For example, affective engagement was identified in 163 studies, with 63% of these studies using text-based tools (e.g., Bulu & Yildirim, 2008 ) , and cognitive engagement identified in 136 studies, with 47% of those using knowledge organisation & sharing tools e.g., (Shonfeld & Ronen, 2015 ). However, further analysis of studies employing discussion forums (a text-based tool ) revealed that, whilst the top affective and behavioural engagement indicators were found in almost two-thirds of studies (see Additional file  5 ), there was a substantial gap between that and the next most prevalent engagement indicator, with the exact pattern (and indicators) emerging for wikis. This represents an area for future research.

figure 5

Engagement and disengagement by tool typology. Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S = knowledge organisation and sharing tools; DAT = data analysis tools; DST = digital storytelling tools; AT = assessment tools; SNT = social networking tools; SCT = synchronous collaboration tools; ML = mobile learning; VW = virtual worlds; LS = learning software; OL = online learning; A&H = Arts & Humanities; BA&L = Business, Administration and Law; EDU = Education; EM&C = Engineering, Manufacturing & Construction; H&W = Health & Welfare; ICT = Information & Communication Technologies; ID = interdisciplinary; NS,M&S = Natural Science, Mathematics & Statistics; NS = Not specified; SoS = Social Sciences, Journalism & Information

Interestingly, studies using website creation tools reported more disengagement than engagement indicators across all three domains (see Fig.  5 ), with studies using assessment tools and social networking tools also reporting increased instances of disengagement across two domains (affective and cognitive, and behavioural and cognitive respectively). 23 of the studies (79%) using website creation tools , used blogs, with students showing, for example, disinterest in topics chosen e.g., (Sullivan & Longnecker, 2014 ), anxiety over their lack of blogging knowledge and skills e.g., (Mansouri & Piki, 2016 ), and continued avoidance of using blogs in some cases, despite introductory training e.g., (Keiller & Inglis-Jassiem, 2015 ). In studies where assessment tools were used, students found timed assessment stressful, particularly when trying to complete complex mathematical solutions e.g., (Gupta, 2009 ), as well as quizzes given at the end of lectures, with some students preferring take-up time of content first e.g., (DePaolo & Wilkinson, 2014 ). Disengagement in studies where social networking tools were used, indicated that some students found it difficult to express themselves in short posts e.g., (Cook & Bissonnette, 2016 ), that conversations lacked authenticity e.g., (Arnold & Paulus, 2010 ), and that some did not want to mix personal and academic spaces e.g., (Ivala & Gachago, 2012 ).

Question 3: What are the learning scenarios, modes of delivery and educational technology tools employed in the studies?

Learning scenarios.

With 58.4% across the sample, social-collaborative learning (SCL) was the scenario most often employed ( n  = 142), followed by 43.2% of studies investigating self-directed learning (SDL) ( n  = 105) and 5.8% of studies using game-based learning (GBL) ( n  = 14) (see Fig. 6 ). Studies coded as SCL included those exploring social learning (Bandura, 1971 ) and social constructivist approaches (Vygotsky, 1978 ). Personal learning environments (PLE) were found for 2.9% of studies, 1.3% studies used other scenarios ( n  = 3), whereas another 13.2% did not provide specification of their learning scenarios ( n  = 32). It is noteworthy that in 45% of possible cases for employing SDL scenarios, SCL was also used. Other learning scenarios were also used mostly in combination with SCL and SDL. Given the rising number of higher education studies exploring flipped learning (Lundin et al., 2018 ), studies exploring the approach were also specifically coded (3%, n  = 7).

figure 6

Co-occurrence of learning scenarios across the sample ( n  = 243). Note. SDL = self-directed learning; SCL = social collaborative learning; GBL = game-based learning; PLE = personal learning environments; other = other learning scenario

Modes of delivery

In 84% of studies ( n  = 204), a single mode of delivery was used, with blended learning the most researched (109 studies), followed by distance education (72 studies), and face-to-face instruction (55 studies). Of the remaining 39 studies, 12 did not indicate their mode of delivery, whilst the other 27 studies combined or compared modes of delivery, e.g. comparing face to face courses to blended learning, such as the study on using iPads in undergraduate nursing education by Davies ( 2014 ).

Educational technology tools investigated

Most studies in this corpus (55%) used technology asynchronously, with 12% of studies researching synchronous tools, and 18% of studies using both asynchronous and synchronous. When exploring the use of tools, the results are not surprising, with a heavy reliance on asynchronous technology. However, when looking at tool usage with studies in face-to-face contexts, the number of synchronous tools (31%) is almost as many as the number of asynchronous tools (41%), and surprisingly low within studies in distance education (7%).

Tool categories were used in combination, with text-based tools most often used in combination with other technology types (see Fig.  7 ). For example, in 60% of all possible cases using multimodal production tools, in 69% of all possible synchronous production tool cases, in 72% of all possible knowledge, organisation & sharing tool cases , and a striking 89% of all possible learning software cases and 100% of all possible MOOC cases. On the contrary, text-based tools were never used in combination with games or data analysis tools . However, studies using gaming tools were used in 67% of possible assessment tool cases as well. Assessment tools, however, constitute somewhat of a special case when studies using website creation tools are concerned, with only 7% of possible cases having employed assessment tools .

figure 7

Co-occurrence of tools across the sample ( n  = 243). Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S = knowledge organisation and sharing tools; DAT = data analysis tools; DST = digital storytelling tools; AT = assessment tools; SNT = social networking tools; SCT = synchronous collaboration tools; ML = mobile learning; VW = virtual worlds; LS = learning software; OL = online learning

In order to gain further understanding into how educational technology was used, we examined how often a combination of two variables should occur in the sample and how often it actually occurs, with deviations described as either ‘more than’ or ‘less than’ the expected value. This provides further insight into potential gaps in the literature, which can inform future research. For example, an analysis of educational technology tool usage amongst study populations (see Fig.  8 ) reveals that 5.0 more studies than expected looked at knowledge organisation & sharing for graduate students, but 5.0 studies less than expected investigated assessment tools for this group. By contrast, 5 studies more than expected researched assessment tools for unspecified study levels, and 4.3 studies less than expected employed knowledge organisation & sharing for undergraduate students.

figure 8

Relative frequency of educational technology tools used according to study level Note. Abbreviations are explained in Fig. 7

Educational technology tools were also used differently from the expected pattern within various fields of study (see Fig.  9 ), most obviously for the cases of the top five tools. However, also for virtual worlds, found in 5.8 studies more in Health & Welfare than expected, and learning software, used in 6.4 studies more in Arts & Humanities than expected. In all other disciplines, learning software was used less often than assumed. Text-based tools were used more often than expected in fields of study that are already text-intensive, including Arts & Humanities, Education, Business, Administration & Law as well as Social Sciences - but less often than thought in fields such as Engineering, Health & Welfare, and Natural Sciences, Mathematics & Statistics. Multimodal production tools were used more often only in Health & Welfare, ICT and Natural Sciences, and less often than assumed across all other disciplines. Assessment tools deviated most clearly, with 11.9 studies more in Natural Sciences, Mathematics & Statistics than assumed, but with 5.2 studies less in both Education and Arts & Humanities.

figure 9

Relative frequency of educational technology tools used according to field of study. Note. TBT = text-based tools; MPT = multimodal production tools; WCT = website creation tools; KO&S = knowledge organisation and sharing tools; DAT = data analysis tools; DST = digital storytelling tools; AT = assessment tools; SNT = social networking tools; SCT = synchronous collaboration tools; ML = mobile learning; VW = virtual worlds; LS = learning software; OL = online learning

In regards to mode of delivery and educational technology tools used, it is interesting to see that from the five top tools, except for assessment tools , all tools were used in face-to-face instruction less often than expected (see Fig.  10 ); from 1.6 studies less for website creation tools to 14.5 studies less for knowledge organisation & sharing tools . Assessment tools , however, were used in 3.3 studies more than expected - but less often than assumed (although moderately) in blended learning and distance education formats. Text-based tools, multimodal production tools and knowledge organisation & sharing tools were employed more often than expected in blended and distance learning, especially obvious in 13.1 studies more on t ext-based tools and 8.2 studies on knowledge organisation & sharing tools in distance education. Contrary to what one would perhaps expect, social networking tools were used in 4.2 studies less than expected for this mode of delivery.

figure 10

Relative frequency of educational technology tools used according mode of delivery. Note. Tool abbreviations as per Figure 10. BL = Blended learning; DE = Distance education; F2F = Face-to-face; NS = Not stated

The findings of this study confirm those of previous research, with the most prolific countries being the US, UK, Australia, Taiwan and China. This is rather representative of the field, with an analysis of instructional design and technology research from 2007 to 2017 listing the most productive countries as the US, Taiwan, UK, Australia and Turkey (Bodily, Leary, & West, 2019 ). Likewise, an analysis of 40 years of research in Computers & Education (CAE) found that the US, UK and Taiwan accounted for 49.9% of all publications (Bond, 2018 ). By contrast, a lack of African research was apparent in this review, which is also evident in educational technology research in top tier peer-reviewed journals, with only 4% of articles published in the British Journal of Educational Technology ( BJET ) in the past decade (Bond, 2019b ) and 2% of articles in the Australasian Journal of Educational Technology (AJET) (Bond, 2018 ) hailing from Africa. Similar results were also found in previous literature and systematic reviews (see Table 1 ), which again raises questions of literature search and inclusion strategies, which will be further discussed in the limitations section.

Whilst other reviews of educational technology and student engagement have found studies to be largely STEM focused (Boyle et al., 2016 ; Li et al., 2017 ; Lundin et al., 2018 ; Nikou & Economides, 2018 ), this corpus features a more balanced scope of research, with the fields of Arts & Humanities (42 studies, 17.3%) and Education (42 studies, 17.3%) constituting roughly one third of all studies in the corpus - and Natural Sciences, Mathematics & Statistics, nevertheless, assuming rank 3 with 38 studies (15.6%). Beyond these three fields, further research is needed within underrepresented fields of study, in order to gain more comprehensive insights into the usage of educational technology tools (Kay & LeSage, 2009 ; Nikou & Economides, 2018 ).

Results of the systematic map further confirm the focus that prior educational technology research has placed on undergraduate students as the target group and participants in technology-enhanced learning settings e.g. (Cheston et al., 2013 ; Henrie et al., 2015 ). With the overwhelming number of 146 studies researching undergraduate students—compared to 33 studies on graduate students and 23 studies investigating both study levels—this also indicates that further investigation into the graduate student experience is needed. Furthermore, the fact that 41 studies do not report on the study level of their participants is an interesting albeit problematic fact, as implications might not easily be drawn for application to one’s own specific teaching context if the target group under investigation is not clearly denominated. A more precise reporting of participants’ details, as well as specification of the study context (country, institution, study level to name a few) is needed to transfer and apply study results to practice—being then able to take into account why some interventions succeed and others do not.

In line with other studies e.g. (Henrie et al., 2015 ), this review has also demonstrated that student engagement remains an under-theorised concept, that is often only considered fragmentally in research. Whilst studies in this review have often focused on isolated aspects of student engagement, their results are nevertheless interesting and valuable. However, it is important to relate these individual facets to the larger framework of student engagement, by considering how these aspects are connected and linked to each other. This is especially helpful to integrate research findings into practice, given that student engagement and disengagement are rarely one-dimensional; it is not enough to focus only on one aspect of engagement, but also to look at aspects that are adjacent to it (Pekrun & Linnenbrink-Garcia, 2012 ). It is also vital, therefore, that researchers develop and refine an understanding of student engagement, and make this explicit in their research (Appleton et al., 2008 ; Christenson et al., 2012 ).

Reflective of current conversations in the field of educational technology (Bond, 2019b ; Castañeda & Selwyn, 2018 ; Hew et al., 2019 ), as well as other reviews (Abdool et al., 2017 ; Hunsu et al., 2016 ; Kaliisa & Picard, 2017 ; Lundin et al., 2018 ), a substantial number of studies in this corpus did not have any theoretical underpinnings. Kaliisa and Picard ( 2017 ) argue that, without theory, research can result in disorganised accounts and issues with interpreting data, with research effectively “sit[ting] in a void if it’s not theoretically connected” (Kara, 2017 ), p. 56. Therefore, framing research in educational technology with a stronger theoretical basis, can assist with locating the “field’s disciplinary alignment” (Crook, 2019 ), p. 486 and further drive conversations forward.

The application of methods in this corpus was interesting in two ways. First, it is noticeable that quantitative studies are prevalent across the 243 articles in the sample. The number of studies employing qualitative research methods in the sample was comparatively low (56 studies as opposed to 84 mixed method studies and 103 quantitative studies). This is also reflected in the educational technology field at large, with a review of articles published in BJET and Educational Technology Research & Development (ETR&D) from 2002 to 2014 revealing that 40% of articles used quantitative methods, 26% qualitative and 13% mixed (Baydas, Kucuk, Yilmaz, Aydemir, & Goktas, 2015 ), and likewise a review of educational technology research from Turkey 1990–2011 revealed that 53% of articles used quantitative methods, 22% qualitative and 10% mixed methods (Kucuk, Aydemir, Yildirim, Arpacik, & Goktas, 2013 ). Quantitative studies primarily show that an intervention has worked or not when applied to e.g. a group of students in a certain setting as done in the study on using mobile apps on student performance in engineering education by Jou, Lin, and Tsai ( 2016 ), however, not all student engagement indicators can actually be measured in this way. The lower numbers of affective and cognitive engagement found in the studies in the corpus, reflect a wider call to the field to increase research on these two domains (Henrie et al., 2015 ; Joksimović et al., 2018 ; O’Flaherty & Phillips, 2015 ; Schindler et al., 2017 ). Whilst it is arguably more difficult to measure these two than behavioural engagement, the use of more rigorous and accurate surveys could be one possibility, as they can “capture unobservable aspects” (Henrie et al., 2015 ), p. 45 such as student feelings and information about the cognitive strategies they employ (Finn & Zimmer, 2012 ). However, they are often lengthy and onerous, or subject to the limitations of self-selection.

Whereas low numbers of qualitative studies researching student engagement and educational technology were previously identified in other student engagement and technology reviews (Connolly et al., 2012 ; Kay & LeSage, 2009 ; Lundin et al., 2018 ), it is studies like that by Lopera Medina ( 2014 ) in this sample, which reveal how people perceive this educational experience and the actual how of the process. Therefore, more qualitative and ethnographic measures should also be employed, such as student observations with thick descriptions, which can help shed light on the complexity of teaching and learning environments (Fredricks et al., 2004 ; Heflin, Shewmaker, & Nguyen, 2017 ). Conducting observations can be costly, however, both in time and money, so this is suggested in combination with computerised learning analytic data, which can provide measurable, objective and timely insight into how certain manifestations of engagement change over time (Henrie et al., 2015 ; Ma et al., 2015 ).

Whereas other results of this review have confirmed previous results in the field, the technology tools that were used in the studies and considered in their relation to student engagement in this corpus deviate. Whilst Henrie et al. ( 2015 ) found that the most frequently researched tools were discussion forums, general websites, LMS, general campus software and videos, the studies here focused predominantly on LMS, discussion forums, videos, recorded lectures and chat. Furthermore, whilst Schindler et al. ( 2017 ) found that digital games, web-conferencing software and Facebook were the most effective tools at enhancing student engagement, this review found that it was rather text-based tools , knowledge organisation & sharing , and multimodal production tools .

Limitations

During the execution of this systematic review, we tried to adhere to the method as rigorously as possible. However, several challenges were also encountered - some of which are addressed and discussed in another publication (Bedenlier, 2020b ) - resulting in limitations to this study. Four large, general educational research databases were searched, which are international in scope. However, by applying the criterion of articles published in English, research published on this topic in languages other than English was not included in this review. The same applies to research documented in, for example, grey literature, book chapters or monographs, or even articles from journals that are not indexed in the four databases searched. Another limitation is that only research published within the period 2007–2016 was investigated. Whilst we are cognisant of this being a restriction, we also think that the technological advances and the implications to be drawn from this time-frame relate more meaningfully to the current situation, than would have been the case for technologies used in the 1990s see (Bond, 2019b ). The sampling strategy also most likely accounts for the low number of studies from certain countries, e.g. in South America and Africa.

Studies included in this review represent various academic fields, and they also vary in the rigour with which they were conducted. Harden and Gough ( 2012 ) stress that the appraisal of quality and relevance of studies “ensure[s] that only the most appropriate, trustworthy and relevant studies are used to develop the conclusions of the review” (p. 154), we have included the criterion of being a peer reviewed contribution as a formal inclusion criterion from the beginning. In doing so, we reason that studies met a baseline of quality as applicable to published research in a specific field - otherwise they would not have been accepted for publication by the respective community. Finally, whilst the studies were diligently read and coded, and disagreements also discussed and reconciled, the human flaw of having overlooked or misinterpreted information provided in the individual articles cannot fully be excluded.

Finally, the results presented here provide an initial window into the overall body of research identified during the search, and further research is being undertaken to provide deeper insight into discipline specific use of technology and resulting student engagement using subsets of this sample (Bedenlier, 2020a ; Bond, M., Bedenlier, S., Buntins, K., Kerres, M., & Zawacki-Richter, O.: Facilitating student engagement through educational technology: A systematic review in the field of education, forthcoming).

Recommendations for future work and implications for practice

Whilst the evidence map presented in this article has confirmed previous research on the nexus of educational technology and student engagement, it has also elucidated a number of areas that further research is invited to address. Although these findings are similar to that of previous reviews, in order to more fully and comprehensively understand student engagement as a multi-faceted construct, it is not enough to focus only on indicators of engagement that can easily be measured, but rather the more complex endeavour of uncovering and investigating those indicators that reside below the surface. This also includes the careful alignment of theory and methodological design, in order to both adequately analyse the phenomenon under investigation, as well as contributing to the soundly executed body of research within the field of educational technology. Further research is invited in particular into how educational technology affects cognitive and affective engagement, whilst considering how this fits within the broader sociocultural framework of engagement (Bond, 2019a ). Further research is also invited into how educational technology affects student engagement within fields of study beyond Arts & Humanities, Education and Natural Sciences, Mathematics & Statistics, as well as within graduate level courses. The use of more qualitative research methods is particularly encouraged.

The findings of this review suggest that research gaps exist with particular combinations of tools, study levels and modes of delivery. With respect to study level, the use of assessment tools with graduate students, as well as knowledge organisation & sharing tools with undergraduate students, are topics researched far less than expected. The use of text-based tools in Engineering, Health & Welfare and Natural Sciences, Mathematics & Statistics, as well as the use of multimodal production tools outside of these disciplines, are also areas for future research, as is the use of assessment tools in the fields of Education and Arts & Humanities in particular.

With 109 studies in this systematic review using a blended learning design, this is a confirmation of the argument that online distance education and traditional face-to-face education are becoming increasingly more integrated with one another. Whilst this indicates that a lot of educators have made the move from face-to-face teaching to technology-enhanced learning, this also makes a case for the need for further professional development, in order to apply these tools effectively within their own teaching contexts, with this review indicating that further research is needed in particlar into the use of social networking tools in online/distance education. The question also needs to be asked, not only why the number of published studies are low within certain countries and regions, but also to enquire into the nature of why that is the case. This entails questioning the conditions under which research is being conducted, potentially criticising publication policies of major, Western-based journals, but also ultimately to reflect on one’s search strategy and research assumptions as a Western educator-researcher.

Based on the findings of this review, educators within higher education institutions are encouraged to use text-based tools , knowledge, organisation and sharing tools , and multimodal production tools in particular and, whilst any technology can lead to disengagement if not employed effectively, to be mindful that website creation tools (blogs and ePortfolios), social networking tools and assessment tools have been found to be more disengaging than engaging in this review. Therefore, educators are encouraged to ensure that students receive sufficient and ongoing training for any new technology used, including those that might appear straightforward, e.g. blogs, and that they may require extra writing support. Ensure that discussion/blog topics are interesting, that they allow student agency, and they are authentic to students, including the use of social media. Social networking tools that augment student professional learning networks are particularly useful. Educators should also be aware, however, that some students do not want to mix their academic and personal lives, and so the decision to use certain social platforms could be decided together with students.

Availability of data and materials

All data will be made publicly available, as part of the funding requirements, via https://www.researchgate.net/project/Facilitating-student-engagement-with-digital-media-in-higher-education-ActiveLeaRn .

The detailed search strategy, including the modified search strings according to the individual databases, can be retrieved from https://www.researchgate.net/project/Facilitating-student-engagement-with-digital-media-in-higher-education-ActiveLeaRn

The full code set can be retrieved from the review protocol at https://www.researchgate.net/project/Facilitating-student-engagement-with-digital-media-in-higher-education-ActiveLeaRn .

Abdool, P. S., Nirula, L., Bonato, S., Rajji, T. K., & Silver, I. L. (2017). Simulation in undergraduate psychiatry: Exploring the depth of learner engagement. Academic Psychiatry : the Journal of the American Association of Directors of Psychiatric Residency Training and the Association for Academic Psychiatry , 41 (2), 251–261. https://doi.org/10.1007/s40596-016-0633-9 .

Article   Google Scholar  

Alioon, Y., & Delialioğlu, Ö. (2017). The effect of authentic m-learning activities on student engagement and motivation. British Journal of Educational Technology , 32 , 121. https://doi.org/10.1111/bjet.12559 .

Alrasheedi, M., Capretz, L. F., & Raza, A. (2015). A systematic review of the critical factors for success of mobile learning in higher education (university students’ perspective). Journal of Educational Computing Research , 52 (2), 257–276. https://doi.org/10.1177/0735633115571928 .

Al-Sakkaf, A., Omar, M., & Ahmad, M. (2019). A systematic literature review of student engagement in software visualization: A theoretical perspective. Computer Science Education , 29 (2–3), 283–309. https://doi.org/10.1080/08993408.2018.1564611 .

Andrew, L., Ewens, B., & Maslin-Prothero, S. (2015). Enhancing the online learning experience using virtual interactive classrooms. Australian Journal of Advanced Nursing , 32 (4), 22–31.

Google Scholar  

Antonenko, P. D. (2015). The instrumental value of conceptual frameworks in educational technology research. Educational Technology Research and Development , 63 (1), 53–71. https://doi.org/10.1007/s11423-014-9363-4 .

Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools , 45 (5), 369–386. https://doi.org/10.1002/pits.20303 .

Arnold, N., & Paulus, T. (2010). Using a social networking site for experiential learning: Appropriating, lurking, modeling and community building. Internet and Higher Education , 13 (4), 188–196. https://doi.org/10.1016/j.iheduc.2010.04.002 .

Astin, A. W. (1984). Student involvement: A developmental theory for higher education. Journal of College Student Development , 25 (4), 297–308.

Astin, A. W. (1999). Student involvement: A developmental theory for higher education. Journal of College Student Development , 40 (5), 518–529. https://www.researchgate.net/publication/220017441 (Original work published July 1984).

Atmacasoy, A., & Aksu, M. (2018). Blended learning at pre-service teacher education in Turkey: A systematic review. Education and Information Technologies , 23 (6), 2399–2422. https://doi.org/10.1007/s10639-018-9723-5 .

Azevedo, R. (2015). Defining and measuring engagement and learning in science: Conceptual, theoretical, methodological, and analytical issues. Educational Psychologist , 50 (1), 84–94. https://doi.org/10.1080/00461520.2015.1004069 .

Bandura, A. (1971). Social learning theory . New York: General Learning Press.

Barak, M. (2018). Are digital natives open to change? Examining flexible thinking and resistance to change. Computers & Education , 121 , 115–123. https://doi.org/10.1016/j.compedu.2018.01.016 .

Barak, M., & Levenberg, A. (2016). Flexible thinking in learning: An individual differences measure for learning in technology-enhanced environments. Computers & Education , 99 , 39–52. https://doi.org/10.1016/j.compedu.2016.04.003 .

Baron, P., & Corbin, L. (2012). Student engagement: Rhetoric and reality. Higher Education Research and Development , 31 (6), 759–772. https://doi.org/10.1080/07294360.2012.655711 .

Baydas, O., Kucuk, S., Yilmaz, R. M., Aydemir, M., & Goktas, Y. (2015). Educational technology research trends from 2002 to 2014. Scientometrics , 105 (1), 709–725. https://doi.org/10.1007/s11192-015-1693-4 .

Bedenlier, S., Bond, M., Buntins, K., Zawacki-Richter, O., & Kerres, M. (2020a). Facilitating student engagement through educational technology in higher education: A systematic review in the field of arts & humanities. Australasian Journal of Educational Technology , 36 (4), 27–47. https://doi.org/10.14742/ajet.5477 .

Bedenlier, S., Bond, M., Buntins, K., Zawacki-Richter, O., & Kerres, M. (2020b). Learning by Doing? Reflections on Conducting a Systematic Review in the Field of Educational Technology. In O. Zawacki-Richter, M. Kerres, S. Bedenlier, M. Bond, & K. Buntins (Eds.), Systematic Reviews in Educational Research (Vol. 45 , pp. 111–127). Wiesbaden: Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-27602-7_7 .

Ben-Eliyahu, A., Moore, D., Dorph, R., & Schunn, C. D. (2018). Investigating the multidimensionality of engagement: Affective, behavioral, and cognitive engagement across science activities and contexts. Contemporary Educational Psychology , 53 , 87–105. https://doi.org/10.1016/j.cedpsych.2018.01.002 .

Betihavas, V., Bridgman, H., Kornhaber, R., & Cross, M. (2016). The evidence for ‘flipping out’: A systematic review of the flipped classroom in nursing education. Nurse Education Today , 38 , 15–21. https://doi.org/10.1016/j.nedt.2015.12.010 .

Bigatel, P., & Williams, V. (2015). Measuring student engagement in an online program. Online Journal of Distance Learning Administration , 18 (2), 9.

Bodily, R., Leary, H., & West, R. E. (2019). Research trends in instructional design and technology journals. British Journal of Educational Technology , 50 (1), 64–79. https://doi.org/10.1111/bjet.12712 .

Boekaerts, M. (2016). Engagement as an inherent aspect of the learning process. Learning and Instruction , 43 , 76–83. https://doi.org/10.1016/j.learninstruc.2016.02.001 .

Bolden, B., & Nahachewsky, J. (2015). Podcast creation as transformative music engagement. Music Education Research , 17 (1), 17–33. https://doi.org/10.1080/14613808.2014.969219 .

Bond, M. (2018). Helping doctoral students crack the publication code: An evaluation and content analysis of the Australasian Journal of Educational Technology. Australasian Journal of Educational Technology , 34 (5), 168–183. https://doi.org/10.14742/ajet.4363 .

Bond, M., & Bedenlier, S. (2019a). Facilitating Student Engagement Through Educational Technology: Towards a Conceptual Framework. Journal of Interactive Media in Education , 2019 (1), 1-14. https://doi.org/10.5334/jime.528 .

Bond, M., Zawacki-Richter, O., & Nichols, M. (2019b). Revisiting five decades of educational technology research: A content and authorship analysis of the British Journal of Educational Technology. British Journal of Educational Technology , 50 (1), 12–63. https://doi.org/10.1111/bjet.12730 .

Bouta, H., Retalis, S., & Paraskeva, F. (2012). Utilising a collaborative macro-script to enhance student engagement: A mixed method study in a 3D virtual environment. Computers & Education , 58 (1), 501–517. https://doi.org/10.1016/j.compedu.2011.08.031 .

Bower, M. (2015). A typology of web 2.0 learning technologies . EDUCAUSE Digital Library Retrieved 20 June 2019, from http://www.educause.edu/library/resources/typology-web-20-learning-technologies .

Bower, M. (2016). Deriving a typology of web 2.0 learning technologies. British Journal of Educational Technology , 47 (4), 763–777. https://doi.org/10.1111/bjet.12344 .

Boyle, E. A., Connolly, T. M., Hainey, T., & Boyle, J. M. (2012). Engagement in digital entertainment games: A systematic review. Computers in Human Behavior , 28 (3), 771–780. https://doi.org/10.1016/j.chb.2011.11.020 .

Boyle, E. A., Hainey, T., Connolly, T. M., Gray, G., Earp, J., Ott, M., … Pereira, J. (2016). An update to the systematic literature review of empirical evidence of the impacts and outcomes of computer games and serious games. Computers & Education , 94 , 178–192. https://doi.org/10.1016/j.compedu.2015.11.003 .

Broadbent, J., & Poon, W. L. (2015). Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. The Internet and Higher Education , 27 , 1–13. https://doi.org/10.1016/j.iheduc.2015.04.007 .

Brunton, G., Stansfield, C., & Thomas, J. (2012). Finding relevant studies. In D. Gough, S. Oliver, & J. Thomas (Eds.), An introduction to systematic reviews , (pp. 107–134). Los Angeles: Sage.

Bryman, A. (2007). The research question in social research: What is its role? International Journal of Social Research Methodology , 10 (1), 5–20. https://doi.org/10.1080/13645570600655282 .

Bulu, S. T., & Yildirim, Z. (2008). Communication behaviors and trust in collaborative online teams. Educational Technology & Society , 11 (1), 132–147.

Bundick, M., Quaglia, R., Corso, M., & Haywood, D. (2014). Promoting student engagement in the classroom. Teachers College Record , 116 (4) Retrieved from http://www.tcrecord.org/content.asp?contentid=17402 .

Castañeda, L., & Selwyn, N. (2018). More than tools? Making sense of the ongoing digitizations of higher education. International Journal of Educational Technology in Higher Education , 15 (1), 211. https://doi.org/10.1186/s41239-018-0109-y .

Chen, P.-S. D., Lambert, A. D., & Guidry, K. R. (2010). Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education , 54 (4), 1222–1232. https://doi.org/10.1016/j.compedu.2009.11.008 .

Cheston, C. C., Flickinger, T. E., & Chisolm, M. S. (2013). Social media use in medical education: A systematic review. Academic Medicine : Journal of the Association of American Medical Colleges , 88 (6), 893–901. https://doi.org/10.1097/ACM.0b013e31828ffc23 .

Choi, M., Glassman, M., & Cristol, D. (2017). What it means to be a citizen in the internet age: Development of a reliable and valid digital citizenship scale. Computers & Education , 107 , 100–112. https://doi.org/10.1016/j.compedu.2017.01.002 .

Christenson, S. L., Reschly, A. L., & Wylie, C. (Eds.) (2012). Handbook of research on student engagement . Boston: Springer US.

Coates, H. (2007). A model of online and general campus-based student engagement. Assessment & Evaluation in Higher Education , 32 (2), 121–141. https://doi.org/10.1080/02602930600801878 .

Connolly, T. M., Boyle, E. A., MacArthur, E., Hainey, T., & Boyle, J. M. (2012). A systematic literature review of empirical evidence on computer games and serious games. Computers & Education , 59 (2), 661–686. https://doi.org/10.1016/j.compedu.2012.03.004 .

Cook, M. P., & Bissonnette, J. D. (2016). Developing preservice teachers’ positionalities in 140 characters or less: Examining microblogging as dialogic space. Contemporary Issues in Technology and Teacher Education (CITE Journal) , 16 (2), 82–109.

Crompton, H., Burke, D., Gregory, K. H., & Gräbe, C. (2016). The use of mobile learning in science: A systematic review. Journal of Science Education and Technology , 25 (2), 149–160. https://doi.org/10.1007/s10956-015-9597-x .

Crook, C. (2019). The “British” voice of educational technology research: 50th birthday reflection. British Journal of Educational Technology , 50 (2), 485–489. https://doi.org/10.1111/bjet.12757 .

Davies, M. (2014). Using the apple iPad to facilitate student-led group work and seminar presentation. Nurse Education in Practice , 14 (4), 363–367. https://doi.org/10.1016/j.nepr.2014.01.006 .

Article   MathSciNet   Google Scholar  

Delialioglu, O. (2012). Student engagement in blended learning environments with lecture-based and problem-based instructional approaches. Educational Technology & Society , 15 (3), 310–322.

DePaolo, C. A., & Wilkinson, K. (2014). Recurrent online quizzes: Ubiquitous tools for promoting student presence, participation and performance. Interdisciplinary Journal of E-Learning and Learning Objects , 10 , 75–91 Retrieved from http://www.ijello.org/Volume10/IJELLOv10p075-091DePaolo0900.pdf .

Doherty, K., & Doherty, G. (2018). Engagement in HCI. ACM Computing Surveys , 51 (5), 1–39. https://doi.org/10.1145/3234149 .

Eccles, J. (2016). Engagement: Where to next? Learning and Instruction , 43 , 71–75. https://doi.org/10.1016/j.learninstruc.2016.02.003 .

Eccles, J., & Wang, M.-T. (2012). Part I commentary: So what is student engagement anyway? In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 133–145). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_6 .

Chapter   Google Scholar  

Englund, C., Olofsson, A. D., & Price, L. (2017). Teaching with technology in higher education: Understanding conceptual change and development in practice. Higher Education Research and Development , 36 (1), 73–87. https://doi.org/10.1080/07294360.2016.1171300 .

Fabian, K., Topping, K. J., & Barron, I. G. (2016). Mobile technology and mathematics: Effects on students’ attitudes, engagement, and achievement. Journal of Computers in Education , 3 (1), 77–104. https://doi.org/10.1007/s40692-015-0048-8 .

Filsecker, M., & Kerres, M. (2014). Engagement as a volitional construct. Simulation & Gaming , 45 (4–5), 450–470. https://doi.org/10.1177/1046878114553569 .

Finn, J. (2006). The adult lives of at-risk students: The roles of attainment and engagement in high school (NCES 2006-328) . Washington, DC: U.S. Department of Education, National Center for Education Statistics Retrieved from website: https://nces.ed.gov/pubs2006/2006328.pdf .

Finn, J., & Zimmer, K. (2012). Student engagement: What is it? Why does it matter? In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 97–131). Boston: Springer US. https://doi.org/10.1007/978-1-4614-2018-7_5 .

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research , 74 (1), 59–109. https://doi.org/10.3102/00346543074001059 .

Fredricks, J. A., Filsecker, M., & Lawson, M. A. (2016). Student engagement, context, and adjustment: Addressing definitional, measurement, and methodological issues. Learning and Instruction , 43 , 1–4. https://doi.org/10.1016/j.learninstruc.2016.02.002 .

Fredricks, J. A., Wang, M.-T., Schall Linn, J., Hofkens, T. L., Sung, H., Parr, A., & Allerton, J. (2016). Using qualitative methods to develop a survey measure of math and science engagement. Learning and Instruction , 43 , 5–15. https://doi.org/10.1016/j.learninstruc.2016.01.009 .

Fukuzawa, S., & Boyd, C. (2016). Student engagement in a large classroom: Using technology to generate a hybridized problem-based learning experience in a large first year undergraduate class. Canadian Journal for the Scholarship of Teaching and Learning , 7 (1). https://doi.org/10.5206/cjsotl-rcacea.2016.1.7 .

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research . Chicago: Aldine.

Gleason, J. (2012). Using technology-assisted instruction and assessment to reduce the effect of class size on student outcomes in undergraduate mathematics courses. College Teaching , 60 (3), 87–94.

Gough, D., Oliver, S., & Thomas, J. (2012). An introduction to systematic reviews . Los Angeles: Sage.

Granberg, C. (2010). Social software for reflective dialogue: Questions about reflection and dialogue in student Teachers’ blogs. Technology, Pedagogy and Education , 19 (3), 345–360. https://doi.org/10.1080/1475939X.2010.513766 .

Greenwood, L., & Kelly, C. (2019). A systematic literature review to explore how staff in schools describe how a sense of belonging is created for their pupils. Emotional and Behavioural Difficulties , 24 (1), 3–19. https://doi.org/10.1080/13632752.2018.1511113 .

Gupta, M. L. (2009). Using emerging technologies to promote student engagement and learning in agricultural mathematics. International Journal of Learning , 16 (10), 497–508. https://doi.org/10.18848/1447-9494/CGP/v16i10/46658 .

Harden, A., & Gough, D. (2012). Quality and relevance appraisal. In D. Gough, S. Oliver, & J. Thomas (Eds.), An introduction to systematic reviews , (pp. 153–178). London: Sage.

Hatzipanagos, S., & Code, J. (2016). Open badges in online learning environments: Peer feedback and formative assessment as an engagement intervention for promoting agency. Journal of Educational Multimedia and Hypermedia , 25 (2), 127–142.

Heflin, H., Shewmaker, J., & Nguyen, J. (2017). Impact of mobile technology on student attitudes, engagement, and learning. Computers & Education , 107 , 91–99. https://doi.org/10.1016/j.compedu.2017.01.006 .

Henderson, M., Selwyn, N., & Aston, R. (2017). What works and why? Student perceptions of ‘useful’ digital technology in university teaching and learning. Studies in Higher Education , 42 (8), 1567–1579. https://doi.org/10.1080/03075079.2015.1007946 .

Hennessy, S., Girvan, C., Mavrikis, M., Price, S., & Winters, N. (2018). Editorial. British Journal of Educational Technology , 49 (1), 3–5. https://doi.org/10.1111/bjet.12598 .

Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers & Education , 90 , 36–53. https://doi.org/10.1016/j.compedu.2015.09.005 .

Hew, K. F., & Cheung, W. S. (2013). Use of web 2.0 technologies in K-12 and higher education: The search for evidence-based practice. Educational Research Review , 9 , 47–64. https://doi.org/10.1016/j.edurev.2012.08.001 .

Hew, K. F., Lan, M., Tang, Y., Jia, C., & Lo, C. K. (2019). Where is the “theory” within the field of educational technology research? British Journal of Educational Technology , 50 (3), 956–971. https://doi.org/10.1111/bjet.12770 .

Howard, S. K., Ma, J., & Yang, J. (2016). Student rules: Exploring patterns of students’ computer-efficacy and engagement with digital technologies in learning. Computers & Education , 101 , 29–42. https://doi.org/10.1016/j.compedu.2016.05.008 .

Hu, S., & Kuh, G. D. (2002). Being (dis)engaged in educationally purposeful activities: The influences of student and institutional characteristics. Research in Higher Education , 43 (5), 555–575. https://doi.org/10.1023/A:1020114231387 .

Hunsu, N. J., Adesope, O., & Bayly, D. J. (2016). A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect. Computers & Education , 94 , 102–119. https://doi.org/10.1016/j.compedu.2015.11.013 .

Ikpeze, C. (2007). Small group collaboration in peer-led electronic discourse: An analysis of group dynamics and interactions involving Preservice and Inservice teachers. Journal of Technology and Teacher Education , 15 (3), 383–407.

Ivala, E., & Gachago, D. (2012). Social media for enhancing student engagement: The use of Facebook and blogs at a university of technology. South African Journal of Higher Education , 26 (1), 152–167.

Järvelä, S., Järvenoja, H., Malmberg, J., Isohätälä, J., & Sobocinski, M. (2016). How do types of interaction and phases of self-regulated learning set a stage for collaborative engagement? Learning and Instruction , 43 , 39–51. https://doi.org/10.1016/j.learninstruc.2016.01.005 .

Joksimović, S., Poquet, O., Kovanović, V., Dowell, N., Mills, C., Gašević, D., … Brooks, C. (2018). How do we model learning at scale? A systematic review of research on MOOCs. Review of Educational Research , 88 (1), 43–86. https://doi.org/10.3102/0034654317740335 .

Jou, M., Lin, Y.-T., & Tsai, H.-C. (2016). Mobile APP for motivation to learning: An engineering case. Interactive Learning Environments , 24 (8), 2048–2057. https://doi.org/10.1080/10494820.2015.1075136 .

Junco, R. (2012). The relationship between frequency of Facebook use, participation in Facebook activities, and student engagement. Computers & Education , 58 (1), 162–171. https://doi.org/10.1016/j.compedu.2011.08.004 .

Kahn, P. (2014). Theorising student engagement in higher education. British Educational Research Journal , 40 (6), 1005–1018. https://doi.org/10.1002/berj.3121 .

Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education , 38 (5), 758–773. https://doi.org/10.1080/03075079.2011.598505 .

Kahu, E. R., & Nelson, K. (2018). Student engagement in the educational interface: Understanding the mechanisms of student success. Higher Education Research and Development , 37 (1), 58–71. https://doi.org/10.1080/07294360.2017.1344197 .

Kaliisa, R., & Picard, M. (2017). A systematic review on mobile learning in higher education: The African perspective. The Turkish Online Journal of Educational Technology , 16 (1) Retrieved from https://files.eric.ed.gov/fulltext/EJ1124918.pdf .

Kara, H. (2017). Research and evaluation for busy students and practitioners: A time-saving guide , (2nd ed., ). Bristol: Policy Press.

Book   Google Scholar  

Karabulut-Ilgu, A., Jaramillo Cherrez, N., & Jahren, C. T. (2018). A systematic review of research on the flipped learning method in engineering education: Flipped learning in engineering education. British Journal of Educational Technology , 49 (3), 398–411. https://doi.org/10.1111/bjet.12548 .

Kay, R. H., & LeSage, A. (2009). Examining the benefits and challenges of using audience response systems: A review of the literature. Computers & Education , 53 (3), 819–827. https://doi.org/10.1016/j.compedu.2009.05.001 .

Keiller, L., & Inglis-Jassiem, G. (2015). A lesson in listening: Is the student voice heard in the rush to incorporate technology into health professions education? African Journal of Health Professions Education , 7 (1), 47–50. https://doi.org/10.7196/ajhpe.371 .

Kelley, K., Lai, K., Lai, M. K., & Suggests, M. (2018). Package ‘MBESS’. Retrieved from https://cran.r-project.org/web/packages/MBESS/MBESS.pdf

Kerres, M. (2013). Mediendidaktik. Konzeption und Entwicklung mediengestützter Lernangebote . München: Oldenbourg.

Kirkwood, A. (2009). E-learning: You don’t always get what you hope for. Technology, Pedagogy and Education , 18 (2), 107–121. https://doi.org/10.1080/14759390902992576 .

Koehler, M., & Mishra, P. (2005). What happens when teachers design educational technology? The development of technological pedagogical content knowledge. Journal of Educational Computing Research , 32 (2), 131–152.

Krause, K.-L., & Coates, H. (2008). Students’ engagement in first-year university. Assessment & Evaluation in Higher Education , 33 (5), 493–505. https://doi.org/10.1080/02602930701698892 .

Kucuk, S., Aydemir, M., Yildirim, G., Arpacik, O., & Goktas, Y. (2013). Educational technology research trends in Turkey from 1990 to 2011. Computers & Education , 68 , 42–50. https://doi.org/10.1016/j.compedu.2013.04.016 .

Kuh, G. D. (2001). The National Survey of student engagement: Conceptual framework and overview of psychometric properties . Bloomington: Indiana University Center for Postsecondary Research Retrieved from http://nsse.indiana.edu/2004_annual_report/pdf/2004_conceptual_framework.pdf .

Kuh, G. D. (2009). What student affairs professionals need to know about student engagement. Journal of College Student Development , 50 (6), 683–706. https://doi.org/10.1353/csd.0.0099 .

Kuh, G. D., Cruce, T. M., Shoup, R., Kinzie, J., & Gonyea, R. M. (2008). Unmasking the effects of student engagement on first-year college grades and persistence. The Journal of Higher Education , 79 (5), 540–563 Retrieved from http://www.jstor.org.ezproxy.umuc.edu/stable/25144692 .

Kuh, G. D., J. Kinzie, J. A. Buckley, B. K. Bridges, & J. C. Hayek. (2006). What matters to student success: A review of the literature. Washington, DC: National Postsecondary Education Cooperative.

Kupper, L. L., & Hafner, K. B. (1989). How appropriate are popular sample size formulas? The American Statistician , 43 (2), 101–105.

Lai, J. W. M., & Bower, M. (2019). How is the use of technology in education evaluated? A systematic review. Computers & Education , 133 , 27–42. https://doi.org/10.1016/j.compedu.2019.01.010 .

Lawson, M. A., & Lawson, H. A. (2013). New conceptual frameworks for student engagement research, policy, and practice. Review of Educational Research , 83 (3), 432–479. https://doi.org/10.3102/0034654313480891 .

Leach, L., & Zepke, N. (2011). Engaging students in learning: A review of a conceptual organiser. Higher Education Research and Development , 30 (2), 193–204. https://doi.org/10.1080/07294360.2010.509761 .

Li, J., van der Spek, E. D., Feijs, L., Wang, F., & Hu, J. (2017). Augmented reality games for learning: A literature review. In N. Streitz, & P. Markopoulos (Eds.), Lecture Notes in Computer Science. Distributed, Ambient and Pervasive Interactions , (vol. 10291, pp. 612–626). Cham: Springer International Publishing. https://doi.org/10.1007/978-3-319-58697-7_46 .

Lim, C. (2004). Engaging learners in online learning environments. TechTrends , 48 (4), 16–23 Retrieved from https://link.springer.com/content/pdf/10.1007%2FBF02763440.pdf .

Lopera Medina, S. (2014). Motivation conditions in a foreign language reading comprehension course offering both a web-based modality and a face-to-face modality (Las condiciones de motivación en un curso de comprensión de lectura en lengua extranjera (LE) ofrecido tanto en la modalidad presencial como en la modalidad a distancia en la web). PROFILE: Issues in Teachers’ Professional Development , 16 (1), 89–104 Retrieved from https://search.proquest.com/docview/1697487398?accountid=12968 .

Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: A systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education , 15 (1), 1. https://doi.org/10.1186/s41239-018-0101-6 .

Ma, J., Han, X., Yang, J., & Cheng, J. (2015). Examining the necessary condition for engagement in an online learning environment based on learning analytics approach: The role of the instructor. The Internet and Higher Education , 24 , 26–34. https://doi.org/10.1016/j.iheduc.2014.09.005 .

Mahatmya, D., Lohman, B. J., Matjasko, J. L., & Farb, A. F. (2012). Engagement across developmental periods. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 45–63). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_3 .

Mansouri, A. S., & Piki, A. (2016). An exploration into the impact of blogs on students’ learning: Case studies in postgraduate business education. Innovations in Education and Teaching International , 53 (3), 260–273. https://doi.org/10.1080/14703297.2014.997777 .

Martin, A. J. (2012). Motivation and engagement: Conceptual, operational, and empirical clarity. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 303–311). Boston: Springer US. https://doi.org/10.1007/978-1-4614-2018-7_14 .

McCutcheon, K., Lohan, M., Traynor, M., & Martin, D. (2015). A systematic review evaluating the impact of online or blended learning vs. face-to-face learning of clinical skills in undergraduate nurse education. Journal of Advanced Nursing , 71 (2), 255–270. https://doi.org/10.1111/jan.12509 .

Miake-Lye, I. M., Hempel, S., Shanman, R., & Shekelle, P. G. (2016). What is an evidence map? A systematic review of published evidence maps and their definitions, methods, and products. Systematic Reviews , 5 , 28. https://doi.org/10.1186/s13643-016-0204-x .

Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. BMJ (Clinical Research Ed.) , 339 , b2535. https://doi.org/10.1136/bmj.b2535 .

Nelson Laird, T. F., & Kuh, G. D. (2005). Student experiences with information technology and their relationship to other aspects of student engagement. Research in Higher Education , 46 (2), 211–233. https://doi.org/10.1007/s11162-004-1600-y .

Nguyen, L., Barton, S. M., & Nguyen, L. T. (2015). iPads in higher education-hype and hope. British Journal of Educational Technology , 46 (1), 190–203. https://doi.org/10.1111/bjet.12137 .

Nicholas, D., Watkinson, A., Jamali, H. R., Herman, E., Tenopir, C., Volentine, R., … Levine, K. (2015). Peer review: Still king in the digital age. Learned Publishing , 28 (1), 15–21. https://doi.org/10.1087/20150104 .

Nikou, S. A., & Economides, A. A. (2018). Mobile-based assessment: A literature review of publications in major referred journals from 2009 to 2018. Computers & Education , 125 , 101–119. https://doi.org/10.1016/j.compedu.2018.06.006 .

Norris, L., & Coutas, P. (2014). Cinderella’s coach or just another pumpkin? Information communication technologies and the continuing marginalisation of languages in Australian schools. Australian Review of Applied Linguistics , 37 (1), 43–61 Retrieved from http://www.jbe-platform.com/content/journals/10.1075/aral.37.1.03nor .

OECD (2015a). Schooling redesigned. Educational Research and Innovation . OECD Publishing Retrieved from http://www.oecd-ilibrary.org/education/schooling-redesigned_9789264245914-en .

OECD (2015b). Students, computers and learning . PISA: OECD Publishing Retrieved from http://www.oecd-ilibrary.org/education/students-computers-and-learning_9789264239555-en .

O’Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education , 25 , 85–95. https://doi.org/10.1016/j.iheduc.2015.02.002 .

O’Gorman, E., Salmon, N., & Murphy, C.-A. (2016). Schools as sanctuaries: A systematic review of contextual factors which contribute to student retention in alternative education. International Journal of Inclusive Education , 20 (5), 536–551. https://doi.org/10.1080/13603116.2015.1095251 .

Oliver, B., & de St Jorre, Trina, J. (2018). Graduate attributes for 2020 and beyond: recommendations for Australian higher education providers. Higher Education Research and Development , 1–16. https://doi.org/10.1080/07294360.2018.1446415 .

O’Mara-Eves, A., Brunton, G., McDaid, D., Kavanagh, J., Oliver, S., & Thomas, J. (2014). Techniques for identifying cross-disciplinary and ‘hard-to-detect’ evidence for systematic review. Research Synthesis Methods , 5 (1), 50–59. https://doi.org/10.1002/jrsm.1094 .

Payne, L. (2017). Student engagement: Three models for its investigation. Journal of Further and Higher Education , 3 (2), 1–17. https://doi.org/10.1080/0309877X.2017.1391186 .

Pekrun, R., & Linnenbrink-Garcia, L. (2012). Academic emotions and student engagement. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 259–282). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_12 .

Popenici, S. (2013). Towards a new vision for university governance, pedagogies and student engagement. In E. Dunne, & D. Owen (Eds.), The student engagement handbook: Practice in higher education , (1st ed., pp. 23–42). Bingley: Emerald.

Price, L., Richardson, J. T., & Jelfs, A. (2007). Face-to-face versus online tutoring support in distance education. Studies in Higher Education , 32 (1), 1–20.

Quin, D. (2017). Longitudinal and contextual associations between teacher–student relationships and student engagement. Review of Educational Research , 87 (2), 345–387. https://doi.org/10.3102/0034654316669434 .

Rashid, T., & Asghar, H. M. (2016). Technology use, self-directed learning, student engagement and academic performance: Examining the interrelations. Computers in Human Behavior , 63 , 604–612. https://doi.org/10.1016/j.chb.2016.05.084 .

Redecker, C. (2017). European framework for the digital competence of educators . Luxembourg: Office of the European Union.

Redmond, P., Heffernan, A., Abawi, L., Brown, A., & Henderson, R. (2018). An online engagement framework for higher education. Online Learning , 22 (1). https://doi.org/10.24059/olj.v22i1.1175 .

Reeve, J. (2012). A self-determination theory perspective on student engagement. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 149–172). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_7 .

Reeve, J., & Tseng, C.-M. (2011). Agency as a fourth aspect of students’ engagement during learning activities. Contemporary Educational Psychology , 36 (4), 257–267. https://doi.org/10.1016/j.cedpsych.2011.05.002 .

Reschly, A. L., & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 3–19). Boston: Springer US Retrieved from http://link.springer.com/10.1007/978-1-4614-2018-7_1 .

Salaber, J. (2014). Facilitating student engagement and collaboration in a large postgraduate course using wiki-based activities. The International Journal of Management Education , 12 (2), 115–126. https://doi.org/10.1016/j.ijme.2014.03.006 .

Schindler, L. A., Burkholder, G. J., Morad, O. A., & Marsh, C. (2017). Computer-based technology and student engagement: A critical review of the literature. International Journal of Educational Technology in Higher Education , 14 (1), 253. https://doi.org/10.1186/s41239-017-0063-0 .

Selwyn, N. (2016). Digital downsides: Exploring university students’ negative engagements with digital technology. Teaching in Higher Education , 21 (8), 1006–1021. https://doi.org/10.1080/13562517.2016.1213229 .

Shonfeld, M., & Ronen, I. (2015). Online learning for students from diverse backgrounds: Learning disability students, excellent students and average students. IAFOR Journal of Education , 3 (2), 13–29.

Skinner, E., & Pitzer, J. R. (2012). Developmental dynamics of student engagement, coping, and everyday resilience. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement , (pp. 21–44). Boston: Springer US.

Smidt, E., Bunk, J., McGrory, B., Li, R., & Gatenby, T. (2014). Student attitudes about distance education: Focusing on context and effective practices. IAFOR Journal of Education , 2 (1), 40–64.

Smith, R. (2006). Peer review: A flawed process at the heart of science and journals. Journal of the Royal Society of Medicine , 99 , 178–182.

Smith, T., & Lambert, R. (2014). A systematic review investigating the use of twitter and Facebook in university-based healthcare education. Health Education , 114 (5), 347–366. https://doi.org/10.1108/HE-07-2013-0030 .

Solomonides, I. (2013). A relational and multidimensional model of student engagement. In E. Dunne, & D. Owen (Eds.), The student engagement handbook: Practice in higher education , (1st ed., pp. 43–58). Bingley: Emerald.

Sosa Neira, E. A., Salinas, J., & de Benito, B. (2017). Emerging technologies (ETs) in education: A systematic review of the literature published between 2006 and 2016. International Journal of Emerging Technologies in Learning (IJET) , 12 (05), 128. https://doi.org/10.3991/ijet.v12i05.6939 .

Sullivan, M., & Longnecker, N. (2014). Class blogs as a teaching tool to promote writing and student interaction. Australasian Journal of Educational Technology , 30 (4), 390–401. https://doi.org/10.14742/ajet.322 .

Sun, J. C.-Y., & Rueda, R. (2012). Situational interest, computer self-efficacy and self-regulation: Their impact on student engagement in distance education. British Journal of Educational Technology , 43 (2), 191–204. https://doi.org/10.1111/j.1467-8535.2010.01157.x .

Szabo, Z., & Schwartz, J. (2011). Learning methods for teacher education: The use of online discussions to improve critical thinking. Technology, Pedagogy and Education , 20 (1), 79–94. https://doi.org/10.1080/1475939x.2010.534866 .

Tamim, R. M., Bernard, R. M., Borokhovski, E., Abrami, P. C., & Schmid, R. F. (2011). What forty years of research says about the impact of technology on learning: A second-order meta-analysis and validation study. Review of Educational Research , 81 (1), 4–28. https://doi.org/10.3102/0034654310393361 .

Trowler, V. (2010). Student engagement literature review . York: The Higher Education Academy Retrieved from website: https://www.heacademy.ac.uk/system/files/studentengagementliteraturereview_1.pdf .

Van Rooij, E., Brouwer, J., Fokkens-Bruinsma, M., Jansen, E., Donche, V., & Noyens, D. (2017). A systematic review of factors related to first-year students’ success in Dutch and Flemish higher education. Pedagogische Studien , 94 (5), 360–405 Retrieved from https://repository.uantwerpen.be/docman/irua/cebc4c/149722.pdf .

Vural, O. F. (2013). The impact of a question-embedded video-based learning tool on E-learning. Educational Sciences: Theory and Practice , 13 (2), 1315–1323.

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes . Cambridge: Harvard University Press.

Webb, L., Clough, J., O’Reilly, D., Wilmott, D., & Witham, G. (2017). The utility and impact of information communication technology (ICT) for pre-registration nurse education: A narrative synthesis systematic review. Nurse Education Today , 48 , 160–171. https://doi.org/10.1016/j.nedt.2016.10.007 .

Wekullo, C. S. (2019). International undergraduate student engagement: Implications for higher education administrators. Journal of International Students , 9 (1), 320–337. https://doi.org/10.32674/jis.v9i1.257 .

Wimpenny, K., & Savin-Baden, M. (2013). Alienation, agency and authenticity: A synthesis of the literature on student engagement. Teaching in Higher Education , 18 (3), 311–326. https://doi.org/10.1080/13562517.2012.725223 .

Winstone, N. E., Nash, R. A., Parker, M., & Rowntree, J. (2017). Supporting learners’ agentic engagement with feedback: A systematic review and a taxonomy of recipience processes. Educational Psychologist , 52 (1), 17–37. https://doi.org/10.1080/00461520.2016.1207538 .

Zepke, N. (2014). Student engagement research in higher education: Questioning an academic orthodoxy. Teaching in Higher Education , 19 (6), 697–708. https://doi.org/10.1080/13562517.2014.901956 .

Zepke, N. (2018). Student engagement in neo-liberal times: What is missing? Higher Education Research and Development , 37 (2), 433–446. https://doi.org/10.1080/07294360.2017.1370440 .

Zepke, N., & Leach, L. (2010). Improving student engagement: Ten proposals for action. Active Learning in Higher Education , 11 (3), 167–177. https://doi.org/10.1177/1469787410379680 .

Zhang, A., & Aasheim, C. (2011). Academic success factors: An IT student perspective. Journal of Information Technology Education: Research , 10 , 309–331. https://doi.org/10.28945/1518 .

Download references

Acknowledgements

The authors thank the two student assistants who helped during the article retrieval and screening stage.

This research resulted from the ActiveLearn project, funded by the Bundesministerium für Bildung und Forschung (BMBF-German Ministry of Education and Research) [grant number 16DHL1007].

Author information

Authors and affiliations.

Faculty of Education and Social Sciences (COER), Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany

Melissa Bond, Svenja Bedenlier & Olaf Zawacki-Richter

Learning Lab, Universität Duisburg-Essen, Essen, Germany

Katja Buntins & Michael Kerres

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the design and conceptualisation of the systematic review. MB, KB and SB conducted the systematic review search and data extraction. MB undertook the literature review on student engagement and educational technology, co-wrote the method, results, discussion and conclusion section. KB designed and executed the sampling strategy and produced all of the graphs and tables, as well as assisted with the formulation of the article. SB co-wrote the method, results, discussion and conclusion sections, and proof read the introduction and literature review sections. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Melissa Bond .

Ethics declarations

Consent for publication.

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1..

Literature reviews (LR) and systematic reviews (SR) on student engagement

Additional file 2.

Indicators of engagement and disengagement

Additional file 3.

Literature reviews (LR) and systematic reviews (SR) on student engagement and technology in higher education (HE)

Additional file 4.

Educational technology tool typology based on Bower ( 2016 ) and Educational technology tools used

Additional file 5.

Text-based tool examples by engagement domain

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Cite this article.

Bond, M., Buntins, K., Bedenlier, S. et al. Mapping research in student engagement and educational technology in higher education: a systematic evidence map. Int J Educ Technol High Educ 17 , 2 (2020). https://doi.org/10.1186/s41239-019-0176-8

Download citation

Received : 01 May 2019

Accepted : 17 December 2019

Published : 22 January 2020

DOI : https://doi.org/10.1186/s41239-019-0176-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Educational technology
  • Higher education
  • Systematic review
  • Evidence map
  • Student engagement

what are some research questions on student engagement

Kentucky Teacher Logo

  • Commissioner’s Comments
  • Educator Spotlight
  • KDE Employee Spotlight
  • Guest Columns
  • Gifted and Talented
  • School Counselors
  • School Safety
  • Career & Technical Education
  • Global Competency/World Languages
  • Health and Physical Education
  • Library Media
  • Mathematics
  • Social Studies
  • Visual & Performing Arts
  • Announcements
  • Conferences & Workshops
  • Contests & Other Events
  • Special Recognition

Essential questions to enhance student engagement

Stacy Bewley

Stacy Bewley

Imagine the perfect day in your classroom. What comes to mind? Flawlessly executed lesson plans? Well-behaved students? High levels of achievement? Student engagement? Is one of these more important than the others?

While lesson planning, classroom management and mastery of content are significant at every level, I think most teachers would agree that student engagement is the key ingredient to the perfect school day.

But how do we define student engagement? In reality, true student engagement is a multi-layered combination of the student, the teacher, the learning environment and the content. Though consistently achieving high levels of student engagement may seem as elusive as a perfect day, with intentionality, it is within the reach of classroom teachers.

John Hattie of the Visible Learning website and authors Robert Marzano, Debra Pickering and Tammy Heflebower of “ The Highly Engaged Classroom ,” suggest that teachers should examine each lesson from the students’ lens. The authors recommend that teachers consider four questions from the students’ perspective:

  • How do I feel?
  • Am I interested?
  • Is this important?
  • Can I do this?

Planning with these questions in mind increases the likelihood of student engagement in the classroom.

How do I feel? In Visible Learning , Hattie claims that student achievement from previous years has a significant impact on student learning in our classrooms. That makes sense, right? But Hattie continues, “Our job as teachers is to mess this up, by planning ways in which to accelerate the growth of those who start behind.” We can do this, in part, through responsive teaching and adjusting the pacing of instruction.

Our students’ emotional states, motivation levels, interest and other outside stressors all play a part in learning. While we can’t control much of what students bring to the classroom in terms of emotions, we can control the climate of the instructional setting, which plays an important role in how students feel when they enter our classrooms. If we are successful in considering the following, our students are safe to learn, to make mistakes and to grow.

  • Pacing and Transitions – Establish clear, practiced routines and include frequent opportunities to discuss the learning. Conduct frequent formative assessments in order to determine and respond to student needs.
  • Physical Movement – Incorporate physical movement into the day to infuse energy, to create physical representations of the content (i.e. vote with your feet, human graph), and to respond to questions. Movement helps students reset their brains and refocus their attention.
  • Intensity and Enthusiasm – Teachers set the tone in the classroom and in each lesson. Our excitement, energy and positive attitudes (or lack thereof) are contagious.
  • Humor – Funny news clips, headlines, quotes and personal anecdotes are potential material to introduce learning or help students make connections.
  • Building Positive Teacher-Student Relationships and Student-Student Relationships – Establish a safe, respectful classroom climate where students can expect to be accepted and treated fairly.

Am I interested? Students encounter a constant stream of information competing to enter into their sensory, working or permanent memory. Marzano, Pickering and Heflebower write that each memory stage varies in purpose, with information being stored in the sensory stage temporarily, in the working stage a bit longer to give students time to process, and in permanent memory as stored experiences.

Where information is ultimately stored is directly connected to the level of interest. Here are a few tactics to build interest in a lesson.

  • Using games and inconsequential competition – Utilize flexible grouping so that students are not always working with the same peer(s), and ensure that not everything is graded. Sometimes we just want students to engage in the learning without the pressure of a grade.
  • Initiating friendly controversy – Consider using a class vote, debate or town hall to encourage students to take a side and defend a claim. Asking students to assume a different position than their own also can deepen their understanding of information.
  • Presenting unusual information – Use current events, fun facts and guest speakers to pique student interest.
  • Questioning to increase response rates – Call on students randomly so they are poised to participate, increase wait time after questions and utilize simultaneous individual responses (response cards, whiteboards, technology, etc.).

Is this important? Our students want to know that what we are teaching them is important in their lives. Whether it be a connection to a real-world problem or directly tied to their college and career goals, establishing the importance of the content is a critical piece of the learning experience for students. Some of the ways we can do this are discussed below.

Connecting to Students’ Lives

  • Comparison Tasks – Encourage students to consider physical characteristics, processes, timelines, cause-and-effect relationships, psychological characteristics, fame or notoriety to help them develop deeper understanding of the content. We can incorporate comparison tasks into a variety of content, including language arts (i.e. character and plot analyses), humanities (i.e. musical and artistic styles, period timelines), social studies and science.
  • Connecting to Students’ Life Ambitions – Give students choice in research projects, reading content and assessment to increase ownership in their learning. Perhaps learners could research a career of interest to them, places they might like to visit or live, or the cost of living independently.

Encouraging Application of Knowledge

  • Cognitive Challenges – Present students with opportunities to engage in decision making, problem solving, experimental inquiry and investigation. Directly involving students in activities designed to develop critical thinking results in deeper understanding of the content.
  • Choice – Provide options for tasks, products, goals and behavior. We can often be proactive in maintaining student engagement by allowing them to choose the order in which to complete tasks, providing a menu of options for a completed project (i.e. create a PowerPoint, write a commercial, design a book jacket), letting them contract for a grade based on work completion and/or difficulty of task, and involving students in setting expectations for classroom behavior.
  • Present Real-World Applications – Ask students to generate ideas for real-world problems and provide a structure for the process of creating possible solutions. Engaging students in this way builds their understanding of community resources and protocols and develops critical thinking and problem-solving skills.

Can I do this? This is perhaps the most important of Marzano, Pickering and Heflebower’s questions, for if the answer is “No,” everything else is of little value. It is critical for us as classroom teachers to build the self-efficacy of our students, to teach them to set goals, monitor their own progress, and to identify their strengths and areas of growth.

Hattie defines self-efficacy as the confidence students have in their own abilities to make learning happen. Many of our students will need our help in building this confidence, particularly when developing persistence and overcoming academic or personal obstacles. Teachers can foster self-efficacy skills by incorporating practices which encourage students to reflect on their learning.

  • Tracking student progress – Allowing students to track their data over time in data notebooks and including them in academic and behavioral goal-setting increases their ownership and efficacy. A visual graph of their reading fluency improvement or math fact mastery may be a more powerful tool than verbal feedback or a grade on a paper.
  • Using Effective Verbal Feedback – Provide timely, specific feedback to students, focusing on strengths and areas of growth. Teacher feedback should serve to encourage, guide and challenge students to improve their performance on a given task. If we are to positively impact student achievement through feedback, we have to deliver it promptly, while ideas are fresh and changes can be made.
  • Providing Examples of Self-Efficacy – Sometimes students need to be reminded of their own previous successes in order to persevere through a difficult task or a plateau in their progress. Another way to encourage students is through the personal stories of others’ trials and dedication to attain a goal.
  • Teaching Self-Efficacy – We have the chance to tear down fixed mindsets and reframe negative thoughts by modeling reflection and analysis of our own strengths and opportunities for growth, and providing students with the framework and space to do the same.

In ” Research Proof Points–Better Student Engagement Improves Student Learning ,” veteran teacher Kathy Dyer writes that researchers have studied the relationship between engagement and achievement for decades, establishing student engagement as a predictor of achievement, desirable classroom behaviors and graduation rates. The stakes are high. Lack of engagement comes with a high cost – one we simply cannot afford to pay.

Stacy Crawford Bewley is an education recovery specialist with the Kentucky Department of Education. She is a former high school special education teacher with 13 years of classroom experience. Bewley taught English and reading in both the resource and collaborative settings. She is a fellow of the Louisville Writing Project and is completing doctoral coursework at the University of Louisville.

Share This Story

Related posts.

Alissa Riley sits with her hands clasped together

Alissa Riley: Experience with KBE changed me “for good”

David Cook

We Need To Start with WHO and WHY

Leave a comment cancel reply.

Increasing Student Engagement

Main navigation.

The challenges of teaching diverse learners in varying learning contexts puts into perspective the importance of student engagement to the learning experience. Consider using the strategies below to help students increase their engagement with learning activities, build confidence in their community of learning, and increase their comprehension of the course material.

Classroom activities should address student fears about learning

Compared to other aspects of college life, the classroom environment is inherently “a riskier one based on intellectual commitment and engagement” (Bauer, 2007), which can be intimidating for many students. A key step to promoting student engagement is recognizing and addressing the fear of failure and judgment by both instructors and peers.

Ask open-ended questions

Questions that ask students to justify an opinion or interpret a reading are more likely to elicit responses even from those who do not know exactly how to define a term or derive a formula because there is no risk of “failing” the question. Because open-ended questions can have multiple correct answers or valid perspectives, they can also generate more interesting discussions. Engagement-based questions can require students to be more diligent in their readings and homework as these questions require a deeper understanding than simply knowing a correct answer. 

You can combine multiple types of questions to both generate discussion and check for student comprehension. For example, consider starting off with a more open-ended question to invite engagement. Then, ask more “fact-finding” follow-up questions to help refine, contextualize, and nuance those responses to ensure students understand the material.

Ask students what they know about a topic before instruction

Background-knowledge probes are useful because they can help instructors decide what to cover in limited time, ensuring that subsequent meetings of the course will better engage students, and can even generate discussion in the moment.

Use more ungraded or credit-upon-completion assignments

Short reflections on class material or participation in classroom discussions can easily be turned into credit-upon-completion components of a course. These types of informal assignments hold students accountable for doing work and can prepare students to think critically in advance of more important graded assessments without presenting a significant intellectual risk for them or a grading burden for instructors. 

Encourage students to take more active roles in collaborative learning and teaching

Many studies underscore the effectiveness of learning techniques that utilize student experts or require students to practice teaching what they learn. These philosophies can be integrated into course activities through a variety of methods.

Incorporate student discussion time into activities

Instead of having students solve an example problem on their own, consider asking students to form small groups or try activities such as think-pair-share to work through it. In addition to boosting engagement, group discussions give students the opportunity to explain to others their reasoning and problem-solving processes, which helps promote metacognition. Small groups work equally well for discussing open-ended questions and problems with explicit solutions. 

Have students model or explain to other students

When students begin to grasp a concept in a difficult lecture for the first time, they may feel like a light bulb has just turned on, bringing clarity to their understanding of a topic. This is a great opportunity to ask these students to explain it to the rest of the class and take other people’s questions, interrupting only to correct or clarify information.

Build peer review into open-ended assignments

While peer review can be beneficial for increasing engagement, students are most accepting when instructors inform them of the importance and potential benefits of participating in such activities. Take time to establish peer review norms and expectations, so that students can trust they will be treated with respect and be more open to feedback. Ask students to account for how and why they incorporated the feedback and when they did not. Consider how and when you give your feedback on student work so that it does not unintentionally undercut the peer review process. If your feedback comes after a draft that incorporates peer feedback, that is an opportunity for you to reinforce the value of that peer feedback by pointing to places where they successfully integrated the feedback or places where they should have.   

Use activities that provide students with a diverse range of engagement opportunities 

Universal Design for Learning (UDL) is a framework which strives to capture the diversity of student learning preferences and is applicable to any field or subject. Consider the following strategies while designing learning activities to best reach students who may possess a variety of engagement styles.

Offer multiple versions of activities or assignments

Information is only accessible to students when it engages their cognition , so it is essential to give students both autonomy in choosing how to engage with the material as well as a diversity of methods for them to learn and assess their skills. Consider utilizing information from multiple types of sources or modalities when giving lectures or allowing students the freedom to choose different types of projects for a final assessment .

Encourage students to reflect upon the learning process

Metacognition is useful for student learning and mastery as well as building and sustaining a motivation to learn. Consider providing students with feedback on key assignments as well as creating activities in which students can conduct self-assessment with a variety of different techniques. The Canvas Commons has an Exit Ticket module with a number of examples you can build into your course. 

Emphasize the importance of course objectives in assignments

While all students appreciate understanding the significance or utility of their course material, some students especially benefit from continued reinforcement of course objectives to boost engagement. Assignments should allow learners to understand or restate the goal of the activity as well as offer relevant examples for how the information gained can be applied which connects to students’ backgrounds and interests.

Research, scholarship of teaching and learning, and online research consulted

  • Improved Student Engagement in Higher Education’s Next Normal   
  • Implementation of a Learning Assistant Program Improves Student Performance on Higher-Order Assessments 
  • The Power of Active Learning During Remote Instruction 
  • Student Perceptions on the Importance of Engagement Strategies in the Online Learning Environment
  • Stepanyan, K., Mather, R., Jones, H., & Lusuardi, C. (2009). Student Engagement with Peer Assessment: A Review of Pedagogical Design and Technologies . Lecture Notes in Computer Science, 367–375. doi:10.1007/978-3-642-03426-8_44
  • Bauer, D. M. (2007). Another F word: Failure in the classroom . Pedagogy, 7 (2), 157-170.
  • Provide Multiple Means of Engagement

Resources related to student engagement

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • HHS Author Manuscripts

Logo of nihpa

Staying Engaged: Knowledge and Research Needs in Student Engagement

Ming-te wang.

School of Education, Learning Research and Development Center, Department of Psychology, University of Pittsburgh

Jessica Degol

School of Education, University of Pittsburgh

In this article, we review knowledge about student engagement and look ahead to the future of study in this area. We begin by describing how researchers in the field define and study student engagement. In particular, we describe the levels, contexts, and dimensions that constitute the measurement of engagement, summarize the contexts that shape engagement and the outcomes that result from it, and articulate person-centered approaches for analyzing engagement. We conclude by addressing limitations to the research and providing recommendations for study. Specifically, we point to the importance of incorporating more work on how learning-related emotions, personality characteristics, prior learning experiences, shared values across contexts, and engagement in nonacademic activities influence individual differences in student engagement. We also stress the need to improve our understanding of the nuances involved in developing engagement over time by incorporating more extensive longitudinal analyses, intervention trials, research on affective neuroscience, and interactions among levels and dimensions of engagement.

Over the past 25 years, student engagement has become prominent in psychology and education because of its potential for addressing problems of student boredom, low achievement, and high dropout rates. When students are engaged with learning, they can focus attention and energy on mastering the task, persist when difficulties arise, build supportive relationships with adults and peers, and connect to their school ( Wang & Eccles, 2012a , 2012b ). Therefore, student engagement is critical for successful learning ( Appleton, Christenson, & Furlong, 2008 ). In this article, we review research on student engagement in school and articulate the key features of student engagement. In addition, we provide recommendations for research on student engagement to address limits to our understanding, apply what we have learned to practice, and focus on aspects that warrant further investigation.

KEY FEATURES OF STUDENT ENGAGEMENT

Engagement is distinct from motivation.

Engagement is a broadly defined construct encompassing a variety of goal-directed behaviors, thoughts, or affective states ( Fredricks, Blumenfeld, & Paris, 2004 ). Although definitions of engagement vary across studies ( Reschly & Christenson, 2012 ), engagement is distinguished from motivation. A common conceptualization, though not universally established, is that engagement is the effort directed toward completing a task, or the action or energy component of motivation ( Appleton et al., 2008 ). For example, motivation has been defined as the psychological processes that underlie the energy, purpose, and durability of activities, while engagement is defined as the outward manifestation of motivation ( Skinner, Kindermann, Connell, & Wellborn, 2009 ). Engagement can take the form of observable behavior (e.g., participation in the learning activity, on-task behavior), or manifest as internal affective (e.g., interest, positive feelings about the task) and cognitive (e.g., metacognition, self-regulated learning) states ( Christenson et al., 2008 ). Therefore, when motivation to pursue a goal or succeed at an academic task is put into action deliberately, the energized result is engagement.

Engagement Is Multilevel

Engagement is a multilevel construct, embedded within several different levels of increasing hierarchy ( Eccles & Wang, 2012 ). Researchers have focused on at least three levels in relation to student engagement ( Skinner & Pitzer, 2012 ). The first level represents student involvement within the school community (e.g., involvement in school activities). The second level narrows the focus to the classroom or subject domain (e.g., how students interact with math teachers and curriculum). The third level examines student engagement in specific learning activities within the classroom, emphasizing the moment-to-moment or situation-to-situation variations in activity and experience.

Engagement Is Multidimensional

Although most researchers agree that student engagement is multidimensional, consensus is lacking over the dimensions that should be distinguished ( Fredricks et al., 2004 ). Most models contain both a behavioral (e.g., active participation within the school) and an emotional (e.g., affective responses to school experiences) component ( Finn, 1989 ). Other researchers have identified cognitive engagement as a third factor that incorporates mental efforts that strengthen learning and performance, such as self-regulated planning and preference for challenge ( Connell & Wellborn, 1991 ; Wang, Willett, & Eccles, 2011 ). Although not as widely recognized, a fourth dimension, agentic engagement , reflects a student’s direct and intentional attempts to enrich the learning process by actively influencing teacher instruction, whereas behavioral, emotional, and cognitive engagement typically represent student reactions to classroom experiences ( Reeve & Tseng, 2011 ). Given the variety of definitions of engagement throughout the field, researchers must specify their dimensions and ensure that their measures align properly with these descriptions of engagement.

Engagement Is Malleable

Student engagement is shaped by context, so it holds potential as a locus for interventions ( Wang & Holcombe, 2010 ). When students have positive learning experiences, supportive relationships with adults and peers, and reaffirmations of their developmental needs in learning contexts, they are more likely to remain actively engaged in school ( Wang & Eccles, 2013 ). Structural features of schools (e.g., class size, school location) have also been attributed to creating an educational atmosphere that influences student engagement and achievement. However, structural characteristics may not directly alter student engagement, but may in fact alter classroom processes, which in turn affect engagement ( Benner, Graham, & Mistry, 2008 ).

Several aspects of classroom processes are central to student engagement. For example, engagement is greater in classrooms where tasks are hands-on, challenging, and authentic ( Marks, 2000 ). Teachers who provide clear expectations and instructions, strong guidance during lessons, and constructive feedback have students who are more behaviorally and cognitively engaged ( Jang, Reeve, & Deci, 2010 ). Researchers have also linked high parental expectations to persistence and interest in school ( Spera, 2005 ), and linked high parental involvement to academic success and mental health both directly and indirectly through behavioral and emotional engagement ( Wang & Sheikh-Khalil, 2014 ). Conceptualizing student engagement as a malleable construct enables researchers to identify features of the environment that can be altered to increase student engagement and learning.

Engagement Predicts Student Outcomes

Student engagement is a strong predictor of educational outcomes. Students with higher behavioral and cognitive engagement have higher grades and aspire to higher education ( Wang & Eccles, 2012a ). Emotional engagement is also correlated positively with academic performance ( Stewart, 2008 ). Student engagement also operates as a mediator between supportive school contexts and academic achievement and school completion ( Wang & Holcombe, 2010 ). Therefore, increasing student engagement is a critical aspect of many intervention efforts aimed at reducing school dropout rates ( Archambault, Janosz, Morizot, & Pagani, 2009 ; Christenson & Reschly, 2010 ; Wang & Fredricks, 2014 ). Moreover, engagement is linked to other facets of child development. Youth with more positive trajectories of behavioral and emotional engagement are less depressed and less likely to be involved in delinquency and substance abuse ( Li & Lerner, 2011 ). School disengagement has been linked to negative indicators of youth development, including higher rates of substance use, problem behaviors, and delinquency ( Henry, Knight, & Thornberry, 2012 ). Some of these associations may actually be reciprocal, so that high engagement may lead to greater academic success, and greater academic success may then lead to even greater academic engagement ( Hughes, Luo, Kwok, & Loyd, 2008 ).

Engagement Comes in Qualitatively Different Patterns

Using person-centered approaches to study engagement advances our understanding of student variation in multivariate engagement profiles and the differential impact of these profiles on child development. One study ( Wang & Peck, 2013 ) used latent profile analysis to classify students into five groups of varying patterns of behavioral, emotional, and cognitive engagement, which were associated differentially with educational and psychological functioning. For example, a group of emotionally disengaged youth was identified (high behavioral and cognitive engagement, but low emotional engagement) with grade point averages and dropout rates comparable to those of the highly engaged group of youth (high on all three dimensions). However, despite their academic success, the emotionally disengaged students had a greater risk of poor mental health, reporting higher rates of symptoms of depression than any other group. Furthermore, growth mixture modeling analysis with a combined measure of behavioral, cognitive, and emotional engagement showed that unlike most individuals who experienced high to moderately stable trajectories of engagement throughout adolescence, many students experienced linear or nonlinear growth or declines ( Janosz, Archambault, Morizot, & Pagani, 2008 ). Students with unstable patterns of engagement were more likely to drop out. These developmental patterns and profiles cannot be detected by variable-centered approaches that focus on population means and overlook heterogeneity across groups. As person-centered research becomes more common, targeted intervention programs should be more effective at serving unique subgroups of students with specific developmental needs.

Disengagement Is More Than the Lack of Engagement

One of the inconsistencies found in the research is whether we should distinguish engagement from disengagement and measure these constructs on the same continuum or as separate continua. Most studies consider engagement as the opposite of disengagement with lower levels of engagement indicating more disengagement. However, some researchers have begun to view disengagement as a separate and distinct psychological process that makes unique contributions to academic outcomes, not simply as the absence of engagement ( Jimerson, Campos, & Greif, 2003 ). For example, behavioral and emotional indicators of engagement (e.g., effort, interest, persistence) and disaffection (e.g., withdrawal, boredom, frustration) can be treated as separate constructs, indicating that although similar, engagement and disaffection do not overlap completely ( Skinner, Furrer, Marchand, & Kindermann, 2008 ). Researchers should incorporate separate measures of engagement and disengagement into their work to determine the unique contributions of each construct to academic, behavioral, and psychological outcomes.

LOOKING AHEAD

Although we know much from research on student engagement, a number of areas require clarification and expansion.

Affective Arousal and Engagement

Emotions in educational contexts can enhance or impede learning by shaping the motivational and cognitive strategies that individuals use when faced with a new challenge. Negative emotions such as anxiety may interfere with performing a task by reducing the working memory, energy, and attention directed at completing the task, whereas positive emotions such as enjoyment, hope, and pride may increase performance by focusing attention on the task and promoting adaptive coping strategies ( Pekrun, Goetz, Titz, & Perry, 2002 ; Reschly, Huebner, Appleton, & Antaramian, 2008 ). However, much of the work on emotions and engagement focuses on general dispositions toward the learning environment, such as measuring interest in or valuing of school ( Stewart, 2008 ). Far less is known about how students’ actual emotions or affective states during specific learning activities influence their academic engagement and achievement ( Linnenbrink-Garcia & Pekrun, 2011 ). Researchers rarely measure how emotions relate to subsequent engagement, relying predominantly on retrospective student self-reports to measure affective states. Useful supplements to students’ reports would be psychophysiological indicators of emotional distress (e.g., facial expression, heart rate) and experience sampling methods to assess situational emotional states during classroom activities.

With the advancement of brain imaging technology, neuroimaging studies show that affective states during learning are important in determining how efficiently the brain processes new information ( Schwabe & Wolf, 2012 ). Although neuroimaging cannot be used to measure classroom engagement in real time, neuroscience techniques are valuable tools that may advance our understanding of how emotional experiences shape neural processing of information and affect engagement during a task. For example, do prolonged states of boredom in the classroom actually alter the shape and functionality of the brain over time, and can we intervene in these processes to reverse the negative effects of boredom or apathy? We also need a more thorough understanding of how genetic predispositions and environmental conditions interact to alter brain chemistry. Studies should identify precursors to or triggers for negative affective experiences, and identify environmental supports that can eliminate these negative emotions, foster adaptive coping strategies, and increase learning engagement and performance.

Interactions Among Levels

Engagement is represented at many hierarchical levels in the educational environment (e.g., school, classroom, momentary level). However, researchers rarely frame their conceptualizations and assessments of engagement in terms of a hierarchical system or process, so we lack understanding about how student engagement at these various levels interacts to influence performance. Learning is a continuous developmental process, not an instantaneous event, and engagement is the energy that directs mental, behavioral, and psychological faculties to the learning process. By focusing on only one level of engagement, we understand little about the process through which engagement is formed and leads ultimately to academic achievement.

Are there reciprocal interrelations between more immediate states of engagement and broader representations, such that moment-to-moment engagement within the classroom informs feelings and behaviors toward the school as a whole, which then trickle down to influence momentary classroom engagement through a continuous feedback loop? Are these levels additive or multiplicative, such that higher engagement across the board is associated with better academic outcomes than high engagement at only one or two levels? Or does engagement at one level compensate for lower engagement at another level, demonstrating that high engagement across all levels is not necessary for optimal functioning? Broadening the focus of research to incorporate engagement at many micro and macro levels of the educational context would advance our understanding of how different levels develop and interact to shape student engagement, and the differential pathways that lead to academic success.

Development of Many Dimensions

Despite the consensus over the multidimensionality of student engagement, the role that each dimension plays in shaping academic outcomes remains unclear ( Skinner et al., 2008 ). Three avenues warrant exploration: (a) independent relations, (b) emotional engagement (which drives behavioral and cognitive engagement), and (c) reciprocal relations.

Independent relations suggest that each dimension of engagement makes unique contributions to student functioning. In other words, high behavioral engagement cannot compensate for the effects of low emotional engagement, given that both shape student outcomes independently.

The second avenue posits that emotional engagement could be a prerequisite for behavioral and cognitive engagement. According to this viewpoint, students who enjoy learning should participate in classroom activities more often and take more ownership over their learning. Emotional engagement sets the stage for developing cognitive and behavioral processes of student engagement.

The third possibility suggests bidirectional relations among the organizational constructs of engagement, with each dimension influencing the others cyclically. For example, enjoyment of learning or high emotional engagement may lead to greater use of self-regulated learning strategies or cognitive engagement and greater behavioral engagement within the classroom. This increased behavioral participation and use of cognitive strategies to improve performance may elicit positive feedback from classmates and teachers, further increasing enjoyment of learning, and so on. With reciprocal relations, each process reinforces and feeds into the others. For researchers to understand the developmental progression of engagement over time, they should tease apart the unique versus compounded effects of each dimension of engagement.

Longitudinal Research Across Developmental Periods

Some research on how student engagement unfolds and changes over time has shown average declines in various indicators of engagement throughout adolescence and in the transition to secondary school ( Wang & Eccles, 2012a , 2012b ), but other studies have shown heterogeneity in engagement patterns across subgroups of individuals ( Archambault et al., 2009 ; Janosz et al., 2008 ; Li & Lerner, 2011 ). However, we know little about developmental trajectories of engagement spanning early childhood to late adolescence. Many studies track engagement only in early adolescence across a span of 3 or 4 years. Because the ability to become a self-regulated learner, set goals, and monitor progress advances as children mature and become active agents in their own learning, student engagement may take different forms in elementary school than it does in subsequent years ( Fredricks et al., 2004 ). Researchers should investigate how younger versus older students think of engagement, how engagement changes across developmental periods, and whether sociocultural and psychological factors differentially shape engagement at the elementary and secondary levels.

Students’ Prior Learning Experiences

Researchers should also explore the role of students’ previous learning experiences in shaping engagement. When students are confronted with new academic challenges, the emotions and cognitions attached to previous experiences should influence how they adjust or cope with these challenges. In particular, engagement and academic achievement decline during school transitions (e.g., elementary to middle school, middle school to high school), which can be stressful experiences for many students ( Eccles et al., 1993 ; Pekrun, 2006 ). Students with prior experiences of failure in school may be especially vulnerable to the alienating effects of school transitions. How do we discontinue students’ negative feelings about schoolwork and reengage them in their education? How do we maintain positive and engaging experiences for students through every grade level and every transition? Using students’ prior learning experiences to break the cycle of disengagement and strengthen the cycle of continuous interest and engagement could inform interventions, particularly during crucial transitory periods when students are most vulnerable to feelings of isolation, boredom, or alienation.

Intervention

Despite the malleability of student engagement and the connection between developmental contexts and engagement, very few theory- and evidence-based preventative programs have been developed, implemented, and tested on a large scale. A few interventions have increased student engagement. For example, Check & Connect, an evidence-based intervention program, has reduced rates of dropout and truancy, particularly for students at high risk of school failure ( Reschly & Christenson, 2012 ). Randomized control trials of schoolwide positive behavioral support programs have also improved student engagement and achievement, reducing discipline referrals and suspensions ( Horner et al., 2009 ; Ward & Gersten, 2013 ). However, many programs are small, intensive interventions that have not been implemented on a larger scale, raising concerns about implementation fidelity and reduced effectiveness. Many interventions also rely on one dose of services and track developmental changes over a short period, making it difficult to infer long-term benefits.

We need to develop comprehensive programs that adapt to the unique needs of individuals receiving services. Preventative programs often rely on one-size-fits-all models, so subgroups of students may not be served properly. Although universal interventions are beneficial for students in general, targeted programs might be more effective for students at greater risk of academic or psychological problems. Therefore, interventions should be implemented at many levels, incorporating a universal program for students in general and more selected services for at-risk students.

Engagement Across Contexts

We should also explore the relative alignment of educational messages, values, and goals across contexts and how this compatibility influences student engagement. Teachers, parents, and peers are not always in tune with each other over educational values, and these conflicting messages may impair how students engage fully with school. For example, parents might endorse educational excellence as a priority, whereas peers may endorse academic apathy. In these situations, students may have to set aside their personal values and pursue or coordinate the values of others, or try to integrate their personal values with the values of the other group. Students’ ability to coordinate the messages, goals, and values from different agents in their social circles will also determine how they see themselves as learners.

We lack studies on how students reconcile inconsistencies in these messages across groups and how it affects their engagement. If peer groups promote antiachievement goals that are directly in conflict with the educational ideals transmitted by parents, will students conform to peer norms or seek out friends with achievement values that are more aligned with the values endorsed by their families? Is misalignment of educational goals across social contexts a risk factor for school dropout, particularly among students from disadvantaged backgrounds? Researchers need to address this area to help students cope with the inconsistent messages about education in their social circles and to consolidate a stronger academic identity.

Student Character and Engagement

Although researchers have examined how contextual, sociocultural, and motivational factors influence student engagement, the influence of student character or personality factors is less well understood. Research on the Big Five personality traits has found conscientiousness, an indicator of perseverance, to be the most consistent predictor of academic achievement ( Poropat, 2009 ).

Persistence has been examined through grit , a characteristic that entails working passionately and laboriously to achieve a long-term goal, and persisting despite challenges, setbacks, or failures ( Duckworth, Peterson, Matthews, & Kelly, 2007 ). Individuals with grit are more likely to exert effort to prepare and practice to achieve their goals, leading them to be more successful than individuals who use less effortful strategies ( Duckworth, Kirby, Tsukayama, Berstein, & Ericsson, 2011 ).

Nevertheless, we know little about how personality traits might interact with environmental contexts to shape student engagement. Additionally, researchers have yet to examine how profiles of personality traits might interact with each other to influence student engagement. More nuanced research in these areas will aid in the development of learning strategies and educational contexts that may yield the most successful outcomes for various personality types.

Beyond Academic Engagement

Research on student engagement has focused on academic engagement or academic-related activities. Although academic experiences are critical determinants of educational success, school is also a place where students socialize with their friends and engage in nonacademic activities. Focusing exclusively on academic engagement neglects the school’s role as a developmental context in which students engage in a wide range of academic, social, and extracurricular activities that shape their identities as academically capable, socially integrated individuals who are committed to learning. For example, students who struggle with academic learning but are athletic may experience more engagement on the football field than in the classroom. Through participating in these types of nonacademic social activities, students build skills and learn life lessons such as collaborating as a team and becoming a leader. Thus, students’ schooling experiences should involve many forms of engagement, including academic, social, and extracurricular engagement. More research is needed to integrate these forms of engagement in school and examine how they interact to influence students’ academic and socioemotional well-being collectively.

Since its conception more than two decades ago, research on student engagement has permeated the fields of psychology and education. Over this period, we have learned much about engagement. We know that engagement can be measured as a multidimensional construct, including both observable and unobservable phenomena. We have come to appreciate the importance of engagement in preventing dropout and promoting academic success. We also understand that engagement is responsive to variations in classroom and family characteristics.

But in spite of the accrued knowledge on engagement, we have barely scratched the surface in understanding how engagement and disengagement can affect academic development, and how engagement unfolds over time by tracking interactions across contexts, dimensions, and levels. We also cannot dismiss the personal traits and affective states that students bring to the classroom, which may influence engagement regardless of the supportive nature of the environment. We lack knowledge about the extent to which large-scale interventions can produce long-term improvements in engagement across diverse groups. As we move forward with engagement research, we must apply what we have learned and focus on aspects that warrant further exploration. The insight this research provides will allow educators to create supportive learning environments in which diverse groups of students not only stay engaged but also experience the academic learning and success that is a byproduct of continuous engagement.

Acknowledgments

This project was supported by Grant DRL1315943 from the National Science Foundation and Grant DA034151-02 from the National Institute on Drug Abuse at the National Institute of Health to Ming-Te Wang.

Contributor Information

Ming-Te Wang, School of Education, Learning Research and Development Center, Department of Psychology, University of Pittsburgh.

Jessica Degol, School of Education, University of Pittsburgh.

  • Appleton JJ, Christenson SL, Furlong MJ. Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools. 2008; 45 :369–386. [ Google Scholar ]
  • Archambault I, Janosz M, Morizot J, Pagani L. Adolescent behavioral, affective, and cognitive engagement in school: Relationship to dropout. Journal of School Health. 2009; 79 :408–415. [ PubMed ] [ Google Scholar ]
  • Benner AD, Graham S, Mistry RS. Discerning direct and mediated effects of ecological structures and processes on adolescents’ educational outcomes. Developmental Psychology. 2008; 44 :840–854. [ PubMed ] [ Google Scholar ]
  • Christenson SL, Reschly AL. Check & Connect: Enhancing school completion through student engagement. In: Doll E, Charvat J, editors. Handbook of prevention science. New York, NY: Routledge; 2010. pp. 327–348. [ Google Scholar ]
  • Christenson SL, Reschly AL, Appleton JJ, Berman-Young S, Spaniers DM, Varro P. Best practices in fostering student engagement. In: Thomas A, Grimes J, editors. Best practices in school psychology. 5th. Bethesda, MD: National Association of School Psychologists; 2008. pp. 1099–1119. [ Google Scholar ]
  • Connell JP, Wellborn JG. Competence, autonomy, and relatedness: A motivational analysis of self-system processes. In: Gunnar MR, Sroufe LA, editors. Self processes in development: Minnesota symposium on child psychology. Vol. 23. Chicago, IL: University of Chicago Press; 1991. pp. 43–77. [ Google Scholar ]
  • Duckworth AL, Kirby T, Tsukayama E, Berstein H, Ericsson KA. Deliberate practice spells success: Why grittier competitors triumph at the National Spelling Bee. Social Psychological and Personality Science. 2011; 2 :174–181. [ Google Scholar ]
  • Duckworth AL, Peterson C, Matthews MD, Kelly DR. Grit: Perseverance and passion for long-term goals. Journal of Personality and Social Psychology. 2007; 92 :1087–1101. [ PubMed ] [ Google Scholar ]
  • Eccles JS, Midgley C, Wigfield A, Buchanan CM, Reuman D, Flanagan C, Mac Iver D. Development during adolescence: The impact of stage-environment fit on young adolescents’ experiences in schools and in families. American Psychologist. 1993; 48 :90–101. [ PubMed ] [ Google Scholar ]
  • Eccles JS, Wang M-T. Part I Commentary: So what is student engagement anyway? In: Christenson SL, Reschly AL, Wylie C, editors. Handbook of research on student engagement. New York, NY: Springer; 2012. pp. 133–148. [ Google Scholar ]
  • Finn JD. Withdrawing from school. Review of Educational Research. 1989; 59 :117–142. [ Google Scholar ]
  • Fredricks JA, Blumenfeld PC, Paris AH. School engagement: Potential of the concept, state of the evidence. Review of Educational Research. 2004; 74 :59–109. [ Google Scholar ]
  • Henry KL, Knight KE, Thornberry TP. School disengagement as a predictor of dropout, delinquency, and problem substance use during adolescence and early adulthood. Journal of Youth and Adolescence. 2012; 41 :156–166. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Horner RH, Sugai G, Smolkowski K, Eber L, Nakasato J, Todd AW, Esperanza J. A randomized, wait-list controlled effectiveness trial assessing school-wide positive behavior support in elementary schools. Journal of Positive Behavior Interventions. 2009; 11 :133–144. [ Google Scholar ]
  • Hughes JN, Luo W, Kwok OM, Loyd LK. Teacher-student support, effortful engagement, and achievement: A 3-year longitudinal study. Journal of Educational Psychology. 2008; 100 :1–14. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Jang H, Reeve J, Deci EL. Engaging students in learning activities: It is not autonomy support or structure but autonomy support and structure. Journal of Educational Psychology. 2010; 102 :588–600. [ Google Scholar ]
  • Janosz M, Archambault I, Morizot J, Pagani LS. School engagement trajectories and their differential predictive relations to dropout. Journal of Social Issues. 2008; 64 :21–40. [ Google Scholar ]
  • Jimerson SR, Campos E, Greif JL. Toward an understanding of definitions and measures of school engagement and related terms. The California School Psychologist. 2003; 8 :7–27. [ Google Scholar ]
  • Li Y, Lerner RM. Trajectories of school engagement during adolescence: Implications for grades, depression, delinquency, and substance use. Developmental Psychology. 2011; 47 :233–247. [ PubMed ] [ Google Scholar ]
  • Linnenbrink-Garcia L, Pekrun R. Students’ emotions and academic engagement: Introduction to the special issue. Contemporary Educational Psychology. 2011; 36 :1–3. [ Google Scholar ]
  • Marks HM. Student engagement in instructional activity: Patterns in the elementary, middle, and high school years. American Educational Research Journal. 2000; 37 :153–184. [ Google Scholar ]
  • Pekrun R. The control-value theory of achievement emotions: Assumptions, corollaries, and implications for educational research and practice. Educational Psychology Review. 2006; 18 :315–341. [ Google Scholar ]
  • Pekrun R, Goetz T, Titz W, Perry RP. Academic emotions in students’ self-regulated learning and achievement: A program of qualitative and quantitative research. Educational Psychologist. 2002; 37 :91–105. [ Google Scholar ]
  • Poropat AE. A meta-analysis of the five-factor model of personality and academic performance. Psychological Bulletin. 2009; 135 :322–338. [ PubMed ] [ Google Scholar ]
  • Reeve J, Tseng CM. Agency as a fourth aspect of students’ engagement during learning activities. Contemporary Educational Psychology. 2011; 36 :257–267. [ Google Scholar ]
  • Reschly AL, Christenson SL. Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In: Christenson SL, Reschly AL, Wylie C, editors. Handbook of research on student engagement. New York, NY: Springer; 2012. pp. 3–20. [ Google Scholar ]
  • Reschly AL, Huebner ES, Appleton JJ, Antaramian S. Engagement as flourishing: The contribution of positive emotions and coping to adolescents’ engagement at school and with learning. Psychology in the Schools. 2008; 45 :419–431. [ Google Scholar ]
  • Schwabe L, Wolf OT. Stress modulates the engagement of multiple memory systems in classification learning. Journal of Neuroscience. 2012; 32 :11042–11049. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Skinner EA, Furrer C, Marchand G, Kindermann T. Engagement and disaffection in the classroom: Part of a larger motivational dynamic? Journal of Educational Psychology. 2008; 100 :765–781. [ Google Scholar ]
  • Skinner EA, Kindermann TA, Connell JP, Wellborn JG. Engagement and disaffection as organizational constructs in the dynamics of motivational development. In: Wentzel K, Wigfield A, editors. Handbook of motivation at school. Mahwah, NJ: Erlbaum; 2009. pp. 223–245. [ Google Scholar ]
  • Skinner EA, Pitzer JR. Developmental dynamics of student engagement, coping, and everyday resilience. In: Christenson SL, Reschly AL, Wylie C, editors. Handbook of research on student engagement. New York, NY: Springer; 2012. pp. 21–44. [ Google Scholar ]
  • Spera C. A review of the relationship among parenting practices, parenting styles, and adolescent school achievement. Educational Psychology Review. 2005; 17 :125–146. [ Google Scholar ]
  • Stewart EB. School structural characteristics, student effort, peer associations, and parental involvement: The influence of school- and individual-level factors on academic achievement. Education and Urban Society. 2008; 40 :179–204. [ Google Scholar ]
  • Wang M-T, Eccles JS. Adolescent behavioral, emotional, and cognitive engagement trajectories in school and their differential relations to educational success. Journal of Research on Adolescence. 2012a; 22 :31–39. [ Google Scholar ]
  • Wang M-T, Eccles JS. Social support matters: Longitudinal effects of social support on three dimensions of school engagement from middle to high school. Child Development. 2012b; 83 :877–895. [ PubMed ] [ Google Scholar ]
  • Wang M-T, Eccles JS. School context, achievement motivation, and academic engagement: A longitudinal study of school engagement using a multidimensional perspective. Learning and Instruction. 2013; 28 :12–23. [ Google Scholar ]
  • Wang M-T, Fredricks JA. The reciprocal links between school engagement, youth problem behaviors, and school dropout during adolescence. Child Development. 2014; 85 :722–737. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Wang M-T, Holcombe R. Adolescents’ perceptions of school environment, engagement, and academic achievement in middle school. American Educational Research Journal. 2010; 47 :633–662. [ Google Scholar ]
  • Wang M-T, Peck SC. Adolescent educational success and mental health vary across school engagement profiles. Developmental Psychology. 2013; 49 :1266–1276. [ PubMed ] [ Google Scholar ]
  • Wang M-T, Sheikh-Khalil S. Does parental involvement matter for student achievement and mental health in high school? Child Development. 2014; 85 :610–625. [ PubMed ] [ Google Scholar ]
  • Wang M-T, Willett JB, Eccles JS. The assessment of school engagement: Examining dimensionality and measurement invariance by gender and race/ethnicity. Journal of School Psychology. 2011; 49 :465–480. [ PubMed ] [ Google Scholar ]
  • Ward B, Gersten R. A randomized evaluation of the Safe and Civil Schools model for positive behavioral interventions and supports at elementary schools in a large urban school district. School Psychology Review. 2013; 42 :317–333. [ Google Scholar ]

Classroom Q&A

With larry ferlazzo.

In this EdWeek blog, an experiment in knowledge-gathering, Ferlazzo will address readers’ questions on classroom management, ELL instruction, lesson planning, and other issues facing teachers. Send your questions to [email protected]. Read more from this blog.

Nine Strategies for Promoting Student Engagement

what are some research questions on student engagement

  • Share article

(This is the third post in a four-part series. You can see Part One here and Part Two here. )

The question-of-the-week is:

Some research suggests that as students get older, their engagement with school tends to decrease. How can schools combat this trend?

Part One ‘s contributors were Janice Wyatt-Ross, Dr. PJ Caposey, Michelle Shory, Irina McGrath, and Matt Renwick. Janice was also a guest on my 10-minute BAM! Radio Show . You can also find a list of, and links to, previous shows here.

In Part Two , Scott Bayer, Amanda Lescas, Ryan Huels, and Joy Hamm shared their thoughts.

Today, Tonia Gibson, Katie Shenk, Libby Woodfin, Jayson W. Richardson, and Luiza Mureseanu offer their observations.

‘At the core are student/teacher relationships’

Tonia Gibson, a managing consultant at McREL International, is a former Australian teacher and school leader. At McREL, she focuses on helping schools use an inside-out, curiosity-based approach to develop sustainable and continuous improvement:

To truly answer this question, we need to understand what student engagement is. For me, student engagement is closely linked to motivation; both intrinsic and extrinsic factors play a role in how interested students are at school.

We also know that student motivation tends to decrease as students move through and up grade levels. Most students in 2nd grade appear happy to be at school and in most classrooms look to be highly engaged in learning, whereas the scene in some 10th grade classrooms might look very different, with students staring at their phones or listening to music through hidden ear pods. A Gallup survey of some 500,000 students found that roughly 8 in 10 elementary students felt “engaged” in school—that is, attentive, curious, and optimistic about their learning—yet by high school, that number had plummeted to just 4 in 10.

Elementary teachers are able to use a myriad of strategies that are almost guaranteed to engage students in learning. Short demonstrations in science or even a brainteaser can spark a child’s curiosity about a topic. Most K–5 students willingly and openly ask questions of their teachers and each other when they need further clarification or help.

Middle school teachers will probably paint a very different picture. Student engagement appears to decrease due to a number of observable factors, including the shift in teacher/student relationships, increased academic demands, and, not to be dismissed, the biological and social changes that students are navigating.

Most high school teachers will tell you that student engagement at this level can (mostly) be linked to a student’s personal motivations about their future life away from or after high school. Students might be motivated to get good grades so that they can enter higher education, or they may be aiming for a technical college or career and choose their pathway accordingly. But then there are also students who seem disengaged at school for reasons that are harder to address, like homelessness, generational poverty, or lack of self-belief or confidence that they can succeed.

In light of all of this, what teachers really need to master is knowing and understanding what motivates their students—in and out of school. Knowing your students and their interests can help teachers become more intentional when planning for learning. Pairing that knowledge with a knowledge of which research-supported, high-impact strategies will be most effective with your students can help teachers create classroom environments where students display increased interest, motivation, and engagement in learning.

At the core are student/teacher relationships. Engaging students is not about being entertaining, providing loads of experiments that blow up, or having a colorful slide show to appeal to students who love screen time. Great teachers know that before their students can learn, they need to build positive relationships and shared understandings about what the purpose of school (or their class) is, and work with students to set realistic personal goals that help them develop and internalize their purpose for being at school.

Once these shared expectations and understandings have been established, teachers should be savvy in ensuring they provide purposeful, transparent, and engaging learning opportunities for students. Some excellent practices teachers should consider are:

Ensure clarity and purpose for student learning. For students to be engaged and motivated to learn, they need to know what they are learning, why it is important, and how to be successful. Teachers should be mindful to ensure that their narrative around the “why” is crafted to connect with the students in front of them. If the purpose for the learning is merely grades, you might lose some students. But if it’s about getting a good grade and understanding and applying the learning to a real-world problem, then you’re engaging the whole class.

Create challenging learning tasks. Challenging learning tasks aren’t just lists of “hard” problems or the teacher setting high expectations. Learning tasks and activities that scaffold student learning, connect to the real world and/or the students’ experiences, and challenge students to use what they know to solve a problem or apply their knowledge to a new task will always be more engaging than assigning pages and pages of practice problems to prepare for a test.

Set up classroom structures that allow students to learn with and from each other. Learning is primarily a social activity. Babies learn to talk by listening and mimicking the people around them, while young children learn how to hit a baseball or skate by watching and getting feedback from others. Student should have the same opportunities at school. Students at your school may come from backgrounds where storytelling is a powerful way to learn. Or using a digital platform to create virtual learning communities where students can chat and share ideas online may be a way to encourage collaboration outside of our regular turn-and-talk classroom routines.

whatstudentsgibson

‘The Four Ts’

Katie Shenk is a lead curriculum designer for EL Education. Libby Woodfin is the director of publications for EL Education and an author of Learning That Lasts: Challenging, Engaging and Empowering Students with Deeper Instruction :

Learning is naturally engaging. When students begin kindergarten, when they learn to read and write, when numbers fall into place for them for the first time, it is exciting. Learning is fun!

But what happens after the primary grades? Students hang onto their joy for a while, but for many, it starts to slip as 4th or 5th grade rolls around. And middle school is a notoriously joyless time for far too many students. It doesn’t mean that all kids end up hating school (though some will), but the source of students’ engagement with school often changes.

Sometimes “achievement” in a traditional sense—grades, accolades from teachers or family members—is what engages students. Sometimes it’s social interactions. But some students will struggle in school and, as a result, are likely to disengage on some level. This may happen for students because learning is hard and their teachers have not found the right way to meet their needs, or because they feel that they don’t belong, that school is not a place where they fit in or where they can succeed.

The problem for educators to solve as students get older is to ensure that school remains a place of learning , not just of schooling . Students may go through the paces of school—doing their homework, answering questions in class—but that’s not necessarily learning and students may become more and more disengaged as they experience more schooling and less authentic learning. Students (even middle schoolers!) can still engage deeply with school, but they need authentic opportunities to learn deeply.

Designing Curriculum with the Four Ts

We use a simple framework called the Four Ts to consider how to combine topics, tasks, targets, and texts in a way that will truly engage students with their learning.

  • The TOPIC teaches standards through real-world issues, original research, primary-source documents, and the opportunity to engage with the community, and they lend themselves to the creation of authentic tasks/products.
  • The TASK gives students the opportunity to address authentic needs and an authentic audience related to the topic.
  • Standards-aligned learning TARGETS are contextualized to the topic; they prepare students for and guide the task and ensure proper, deep analysis of the text.
  • A worthy TEXT is chosen judiciously to ensure that it will help students build world knowledge, master specific standards, and learn about the topic.

Let’s look at a few examples of the Fours Ts in action. Note that one of the key themes in these examples is that students are engaged in purposeful work, and, because of that, they are motivated to dig in to complex and rigorous learning. They are engaged.

(We urge you to watch the accompanying videos, which tell each story in detail.)

Living History: 4th and 5th Grades

This video tells the story of 4th and 5th grade students at Silverton School in Silverton, Colo., who engaged in a semester-long study of local Chinese American history at the turn of the 20th century. Students created a new exhibit at their local history museum and rebuilt a local Chinese garden. This is a powerful example of students learning history “beyond the textbook” and putting their learning to use to make their community better. Students were engaged deeply because their learning had purpose and, as one student shared, “we had to learn with our heads and our hearts.”

Their teacher expertly wove together the Four Ts to design a compelling project from start to finish. Her framing of the topic as a “history mystery” piqued students’ curiosity out of the gate and her innovative use of texts included epitaphs from the local cemetery, museum gallery texts, and historical newspapers. Key literacy standards related to reading and synthesizing informational texts, writing informational texts, and speaking with and listening to experts and museum guests anchored the learning targets . The culminating task —the museum exhibit and garden—honored the history of the local community.

Community Faces: 6th Grade

This video features 6th graders from Interdistrict School for Arts and Communication in New London, Conn., working to break stereotypes associated with the label “immigrant” by telling the human story of immigration. Earning widespread local and national media coverage, these students produced a beautiful book filled with original photography and stories from people in their community who immigrated to the United States. Students spoke at the state Capitol and toured the Northeast with an exhibit of their learning. These students were motivated to learn deeply and do their best because they were working on behalf of immigrants in their community and presenting their work to multiple audiences.

This project is a beautiful example of the Four Ts in action. The topic engaged students in a real-world, contemporary issue that impacted their community and allowed them to do authentic, primary research with community members. Engagement in the task was high as students interviewed local immigrants and learned photography and interviewing skills from experts. Learning targets were standards-aligned and interdisciplinary, and the project incorporated work from all subject areas, including math. Students were eager to dig into complex texts , including primary texts, in order to prepare for their interviews and produce high-quality work for an authentic audience.

The Successes, Challenges, and Possibilities of Policing in the United States: 12th Grade

This video features seniors at Codman Academy in Boston preparing to write a research paper analyzing a critical component of policing in America. Their preparation involves reading a series of case studies and primary-source texts, as well as structured academic discussions, about policing practices in a variety of communities around the U.S. Their primary texts are The New Jim Crow by Michelle Alexander and the U.S. Department of Justice’s “Investigation of the Ferguson Police Department.” The beauty of the Four Ts in this video is the way that they weave together to give students access to a challenging text; those access points serve to engage students deeply in their learning.

The topic of policing in America is one that impacts the lives of the students in this class every day. This compelling topic, combined with a powerful primary-source text (the Ferguson report) inspires students to read and think critically. The task is a research paper, which is scaffolded by a series of academic discussions in which students grapple with their readings and analysis. Due to the nature of the text, which is incredibly dense, the teacher has designed lessons to ensure that students will be able to access it, make meaning from it, and learn from it. She has also designed tools for the students, such as note-catchers, that require them to capture evidence from the text, which they can use in their analysis. This is a key college- and career-ready skill aligned to standards and learning targets .

theproblemwoodfin

Measuring the wrong things

Jayson W. Richardson is a professor at the University of Denver and the department chair of the Department of Educational Leadership and Policy Studies in the Morgridge College of Education. He has written over 100 scholarly articles, books chapters, and books focusing on technology, leadership, and change including a new book on Bringing Innovative Practices to Your School :

School leaders often worry about issues such as discipline, dropout rates, achievement, and academic progression. These are often viewed as the traditional measures of a student’s (and school’s) success. But these measures do not get at what might be the most problematic issue. These forward-facing measures are simply indicative of a deeper issue: student engagement. As such, efforts to address student engagement in P-12 schools, are often blurred by competing accountability measures.

Gallup (2016) conducted a survey of nearly 1 million students across the United States. They found that starting at 5 th grade, 74 percent of students reported to being engaged in school. By the 12 th grade, only 34 percent of students reported being engaged in school. These are alarming numbers given that by the time students leave our schools, nearly 7 out of 10 are generally disengaged in the learning process.

Looking at the Gallup (2016) data deeper, it is clear that the issue of a lack of engagement is indeed dire. Sixty percent of 5 th graders reported learning something interesting in school in the past 7 days. That level drops to 33 percent by the 12 th grade. When students were asked if they have fun at school, 47 percent of 5 th grades and just 20 percent of 12 th grades said yes. From these data, it is clear that we have more than half of our 5 th graders not having fun at schools and 4 out of 10 not even learning something interesting in school! By the time that student is ready to tackle the world, she spends 67 percent of the time not learning anything interesting in school and 80 percent of her time not having any fun.

It is the old adage of the “squeaky wheel gets the grease.” It is often harder to measure engagement so we measure academics, we measure dropouts, we measure attendance, we measure discipline, and so on. But deficiencies in these measures are likely a result of the majority of students being disengaged in their education. Nevertheless, recent research makes it clear that student engagement is significantly and positively linked to achievement, discipline, and behaviors. It is time we refocus our efforts on engaging students in learning.

COVID-19 has brought with it an array of schooling challenges around organizational change . We cannot overlook that. Some challenges are structural (e.g., devices, learning platforms, and internet access), some are human resources (e.g., teacher training), some are political (e.g., state policies and local technology-use policies), and yet others are symbolic (e.g., messaging coming from leaders about emergency remote learning). As leaders of schools in uncertain times where there is no normal, we must focus on the bigger picture. We must resist the urge to fix problems without focusing on the core issue: increasing student engagement. By putting students first, we might likely find innovative ways to “educate” students that we have never thought of before. In this time of the pandemic, let’s invent more engaging student experiences that might propel wholesale rethinking of what schooling can be.

bythetimerichardson

Supporting young adults

Luiza Mureseanu is an instructional resource teacher, K-12, for ESL/ELD programs, in Peel DSB, Ontario, with over 17 years of teaching middle and high school students in Canada and Romania. She believes that all English-learners will be successful in schools that cultivate culturally and linguistically responsive practices:

Schools need to prepare a different approach in providing instructional support for older (18+) students using experiential learning and age-relevant curriculum. Older students naturally have a different level of interest in attending school, and their level of performance changes because their life priorities change.

Often, they are breadwinners for the household; having multiple jobs or even taking care of younger siblings forces them to slow down in their schooling. In fact, family circumstances or a complicated history of immigration often determines that they need to spend longer time in school to finish their credits. As a secondary school teacher, I know when my students are late to class or tired because of jobs or family commitments.

It is true that research indicates the correlation between getting older and the decreasing of school engagement, but there is no similar correlation between getting older and abandoning the interest in finishing school. In fact, when asked, older students indicate their desire to get their diploma. Therefore, schools need flexible programs to accommodate the learner, including modified timetable and relevant courses.

For example, some of the schools with larger clusters of older learners could provide alternate starting time in the morning. The model is not entirely new as some of the high schools with large hospitality and tech programs have that option already, and it works best for students who have a co-op course, a dual credit, or a workplace component on their timetable.

Another important step schools need to take will be to provide courses that are of interest for older students—particularly the elective courses. For example, some financial-literacy courses, family economics, or career courses need to be available for the older group. They should be provided with an opportunity to have some form of workplace component in their education and to get them involved in projects with community or local businesses.

Schools definitely need to provide more specific supports for this group. Ironically, students could attend high school in Ontario, Canada, until they are 21. In reality, there are just a few students who will remain in a regular day school after the age of 18. They either go to a continuing education program, sometime later in life, or drop out of school. Some groups are particularly affected—ELLs with a refugee background, students living below the poverty line, students with exceptionalities. School districts have the data that shows this trend, and they must prioritize the needs of this group.

schoolsneedluiza

Thanks to Tonia, Katie, Libby, Jayson, and Luiza for their contributions!

Please feel free to leave a comment with your reactions to the topic or directly to anything that has been said in this post.

Consider contributing a question to be answered in a future post. You can send one to me at [email protected] . When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.

You can also contact me on Twitter at @Larryferlazzo .

Education Week has published a collection of posts from this blog, along with new material, in an e-book form. It’s titled Classroom Management Q&As: Expert Strategies for Teaching .

Just a reminder; you can subscribe and receive updates from this blog via email (The RSS feed for this blog, and for all Ed Week articles, has been changed by the new redesign—new ones won’t be available until late January). And if you missed any of the highlights from the first nine years of this blog, you can see a categorized list below.

This Year’s Most Popular Q&A Posts

Race & Racism in Schools

School Closures & the Coronavirus Crisis

Classroom-Management Advice

Best Ways to Begin the School Year

Best Ways to End the School Year

Student Motivation & Social-Emotional Learning

Implementing the Common Core

Facing Gender Challenges in Education

Teaching Social Studies.

Cooperative & Collaborative Learning

Using Tech in the Classroom

Student Voices

Parent Engagment In Schools

Teaching English-Language Learners

Reading Instruction

Writing Instruction

Education Policy Issues

Differentiating Instruction

Math Instruction

Science Instruction

Advice for New Teachers

Author Interviews

Entering the Teaching Profession

The Inclusive Classroom

Learning & the Brain

Administrator Leadership

Teacher Leadership

Relationships in Schools

Professional Development

Instructional Strategies

Best of Classroom Q&A

Professional Collaboration

Classroom Organization

Mistakes in Education

Project-Based Learning

I am also creating a Twitter list including all contributors to this column .

The opinions expressed in Classroom Q&A With Larry Ferlazzo are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Sign Up for EdWeek Update

Edweek top school jobs.

Teachers and administrator talking outside school building.

Sign Up & Sign In

module image 9

what are some research questions on student engagement

Student engagement: What is it and why does it matter?

Group of students studying, some using laptops and some writing in notepads.

Teaching Insights shares staff and student voices, asking and answering questions, and showcasing examples of effective teaching and learning practices. How do our conversations and authors’ contributions align with the scholarly debate on improving student engagement?

On a warm afternoon, all working from home and with still little prospect of returning to the campus for our work, educational developers from Oxford Brookes University came together on Zoom to ‘think aloud’ about the ways they, and the teaching staff they work with, could improve student engagement. Everyone had read the contributions that form the October 2021 issue of Teaching Insights . There was a buzz as we joined the meeting and we started by discussing just how important conversation is in supporting the development of teaching practice. We noted how meaningful teaching conversations between peers are (Roxå & Mårtenssen, 2009) as an informal yet vital component of the professional development of higher education teachers. We were in agreement too with Chris Headleand (2021) whose recent blog post for The Campus discussed the need for colleagues to share their interpretations of student engagement with each other, considering how much variability there is between individual conceptions of engagement. Dismore, Turner and Huang (2018) also noted that academic staff new to teaching perceive student engagement differently from more experienced colleagues and explored new lecturers’ practices to enable engagement—more on that later.

Student engagement as contested and multi-dimensional 

Chris Headleand wasn’t the first author to point to the confusion around the meaning and use of the term student engagement. Buckley (2018) describes other writers’ and educational researchers’ concerns and concludes that student engagement is ‘enigmatic’ (p. 721). Commonly, across the higher education and compulsory education sectors, student engagement is described as multidimensional or as a meta-construct. Authors that discuss student engagement in this way write about student engagement in terms of a student’s active participation in learning activities. This, in turn, is linked to a number of things: teaching practices, well-designed curricula and learning environments, and the wider ambitions of either students or their universities for personally transformative or socially-impactful education experiences. These are the meanings that this issue’s various conversationalists and authors held of student engagement. The group recall the Alumni Reunion conversation, in which Professor Helen Walkington held a firm conception of student engagement as comprising three elements: the cognitive, emotional and professional development of a student. More commonly in the literature, the element of behavioural engagement is referenced in place of professional development engagement. Another dimension to student engagement in higher education that merits noting, even though it is not the focus of this Teaching Insights issue, relates to student participation in university decision-making, including governance, representation and co-production. Ashwin and McVitty’s (2015) article highlights how important it is to define the object in focus for any discussion of student engagement and that certainly has been the starting point—and sometimes the sticking point—across the conversations in this issue.

Student engagement is a ‘good thing’

The Recipes for Success , which are examples of practical activities that have proven effective in specific learning and teaching contexts, have all been focused not simply on defining student engagement but on improving it. Engagement is assumed to be a ‘good thing’—something to cultivate and support. Zepke, Leach and Butler (2014) noted that lecturers in their study tended to see engagement as the responsibility primarily of the student. What is clear from the wider literature and experiences from our Teaching Insights contributors is that there are many ways teaching staff influence engagement. If, as we posit, student engagement is a ‘good thing’, then teaching staff and educational developers should rightfully play an active role to enable student engagement. In the context of student engagement as the formation of understanding and the development and transformation of knowledge [to broadly follow Ashwin and McVitty’s (2015) distinctive interpretation], our conversation turned to the ways teaching staff can improve student engagement.

From knowing students to watching for student engagement

The group saw improving student engagement as ideally starting from a point of knowing our students individually. From here, educators can then align their teaching approaches, learning activities and curriculum to best support students. Our conversationalists clearly saw the personalised support that educators give students as among the most valuable of ways to improve student engagement. However, in a mass higher education system with staff-student ratios closely managed, knowing all our students individually is tricky. We talked about the increasing potential of course learning analytics for informing and strengthening individual staff-student relationships, for instance as behavioural engagement might be knowable from patterns that emerge across monitored student actions such as downloading readings and attendance. We also recognised that promoting active learning approaches during course contact hours was critical. Educators can watch students’ cognitive and emotional engagement, and use that as real-time feedback on student learning. 

A salient outcome of teaching during the pandemic has been the consideration of the ways we can measure (or watch) more dimensions of student engagement in online spaces with the intention of taking proactive actions to improve it [see, for example, Brown et al.’s (2020) conceptual framework to enhance online learning and engagement]. ‘Teaching into the void’—that is, teaching students who have their cameras off during online classes—has been repeatedly described as problematic for teaching staff because it severs an important feedback loop where proxies or approximations of some dimensions of student engagement are knowable (or watchable). We note the suggestions on how to encourage students to turn their cameras on shared by the staff and student panellists in this issue’s Peer Review .

Once student engagement in learning can be known and seen, teaching staff can react and align their teaching practice and learning resources to best support engagement and play their part to enable student engagement. However, some teaching staff, particularly those new in role, might struggle to enable student engagement. Dismore, Turner and Huang (2018) noted that incorporating (inter)active learning approaches into classes to promote student engagement required risk-taking and confidence on the part of the educator—even more so with perceived barriers such as group size and content.

Structuring engaging learning

It is clear that to encourage student engagement, there is an important role for teaching staff to create learning environments that are purposeful, active and interactive. Aspects of behavioural, emotional and cognitive engagement can be observed, and actions taken to improve these facets of engagement. It would be useful to consider student engagement in terms of the relationships at the core of higher education learning. These include the relationships between:

  • staff and students;
  • students and their peers;
  • students and the discipline or subject; and
  • students and their own ontological development.

Teaching staff and the ways they structure learning (whether online or in-person) play a critical role in orchestrating these relationships, for instance in defining how peers collaborate and support one another to learn, in setting how the curriculum delivery reveals disciplinary and professional skills and insights, and in allowing time and space for students to reflect on how their personal and professional development has grown during their studies.

So far, our conversation has explored the ways in which educators can enable student engagement. We talked about the importance of getting to know students individually and being able to observe student engagement behaviours. In an era of mass (and online) higher education, course learning analytics and active learning activities during class contact hours can be used strategically to obtain feedback on how students are engaging and learning. When structuring engaging learning experiences, educators can usefully consider the key relationships in higher education learning as outlined above (e.g. between students and their peers, students and their own ontological development, etc.). Of course, our conversation has been predicated on the assumption that student engagement is a ‘good thing’ and that educators have a part to play in enabling engaging learning. This would be particularly so if we assume, as seems often to be the case, that student engagement is linked to students fulfilling their potential and achieving their goals. But is there evidence of that? Is there published evidence where quantified measures of student engagement have been related to student outcomes and achievements? 

The relationship between student engagement, student outcomes and well-being

Despite the amount of research and practice-led speculation on the relationship between student engagement and teaching practices, we were not aware of much data that could demonstrate causality between student engagement and individual student outcomes in higher education. The book Engaging University Students by Coates and McCormick (2014) showcases global endeavour in different countries to measure student engagement, largely through large-scale surveys. Beyond higher education, there is evidence of links between aspects of behavioural and emotional student engagement, and academic achievement [see for example, Finn and Zimmer (2012)]. In higher education settings, a few papers exist that do demonstrate the impacts of (mostly) small-scale, micro-changes to teaching practices on closely related achievement measures. There is also plenty of discussion about what outcomes or success measures might be used. One large-scale study by Pascarella, Seifert and Blaich (2010) across 13 higher education colleges in the USA concluded that there was a correlation between students’ effective learning practices (e.g. preparing for classes, reading and writing, working with peers, interacting with teaching staff) and a number of important educational outcomes linked to liberal educational intentions, including moral character and well-being. We think the link between engagement and well-being is particularly interesting. More recent work by Boulton et al. (2019) has also indicated that student engagement and well-being may be related and they have suggested a ‘possible feedback loop where increasing engagement increases academic performance, which in turn increases well-being’ (p. 14). Macfarlane and Tomlinson (2017) have criticised the overt focus on student engagement and performance, and the search for causal relationships. 

We conclude by noting the virtue of teaching staff, educational developers and university managers looking for ways to develop student engagement. We also suggest that there is much merit in clearly defining and sharing their definitions of student engagement, for example, as students’ active commitment and effort spent on learning activities. With clear definitions in use, we can take action (and monitor the impacts of these actions) to build learning experiences that improve student engagement and support student well-being. Such approaches would seek to link student engagement to a compassionate, principled pedagogy that can support higher education students to flourish, irrespective of their motivations, aspirations, desired outcomes or definitions of success.

Ashwin, P., & McVitty, D. (2015). The meanings of student engagement: Implications for policies and practices. In A. Curaj, L. Matei, R. Pricopie, J. Salmi, & P. Scott (Eds.), The European higher education area: Between critical reflections and future policies . Springer. https://doi.org/10.1007/978-3-319-20877-0_23

Boulton, C. A., Hughes, E., Kent, C., Smith, J. R., & Williams, H. T. P. (2019). Student engagement and wellbeing over time at a higher education institution. PLoS ONE, 14 (11), e0225770. https://doi.org/10.1371/journal.pone.0225770  

Brown, A., Lawrence, J., Basson M., & Redmond, P. (2020). A conceptual framework to enhance student online learning and engagement in higher education. Higher Education Research & Development . https://doi.org/10.1080/07294360.2020.1860912

Buckley, A. (2018). The ideology of student engagement research. Teaching in Higher Education, 23 (6), 718–732. https://doi.org/10.1080/13562517.2017.1414789  

Coates, H., & McCormick, A.C. (2014). Engaging university students: International insights from system-wide studies . Springer. https://doi.org/10.1007/978-981-4585-63-7  

Dismore, H., Turner, R., & Huang, R. (2018). Let me edutain you! Practices of student engagement employed by new lecturers. Higher Education Research & Development, 38 (2), 235–249. https://doi.org/10.1080/07294360.2018.1532984    

Finn, J. D., & Zimmer, K. S. (2012). Student engagement: What is it? Why does it matter?. In S. Christenson, A. Reschly, & C. Wylie (Eds.), Handbook of research on student engagement (pp. 97–131). Springer. https://doi.org/10.1007/978-1-4614-2018-7_5

Headleand, C. (2021) What does student engagement mean to you? And you? And you? . The Campus. Retrieved 3 September 2021, from https://www.timeshighereducation.com/campus/what-does-student-engagement-mean-you-and-you-and-you

Macfarlane, B., & Tomlinson, M. (2017). Critiques of student engagement. Higher Education Policy, 30 , 5–21. https://doi.org/10.1057/s41307-016-0027-3

Pascarella, E. T ., Seifert, T. A ., & Blaich, C. (2010). How effective are the NSSE benchmarks in predicting important educational outcomes? Change: The Magazine of Higher Learning, 42 (1), 16–22. https://doi.org/10.1080/00091380903449060  

Roxå, T., & Mårtenssen, A. (2009). Significant conversations and significant networks – exploring the backstage of the teaching arena. Studies in Higher Education, 34 (5), 547–559. https://doi.org/10.1080/03075070802597200  

Zepke, N., Leach, L., & Butler, P. (2014). Student engagement: Students’ and teachers’ perceptions. Higher Education Research & Development, 33 (2), 386–398. https://doi.org/10.1080/07294360.2013.832160  

Acknowledgements

Written by Jackie Potter, Mary Kitchener, Kat Kwok, Elizabeth Lovegrove and Cathy Malone.

Print

Get in touch with us today to contribute towards a future issue

Privacy overview.

CookieDurationDescription
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
  • Our Mission

To Increase Student Engagement, Focus on Motivation

Teachers can motivate middle and high school students by providing structure while also allowing them some control over their learning.

Photo of high school teacher and student

A 2018 Gallup study found that as students get older, they become less engaged, or “ involved, enthusiastic, and committed .” The study contained some alarming findings: In fifth grade, most students (74 percent) report high levels of engagement with school. However, by middle school, only half of students are engaged, and by high school the number of engaged students shrinks to about one-third. 

Student engagement continued to be a pressing concern for parents and educators during and after the pandemic. Approximately half of parents (45 percent), 77 percent of administrators, and 81 percent of teachers said that keeping students engaged was difficult during remote learning. In addition, 94 percent of educators considered student engagement to be the most important metric to look at when determining student success. Gallup found that students who are engaged with school not only report achieving higher grades but also feel more hopeful about their future. 

Student motivation

Engagement and motivation are separate, related, but often confused. Motivation is the driving force that causes a student to take action. Engagement is the observable behavior or evidence of that motivation. Motivation is necessary for engagement, but successful engagement could also help students to feel motivated in the future. 

In my book The Independent Learner , I discuss how self-regulated learning strategies help students to increase their motivation and willingness to engage in learning because they create feelings of autonomy, competence, and relatedness. According to research by Ryan and Deci, these are the three components that facilitate motivation :

  • Autonomy is a “sense of initiative and ownership in one’s actions.”
  • Competence is a “feeling of mastery” and a sense that with effort a student can “succeed and grow.”
  • Relatedness is when the school setting “conveys respect and caring” that results in the student feeling “a sense of belonging and connection.”

Motivating students in the classroom

It is important not to confuse engagement with entertainment. In an EdWeek survey , researchers found that the entertaining activities that teachers expect to engage students are not necessarily working. While the majority of teachers who had increased their use of digital games assumed that games would engage students, only 27 percent of students reported feeling more engaged when digital games were involved. In addition, 30 percent of students said learning was actually less engaging. 

So what creates engagement? Gallup found that students who strongly agreed with the following two statements were 30 times more likely to report high levels of engagement with school:

  • My school is “committed to building the strengths of each student.”
  • I have “at least one teacher who makes me excited about the future.”

In other words, engaged students recognize that they have the support of caring adults who are willing to partner with them in their learning. 

Not all classrooms create these conditions. Controlling classrooms lower autonomy and motivation and increase student frustration. In controlling classrooms, students avoid challenges because they are afraid of failure. They work toward external rewards or to avoid possible anxiety or shame caused by mistakes. The teacher controls the answers and learning materials and uses language like “should” or “have to,” and students feel pressured to behave and achieve.

In contrast, creating classroom environments where students feel autonomy, competence, and relatedness helps students to maintain motivation and increase their engagement in school activities. Classrooms that foster motivation and increase engagement are high in structure but low in top-down control . These classrooms have the following qualities.

Supportive: Teachers support autonomy by listening and attempting to understand and respond to students’ perspectives. They look at what a student can currently do and where they need to go to reach the standard or objective, and they help the student by building scaffolds or supports to bridge the gap. This makes achievement toward grade-level content possible, even for learners who are not quite there yet.

Personal and individualized: Students feel like they are able to customize their assignments in order to explore their own interests. Students can also be taught to make their own connections to what they are learning through creating their own hooks for a lesson . Recognizing students’ unique qualities and special talents, getting to know what interests students and incorporating these interests into lessons or assignments, and reaching out to parents with a note or email when a student does something well are all strategies that I use to make learning personal and increase relatedness in the classroom. 

Structured and goal-oriented: When teachers give students strategies, provide frequent feedback, and show them how to use those strategies effectively, students are motivated by observing their own progress. Teachers can provide a rationale or standard and guide students in setting short-term mastery goals for each required task. They can also help students to align their daily actions and effort with the results they are hoping to achieve by making a process plan . I have found that when students graph their own progress or use their process plan as a checklist, this makes growth visual and allows students to see the steps they are accomplishing each day toward their goal. In addition, clear expectations, consistency in classroom structure, clear rules, and set routines are all important. 

Collaborative: Teachers provide students with choices and opportunities to partner with the teacher in their learning experiences and show ownership in the tasks that are assigned to them. When teachers encourage students to begin to make choices and take responsibility for their own learning, students see a purpose in school activities. One way to do this is through using self-assessment to prompt reflection on strategy use. I have students analyze their graded assignments to decide what strategies to keep and what to do differently next time. When students see errors as a signal that they need to reflect on the process and learning strategies they used, they realize there are no real mistakes, just opportunities for learning.

Although the pandemic has been difficult, the majority of students (69 percent) report feeling hopeful about the future. Students who are hopeful and engaged are less likely to get suspended or expelled, have chronic absenteeism, skip school, or drop out of school. When educators put effort into the goal of creating a school environment guided by student engagement, motivation, and autonomy, students can see their own growth. This creates an excitement for learning that helps students to maintain hope for the future even through difficult circumstances.

The Importance of Student Engagement: Why Active Participation is Essential for Learning

True student engagement is vital to successful learning, as it helps them absorb information and excel; understanding why student engagement is important is essential.

Student engagement is a critical component of effective education. When students are engaged, they are more likely to learn, retain information, and achieve academic success. Student engagement is about more than just attending proper classes in school.

It is about being fully present and completely interested in the activity that is going on around the class. There are several reasons “why is student engagement important.” Let’s look at some of the most critical aspects of student engagement.

The Importance of Student Engagement

1. enhance learning.

When students actively engage in their learning, they are more likely to pay attention and participate in class discussions. Active participation allows students to clarify their understanding of the subject matter, ask questions, and seek clarification which enhances their understanding and retention of the information, leading to better learning outcomes.

2. Boosts Academic Performance

Students are more likely to take responsibility for their education when they are engaged in their learning. They will complete assignments on time, participate in discussions, and study regularly. As a result, they perform better academically and achieve higher grades.

3. Foster Critical Thinking

Engaged students are more likely to develop critical thinking skills. Critical thinking skills involve analyzing information, evaluating arguments, and solving problems. When students are encouraged to think critically, they become more independent learners who can analyze data and draw conclusions.

4. Encourage Creativity

Engaged students are more likely to be creative. When students are encouraged to explore, experiment, and take risks in their learning, they are more likely to develop their creativity and problem-solving skills. Creativity is essential in developing a student’s ability to approach problems from different perspectives and develop new solutions.

5. Promote a Positive Classroom Environment

Engaged students create a positive classroom environment. When students actively participate, they create a sense of community and foster peer collaboration. This, in turn, leads to a more positive and supportive classroom environment, which enhances learning outcomes.

9 Efficient Methods for Boosting Student Participation

1. active learning strategies would be a better idea.

Active learning strategies involve getting students involved in the learning process actively. It includes group work, discussion, role-playing, and other hands-on activities. Active learning strategies promote critical thinking, problem-solving, and student collaboration, leading to increased engagement and better learning outcomes.

2. Incorporate Technology

Incorporating technology into the learning process can be an effective way to increase student engagement . It includes using interactive learning tools such as digital simulations, videos, and online quizzes. Technology can also enhance student communication, collaboration, and group work, leading to increased engagement and improved academic outcomes.

3. Provide Different Opportunity According to Choice

Providing students with choices and opportunities for self-directed learning can increase their motivation and engagement. The different options can include allowing students to choose their research topics, assignments, or learning activities. It empowers students to take ownership of their learning and can lead to improved academic outcomes.

4. Connect Learning to Real-World Applications

Connecting learning to real-world applications can help students understand the relevance of their learning and increase their engagement. This includes using examples from current events or real-world scenarios to illustrate concepts or skills. It helps students understand the practical application of their learning, leading to improved academic outcomes.

5. Establish a Positive Classroom Environment

Creating a positive classroom environment can encourage engagement and participation among students. Creating a safe and inclusive learning environment fosters a sense of community and promotes positive student interactions. Students are more likely to be engaged in learning when they feel supported and valued by their peers and instructors.

6. Provide Frequent Feedback

Providing frequent feedback to students can help them understand their progress and areas for improvement. This includes giving regular assessments, providing feedback on assignments and class participation, and offering support and guidance as needed. Frequent feedback can motivate students to stay engaged in their learning and help them achieve their academic goals.

7. Use Creative Teaching Methods

Creative teaching methods can help increase student engagement by making learning more interesting and enjoyable. This includes storytelling, humor, and visual aids to help students understand and remember the content. It can also involve incorporating games or other interactive activities into the learning process.

8. Make Learning Relevant to Students’ Interests

Making learning relevant to students’ interests can help increase their engagement and motivation. This includes incorporating topics relevant to students’ lives or using examples that students can relate to. It allows students to see the practical application of their learning, leading to increased engagement and improved academic outcomes.

9. Encourage Student Collaboration

Encouraging student collaboration can increase engagement by creating a sense of community and promoting teamwork. It includes group projects, peer teaching, and collaborative problem-solving. It can help students build relationships with their peers, learn from one another, and develop valuable teamwork skills.

Above is a brief explanation of the question, “Why is student engagement important.” Student engagement is critical for effective education. Engaged students are active learners who perform better academically, develop necessary thinking skills, and promote a positive classroom environment. By using active learning strategies, providing timely feedback, fostering a safe and supportive classroom environment, using technology, and personalizing learning, educators can increase student engagement and create a more effective learning environment.

1. What is the right way to measure student engagement?

There are different ways, such as performing surveys, keeping track of their records, and observation, which you can use to measure student engagement .

2. What is the main role of cultural competence in promoting engagement?

It helps teachers recognize their students’ value and diversity, leading to a better learning environment.

3. As parents, how can we support student engagement for our kids?

Parents can support student engagement by reinforcing the value of education, encouraging their child’s curiosity and interests, and providing opportunities for learning outside the classroom.

4. What is intrinsic motivation?

Intrinsic motivation comes from within oneself, such as personal interest, enjoyment, and satisfaction with learning.

5. What is the relationship between student engagement and academic success?

There is a positive correlation between student engagement and academic success.

Lydia Thompson

Elevating thoughts with insightful observations and creative learning approaches.

You may also like...

Strategies for promoting student engagement in the classroom.

Explore 12 strategies to boost student engagement, fostering a positive, inclusive learning environment for effective education outcomes.

8 Things You Can Do to Make Your Students Engaged in Art Class

Engage students in art class using 8 strategies: cultivate positivity, offer varied materials, showcase iconic artists, and harness tech for deeper creativity.

Engaging Students in Reading: Effective Strategies and Techniques

Uncover methods to engage students in reading. From a reading-friendly classroom to celebrating achievements, cultivate a lifelong love for books.

what are some research questions on student engagement

20 Technology Tools To Engage Students In The Classroom

What are the best learning technology tools to engage students.

contributed by Sara McGuire , venngage.com

Technology distracts students, right? Keeps them from focusing?

One solution is to ban phones and computers from the classroom. Another is to harness students’ tech-savvy and engage them with online tools to help them complete assignments while still engaging them electronically. Whether they’re working on a research essay, a presentation, a science project, or a math report, ample tools are available to make the process more engaging for students.

If students grow up in a world that requires them to be tech-savvy, shouldn’t technology play a big role in their classroom experience? Here are ten picks for tools to engage students in the classroom.

First, what type of tools can make learning more interesting for students?

Many learning technology tools can be used to improve student engagement.

Technology Tools And Apps For Student Engagement

1. Gamification tools

Gamification tools like Kahoot!, Quizlet, and Gimkit can make learning more engaging and interactive by adding game elements like points, rewards, and competition. These tools can be used to create quizzes, flashcards, and other activities that students can complete individually or in groups.

2. Video Content

Video-based tools, such as Edpuzzle and PlayPosit, can make learning more engaging by incorporating videos into lessons. These tools allow teachers to add interactive elements, such as quizzes or discussion questions, to videos and track student progress.

YouTube is an obvious choice here as well.

3. Tools For Student Collaboration

Collaborative tools, such as Google Docs and Padlet, can make learning more engaging by allowing students to collaborate and share ideas. These tools can be used for group projects, brainstorming sessions, and peer review activities.

4. Adaptive learning apps and platforms

Adaptive learning tools, such as DreamBox and Aleks, can make learning more engaging by providing personalized learning experiences for each student. These tools use algorithms to adjust the difficulty of the content based on each student’s strengths and weaknesses.

5. Virtual and augmented reality tools

Virtual and augmented reality tools like Nearpod VR and Merge Cube can make learning more engaging by providing immersive experiences. These tools can explore historical sites, conduct science experiments, and visualize complex concepts.

You can see some examples of these kinds of tools below.

20 Learning Technology Tools For Student Engagement

1. Augmented Reality Apps

Here are some augmented reality apps to get started.

Video is a wonderful engagement tool. Add a social dynamic in a school-friendly architecture and you’ve got Flip. See some ideas on ideas for using Flip in the classroom .

3. Video Games

I know this is general–merely saying ‘video games’ isn’t a ‘student engagement tool.’ However, video games don’t work without player input–and thus student engagement. The right game could change your classroom. Here are some examples of video games you can teach with .

See also Why People Play Video Games

4. Google Forms

Google Forms is likely the simplest app on the list (well–aside from the background noise strategy): Google Forms

One of the best ways to engage all students in your classroom is to give students an easy (and even anonymous) way to ask questions, receive feedback, or reach out to the teacher. While there are many ways to do this, one of the most universally accessible (and free) methods is Google Forms.

Whether you provide specific questions and prompts for students to respond to as an exit slip (e.g. (Was there any point during today’s lesson where you were confused?) or you leave it as a way for students to post questions anonymously (which can be useful for some struggling students who might otherwise be hesitant to reach out), a simple messaging system or basic form can help improve student engagement.

5. Socrative

Like a few others on this list, you’ve likely heard of Socrative, a tool to “assess student understanding with prepared activities or on-the-fly questions, then adjust your teaching based on the results.”

6. Kahoot!

Kahoot! is a handy tool for creating in-class questionnaires and quizzes. This is useful for obtaining data for graphing assignments, data for research essays, and feedback from classmates. Kahoot! is compatible with multiple devices and has a game-like feel that will help keep students interested.

7. Class Dojo

This is a fun tool to gamify the classroom. Students make their own avatars and gain and lose points based on classroom behavior, discussion approaches, and other soft skills agreed upon by the teacher and the class. Teachers can also use Class Dojo to take attendance and create graphs that break down the information for them. Not only will this tool encourage students to uphold class values, but it will also provide key metrics to help teachers adjust their teaching tactics accordingly.

8. Clickers

Classroom clickers may not be the higher-water mark for innovation in education, but as a simple and useful tool that you can use almost every day, it’s a no-brainer for many classrooms.

This is a tool for teachers, to help assess students’ understanding of concepts and their engagement with the material. With some tools, teachers can project questions onto their screens while students answer them in real time. Students’ answers appear on the teacher’s phone screen, and teachers can see which students got answers right and which didn’t. This gives teachers an accurate picture of how students follow the information and adjust their lessons accordingly.

Note, this is more of a general recommendation than an endorsement of a specific clicker tool or app. The problem with this otherwise ‘no-doubter’ recommendation is that many clicker tools are expensive–at least the ones we know of. Plickers, iClicker, Top Hat, and other tools are not only not free but often have monthly subscription-level pricing. If your school has the budget and you put it to good use, it’s likely worth the investment.

Immediate responses from every single student instantly? That’s a great strategy for engaging students.

9.  Edvoice

Edvoice is a feature-rich communication tool that offers everything from lesson planning and rubrics to messaging, announcements, notifications, and even tools to help prevent (or respond to) bullying in the classroom.

10. Background Noise

Depending on what you want the students engage in–you, one another, content, an assignment, etc.–they need to be able to focus, and classroom aren’t always the easiest places to do that. Background noise can not only drown out excess noise, but more helpfully as students concentrate, there is less noise  because  they’re concentrating. Neat trick, huh?

11.  Venngage

Create interactive lessons, assess students on the fly, and see data and student responses in real time. Students who can ask questions and receive feedback at any time are more likely to be engaged.

With so much focus on data analytics, data literacy is a useful skill for students to learn. Visualizing their data in an infographic is a highly useful skill, whether your students have collected or collected it from other sources. Infographics appeal to both visual learners and textual learners . Venngage offers a selection of infographic templates that students can customize.

12. Prezi

Presentations are a core part of the curriculum, but let’s face it, PowerPoint isn’t terribly engaging. Prezi allows students to create presentations that are more creative and exciting than what PP has to offer. Not only will this make the presentation creation process more interesting for students, but it will also make watching presentations more interesting. Prezi presentations are published publicly on students’ accounts, so their classmates can access them later to check their notes.

Because so many students are in the habit of multitasking, a good skill to teach them is how to organize and streamline their assignments. Trello is a free and super easy-to-use tool students can use to create workflow charts. Multiple students can be added to the same board, which is great for project collaboration. (See also a better list of ideas for project-based learning .)

14. Virtual Reality

For most classrooms, virtual reality still isn’t viable. Even if you do have a headset, you likely don’t have 30. But in terms of ‘student engagement,’ it’s difficult to improve on virtual reality. We use a Pico virtual reality headset for learning, for example.

15. Cold Turkey

Students probably won’t love this one, but it’s a useful tool for mitigating the multitasking students can do on their computers. Cold Turkey is a tool that allows you to block certain websites or the internet in general so that students can focus on their tasks. Even having students turn it on for half a period for some focused in-class writing time will make a difference in their productivity.

Pear Deck is an interactive presentation platform that allows teachers to create engaging slide decks with interactive questions, polls, and activities. Students can respond to prompts in real time, providing instant feedback to the teacher and promoting active participation.

Genially is a multimedia creation tool that allows teachers to design interactive presentations, infographics, timelines, and more. It offers a wide range of templates and customization options, making it easy to create visually engaging content to enhance classroom instruction.

Mentimeter is a polling and interactive tool allowing teachers to create engaging presentations with live polls, quizzes, word clouds, and more. Students can respond to prompts using their smartphones or computers, and results are displayed in real time, fostering active participation and engagement.

Classcraft is a gamification platform that allows teachers to turn their classroom into a collaborative game environment. Students create characters, earn points, and work together to complete quests and challenges, promoting teamwork, motivation, and positive behavior.

Book Creator is a digital publishing platform allowing students to create and publish multimedia eBooks. With features like text, images, audio, video, and drawings, students can express their creativity while developing literacy and digital storytelling skills.

Technology Tools To Improve Student Engagement In The Classroom

About The Author

Teachthought staff.

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

education-logo

Article Menu

what are some research questions on student engagement

  • Subscribe SciFeed
  • Author Biographies
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

Chatgpt in teaching and learning: a systematic review.

what are some research questions on student engagement

1. Introduction

2. methodology, 2.1. research questions.

  • What are the benefits of ChatGPT use in academia, particularly in teaching and learning contexts?

2.2. Search Strategy

2.3. inclusion and exclusion criteria, 2.4. screening the articles, 2.5. research profile, 3. results and discussion, 3.1. advantages of chatgpt in teaching and learning.

  • Learning Engagement and Accessibility (45 occurrences): This category includes articles that discuss the role of ChatGPT in engaging students and making learning more accessible, which is the most frequently mentioned advantage in the included articles. ChatGPT can make learning more engaging and accessible, especially for students with disabilities. In their study, Hsu and Ching, 2023 categorized the applications of ChatGPT for learning, stating that teachers can use ChatGPT for support in the following ways: (1) assistance with teaching; (2) help with student assessment; (3) support for student learning; (4) suggestions for improving teaching; and (5) assistance with teacher–student and teacher–parent communication [ 27 ]. For students, the chatbot can provide support in the following areas: (1) personalized learning; (2) creative thinking; (3) assessment; and (4) reading and writing comprehension [ 27 ]. Rodrigues and Rodrigues, 2023 emphasized the potential of ChatGPT to facilitate more personalized and adaptive learning due to its interactivity [ 11 ]. This enables the execution of effective learning mechanisms, with feedback being a core feature of learner support that is highly effective in supporting learning. In a special case, Houston et al., 2023 showed that libraries can benefit from this solution by improving their reference practices, developing their collections, and transforming and creating metadata [ 18 ].
  • Natural Language Processing (NLP) (31 occurrences): ChatGPT’s ability to understand and generate human-like text is a significant advantage. Students can practice conversational skills in different languages with ChatGPT, enhancing their language proficiency. Farrokhnia et al. 2023 reported that ChatGPT’s natural language model is sophisticated and generates plausible, personalized, and real-time answers while self-improving [ 22 ]. In another study, Clark, 2023 found that ChatGPT’s responses demonstrate strong language processing abilities, performing better on questions that require generalizable information rather than specific skills, particularly the skills taught in lectures [ 60 ]. Masters, 2023 noted that ChatGPT can be helpful in the grading process, especially for written assignments [ 66 ].
  • Text Generation (29 occurrences): The capability of ChatGPT to generate text is highlighted in this category. ChatGPT can help students overcome writer’s block by generating ideas or outlines for essays and research papers. NLP covers text generation; however, we created a separate category for text generation because it is an essential aspect of education and has attracted significant attention from authors in this field. It can also assist teachers in creating educational content, such as lesson plans or example texts for class discussions. For example, generative-AI chatbots excel at quickly generating plausible answers for any question [ 104 ]. Marquez et al., 2023 conducted a study focusing on biobased materials education, finding that it was possible to use AI-powered text generation to brainstorm biobased materials and products and develop academic written text by applying a scientific method approach [ 106 ]. As an AI-powered assistant, ChatGPT can answer questions, generate text, write code, summarize papers, evaluate responses, and more. It offers a range of relevant topics and ideas that can be included in a course’s curriculum [ 80 ].
  • Performance Evaluation (19 occurrences): Teachers can use ChatGPT to provide instant feedback on students’ assignments or essays. For example, Ruiz et al., 2023 found that this virtual assistant tool allows teachers to provide real-time personalized support by answering student queries and offering additional information [ 100 ]. Additionally, Karabacak et al., 2023 highlighted opportunities to improve medical education through the use of personalized feedback and evaluation methods [ 43 ]. Clark et al., 2023 emphasized the potential of chatbots to support student learning in an interactive way by providing real-time feedback, thereby increasing student engagement [ 60 ].
  • Versatility (20 occurrences): ChatGPT can be applied across various academic fields. In science, it can be used to explain complex concepts, while in history, it can provide historical context. When used in mathematics, ChatGPT can solve problems and explain the steps in the process. Indeed, it can be used for a wide variety of tasks, for example, to evaluate task performance, provide feedback, generate human-like writing, offer expert solutions to complex tasks, and assist in solving mathematical problems [ 23 ]. Therefore, ChatGPT is likely to have a major impact on work and education, as it provides quick and easily understandable answers to a variety of questions [ 39 ].
  • Enhanced Communication (12 occurrences): This category refers to the potential for improved communication and interaction using ChatGPT. For example, Lozano and Fontao, 2023 noted that ChatGPT has great potential for improving communication between teachers and students. It can be used to generate innovative methodologies to improve the teaching–learning process, thereby increasing student performance [ 1 ]. AI assistants like ChatGPT can help by explaining complex concepts in simple language. This modern approach can help medical students learn more efficiently and provide better patient care [ 76 ]. Students can practice conversational skills in different languages with ChatGPT, enhancing their language proficiency. Moreover, ChatGPT can serve as an intermediary for students who may be shy or reluctant to ask questions in class or for students with disabilities [ 13 ]. Integrating ChatGPT with speech-to-text technology can support inclusive education for students with visual impairments or dyslexia.
  • Other (12 occurrences): This category includes responses that do not fit neatly into the predefined themes. For example, Schen et al., 2023 focused on ChatGPT voice response automation, which could overcome issues with response behavior or response quality [ 30 ]. Other examples include thinking about the current instrumentalization in education [ 47 ], debugging code [ 61 ], and clinical decision-making [ 76 ].
  • Not Applicable (10 occurrences): This category includes papers that were fully reviewed but did not discuss any advantages related to the use of ChatGPT in teaching and learning.

3.2. Disadvantages of ChatGPT in Teaching and Learning

  • Quality of Responses and Bias in AI (51 occurrences): Concerns about the accuracy of ChatGPT’s responses and potential biases in AI models are the most frequently cited disadvantages. The accuracy of ChatGPT’s responses may not always be reliable, and there can be biases in the AI model. This requires teachers to double-check information and discuss these biases with students. Rawas, 2023 mentioned that the potential for bias in AI implementation must be approached with caution and a clear understanding of the opportunities and challenges involved [ 13 ]. Moreover, Iskender, 2023 argued that ChatGPT could exacerbate existing biases in education, such as socio-economic and racial disparities [ 7 ]. Tsang, 2023 expressed concerns about the reliability of ChatGPT due to hallucinations and its training sources, which limit its use as a clinical support resource and evidence-based research tool [ 74 ]. Naidu and Sevnaravan, 2023 emphasized that the quality of the responses provided by ChatGPT are contingent on the quality of the input received, and it can generate better answers if the questions and prompts are clear [ 31 ].
  • Plagiarism and Authenticity (39 occurrences): It is challenging to ensure the originality and authenticity of content generated by ChatGPT. Moreover, there is a risk that students might submit ChatGPT-generated text as their own work. Educators need to emphasize the importance of academic integrity and may need to use plagiarism detection tools. Indeed, Dalalah and Dalalah, 2023 warned that plagiarism could become commonplace and endanger scientific research, leading to the loss of uniqueness and creativity in writing and art [ 63 ]. Furthermore, Wilby and Esson, 2023 highlighted ethical concerns regarding academic misconduct, model bias, robustness, and toxic output [ 96 ]. Thomas, 2023 reported that educators are concerned about cheating and may resort to oral exams [ 73 ]. Many are warning students that the use of ChatGPT will result in a failing grade.
  • Error Recognition (17 occurrences): ChatGPT may not always recognize its own errors. Teachers and students should be aware of this limitation and cross-verify information with credible sources. For instance, Houston and Corrado, 2023 showed that ChatGPT’s proficiency in generating text is best in the language it has been extensively trained on, namely, English [ 18 ]. Its ability to produce quality responses in other languages may not be as good as its responses in English, and there might be inconsistencies or errors in its language generation. Further, the importance of continually updating AI models with the latest medical knowledge has been emphasized to ensure that they remain reliable and accurate in the rapidly evolving field of medicine [ 111 ]. Failure to do so could cause ChatGPT to provide inaccurate responses. In the context of programming, Borger et al., 2023 noted that relying on ChatGPT-generated code requires users to have a fundamental understanding of programming concepts to avoid erroneous outputs [ 84 ].
  • Dependency (15 occurrences): There is a risk of overreliance on AI tools in learning environments, which can hinder students’ ability to think critically and solve problems independently. Educators need to balance the use of AI with traditional teaching methods. For example, Hosseini et al., 2023 warned that clinicians may become overly reliant on ChatGPT-like systems, putting their clinical reasoning skills at risk [ 87 ]. Similarly, Marzuki et al., 2023 reported that some educators are concerned that excessive use of these AI tools for language refinement and idea generation may limit students’ creative thinking and originality [ 82 ]. Ratten and Jones [ 107 ] expressed concerns about students relying too heavily on ChatGPT in completing their assignments, impeding their development of intuitive skills and potentially altering assessment practices.
  • Privacy and Data Security (11 occurrences): The use of ChatGPT in education raises concerns about data privacy and security. Schools and educational institutions must ensure that they are using AI tools in compliance with data protection laws and regulations. For instance, Marquez et al., 2023 noted using ChatGPT in education raises ethical concerns regarding privacy, data ownership, and algorithmic bias [ 106 ]. Michel-Villarreal et al., 2023 argued that is crucial for universities to address data privacy, algorithmic bias, and responsible use of AI-generated content to avoid skepticism around the implementation of ChatGPT [ 67 ].
  • Other (6 occurrences): This includes miscellaneous concerns that do not fit into the predefined categories. For example, Dergaa et al., 2023 pointed out that this technology has the potential to generate harmful outputs, such as spam and ransomware, which is a cause for concern in modern societies [ 28 ]. Other examples include environmental concerns, an inability to be used in highly specialized contexts, and a lack of contextual and nuanced understanding [ 6 , 46 , 90 ].
  • Not Applicable (14 occurrences): This category includes papers that were fully reviewed but did not discuss any disadvantages related to the use of ChatGPT in teaching and learning.

4. Recommendations

5. conclusions, author contributions, data availability statement, conflicts of interest, abbreviations.

AIArtificial Intelligence
NLPNatural Language Processing
  • Lozano, A.; Fontao, C. Is the Education System Prepared for the Irruption of Artificial Intelligence? A Study on the Perceptions of Students of Primary Education Degree from a Dual Perspective: Current Pupils and Future Teachers. Educ. Sci. 2023 , 13 , 733. [ Google Scholar ] [ CrossRef ]
  • Waltzer, T.; Cox, R.; Heyman, G. Testing the Ability of Teachers and Students to Differentiate between Essays Generated by ChatGPT and High School Students. Hum. Behav. Emerg. Technol. 2023 , 2023 , 1923981. [ Google Scholar ] [ CrossRef ]
  • Van Slyke, C.; Johnson, R.; Sarabadani, J. Generative Artificial Intelligence in Information Systems Education: Challenges, Consequences, and Responses. Commun. Assoc. Inf. Syst. 2023 , 53 , 14. [ Google Scholar ] [ CrossRef ]
  • Totlis, T.; Natsis, K.; Filos, D.; Ediaroglou, V.; Mantzou, N.; Duparc, F.; Piagkou, M. The Potential Role of ChatGPT and Artificial Intelligence in Anatomy Education: A Conversation with ChatGPT. Surg. Radiol. Anat. 2023 , 45 , 1321–1329. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Bissessar, C. To Use or Not to Use ChatGPT and Assistive Artificial Intelligence Tools in Higher Education Institutions? The Modern-Day Conundrum—Students’ and Faculty’s Perspectives. Equity Educ. Soc. 2023 , 27526461231215083. [ Google Scholar ] [ CrossRef ]
  • Cooper, G. Examining Science Education in ChatGPT: An Exploratory Study of Generative Artificial Intelligence. J. Sci. Educ. Technol. 2023 , 32 , 444–452. [ Google Scholar ] [ CrossRef ]
  • Iskender, A. Holy or Unholy? Interview with Open AI’s ChatGPT. Eur. J. Tour. Res. 2023 , 34 , 3414. [ Google Scholar ] [ CrossRef ]
  • Bringula, R. ChatGPT in a Programming Course: Benefits and Limitations. Front. Educ. 2024 , 9 , 1248705. [ Google Scholar ] [ CrossRef ]
  • da Silva, C.A.G.; Ramos, F.N.; de Moraes, R.V.; Santos, E.L. ChatGPT: Challenges and Benefits in Software Programming for Higher Education. Sustainability 2024 , 16 , 1245. [ Google Scholar ] [ CrossRef ]
  • Yan, L.; Sha, L.; Zhao, L.; Li, Y.; Martinez-Maldonado, R.; Chen, G.; Li, X.; Jin, Y.; Gasevic, D. Practical and Ethical Challenges of Large Language Models in Education: A Systematic Scoping Review. Br. J. Educ. Technol. 2023 , 55 , 90–112. [ Google Scholar ] [ CrossRef ]
  • Rodrigues, O.S.; Rodrigues, K.S. Artificial Intelligence in Education: The Challenges of ChatGPT. Texto Livre 2023 , 16 , e45997. [ Google Scholar ] [ CrossRef ]
  • Baidoo-Anu, D.; Owusu Ansah, L. Education in the Era of Generative Artificial Intelligence (AI): Understanding the Potential Benefits of ChatGPT in Promoting Teaching and Learning. J. AI 2023 , 7 , 52–62. [ Google Scholar ] [ CrossRef ]
  • Rawas, S. ChatGPT: Empowering Lifelong Learning in the Digital Age of Higher Education. Educ. Inf. Technol. 2024 , 29 , 6895–6908. [ Google Scholar ] [ CrossRef ]
  • Klayklung, P.; Chocksathaporn, P.; Limna, P.; Kraiwanit, T.; Jangjarat, K. Revolutionizing education with chatgpt: Enhancing learning through conversational AI. Univers. J. Educ. Res. 2023 , 2 , 217–225. [ Google Scholar ]
  • Choudhary, O.P.; Saini, J.; Challana, A. ChatGPT for Veterinary Anatomy Education: An Overview of the Prospects and Drawbacks. Int. J. Morphol. 2023 , 41 , 1198–1202. [ Google Scholar ] [ CrossRef ]
  • Sok, S.; Heng, K. ChatGPT for Education and Research: A Review of Benefits and Risks. SSRN Electron. J. 2023 . pre-print . [ Google Scholar ] [ CrossRef ]
  • Cotton, D.R.E.; Cotton, P.A.; Shipway, J.R. Chatting and Cheating: Ensuring Academic Integrity in the Era of ChatGPT. Innov. Educ. Teach. Int. 2023 , 61 , 228–239. [ Google Scholar ] [ CrossRef ]
  • Houston, A.; Corrado, E. Embracing ChatGPT: Implications of Emergent Language Models for Academia and Libraries. Tech. Serv. Q. 2023 , 40 , 76–91. [ Google Scholar ] [ CrossRef ]
  • Bauer, E.; Greisel, M.; Kuznetsov, I.; Berndt, M.; Kollar, I.; Dresel, M.; Fischer, M.R.; Fischer, F. Using Natural Language Processing to Support Peer-Feedback in the Age of Artificial Intelligence: A Cross-Disciplinary Framework and a Research Agenda. Brit. J. Educ. Tech. 2023 , 54 , 1222–1245. [ Google Scholar ] [ CrossRef ]
  • Hoch, C.C.; Wollenberg, B.; Lüers, J.-C.; Knoedler, S.; Knoedler, L.; Frank, K.; Cotofana, S.; Alfertshofer, M. ChatGPT’s Quiz Skills in Different Otolaryngology Subspecialties: An analysis of 2576 Single-Choice and Multiple-Choice Board Certification Preparation Questions. Eur. Arch. Otorhinolaryngol. 2023 , 280 , 4271–4278. [ Google Scholar ] [ CrossRef ]
  • Jiang, H.; Cheong, K. Developing Teaching Strategies for Rural School Pupils’ Concentration in the Distance Music Classroom. Educ. Inf. Technol. 2024 , 29 , 5903–5920. [ Google Scholar ] [ CrossRef ]
  • Farrokhnia, M.; Banihashem, S.K.; Noroozi, O.; Wals, A. A SWOT Analysis of ChatGPT: Implications for Educational Practice and Research. Innov. Educ. Teach. Int. 2024 , 61 , 460–474. [ Google Scholar ] [ CrossRef ]
  • Zhu, I.C.; Sun, M.; Luo, J.; Li, T.; Wang, M. How to Harness the Potential of ChatGPT in Education? Knowl. Manag. E-Learn. 2023 , 15 , 133–152. [ Google Scholar ] [ CrossRef ]
  • do Amaral, I. Reflection on the use of Generative Language Models as a Tool for Teaching Design. In Proceedings of the 2024 IEEE World Engineering Education Conference (EDUNINE), Kos, Greece, 10–13 March 2024; pp. 1–4. [ Google Scholar ]
  • Dahlkemper, M.; Lahme, S.; Klein, P. How Do Physics Students Evaluate Artificial Intelligence Responses on Comprehension Questions? A Study on the Perceived Scientific Accuracy and Linguistic Quality of ChatGPT. Phys. Rev. Phys. Educ. Res. 2023 , 19 , 010142. [ Google Scholar ] [ CrossRef ]
  • Duong, D. How Effort Expectancy and Performance Expectancy Interact to Trigger Higher Education Students’ Uses of ChatGPT for Learning. Interact. Technol. Smart Educ. 2023 . ahead of print . [ Google Scholar ] [ CrossRef ]
  • Hsu, Y.-C.; Ching, Y.-H. Generative Artificial Intelligence in Education, Part One: The Dynamic Frontier. TechTrends 2023 , 67 , 603–607. [ Google Scholar ] [ CrossRef ]
  • Dergaa, I.; Chamari, K.; Zmijewski, P.; Ben Saad, H. From Human Writing to Artificial Intelligence Generated Text: Examining the Prospects and Potential Threats of ChatGPT in Academic Writing. Biol. Sport 2023 , 40 , 615–622. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Strzelecki, A. To Use or Not to Use ChatGPT in Higher Education? A Study of Students’ Acceptance and Use of Technology. Interact. Learn. Environ. 2023 , 1–14. [ Google Scholar ] [ CrossRef ]
  • Schön, E.-M.; Neumann, M.; Hofmann-Stölting, C.; Baeza-Yates, R.; Rauschenberger, M. How Are AI Assistants Changing Higher Education? Front. Comput. Sci. 2023 , 5 , 1208550. [ Google Scholar ] [ CrossRef ]
  • Naidu, K.; Sevnarayan, K. ChatGPT: An Ever-Increasing Encroachment of Artificial Intelligence in Online Assessment in Distance Education. Online J. Commun. Media Technol. 2023 , 13 , e202336. [ Google Scholar ] [ CrossRef ]
  • Farazouli, A.; Cerratto Pargman, T.; Laksov, K.; McGrath, C. Hello GPT! Goodbye Home Examination? An Exploratory Study of AI Chatbots Impact on University Teachers’ Assessment Practices. Assess. Eval. High. Educ. 2024 , 49 , 363–375. [ Google Scholar ] [ CrossRef ]
  • Lancaster, T. Artificial Intelligence, Text Generation Tools and ChatGPT—Does Digital Watermarking Offer a Solution? Int. J. Educ. Integr. 2023 , 19 , 10. [ Google Scholar ] [ CrossRef ]
  • Tam, W.; Huynh, T.; Tang, A.; Luong, S.; Khatri, Y.; Zhou, W. Nursing Education in the Age of Artificial Intelligence Powered Chatbots (AI-Chatbots): Are We Ready Yet? Nurse Educ. Today 2023 , 129 , 105917. [ Google Scholar ] [ CrossRef ]
  • Kumah-Crystal, Y.; Mankowitz, S.; Embi, P.; Lehmann, C. ChatGPT and the Clinical Informatics Board Examination: The End of Unproctored Maintenance of Certification? J. Am. Med. Inform. Assoc. JAMIA 2023 , 30 , 1558–1560. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Abd-alrazaq, A.; Alsaad, R.; Alhuwail, D.; Ahmed, A.; Healy, M.; Latifi, S.; Aziz, S.; Damseh, R.; Alrazak, S.; Sheikh, J. Large Language Models in Medical Education: Opportunities, Challenges, and Future Directions. JMIR Med. Educ. 2023 , 9 , e48291. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Liu, H.; Azam, M.; Bin Naeem, S.; Faiola, A. An Overview of the Capabilities of ChatGPT for Medical Writing and Its Implications for Academic Integrity. Health Inf. Libr. J. 2023 , 40 , 440–446. [ Google Scholar ] [ CrossRef ]
  • Al-Ghonmein, A.M.; Al-Moghrabi, K.G. The Potential of ChatGPT Technology in Education: Advantages, Obstacles and Future Growth. IAES Int. J. Artif. Intell. 2024 , 13 , 1206–1213. [ Google Scholar ] [ CrossRef ]
  • Ajevski, M.; Barker, K.; Gilbert, A.; Hardie, L.; Ryan, F. ChatGPT and the Future of Legal Education and Practice. Law Teach. 2023 , 57 , 352–364. [ Google Scholar ] [ CrossRef ]
  • Deebel, N.; Terlecki, R. ChatGPT Performance on the American Urological Association (AUA) Self-Assessment Study Program and the Potential Influence of Artificial Intelligence (AI) in Urologic Training. Urology 2023 , 177 , 29–33. [ Google Scholar ] [ CrossRef ]
  • Megahed, F.; Chen, Y.-J.; Ferris, J.; Knoth, S.; Jones-Farmer, L.A. How Generative AI Models Such as ChatGPT Can Be (Mis)Used in SPC Practice, Education, and Research? An Exploratory Study. Qual. Eng. 2024 , 36 , 287–315. [ Google Scholar ] [ CrossRef ]
  • Zhou, J.; Ke, P.; Qiu, X.; Huang, M.; Zhang, J. ChatGPT: Potential, Prospects, and Limitations. Front. Inf. Technol. Electron. Eng. 2023 , 25 , 6–11. [ Google Scholar ] [ CrossRef ]
  • Karabacak, M.; Ozkara, B.B.; Margetis, K.; Wintermark, M.; Bisdas, S. The Advent of Generative Language Models in Medical Education. JMIR Med. Educ. 2023 , 9 , e48163. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Cross, J.; Robinson, R.; Devaraju, S.; Vaughans, A.; Hood, R.; Kayalackakom, T.; Honnavar, P.; Naik, S.; Sebastian, R. Transforming Medical Education: Assessing the Integration of ChatGPT Into Faculty Workflows at a Caribbean Medical School. Cureus 2023 , 15 , e41399. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Nikolic, S.; Daniel, S.; Haque, R.; Belkina, M.; Hassan, G.M.; Grundy, S.; Lyden, S.; Neal, P.; Sandison, C. ChatGPT Versus Engineering Education Assessment: A Multidisciplinary and Multi-Institutional Benchmarking and Analysis of This Generative Artificial Intelligence Tool to Investigate Assessment Integrity. Eur. J. Eng. Educ. 2023 , 48 , 559–614. [ Google Scholar ] [ CrossRef ]
  • Nune, A.; Iyengar, K.; Manzo, C.; Barman, B.; Botchu, R. Chat Generative Pre-Trained Transformer (ChatGPT): Potential Implications for Rheumatology Practice. Rheumatol. Int. 2023 , 43 , 3. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Heimans, S.; Biesta, G.; Takayama, K.; Kettle, M. ChatGPT, Subjectification, and the Purposes and Politics of Teacher Education and Its Scholarship. Asia-Pac. J. Teach. Educ. 2023 , 51 , 105–112. [ Google Scholar ] [ CrossRef ]
  • Lim, W.M.; Gunasekara, A.; Pallant, J.L.; Pallant, J.I.; Pechenkina, E. Generative AI and the Future of Education: Ragnarök or Reformation? A Paradoxical Perspective from Management Educators. Int. J. Manag. Educ. 2023 , 21 , 100790. [ Google Scholar ] [ CrossRef ]
  • Baker, B.; Mills, K.A.; McDonald, P.; Wang, L. AI, Concepts of Intelligence, and Chatbots: The “Figure of Man”, the Rise of Emotion, and Future Visions of Education. Teach. Coll. Rec. 2023 , 125 , 60–84. [ Google Scholar ] [ CrossRef ]
  • Beerepoot, M. Formative and Summative Automated Assessment with Multiple-Choice Question Banks. J. Chem. Educ. 2023 , 100 , 2947–2955. [ Google Scholar ] [ CrossRef ]
  • Greiner, C.; Peisl, T.; Höpfl, F.; Beese, O. Acceptance of AI in Semi-Structured Decision-Making Situations Applying the Four-Sides Model of Communication—An Empirical Analysis Focused on Higher Education. Educ. Sci. 2023 , 13 , 865. [ Google Scholar ] [ CrossRef ]
  • Khan, R.A.; Jawaid, M.; Khan, A.R.; Sajjad, M. ChatGPT—Reshaping Medical Education and Clinical Management. Pak. J. Med. Sci. 2023 , 39 , 605–607. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Kasneci, E.; Sessler, K.; Küchemann, S.; Bannert, M.; Dementieva, D.; Fischer, F.; Gasser, U.; Groh, G.; Günnemann, S.; Hüllermeier, E.; et al. ChatGPT for Good? On Opportunities and Challenges of Large Language Models for Education. Learn. Individ. Differ. 2023 , 103 , 102274. [ Google Scholar ] [ CrossRef ]
  • Lacey, M.; Smith, D. Teaching and Assessment of the Future Today: Higher Education and AI. Microbiol. Aust. 2023 , 44 , 124–126. [ Google Scholar ] [ CrossRef ]
  • Milton, C.L. ChatGPT and Forms of Deception. Nurs. Sci. Q. 2023 , 36 , 232–233. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Zhang, B. ChatGPT, an Opportunity to Understand More About Language Models. Med. Ref. Serv. Q. 2023 , 42 , 194–201. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Castro, C.A. de A Discussion about the Impact of ChatGPT in Education: Benefits and Concerns. J. Bus. Theory Pract. 2023 , 11 , 28. [ Google Scholar ] [ CrossRef ]
  • Alasadi, E.A.; Baiz, C.R. Generative AI in Education and Research: Opportunities, Concerns, and Solutions. J. Chem. Educ. 2023 , 100 , 2965–2971. [ Google Scholar ] [ CrossRef ]
  • Barrot, J.S. Using ChatGPT for Second Language Writing: Pitfalls and Potentials. Assess. Writ. 2023 , 57 , 100745. [ Google Scholar ] [ CrossRef ]
  • Clark, T. Investigating the Use of an Artificial Intelligence Chatbot with General Chemistry Exam Questions. J. Chem. Educ. 2023 , 100 , 1905–1916. [ Google Scholar ] [ CrossRef ]
  • Cloesmeijer, M.; Janssen, A.; Koopman, S.; Cnossen, M.; Mathôt, R. ChatGPT in Pharmacometrics? Potential Opportunities and Limitations. Br. J. Clin. Pharmacol. 2023 , 90 , 360–365. [ Google Scholar ] [ CrossRef ]
  • Currie, G.; Singh, C.; Nelson, T.; Nabasenja, C.; Al-Hayek, Y.; Spuur, K. ChatGPT in Medical Imaging Higher Education. Radiography 2023 , 29 , 792–799. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Dalalah, D.; Dalalah, O. The False Positives and False Negatives of Generative AI Detection Tools in Education and Academic Research: The Case of ChatGPT. Int. J. Manag. Educ. 2023 , 21 , 100822. [ Google Scholar ] [ CrossRef ]
  • Ibrahim, H.; Asim, R.; Zaffar, F.; Rahwan, T.; Zaki, Y. Rethinking Homework in the Age of Artificial Intelligence. IEEE Intell. Syst. 2023 , 38 , 24–27. [ Google Scholar ] [ CrossRef ]
  • Kooli, C. Chatbots in Education and Research: A Critical Examination of Ethical Implications and Solutions. Sustainability 2023 , 15 , 5614. [ Google Scholar ] [ CrossRef ]
  • Masters, K. Ethical use of Artificial Intelligence in Health Professions Education: AMEE Guide No. 158. Med. Teach. 2023 , 45 , 574–584. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Michel-Villarreal, R.; Vilalta-Perdomo, E.; Salinas-Navarro, D.E.; Thierry-Aguilera, R.; Gerardou, F.S. Challenges and Opportunities of Generative AI for Higher Education as Explained by ChatGPT. Educ. Sci. 2023 , 13 , 856. [ Google Scholar ] [ CrossRef ]
  • Oh, N.; Choi, G.-S.; Lee, W.Y. ChatGPT Goes to the Operating Room: Evaluating GPT-4 Performance and its Potential in Surgical Education and Training in the Era of Large Language Models. Ann. Surg. Treat. Res. 2023 , 104 , 269. [ Google Scholar ] [ CrossRef ]
  • Pretorius, L. Fostering AI Literacy: A Teaching Practice Reflection. J. Acad. Lang. Learn. 2023 , 17 , T1–T8. [ Google Scholar ]
  • Rahman, M.; Watanobe, Y. ChatGPT for Education and Research: Opportunities, Threats, and Strategies. Appl. Sci. 2023 , 13 , 5783. [ Google Scholar ] [ CrossRef ]
  • Sanchez Ruiz, L.M.; Moll-López, S.; Nuñez-Pérez, A.; Moraño, J.; Vega, E. ChatGPT Challenges Blended Learning Methodologies in Engineering Education: A Case Study in Mathematics. Appl. Sci. 2023 , 13 , 6039. [ Google Scholar ] [ CrossRef ]
  • Shaikh, S.; Yildirim Yayilgan, S.; Klimova, B.; Pikhart, M. Assessing the Usability of ChatGPT for Formal English Language Learning. Eur. J. Investig. Health Psychol. Educ. 2023 , 13 , 1937–1960. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Thomas, S.P. Grappling with the Implications of ChatGPT for Researchers, Clinicians, and Educators. Issues Ment. Health Nurs. 2023 , 44 , 141–142. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Tsang, R. Practical Applications of ChatGPT in Undergraduate Medical Education. J. Med. Educ. Curric. Dev. 2023 , 10 , 23821205231178449. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Weng, T.-L.; Wang, Y.-M.; Chang, S.; Chen, T.-J.; Hwang, S.-J. ChatGPT Failed Taiwan’s Family Medicine Board Exam. J. Chin. Med. Assoc. 2023 , 86 , 762–766. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Bhattacharya, K.; Bhattacharya, N.; Bhattacharya, A.; Yagnik, V.; Garg, P. ChatGPT in Surgical Practice-a New Kid on the Block. Indian J. Surg. 2023 , 85 , 1346–1349. [ Google Scholar ] [ CrossRef ]
  • Eager, B.; Brunton, R. Prompting Higher Education Towards AI-Augmented Teaching and Learning Practice. J. Univ. Teach. Learn. Pract. 2023 , 20 , 2. [ Google Scholar ] [ CrossRef ]
  • French, F.; Levi, D.; Maczo, C.; Simonaityte, A.; Triantafyllidis, S.; Varda, G. Creative Use of OpenAI in Education: Case Studies from Game Development. Multimodal Technol. Interact. 2023 , 7 , 81. [ Google Scholar ] [ CrossRef ]
  • Fütterer, T.; Fischer, C.; Alekseeva, A.; Chen, X.; Tate, T.; Warschauer, M.; Gerjets, P. ChatGPT in Education: Global Reactions to AI Innovations. Sci. Rep. 2023 , 13 , 15310. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Glaser, N. Exploring the Potential of ChatGPT as an Educational Technology: An Emerging Technology Report. Technol. Knowl. Learn. 2023 , 28 , 1945–1952. [ Google Scholar ] [ CrossRef ]
  • Livberber, T.; Ayvaz, S. The Impact of Artificial Intelligence in Academia: Views of Turkish Academics on ChatGPT. Heliyon 2023 , 9 , e19688. [ Google Scholar ] [ CrossRef ]
  • Marzuki; Widiati, U.; Rusdin, D.; Darwin; Indrawati, I. The Impact of AI Writing Tools on the Content and Organization of Students’ Writing: EFL Teachers’ Perspective. Cogent Educ. 2023 , 10 , 2236469. [ Google Scholar ] [ CrossRef ]
  • Abu hammour, K.; Alhamad, H.; Al-Ashwal, F.; Halboup, A.; Abu Farha, R. ChatGPT in Pharmacy Practice: A Cross-Sectional Exploration of Jordanian Pharmacists’ Perception, Practice, and Concerns. J. Pharm. Policy Pract. 2023 , 16 , 115. [ Google Scholar ] [ CrossRef ]
  • Borger, J.; Ng, A.; Anderton, H.; Ashdown, G.; Auld, M.; Blewitt, M.; Brown, D.; Call, M.; Collins, P.; Freytag, S.; et al. Artificial Intelligence Takes Center Stage: Exploring the Capabilities and Implications of ChatGPT and Other AI-Assisted Technologies in Scientific Research and Education. Immunol. Cell Biol. 2023 , 101 , 923–935. [ Google Scholar ] [ CrossRef ]
  • Chang, D.; Lin, M.; Hajian, S.; Wang, Q. Educational Design Principles of Using AI Chatbot That Supports Self-Regulated Learning in Education: Goal Setting, Feedback, and Personalization. Sustainability 2023 , 15 , 12921. [ Google Scholar ] [ CrossRef ]
  • Ellis, A.; Slade, E. A New Era of Learning: Considerations for ChatGPT as a Tool to Enhance Statistics and Data Science Education. J. Stat. Data Sci. Educ. 2023 , 31 , 128–133. [ Google Scholar ] [ CrossRef ]
  • Hosseini, M.; Gao, C.; Liebovitz, D.; Carvalho, A.; Ahmad, F.; Luo, Y.; MacDonald, N.; Holmes, K.; Kho, A. An Exploratory Survey about Using ChatGPT in Education, Healthcare, and Research. PLoS ONE 2023 , 18 , e0292216. [ Google Scholar ] [ CrossRef ]
  • Ibrahim, H.; Liu, F.; Asim, R.; Battu, B.; Benabderrahmane, S.; Alhafni, B.; Adnan, W.; Alhanai, T.; AlShebli, B.; Baghdadi, R.; et al. Perception, Performance, and Detectability of Conversational Artificial Intelligence across 32 University Courses. Sci. Rep. 2023 , 13 , 12187. [ Google Scholar ] [ CrossRef ]
  • Kohnke, L.; Moorhouse, B.; Zou, D. ChatGPT for Language Teaching and Learning. Relc J. 2023 , 54 , 537–550. [ Google Scholar ] [ CrossRef ]
  • Lower, K.; Seth, I.; Lim, B.; Seth, N. ChatGPT-4: Transforming Medical Education and Addressing Clinical Exposure Challenges in the Post-pandemic Era. Indian J. Orthop. 2023 , 57 , 1527–1544. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Mohapatra, D.; MT, F.; Tripathy, S.; Rajan, S.; Vathulya, M.; Lakshmi, P.; Singh, V.; Haq, A. Leveraging Large Language Models (LLM) for the Plastic Surgery Resident Training: Do They Have a Role? Indian J. Plast. Surg. 2023 , 56 , 413–420. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Moshirfar, M.; Altaf, A.W.; Stoakes, I.M.; Tuttle, J.J.; Hoopes, P.C. Artificial Intelligence in Ophthalmology: A Comparative Analysis of GPT-3.5, GPT-4, and Human Expertise in Answering StatPearls Questions. Cureus 2023 , 15 , e40822. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Passmore, J.; Woodward, W. Coaching Education: Wake up to the New Digital and AI Coaching Revolution! Int. Coach. Psychol. Rev. 2023 , 18 , 58–72. [ Google Scholar ] [ CrossRef ]
  • Su, Y.; Lin, Y.; Lai, C. Collaborating with ChatGPT in Argumentative Writing Classrooms. Assess. Writ. 2023 , 57 , 100752. [ Google Scholar ] [ CrossRef ]
  • Walters, W.H.; Wilder, E.I. Fabrication and Errors in the Bibliographic Citations Generated by ChatGPT. Sci. Rep. 2023 , 13 , 14045. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Wilby, R.; Esson, J. AI Literacy in Geographic Education and Research: Capabilities, Caveats, and Criticality. Geogr. J. 2023 , 190 , e12548. [ Google Scholar ] [ CrossRef ]
  • Bender, S.M. Coexistence and Creativity: Screen Media Education in the Age of Artificial Intelligence Content Generators. Media Pract. Educ. 2023 , 24 , 351–366. [ Google Scholar ] [ CrossRef ]
  • Chiu, T.K.F. The Impact of Generative AI (GenAI) on Practices, Policies and Research Direction in Education: A Case of ChatGPT and Midjourney. Interact. Learn. Environ. 2023 . [ Google Scholar ] [ CrossRef ]
  • Clark, T.; Anderson, E.; Dickson-Karn, N.; Soltanirad, C.; Tafini, N. Comparing the Performance of College Chemistry Students with ChatGPT for Calculations Involving Acids and Bases. J. Chem. Educ. 2023 , 100 , 3934–3944. [ Google Scholar ] [ CrossRef ]
  • Ruiz, L.; Acosta-Vargas, P.; De-Moreta-Llovet, J.; Gonzalez, M. Empowering Education with Generative Artificial Intelligence Tools: Approach with an Instructional Design Matrix. Sustainability 2023 , 15 , 11524. [ Google Scholar ] [ CrossRef ]
  • Sison, A.J.; Daza, M.; Gozalo-Brizuela, R.; Garrido-Merchán, E. ChatGPT: More Than a Weapon of Mass Deception, Ethical Challenges and Responses from the Human-Centered Artificial Intelligence (HCAI) perspective. Int. J. Hum. Comput. Interact. 2023 . [ Google Scholar ] [ CrossRef ]
  • Zhang, B.; Qian, M. ChatGPT Related Technology and Its Applications in the Medical Field. Adv. Ultrasound Diagn. Ther. 2023 , 7 , 158. [ Google Scholar ] [ CrossRef ]
  • Dobbs, T. ChatGPT: Do We Need to Write Anything Ever Again? Bulletin 2023 , 105 , 82–83. [ Google Scholar ] [ CrossRef ]
  • Johnson, W. How to Harness Generative AI to Accelerate Human Learning. Int. J. Artif. Intell. Educ. 2023 , 1–5. [ Google Scholar ] [ CrossRef ]
  • Kikerpill, K.; Siibak, A. App-Hazard Disruption: An Empirical Investigation of Media Discourses on ChatGPT in Educational Contexts. Comput. Sch. 2023 , 40 , 334–355. [ Google Scholar ] [ CrossRef ]
  • Marquez, R.; Barrios, N.; Vera, R.E.; Mendez, M.E.; Tolosa, L.; Zambrano, F.; Li, Y. A Perspective on the Synergistic Potential of Artificial Intelligence and Product-Based Learning Strategies in Biobased Materials Education. Educ. Chem. Eng. 2023 , 44 , 164–180. [ Google Scholar ] [ CrossRef ]
  • Ratten, V.; Jones, P. Generative Artificial Intelligence (ChatGPT): Implications for Management Educators. Int. J. Manag. Educ. 2023 , 21 , 100857. [ Google Scholar ] [ CrossRef ]
  • Kaneda, Y.; Takahashi, R.; Kaneda, U.; Akashima, S.; Okita, H.; Misaki, S.; Yamashiro, A.; Ozaki, A.; Tanimoto, T. Assessing the Performance of GPT-3.5 and GPT-4 on the 2023 Japanese Nursing Examination. Cureus 2023 , 15 , e42924. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Luo, W.; He, H.; Liu, J.; Berson, I.; Berson, M.; Zhou, Y.; Li, H. Aladdin’s Genie or Pandora’s Box for Early Childhood Education? Experts Chat on the Roles, Challenges, and Developments of ChatGPT. Early Educ. Dev. 2023 , 2023 , 2214181. [ Google Scholar ] [ CrossRef ]
  • Orrù, G.; Piarulli, A.; Conversano, C.; Gemignani, A. Human-like Problem-Solving Abilities in Large Language Models Using ChatGPT. Front. Artif. Intell. 2023 , 6 , 1199350. [ Google Scholar ] [ CrossRef ]
  • Giannos, P. Evaluating the Limits of AI in Medical Specialisation: ChatGPT’s Performance on the UK Neurology Specialty Certificate Examination. BMJ Neurol. Open 2023 , 5 , e000451. [ Google Scholar ] [ CrossRef ]
  • Romero-Rodríguez, J.-M.; Ramírez-Montoya, M.-S.; Buenestado-Fernández, M.; Lara Lara, F. Use of ChatGPT at University as a Tool for Complex Thinking: Students’ Perceived Usefulness. J. New Approaches Educ. Res. 2023 , 12 , 323–339. [ Google Scholar ] [ CrossRef ]
  • Yan, D. Impact of ChatGPT on Learners in a L2 Writing Practicum: An Exploratory Investigation. Educ. Inf. Technol. 2023 , 28 , 13943–13967. [ Google Scholar ] [ CrossRef ]
  • Elkhatat, A.; Elsaid, K.; Almeer, S. Evaluating the Efficacy of AI Content Detection Tools in Differentiating between Human and AI-Generated Text. Int. J. Educ. Integr. 2023 , 19 , 17. [ Google Scholar ] [ CrossRef ]
  • Kortemeyer, G. Could an Artificial-Intelligence Agent Pass an Introductory Physics Course? Phys. Rev. Phys. Educ. Res. 2023 , 19 , 010132. [ Google Scholar ] [ CrossRef ]
  • Perkins, M.; Roe, J. Decoding Academic Integrity Policies: A Corpus Linguistics Investigation of AI and Other Technological Threats. High Educ Policy 2023 , 1–21. [ Google Scholar ] [ CrossRef ]
  • Chan, C. A Comprehensive AI Policy Education Framework for University Teaching and Learning. Int. J. Educ. Technol. High. Educ. 2023 , 20 , 38. [ Google Scholar ] [ CrossRef ]
  • Esplugas, M. The Use of Artificial Intelligence (AI) to Enhance Academic Communication, Education and Research: A Balanced Approach. J. Hand Surg. Eur. Vol. 2023 , 48 , 819–822. [ Google Scholar ] [ CrossRef ]
  • Shoufan, A. Exploring Students’ Perceptions of ChatGPT: Thematic Analysis and Follow-Up Survey. IEEE Access 2023 , 11 , 38805–38818. [ Google Scholar ] [ CrossRef ]
  • Alshater, M.M. Exploring the Role of Artificial Intelligence in Enhancing Academic Performance: A Case Study of ChatGPT. SSRN 2022 . pre-print . [ Google Scholar ] [ CrossRef ]
  • Deshpande, S.; Szefer, J. Analyzing ChatGPT’s Aptitude in an Introductory Computer Engineering Course. arXiv 2023 , arXiv:2304.06122. [ Google Scholar ] [ CrossRef ]
  • Chan, C.K.Y.; Hu, W. Students’ Voices on Generative AI: Perceptions, Benefits, and Challenges in Higher Education. arXiv 2023 , arXiv:2305.00290. [ Google Scholar ] [ CrossRef ]

Click here to enlarge figure

Search Strings
((ChatGPT OR Chat gpt) AND (Challenge OR pros OR cons OR benefit OR Advantage OR problem OR disadvantage OR harm OR support))
(((ChatGPT OR Chat gpt) AND (future OR application OR possibility)))
(((ChatGPT OR Chat gpt) AND (opinion OR feeling OR attitude OR user OR professional OR evaluation OR evaluate OR experience OR perception OR misconduct OR ethics OR integrity)))
(((ChatGPT OR Chat gpt) AND (teaching OR assignment OR homework OR education OR school OR student OR teacher OR practice OR project)))
Inclusion CriteriaExclusion Criteria
Peer-reviewed journal articles.
Availability of full texts.
Published in the English language.
Published any time after 30 November 2022 (the launch day of ChatGPT).
Conference articles, non-peer-reviewed publications, review articles, book chapters, magazine articles, theses, and notes to editors.
Articles not directly related to the application of ChatGPT in educational contexts.
Advantages
Natural Language Processing [ , , , , , , , , , , , , , , , , , , , , , , , , , , , ]
Enhanced Communication [ , , , , , , , , , , , ]
Learning Engagement and Accessibility [ , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ]
Performance Evaluation [ , , , , , , , , , , , , , , , , , , ]
Text Generation [ , , , , , , , , , , , , , , , , , , , , , , , , , , , ]
Versatility—ability to adapt to many different functions[ , , , , , , , , , , , , , , , , , ]
Other[ , , , , , , , , , ]
Disadvantages
Error recognition[ , , , , , , , , , , , , , , , , ]
Plagiarism and Authenticity[ , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ]
Quality of Responses and Bias in AI[ , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ]
Other[ , , , , , , , , , , , , , , ]
Dependency[ , , , , , , , , , , , , , ]
Privacy and Data Security[ , , , , , , , , , ]
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

Ali, D.; Fatemi, Y.; Boskabadi, E.; Nikfar, M.; Ugwuoke, J.; Ali, H. ChatGPT in Teaching and Learning: A Systematic Review. Educ. Sci. 2024 , 14 , 643. https://doi.org/10.3390/educsci14060643

Ali D, Fatemi Y, Boskabadi E, Nikfar M, Ugwuoke J, Ali H. ChatGPT in Teaching and Learning: A Systematic Review. Education Sciences . 2024; 14(6):643. https://doi.org/10.3390/educsci14060643

Ali, Duha, Yasin Fatemi, Elahe Boskabadi, Mohsen Nikfar, Jude Ugwuoke, and Haneen Ali. 2024. "ChatGPT in Teaching and Learning: A Systematic Review" Education Sciences 14, no. 6: 643. https://doi.org/10.3390/educsci14060643

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

Content Marketing Institute

B2B Content Marketing Benchmarks, Budgets, and Trends: Outlook for 2024 [Research]

B2B Content Marketing Trends for 2024

  • by Stephanie Stahl
  • | Published: October 18, 2023
  • | Trends and Research

Creating standards, guidelines, processes, and workflows for content marketing is not the sexiest job.

But setting standards is the only way to know if you can improve anything (with AI or anything else).

Here’s the good news: All that non-sexy work frees time and resources (human and tech) you can apply to bring your brand’s strategies and plans to life.  

But in many organizations, content still isn’t treated as a coordinated business function. That’s one of the big takeaways from our latest research, B2B Content Marketing Benchmarks, Budgets, and Trends: Outlook for 2024, conducted with MarketingProfs and sponsored by Brightspot .

A few symptoms of that reality showed up in the research:

  • Marketers cite a lack of resources as a top situational challenge, the same as they did the previous year.
  • Nearly three-quarters (72%) say they use generative AI, but 61% say their organization lacks guidelines for its use.
  • The most frequently cited challenges include creating the right content, creating content consistently, and differentiating content.

I’ll walk you through the findings and share some advice from CMI Chief Strategy Advisor Robert Rose and other industry voices to shed light on what it all means for B2B marketers. There’s a lot to work through, so feel free to use the table of contents to navigate to the sections that most interest you.

Note: These numbers come from a July 2023 survey of marketers around the globe. We received 1,080 responses. This article focuses on answers from the 894 B2B respondents.

Table of contents

  • Team structure
  • Content marketing challenges

Content types, distribution channels, and paid channels

  • Social media

Content management and operations

  • Measurement and goals
  • Overall success
  • Budgets and spending
  • Top content-related priorities for 2024
  • Content marketing trends for 2024

Action steps

Methodology, ai: 3 out of 4 b2b marketers use generative tools.

Of course, we asked respondents how they use generative AI in content and marketing. As it turns out, most experiment with it: 72% of respondents say they use generative AI tools.

But a lack of standards can get in the way.

“Generative AI is the new, disruptive capability entering the realm of content marketing in 2024,” Robert says. “It’s just another way to make our content process more efficient and effective. But it can’t do either until you establish a standard to define its value. Until then, it’s yet just another technology that may or may not make you better at what you do.”

So, how do content marketers use the tools today? About half (51%) use generative AI to brainstorm new topics. Many use the tools to research headlines and keywords (45%) and write drafts (45%). Fewer say they use AI to outline assignments (23%), proofread (20%), generate graphics (11%), and create audio (5%) and video (5%).

Content Marketing Trends for 2024: B2B marketers use generative AI for various content tasks.

Some marketers say they use AI to do things like generate email headlines and email copy, extract social media posts from long-form content, condense long-form copy into short form, etc.

Only 28% say they don’t use generative AI tools.

Most don’t pay for generative AI tools (yet)

Among those who use generative AI tools, 91% use free tools (e.g., ChatGPT ). Thirty-eight percent use tools embedded in their content creation/management systems, and 27% pay for tools such as Writer and Jasper.

AI in content remains mostly ungoverned

Asked if their organizations have guidelines for using generative AI tools, 31% say yes, 61% say no, and 8% are unsure.

Content Marketing Trends for 2024: Many B2B organizations lack guidelines for generative AI tools.

We asked Ann Handley , chief content officer of MarketingProfs, for her perspective. “It feels crazy … 61% have no guidelines? But is it actually shocking and crazy? No. It is not. Most of us are just getting going with generative AI. That means there is a clear and rich opportunity to lead from where you sit,” she says.

“Ignite the conversation internally. Press upon your colleagues and your leadership that this isn’t a technology opportunity. It’s also a people and operational challenge in need of thoughtful and intelligent response. You can be the AI leader your organization needs,” Ann says.

Why some marketers don’t use generative AI tools

While a lack of guidelines may deter some B2B marketers from using generative AI tools, other reasons include accuracy concerns (36%), lack of training (27%), and lack of understanding (27%). Twenty-two percent cite copyright concerns, and 19% have corporate mandates not to use them.

Content Marketing Trends for 2024: Reasons why B2B marketers don't use generative AI tools.

How AI is changing SEO

We also wondered how AI’s integration in search engines shifts content marketers’ SEO strategy. Here’s what we found:

  • 31% are sharpening their focus on user intent/answering questions.
  • 27% are creating more thought leadership content.
  • 22% are creating more conversational content.

Over one-fourth (28%) say they’re not doing any of those things, while 26% say they’re unsure.

AI may heighten the need to rethink your SEO strategy. But it’s not the only reason to do so, as Orbit Media Studios co-founder and chief marketing officer Andy Crestodina points out: “Featured snippets and people-also-ask boxes have chipped away at click-through rates for years,” he says. “AI will make that even worse … but only for information intent queries . Searchers who want quick answers really don’t want to visit websites.

“Focus your SEO efforts on those big questions with big answers – and on the commercial intent queries,” Andy continues. “Those phrases still have ‘visit website intent’ … and will for years to come.”

Will the AI obsession ever end?

Many B2B marketers surveyed predict AI will dominate the discussions of content marketing trends in 2024. As one respondent says: “AI will continue to be the shiny thing through 2024 until marketers realize the dedication required to develop prompts, go through the iterative process, and fact-check output . AI can help you sharpen your skills, but it isn’t a replacement solution for B2B marketing.”

Back to table of contents

Team structure: How does the work get done?

Generative AI isn’t the only issue affecting content marketing these days. We also asked marketers about how they organize their teams .

Among larger companies (100-plus employees), half say content requests go through a centralized content team. Others say each department/brand produces its own content (23%), and the departments/brand/products share responsibility (21%).

Content Marketing Trends for 2024: In large organizations, requests for B2B content often go through a central team.

Content strategies integrate with marketing, comms, and sales

Seventy percent say their organizations integrate content strategy into the overall marketing sales/communication/strategy, and 2% say it’s integrated into another strategy. Eleven percent say content is a stand-alone strategy for content used for marketing, and 6% say it’s a stand-alone strategy for all content produced by the company. Only 9% say they don’t have a content strategy. The remaining 2% say other or are unsure.

Employee churn means new teammates; content teams experience enlightened leadership

Twenty-eight percent of B2B marketers say team members resigned in the last year, 20% say team members were laid off, and about half (49%) say they had new team members acclimating to their ways of working.

While team members come and go, the understanding of content doesn’t. Over half (54%) strongly agree, and 30% somewhat agree the leader to whom their content team reports understands the work they do. Only 11% disagree. The remaining 5% neither agree nor disagree.

And remote work seems well-tolerated: Only 20% say collaboration was challenging due to remote or hybrid work.

Content marketing challenges: Focus shifts to creating the right content

We asked B2B marketers about both content creation and non-creation challenges.

Content creation

Most marketers (57%) cite creating the right content for their audience as a challenge. This is a change from many years when “creating enough content” was the most frequently cited challenge.

One respondent points out why understanding what audiences want is more important than ever: “As the internet gets noisier and AI makes it incredibly easy to create listicles and content that copy each other, there will be a need for companies to stand out. At the same time, as … millennials and Gen Z [grow in the workforce], we’ll begin to see B2B become more entertaining and less boring. We were never only competing with other B2B content. We’ve always been competing for attention.”

Other content creation challenges include creating it consistently (54%) and differentiating it (54%). Close to half (45%) cite optimizing for search and creating quality content (44%). About a third (34%) cite creating enough content to keep up with internal demand, 30% say creating enough content to keep up with external demand, and 30% say creating content that requires technical skills.

Content Marketing Trends for 2024: B2B marketers' content creation challenges.

Other hurdles

The most frequently cited non-creation challenge, by far, is a lack of resources (58%), followed by aligning content with the buyer’s journey (48%) and aligning content efforts across sales and marketing (45%). Forty-one percent say they have issues with workflow/content approval, and 39% say they have difficulty accessing subject matter experts. Thirty-four percent say it is difficult to keep up with new technologies/tools (e.g., AI). Only 25% cite a lack of strategy as a challenge, 19% say keeping up with privacy rules, and 15% point to tech integration issues.

Content Marketing Trends for 2024: Situational challenges B2B content creation teams face.

We asked content marketers about the types of content they produce, their distribution channels , and paid content promotion. We also asked which formats and channels produce the best results.

Popular content types and formats

As in the previous year, the three most popular content types/formats are short articles/posts (94%, up from 89% last year), videos (84%, up from 75% last year), and case studies/customer stories (78%, up from 67% last year). Almost three-quarters (71%) use long articles, 60% produce visual content, and 59% craft thought leadership e-books or white papers. Less than half of marketers use brochures (49%), product or technical data sheets (45%), research reports (36%), interactive content (33%), audio (29%), and livestreaming (25%).

Content Marketing Trends for 2024: Types of content B2B marketers used in the last 12 months.

Effective content types and formats

Which formats are most effective? Fifty-three percent say case studies/customer stories and videos deliver some of their best results. Almost as many (51%) names thought leadership e-books or white papers, 47% short articles, and 43% research reports.

Content Marketing Trends for 2024: Types of content that produce the best results for B2B marketers.

Popular content distribution channels

Regarding the channels used to distribute content, 90% use social media platforms (organic), followed by blogs (79%), email newsletters (73%), email (66%), in-person events (56%), and webinars (56%).

Channels used by the minority of those surveyed include:

  • Digital events (44%)
  • Podcasts (30%)
  • Microsites (29%)
  • Digital magazines (21%)
  • Branded online communities (19%)
  • Hybrid events (18%)
  • Print magazines (16%)
  • Online learning platforms (15%)
  • Mobile apps (8%)
  • Separate content brands (5%)

Content Marketing Trends for 2024: Distribution channels B2B marketers used in the last 12 months.

Effective content distribution channels

Which channels perform the best? Most marketers in the survey point to in-person events (56%) and webinars (51%) as producing better results. Email (44%), organic social media platforms (44%), blogs (40%) and email newsletters (39%) round out the list.

Content Marketing Trends for 2024: Distributions channels that produce the best results for B2B marketers.

Popular paid content channels

When marketers pay to promote content , which channels do they invest in? Eighty-six percent use paid content distribution channels.

Of those, 78% use social media advertising/promoted posts, 65% use sponsorships, 64% use search engine marketing (SEM)/pay-per-click, and 59% use digital display advertising. Far fewer invest in native advertising (35%), partner emails (29%), and print display ads (21%).

Effective paid content channels

SEM/pay-per-click produces good results, according to 62% of those surveyed. Half of those who use paid channels say social media advertising/promoted posts produce good results, followed by sponsorships (49%), partner emails (36%), and digital display advertising (34%).

Content Marketing Trends for 2024: Paid channels that produce the best results for B2B marketers.

Social media use: One platform rises way above

When asked which organic social media platforms deliver the best value for their organization, B2B marketers picked LinkedIn by far (84%). Only 29% cite Facebook as a top performer, 22% say YouTube, and 21% say Instagram. Twitter and TikTok see 8% and 3%, respectively.

Content Marketing Trends for 2024: LinkedIn delivers the best value for B2B marketers.

So it makes sense that 72% say they increased their use of LinkedIn over the last 12 months, while only 32% boosted their YouTube presence, 31% increased Instagram use, 22% grew their Facebook presence, and 10% increased X and TikTok use.

Which platforms are marketers giving up? Did you guess X? You’re right – 32% of marketers say they decreased their X use last year. Twenty percent decreased their use of Facebook, with 10% decreasing on Instagram, 9% pulling back on YouTube, and only 2% decreasing their use of LinkedIn.

Content Marketing Trends for 2024: B2B marketers' use of organic social media platforms in the last 12 months.

Interestingly, we saw a significant rise in B2B marketers who use TikTok: 19% say they use the platform – more than double from last year.

To explore how teams manage content, we asked marketers about their technology use and investments and the challenges they face when scaling their content .

Content management technology

When asked which technologies they use to manage content, marketers point to:

  • Analytics tools (81%)
  • Social media publishing/analytics (72%)
  • Email marketing software (69%)
  • Content creation/calendaring/collaboration/workflow (64%)
  • Content management system (50%)
  • Customer relationship management system (48%)

But having technology doesn’t mean it’s the right technology (or that its capabilities are used). So, we asked if they felt their organization had the right technology to manage content across the organization.

Only 31% say yes. Thirty percent say they have the technology but aren’t using its potential, and 29% say they haven’t acquired the right technology. Ten percent are unsure.

Content Marketing Trends for 2024: Many B2B marketers lack the right content management technology.

Content tech spending will likely rise

Even so, investment in content management technology seems likely in 2024: 45% say their organization is likely to invest in new technology, whereas 32% say their organization is unlikely to do so. Twenty-three percent say their organization is neither likely nor unlikely to invest.

Content Marketing Trends for 2024: Nearly half of B2B marketers expect investment in additional content management technology in 2024.

Scaling content production

We introduced a new question this year to understand what challenges B2B marketers face while scaling content production .

Almost half (48%) say it’s “not enough content repurposing.” Lack of communication across organizational silos is a problem for 40%. Thirty-one percent say they have no structured content production process, and 29% say they lack an editorial calendar with clear deadlines. Ten percent say scaling is not a current focus.

Among the other hurdles – difficulty locating digital content assets (16%), technology issues (15%), translation/localization issues (12%), and no style guide (11%).

Content Marketing Trends for 2024: Challenges B2B marketers face while scaling content production.

For those struggling with content repurposing, content standardization is critical. “Content reuse is the only way to deliver content at scale. There’s just no other way,” says Regina Lynn Preciado , senior director of content strategy solutions at Content Rules Inc.

“Even if you’re not trying to provide the most personalized experience ever or dominate the metaverse with your omnichannel presence, you absolutely must reuse content if you are going to deliver content effectively,” she says.

“How to achieve content reuse ? You’ve probably heard that you need to move to modular, structured content. However, just chunking your content into smaller components doesn’t go far enough. For content to flow together seamlessly wherever you reuse it, you’ve got to standardize your content. That’s the personalization paradox right there. To personalize, you must standardize.

“Once you have your content standards in place and everyone is creating content in alignment with those standards, there is no limit to what you can do with the content,” Regina explains.

Why do content marketers – who are skilled communicators – struggle with cross-silo communication? Standards and alignment come into play.

“I think in the rush to all the things, we run out of time to address scalable processes that will fix those painful silos, including taking time to align on goals, roles and responsibilities, workflows, and measurement,” says Ali Orlando Wert , senior director of content strategy at Appfire. “It takes time, but the payoffs are worth it. You have to learn how to crawl before you can walk – and walk before you can run.”

Measurement and goals: Generating sales and revenue rises

Almost half (46%) of B2B marketers agree their organization measures content performance effectively. Thirty-six percent disagree, and 15% neither agree nor disagree. Only 3% say they don’t measure content performance.

The five most frequently used metrics to assess content performance are conversions (73%), email engagement (71%), website traffic (71%), website engagement (69%), and social media analytics (65%).

About half (52%) mention the quality of leads, 45% say they rely on search rankings, 41% use quantity of leads, 32% track email subscribers, and 29% track the cost to acquire a lead, subscriber, or customer.

Content Marketing Trends for 2024: Metrics B2B marketers rely on most to evaluate content performance.

The most common challenge B2B marketers have while measuring content performance is integrating/correlating data across multiple platforms (84%), followed by extracting insights from data (77%), tying performance data to goals (76%), organizational goal setting (70%), and lack of training (66%).

Content Marketing Trends for 2024: B2B marketers' challenges with measuring content performance.

Regarding goals, 84% of B2B marketers say content marketing helped create brand awareness in the last 12 months. Seventy-six percent say it helped generate demand/leads; 63% say it helped nurture subscribers/audiences/leads, and 58% say it helped generate sales/revenue (up from 42% the previous year).

Content Marketing Trends for 2024: Goals B2B marketers achieved by using content marketing in the last 12 months.

Success factors: Know your audience

To separate top performers from the pack, we asked the B2B marketers to assess the success of their content marketing approach.

Twenty-eight percent rate the success of their organization’s content marketing approach as extremely or very successful. Another 57% report moderate success and 15% feel minimally or not at all successful.

The most popular factor for successful marketers is knowing their audience (79%).

This makes sense, considering that “creating the right content for our audience” is the top challenge. The logic? Top-performing content marketers prioritize knowing their audiences to create the right content for those audiences.

Top performers also set goals that align with their organization’s objectives (68%), effectively measure and demonstrate content performance (61%), and show thought leadership (60%). Collaboration with other teams (55%) and a documented strategy (53%) also help top performers reach high levels of content marketing success.

Content Marketing Trends for 2024: Top performers often attribute their B2B content marketing success to knowing their audience.

We looked at several other dimensions to identify how top performers differ from their peers. Of note, top performers:

  • Are backed by leaders who understand the work they do.
  • Are more likely to have the right content management technologies.
  • Have better communication across organizational silos.
  • Do a better job of measuring content effectiveness.
  • Are more likely to use content marketing successfully to generate demand/leads, nurture subscribers/audiences/leads, generate sales/revenue, and grow a subscribed audience.

Little difference exists between top performers and their less successful peers when it comes to the adoption of generative AI tools and related guidelines. It will be interesting to see if and how that changes next year.

Content Marketing Trends for 2024: Key areas where B2 top-performing content marketers differ from their peers.

Budgets and spending: Holding steady

To explore budget plans for 2024, we asked respondents if they have knowledge of their organization’s budget/budgeting process for content marketing. Then, we asked follow-up questions to the 55% who say they do have budget knowledge.

Content marketing as a percentage of total marketing spend

Here’s what they say about the total marketing budget (excluding salaries):

  • About a quarter (24%) say content marketing takes up one-fourth or more of the total marketing budget.
  • Nearly one in three (29%) indicate that 10% to 24% of the marketing budget goes to content marketing.
  • Just under half (48%) say less than 10% of the marketing budget goes to content marketing.

Content marketing budget outlook for 2024

Next, we asked about their 2024 content marketing budget. Forty-five percent think their content marketing budget will increase compared with 2023, whereas 42% think it will stay the same. Only 6% think it will decrease.

Content Marketing Trends for 2024: How B2B content marketing budgets will change in 2024.

Where will the budget go?

We also asked where respondents plan to increase their spending.

Sixty-nine percent of B2B marketers say they would increase their investment in video, followed by thought leadership content (53%), in-person events (47%), paid advertising (43%), online community building (33%), webinars (33%), audio content (25%), digital events (21%), and hybrid events (11%).

Content Marketing Trends for 2024: Percentage of B2B marketers who think their organization will increase in the following areas in 2024.

The increased investment in video isn’t surprising. The focus on thought leadership content might surprise, but it shouldn’t, says Stephanie Losee , director of executive and ABM content at Autodesk.

“As measurement becomes more sophisticated, companies are finding they’re better able to quantify the return from upper-funnel activities like thought leadership content ,” she says. “At the same time, companies recognize the impact of shifting their status from vendor to true partner with their customers’ businesses.

“Autodesk recently launched its first global, longitudinal State of Design & Make report (registration required), and we’re finding that its insights are of such value to our customers that it’s enabling conversations we’ve never been able to have before. These conversations are worth gold to both sides, and I would imagine other B2B companies are finding the same thing,” Stephanie says.

Top content-related priorities for 2024: Leading with thought leadership

We asked an open-ended question about marketers’ top three content-related priorities for 2024. The responses indicate marketers place an emphasis on thought leadership and becoming a trusted resource.

Other frequently mentioned priorities include:

  • Better understanding of the audience
  • Discovering the best ways to use AI
  • Increasing brand awareness
  • Lead generation
  • Using more video
  • Better use of analytics
  • Conversions
  • Repurposing existing content

Content marketing predictions for 2024: AI is top of mind

In another open-ended question, we asked B2B marketers, “What content marketing trends do you predict for 2024?” You probably guessed the most popular trend: AI.

Here are some of the marketers’ comments about how AI will affect content marketing next year:

  • “We’ll see generative AI everywhere, all the time.”
  • “There will be struggles to determine the best use of generative AI in content marketing.”
  • “AI will likely result in a flood of poor-quality, machine-written content. Winners will use AI for automating the processes that support content creation while continuing to create high-quality human-generated content.”
  • “AI has made creating content so easy that there are and will be too many long articles on similar subjects; most will never be read or viewed. A sea of too many words. I predict short-form content will have to be the driver for eyeballs.”

Other trends include:

  • Greater demand for high-quality content as consumers grow weary of AI-generated content
  • Importance of video content
  • Increasing use of short video and audio content
  • Impact of AI on SEO

Among the related comments:

  • “Event marketing (webinars and video thought leadership) will become more necessary as teams rely on AI-generated written content.”
  • “AI will be an industry sea change and strongly impact the meaning of SEO. Marketers need to be ready to ride the wave or get left behind.”
  • “Excitement around AI-generated content will rise before flattening out when people realize it’s hard to differentiate, validate, verify, attribute, and authenticate. New tools, processes, and roles will emerge to tackle this challenge.”
  • “Long-form reports could start to see a decline. If that is the case, we will need a replacement. Logically, that could be a webinar or video series that digs deeper into the takeaways.”

What does this year’s research suggest B2B content marketers do to move forward?

I asked CMI’s Robert Rose for some insights. He says the steps are clear: Develop standards, guidelines, and playbooks for how to operate – just like every other function in business does.

“Imagine if everyone in your organization had a different idea of how to define ‘revenue’ or ‘profit margin,’” Robert says. “Imagine if each salesperson had their own version of your company’s customer agreements and tried to figure out how to write them for every new deal. The legal team would be apoplectic. You’d start to hear from sales how they were frustrated that they couldn’t figure out how to make the ‘right agreement,’ or how to create agreements ‘consistently,’ or that there was a complete ‘lack of resources’ for creating agreements.”

Just remember: Standards can change along with your team, audiences, and business priorities. “Setting standards doesn’t mean casting policies and templates in stone,” Robert says. “Standards only exist so that we can always question the standard and make sure that there’s improvement available to use in setting new standards.”

He offers these five steps to take to solidify your content marketing strategy and execution:

  • Direct. Create an initiative that will define the scope of the most important standards for your content marketing. Prioritize the areas that hurt the most. Work with leadership to decide where to start. Maybe it’s persona development. Maybe you need a new standardized content process. Maybe you need a solid taxonomy. Build the list and make it a real initiative.
  • Define . Create a common understanding of all the things associated with the standards. Don’t assume that everybody knows. They don’t. What is a white paper? What is an e-book? What is a campaign vs. an initiative? What is a blog post vs. an article? Getting to a common language is one of the most powerful things you can do to coordinate better.
  • Develop . You need both policies and playbooks. Policies are the formal documentation of your definitions and standards. Playbooks are how you communicate combinations of policies so that different people can not just understand them but are ready, willing, and able to follow them.
  • Distribute . If no one follows the standards, they’re not standards. So, you need to develop a plan for how your new playbooks fit into the larger, cross-functional approach to the content strategy. You need to deepen the integration into each department – even if that is just four other people in your company.
  • Distill . Evolve your standards. Make them living documents. Deploy technology to enforce and scale the standards. Test. If a standard isn’t working, change it. Sometimes, more organic processes are OK. Sometimes, it’s OK to acknowledge two definitions for something. The key is acknowledging a change to an existing standard so you know whether it improves things.

For their 14 th annual content marketing survey, CMI and MarketingProfs surveyed 1,080 recipients around the globe – representing a range of industries, functional areas, and company sizes — in July 2023. The online survey was emailed to a sample of marketers using lists from CMI and MarketingProfs.

This article presents the findings from the 894 respondents, mostly from North America, who indicated their organization is primarily B2B and that they are either content marketers or work in marketing, communications, or other roles involving content.

Content Marketing Trends for 2024: B2B  industry classification, and size of B2B company by employees.

Thanks to the survey participants, who made this research possible, and to everyone who helps disseminate these findings throughout the content marketing industry.

Cover image by Joseph Kalinowski/Content Marketing Institute

About Content Marketing Institute

what are some research questions on student engagement

Content Marketing Institute (CMI) exists to do one thing: advance the practice of content marketing through online education and in-person and digital events. We create and curate content experiences that teach marketers and creators from enterprise brands, small businesses, and agencies how to attract and retain customers through compelling, multichannel storytelling. Global brands turn to CMI for strategic consultation, training, and research. Organizations from around the world send teams to Content Marketing World, the largest content marketing-focused event, the Marketing Analytics & Data Science (MADS) conference, and CMI virtual events, including ContentTECH Summit. Our community of 215,000+ content marketers shares camaraderie and conversation. CMI is organized by Informa Connect. To learn more, visit www.contentmarketinginstitute.com .

About MarketingProfs

Marketingprofs is your quickest path to b2b marketing mastery.

what are some research questions on student engagement

More than 600,000 marketing professionals worldwide rely on MarketingProfs for B2B Marketing training and education backed by data science, psychology, and real-world experience. Access free B2B marketing publications, virtual conferences, podcasts, daily newsletters (and more), and check out the MarketingProfs B2B Forum–the flagship in-person event for B2B Marketing training and education at MarketingProfs.com.

About Brightspot

Brightspot , the content management system to boost your business.

what are some research questions on student engagement

Why Brightspot? Align your technology approach and content strategy with Brightspot, the leading Content Management System for delivering exceptional digital experiences. Brightspot helps global organizations meet the business needs of today and scale to capitalize on the opportunities of tomorrow. Our Enterprise CMS and world-class team solves your unique business challenges at scale. Fast, flexible, and fully customizable, Brightspot perfectly harmonizes your technology approach with your content strategy and grows with you as your business evolves. Our customer-obsessed teams walk with you every step of the way with an unwavering commitment to your long-term success. To learn more, visit www.brightspot.com .

Stephanie Stahl

Stephanie Stahl

Advertisement

Advertisement

Student engagement research trends of past 10 years: A machine learning-based analysis of 42,000 research articles

  • Published: 25 April 2023
  • Volume 28 , pages 15067–15091, ( 2023 )

Cite this article

what are some research questions on student engagement

  • Fatih Gurcan   ORCID: orcid.org/0000-0001-9915-6686 1 ,
  • Fatih Erdogdu 2 ,
  • Nergiz Ercil Cagiltay 3 &
  • Kursat Cagiltay 4  

1152 Accesses

3 Citations

2 Altmetric

Explore all metrics

Student engagement is critical for both academic achievement and learner satisfaction because it promotes successful learning outcomes. Despite its importance in various learning environments, research into the trends and themes of student engagement is scarce. In this regard, topic modeling, a machine learning technique, allows for the analysis of large amounts of content in any field. Thus, topic modeling provides a systematic methodology for identifying research themes, trends, and application areas in a comprehensive framework. In the literature, there is a lack of topic modeling-based studies that analyze the holistic landscape of student engagement research. Such research is important for identifying wide-ranging topics and trends in the field and guiding researchers and educators. Therefore, this study aimed to analyze student engagement research using a topic modeling approach and to reveal research interests and trends with their temporal development, thereby addressing a lack of research in this area. To this end, this study analyzed 42,517 peer-reviewed journal articles published from 2010 to 2019 using machine learning techniques. According to our findings, two new dimensions, “Community Engagement” and “School Engagement”, were identified in addition to the existing ones. It is also envisaged that the next period of research and applications in student engagement will focus on the motivation-oriented tools and methods, dimensions of student engagement, such as social and behavioral engagement, and specific learning contexts such as English as a Foreign Language “EFL” and Science, Technology, Engineering and Math “STEM”.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

what are some research questions on student engagement

Similar content being viewed by others

what are some research questions on student engagement

Exploring Student Engagement in an Online Programming Course Using Machine Learning Methods

what are some research questions on student engagement

Analytics for Student Engagement

what are some research questions on student engagement

Conceptualizations and Measures of Student Engagement: A Worked Example of Systematic Review

Data availability.

The datasets generated during and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Astin, A. W. (1984). Student involvement: A developmental theory for higher education. Journal of College Student Development, 25 (4), 297–308.

Google Scholar  

Blei, D. M. (2012). Probabilistic topic models. Communications of the ACM, 55 (4), 77–84.

Article   Google Scholar  

Blei, D. M., & Lafferty, J. D. (2007). Correction: A correlated topic model of science. The Annals of Applied Statistics, 1 (2), 634–634. https://doi.org/10.1214/07-aoas136

Article   MathSciNet   MATH   Google Scholar  

Blei, D. M., Ng, A. Y., & Jordan, M. I. (2003). Blei et al., 2003 - Latent Dirichlet Allocation. Journal of Machine Learning Research , 3 (4/5), 993–1022.

Bond, M. (2020). Facilitating student engagement through the flipped learning approach in K-12: A systematic review. Computers and Education . https://doi.org/10.1016/j.compedu.2020.103819

Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., & Kerres, M. (2020). Mapping research in student engagement and educational technology in higher education: a systematic evidence map. In International Journal of Educational Technology in Higher Education . https://doi.org/10.1186/s41239-019-0176-8

Boulton, C. A., Kent, C., & Williams, H. T. P. (2018). Virtual learning environment engagement and learning outcomes at a ‘bricks-and-mortar’ university. Computers and Education . https://doi.org/10.1016/j.compedu.2018.06.031

Chapman, C., Laird, J., Ifill, N., & KewalRamani, A. (2010). Trends in high schools dropout and completion rates in the United States: 1972–2009 (NCES 2012-006) . National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.

Chien, S. Y., Hwang, G. J., & Jong, M. S. Y. (2020). Effects of peer assessment within the context of spherical video-based virtual reality on EFL students’ english-speaking performance and learning perceptions. Computers and Education . https://doi.org/10.1016/j.compedu.2019.103751

Christensen, R., Knezek, G., & Tyler-Wood, T. (2014). Student perceptions of Science, Technology, Engineering and Mathematics (STEM) content and careers. Computers in Human Behavior . https://doi.org/10.1016/j.chb.2014.01.046

Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience . Harper Row.

Csikszentmihalyi, M. (2020). Finding flow. The psychology of engagement with everyday life . Hachette UK.

D’Errico, F., Paciello, M., & Cerniglia, L. (2016). When emotions enhance students’ engagement in e-learning processes.  Journal of E-Learning and Knowledge Society, 12 (4).

Eccles, J. S. (2016). Engagement: Where to next? In Learning and Instruction . https://doi.org/10.1016/j.learninstruc.2016.02.003

Eltahir, M. E., Alsalhi, N. R., Al-Qatawneh, S., AlQudah, H. A., & Jaradat, M. (2021). The impact of game-based learning (GBL) on students’ motivation, engagement and academic performance on an arabic language grammar course in higher education. Education and Information Technologies . https://doi.org/10.1007/s10639-020-10396-w

Finn, J. D. (1989). Withdrawing from School. Review of Educational Research . https://doi.org/10.3102/00346543059002117

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. In Review of Educational Research . https://doi.org/10.3102/00346543074001059

Guo, P. J., Kim, J., & Rubin, R. (2014). How video production affects student engagement: An empirical study of MOOC videos. L@S2014 - Proceedings of the 1st ACM Conference on Learning at Scale . https://doi.org/10.1145/2556325.2566239

Gurcan, F., & Cagiltay, N. E. (2020). Research trends on distance learning: a text mining-based literature review from 2008 to 2018. In Interactive Learning Environments . 31 (2), 1007–1028. https://doi.org/10.1080/10494820.2020.1815795

Gurcan, F., & Cagiltay, N. E. (2022). Exploratory analysis of topic interests and their evolution in bioinformatics research using semantic text mining and probabilistic topic modeling. IEEE Access: Practical Innovations, Open Solutions,   10 , 31480–31493. https://doi.org/10.1109/ACCESS.2022.3160795

Gurcan, F., Cagiltay, N. E., & Cagiltay, K. (2021a). Mapping human–computer interaction research themes and trends from its existence to today: A topic modeling-based review of past 60 years. International Journal of Human-Computer Interaction, 37 (3), 267–280. https://doi.org/10.1080/10447318.2020.1819668

Gurcan, F., Ozyurt, O., & Cagiltay, N. E. (2021b). Investigation of emerging trends in the E-learning field using latent dirichlet allocation. International Review of Research in Open and Distributed Learning, 22 (2), 1–18. https://doi.org/10.19173/irrodl.v22i2.5358

Gurcan, F., Dalveren, G. G. M., Cagiltay, N. E., Roman, D., & Soylu, A. (2022a). Evolution of software testing strategies and trends: Semantic content analysis of software research corpus of the last 40 years. IEEE Access: Practical Innovations, Open Solutions, 10 , 106093–106109.  https://doi.org/10.1109/ACCESS.2022.3211949

Gurcan, F., Dalveren, G. G. M., Cagiltay, N. E., & Soylu, A. (2022b). Detecting latent topics and trends in software engineering research since 1980 using probabilistic topic modeling. IEEE Access: Practical Innovations, Open Solutions, 10 , 74638–74654. https://doi.org/10.1109/ACCESS.2022.3190632

Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-mediated learning: A review. Computers and Education . https://doi.org/10.1016/j.compedu.2015.09.005

Hu, Y. L., Ching, G. S., & Chao, P. C. (2011). Taiwan student engagement model: Conceptual framework and overview of psychometric properties. International Journal of Research Studies in Education . https://doi.org/10.5861/ijrse.2012.v1i1.19

Karl, A., Wisnowski, J., & Rushing, W. H. (2015). A practical guide to text mining with topic extraction. Wiley Interdisciplinary Reviews: Computational Statistics, 7 (5), 326–340.

Article   MathSciNet   Google Scholar  

Kim, Y., Glassman, M., & Williams, M. S. (2015). Connecting agents: Engagement and motivation in online collaboration. Computers in Human Behavior . https://doi.org/10.1016/j.chb.2015.03.015

Konrad, M. (2017). Text mining and topic modeling toolkit . https://pypi.org/project/tmtoolkit/ . Accessed 18 Feb 2023.

Kuh, G. D., Cruce, T. M., Shoup, R., Kinzie, J., & Gonyea, R. M. (2008). Unmasking the effects of student engagement on first-year college grades and persistence. Journal of Higher Education . https://doi.org/10.1353/jhe.0.0019

Lawson, M. A., & Lawson, H. A. (2013). New conceptual frameworks for student engagement research, policy, and practice. Review of Educational Research . https://doi.org/10.3102/0034654313480891

Li, C., Wang, H., Zhang, Z., Sun, A., & Ma, Z. (2016). Topic modeling for short texts with auxiliary word embeddings. SIGIR 2016 - Proceedings of the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval , 165–174. https://doi.org/10.1145/2911451.2911499

Mimno, D., Wallach, H. M., Talley, E., Leenders, M., & McCallum, A. (2011). Optimizing semantic coherence in topic models. EMNLP 2011 - Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference .

Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of web of science and scopus: A comparative analysis. Scientometrics, 106 (1), 213–228. https://doi.org/10.1007/s11192-015-1765-5

Naing, C., Wai, V. N., Durham, J., Whittaker, M. A., Win, N. N., Aung, K., & Mak, J. W. (2015). A systematic review and meta-analysis of medical students’ perspectives on the engagement in research. Medicine (United States) . https://doi.org/10.1097/MD.0000000000001089

Owen, K. B., Parker, P. D., Van Zanden, B., MacMillan, F., Astell-Burt, T., & Lonsdale, C. (2016). Physical activity and school engagement in youth: A systematic review and meta-analysis. In Educational Psychologist . https://doi.org/10.1080/00461520.2016.1151793

Pietarinen, J., Soini, T., & Pyhältö, K. (2014). Students’ emotional and cognitive engagement as the determinants of well-being and achievement in school. International Journal of Educational Research . https://doi.org/10.1016/j.ijer.2014.05.001

Porter, M. F. (2001). Snowball: A language for stemming algorithms. In Snowball . http://snowball.tartarus.org/texts/introduction.html

Quin, D. (2017). Longitudinal and contextual associations between teacher–student relationships and student engagement: A systematic review. Review of Educational Research . https://doi.org/10.3102/0034654316669434

Reschly, A. L., & Christenson, S. L. (2012). Jingle, jangle, and conceptual haziness: Evolution and future directions of the engagement construct. In Handbook of Research on Student Engagement . https://doi.org/10.1007/978-1-4614-2018-7_1

Saeed, S., & Zyngier, D. (2012). How motivation influences student engagement: A qualitative case study. Journal of Education and Learning . https://doi.org/10.5539/jel.v1n2p252

Schwartz, S., & Tinto, V. (1987). Leaving college: Rethinking the causes and cures of student attrition. Academe . https://doi.org/10.2307/40250027

Silva, C. C., Galster, M., & Gilson, F. (2021). Topic modeling in software engineering research. Empirical Software Engineering , 26 (6). https://doi.org/10.1007/s10664-021-10026-0

Skinner, E. A., & Pitzer, J. R. (2012). Developmental dynamics of student engagement, coping, and everyday resilience. In Handbook of Research on Student Engagement . https://doi.org/10.1007/978-1-4614-2018-7_2

Smit, K., de Brabander, C. J., Boekaerts, M., & Martens, R. L. (2017). The self-regulation of motivation: Motivational strategies as mediator between motivational beliefs and engagement for learning. International Journal of Educational Research . https://doi.org/10.1016/j.ijer.2017.01.006

Subramainan, L., & Mahmoud, M. A. (2020). A systematic review on students’ engagement in classroom: Indicators, challenges and computational techniques. International Journal of Advanced Computer Science and Applications . https://doi.org/10.14569/ijacsa.2020.0110113

Topalli, D., & Cagiltay, N. E. (2018). Improving programming skills in engineering education through problem-based game projects with scratch. Computers and Education . https://doi.org/10.1016/j.compedu.2018.01.011

Uysal, A. K., & Gunal, S. (2014). The impact of preprocessing on text classification. Information Processing and Management . https://doi.org/10.1016/j.ipm.2013.08.006

Vayansky, I., & Kumar, S. A. P. (2020). A review of topic modeling methods. Information Systems , 94 . https://doi.org/10.1016/j.is.2020.101582

Wang, Y., Bowers, A. J., & Fikis, D. J. (2017). Automated text data mining analysis of five decades of educational leadership research literature: probabilistic topic modeling of EAQ articles from 1965 to 2014. Educational Administration Quarterly . https://doi.org/10.1177/0013161X16660585

Williams, K. M., Stafford, R. E., Corliss, S. B., & Reilly, E. D. (2018). Examining student characteristics, goals, and engagement in massive Open Online Courses. Computers and Education . https://doi.org/10.1016/j.compedu.2018.08.014

Zhang, S., & Liu, Q. (2019). Investigating the relationships among teachers’ motivational beliefs, motivational regulation, and their learning engagement in online professional learning communities. Computers and Education . https://doi.org/10.1016/j.compedu.2019.02.013

Download references

Author information

Authors and affiliations.

Department of Management Information Systems, Faculty of Economics and Administrative Sciences, Karadeniz Technical University, Trabzon, Turkey

Fatih Gurcan

Department of Computer Technology, Zonguldak Bülent Ecevit University, Zonguldak, Turkey

Fatih Erdogdu

Software Engineering Department, Atilim University, Ankara, Turkey

Nergiz Ercil Cagiltay

Faculty of Engineering and Natural Sciences, Sabanci University, Istanbul, Turkey

Kursat Cagiltay

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Fatih Gurcan .

Ethics declarations

Conflict of interest.

The authors declare that they have no conflict interest.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Gurcan, F., Erdogdu, F., Cagiltay, N.E. et al. Student engagement research trends of past 10 years: A machine learning-based analysis of 42,000 research articles. Educ Inf Technol 28 , 15067–15091 (2023). https://doi.org/10.1007/s10639-023-11803-8

Download citation

Received : 16 November 2022

Accepted : 11 April 2023

Published : 25 April 2023

Issue Date : November 2023

DOI : https://doi.org/10.1007/s10639-023-11803-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Student engagement
  • Topic modeling
  • Text mining
  • Trend analysis
  • Machine learning
  • Find a journal
  • Publish with us
  • Track your research

what are some research questions on student engagement

  • Main Content

Tel Aviv University Logo

  • Student Portal
  • Tel Aviv University
  • Join the Team
  • About Tel Aviv University
  • About The Lowy International School
  • The Lowy International Team
  • Join Our Team
  • Chat with a Student
  • Assistance for New Olim
  • Campus Visits
  • Virtual Campus Tour
  • Online Store
  • Accreditation
  • Subscribe to Our Newsletter
  • Undergraduate Degrees
  • Graduate Degrees
  • Transfer to TAU
  • Degree Navigator
  • Study Abroad Programs
  • ELI Gap Year Program
  • Outgoing Student Exchange
  • Incoming Student Exchange
  • Admin Staff Exchange
  • Language Programs
  • Summer Programs
  • Executive Education
  • Doctoral Programs
  • Post-Doctoral Fellowships
  • Visiting Students and Researchers
  • Open Positions at TAU
  • Lowy Distinguished Guest Professors
  • Joint Collaborations Overview
  • Exchange Partners
  • Customized & Faculty-Led Programs
  • Our Study Abroad Partners
  • Fees and Expenses
  • Cost of Attendance
  • Refund Policy
  • Lowy International Scholarships
  • MASA Financial Assistance
  • Funding for New Olim
  • US Federal Loans
  • Need-Based Aid
  • Other Funding Opportunities
  • TAU Student Portal
  • Proof of English Proficiency
  • Academic Calendar
  • Transcripts
  • TAU Constitution
  • Student Life Team
  • Campus Life at TAU
  • Religious Life On Campus
  • Communities at TAU
  • Living in Tel Aviv
  • Student Testimonials
  • Passport & Visa
  • Pre-Departure Information
  • Health Insurance
  • Housing at TAU
  • Safety, Health, and Security
  • Accessibility
  • Equality and Diversity

Summer of Research Discovery at Tel Aviv University

International students engage in diverse projects through tau’s summer research program.

print

How did the Israeli-Palestinian conflict turn existential? What diseases did people suffer from 9,000 years ago? How can locusts help detect odors? Why do insurance companies refuse to cover double mastectomies for women who decide to go flat? These are just some of the questions that international students will be exploring during their summer research internships at Tel Aviv University (TAU).

Over the course of two months, students from all around the world will be working under the supervision of TAU faculty members in university labs and research teams, enjoying full access to TAU campus services, extensive libraries, and online resources. They will also be taking advantage of everything that Tel Aviv has to offer in the summer!

The summer research program at TAU is divided into two distinct tracks, one in social sciences and humanities , and another in sciences which include everything from psychology to engineering, life sciences, and exact sciences. When applying, students choose the project that best aligns with their academic interests and are accepted into the program upon approval from the lead researcher. 

This year, over 60 students have already begun their work, aiming to present their results at the end of July. Initially, students sit down with the lead researcher in their chosen lab to define the project scope and goals. The work typically involves not only literature review and analysis, but also conducting experiments to test their hypotheses, and validating results. This approach ensures participants gain hands-on practical experience.

From Insurance Policies to the Gig Economy, and Fiction Books

Speaking to participants during the kick-off meeting for the Summer Research Program in Humanities and Social Sciences, Prof. Milette Shamir, the head of the program and the VP for International at TAU, emphasized that the university values the opportunity to host international students on campus.

“In a crisis, turmoil, and change, the value of social sciences research begins to surface even more than usual as access to credible information becomes exceptionally valuable, along with critical reading and thinking.” — Prof. Milette Shamir

Projects in this track are extremely diverse: one examines the rise of the populist radical right parties and the news slant across different news outlets, while another assesses the effects of interventions by the International Monetary Fund. There are also projects investigating legal obstacles to breast cancer treatment, the nature of the Israeli-Palestinian conflict, and the emergence of leadership structures among ride-hailing drivers in India.

what are some research questions on student engagement

Introductory Session for Summer Research Program in Humanities and Social Sciences

For Wu Sijia from the Joint Institute for Jewish Studies at Shandong University, this program is an opportunity to take her PhD studies to the next level. She focuses on religious studies and is particularly interested in the interrelationship between the secular and religious worlds in Israel. 

“Coming to Israel is important for my research.”

During her research internship, Sijia will be reading twelve fiction books and memoirs for her project on the images of Jews and their relation with non-Jews in Eastern Europe.

Diving into Lab Work

Prof. Ben Maoz, who is heading the Summer Research Program in Sciences, reassured the students that at TAU, they will receive all the support necessary to experience research (and Tel Aviv) in the best way possible. 

‘It will give you a really good sense of research because you will be in the labs.” — Prof. Ben Maoz

In the sciences program, the projects span genetic engineering, astrophysics, neuroscience, physics, biomedicine, and other fields.

what are some research questions on student engagement

Introductory Session for Summer Research Program in Sciences

Have you ever wondered how toothbrushes factor into anthropological research? Students interning at the Biohistory and Evolutionary Medicine Laboratory will be using them in their work with skeletal remains from two excavation sites in Israel to clean the bones. Their project involves preserving and reconstructing the skeletons, and retrieving biological data about the general health and demographic profile of the individuals.

Another student's research will involve manipulating the electric signals to which locust antennae respond, aiming to establish how different odors can be differentiated. 

Projects also include growing an adult organoid heart system, researching how a person’s gaze conveys attention, simulating asymmetry in early-universe cosmological sources, improving AI defense against attacks, investigating various types of tumors, studying gene mutations, and much more.

We wish all the students success in their research and an enjoyable summer in Tel Aviv!

Related Articles

Connecting International Researchers with Cutting-Edge Engineering Programs and Labs

  • Academic Units
  • Transport & Parking
  • Emergency Contacts
  • Phone Directory
  • Dean of Students
  • Sexual Harassment Prevention
  • Student Union
  • Computer Services
  • Accessibility Statement
  • Terms of Use
  • Privacy and Data Protection
  • TAU International website
  • Graduate Programs
  • Undergrad Programs
  • Study Abroad
  • Global TAU Magazine
  • Friends Associations
  • Board of Governors
  • Development & Public Affairs
  • Print Publications

realcommerce

IMAGES

  1. Student Engagement Questionnaire Pdf

    what are some research questions on student engagement

  2. 18+ Student Engagement Survey Templates in PDF

    what are some research questions on student engagement

  3. PPT

    what are some research questions on student engagement

  4. student engagement.docx

    what are some research questions on student engagement

  5. (PDF) An Evaluation of the Relationship Between Student Engagement

    what are some research questions on student engagement

  6. 18+ Student Engagement Survey Templates in PDF

    what are some research questions on student engagement

VIDEO

  1. Tips for Writing Research Objectives, Research Questions and Research Hypotheses from Model

  2. The Student Experience: Making sense of data in a complex higher education era

  3. 2022 DepEd Action Research Sample

  4. An Easy, Research-Backed Way to Improve Student Attendance

  5. Cbse Board Exam 2024 Big Happy News

  6. Audiobook Sample: Heidi Across America

COMMENTS

  1. Fostering student engagement with motivating teaching: an observation

    In addition, observing student engagement with five items, showed some differences in results among the three different aspects of student engagement. In order to further contribute to the understanding of student engagement as a multidimensional concept, it would be beneficial to expand the research to include more indicators of student ...

  2. 45 Survey Questions to Understand Student Engagement in Online Learning

    Research suggests that some groups of students experience more difficulty with academic performance and engagement when course content is delivered online vs. face-to-face. As you look to improve the online learning experience for students, take a moment to understand how students, caregivers, and staff are currently experiencing virtual learning.

  3. PDF Evidence-Based Strategies for Elevating Student Engagement

    learning pedagogies that rely on student attendance.9 Research on iClicker usage by our Learning and Insights team has revealed that by adding polling to a course, attendance improves by up to 10%. This research also indicates that you only need 3-5 polling questions to engage students in class without overwhelming them—so, incorporating

  4. PDF Active learning classroom design and student engagement: An ...

    ENGAGEMENT IN ACTIVE LEARNING CLASSROOM Journal of Learning Spaces, 10(1), 2021. achieve data triangulation, perceptions of student engagement were collected from students, the instructor, and the research team. Three research questions guided this inquiry: 1.

  5. PDF Engagement Matters: Student Perceptions on the Importance of Engagement

    Student engagement is defined as "the student's psychological investment in and effort directed . Engagement Matters: Student Perceptions on the Importance of Engagement Strategies in the Online Learning Environment ... The following research questions guided the study: 1. Which strategies do students perceive to be important in enhancing ...

  6. Mapping research in student engagement and educational technology in

    The following research questions guide this enquiry: 1) ... however, that some students do not want to mix their academic and personal lives, and so the decision to use certain social platforms could be decided together with students. ... & C. Wylie (Eds.), Handbook of research on student engagement, (pp. 149-172). Boston: Springer US ...

  7. All better than being disengaged: Student engagement ...

    Along with such a differentiated consideration of student engagement questions arise whether some ways of engagement are more beneficial than others for student learning (Wang et al. 2019), and how student motivation, as a driver of engagement (Connell and Wellborn 1991), relates to these favorable patterns.

  8. A study of the relationship between students' engagement and their

    The findings of this research align with the existing body of work to establish that student engagement is an important factor that contributes to the success of students on online courses. However, there are different models of students' engagement based on the teaching and learning context and the preferred learning design when it comes to ...

  9. Student Engagement in Higher Education: Conceptualizations ...

    Researchers have offered varying definitions and conceptualizations of student engagement, including the factors that contribute to students' levels of engagement, emphasizing the role and importance of engagement in contributing to student learning and outcomes (Groccia, 2018).This research has attempted to tease out the various ways of defining and thinking about student engagement in the ...

  10. Student Engagement: What Is It? Why Does It Matter?

    Engagement was defined as "the student's psychological investment in and effort directed toward learning, understanding, or mastering the knowledge, skills, or crafts that academic work is intended to promote" (Newmann, 1992, p. 12). One set of models emphasized the role of school context.

  11. Full article: Fostering student engagement through a real-world

    Measuring student engagement can be challenging because engagement is a 'within-person' (first-person) experience. It can only be detected indirectly through externally observable phenomena that researchers interpret as evidence of engagement. Some studies measure engagement quantitatively using grades, retention rates, and attendance rates.

  12. Essential questions to enhance student engagement

    Conduct frequent formative assessments in order to determine and respond to student needs. Physical Movement - Incorporate physical movement into the day to infuse energy, to create physical representations of the content (i.e. vote with your feet, human graph), and to respond to questions. Movement helps students reset their brains and ...

  13. Increasing Student Engagement

    In addition to boosting engagement, group discussions give students the opportunity to explain to others their reasoning and problem-solving processes, which helps promote metacognition. Small groups work equally well for discussing open-ended questions and problems with explicit solutions. Have students model or explain to other students

  14. Staying Engaged: Knowledge and Research Needs in Student Engagement

    Researchers have focused on at least three levels in relation to student engagement ( Skinner & Pitzer, 2012 ). The first level represents student involvement within the school community (e.g., involvement in school activities). The second level narrows the focus to the classroom or subject domain (e.g., how students interact with math teachers ...

  15. PDF Student Engagement

    learning that student engagement (or the lack of it) can make, and we're willing to bet that every teacher on earth has at some point wished engagement could be bottled and served with breakfast. To a degree, it can be. While science has yet to invent the switch we can flip to make all of our students

  16. Student Engagement in the Classroom

    students' value a strong relationship with their teacher in order to be actively engaged in. learning. Lastly, this study reveals that student-student relationships promote a strong connection. to student engagement in the classroom and that students' perception of these peer relationships.

  17. Full article: Student engagement and learning outcomes: an empirical

    If research investigates only some of these dimensions of student engagement, or treats student engagement as a holistic concept, it is unclear whether all dimensions of engagement play the same role, and how we can apply student engagement in more practical ways [Citation 4, Citation 30].

  18. Nine Strategies for Promoting Student Engagement (Opinion)

    The question-of-the-week is: Some research suggests that as students get older, their engagement with school tends to decrease. ... Student engagement appears to decrease due to a number of ...

  19. Student engagement: What is it and why does it matter?

    Student engagement is a 'good thing'. The Recipes for Success, which are examples of practical activities that have proven effective in specific learning and teaching contexts, have all been focused not simply on defining student engagement but on improving it. Engagement is assumed to be a 'good thing'—something to cultivate and support.

  20. To Increase Student Engagement, Focus on Motivation

    According to research by Ryan and Deci, these are the three components that facilitate motivation: Autonomy is a "sense of initiative and ownership in one's actions.". Competence is a "feeling of mastery" and a sense that with effort a student can "succeed and grow.". Relatedness is when the school setting "conveys respect and ...

  21. The Importance of Student Engagement: Why Active Participation is

    Let's look at some of the most critical aspects of student engagement. ... The different options can include allowing students to choose their research topics, assignments, or learning activities. ... Above is a brief explanation of the question, "Why is student engagement important." Student engagement is critical for effective education.

  22. Student Engagement: Current State of the Construct ...

    In recent years, the construct of student engagement has gained substantial attention in education research, policy, and practice (Fredricks et al., 2016a).This is perhaps due to its reported associations with desired scholastic and non-scholastic outcomes, such as academic achievement (Reyes et al., 2012), school completion (Archambault et al., 2009), and physical and psychological well-being ...

  23. 20 Technology Tools To Engage Students In The Classroom

    Add a social dynamic in a school-friendly architecture and you've got Flip. See some ideas on ideas for using Flip in the classroom. 3. Video Games. I know this is general-merely saying 'video games' isn't a 'student engagement tool.' However, video games don't work without player input-and thus student engagement.

  24. ChatGPT in Teaching and Learning: A Systematic Review

    The increasing use of artificial intelligence (AI) in education has raised questions about the implications of ChatGPT for teaching and learning. A systematic literature review was conducted to answer these questions, analyzing 112 scholarly articles to identify the potential benefits and challenges related to ChatGPT use in educational settings. The selection process was thorough to ensure a ...

  25. Questions to Ask Before Heading to Law School

    Some research and self-assessment can help J.D. hopefuls make the best decision about law school, experts say. ... Here are some questions experts say J.D. hopefuls should ask before heading to ...

  26. B2B Content Marketing Trends 2024 [Research]

    Why some marketers don't use generative AI tools. While a lack of guidelines may deter some B2B marketers from using generative AI tools, other reasons include accuracy concerns (36%), lack of training (27%), and lack of understanding (27%). Twenty-two percent cite copyright concerns, and 19% have corporate mandates not to use them.

  27. Student engagement research trends of past 10 years: A ...

    Student engagement is critical for both academic achievement and learner satisfaction because it promotes successful learning outcomes. Despite its importance in various learning environments, research into the trends and themes of student engagement is scarce. In this regard, topic modeling, a machine learning technique, allows for the analysis of large amounts of content in any field. Thus ...

  28. Summer of Research Discovery at Tel Aviv University

    These are just some of the questions that international students will be exploring during their summer research internships at Tel Aviv University (TAU). Over the course of two months, students from all around the world will be working under the supervision of TAU faculty members in university labs and research teams, enjoying full access to ...

  29. Adobe Creative Cloud for students and teachers

    Students and Teachers. Introductory Pricing Terms and Conditions Creative Cloud Introductory Pricing Eligible students 13 and older and teachers can purchase an annual membership to Adobe® Creative Cloud™ for a reduced price of for the first year. At the end of your offer term, your subscription will be automatically billed at the standard subscription rate, currently at (plus applicable ...