Advertisement

Advertisement

The effects of online education on academic success: A meta-analysis study

  • Published: 06 September 2021
  • Volume 27 , pages 429–450, ( 2022 )

Cite this article

significance of the study in research about online learning

  • Hakan Ulum   ORCID: orcid.org/0000-0002-1398-6935 1  

80k Accesses

28 Citations

11 Altmetric

Explore all metrics

The purpose of this study is to analyze the effect of online education, which has been extensively used on student achievement since the beginning of the pandemic. In line with this purpose, a meta-analysis of the related studies focusing on the effect of online education on students’ academic achievement in several countries between the years 2010 and 2021 was carried out. Furthermore, this study will provide a source to assist future studies with comparing the effect of online education on academic achievement before and after the pandemic. This meta-analysis study consists of 27 studies in total. The meta-analysis involves the studies conducted in the USA, Taiwan, Turkey, China, Philippines, Ireland, and Georgia. The studies included in the meta-analysis are experimental studies, and the total sample size is 1772. In the study, the funnel plot, Duval and Tweedie’s Trip and Fill Analysis, Orwin’s Safe N Analysis, and Egger’s Regression Test were utilized to determine the publication bias, which has been found to be quite low. Besides, Hedge’s g statistic was employed to measure the effect size for the difference between the means performed in accordance with the random effects model. The results of the study show that the effect size of online education on academic achievement is on a medium level. The heterogeneity test results of the meta-analysis study display that the effect size does not differ in terms of class level, country, online education approaches, and lecture moderators.

Avoid common mistakes on your manuscript.

1 Introduction

Information and communication technologies have become a powerful force in transforming the educational settings around the world. The pandemic has been an important factor in transferring traditional physical classrooms settings through adopting information and communication technologies and has also accelerated the transformation. The literature supports that learning environments connected to information and communication technologies highly satisfy students. Therefore, we need to keep interest in technology-based learning environments. Clearly, technology has had a huge impact on young people's online lives. This digital revolution can synergize the educational ambitions and interests of digitally addicted students. In essence, COVID-19 has provided us with an opportunity to embrace online learning as education systems have to keep up with the rapid emergence of new technologies.

Information and communication technologies that have an effect on all spheres of life are also actively included in the education field. With the recent developments, using technology in education has become inevitable due to personal and social reasons (Usta, 2011a ). Online education may be given as an example of using information and communication technologies as a consequence of the technological developments. Also, it is crystal clear that online learning is a popular way of obtaining instruction (Demiralay et al., 2016 ; Pillay et al., 2007 ), which is defined by Horton ( 2000 ) as a way of education that is performed through a web browser or an online application without requiring an extra software or a learning source. Furthermore, online learning is described as a way of utilizing the internet to obtain the related learning sources during the learning process, to interact with the content, the teacher, and other learners, as well as to get support throughout the learning process (Ally, 2004 ). Online learning has such benefits as learning independently at any time and place (Vrasidas & MsIsaac, 2000 ), granting facility (Poole, 2000 ), flexibility (Chizmar & Walbert, 1999 ), self-regulation skills (Usta, 2011b ), learning with collaboration, and opportunity to plan self-learning process.

Even though online education practices have not been comprehensive as it is now, internet and computers have been used in education as alternative learning tools in correlation with the advances in technology. The first distance education attempt in the world was initiated by the ‘Steno Courses’ announcement published in Boston newspaper in 1728. Furthermore, in the nineteenth century, Sweden University started the “Correspondence Composition Courses” for women, and University Correspondence College was afterwards founded for the correspondence courses in 1843 (Arat & Bakan, 2011 ). Recently, distance education has been performed through computers, assisted by the facilities of the internet technologies, and soon, it has evolved into a mobile education practice that is emanating from progress in the speed of internet connection, and the development of mobile devices.

With the emergence of pandemic (Covid-19), face to face education has almost been put to a halt, and online education has gained significant importance. The Microsoft management team declared to have 750 users involved in the online education activities on the 10 th March, just before the pandemic; however, on March 24, they informed that the number of users increased significantly, reaching the number of 138,698 users (OECD, 2020 ). This event supports the view that it is better to commonly use online education rather than using it as a traditional alternative educational tool when students do not have the opportunity to have a face to face education (Geostat, 2019 ). The period of Covid-19 pandemic has emerged as a sudden state of having limited opportunities. Face to face education has stopped in this period for a long time. The global spread of Covid-19 affected more than 850 million students all around the world, and it caused the suspension of face to face education. Different countries have proposed several solutions in order to maintain the education process during the pandemic. Schools have had to change their curriculum, and many countries supported the online education practices soon after the pandemic. In other words, traditional education gave its way to online education practices. At least 96 countries have been motivated to access online libraries, TV broadcasts, instructions, sources, video lectures, and online channels (UNESCO, 2020 ). In such a painful period, educational institutions went through online education practices by the help of huge companies such as Microsoft, Google, Zoom, Skype, FaceTime, and Slack. Thus, online education has been discussed in the education agenda more intensively than ever before.

Although online education approaches were not used as comprehensively as it has been used recently, it was utilized as an alternative learning approach in education for a long time in parallel with the development of technology, internet and computers. The academic achievement of the students is often aimed to be promoted by employing online education approaches. In this regard, academicians in various countries have conducted many studies on the evaluation of online education approaches and published the related results. However, the accumulation of scientific data on online education approaches creates difficulties in keeping, organizing and synthesizing the findings. In this research area, studies are being conducted at an increasing rate making it difficult for scientists to be aware of all the research outside of their ​​expertise. Another problem encountered in the related study area is that online education studies are repetitive. Studies often utilize slightly different methods, measures, and/or examples to avoid duplication. This erroneous approach makes it difficult to distinguish between significant differences in the related results. In other words, if there are significant differences in the results of the studies, it may be difficult to express what variety explains the differences in these results. One obvious solution to these problems is to systematically review the results of various studies and uncover the sources. One method of performing such systematic syntheses is the application of meta-analysis which is a methodological and statistical approach to draw conclusions from the literature. At this point, how effective online education applications are in increasing the academic success is an important detail. Has online education, which is likely to be encountered frequently in the continuing pandemic period, been successful in the last ten years? If successful, how much was the impact? Did different variables have an impact on this effect? Academics across the globe have carried out studies on the evaluation of online education platforms and publishing the related results (Chiao et al., 2018 ). It is quite important to evaluate the results of the studies that have been published up until now, and that will be published in the future. Has the online education been successful? If it has been, how big is the impact? Do the different variables affect this impact? What should we consider in the next coming online education practices? These questions have all motivated us to carry out this study. We have conducted a comprehensive meta-analysis study that tries to provide a discussion platform on how to develop efficient online programs for educators and policy makers by reviewing the related studies on online education, presenting the effect size, and revealing the effect of diverse variables on the general impact.

There have been many critical discussions and comprehensive studies on the differences between online and face to face learning; however, the focus of this paper is different in the sense that it clarifies the magnitude of the effect of online education and teaching process, and it represents what factors should be controlled to help increase the effect size. Indeed, the purpose here is to provide conscious decisions in the implementation of the online education process.

The general impact of online education on the academic achievement will be discovered in the study. Therefore, this will provide an opportunity to get a general overview of the online education which has been practiced and discussed intensively in the pandemic period. Moreover, the general impact of online education on academic achievement will be analyzed, considering different variables. In other words, the current study will allow to totally evaluate the study results from the related literature, and to analyze the results considering several cultures, lectures, and class levels. Considering all the related points, this study seeks to answer the following research questions:

What is the effect size of online education on academic achievement?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the country?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the class level?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the lecture?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the online education approaches?

This study aims at determining the effect size of online education, which has been highly used since the beginning of the pandemic, on students’ academic achievement in different courses by using a meta-analysis method. Meta-analysis is a synthesis method that enables gathering of several study results accurately and efficiently, and getting the total results in the end (Tsagris & Fragkos, 2018 ).

2.1 Selecting and coding the data (studies)

The required literature for the meta-analysis study was reviewed in July, 2020, and the follow-up review was conducted in September, 2020. The purpose of the follow-up review was to include the studies which were published in the conduction period of this study, and which met the related inclusion criteria. However, no study was encountered to be included in the follow-up review.

In order to access the studies in the meta-analysis, the databases of Web of Science, ERIC, and SCOPUS were reviewed by utilizing the keywords ‘online learning and online education’. Not every database has a search engine that grants access to the studies by writing the keywords, and this obstacle was considered to be an important problem to be overcome. Therefore, a platform that has a special design was utilized by the researcher. With this purpose, through the open access system of Cukurova University Library, detailed reviews were practiced using EBSCO Information Services (EBSCO) that allow reviewing the whole collection of research through a sole searching box. Since the fundamental variables of this study are online education and online learning, the literature was systematically reviewed in the related databases (Web of Science, ERIC, and SCOPUS) by referring to the keywords. Within this scope, 225 articles were accessed, and the studies were included in the coding key list formed by the researcher. The name of the researchers, the year, the database (Web of Science, ERIC, and SCOPUS), the sample group and size, the lectures that the academic achievement was tested in, the country that the study was conducted in, and the class levels were all included in this coding key.

The following criteria were identified to include 225 research studies which were coded based on the theoretical basis of the meta-analysis study: (1) The studies should be published in the refereed journals between the years 2020 and 2021, (2) The studies should be experimental studies that try to determine the effect of online education and online learning on academic achievement, (3) The values of the stated variables or the required statistics to calculate these values should be stated in the results of the studies, and (4) The sample group of the study should be at a primary education level. These criteria were also used as the exclusion criteria in the sense that the studies that do not meet the required criteria were not included in the present study.

After the inclusion criteria were determined, a systematic review process was conducted, following the year criterion of the study by means of EBSCO. Within this scope, 290,365 studies that analyze the effect of online education and online learning on academic achievement were accordingly accessed. The database (Web of Science, ERIC, and SCOPUS) was also used as a filter by analyzing the inclusion criteria. Hence, the number of the studies that were analyzed was 58,616. Afterwards, the keyword ‘primary education’ was used as the filter and the number of studies included in the study decreased to 3152. Lastly, the literature was reviewed by using the keyword ‘academic achievement’ and 225 studies were accessed. All the information of 225 articles was included in the coding key.

It is necessary for the coders to review the related studies accurately and control the validity, safety, and accuracy of the studies (Stewart & Kamins, 2001 ). Within this scope, the studies that were determined based on the variables used in this study were first reviewed by three researchers from primary education field, then the accessed studies were combined and processed in the coding key by the researcher. All these studies that were processed in the coding key were analyzed in accordance with the inclusion criteria by all the researchers in the meetings, and it was decided that 27 studies met the inclusion criteria (Atici & Polat, 2010 ; Carreon, 2018 ; Ceylan & Elitok Kesici, 2017 ; Chae & Shin, 2016 ; Chiang et al. 2014 ; Ercan, 2014 ; Ercan et al., 2016 ; Gwo-Jen et al., 2018 ; Hayes & Stewart, 2016 ; Hwang et al., 2012 ; Kert et al., 2017 ; Lai & Chen, 2010 ; Lai et al., 2015 ; Meyers et al., 2015 ; Ravenel et al., 2014 ; Sung et al., 2016 ; Wang & Chen, 2013 ; Yu, 2019 ; Yu & Chen, 2014 ; Yu & Pan, 2014 ; Yu et al., 2010 ; Zhong et al., 2017 ). The data from the studies meeting the inclusion criteria were independently processed in the second coding key by three researchers, and consensus meetings were arranged for further discussion. After the meetings, researchers came to an agreement that the data were coded accurately and precisely. Having identified the effect sizes and heterogeneity of the study, moderator variables that will show the differences between the effect sizes were determined. The data related to the determined moderator variables were added to the coding key by three researchers, and a new consensus meeting was arranged. After the meeting, researchers came to an agreement that moderator variables were coded accurately and precisely.

2.2 Study group

27 studies are included in the meta-analysis. The total sample size of the studies that are included in the analysis is 1772. The characteristics of the studies included are given in Table 1 .

2.3 Publication bias

Publication bias is the low capability of published studies on a research subject to represent all completed studies on the same subject (Card, 2011 ; Littell et al., 2008 ). Similarly, publication bias is the state of having a relationship between the probability of the publication of a study on a subject, and the effect size and significance that it produces. Within this scope, publication bias may occur when the researchers do not want to publish the study as a result of failing to obtain the expected results, or not being approved by the scientific journals, and consequently not being included in the study synthesis (Makowski et al., 2019 ). The high possibility of publication bias in a meta-analysis study negatively affects (Pecoraro, 2018 ) the accuracy of the combined effect size, causing the average effect size to be reported differently than it should be (Borenstein et al., 2009 ). For this reason, the possibility of publication bias in the included studies was tested before determining the effect sizes of the relationships between the stated variables. The possibility of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test.

2.4 Selecting the model

After determining the probability of publication bias of this meta-analysis study, the statistical model used to calculate the effect sizes was selected. The main approaches used in the effect size calculations according to the differentiation level of inter-study variance are fixed and random effects models (Pigott, 2012 ). Fixed effects model refers to the homogeneity of the characteristics of combined studies apart from the sample sizes, while random effects model refers to the parameter diversity between the studies (Cumming, 2012 ). While calculating the average effect size in the random effects model (Deeks et al., 2008 ) that is based on the assumption that effect predictions of different studies are only the result of a similar distribution, it is necessary to consider several situations such as the effect size apart from the sample error of combined studies, characteristics of the participants, duration, scope, and pattern of the study (Littell et al., 2008 ). While deciding the model in the meta-analysis study, the assumptions on the sample characteristics of the studies included in the analysis and the inferences that the researcher aims to make should be taken into consideration. The fact that the sample characteristics of the studies conducted in the field of social sciences are affected by various parameters shows that using random effects model is more appropriate in this sense. Besides, it is stated that the inferences made with the random effects model are beyond the studies included in the meta-analysis (Field, 2003 ; Field & Gillett, 2010 ). Therefore, using random effects model also contributes to the generalization of research data. The specified criteria for the statistical model selection show that according to the nature of the meta-analysis study, the model should be selected just before the analysis (Borenstein et al., 2007 ; Littell et al., 2008 ). Within this framework, it was decided to make use of the random effects model, considering that the students who are the samples of the studies included in the meta-analysis are from different countries and cultures, the sample characteristics of the studies differ, and the patterns and scopes of the studies vary as well.

2.5 Heterogeneity

Meta-analysis facilitates analyzing the research subject with different parameters by showing the level of diversity between the included studies. Within this frame, whether there is a heterogeneous distribution between the studies included in the study or not has been evaluated in the present study. The heterogeneity of the studies combined in this meta-analysis study has been determined through Q and I 2 tests. Q test evaluates the random distribution probability of the differences between the observed results (Deeks et al., 2008 ). Q value exceeding 2 value calculated according to the degree of freedom and significance, indicates the heterogeneity of the combined effect sizes (Card, 2011 ). I 2 test, which is the complementary of the Q test, shows the heterogeneity amount of the effect sizes (Cleophas & Zwinderman, 2017 ). I 2 value being higher than 75% is explained as high level of heterogeneity.

In case of encountering heterogeneity in the studies included in the meta-analysis, the reasons of heterogeneity can be analyzed by referring to the study characteristics. The study characteristics which may be related to the heterogeneity between the included studies can be interpreted through subgroup analysis or meta-regression analysis (Deeks et al., 2008 ). While determining the moderator variables, the sufficiency of the number of variables, the relationship between the moderators, and the condition to explain the differences between the results of the studies have all been considered in the present study. Within this scope, it was predicted in this meta-analysis study that the heterogeneity can be explained with the country, class level, and lecture moderator variables of the study in terms of the effect of online education, which has been highly used since the beginning of the pandemic, and it has an impact on the students’ academic achievement in different lectures. Some subgroups were evaluated and categorized together, considering that the number of effect sizes of the sub-dimensions of the specified variables is not sufficient to perform moderator analysis (e.g. the countries where the studies were conducted).

2.6 Interpreting the effect sizes

Effect size is a factor that shows how much the independent variable affects the dependent variable positively or negatively in each included study in the meta-analysis (Dinçer, 2014 ). While interpreting the effect sizes obtained from the meta-analysis, the classifications of Cohen et al. ( 2007 ) have been utilized. The case of differentiating the specified relationships of the situation of the country, class level, and school subject variables of the study has been identified through the Q test, degree of freedom, and p significance value Fig.  1 and 2 .

3 Findings and results

The purpose of this study is to determine the effect size of online education on academic achievement. Before determining the effect sizes in the study, the probability of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test.

When the funnel plots are examined, it is seen that the studies included in the analysis are distributed symmetrically on both sides of the combined effect size axis, and they are generally collected in the middle and lower sections. The probability of publication bias is low according to the plots. However, since the results of the funnel scatter plots may cause subjective interpretations, they have been supported by additional analyses (Littell et al., 2008 ). Therefore, in order to provide an extra proof for the probability of publication bias, it has been analyzed through Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test (Table 2 ).

Table 2 consists of the results of the rates of publication bias probability before counting the effect size of online education on academic achievement. According to the table, Orwin Safe N analysis results show that it is not necessary to add new studies to the meta-analysis in order for Hedges g to reach a value outside the range of ± 0.01. The Duval and Tweedie test shows that excluding the studies that negatively affect the symmetry of the funnel scatter plots for each meta-analysis or adding their exact symmetrical equivalents does not significantly differentiate the calculated effect size. The insignificance of the Egger tests results reveals that there is no publication bias in the meta-analysis study. The results of the analysis indicate the high internal validity of the effect sizes and the adequacy of representing the studies conducted on the relevant subject.

In this study, it was aimed to determine the effect size of online education on academic achievement after testing the publication bias. In line with the first purpose of the study, the forest graph regarding the effect size of online education on academic achievement is shown in Fig.  3 , and the statistics regarding the effect size are given in Table 3 .

figure 1

The flow chart of the scanning and selection process of the studies

figure 2

Funnel plot graphics representing the effect size of the effects of online education on academic success

figure 3

Forest graph related to the effect size of online education on academic success

The square symbols in the forest graph in Fig.  3 represent the effect sizes, while the horizontal lines show the intervals in 95% confidence of the effect sizes, and the diamond symbol shows the overall effect size. When the forest graph is analyzed, it is seen that the lower and upper limits of the combined effect sizes are generally close to each other, and the study loads are similar. This similarity in terms of study loads indicates the similarity of the contribution of the combined studies to the overall effect size.

Figure  3 clearly represents that the study of Liu and others (Liu et al., 2018 ) has the lowest, and the study of Ercan and Bilen ( 2014 ) has the highest effect sizes. The forest graph shows that all the combined studies and the overall effect are positive. Furthermore, it is simply understood from the forest graph in Fig.  3 and the effect size statistics in Table 3 that the results of the meta-analysis study conducted with 27 studies and analyzing the effect of online education on academic achievement illustrate that this relationship is on average level (= 0.409).

After the analysis of the effect size in the study, whether the studies included in the analysis are distributed heterogeneously or not has also been analyzed. The heterogeneity of the combined studies was determined through the Q and I 2 tests. As a result of the heterogeneity test, Q statistical value was calculated as 29.576. With 26 degrees of freedom at 95% significance level in the chi-square table, the critical value is accepted as 38.885. The Q statistical value (29.576) counted in this study is lower than the critical value of 38.885. The I 2 value, which is the complementary of the Q statistics, is 12.100%. This value indicates that the accurate heterogeneity or the total variability that can be attributed to variability between the studies is 12%. Besides, p value is higher than (0.285) p = 0.05. All these values [Q (26) = 29.579, p = 0.285; I2 = 12.100] indicate that there is a homogeneous distribution between the effect sizes, and fixed effects model should be used to interpret these effect sizes. However, some researchers argue that even if the heterogeneity is low, it should be evaluated based on the random effects model (Borenstein et al., 2007 ). Therefore, this study gives information about both models. The heterogeneity of the combined studies has been attempted to be explained with the characteristics of the studies included in the analysis. In this context, the final purpose of the study is to determine the effect of the country, academic level, and year variables on the findings. Accordingly, the statistics regarding the comparison of the stated relations according to the countries where the studies were conducted are given in Table 4 .

As seen in Table 4 , the effect of online education on academic achievement does not differ significantly according to the countries where the studies were conducted in. Q test results indicate the heterogeneity of the relationships between the variables in terms of countries where the studies were conducted in. According to the table, the effect of online education on academic achievement was reported as the highest in other countries, and the lowest in the US. The statistics regarding the comparison of the stated relations according to the class levels are given in Table 5 .

As seen in Table 5 , the effect of online education on academic achievement does not differ according to the class level. However, the effect of online education on academic achievement is the highest in the 4 th class. The statistics regarding the comparison of the stated relations according to the class levels are given in Table 6 .

As seen in Table 6 , the effect of online education on academic achievement does not differ according to the school subjects included in the studies. However, the effect of online education on academic achievement is the highest in ICT subject.

The obtained effect size in the study was formed as a result of the findings attained from primary studies conducted in 7 different countries. In addition, these studies are the ones on different approaches to online education (online learning environments, social networks, blended learning, etc.). In this respect, the results may raise some questions about the validity and generalizability of the results of the study. However, the moderator analyzes, whether for the country variable or for the approaches covered by online education, did not create significant differences in terms of the effect sizes. If significant differences were to occur in terms of effect sizes, we could say that the comparisons we will make by comparing countries under the umbrella of online education would raise doubts in terms of generalizability. Moreover, no study has been found in the literature that is not based on a special approach or does not contain a specific technique conducted under the name of online education alone. For instance, one of the commonly used definitions is blended education which is defined as an educational model in which online education is combined with traditional education method (Colis & Moonen, 2001 ). Similarly, Rasmussen ( 2003 ) defines blended learning as “a distance education method that combines technology (high technology such as television, internet, or low technology such as voice e-mail, conferences) with traditional education and training.” Further, Kerres and Witt (2003) define blended learning as “combining face-to-face learning with technology-assisted learning.” As it is clearly observed, online education, which has a wider scope, includes many approaches.

As seen in Table 7 , the effect of online education on academic achievement does not differ according to online education approaches included in the studies. However, the effect of online education on academic achievement is the highest in Web Based Problem Solving Approach.

4 Conclusions and discussion

Considering the developments during the pandemics, it is thought that the diversity in online education applications as an interdisciplinary pragmatist field will increase, and the learning content and processes will be enriched with the integration of new technologies into online education processes. Another prediction is that more flexible and accessible learning opportunities will be created in online education processes, and in this way, lifelong learning processes will be strengthened. As a result, it is predicted that in the near future, online education and even digital learning with a newer name will turn into the main ground of education instead of being an alternative or having a support function in face-to-face learning. The lessons learned from the early period online learning experience, which was passed with rapid adaptation due to the Covid19 epidemic, will serve to develop this method all over the world, and in the near future, online learning will become the main learning structure through increasing its functionality with the contribution of new technologies and systems. If we look at it from this point of view, there is a necessity to strengthen online education.

In this study, the effect of online learning on academic achievement is at a moderate level. To increase this effect, the implementation of online learning requires support from teachers to prepare learning materials, to design learning appropriately, and to utilize various digital-based media such as websites, software technology and various other tools to support the effectiveness of online learning (Rolisca & Achadiyah, 2014 ). According to research conducted by Rahayu et al. ( 2017 ), it has been proven that the use of various types of software increases the effectiveness and quality of online learning. Implementation of online learning can affect students' ability to adapt to technological developments in that it makes students use various learning resources on the internet to access various types of information, and enables them to get used to performing inquiry learning and active learning (Hart et al., 2019 ; Prestiadi et al., 2019 ). In addition, there may be many reasons for the low level of effect in this study. The moderator variables examined in this study could be a guide in increasing the level of practical effect. However, the effect size did not differ significantly for all moderator variables. Different moderator analyzes can be evaluated in order to increase the level of impact of online education on academic success. If confounding variables that significantly change the effect level are detected, it can be spoken more precisely in order to increase this level. In addition to the technical and financial problems, the level of impact will increase if a few other difficulties are eliminated such as students, lack of interaction with the instructor, response time, and lack of traditional classroom socialization.

In addition, COVID-19 pandemic related social distancing has posed extreme difficulties for all stakeholders to get online as they have to work in time constraints and resource constraints. Adopting the online learning environment is not just a technical issue, it is a pedagogical and instructive challenge as well. Therefore, extensive preparation of teaching materials, curriculum, and assessment is vital in online education. Technology is the delivery tool and requires close cross-collaboration between teaching, content and technology teams (CoSN, 2020 ).

Online education applications have been used for many years. However, it has come to the fore more during the pandemic process. This result of necessity has brought with it the discussion of using online education instead of traditional education methods in the future. However, with this research, it has been revealed that online education applications are moderately effective. The use of online education instead of face-to-face education applications can only be possible with an increase in the level of success. This may have been possible with the experience and knowledge gained during the pandemic process. Therefore, the meta-analysis of experimental studies conducted in the coming years will guide us. In this context, experimental studies using online education applications should be analyzed well. It would be useful to identify variables that can change the level of impacts with different moderators. Moderator analyzes are valuable in meta-analysis studies (for example, the role of moderators in Karl Pearson's typhoid vaccine studies). In this context, each analysis study sheds light on future studies. In meta-analyses to be made about online education, it would be beneficial to go beyond the moderators determined in this study. Thus, the contribution of similar studies to the field will increase more.

The purpose of this study is to determine the effect of online education on academic achievement. In line with this purpose, the studies that analyze the effect of online education approaches on academic achievement have been included in the meta-analysis. The total sample size of the studies included in the meta-analysis is 1772. While the studies included in the meta-analysis were conducted in the US, Taiwan, Turkey, China, Philippines, Ireland, and Georgia, the studies carried out in Europe could not be reached. The reason may be attributed to that there may be more use of quantitative research methods from a positivist perspective in the countries with an American academic tradition. As a result of the study, it was found out that the effect size of online education on academic achievement (g = 0.409) was moderate. In the studies included in the present research, we found that online education approaches were more effective than traditional ones. However, contrary to the present study, the analysis of comparisons between online and traditional education in some studies shows that face-to-face traditional learning is still considered effective compared to online learning (Ahmad et al., 2016 ; Hamdani & Priatna, 2020 ; Wei & Chou, 2020 ). Online education has advantages and disadvantages. The advantages of online learning compared to face-to-face learning in the classroom is the flexibility of learning time in online learning, the learning time does not include a single program, and it can be shaped according to circumstances (Lai et al., 2019 ). The next advantage is the ease of collecting assignments for students, as these can be done without having to talk to the teacher. Despite this, online education has several weaknesses, such as students having difficulty in understanding the material, teachers' inability to control students, and students’ still having difficulty interacting with teachers in case of internet network cuts (Swan, 2007 ). According to Astuti et al ( 2019 ), face-to-face education method is still considered better by students than e-learning because it is easier to understand the material and easier to interact with teachers. The results of the study illustrated that the effect size (g = 0.409) of online education on academic achievement is of medium level. Therefore, the results of the moderator analysis showed that the effect of online education on academic achievement does not differ in terms of country, lecture, class level, and online education approaches variables. After analyzing the literature, several meta-analyses on online education were published (Bernard et al., 2004 ; Machtmes & Asher, 2000 ; Zhao et al., 2005 ). Typically, these meta-analyzes also include the studies of older generation technologies such as audio, video, or satellite transmission. One of the most comprehensive studies on online education was conducted by Bernard et al. ( 2004 ). In this study, 699 independent effect sizes of 232 studies published from 1985 to 2001 were analyzed, and face-to-face education was compared to online education, with respect to success criteria and attitudes of various learners from young children to adults. In this meta-analysis, an overall effect size close to zero was found for the students' achievement (g +  = 0.01).

In another meta-analysis study carried out by Zhao et al. ( 2005 ), 98 effect sizes were examined, including 51 studies on online education conducted between 1996 and 2002. According to the study of Bernard et al. ( 2004 ), this meta-analysis focuses on the activities done in online education lectures. As a result of the research, an overall effect size close to zero was found for online education utilizing more than one generation technology for students at different levels. However, the salient point of the meta-analysis study of Zhao et al. is that it takes the average of different types of results used in a study to calculate an overall effect size. This practice is problematic because the factors that develop one type of learner outcome (e.g. learner rehabilitation), particularly course characteristics and practices, may be quite different from those that develop another type of outcome (e.g. learner's achievement), and it may even cause damage to the latter outcome. While mixing the studies with different types of results, this implementation may obscure the relationship between practices and learning.

Some meta-analytical studies have focused on the effectiveness of the new generation distance learning courses accessed through the internet for specific student populations. For instance, Sitzmann and others (Sitzmann et al., 2006 ) reviewed 96 studies published from 1996 to 2005, comparing web-based education of job-related knowledge or skills with face-to-face one. The researchers found that web-based education in general was slightly more effective than face-to-face education, but it is insufficient in terms of applicability ("knowing how to apply"). In addition, Sitzmann et al. ( 2006 ) revealed that Internet-based education has a positive effect on theoretical knowledge in quasi-experimental studies; however, it positively affects face-to-face education in experimental studies performed by random assignment. This moderator analysis emphasizes the need to pay attention to the factors of designs of the studies included in the meta-analysis. The designs of the studies included in this meta-analysis study were ignored. This can be presented as a suggestion to the new studies that will be conducted.

Another meta-analysis study was conducted by Cavanaugh et al. ( 2004 ), in which they focused on online education. In this study on internet-based distance education programs for students under 12 years of age, the researchers combined 116 results from 14 studies published between 1999 and 2004 to calculate an overall effect that was not statistically different from zero. The moderator analysis carried out in this study showed that there was no significant factor affecting the students' success. This meta-analysis used multiple results of the same study, ignoring the fact that different results of the same student would not be independent from each other.

In conclusion, some meta-analytical studies analyzed the consequences of online education for a wide range of students (Bernard et al., 2004 ; Zhao et al., 2005 ), and the effect sizes were generally low in these studies. Furthermore, none of the large-scale meta-analyzes considered the moderators, database quality standards or class levels in the selection of the studies, while some of them just referred to the country and lecture moderators. Advances in internet-based learning tools, the pandemic process, and increasing popularity in different learning contexts have required a precise meta-analysis of students' learning outcomes through online learning. Previous meta-analysis studies were typically based on the studies, involving narrow range of confounding variables. In the present study, common but significant moderators such as class level and lectures during the pandemic process were discussed. For instance, the problems have been experienced especially in terms of eligibility of class levels in online education platforms during the pandemic process. It was found that there is a need to study and make suggestions on whether online education can meet the needs of teachers and students.

Besides, the main forms of online education in the past were to watch the open lectures of famous universities and educational videos of institutions. In addition, online education is mainly a classroom-based teaching implemented by teachers in their own schools during the pandemic period, which is an extension of the original school education. This meta-analysis study will stand as a source to compare the effect size of the online education forms of the past decade with what is done today, and what will be done in the future.

Lastly, the heterogeneity test results of the meta-analysis study display that the effect size does not differ in terms of class level, country, online education approaches, and lecture moderators.

*Studies included in meta-analysis

Ahmad, S., Sumardi, K., & Purnawan, P. (2016). Komparasi Peningkatan Hasil Belajar Antara Pembelajaran Menggunakan Sistem Pembelajaran Online Terpadu Dengan Pembelajaran Klasikal Pada Mata Kuliah Pneumatik Dan Hidrolik. Journal of Mechanical Engineering Education, 2 (2), 286–292.

Article   Google Scholar  

Ally, M. (2004). Foundations of educational theory for online learning. Theory and Practice of Online Learning, 2 , 15–44. Retrieved on the 11th of September, 2020 from https://eddl.tru.ca/wp-content/uploads/2018/12/01_Anderson_2008-Theory_and_Practice_of_Online_Learning.pdf

Arat, T., & Bakan, Ö. (2011). Uzaktan eğitim ve uygulamaları. Selçuk Üniversitesi Sosyal Bilimler Meslek Yüksek Okulu Dergisi , 14 (1–2), 363–374. https://doi.org/10.29249/selcuksbmyd.540741

Astuti, C. C., Sari, H. M. K., & Azizah, N. L. (2019). Perbandingan Efektifitas Proses Pembelajaran Menggunakan Metode E-Learning dan Konvensional. Proceedings of the ICECRS, 2 (1), 35–40.

*Atici, B., & Polat, O. C. (2010). Influence of the online learning environments and tools on the student achievement and opinions. Educational Research and Reviews, 5 (8), 455–464. Retrieved on the 11th of October, 2020 from https://academicjournals.org/journal/ERR/article-full-text-pdf/4C8DD044180.pdf

Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., et al. (2004). How does distance education compare with classroom instruction? A meta- analysis of the empirical literature. Review of Educational Research, 3 (74), 379–439. https://doi.org/10.3102/00346543074003379

Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis . Wiley.

Book   Google Scholar  

Borenstein, M., Hedges, L., & Rothstein, H. (2007). Meta-analysis: Fixed effect vs. random effects . UK: Wiley.

Card, N. A. (2011). Applied meta-analysis for social science research: Methodology in the social sciences . Guilford.

Google Scholar  

*Carreon, J. R. (2018 ). Facebook as integrated blended learning tool in technology and livelihood education exploratory. Retrieved on the 1st of October, 2020 from https://files.eric.ed.gov/fulltext/EJ1197714.pdf

Cavanaugh, C., Gillan, K. J., Kromrey, J., Hess, M., & Blomeyer, R. (2004). The effects of distance education on K-12 student outcomes: A meta-analysis. Learning Point Associates/North Central Regional Educational Laboratory (NCREL) . Retrieved on the 11th of September, 2020 from https://files.eric.ed.gov/fulltext/ED489533.pdf

*Ceylan, V. K., & Elitok Kesici, A. (2017). Effect of blended learning to academic achievement. Journal of Human Sciences, 14 (1), 308. https://doi.org/10.14687/jhs.v14i1.4141

*Chae, S. E., & Shin, J. H. (2016). Tutoring styles that encourage learner satisfaction, academic engagement, and achievement in an online environment. Interactive Learning Environments, 24(6), 1371–1385. https://doi.org/10.1080/10494820.2015.1009472

*Chiang, T. H. C., Yang, S. J. H., & Hwang, G. J. (2014). An augmented reality-based mobile learning system to improve students’ learning achievements and motivations in natural science inquiry activities. Educational Technology and Society, 17 (4), 352–365. Retrieved on the 11th of September, 2020 from https://www.researchgate.net/profile/Gwo_Jen_Hwang/publication/287529242_An_Augmented_Reality-based_Mobile_Learning_System_to_Improve_Students'_Learning_Achievements_and_Motivations_in_Natural_Science_Inquiry_Activities/links/57198c4808ae30c3f9f2c4ac.pdf

Chiao, H. M., Chen, Y. L., & Huang, W. H. (2018). Examining the usability of an online virtual tour-guiding platform for cultural tourism education. Journal of Hospitality, Leisure, Sport & Tourism Education, 23 (29–38), 1. https://doi.org/10.1016/j.jhlste.2018.05.002

Chizmar, J. F., & Walbert, M. S. (1999). Web-based learning environments guided by principles of good teaching practice. Journal of Economic Education, 30 (3), 248–264. https://doi.org/10.2307/1183061

Cleophas, T. J., & Zwinderman, A. H. (2017). Modern meta-analysis: Review and update of methodologies . Switzerland: Springer. https://doi.org/10.1007/978-3-319-55895-0

Cohen, L., Manion, L., & Morrison, K. (2007). Observation.  Research Methods in Education, 6 , 396–412. Retrieved on the 11th of September, 2020 from https://www.researchgate.net/profile/Nabil_Ashraf2/post/How_to_get_surface_potential_Vs_Voltage_curve_from_CV_and_GV_measurements_of_MOS_capacitor/attachment/5ac6033cb53d2f63c3c405b4/AS%3A612011817844736%401522926396219/download/Very+important_C-V+characterization+Lehigh+University+thesis.pdf

Colis, B., & Moonen, J. (2001). Flexible Learning in a Digital World: Experiences and Expectations. Open & Distance Learning Series . Stylus Publishing.

CoSN. (2020). COVID-19 Response: Preparing to Take School Online. CoSN. (2020). COVID-19 Response: Preparing to Take School Online. Retrieved on the 3rd of September, 2021 from https://www.cosn.org/sites/default/files/COVID-19%20Member%20Exclusive_0.pdf

Cumming, G. (2012). Understanding new statistics: Effect sizes, confidence intervals, and meta-analysis. New York, USA: Routledge. https://doi.org/10.4324/9780203807002

Deeks, J. J., Higgins, J. P. T., & Altman, D. G. (2008). Analysing data and undertaking meta-analyses . In J. P. T. Higgins & S. Green (Eds.), Cochrane handbook for systematic reviews of interventions (pp. 243–296). Sussex: John Wiley & Sons. https://doi.org/10.1002/9780470712184.ch9

Demiralay, R., Bayır, E. A., & Gelibolu, M. F. (2016). Öğrencilerin bireysel yenilikçilik özellikleri ile çevrimiçi öğrenmeye hazır bulunuşlukları ilişkisinin incelenmesi. Eğitim ve Öğretim Araştırmaları Dergisi, 5 (1), 161–168. https://doi.org/10.23891/efdyyu.2017.10

Dinçer, S. (2014). Eğitim bilimlerinde uygulamalı meta-analiz. Pegem Atıf İndeksi, 2014(1), 1–133. https://doi.org/10.14527/pegem.001

*Durak, G., Cankaya, S., Yunkul, E., & Ozturk, G. (2017). The effects of a social learning network on students’ performances and attitudes. European Journal of Education Studies, 3 (3), 312–333. 10.5281/zenodo.292951

*Ercan, O. (2014). Effect of web assisted education supported by six thinking hats on students’ academic achievement in science and technology classes . European Journal of Educational Research, 3 (1), 9–23. https://doi.org/10.12973/eu-jer.3.1.9

Ercan, O., & Bilen, K. (2014). Effect of web assisted education supported by six thinking hats on students’ academic achievement in science and technology classes. European Journal of Educational Research, 3 (1), 9–23.

*Ercan, O., Bilen, K., & Ural, E. (2016). “Earth, sun and moon”: Computer assisted instruction in secondary school science - Achievement and attitudes. Issues in Educational Research, 26 (2), 206–224. https://doi.org/10.12973/eu-jer.3.1.9

Field, A. P. (2003). The problems in using fixed-effects models of meta-analysis on real-world data. Understanding Statistics, 2 (2), 105–124. https://doi.org/10.1207/s15328031us0202_02

Field, A. P., & Gillett, R. (2010). How to do a meta-analysis. British Journal of Mathematical and Statistical Psychology, 63 (3), 665–694. https://doi.org/10.1348/00071010x502733

Geostat. (2019). ‘Share of households with internet access’, National statistics office of Georgia . Retrieved on the 2nd September 2020 from https://www.geostat.ge/en/modules/categories/106/information-and-communication-technologies-usage-in-households

*Gwo-Jen, H., Nien-Ting, T., & Xiao-Ming, W. (2018). Creating interactive e-books through learning by design: The impacts of guided peer-feedback on students’ learning achievements and project outcomes in science courses. Journal of Educational Technology & Society., 21 (1), 25–36. Retrieved on the 2nd of October, 2020 https://ae-uploads.uoregon.edu/ISTE/ISTE2019/PROGRAM_SESSION_MODEL/HANDOUTS/112172923/CreatingInteractiveeBooksthroughLearningbyDesignArticle2018.pdf

Hamdani, A. R., & Priatna, A. (2020). Efektifitas implementasi pembelajaran daring (full online) dimasa pandemi Covid-19 pada jenjang Sekolah Dasar di Kabupaten Subang. Didaktik: Jurnal Ilmiah PGSD STKIP Subang, 6 (1), 1–9.

Hart, C. M., Berger, D., Jacob, B., Loeb, S., & Hill, M. (2019). Online learning, offline outcomes: Online course taking and high school student performance. Aera Open, 5(1).

*Hayes, J., & Stewart, I. (2016). Comparing the effects of derived relational training and computer coding on intellectual potential in school-age children. The British Journal of Educational Psychology, 86 (3), 397–411. https://doi.org/10.1111/bjep.12114

Horton, W. K. (2000). Designing web-based training: How to teach anyone anything anywhere anytime (Vol. 1). Wiley Publishing.

*Hwang, G. J., Wu, P. H., & Chen, C. C. (2012). An online game approach for improving students’ learning performance in web-based problem-solving activities. Computers and Education, 59 (4), 1246–1256. https://doi.org/10.1016/j.compedu.2012.05.009

*Kert, S. B., Köşkeroğlu Büyükimdat, M., Uzun, A., & Çayiroğlu, B. (2017). Comparing active game-playing scores and academic performances of elementary school students. Education 3–13, 45 (5), 532–542. https://doi.org/10.1080/03004279.2016.1140800

*Lai, A. F., & Chen, D. J. (2010). Web-based two-tier diagnostic test and remedial learning experiment. International Journal of Distance Education Technologies, 8 (1), 31–53. https://doi.org/10.4018/jdet.2010010103

*Lai, A. F., Lai, H. Y., Chuang W. H., & Wu, Z.H. (2015). Developing a mobile learning management system for outdoors nature science activities based on 5e learning cycle. Proceedings of the International Conference on e-Learning, ICEL. Proceedings of the International Association for Development of the Information Society (IADIS) International Conference on e-Learning (Las Palmas de Gran Canaria, Spain, July 21–24, 2015). Retrieved on the 14th November 2020 from https://files.eric.ed.gov/fulltext/ED562095.pdf

Lai, C. H., Lin, H. W., Lin, R. M., & Tho, P. D. (2019). Effect of peer interaction among online learning community on learning engagement and achievement. International Journal of Distance Education Technologies (IJDET), 17 (1), 66–77.

Littell, J. H., Corcoran, J., & Pillai, V. (2008). Systematic reviews and meta-analysis . Oxford University.

*Liu, K. P., Tai, S. J. D., & Liu, C. C. (2018). Enhancing language learning through creation: the effect of digital storytelling on student learning motivation and performance in a school English course. Educational Technology Research and Development, 66 (4), 913–935. https://doi.org/10.1007/s11423-018-9592-z

Machtmes, K., & Asher, J. W. (2000). A meta-analysis of the effectiveness of telecourses in distance education. American Journal of Distance Education, 14 (1), 27–46. https://doi.org/10.1080/08923640009527043

Makowski, D., Piraux, F., & Brun, F. (2019). From experimental network to meta-analysis: Methods and applications with R for agronomic and environmental sciences. Dordrecht: Springer. https://doi.org/10.1007/978-94-024_1696-1

* Meyers, C., Molefe, A., & Brandt, C. (2015). The Impact of the" Enhancing Missouri's Instructional Networked Teaching Strategies"(eMINTS) Program on Student Achievement, 21st-Century Skills, and Academic Engagement--Second-Year Results . Society for Research on Educational Effectiveness. Retrieved on the 14 th November, 2020 from https://files.eric.ed.gov/fulltext/ED562508.pdf

OECD. (2020). ‘A framework to guide an education response to the COVID-19 Pandemic of 2020 ’. https://doi.org/10.26524/royal.37.6

Pecoraro, V. (2018). Appraising evidence . In G. Biondi-Zoccai (Ed.), Diagnostic meta-analysis: A useful tool for clinical decision-making (pp. 99–114). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-78966-8_9

Pigott, T. (2012). Advances in meta-analysis . Springer.

Pillay, H. , Irving, K., & Tones, M. (2007). Validation of the diagnostic tool for assessing Tertiary students’ readiness for online learning. Higher Education Research & Development, 26 (2), 217–234. https://doi.org/10.1080/07294360701310821

Prestiadi, D., Zulkarnain, W., & Sumarsono, R. B. (2019). Visionary leadership in total quality management: efforts to improve the quality of education in the industrial revolution 4.0. In the 4th International Conference on Education and Management (COEMA 2019). Atlantis Press

Poole, D. M. (2000). Student participation in a discussion-oriented online course: a case study. Journal of Research on Computing in Education, 33 (2), 162–177. https://doi.org/10.1080/08886504.2000.10782307

Rahayu, F. S., Budiyanto, D., & Palyama, D. (2017). Analisis penerimaan e-learning menggunakan technology acceptance model (Tam)(Studi Kasus: Universitas Atma Jaya Yogyakarta). Jurnal Terapan Teknologi Informasi, 1 (2), 87–98.

Rasmussen, R. C. (2003). The quantity and quality of human interaction in a synchronous blended learning environment . Brigham Young University Press.

*Ravenel, J., T. Lambeth, D., & Spires, B. (2014). Effects of computer-based programs on mathematical achievement scores for fourth-grade students. i-manager’s Journal on School Educational Technology, 10 (1), 8–21. https://doi.org/10.26634/jsch.10.1.2830

Rolisca, R. U. C., & Achadiyah, B. N. (2014). Pengembangan media evaluasi pembelajaran dalam bentuk online berbasis e-learning menggunakan software wondershare quiz creator dalam mata pelajaran akuntansi SMA Brawijaya Smart School (BSS). Jurnal Pendidikan Akuntansi Indonesia, 12(2).

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effective- ness of Web-based and classroom instruction: A meta-analysis . Personnel Psychology, 59 (3), 623–664. https://doi.org/10.1111/j.1744-6570.2006.00049.x

Stewart, D. W., & Kamins, M. A. (2001). Developing a coding scheme and coding study reports. In M. W. Lipsey & D. B. Wilson (Eds.), Practical meta­analysis: Applied social research methods series (Vol. 49, pp. 73–90). Sage.

Swan, K. (2007). Research on online learning. Journal of Asynchronous Learning Networks, 11 (1), 55–59.

*Sung, H. Y., Hwang, G. J., & Chang, Y. C. (2016). Development of a mobile learning system based on a collaborative problem-posing strategy. Interactive Learning Environments, 24 (3), 456–471. https://doi.org/10.1080/10494820.2013.867889

Tsagris, M., & Fragkos, K. C. (2018). Meta-analyses of clinical trials versus diagnostic test accuracy studies. In G. Biondi-Zoccai (Ed.), Diagnostic meta-analysis: A useful tool for clinical decision-making (pp. 31–42). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-78966-8_4

UNESCO. (2020, Match 13). COVID-19 educational disruption and response. Retrieved on the 14 th November 2020 from https://en.unesco.org/themes/education-emergencies/ coronavirus-school-closures

Usta, E. (2011a). The effect of web-based learning environments on attitudes of students regarding computer and internet. Procedia-Social and Behavioral Sciences, 28 (262–269), 1. https://doi.org/10.1016/j.sbspro.2011.11.051

Usta, E. (2011b). The examination of online self-regulated learning skills in web-based learning environments in terms of different variables. Turkish Online Journal of Educational Technology-TOJET, 10 (3), 278–286. Retrieved on the 14th November 2020 from https://files.eric.ed.gov/fulltext/EJ944994.pdf

Vrasidas, C. & MsIsaac, M. S. (2000). Principles of pedagogy and evaluation for web-based learning. Educational Media International, 37 (2), 105–111. https://doi.org/10.1080/095239800410405

*Wang, C. H., & Chen, C. P. (2013). Effects of facebook tutoring on learning english as a second language. Proceedings of the International Conference e-Learning 2013, (2009), 135–142. Retrieved on the 15th November 2020 from https://files.eric.ed.gov/fulltext/ED562299.pdf

Wei, H. C., & Chou, C. (2020). Online learning performance and satisfaction: Do perceptions and readiness matter? Distance Education, 41 (1), 48–69.

*Yu, F. Y. (2019). The learning potential of online student-constructed tests with citing peer-generated questions. Interactive Learning Environments, 27 (2), 226–241. https://doi.org/10.1080/10494820.2018.1458040

*Yu, F. Y., & Chen, Y. J. (2014). Effects of student-generated questions as the source of online drill-and-practice activities on learning . British Journal of Educational Technology, 45 (2), 316–329. https://doi.org/10.1111/bjet.12036

*Yu, F. Y., & Pan, K. J. (2014). The effects of student question-generation with online prompts on learning. Educational Technology and Society, 17 (3), 267–279. Retrieved on the 15th November 2020 from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.565.643&rep=rep1&type=pdf

*Yu, W. F., She, H. C., & Lee, Y. M. (2010). The effects of web-based/non-web-based problem-solving instruction and high/low achievement on students’ problem-solving ability and biology achievement. Innovations in Education and Teaching International, 47 (2), 187–199. https://doi.org/10.1080/14703291003718927

Zhao, Y., Lei, J., Yan, B, Lai, C., & Tan, S. (2005). A practical analysis of research on the effectiveness of distance education. Teachers College Record, 107 (8). https://doi.org/10.1111/j.1467-9620.2005.00544.x

*Zhong, B., Wang, Q., Chen, J., & Li, Y. (2017). Investigating the period of switching roles in pair programming in a primary school. Educational Technology and Society, 20 (3), 220–233. Retrieved on the 15th November 2020 from https://repository.nie.edu.sg/bitstream/10497/18946/1/ETS-20-3-220.pdf

Download references

Author information

Authors and affiliations.

Primary Education, Ministry of Turkish National Education, Mersin, Turkey

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Hakan Ulum .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Ulum, H. The effects of online education on academic success: A meta-analysis study. Educ Inf Technol 27 , 429–450 (2022). https://doi.org/10.1007/s10639-021-10740-8

Download citation

Received : 06 December 2020

Accepted : 30 August 2021

Published : 06 September 2021

Issue Date : January 2022

DOI : https://doi.org/10.1007/s10639-021-10740-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Student achievement
  • Academic success
  • Meta-analysis
  • Find a journal
  • Publish with us
  • Track your research

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 09 January 2024

Online vs in-person learning in higher education: effects on student achievement and recommendations for leadership

  • Bandar N. Alarifi 1 &
  • Steve Song 2  

Humanities and Social Sciences Communications volume  11 , Article number:  86 ( 2024 ) Cite this article

8258 Accesses

2 Citations

2 Altmetric

Metrics details

  • Science, technology and society

This study is a comparative analysis of online distance learning and traditional in-person education at King Saud University in Saudi Arabia, with a focus on understanding how different educational modalities affect student achievement. The justification for this study lies in the rapid shift towards online learning, especially highlighted by the educational changes during the COVID-19 pandemic. By analyzing the final test scores of freshman students in five core courses over the 2020 (in-person) and 2021 (online) academic years, the research provides empirical insights into the efficacy of online versus traditional education. Initial observations suggested that students in online settings scored lower in most courses. However, after adjusting for variables like gender, class size, and admission scores using multiple linear regression, a more nuanced picture emerged. Three courses showed better performance in the 2021 online cohort, one favored the 2020 in-person group, and one was unaffected by the teaching format. The study emphasizes the crucial need for a nuanced, data-driven strategy in integrating online learning within higher education systems. It brings to light the fact that the success of educational methodologies is highly contingent on specific contextual factors. This finding advocates for educational administrators and policymakers to exercise careful and informed judgment when adopting online learning modalities. It encourages them to thoroughly evaluate how different subjects and instructional approaches might interact with online formats, considering the variable effects these might have on learning outcomes. This approach ensures that decisions about implementing online education are made with a comprehensive understanding of its diverse and context-specific impacts, aiming to optimize educational effectiveness and student success.

Similar content being viewed by others

significance of the study in research about online learning

Elementary school teachers’ perspectives about learning during the COVID-19 pandemic

significance of the study in research about online learning

Quality of a master’s degree in education in Ecuador

significance of the study in research about online learning

Impact of video-based learning in business statistics: a longitudinal study

Introduction.

The year 2020 marked an extraordinary period, characterized by the global disruption caused by the COVID-19 pandemic. Governments and institutions worldwide had to adapt to unforeseen challenges across various domains, including health, economy, and education. In response, many educational institutions quickly transitioned to distance teaching (also known as e-learning, online learning, or virtual classrooms) to ensure continued access to education for their students. However, despite this rapid and widespread shift to online learning, a comprehensive examination of its effects on student achievement in comparison to traditional in-person instruction remains largely unexplored.

In research examining student outcomes in the context of online learning, the prevailing trend is the consistent observation that online learners often achieve less favorable results when compared to their peers in traditional classroom settings (e.g., Fischer et al., 2020 ; Bettinger et al., 2017 ; Edvardsson and Oskarsson, 2008 ). However, it is important to note that a significant portion of research on online learning has primarily focused on its potential impact (Kuhfeld et al., 2020 ; Azevedo et al., 2020 ; Di Pietro et al., 2020 ) or explored various perspectives (Aucejo et al., 2020 ; Radha et al., 2020 ) concerning distance education. These studies have often omitted a comprehensive and nuanced examination of its concrete academic consequences, particularly in terms of test scores and grades.

Given the dearth of research on the academic impact of online learning, especially in light of Covid-19 in the educational arena, the present study aims to address that gap by assessing the effectiveness of distance learning compared to in-person teaching in five required freshmen-level courses at King Saud University, Saudi Arabia. To accomplish this objective, the current study compared the final exam results of 8297 freshman students who were enrolled in the five courses in person in 2020 to their 8425 first-year counterparts who has taken the same courses at the same institution in 2021 but in an online format.

The final test results of the five courses (i.e., University Skills 101, Entrepreneurship 101, Computer Skills 101, Computer Skills 101, and Fitness and Health Culture 101) were examined, accounting for potential confounding factors such as gender, class size and admission scores, which have been cited in past research to be correlated with student achievement (e.g., Meinck and Brese, 2019 ; Jepsen, 2015 ) Additionally, as the preparatory year at King Saud University is divided into five tracks—health, nursing, science, business, and humanity, the study classified students based on their respective disciplines.

Motivation for the study

The rapid expansion of distance learning in higher education, particularly highlighted during the recent COVID-19 pandemic (Volk et al., 2020 ; Bettinger et al., 2017 ), underscores the need for alternative educational approaches during crises. Such disruptions can catalyze innovation and the adoption of distance learning as a contingency plan (Christensen et al., 2015 ). King Saud University, like many institutions worldwide, faced the challenge of transitioning abruptly to online learning in response to the pandemic.

E-learning has gained prominence in higher education due to technological advancements, offering institutions a competitive edge (Valverde-Berrocoso et al., 2020 ). Especially during conditions like the COVID-19 pandemic, electronic communication was utilized across the globe as a feasible means to overcome barriers and enhance interactions (Bozkurt, 2019 ).

Distance learning, characterized by flexibility, became crucial when traditional in-person classes are hindered by unforeseen circumstance such as the ones posed by COVID-19 (Arkorful and Abaidoo, 2015 ). Scholars argue that it allows students to learn at their own pace, often referred to as self-directed learning (Hiemstra, 1994 ) or self-education (Gadamer, 2001 ). Additional advantages include accessibility, cost-effectiveness, and flexibility (Sadeghi, 2019 ).

However, distance learning is not immune to its own set of challenges. Technical impediments, encompassing network issues, device limitations, and communication hiccups, represent formidable hurdles (Sadeghi, 2019 ). Furthermore, concerns about potential distractions in the online learning environment, fueled by the ubiquity of the internet and social media, have surfaced (Hall et al., 2020 ; Ravizza et al., 2017 ). The absence of traditional face-to-face interactions among students and between students and instructors is also viewed as a potential drawback (Sadeghi, 2019 ).

Given the evolving understanding of the pros and cons of distance learning, this study aims to contribute to the existing literature by assessing the effectiveness of distance learning, specifically in terms of student achievement, as compared to in-person classroom learning at King Saud University, one of Saudi Arabia’s largest higher education institutions.

Academic achievement: in-person vs online learning

The primary driving force behind the rapid integration of technology in education has been its emphasis on student performance (Lai and Bower, 2019 ). Over the past decade, numerous studies have undertaken comparisons of student academic achievement in online and in-person settings (e.g., Bettinger et al., 2017 ; Fischer et al., 2020 ; Iglesias-Pradas et al., 2021 ). This section offers a concise review of the disparities in academic achievement between college students engaged in in-person and online learning, as identified in existing research.

A number of studies point to the superiority of traditional in-person education over online learning in terms of academic outcomes. For example, Fischer et al. ( 2020 ) conducted a comprehensive study involving 72,000 university students across 433 subjects, revealing that online students tend to achieve slightly lower academic results than their in-class counterparts. Similarly, Bettinger et al. ( 2017 ) found that students at for-profit online universities generally underperformed when compared to their in-person peers. Supporting this trend, Figlio et al. ( 2013 ) indicated that in-person instruction consistently produced better results, particularly among specific subgroups like males, lower-performing students, and Hispanic learners. Additionally, Kaupp’s ( 2012 ) research in California community colleges demonstrated that online students faced lower completion and success rates compared to their traditional in-person counterparts (Fig. 1 ).

figure 1

The figure compared student achievement in the final tests in the five courses by year, using independent-samples t-tests; the results show a statistically-significant drop in test scores from 2020 (in person) to 2021 (online) for all courses except CT_101.

In contrast, other studies present evidence of online students outperforming their in-person peers. For example, Iglesias-Pradas et al. ( 2021 ) conducted a comparative analysis of 43 bachelor courses at Telecommunication Engineering College in Malaysia, revealing that online students achieved higher academic outcomes than their in-person counterparts. Similarly, during the COVID-19 pandemic, Gonzalez et al. ( 2020 ) found that students engaged in online learning performed better than those who had previously taken the same subjects in traditional in-class settings.

Expanding on this topic, several studies have reported mixed results when comparing the academic performance of online and in-person students, with various student and instructor factors emerging as influential variables. Chesser et al. ( 2020 ) noted that student traits such as conscientiousness, agreeableness, and extraversion play a substantial role in academic achievement, regardless of the learning environment—be it traditional in-person classrooms or online settings. Furthermore, Cacault et al. ( 2021 ) discovered that online students with higher academic proficiency tend to outperform those with lower academic capabilities, suggesting that differences in students’ academic abilities may impact their performance. In contrast, Bergstrand and Savage ( 2013 ) found that online classes received lower overall ratings and exhibited a less respectful learning environment when compared to in-person instruction. Nevertheless, they also observed that the teaching efficiency of both in-class and online courses varied significantly depending on the instructors’ backgrounds and approaches. These findings underscore the multifaceted nature of the online vs. in-person learning debate, highlighting the need for a nuanced understanding of the factors at play.

Theoretical framework

Constructivism is a well-established learning theory that places learners at the forefront of their educational experience, emphasizing their active role in constructing knowledge through interactions with their environment (Duffy and Jonassen, 2009 ). According to constructivist principles, learners build their understanding by assimilating new information into their existing cognitive frameworks (Vygotsky, 1978 ). This theory highlights the importance of context, active engagement, and the social nature of learning (Dewey, 1938 ). Constructivist approaches often involve hands-on activities, problem-solving tasks, and opportunities for collaborative exploration (Brooks and Brooks, 1999 ).

In the realm of education, subject-specific pedagogy emerges as a vital perspective that acknowledges the distinctive nature of different academic disciplines (Shulman, 1986 ). It suggests that teaching methods should be tailored to the specific characteristics of each subject, recognizing that subjects like mathematics, literature, or science require different approaches to facilitate effective learning (Shulman, 1987 ). Subject-specific pedagogy emphasizes that the methods of instruction should mirror the ways experts in a particular field think, reason, and engage with their subject matter (Cochran-Smith and Zeichner, 2005 ).

When applying these principles to the design of instruction for online and in-person learning environments, the significance of adapting methods becomes even more pronounced. Online learning often requires unique approaches due to its reliance on technology, asynchronous interactions, and potential for reduced social presence (Anderson, 2003 ). In-person learning, on the other hand, benefits from face-to-face interactions and immediate feedback (Allen and Seaman, 2016 ). Here, the interplay of constructivism and subject-specific pedagogy becomes evident.

Online learning. In an online environment, constructivist principles can be upheld by creating interactive online activities that promote exploration, reflection, and collaborative learning (Salmon, 2000 ). Discussion forums, virtual labs, and multimedia presentations can provide opportunities for students to actively engage with the subject matter (Harasim, 2017 ). By integrating subject-specific pedagogy, educators can design online content that mirrors the discipline’s methodologies while leveraging technology for authentic experiences (Koehler and Mishra, 2009 ). For instance, an online history course might incorporate virtual museum tours, primary source analysis, and collaborative timeline projects.

In-person learning. In a traditional brick-and-mortar classroom setting, constructivist methods can be implemented through group activities, problem-solving tasks, and in-depth discussions that encourage active participation (Jonassen et al., 2003 ). Subject-specific pedagogy complements this by shaping instructional methods to align with the inherent characteristics of the subject (Hattie, 2009). For instance, in a physics class, hands-on experiments and real-world applications can bring theoretical concepts to life (Hake, 1998 ).

In sum, the fusion of constructivism and subject-specific pedagogy offers a versatile approach to instructional design that adapts to different learning environments (Garrison, 2011 ). By incorporating the principles of both theories, educators can tailor their methods to suit the unique demands of online and in-person learning, ultimately providing students with engaging and effective learning experiences that align with the nature of the subject matter and the mode of instruction.

Course description

The Self-Development Skills Department at King Saud University (KSU) offers five mandatory freshman-level courses. These courses aim to foster advanced thinking skills and cultivate scientific research abilities in students. They do so by imparting essential skills, identifying higher-level thinking patterns, and facilitating hands-on experience in scientific research. The design of these classes is centered around aiding students’ smooth transition into university life. Brief descriptions of these courses are as follows:

University Skills 101 (CI 101) is a three-hour credit course designed to nurture essential academic, communication, and personal skills among all preparatory year students at King Saud University. The primary goal of this course is to equip students with the practical abilities they need to excel in their academic pursuits and navigate their university lives effectively. CI 101 comprises 12 sessions and is an integral part of the curriculum for all incoming freshmen, ensuring a standardized foundation for skill development.

Fitness and Health 101 (FAJB 101) is a one-hour credit course. FAJB 101 focuses on the aspects of self-development skills in terms of health and physical, and the skills related to personal health, nutrition, sports, preventive, psychological, reproductive, and first aid. This course aims to motivate students’ learning process through entertainment, sports activities, and physical exercises to maintain their health. This course is required for all incoming freshmen students at King Saud University.

Entrepreneurship 101 (ENT 101) is a one-hour- credit course. ENT 101 aims to develop students’ skills related to entrepreneurship. The course provides students with knowledge and skills to generate and transform ideas and innovations into practical commercial projects in business settings. The entrepreneurship course consists of 14 sessions and is taught only to students in the business track.

Computer Skills 101 (CT 101) is a three-hour credit course. This provides students with the basic computer skills, e.g., components, operating systems, applications, and communication backup. The course explores data visualization, introductory level of modern programming with algorithms and information security. CT 101 course is taught for all tracks except those in the human track.

Computer Skills 102 (CT 102) is a three-hour credit course. It provides IT skills to the students to utilize computers with high efficiency, develop students’ research and scientific skills, and increase capability to design basic educational software. CT 102 course focuses on operating systems such as Microsoft Office. This course is only taught for students in the human track.

Structure and activities

These courses ranged from one to three hours. A one-hour credit means that students must take an hour of the class each week during the academic semester. The same arrangement would apply to two and three credit-hour courses. The types of activities in each course are shown in Table 1 .

At King Saud University, each semester spans 15 weeks in duration. The total number of semester hours allocated to each course serves as an indicator of its significance within the broader context of the academic program, including the diverse tracks available to students. Throughout the two years under study (i.e., 2020 and 2021), course placements (fall or spring), course content, and the organizational structure remained consistent and uniform.

Participants

The study’s data comes from test scores of a cohort of 16,722 first-year college students enrolled at King Saud University in Saudi Arabia over the span of two academic years: 2020 and 2021. Among these students, 8297 were engaged in traditional, in-person learning in 2020, while 8425 had transitioned to online instruction for the same courses in 2021 due to the Covid-19 pandemic. In 2020, the student population consisted of 51.5% females and 48.5% males. However, in 2021, there was a reversal in these proportions, with female students accounting for 48.5% and male students comprising 51.5% of the total participants.

Regarding student enrollment in the five courses, Table 2 provides a detailed breakdown by average class size, admission scores, and the number of students enrolled in the courses during the two years covered by this study. While the total number of students in each course remained relatively consistent across the two years, there were noticeable fluctuations in average class sizes. Specifically, four out of the five courses experienced substantial increases in class size, with some nearly doubling in size (e.g., ENT_101 and CT_102), while one course (CT_101) showed a reduction in its average class size.

In this study, it must be noted that while some students enrolled in up to three different courses within the same academic year, none repeated the same exam in both years. Specifically, students who failed to pass their courses in 2020 were required to complete them in summer sessions and were consequently not included in this study’s dataset. To ensure clarity and precision in our analysis, the research focused exclusively on student test scores to evaluate and compare the academic effectiveness of online and traditional in-person learning methods. This approach was chosen to provide a clear, direct comparison of the educational impacts associated with each teaching format.

Descriptive analysis of the final exam scores for the two years (2020 and 2021) were conducted. Additionally, comparison of student outcomes in in-person classes in 2020 to their online platform peers in 2021 were conducted using an independent-samples t -test. Subsequently, in order to address potential disparities between the two groups arising from variables such as gender, class size, and admission scores (which serve as an indicator of students’ academic aptitude and pre-enrollment knowledge), multiple regression analyses were conducted. In these multivariate analyses, outcomes of both in-person and online cohorts were assessed within their respective tracks. By carefully considering essential aforementioned variables linked to student performance, the study aimed to ensure a comprehensive and equitable evaluation.

Study instrument

The study obtained students’ final exam scores for the years 2020 (in-person) and 2021 (online) from the school’s records office through their examination management system. In the preparatory year at King Saud University, final exams for all courses are developed by committees composed of faculty members from each department. To ensure valid comparisons, the final exam questions, crafted by departmental committees of professors, remained consistent and uniform for the two years under examination.

Table 3 provides a comprehensive assessment of the reliability of all five tests included in our analysis. These tests exhibit a strong degree of internal consistency, with Cronbach’s alpha coefficients spanning a range from 0.77 to 0.86. This robust and consistent internal consistency measurement underscores the dependable nature of these tests, affirming their reliability and suitability for the study’s objectives.

In terms of assessing test validity, content validity was ensured through a thorough review by university subject matter experts, resulting in test items that align well with the content domain and learning objectives. Additionally, criterion-related validity was established by correlating students’ admissions test scores with their final required freshman test scores in the five subject areas, showing a moderate and acceptable relationship (0.37 to 0.56) between the test scores and the external admissions test. Finally, construct validity was confirmed through reviews by experienced subject instructors, leading to improvements in test content. With guidance from university subject experts, construct validity was established, affirming the effectiveness of the final tests in assessing students’ subject knowledge at the end of their coursework.

Collectively, these validity and reliability measures affirm the soundness and integrity of the final subject tests, establishing their suitability as effective assessment tools for evaluating students’ knowledge in their five mandatory freshman courses at King Saud University.

After obtaining research approval from the Research Committee at King Saud University, the coordinators of the five courses (CI_101, ENT_101, CT_101, CT_102, and FAJB_101) supplied the researchers with the final exam scores of all first-year preparatory year students at King Saud University for the initial semester of the academic years 2020 and 2021. The sample encompassed all students who had completed these five courses during both years, resulting in a total of 16,722 students forming the final group of participants.

Limitations

Several limitations warrant acknowledgment in this study. First, the research was conducted within a well-resourced major public university. As such, the experiences with online classes at other types of institutions (e.g., community colleges, private institutions) may vary significantly. Additionally, the limited data pertaining to in-class teaching practices and the diversity of learning activities across different courses represents a gap that could have provided valuable insights for a more thorough interpretation and explanation of the study’s findings.

To compare student achievement in the final tests in the five courses by year, independent-samples t -tests were conducted. Table 4 shows a statistically-significant drop in test scores from 2020 (in person) to 2021 (online) for all courses except CT_101. The biggest decline was with CT_102 with 3.58 points, and the smallest decline was with CI_101 with 0.18 points.

However, such simple comparison of means between the two years (via t -tests) by subjects does not account for the differences in gender composition, class size, and admission scores between the two academic years, all of which have been associated with student outcomes (e.g., Ho and Kelman, 2014 ; De Paola et al., 2013 ). To account for such potential confounding variables, multiple regressions were conducted to compare the 2 years’ results while controlling for these three factors associated with student achievement.

Table 5 presents the regression results, illustrating the variation in final exam scores between 2020 and 2021, while controlling for gender, class size, and admission scores. Importantly, these results diverge significantly from the outcomes obtained through independent-sample t -test analyses.

Taking into consideration the variables mentioned earlier, students in the 2021 online cohort demonstrated superior performance compared to their 2020 in-person counterparts in CI_101, FAJB_101, and CT_101, with score advantages of 0.89, 0.56, and 5.28 points, respectively. Conversely, in the case of ENT_101, online students in 2021 scored 0.69 points lower than their 2020 in-person counterparts. With CT_102, there were no statistically significant differences in final exam scores between the two cohorts of students.

The study sought to assess the effectiveness of distance learning compared to in-person learning in the higher education setting in Saudi Arabia. We analyzed the final exam scores of 16,722 first-year college students in King Saud University in five required subjects (i.e., CI_101, ENT_101, CT_101, CT_102, and FAJB_101). The study initially performed a simple comparison of mean scores by tracks by year (via t -tests) and then a number of multiple regression analyses which controlled for class size, gender composition, and admission scores.

Overall, the study’s more in-depth findings using multiple regression painted a wholly different picture than the results obtained using t -tests. After controlling for class size, gender composition, and admissions scores, online students in 2021 performed better than their in-person instruction peers in 2020 in University Skills (CI_101), Fitness and Health (FAJB_101), and Computer Skills (CT_101), whereas in-person students outperformed their online peers in Entrepreneurship (ENT_101). There was no meaningful difference in outcomes for students in the Computer Skills (CT_102) course for the two years.

In light of these findings, it raises the question: why do we observe minimal differences (less than a one-point gain or loss) in student outcomes in courses like University Skills, Fitness and Health, Entrepreneurship, and Advanced Computer Skills based on the mode of instruction? Is it possible that when subjects are primarily at a basic or introductory level, as is the case with these courses, the mode of instruction may have a limited impact as long as the concepts are effectively communicated in a manner familiar and accessible to students?

In today’s digital age, one could argue that students in more developed countries, such as Saudi Arabia, generally possess the skills and capabilities to effectively engage with materials presented in both in-person and online formats. However, there is a notable exception in the Basic Computer Skills course, where the online cohort outperformed their in-person counterparts by more than 5 points. Insights from interviews with the instructors of this course suggest that this result may be attributed to the course’s basic and conceptual nature, coupled with the availability of instructional videos that students could revisit at their own pace.

Given that students enter this course with varying levels of computer skills, self-paced learning may have allowed them to cover course materials at their preferred speed, concentrating on less familiar topics while swiftly progressing through concepts they already understood. The advantages of such self-paced learning have been documented by scholars like Tullis and Benjamin ( 2011 ), who found that self-paced learners often outperform those who spend the same amount of time studying identical materials. This approach allows learners to allocate their time more effectively according to their individual learning pace, providing greater ownership and control over their learning experience. As such, in courses like introductory computer skills, it can be argued that becoming familiar with fundamental and conceptual topics may not require extensive in-class collaboration. Instead, it may be more about exposure to and digestion of materials in a format and at a pace tailored to students with diverse backgrounds, knowledge levels, and skill sets.

Further investigation is needed to more fully understand why some classes benefitted from online instruction while others did not, and vice versa. Perhaps, it could be posited that some content areas are more conducive to in-person (or online) format while others are not. Or it could be that the different results of the two modes of learning were driven by students of varying academic abilities and engagement, with low-achieving students being more vulnerable to the limitations of online learning (e.g., Kofoed et al., 2021 ). Whatever the reasons, the results of the current study can be enlightened by a more in-depth analysis of the various factors associated with such different forms of learning. Moreover, although not clear cut, what the current study does provide is additional evidence against any dire consequences to student learning (at least in the higher ed setting) as a result of sudden increase in online learning with possible benefits of its wider use being showcased.

Based on the findings of this study, we recommend that educational leaders adopt a measured approach to online learning—a stance that neither fully embraces nor outright denounces it. The impact on students’ experiences and engagement appears to vary depending on the subjects and methods of instruction, sometimes hindering, other times promoting effective learning, while some classes remain relatively unaffected.

Rather than taking a one-size-fits-all approach, educational leaders should be open to exploring the nuances behind these outcomes. This involves examining why certain courses thrived with online delivery, while others either experienced a decline in student achievement or remained largely unaffected. By exploring these differentiated outcomes associated with diverse instructional formats, leaders in higher education institutions and beyond can make informed decisions about resource allocation. For instance, resources could be channeled towards in-person learning for courses that benefit from it, while simultaneously expanding online access for courses that have demonstrated improved outcomes through its virtual format. This strategic approach not only optimizes resource allocation but could also open up additional revenue streams for the institution.

Considering the enduring presence of online learning, both before the pandemic and its accelerated adoption due to Covid-19, there is an increasing need for institutions of learning and scholars in higher education, as well as other fields, to prioritize the study of its effects and optimal utilization. This study, which compares student outcomes between two cohorts exposed to in-person and online instruction (before and during Covid-19) at the largest university in Saudi Arabia, represents a meaningful step in this direction.

Data availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author upon reasonable request.

Allen IE, Seaman J (2016) Online report card: Tracking online education in the United States . Babson Survey Group

Anderson T (2003) Getting the mix right again: an updated and theoretical rationale for interaction. Int Rev Res Open Distrib Learn , 4 (2). https://doi.org/10.19173/irrodl.v4i2.149

Arkorful V, Abaidoo N (2015) The role of e-learning, advantages and disadvantages of its adoption in higher education. Int J Instruct Technol Distance Learn 12(1):29–42

Google Scholar  

Aucejo EM, French J, Araya MP, Zafar B (2020) The impact of COVID-19 on student experiences and expectations: Evidence from a survey. Journal of Public Economics 191:104271. https://doi.org/10.1016/j.jpubeco.2020.104271

Article   PubMed   PubMed Central   Google Scholar  

Azevedo JP, Hasan A, Goldemberg D, Iqbal SA, and Geven K (2020) Simulating the potential impacts of COVID-19 school closures on schooling and learning outcomes: a set of global estimates. World Bank Policy Research Working Paper

Bergstrand K, Savage SV (2013) The chalkboard versus the avatar: Comparing the effectiveness of online and in-class courses. Teach Sociol 41(3):294–306. https://doi.org/10.1177/0092055X13479949

Article   Google Scholar  

Bettinger EP, Fox L, Loeb S, Taylor ES (2017) Virtual classrooms: How online college courses affect student success. Am Econ Rev 107(9):2855–2875. https://doi.org/10.1257/aer.20151193

Bozkurt A (2019) From distance education to open and distance learning: a holistic evaluation of history, definitions, and theories. Handbook of research on learning in the age of transhumanism , 252–273. https://doi.org/10.4018/978-1-5225-8431-5.ch016

Brooks JG, Brooks MG (1999) In search of understanding: the case for constructivist classrooms . Association for Supervision and Curriculum Development

Cacault MP, Hildebrand C, Laurent-Lucchetti J, Pellizzari M (2021) Distance learning in higher education: evidence from a randomized experiment. J Eur Econ Assoc 19(4):2322–2372. https://doi.org/10.1093/jeea/jvaa060

Chesser S, Murrah W, Forbes SA (2020) Impact of personality on choice of instructional delivery and students’ performance. Am Distance Educ 34(3):211–223. https://doi.org/10.1080/08923647.2019.1705116

Christensen CM, Raynor M, McDonald R (2015) What is disruptive innovation? Harv Bus Rev 93(12):44–53

Cochran-Smith M, Zeichner KM (2005) Studying teacher education: the report of the AERA panel on research and teacher education. Choice Rev Online 43 (4). https://doi.org/10.5860/choice.43-2338

De Paola M, Ponzo M, Scoppa V (2013) Class size effects on student achievement: heterogeneity across abilities and fields. Educ Econ 21(2):135–153. https://doi.org/10.1080/09645292.2010.511811

Dewey, J (1938) Experience and education . Simon & Schuster

Di Pietro G, Biagi F, Costa P, Karpinski Z, Mazza J (2020) The likely impact of COVID-19 on education: reflections based on the existing literature and recent international datasets. Publications Office of the European Union, Luxembourg

Duffy TM, Jonassen DH (2009) Constructivism and the technology of instruction: a conversation . Routledge, Taylor & Francis Group

Edvardsson IR, Oskarsson GK (2008) Distance education and academic achievement in business administration: the case of the University of Akureyri. Int Rev Res Open Distrib Learn, 9 (3). https://doi.org/10.19173/irrodl.v9i3.542

Figlio D, Rush M, Yin L (2013) Is it live or is it internet? Experimental estimates of the effects of online instruction on student learning. J Labor Econ 31(4):763–784. https://doi.org/10.3386/w16089

Fischer C, Xu D, Rodriguez F, Denaro K, Warschauer M (2020) Effects of course modality in summer session: enrollment patterns and student performance in face-to-face and online classes. Internet Higher Educ 45:100710. https://doi.org/10.1016/j.iheduc.2019.100710

Gadamer HG (2001) Education is self‐education. J Philos Educ 35(4):529–538

Garrison DR (2011) E-learning in the 21st century: a framework for research and practice . Routledge. https://doi.org/10.4324/9780203838761

Gonzalez T, de la Rubia MA, Hincz KP, Comas-Lopez M, Subirats L, Fort S, & Sacha GM (2020) Influence of COVID-19 confinement on students’ performance in higher education. PLOS One 15 (10). https://doi.org/10.1371/journal.pone.0239490

Hake RR (1998) Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am J Phys 66(1):64–74. https://doi.org/10.1119/1.18809

Article   ADS   Google Scholar  

Hall ACG, Lineweaver TT, Hogan EE, O’Brien SW (2020) On or off task: the negative influence of laptops on neighboring students’ learning depends on how they are used. Comput Educ 153:1–8. https://doi.org/10.1016/j.compedu.2020.103901

Harasim L (2017) Learning theory and online technologies. Routledge. https://doi.org/10.4324/9780203846933

Hiemstra R (1994) Self-directed learning. In WJ Rothwell & KJ Sensenig (Eds), The sourcebook for self-directed learning (pp 9–20). HRD Press

Ho DE, Kelman MG (2014) Does class size affect the gender gap? A natural experiment in law. J Legal Stud 43(2):291–321

Iglesias-Pradas S, Hernández-García Á, Chaparro-Peláez J, Prieto JL (2021) Emergency remote teaching and students’ academic performance in higher education during the COVID-19 pandemic: a case study. Comput Hum Behav 119:106713. https://doi.org/10.1016/j.chb.2021.106713

Jepsen C (2015) Class size: does it matter for student achievement? IZA World of Labor . https://doi.org/10.15185/izawol.190

Jonassen DH, Howland J, Moore J, & Marra RM (2003) Learning to solve problems with technology: a constructivist perspective (2nd ed). Columbus: Prentice Hall

Kaupp R (2012) Online penalty: the impact of online instruction on the Latino-White achievement gap. J Appli Res Community Coll 19(2):3–11. https://doi.org/10.46569/10211.3/99362

Koehler MJ, Mishra P (2009) What is technological pedagogical content knowledge? Contemp Issues Technol Teacher Educ 9(1):60–70

Kofoed M, Gebhart L, Gilmore D, & Moschitto R (2021) Zooming to class?: Experimental evidence on college students’ online learning during COVID-19. SSRN Electron J. https://doi.org/10.2139/ssrn.3846700

Kuhfeld M, Soland J, Tarasawa B, Johnson A, Ruzek E, Liu J (2020) Projecting the potential impact of COVID-19 school closures on academic achievement. Educ Res 49(8):549–565. https://doi.org/10.3102/0013189x20965918

Lai JW, Bower M (2019) How is the use of technology in education evaluated? A systematic review. Comput Educ 133:27–42

Meinck S, Brese F (2019) Trends in gender gaps: using 20 years of evidence from TIMSS. Large-Scale Assess Educ 7 (1). https://doi.org/10.1186/s40536-019-0076-3

Radha R, Mahalakshmi K, Kumar VS, Saravanakumar AR (2020) E-Learning during lockdown of COVID-19 pandemic: a global perspective. Int J Control Autom 13(4):1088–1099

Ravizza SM, Uitvlugt MG, Fenn KM (2017) Logged in and zoned out: How laptop Internet use relates to classroom learning. Psychol Sci 28(2):171–180. https://doi.org/10.1177/095679761667731

Article   PubMed   Google Scholar  

Sadeghi M (2019) A shift from classroom to distance learning: advantages and limitations. Int J Res Engl Educ 4(1):80–88

Salmon G (2000) E-moderating: the key to teaching and learning online . Routledge. https://doi.org/10.4324/9780203816684

Shulman LS (1986) Those who understand: knowledge growth in teaching. Edu Res 15(2):4–14

Shulman LS (1987) Knowledge and teaching: foundations of the new reform. Harv Educ Rev 57(1):1–22

Tullis JG, Benjamin AS (2011) On the effectiveness of self-paced learning. J Mem Lang 64(2):109–118. https://doi.org/10.1016/j.jml.2010.11.002

Valverde-Berrocoso J, Garrido-Arroyo MDC, Burgos-Videla C, Morales-Cevallos MB (2020) Trends in educational research about e-learning: a systematic literature review (2009–2018). Sustainability 12(12):5153

Volk F, Floyd CG, Shaler L, Ferguson L, Gavulic AM (2020) Active duty military learners and distance education: factors of persistence and attrition. Am J Distance Educ 34(3):1–15. https://doi.org/10.1080/08923647.2019.1708842

Vygotsky LS (1978) Mind in society: the development of higher psychological processes. Harvard University Press

Download references

Author information

Authors and affiliations.

Department of Sports and Recreation Management, King Saud University, Riyadh, Saudi Arabia

Bandar N. Alarifi

Division of Research and Doctoral Studies, Concordia University Chicago, 7400 Augusta Street, River Forest, IL, 60305, USA

You can also search for this author in PubMed   Google Scholar

Contributions

Dr. Bandar Alarifi collected and organized data for the five courses and wrote the manuscript. Dr. Steve Song analyzed and interpreted the data regarding student achievement and revised the manuscript. These authors jointly supervised this work and approved the final manuscript.

Corresponding author

Correspondence to Bandar N. Alarifi .

Ethics declarations

Competing interests.

The author declares no competing interests.

Ethical approval

This study was approved by the Research Ethics Committee at King Saud University on 25 March 2021 (No. 4/4/255639). This research does not involve the collection or analysis of data that could be used to identify participants (including email addresses or other contact details). All information is anonymized and the submission does not include images that may identify the person. The procedures used in this study adhere to the tenets of the Declaration of Helsinki.

Informed consent

This article does not contain any studies with human participants performed by any of the authors.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Alarifi, B.N., Song, S. Online vs in-person learning in higher education: effects on student achievement and recommendations for leadership. Humanit Soc Sci Commun 11 , 86 (2024). https://doi.org/10.1057/s41599-023-02590-1

Download citation

Received : 07 June 2023

Accepted : 21 December 2023

Published : 09 January 2024

DOI : https://doi.org/10.1057/s41599-023-02590-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

significance of the study in research about online learning

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

education-logo

Article Menu

significance of the study in research about online learning

  • Subscribe SciFeed
  • Recommended Articles
  • Author Biographies
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

Navigating the new normal: adapting online and distance learning in the post-pandemic era.

significance of the study in research about online learning

1. Introduction

1.1. background, 1.2. purpose of the review.

  • Highlighting the multifaceted impact of the pandemic on education, including the disruptions caused by school closures and the subsequent shift to remote learning [ 1 ].
  • Exploring innovative approaches and strategies employed by educators to ensure effective online teaching and learning experiences [ 2 , 4 ].
  • Examining the role of technological solutions and platforms in facilitating remote education and their effectiveness in supporting teaching and learning processes [ 4 ].
  • Investigating strategies for promoting student engagement and participation in virtual classrooms, considering the unique challenges and opportunities presented by online and distance learning [ 2 , 3 ].
  • Evaluating the various assessment and evaluation methods employed in online education, considering their validity, reliability, and alignment with learning outcomes [ 4 ].
  • Discussing the importance of supporting student well-being and academic success in the digital environment, addressing the social and emotional aspects of remote learning [ 3 ].
  • Examining the professional development opportunities and resources available for educators to enhance their skills in online teaching and adapt to the changing educational landscape [ 4 ].
  • Addressing equity and accessibility considerations in online and distance learning, developing strategies to ensure equitable opportunities for all learners and mitigate the digital divide [ 1 , 2 ].
  • Identifying key lessons learned and best practices from the experiences of educators and students during the pandemic, providing insights for future educational practices [ 1 , 4 ].
  • Discussing the potential for educational innovation and transformations in teaching and learning practices in the post-pandemic era, considering the lessons learned from the rapid transition to online and distance learning [ 4 ].

1.3. Significance of the Study

  • To provide a comprehensive understanding of the impact of the pandemic on education. UNESCO (2020) reported that the widespread school closures caused by the pandemic disrupted traditional education practices and posed significant challenges for students, educators, and families [ 1 ]. As such, understanding the multifaceted impact of the pandemic is crucial for effective decision making and policy development.
  • To highlight innovative approaches to online teaching and learning. Hodges et al. [ 4 ] emphasized the importance of instructional design principles and the use of educational technology tools in facilitating effective online education [ 4 ] by examining strategies employed by educators during the pandemic. This review paper aims to identify successful practices that can be applied in future online and blended learning environments.
  • To explore the role of technology in supporting remote education. The rapid transition to online and distance learning has required the use of various technological solutions and platforms. With reference to this subject, Hodges et al. (2020) discussed the difference between emergency remote teaching and online learning, highlighting the importance of leveraging technology to create engaging and interactive virtual classrooms [ 4 ].
  • To address equity and accessibility considerations. The pandemic has exacerbated existing inequities in access to education and technology. On this line, UNESCO (2020) emphasized the need to address equity issues and bridge the digital divide to ensure equitable opportunities for all learners. This review paper examines strategies and interventions aimed at promoting equitable access to online and distance learning.
  • To provide insights for future educational practices by analyzing experiences, challenges, and successes encountered during the transition to online and distance learning. This review paper aims to provide valuable insights for educators, policymakers, and researchers. So, lessons learned from the pandemic can inform the development of effective educational policies, teacher training programs, and support systems for students.

1.4. Methodology of Search

2. impact of the covid-19 pandemic on education, 3. transitioning from traditional classrooms to online and distance learning, 4. challenges faced by educators during the lockdown period, 5. strategies for effective online teaching and learning, 6. technological solutions and platforms for remote education, 7. promoting student engagement and participation in the virtual classroom, 8. assessments and evaluation methods in online education, 9. supporting student well-being and academic success in the digital environment, 10. professional development for educators in online teaching, 11. addressing equity and accessibility in online and distance learning, 12. lessons learned and best practices for future educational practices, 13. innovations and transformations in education post-pandemic, 14. policy implications and recommendations for effective online education, 15. ethical considerations in online and distance learning, 16. innovations and practical applications in post-pandemic educational strategies.

  • Impact Analysis Tools: Develop analytical tools to quantify the educational disruptions caused by the pandemic, focusing on metrics like attendance, engagement, and performance shifts due to remote learning.
  • Online Pedagogy Workshops: Create workshops for educators to share and learn innovative online teaching strategies, focusing on interactivity, student-centered learning, and curriculum adaptation for virtual environments.
  • Tech-Integration Frameworks: Develop frameworks for integrating and evaluating the effectiveness of various technological solutions in remote education, including LMS, interactive tools, and AI-based learning supports.
  • Engagement-Boosting Platforms: Create platforms or tools that specifically target student engagement in virtual classrooms, incorporating gamification, interactive content, and real-time feedback mechanisms.
  • Assessment Methodology Guides: Develop guidelines or toolkits for educators to design and implement valid and reliable online assessments aligned with learning outcomes.
  • Well-being Monitoring Systems: Implement systems to monitor and support student well-being in digital learning environments, incorporating mental health resources and social-emotional learning components.
  • Professional Development Portals: Develop online portals offering continuous professional development opportunities for educators, focusing on upskilling in digital pedagogy, content creation, and adaptive learning technologies.
  • Equity and Accessibility Strategies: Formulate and implement strategies to ensure equitable access to online and distance learning, addressing the digital divide through resource distribution, adaptive technologies, and inclusive curriculum design.
  • Best Practices Repository: Create a repository of best practices and lessons learned from the pandemic’s educational challenges, serving as a resource for future educational planning and crisis management.
  • Post-Pandemic Educational Innovation Labs: Establish innovation labs to explore and pilot new teaching and learning practices in the post-pandemic era, emphasizing the integration of traditional and digital pedagogies.

17. Conclusions: Navigating the Path Forward in Online Education

Author contributions, institutional review board statement, informed consent statement, data availability statement, conflicts of interest.

  • Reuge, N.; Jenkins, R.; Brossard, M.; Soobrayan, B.; Mizunoya, S.; Ackers, J.; Jones, L.; Taulo, W.G. Education response to COVID 19 pandemic, a special issue proposed by UNICEF: Editorial review. Int. J. Educ. Dev. 2021 , 87 , 102485. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Wang, C.; Cheng, Z.; Yue, X.-G.; McAleer, M. Risk Management of COVID-19 by Universities in China. J. Risk Financ. Manag. 2020 , 13 , 36. [ Google Scholar ] [ CrossRef ]
  • Bao, W. COVID-19 and Online Teaching in Higher Education: A Case Study of Peking University. Hum. Behav. Emerg. Technol. 2020 , 2 , 113–115. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Hodges, C.B.; Moore, S.; Lockee, B.B.; Trust, T.; Bond, M.A. The Difference between Emergency Remote Teaching and Online Learning ; Educause: Boulder, CO, USA, 2020. [ Google Scholar ]
  • Bashir, A.; Bashir, S.; Rana, K.; Lambert, P.; Vernallis, A. Post-COVID-19 Adaptations; the Shifts towards Online Learning, Hybrid Course Delivery and the Implications for Biosciences Courses in the Higher Education Setting. Front. Educ. 2021 , 6 , 310. [ Google Scholar ] [ CrossRef ]
  • Akpa, V.O.; Akinosi, J.R.; Nwankwere, I.A.; Makinde, G.O.; Ajike, E.O. Strategic Innovation, Digital Dexterity and Service Quality of Selected Quoted Deposit Money Banks in Nigeria. Eur. J. Bus. Innov. Res. 2022 , 10 , 15–35. [ Google Scholar ]
  • Loades, M.E.; Chatburn, E.; Higson-Sweeney, N.; Reynolds, S.; Shafran, R.; Brigden, A.; Linney, C.; McManus, M.N.; Borwick, C.; Crawley, E. Rapid Systematic Review: The Impact of Social Isolation and Loneliness on the Mental Health of Children and Adolescents in the Context of COVID-19. J. Am. Acad. Child Adolesc. Psychiatry 2020 , 59 , 1218–1239.e3. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Van Lancker, W.; Parolin, Z. COVID-19, School Closures, and Child Poverty: A Social Crisis in the Making. Lancet Public Health 2020 , 5 , e243–e244. [ Google Scholar ] [ CrossRef ]
  • Brooks, S.K.; Webster, R.K.; Smith, L.E.; Woodland, L.; Wessely, S.; Greenberg, N.; Rubin, G.J. The Psychological Impact of Quarantine and How to Reduce It: Rapid Review of the Evidence. Lancet 2020 , 395 , 912–920. [ Google Scholar ] [ CrossRef ]
  • Padmanabhanunni, A.; Pretorius, T.B. Teacher Burnout in the Time of COVID-19: Antecedents and Psychological Consequences. Int. J. Environ. Res. Public Health 2023 , 20 , 4204. [ Google Scholar ] [ CrossRef ]
  • Al Lily, A.E.; Ismail, A.F.; Abunasser, F.M.; Alhajhoj Alqahtani, R.H. Distance Education as a Response to Pandemics: Coronavirus and Arab Culture. Technol. Soc. 2020 , 63 , 101317. [ Google Scholar ] [ CrossRef ]
  • Means, B.; Toyama, Y.; Murphy, R.; Bakia, M.; Jones, K. Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies ; Centre for Learning Technology: Hong Kong, China, 2009. [ Google Scholar ]
  • Picciano, A.G. Theories and Frameworks for Online Education: Seeking an Integrated Model. In A Guide to Administering Distance Learning ; Brill: Leiden, The Netherlands, 2021; pp. 79–103. [ Google Scholar ]
  • Burgstahler, S.E.; Cory, R.C. Universal Design in Higher Education: From Principles to Practice ; Harvard Education Press: Cambridge, MA, USA, 2010. [ Google Scholar ]
  • Nicol, D.J.; Macfarlane-Dick, D. Formative Assessment and Self-regulated Learning: A Model and Seven Principles of Good Feedback Practice. Stud. High. Educ. 2006 , 31 , 199–218. [ Google Scholar ] [ CrossRef ]
  • Stodel, E.J.; Thompson, T.L.; MacDonald, C.J. Learners’ Perspectives on What Is Missing from Online Learning: Interpretations through the Community of Inquiry Framework. Int. Rev. Res. Open Distrib. Learn. 2006 , 7 , 1–24. [ Google Scholar ] [ CrossRef ]
  • Richardson, J.C.; Maeda, Y.; Lv, J.; Caskurlu, S. Social Presence in Relation to Students’ Satisfaction and Learning in the Online Environment: A Meta-Analysis. Comput. Hum. Behav. 2017 , 71 , 402–417. [ Google Scholar ] [ CrossRef ]
  • Mayer, R.E. Using Multimedia for E-learning. J. Comput. Assist. Learn. 2017 , 33 , 403–423. [ Google Scholar ] [ CrossRef ]
  • Swan, K. Building Learning Communities in Online Courses: The Importance of Interaction. Educ. Commun. Inf. 2002 , 2 , 23–49. [ Google Scholar ] [ CrossRef ]
  • Sato, S.N.; Condes Moreno, E.; Villanueva, A.R.; Orquera Miranda, P.; Chiarella, P.; Bermudez, G.; Aguilera, J.F.T.; Clemente-Suárez, V.J. Psychological Impacts of Teaching Models on Ibero-American Educators during COVID-19. Behav. Sci. 2023 , 13 , 957. [ Google Scholar ] [ CrossRef ]
  • Dennen, V.P.; Burner, K.J. The Cognitive Apprenticeship Model in Educational Practice. In Handbook of Research on Educational Communications and Technology ; Routledge: Oxfordshire, UK, 2008; pp. 425–439. [ Google Scholar ]
  • Alturki, U.; Aldraiweesh, A. Application of Learning Management System (LMS) during the COVID-19 Pandemic: A Sustainable Acceptance Model of the Expansion Technology Approach. Sustainability 2021 , 13 , 10991. [ Google Scholar ] [ CrossRef ]
  • Sun, A.; Chen, X. Online Education and Its Effective Practice: A Research Review. J. Inf. Technol. Educ. 2016 , 15 , 157–190. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • AIKTC; CITEL. Three Days National Conference on Innovative Teaching & Exuberant Learning (NCiTeL 2021) ; AIKTC: Navi Mumbai, India, 2021. [ Google Scholar ]
  • Coggi, C. Innovare La Didattica e La Valutazione in Università: Il Progetto IRIDI per La Formazione Dei Docenti. In Innovare la Didattica e la Valutazione in Università ; Franco Angeli Edizioni: Milano, Italy, 2019; pp. 1–361. [ Google Scholar ]
  • Hawa, D.M.; Ghoniem, E.; Saad, A.M. Integrating Problem-Based Learning Into Blended Learning To Enhance Students’ Programming Skills. J. Posit. Sch. Psychol. 2022 , 6 , 4479–4497. [ Google Scholar ]
  • Lee, E.; Hannafin, M.J. A Design Framework for Enhancing Engagement in Student-Centered Learning: Own It, Learn It, and Share It. Educ. Technol. Res. Dev. 2016 , 64 , 707–734. [ Google Scholar ] [ CrossRef ]
  • Mayer, R.E. How Multimedia Can Improve Learning and Instruction. In The Cambridge Handbook of Cognition and Education ; Cambridge University Press: Cambridge, UK, 2019. [ Google Scholar ]
  • Dillenbourg, P.; Järvelä, S.; Fischer, F. The Evolution of Research on Computer-Supported Collaborative Learning BT-Technology-Enhanced Learning: Principles and Products ; Balacheff, N., Ludvigsen, S., de Jong, T., Lazonder, A., Barnes, S., Eds.; Springer: Dordrecht, The Netherlands, 2009. [ Google Scholar ]
  • McNair, D.E.; Palloff, R.M.; Pratt, K. Lessons from the Virtual Classroom: The Realities of Online Teaching ; SAGE Publications: Los Angeles, CA, USA, 2015. [ Google Scholar ]
  • Baran, E.; Correia, A.-P. A Professional Development Framework for Online Teaching. TechTrends 2014 , 58 , 95–101. [ Google Scholar ] [ CrossRef ]
  • Gikandi, J.W.; Morrow, D.; Davis, N.E. Online Formative Assessment in Higher Education: A Review of the Literature. Comput. Educ. 2011 , 57 , 2333–2351. [ Google Scholar ] [ CrossRef ]
  • Conole, G. Designing for Learning in an Open World ; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012; Volume 4. [ Google Scholar ]
  • Black, P.; Wiliam, D. The Formative Purpose: Assessment Must First Promote Learning. Yearb. Natl. Soc. Study Educ. 2004 , 103 , 20–50. [ Google Scholar ] [ CrossRef ]
  • Ruiz-Primo, M.A.; Briggs, D.; Iverson, H.; Talbot, R.; Shepard, L.A. Impact of Undergraduate Science Course Innovations on Learning. Science 2011 , 331 , 1269–1270. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Ismail, S.M.; Rahul, D.R.; Patra, I.; Rezvani, E. Formative vs. Summative Assessment: Impacts on Academic Motivation, Attitude toward Learning, Test Anxiety, and Self-Regulation Skill. Lang. Test. Asia 2022 , 12 , 40. [ Google Scholar ] [ CrossRef ]
  • Nitko, A.J. Educational Assessment of Students ; ERIC: Washington, DC, USA, 1996. [ Google Scholar ]
  • Cherner, T.; Halpin, P. Determining the Educational Value of Virtual Reality Apps Using Content Analysis. J. Interact. Learn. Res. 2021 , 32 , 245–280. [ Google Scholar ]
  • Pirker, B.; Smolka, J. International Law and Linguistics: Pieces of an Interdisciplinary Puzzle. J. Int. Disput. Settl. 2020 , 11 , 501–521. [ Google Scholar ] [ CrossRef ]
  • Panda, S. Analyzing Effectiveness of Learning Management System in Present Scenario: Conceptual Background and Practical Implementation. Int. J. Innov. Res. Adv. Stud. 2020 , 7 , 40–50. [ Google Scholar ]
  • Coghlan, S.; Miller, T.; Paterson, J. Good Proctor or “Big Brother”? Ethics of Online Exam Supervision Technologies. Philos. Technol. 2021 , 34 , 1581–1606. [ Google Scholar ] [ CrossRef ]
  • Shute, V.J.; Rahimi, S. Review of Computer-based Assessment for Learning in Elementary and Secondary Education. J. Comput. Assist. Learn. 2017 , 33 , 1–19. [ Google Scholar ] [ CrossRef ]
  • Landers, R.N.; Callan, R.C. Casual Social Games as Serious Games: The Psychology of Gamification in Undergraduate Education and Employee Training. In Serious Games and Edutainment Applications ; Springer: London, UK, 2011; pp. 399–423. [ Google Scholar ]
  • Hattie, J.; Timperley, H. The Power of Feedback. Rev. Educ. Res. 2007 , 77 , 81–112. [ Google Scholar ] [ CrossRef ]
  • Schraw, G.; Crippen, K.J.; Hartley, K. Promoting Self-Regulation in Science Education: Metacognition as Part of a Broader Perspective on Learning. Res. Sci. Educ. 2006 , 36 , 111–139. [ Google Scholar ] [ CrossRef ]
  • Przybylski, A.K.; Murayama, K.; DeHaan, C.R.; Gladwell, V. Motivational, Emotional, and Behavioral Correlates of Fear of Missing Out. Comput. Hum. Behav. 2013 , 29 , 1841–1848. [ Google Scholar ] [ CrossRef ]
  • Bereznowski, P.; Atroszko, P.A.; Konarski, R. Work Addiction, Work Engagement, Job Burnout, and Perceived Stress: A Network Analysis. Front. Psychol. 2023 , 14 , 1130069. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Brock, A. Mitigating Burnout and Promoting Professional Well-Being in Advisors. In Academic Advising Administration ; Routledge: Oxfordshire, UK, 2023; pp. 332–344. [ Google Scholar ]
  • Rovai, A.P.; Barnum, K.T. On-Line Course Effectiveness: An Analysis of Student Interactions and Perceptions of Learning. Int. J. E-Learn. Distance Educ. Int. Du E-Learn. La Form. Distance 2003 , 18 , 57–73. [ Google Scholar ]
  • Topping, K.J. Trends in Peer Learning. Educ. Psychol. 2005 , 25 , 631–645. [ Google Scholar ] [ CrossRef ]
  • Shea, P.; Li, C.S.; Pickett, A. A Study of Teaching Presence and Student Sense of Learning Community in Fully Online and Web-Enhanced College Courses. Internet High. Educ. 2006 , 9 , 175–190. [ Google Scholar ] [ CrossRef ]
  • Wiggins, G. Educative Assessment. Designing Assessments To Inform and Improve Student Performance ; ERIC: Washington, DC, USA, 1998. [ Google Scholar ]
  • Yukselturk, E.; Bulut, S. Predictors for Student Success in an Online Course. J. Educ. Technol. Soc. 2007 , 10 , 71–83. [ Google Scholar ]
  • Deci, E.L.; Ryan, R.M. The “What” and “Why” of Goal Pursuits: Human Needs and the Self-Determination of Behavior. Psychol. Inq. 2000 , 11 , 227–268. [ Google Scholar ] [ CrossRef ]
  • Locke, E.A.; Latham, G.P. A Theory of Goal Setting & Task Performance ; Prentice-Hall, Inc.: Hoboken, NJ, USA, 1990. [ Google Scholar ]
  • Salmon, G. E-Tivities: The Key to Active Online Learning ; Routledge: Oxfordshire, UK, 2013. [ Google Scholar ]
  • Koehler, M.; Mishra, P. What Is Technological Pedagogical Content Knowledge (TPACK)? Contemp. Issues Technol. Teach. Educ. 2009 , 9 , 60–70. [ Google Scholar ] [ CrossRef ]
  • Petretto, D.R.; Carta, S.M.; Cataudella, S.; Masala, I.; Mascia, M.L.; Penna, M.P.; Piras, P.; Pistis, I.; Masala, C. The Use of Distance Learning and E-Learning in Students with Learning Disabilities: A Review on the Effects and Some Hint of Analysis on the Use during COVID-19 Outbreak. Clin. Pract. Epidemiol. Ment. Health 2021 , 17 , 92–102. [ Google Scholar ] [ CrossRef ]
  • Lodder, J.; Heeren, B.; Jeuring, J. A Comparison of Elaborated and Restricted Feedback in LogEx, a Tool for Teaching Rewriting Logical Formulae. J. Comput. Assist. Learn. 2019 , 35 , 620–632. [ Google Scholar ] [ CrossRef ]
  • Kocdar, S.; Bozkurt, A. Supporting Learners with Special Needs in Open, Distance, and Digital Education. In Handbook of Open, Distance and Digital Education ; Springer: Singapore, 2022; pp. 1–16. [ Google Scholar ]
  • Tsai, Y.-S.; Rates, D.; Moreno-Marcos, P.M.; Muñoz-Merino, P.J.; Jivet, I.; Scheffel, M.; Drachsler, H.; Kloos, C.D.; Gašević, D. Learning Analytics in European Higher Education—Trends and Barriers. Comput. Educ. 2020 , 155 , 103933. [ Google Scholar ] [ CrossRef ]
  • Ladson-Billings, G. Culturally Relevant Pedagogy 2.0: Aka the Remix. Harv. Educ. Rev. 2014 , 84 , 74–84. [ Google Scholar ] [ CrossRef ]
  • Means, B.; Bakia, M.; Murphy, R. Learning Online: What Research Tells Us about Whether, When and How ; Routledge: Oxfordshire, UK, 2014. [ Google Scholar ]
  • Gray, J.A.; DiLoreto, M. The Effects of Student Engagement, Student Satisfaction, and Perceived Learning in Online Learning Environments. Int. J. Educ. Leadersh. Prep. 2016 , 11 , n1. [ Google Scholar ]
  • Garrison, D.R.; Cleveland-Innes, M. Facilitating Cognitive Presence in Online Learning: Interaction Is Not Enough. Am. J. Distance Educ. 2005 , 19 , 133–148. [ Google Scholar ] [ CrossRef ]
  • Darling-Hammond, L.; Hyler, M.E.; Gardner, M. Effective Teacher Professional Development ; Learning Policy Institute: Palo Alto, CA, USA, 2017. [ Google Scholar ]
  • Garrison, D.R.; Vaughan, N.D. Blended Learning in Higher Education: Framework, Principles, and Guidelines ; John Wiley & Sons: Hoboken, NJ, USA, 2008. [ Google Scholar ]
  • Vygotsky, L.S.; Cole, M. Mind in Society: Development of Higher Psychological Processes ; Harvard University Press: Cambridge, MA, USA, 1978. [ Google Scholar ]
  • Burlacu, M.; Coman, C.; Bularca, M.C. Blogged into the System: A Systematic Review of the Gamification in e-Learning before and during the COVID-19 Pandemic. Sustainability 2023 , 15 , 6476. [ Google Scholar ] [ CrossRef ]
  • Yamani, H.A. A Conceptual Framework for Integrating Gamification in eLearning Systems Based on Instructional Design Model. Int. J. Emerg. Technol. Learn. 2021 , 16 , 14. [ Google Scholar ] [ CrossRef ]
  • Siemens, G.; Baker, R.S.J.d. Learning Analytics and Educational Data Mining: Towards Communication and Collaboration. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, BC, Canada, 29 April–2 May 2012; pp. 252–254. [ Google Scholar ]
  • Li, L. Reskilling and Upskilling the Future-Ready Workforce for Industry 4.0 and Beyond. Inf. Syst. Front. 2022 . [ Google Scholar ] [ CrossRef ]
  • Lythreatis, S.; Singh, S.K.; El-Kassar, A.-N. The Digital Divide: A Review and Future Research Agenda. Technol. Forecast. Soc. Change 2022 , 175 , 121359. [ Google Scholar ] [ CrossRef ]
  • Tawfik, A.A.; Shepherd, C.E.; Gatewood, J.; Gish-Lieberman, J.J. First and Second Order Barriers to Teaching in K-12 Online Learning. TechTrends 2021 , 65 , 925–938. [ Google Scholar ] [ CrossRef ]
  • Muñoz, F.; Matus, O.; Pérez, C.; Fasce, E. Blended Learning y El Desarrollo de La Comunicación Científica En Un Programa de Especialización Dental. Investig. En Educ. Médica 2017 , 6 , 180–189. [ Google Scholar ] [ CrossRef ]
  • Barbour, M.K. Introducing a Special Collection of Papers on K-12 Online Learning and Continuity of Instruction after Emergency Remote Teaching. TechTrends 2022 , 66 , 298–300. [ Google Scholar ] [ CrossRef ]
  • Khalil, M.; Slade, S.; Prinsloo, P. Learning Analytics in Support of Inclusiveness and Disabled Students: A Systematic Review. J. Comput. High. Educ. 2023 , 1–18. [ Google Scholar ] [ CrossRef ]
  • Prinsloo, P.; Slade, S. Student Privacy Self-Management: Implications for Learning Analytics. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge, Poughkeepsie, NY, USA, 16–20 March 2015; pp. 83–92. [ Google Scholar ]
  • Watson, G.R.; Sottile, J. Cheating in the Digital Age: Do Students Cheat More in Online Courses? Online J. Distance Learn. Adm. 2010 , 13 , 798–803. [ Google Scholar ]
  • Bhattacharya, S.; Murthy, V.; Bhattacharya, S. The Social and Ethical Issues of Online Learning during the Pandemic and Beyond. Asian J. Bus. Ethics 2022 , 11 , 275–293. [ Google Scholar ] [ CrossRef ]
  • Yadav, K.K.; Reddy, L.J. Psychological effects of technology on college students. J. Clin. Otorhinolaryngol. Head Neck Surg. 2023 , 27 , 1805–1816. [ Google Scholar ]
  • Martin, F.; Bolliger, D.U. Engagement Matters: Student Perceptions on the Importance of Engagement Strategies in the Online Learning Environment. Online Learn. 2018 , 22 , 205–222. [ Google Scholar ] [ CrossRef ]
  • Clemente-Suárez, V.J.; Dalamitros, A.A.; Beltran-Velasco, A.I.; Mielgo-Ayuso, J.; Tornero-Aguilera, J.F. Social and psychophysiological consequences of the COVID-19 pandemic: An extensive literature review. Front. Psychol. 2020 , 11 , 3077. [ Google Scholar ] [ CrossRef ]
  • Clemente-Suárez, V.J.; Navarro-Jiménez, E.; Jimenez, M.; Hormeño-Holgado, A.; Martinez-Gonzalez, M.B.; Benitez-Agudelo, J.C.; Perez-Palencia, N.; Laborde-Cárdenas, C.C.; Tornero-Aguilera, J.F. Impact of COVID-19 Pandemic in Public Mental Health: An Extensive Narrative Review. Sustainability 2021 , 13 , 3221. [ Google Scholar ] [ CrossRef ]
  • Clemente-Suárez, V.J.; Navarro-Jiménez, E.; Moreno-Luna, L.; Saavedra-Serrano, M.C.; Jimenez, M.; Simón, J.A.; Tornero-Aguilera, J.F. The Impact of the COVID-19 Pandemic on Social, Health, and Economy. Sustainability 2021 , 13 , 6314. [ Google Scholar ] [ CrossRef ]
  • Rodriguez-Besteiro, S.; Beltran-Velasco, A.I.; Tornero-Aguilera, J.F.; Martínez-González, M.B.; Navarro-Jiménez, E.; Yáñez-Sepúlveda, R.; Clemente-Suárez, V.J. Social Media, Anxiety and COVID-19 Lockdown Measurement Compliance. Int. J. Environ. Res. Public Health 2023 , 20 , 4416. [ Google Scholar ] [ CrossRef ]
  • Clemente-Suárez, V.J.; Navarro-Jiménez, E.; Simón-Sanjurjo, J.A.; Beltran-Velasco, A.I.; Laborde-Cárdenas, C.C.; Benitez-Agudelo, J.C.; Bustamante-Sánchez, Á.; Tornero-Aguilera, J.F. Mis–Dis Information in COVID-19 Health Crisis: A Narrative Review. Int. J. Environ. Res. Public Health 2022 , 19 , 5321. [ Google Scholar ] [ CrossRef ]
  • Clemente-Suárez, V.J.; Navarro-Jiménez, E.; Ruisoto, P.; Dalamitros, A.A.; Beltran-Velasco, A.I.; Hormeño-Holgado, A.; Laborde-Cárdenas, C.C.; Tornero-Aguilera, J.F. Performance of Fuzzy Multi-Criteria Decision Analysis of Emergency System in COVID-19 Pandemic. An Extensive Narrative Review. Int. J. Environ. Res. Public Health 2021 , 18 , 5208. [ Google Scholar ] [ CrossRef ]
  • Sato, S.N.; Condes Moreno, E.; Rico Villanueva, A.; Orquera Miranda, P.; Chiarella, P.; Tornero-Aguilera, J.F.; Clemente-Suárez, V.J. Cultural Differences between University Students in Online Learning Quality and Psychological Profile during COVID-19. J. Risk Financ. Manag. 2022 , 15 , 555. [ Google Scholar ] [ CrossRef ]
  • Nomie-Sato, S.; Condes Moreno, E.; Villanueva, A.R.; Chiarella, P.; Tornero-Aguilera, J.F.; Beltrán-Velasco, A.I.; Clemente-Suárez, V.J. Gender Differences of University Students in the Online Teaching Quality and Psychological Profile during the COVID-19 Pandemic. Int. J. Environ. Res. Public Health 2022 , 19 , 14729. [ Google Scholar ] [ CrossRef ]
  • Williamson, B.; Macgilchrist, F.; Potter, J. COVID-19 controversies and critical research in digital education. Learn. Media Technol. 2021 , 46 , 117–127. [ Google Scholar ] [ CrossRef ]
  • Brammer, S.; Clark, T. COVID-19 and management education: Reflections on challenges, opportunities, and potential futures. Br. J. Manag. 2020 , 31 , 453. [ Google Scholar ] [ CrossRef ]
  • Peytcheva-Forsyth, R.V.; Aleksieva, L.K. The effect of the teachers’ experience in online education during the pandemic on their views of strengths and weaknesses of e-learning (SU case). In Proceedings of the 22nd International Conference on Computer Systems and Technologies, Ruse, Bulgaria, 18–19 June 2021; pp. 1–11. [ Google Scholar ]
  • Nguyen, T.; Netto, C.L.M.; Wilkins, J.F.; Bröker, P.; Vargas, E.E.; Sealfon, C.D.; Puthipiroj, P.; Li, K.S.; Bowler, J.E.; Hinson, H.R.; et al. Insights into students’ experiences and perceptions of remote learning methods: From the COVID-19 pandemic to best practice for the future. Front. Educ. 2021 , 6 , 91. [ Google Scholar ] [ CrossRef ]
  • Goudeau, S.; Sanrey, C.; Stanczak, A.; Manstead, A.; Darnon, C. Why lockdown and distance learning during the COVID-19 pandemic are likely to increase the social class achievement gap. Nat. Hum. Behav. 2021 , 5 , 1273–1281. [ Google Scholar ] [ CrossRef ]
  • Zhang, J.; Ding, Y.; Yang, X.; Zhong, J.; Qiu, X.; Zou, Z.; Xu, Y.; Jin, X.; Wu, X.; Huang, J.; et al. COVID-19′s impacts on the scope, effectiveness, and interaction characteristics of online learning: A social network analysis. PLoS ONE 2022 , 17 , e0273016. [ Google Scholar ] [ CrossRef ]
  • Munoz-Najar, A.; Gilberto, A.; Hasan, A.; Cobo, C.; Azevedo, J.P.; Akmal, M. Remote Learning during COVID-19: Lessons from Today, Principles for Tomorrow ; World Bank: Washington, DC, USA, 2021. [ Google Scholar ]
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

Sato, S.N.; Condes Moreno, E.; Rubio-Zarapuz, A.; Dalamitros, A.A.; Yañez-Sepulveda, R.; Tornero-Aguilera, J.F.; Clemente-Suárez, V.J. Navigating the New Normal: Adapting Online and Distance Learning in the Post-Pandemic Era. Educ. Sci. 2024 , 14 , 19. https://doi.org/10.3390/educsci14010019

Sato SN, Condes Moreno E, Rubio-Zarapuz A, Dalamitros AA, Yañez-Sepulveda R, Tornero-Aguilera JF, Clemente-Suárez VJ. Navigating the New Normal: Adapting Online and Distance Learning in the Post-Pandemic Era. Education Sciences . 2024; 14(1):19. https://doi.org/10.3390/educsci14010019

Sato, Simone Nomie, Emilia Condes Moreno, Alejandro Rubio-Zarapuz, Athanasios A. Dalamitros, Rodrigo Yañez-Sepulveda, Jose Francisco Tornero-Aguilera, and Vicente Javier Clemente-Suárez. 2024. "Navigating the New Normal: Adapting Online and Distance Learning in the Post-Pandemic Era" Education Sciences 14, no. 1: 19. https://doi.org/10.3390/educsci14010019

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

COVID-19’s impacts on the scope, effectiveness, and interaction characteristics of online learning: A social network analysis

Roles Data curation, Formal analysis, Methodology, Writing – review & editing

¶ ‡ JZ and YD are contributed equally to this work as first authors.

Affiliation School of Educational Information Technology, South China Normal University, Guangzhou, Guangdong, China

Roles Data curation, Formal analysis, Methodology, Writing – original draft

Affiliations School of Educational Information Technology, South China Normal University, Guangzhou, Guangdong, China, Hangzhou Zhongce Vocational School Qiantang, Hangzhou, Zhejiang, China

Roles Data curation, Writing – original draft

Roles Data curation

Roles Writing – original draft

Affiliation Faculty of Education, Shenzhen University, Shenzhen, Guangdong, China

Roles Conceptualization, Supervision, Writing – review & editing

* E-mail: [email protected] (JH); [email protected] (YZ)

ORCID logo

  • Junyi Zhang, 
  • Yigang Ding, 
  • Xinru Yang, 
  • Jinping Zhong, 
  • XinXin Qiu, 
  • Zhishan Zou, 
  • Yujie Xu, 
  • Xiunan Jin, 
  • Xiaomin Wu, 

PLOS

  • Published: August 23, 2022
  • https://doi.org/10.1371/journal.pone.0273016
  • Reader Comments

Table 1

The COVID-19 outbreak brought online learning to the forefront of education. Scholars have conducted many studies on online learning during the pandemic, but only a few have performed quantitative comparative analyses of students’ online learning behavior before and after the outbreak. We collected review data from China’s massive open online course platform called icourse.163 and performed social network analysis on 15 courses to explore courses’ interaction characteristics before, during, and after the COVID-19 pan-demic. Specifically, we focused on the following aspects: (1) variations in the scale of online learning amid COVID-19; (2a) the characteristics of online learning interaction during the pandemic; (2b) the characteristics of online learning interaction after the pandemic; and (3) differences in the interaction characteristics of social science courses and natural science courses. Results revealed that only a small number of courses witnessed an uptick in online interaction, suggesting that the pandemic’s role in promoting the scale of courses was not significant. During the pandemic, online learning interaction became more frequent among course network members whose interaction scale increased. After the pandemic, although the scale of interaction declined, online learning interaction became more effective. The scale and level of interaction in Electrodynamics (a natural science course) and Economics (a social science course) both rose during the pan-demic. However, long after the pandemic, the Economics course sustained online interaction whereas interaction in the Electrodynamics course steadily declined. This discrepancy could be due to the unique characteristics of natural science courses and social science courses.

Citation: Zhang J, Ding Y, Yang X, Zhong J, Qiu X, Zou Z, et al. (2022) COVID-19’s impacts on the scope, effectiveness, and interaction characteristics of online learning: A social network analysis. PLoS ONE 17(8): e0273016. https://doi.org/10.1371/journal.pone.0273016

Editor: Heng Luo, Central China Normal University, CHINA

Received: April 20, 2022; Accepted: July 29, 2022; Published: August 23, 2022

Copyright: © 2022 Zhang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: The data underlying the results presented in the study were downloaded from https://www.icourse163.org/ and are now shared fully on Github ( https://github.com/zjyzhangjunyi/dataset-from-icourse163-for-SNA ). These data have no private information and can be used for academic research free of charge.

Funding: The author(s) received no specific funding for this work.

Competing interests: The authors have declared that no competing interests exist.

1. Introduction

The development of the mobile internet has spurred rapid advances in online learning, offering novel prospects for teaching and learning and a learning experience completely different from traditional instruction. Online learning harnesses the advantages of network technology and multimedia technology to transcend the boundaries of conventional education [ 1 ]. Online courses have become a popular learning mode owing to their flexibility and openness. During online learning, teachers and students are in different physical locations but interact in multiple ways (e.g., via online forum discussions and asynchronous group discussions). An analysis of online learning therefore calls for attention to students’ participation. Alqurashi [ 2 ] defined interaction in online learning as the process of constructing meaningful information and thought exchanges between more than two people; such interaction typically occurs between teachers and learners, learners and learners, and the course content and learners.

Massive open online courses (MOOCs), a 21st-century teaching mode, have greatly influenced global education. Data released by China’s Ministry of Education in 2020 show that the country ranks first globally in the number and scale of higher education MOOCs. The COVID-19 outbreak has further propelled this learning mode, with universities being urged to leverage MOOCs and other online resource platforms to respond to government’s “School’s Out, But Class’s On” policy [ 3 ]. Besides MOOCs, to reduce in-person gatherings and curb the spread of COVID-19, various online learning methods have since become ubiquitous [ 4 ]. Though Lederman asserted that the COVID-19 outbreak has positioned online learning technologies as the best way for teachers and students to obtain satisfactory learning experiences [ 5 ], it remains unclear whether the COVID-19 pandemic has encouraged interaction in online learning, as interactions between students and others play key roles in academic performance and largely determine the quality of learning experiences [ 6 ]. Similarly, it is also unclear what impact the COVID-19 pandemic has had on the scale of online learning.

Social constructivism paints learning as a social phenomenon. As such, analyzing the social structures or patterns that emerge during the learning process can shed light on learning-based interaction [ 7 ]. Social network analysis helps to explain how a social network, rooted in interactions between learners and their peers, guides individuals’ behavior, emotions, and outcomes. This analytical approach is especially useful for evaluating interactive relationships between network members [ 8 ]. Mohammed cited social network analysis (SNA) as a method that can provide timely information about students, learning communities and interactive networks. SNA has been applied in numerous fields, including education, to identify the number and characteristics of interelement relationships. For example, Lee et al. also used SNA to explore the effects of blogs on peer relationships [ 7 ]. Therefore, adopting SNA to examine interactions in online learning communities during the COVID-19 pandemic can uncover potential issues with this online learning model.

Taking China’s icourse.163 MOOC platform as an example, we chose 15 courses with a large number of participants for SNA, focusing on learners’ interaction characteristics before, during, and after the COVID-19 outbreak. We visually assessed changes in the scale of network interaction before, during, and after the outbreak along with the characteristics of interaction in Gephi. Examining students’ interactions in different courses revealed distinct interactive network characteristics, the pandemic’s impact on online courses, and relevant suggestions. Findings are expected to promote effective interaction and deep learning among students in addition to serving as a reference for the development of other online learning communities.

2. Literature review and research questions

Interaction is deemed as central to the educational experience and is a major focus of research on online learning. Moore began to study the problem of interaction in distance education as early as 1989. He defined three core types of interaction: student–teacher, student–content, and student–student [ 9 ]. Lear et al. [ 10 ] described an interactivity/ community-process model of distance education: they specifically discussed the relationships between interactivity, community awareness, and engaging learners and found interactivity and community awareness to be correlated with learner engagement. Zulfikar et al. [ 11 ] suggested that discussions initiated by the students encourage more students’ engagement than discussions initiated by the instructors. It is most important to afford learners opportunities to interact purposefully with teachers, and improving the quality of learner interaction is crucial to fostering profound learning [ 12 ]. Interaction is an important way for learners to communicate and share information, and a key factor in the quality of online learning [ 13 ].

Timely feedback is the main component of online learning interaction. Woo and Reeves discovered that students often become frustrated when they fail to receive prompt feedback [ 14 ]. Shelley et al. conducted a three-year study of graduate and undergraduate students’ satisfaction with online learning at universities and found that interaction with educators and students is the main factor affecting satisfaction [ 15 ]. Teachers therefore need to provide students with scoring justification, support, and constructive criticism during online learning. Some researchers examined online learning during the COVID-19 pandemic. They found that most students preferred face-to-face learning rather than online learning due to obstacles faced online, such as a lack of motivation, limited teacher-student interaction, and a sense of isolation when learning in different times and spaces [ 16 , 17 ]. However, it can be reduced by enhancing the online interaction between teachers and students [ 18 ].

Research showed that interactions contributed to maintaining students’ motivation to continue learning [ 19 ]. Baber argued that interaction played a key role in students’ academic performance and influenced the quality of the online learning experience [ 20 ]. Hodges et al. maintained that well-designed online instruction can lead to unique teaching experiences [ 21 ]. Banna et al. mentioned that using discussion boards, chat sessions, blogs, wikis, and other tools could promote student interaction and improve participation in online courses [ 22 ]. During the COVID-19 pandemic, Mahmood proposed a series of teaching strategies suitable for distance learning to improve its effectiveness [ 23 ]. Lapitan et al. devised an online strategy to ease the transition from traditional face-to-face instruction to online learning [ 24 ]. The preceding discussion suggests that online learning goes beyond simply providing learning resources; teachers should ideally design real-life activities to give learners more opportunities to participate.

As mentioned, COVID-19 has driven many scholars to explore the online learning environment. However, most have ignored the uniqueness of online learning during this time and have rarely compared pre- and post-pandemic online learning interaction. Taking China’s icourse.163 MOOC platform as an example, we chose 15 courses with a large number of participants for SNA, centering on student interaction before and after the pandemic. Gephi was used to visually analyze changes in the scale and characteristics of network interaction. The following questions were of particular interest:

  • (1) Can the COVID-19 pandemic promote the expansion of online learning?
  • (2a) What are the characteristics of online learning interaction during the pandemic?
  • (2b) What are the characteristics of online learning interaction after the pandemic?
  • (3) How do interaction characteristics differ between social science courses and natural science courses?

3. Methodology

3.1 research context.

We selected several courses with a large number of participants and extensive online interaction among hundreds of courses on the icourse.163 MOOC platform. These courses had been offered on the platform for at least three semesters, covering three periods (i.e., before, during, and after the COVID-19 outbreak). To eliminate the effects of shifts in irrelevant variables (e.g., course teaching activities), we chose several courses with similar teaching activities and compared them on multiple dimensions. All course content was taught online. The teachers of each course posted discussion threads related to learning topics; students were expected to reply via comments. Learners could exchange ideas freely in their responses in addition to asking questions and sharing their learning experiences. Teachers could answer students’ questions as well. Conversations in the comment area could partly compensate for a relative absence of online classroom interaction. Teacher–student interaction is conducive to the formation of a social network structure and enabled us to examine teachers’ and students’ learning behavior through SNA. The comment areas in these courses were intended for learners to construct knowledge via reciprocal communication. Meanwhile, by answering students’ questions, teachers could encourage them to reflect on their learning progress. These courses’ successive terms also spanned several phases of COVID-19, allowing us to ascertain the pandemic’s impact on online learning.

3.2 Data collection and preprocessing

To avoid interference from invalid or unclear data, the following criteria were applied to select representative courses: (1) generality (i.e., public courses and professional courses were chosen from different schools across China); (2) time validity (i.e., courses were held before during, and after the pandemic); and (3) notability (i.e., each course had at least 2,000 participants). We ultimately chose 15 courses across the social sciences and natural sciences (see Table 1 ). The coding is used to represent the course name.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0273016.t001

To discern courses’ evolution during the pandemic, we gathered data on three terms before, during, and after the COVID-19 outbreak in addition to obtaining data from two terms completed well before the pandemic and long after. Our final dataset comprised five sets of interactive data. Finally, we collected about 120,000 comments for SNA. Because each course had a different start time—in line with fluctuations in the number of confirmed COVID-19 cases in China and the opening dates of most colleges and universities—we divided our sample into five phases: well before the pandemic (Phase I); before the pandemic (Phase Ⅱ); during the pandemic (Phase Ⅲ); after the pandemic (Phase Ⅳ); and long after the pandemic (Phase Ⅴ). We sought to preserve consistent time spans to balance the amount of data in each period ( Fig 1 ).

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g001

3.3 Instrumentation

Participants’ comments and “thumbs-up” behavior data were converted into a network structure and compared using social network analysis (SNA). Network analysis, according to M’Chirgui, is an effective tool for clarifying network relationships by employing sophisticated techniques [ 25 ]. Specifically, SNA can help explain the underlying relationships among team members and provide a better understanding of their internal processes. Yang and Tang used SNA to discuss the relationship between team structure and team performance [ 26 ]. Golbeck argued that SNA could improve the understanding of students’ learning processes and reveal learners’ and teachers’ role dynamics [ 27 ].

To analyze Question (1), the number of nodes and diameter in the generated network were deemed as indicators of changes in network size. Social networks are typically represented as graphs with nodes and degrees, and node count indicates the sample size [ 15 ]. Wellman et al. proposed that the larger the network scale, the greater the number of network members providing emotional support, goods, services, and companionship [ 28 ]. Jan’s study measured the network size by counting the nodes which represented students, lecturers, and tutors [ 29 ]. Similarly, network nodes in the present study indicated how many learners and teachers participated in the course, with more nodes indicating more participants. Furthermore, we investigated the network diameter, a structural feature of social networks, which is a common metric for measuring network size in SNA [ 30 ]. The network diameter refers to the longest path between any two nodes in the network. There has been evidence that a larger network diameter leads to greater spread of behavior [ 31 ]. Likewise, Gašević et al. found that larger networks were more likely to spread innovative ideas about educational technology when analyzing MOOC-related research citations [ 32 ]. Therefore, we employed node count and network diameter to measure the network’s spatial size and further explore the expansion characteristic of online courses. Brief introduction of these indicators can be summarized in Table 2 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t002

To address Question (2), a list of interactive analysis metrics in SNA were introduced to scrutinize learners’ interaction characteristics in online learning during and after the pandemic, as shown below:

  • (1) The average degree reflects the density of the network by calculating the average number of connections for each node. As Rong and Xu suggested, the average degree of a network indicates how active its participants are [ 33 ]. According to Hu, a higher average degree implies that more students are interacting directly with each other in a learning context [ 34 ]. The present study inherited the concept of the average degree from these previous studies: the higher the average degree, the more frequent the interaction between individuals in the network.
  • (2) Essentially, a weighted average degree in a network is calculated by multiplying each degree by its respective weight, and then taking the average. Bydžovská took the strength of the relationship into account when determining the weighted average degree [ 35 ]. By calculating friendship’s weighted value, Maroulis assessed peer achievement within a small-school reform [ 36 ]. Accordingly, we considered the number of interactions as the weight of the degree, with a higher average degree indicating more active interaction among learners.
  • (3) Network density is the ratio between actual connections and potential connections in a network. The more connections group members have with each other, the higher the network density. In SNA, network density is similar to group cohesion, i.e., a network of more strong relationships is more cohesive [ 37 ]. Network density also reflects how much all members are connected together [ 38 ]. Therefore, we adopted network density to indicate the closeness among network members. Higher network density indicates more frequent interaction and closer communication among students.
  • (4) Clustering coefficient describes local network attributes and indicates that two nodes in the network could be connected through adjacent nodes. The clustering coefficient measures users’ tendency to gather (cluster) with others in the network: the higher the clustering coefficient, the more frequently users communicate with other group members. We regarded this indicator as a reflection of the cohesiveness of the group [ 39 ].
  • (5) In a network, the average path length is the average number of steps along the shortest paths between any two nodes. Oliveres has observed that when an average path length is small, the route from one node to another is shorter when graphed [ 40 ]. This is especially true in educational settings where students tend to become closer friends. So we consider that the smaller the average path length, the greater the possibility of interaction between individuals in the network.
  • (6) A network with a large number of nodes, but whose average path length is surprisingly small, is known as the small-world effect [ 41 ]. A higher clustering coefficient and shorter average path length are important indicators of a small-world network: a shorter average path length enables the network to spread information faster and more accurately; a higher clustering coefficient can promote frequent knowledge exchange within the group while boosting the timeliness and accuracy of knowledge dissemination [ 42 ]. Brief introduction of these indicators can be summarized in Table 3 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t003

To analyze Question 3, we used the concept of closeness centrality, which determines how close a vertex is to others in the network. As Opsahl et al. explained, closeness centrality reveals how closely actors are coupled with their entire social network [ 43 ]. In order to analyze social network-based engineering education, Putnik et al. examined closeness centrality and found that it was significantly correlated with grades [ 38 ]. We used closeness centrality to measure the position of an individual in the network. Brief introduction of these indicators can be summarized in Table 4 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t004

3.4 Ethics statement

This study was approved by the Academic Committee Office (ACO) of South China Normal University ( http://fzghb.scnu.edu.cn/ ), Guangzhou, China. Research data were collected from the open platform and analyzed anonymously. There are thus no privacy issues involved in this study.

4.1 COVID-19’s role in promoting the scale of online courses was not as important as expected

As shown in Fig 2 , the number of course participants and nodes are closely correlated with the pandemic’s trajectory. Because the number of participants in each course varied widely, we normalized the number of participants and nodes to more conveniently visualize course trends. Fig 2 depicts changes in the chosen courses’ number of participants and nodes before the pandemic (Phase II), during the pandemic (Phase III), and after the pandemic (Phase IV). The number of participants in most courses during the pandemic exceeded those before and after the pandemic. But the number of people who participate in interaction in some courses did not increase.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g002

In order to better analyze the trend of interaction scale in online courses before, during, and after the pandemic, the selected courses were categorized according to their scale change. When the number of participants increased (decreased) beyond 20% (statistical experience) and the diameter also increased (decreased), the course scale was determined to have increased (decreased); otherwise, no significant change was identified in the course’s interaction scale. Courses were subsequently divided into three categories: increased interaction scale, decreased interaction scale, and no significant change. Results appear in Table 5 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t005

From before the pandemic until it broke out, the interaction scale of five courses increased, accounting for 33.3% of the full sample; one course’s interaction scale declined, accounting for 6.7%. The interaction scale of nine courses decreased, accounting for 60%. The pandemic’s role in promoting online courses thus was not as important as anticipated, and most courses’ interaction scale did not change significantly throughout.

No courses displayed growing interaction scale after the pandemic: the interaction scale of nine courses fell, accounting for 60%; and the interaction scale of six courses did not shift significantly, accounting for 40%. Courses with an increased scale of interaction during the pandemic did not maintain an upward trend. On the contrary, the improvement in the pandemic caused learners’ enthusiasm for online learning to wane. We next analyzed several interaction metrics to further explore course interaction during different pandemic periods.

4.2 Characteristics of online learning interaction amid COVID-19

4.2.1 during the covid-19 pandemic, online learning interaction in some courses became more active..

Changes in course indicators with the growing interaction scale during the pandemic are presented in Fig 3 , including SS5, SS6, NS1, NS3, and NS8. The horizontal ordinate indicates the number of courses, with red color representing the rise of the indicator value on the vertical ordinate and blue representing the decline.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g003

Specifically: (1) The average degree and weighted average degree of the five course networks demonstrated an upward trend. The emergence of the pandemic promoted students’ enthusiasm; learners were more active in the interactive network. (2) Fig 3 shows that 3 courses had increased network density and 2 courses had decreased. The higher the network density, the more communication within the team. Even though the pandemic accelerated the interaction scale and frequency, the tightness between learners in some courses did not improve. (3) The clustering coefficient of social science courses rose whereas the clustering coefficient and small-world property of natural science courses fell. The higher the clustering coefficient and the small-world property, the better the relationship between adjacent nodes and the higher the cohesion [ 39 ]. (4) Most courses’ average path length increased as the interaction scale increased. However, when the average path length grew, adverse effects could manifest: communication between learners might be limited to a small group without multi-directional interaction.

When the pandemic emerged, the only declining network scale belonged to a natural science course (NS2). The change in each course index is pictured in Fig 4 . The abscissa indicates the size of the value, with larger values to the right. The red dot indicates the index value before the pandemic; the blue dot indicates its value during the pandemic. If the blue dot is to the right of the red dot, then the value of the index increased; otherwise, the index value declined. Only the weighted average degree of the course network increased. The average degree, network density decreased, indicating that network members were not active and that learners’ interaction degree and communication frequency lessened. Despite reduced learner interaction, the average path length was small and the connectivity between learners was adequate.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g004

4.2.2 After the COVID-19 pandemic, the scale decreased rapidly, but most course interaction was more effective.

Fig 5 shows the changes in various courses’ interaction indicators after the pandemic, including SS1, SS2, SS3, SS6, SS7, NS2, NS3, NS7, and NS8.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g005

Specifically: (1) The average degree and weighted average degree of most course networks decreased. The scope and intensity of interaction among network members declined rapidly, as did learners’ enthusiasm for communication. (2) The network density of seven courses also fell, indicating weaker connections between learners in most courses. (3) In addition, the clustering coefficient and small-world property of most course networks decreased, suggesting little possibility of small groups in the network. The scope of interaction between learners was not limited to a specific space, and the interaction objects had no significant tendencies. (4) Although the scale of course interaction became smaller in this phase, the average path length of members’ social networks shortened in nine courses. Its shorter average path length would expedite the spread of information within the network as well as communication and sharing among network members.

Fig 6 displays the evolution of course interaction indicators without significant changes in interaction scale after the pandemic, including SS4, SS5, NS1, NS4, NS5, and NS6.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g006

Specifically: (1) Some course members’ social networks exhibited an increase in the average and weighted average. In these cases, even though the course network’s scale did not continue to increase, communication among network members rose and interaction became more frequent and deeper than before. (2) Network density and average path length are indicators of social network density. The greater the network density, the denser the social network; the shorter the average path length, the more concentrated the communication among network members. However, at this phase, the average path length and network density in most courses had increased. Yet the network density remained small despite having risen ( Table 6 ). Even with more frequent learner interaction, connections remained distant and the social network was comparatively sparse.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t006

In summary, the scale of interaction did not change significantly overall. Nonetheless, some course members’ frequency and extent of interaction increased, and the relationships between network members became closer as well. In the study, we found it interesting that the interaction scale of Economics (a social science course) course and Electrodynamics (a natural science course) course expanded rapidly during the pandemic and retained their interaction scale thereafter. We next assessed these two courses to determine whether their level of interaction persisted after the pandemic.

4.3 Analyses of natural science courses and social science courses

4.3.1 analyses of the interaction characteristics of economics and electrodynamics..

Economics and Electrodynamics are social science courses and natural science courses, respectively. Members’ interaction within these courses was similar: the interaction scale increased significantly when COVID-19 broke out (Phase Ⅲ), and no significant changes emerged after the pandemic (Phase Ⅴ). We hence focused on course interaction long after the outbreak (Phase V) and compared changes across multiple indicators, as listed in Table 7 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t007

As the pandemic continued to improve, the number of participants and the diameter long after the outbreak (Phase V) each declined for Economics compared with after the pandemic (Phase IV). The interaction scale decreased, but the interaction between learners was much deeper. Specifically: (1) The weighted average degree, network density, clustering coefficient, and small-world property each reflected upward trends. The pandemic therefore exerted a strong impact on this course. Interaction was well maintained even after the pandemic. The smaller network scale promoted members’ interaction and communication. (2) Compared with after the pandemic (Phase IV), members’ network density increased significantly, showing that relationships between learners were closer and that cohesion was improving. (3) At the same time, as the clustering coefficient and small-world property grew, network members demonstrated strong small-group characteristics: the communication between them was deepening and their enthusiasm for interaction was higher. (4) Long after the COVID-19 outbreak (Phase V), the average path length was reduced compared with previous terms, knowledge flowed more quickly among network members, and the degree of interaction gradually deepened.

The average degree, weighted average degree, network density, clustering coefficient, and small-world property of Electrodynamics all decreased long after the COVID-19 outbreak (Phase V) and were lower than during the outbreak (Phase Ⅲ). The level of learner interaction therefore gradually declined long after the outbreak (Phase V), and connections between learners were no longer active. Although the pandemic increased course members’ extent of interaction, this rise was merely temporary: students’ enthusiasm for learning waned rapidly and their interaction decreased after the pandemic (Phase IV). To further analyze the interaction characteristics of course members in Economics and Electrodynamics, we evaluated the closeness centrality of their social networks, as shown in section 4.3.2.

4.3.2 Analysis of the closeness centrality of Economics and Electrodynamics.

The change in the closeness centrality of social networks in Economics was small, and no sharp upward trend appeared during the pandemic outbreak, as shown in Fig 7 . The emergence of COVID-19 apparently fostered learners’ interaction in Economics albeit without a significant impact. The closeness centrality changed in Electrodynamics varied from that of Economics: upon the COVID-19 outbreak, closeness centrality was significantly different from other semesters. Communication between learners was closer and interaction was more effective. Electrodynamics course members’ social network proximity decreased rapidly after the pandemic. Learners’ communication lessened. In general, Economics course showed better interaction before the outbreak and was less affected by the pandemic; Electrodynamics course was more affected by the pandemic and showed different interaction characteristics at different periods of the pandemic.

thumbnail

(Note: "****" indicates the significant distinction in closeness centrality between the two periods, otherwise no significant distinction).

https://doi.org/10.1371/journal.pone.0273016.g007

5. Discussion

We referred to discussion forums from several courses on the icourse.163 MOOC platform to compare online learning before, during, and after the COVID-19 pandemic via SNA and to delineate the pandemic’s effects on online courses. Only 33.3% of courses in our sample increased in terms of interaction during the pandemic; the scale of interaction did not rise in any courses thereafter. When the courses scale rose, the scope and frequency of interaction showed upward trends during the pandemic; and the clustering coefficient of natural science courses and social science courses differed: the coefficient for social science courses tended to rise whereas that for natural science courses generally declined. When the pandemic broke out, the interaction scale of a single natural science course decreased along with its interaction scope and frequency. The amount of interaction in most courses shrank rapidly during the pandemic and network members were not as active as they had been before. However, after the pandemic, some courses saw declining interaction but greater communication between members; interaction also became more frequent and deeper than before.

5.1 During the COVID-19 pandemic, the scale of interaction increased in only a few courses

The pandemic outbreak led to a rapid increase in the number of participants in most courses; however, the change in network scale was not significant. The scale of online interaction expanded swiftly in only a few courses; in others, the scale either did not change significantly or displayed a downward trend. After the pandemic, the interaction scale in most courses decreased quickly; the same pattern applied to communication between network members. Learners’ enthusiasm for online interaction reduced as the circumstances of the pandemic improved—potentially because, during the pandemic, China’s Ministry of Education declared “School’s Out, But Class’s On” policy. Major colleges and universities were encouraged to use the Internet and informational resources to provide learning support, hence the sudden increase in the number of participants and interaction in online courses [ 46 ]. After the pandemic, students’ enthusiasm for online learning gradually weakened, presumably due to easing of the pandemic [ 47 ]. More activities also transitioned from online to offline, which tempered learners’ online discussion. Research has shown that long-term online learning can even bore students [ 48 ].

Most courses’ interaction scale decreased significantly after the pandemic. First, teachers and students occupied separate spaces during the outbreak, had few opportunities for mutual cooperation and friendship, and lacked a sense of belonging [ 49 ]. Students’ enthusiasm for learning dissipated over time [ 50 ]. Second, some teachers were especially concerned about adapting in-person instructional materials for digital platforms; their pedagogical methods were ineffective, and they did not provide learning activities germane to student interaction [ 51 ]. Third, although teachers and students in remote areas were actively engaged in online learning, some students could not continue to participate in distance learning due to inadequate technology later in the outbreak [ 52 ].

5.2 Characteristics of online learning interaction during and after the COVID-19 pandemic

5.2.1 during the covid-19 pandemic, online interaction in most courses did not change significantly..

The interaction scale of only a few courses increased during the pandemic. The interaction scope and frequency of these courses climbed as well. Yet even as the degree of network interaction rose, course network density did not expand in all cases. The pandemic sparked a surge in the number of online learners and a rapid increase in network scale, but students found it difficult to interact with all learners. Yau pointed out that a greater network scale did not enrich the range of interaction between individuals; rather, the number of individuals who could interact directly was limited [ 53 ]. The internet facilitates interpersonal communication. However, not everyone has the time or ability to establish close ties with others [ 54 ].

In addition, social science courses and natural science courses in our sample revealed disparate trends in this regard: the clustering coefficient of social science courses increased and that of natural science courses decreased. Social science courses usually employ learning approaches distinct from those in natural science courses [ 55 ]. Social science courses emphasize critical and innovative thinking along with personal expression [ 56 ]. Natural science courses focus on practical skills, methods, and principles [ 57 ]. Therefore, the content of social science courses can spur large-scale discussion among learners. Some course evaluations indicated that the course content design was suboptimal as well: teachers paid close attention to knowledge transmission and much less to piquing students’ interest in learning. In addition, the thread topics that teachers posted were scarcely diversified and teachers’ questions lacked openness. These attributes could not spark active discussion among learners.

5.2.2 Online learning interaction declined after the COVID-19 pandemic.

Most courses’ interaction scale and intensity decreased rapidly after the pandemic, but some did not change. Courses with a larger network scale did not continue to expand after the outbreak, and students’ enthusiasm for learning paled. The pandemic’s reduced severity also influenced the number of participants in online courses. Meanwhile, restored school order moved many learning activities from virtual to in-person spaces. Face-to-face learning has gradually replaced online learning, resulting in lower enrollment and less interaction in online courses. Prolonged online courses could have also led students to feel lonely and to lack a sense of belonging [ 58 ].

The scale of interaction in some courses did not change substantially after the pandemic yet learners’ connections became tighter. We hence recommend that teachers seize pandemic-related opportunities to design suitable activities. Additionally, instructors should promote student-teacher and student-student interaction, encourage students to actively participate online, and generally intensify the impact of online learning.

5.3 What are the characteristics of interaction in social science courses and natural science courses?

The level of interaction in Economics (a social science course) was significantly higher than that in Electrodynamics (a natural science course), and the small-world property in Economics increased as well. To boost online courses’ learning-related impacts, teachers can divide groups of learners based on the clustering coefficient and the average path length. Small groups of students may benefit teachers in several ways: to participate actively in activities intended to expand students’ knowledge, and to serve as key actors in these small groups. Cultivating students’ keenness to participate in class activities and self-management can also help teachers guide learner interaction and foster deep knowledge construction.

As evidenced by comments posted in the Electrodynamics course, we observed less interaction between students. Teachers also rarely urged students to contribute to conversations. These trends may have arisen because teachers and students were in different spaces. Teachers might have struggled to discern students’ interaction status. Teachers could also have failed to intervene in time, to design online learning activities that piqued learners’ interest, and to employ sound interactive theme planning and guidance. Teachers are often active in traditional classroom settings. Their roles are comparatively weakened online, such that they possess less control over instruction [ 59 ]. Online instruction also requires a stronger hand in learning: teachers should play a leading role in regulating network members’ interactive communication [ 60 ]. Teachers can guide learners to participate, help learners establish social networks, and heighten students’ interest in learning [ 61 ]. Teachers should attend to core members in online learning while also considering edge members; by doing so, all network members can be driven to share their knowledge and become more engaged. Finally, teachers and assistant teachers should help learners develop knowledge, exchange topic-related ideas, pose relevant questions during course discussions, and craft activities that enable learners to interact online [ 62 ]. These tactics can improve the effectiveness of online learning.

As described, network members displayed distinct interaction behavior in Economics and Electrodynamics courses. First, these courses varied in their difficulty: the social science course seemed easier to understand and focused on divergent thinking. Learners were often willing to express their views in comments and to ponder others’ perspectives [ 63 ]. The natural science course seemed more demanding and was oriented around logical thinking and skills [ 64 ]. Second, courses’ content differed. In general, social science courses favor the acquisition of declarative knowledge and creative knowledge compared with natural science courses. Social science courses also entertain open questions [ 65 ]. Natural science courses revolve around principle knowledge, strategic knowledge, and transfer knowledge [ 66 ]. Problems in these courses are normally more complicated than those in social science courses. Third, the indicators affecting students’ attitudes toward learning were unique. Guo et al. discovered that “teacher feedback” most strongly influenced students’ attitudes towards learning social science courses but had less impact on students in natural science courses [ 67 ]. Therefore, learners in social science courses likely expect more feedback from teachers and greater interaction with others.

6. Conclusion and future work

Our findings show that the network interaction scale of some online courses expanded during the COVID-19 pandemic. The network scale of most courses did not change significantly, demonstrating that the pandemic did not notably alter the scale of course interaction. Online learning interaction among course network members whose interaction scale increased also became more frequent during the pandemic. Once the outbreak was under control, although the scale of interaction declined, the level and scope of some courses’ interactive networks continued to rise; interaction was thus particularly effective in these cases. Overall, the pandemic appeared to have a relatively positive impact on online learning interaction. We considered a pair of courses in detail and found that Economics (a social science course) fared much better than Electrodynamics (a natural science course) in classroom interaction; learners were more willing to partake in-class activities, perhaps due to these courses’ unique characteristics. Brint et al. also came to similar conclusions [ 57 ].

This study was intended to be rigorous. Even so, several constraints can be addressed in future work. The first limitation involves our sample: we focused on a select set of courses hosted on China’s icourse.163 MOOC platform. Future studies should involve an expansive collection of courses to provide a more holistic understanding of how the pandemic has influenced online interaction. Second, we only explored the interactive relationship between learners and did not analyze interactive content. More in-depth content analysis should be carried out in subsequent research. All in all, the emergence of COVID-19 has provided a new path for online learning and has reshaped the distance learning landscape. To cope with associated challenges, educational practitioners will need to continue innovating in online instructional design, strengthen related pedagogy, optimize online learning conditions, and bolster teachers’ and students’ competence in online learning.

  • View Article
  • Google Scholar
  • PubMed/NCBI
  • 30. Serrat O. Social network analysis. Knowledge solutions: Springer; 2017. p. 39–43. https://doi.org/10.1007/978-981-10-0983-9_9
  • 33. Rong Y, Xu E, editors. Strategies for the Management of the Government Affairs Microblogs in China Based on the SNA of Fifty Government Affairs Microblogs in Beijing. 14th International Conference on Service Systems and Service Management 2017.
  • 34. Hu X, Chu S, editors. A comparison on using social media in a professional experience course. International Conference on Social Media and Society; 2013.
  • 35. Bydžovská H. A Comparative Analysis of Techniques for Predicting Student Performance. Proceedings of the 9th International Conference on Educational Data Mining; Raleigh, NC, USA: International Educational Data Mining Society2016. p. 306–311.
  • 40. Olivares D, Adesope O, Hundhausen C, et al., editors. Using social network analysis to measure the effect of learning analytics in computing education. 19th IEEE International Conference on Advanced Learning Technologies 2019.
  • 41. Travers J, Milgram S. An experimental study of the small world problem. Social Networks: Elsevier; 1977. p. 179–197. https://doi.org/10.1016/B978-0-12-442450-0.50018–3
  • 43. Okamoto K, Chen W, Li X-Y, editors. Ranking of closeness centrality for large-scale social networks. International workshop on frontiers in algorithmics; 2008; Springer, Berlin, Heidelberg: Springer.
  • 47. Ding Y, Yang X, Zheng Y, editors. COVID-19’s Effects on the Scope, Effectiveness, and Roles of Teachers in Online Learning Based on Social Network Analysis: A Case Study. International Conference on Blended Learning; 2021: Springer.
  • 64. Boys C, Brennan J., Henkel M., Kirkland J., Kogan M., Youl P. Higher Education and Preparation for Work. Jessica Kingsley Publishers. 1988. https://doi.org/10.1080/03075079612331381467
  • Open supplemental data
  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, insights into students’ experiences and perceptions of remote learning methods: from the covid-19 pandemic to best practice for the future.

significance of the study in research about online learning

  • 1 Minerva Schools at Keck Graduate Institute, San Francisco, CA, United States
  • 2 Ronin Institute for Independent Scholarship, Montclair, NJ, United States
  • 3 Department of Physics, University of Toronto, Toronto, ON, Canada

This spring, students across the globe transitioned from in-person classes to remote learning as a result of the COVID-19 pandemic. This unprecedented change to undergraduate education saw institutions adopting multiple online teaching modalities and instructional platforms. We sought to understand students’ experiences with and perspectives on those methods of remote instruction in order to inform pedagogical decisions during the current pandemic and in future development of online courses and virtual learning experiences. Our survey gathered quantitative and qualitative data regarding students’ experiences with synchronous and asynchronous methods of remote learning and specific pedagogical techniques associated with each. A total of 4,789 undergraduate participants representing institutions across 95 countries were recruited via Instagram. We find that most students prefer synchronous online classes, and students whose primary mode of remote instruction has been synchronous report being more engaged and motivated. Our qualitative data show that students miss the social aspects of learning on campus, and it is possible that synchronous learning helps to mitigate some feelings of isolation. Students whose synchronous classes include active-learning techniques (which are inherently more social) report significantly higher levels of engagement, motivation, enjoyment, and satisfaction with instruction. Respondents’ recommendations for changes emphasize increased engagement, interaction, and student participation. We conclude that active-learning methods, which are known to increase motivation, engagement, and learning in traditional classrooms, also have a positive impact in the remote-learning environment. Integrating these elements into online courses will improve the student experience.

Introduction

The COVID-19 pandemic has dramatically changed the demographics of online students. Previously, almost all students engaged in online learning elected the online format, starting with individual online courses in the mid-1990s through today’s robust online degree and certificate programs. These students prioritize convenience, flexibility and ability to work while studying and are older than traditional college age students ( Harris and Martin, 2012 ; Levitz, 2016 ). These students also find asynchronous elements of a course are more useful than synchronous elements ( Gillingham and Molinari, 2012 ). In contrast, students who chose to take courses in-person prioritize face-to-face instruction and connection with others and skew considerably younger ( Harris and Martin, 2012 ). This leaves open the question of whether students who prefer to learn in-person but are forced to learn remotely will prefer synchronous or asynchronous methods. One study of student preferences following a switch to remote learning during the COVID-19 pandemic indicates that students enjoy synchronous over asynchronous course elements and find them more effective ( Gillis and Krull, 2020 ). Now that millions of traditional in-person courses have transitioned online, our survey expands the data on student preferences and explores if those preferences align with pedagogical best practices.

An extensive body of research has explored what instructional methods improve student learning outcomes (Fink. 2013). Considerable evidence indicates that active-learning or student-centered approaches result in better learning outcomes than passive-learning or instructor-centered approaches, both in-person and online ( Freeman et al., 2014 ; Chen et al., 2018 ; Davis et al., 2018 ). Active-learning approaches include student activities or discussion in class, whereas passive-learning approaches emphasize extensive exposition by the instructor ( Freeman et al., 2014 ). Constructivist learning theories argue that students must be active participants in creating their own learning, and that listening to expert explanations is seldom sufficient to trigger the neurological changes necessary for learning ( Bostock, 1998 ; Zull, 2002 ). Some studies conclude that, while students learn more via active learning, they may report greater perceptions of their learning and greater enjoyment when passive approaches are used ( Deslauriers et al., 2019 ). We examine student perceptions of remote learning experiences in light of these previous findings.

In this study, we administered a survey focused on student perceptions of remote learning in late May 2020 through the social media account of @unjadedjade to a global population of English speaking undergraduate students representing institutions across 95 countries. We aim to explore how students were being taught, the relationship between pedagogical methods and student perceptions of their experience, and the reasons behind those perceptions. Here we present an initial analysis of the results and share our data set for further inquiry. We find that positive student perceptions correlate with synchronous courses that employ a variety of interactive pedagogical techniques, and that students overwhelmingly suggest behavioral and pedagogical changes that increase social engagement and interaction. We argue that these results support the importance of active learning in an online environment.

Materials and Methods

Participant pool.

Students were recruited through the Instagram account @unjadedjade. This social media platform, run by influencer Jade Bowler, focuses on education, effective study tips, ethical lifestyle, and promotes a positive mindset. For this reason, the audience is presumably academically inclined, and interested in self-improvement. The survey was posted to her account and received 10,563 responses within the first 36 h. Here we analyze the 4,789 of those responses that came from undergraduates. While we did not collect demographic or identifying information, we suspect that women are overrepresented in these data as followers of @unjadedjade are 80% women. A large minority of respondents were from the United Kingdom as Jade Bowler is a British influencer. Specifically, 43.3% of participants attend United Kingdom institutions, followed by 6.7% attending university in the Netherlands, 6.1% in Germany, 5.8% in the United States and 4.2% in Australia. Ninety additional countries are represented in these data (see Supplementary Figure 1 ).

Survey Design

The purpose of this survey is to learn about students’ instructional experiences following the transition to remote learning in the spring of 2020.

This survey was initially created for a student assignment for the undergraduate course Empirical Analysis at Minerva Schools at KGI. That version served as a robust pre-test and allowed for identification of the primary online platforms used, and the four primary modes of learning: synchronous (live) classes, recorded lectures and videos, uploaded or emailed materials, and chat-based communication. We did not adapt any open-ended questions based on the pre-test survey to avoid biasing the results and only corrected language in questions for clarity. We used these data along with an analysis of common practices in online learning to revise the survey. Our revised survey asked students to identify the synchronous and asynchronous pedagogical methods and platforms that they were using for remote learning. Pedagogical methods were drawn from literature assessing active and passive teaching strategies in North American institutions ( Fink, 2013 ; Chen et al., 2018 ; Davis et al., 2018 ). Open-ended questions asked students to describe why they preferred certain modes of learning and how they could improve their learning experience. Students also reported on their affective response to learning and participation using a Likert scale.

The revised survey also asked whether students had responded to the earlier survey. No significant differences were found between responses of those answering for the first and second times (data not shown). See Supplementary Appendix 1 for survey questions. Survey data was collected from 5/21/20 to 5/23/20.

Qualitative Coding

We applied a qualitative coding framework adapted from Gale et al. (2013) to analyze student responses to open-ended questions. Four researchers read several hundred responses and noted themes that surfaced. We then developed a list of themes inductively from the survey data and deductively from the literature on pedagogical practice ( Garrison et al., 1999 ; Zull, 2002 ; Fink, 2013 ; Freeman et al., 2014 ). The initial codebook was revised collaboratively based on feedback from researchers after coding 20–80 qualitative comments each. Before coding their assigned questions, alignment was examined through coding of 20 additional responses. Researchers aligned in identifying the same major themes. Discrepancies in terms identified were resolved through discussion. Researchers continued to meet weekly to discuss progress and alignment. The majority of responses were coded by a single researcher using the final codebook ( Supplementary Table 1 ). All responses to questions 3 (4,318 responses) and 8 (4,704 responses), and 2,512 of 4,776 responses to question 12 were analyzed. Valence was also indicated where necessary (i.e., positive or negative discussion of terms). This paper focuses on the most prevalent themes from our initial analysis of the qualitative responses. The corresponding author reviewed codes to ensure consistency and accuracy of reported data.

Statistical Analysis

The survey included two sets of Likert-scale questions, one consisting of a set of six statements about students’ perceptions of their experiences following the transition to remote learning ( Table 1 ). For each statement, students indicated their level of agreement with the statement on a five-point scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). The second set asked the students to respond to the same set of statements, but about their retroactive perceptions of their experiences with in-person instruction before the transition to remote learning. This set was not the subject of our analysis but is present in the published survey results. To explore correlations among student responses, we used CrossCat analysis to calculate the probability of dependence between Likert-scale responses ( Mansinghka et al., 2016 ).

www.frontiersin.org

Table 1. Likert-scale questions.

Mean values are calculated based on the numerical scores associated with each response. Measures of statistical significance for comparisons between different subgroups of respondents were calculated using a two-sided Mann-Whitney U -test, and p -values reported here are based on this test statistic. We report effect sizes in pairwise comparisons using the common-language effect size, f , which is the probability that the response from a random sample from subgroup 1 is greater than the response from a random sample from subgroup 2. We also examined the effects of different modes of remote learning and technological platforms using ordinal logistic regression. With the exception of the mean values, all of these analyses treat Likert-scale responses as ordinal-scale, rather than interval-scale data.

Students Prefer Synchronous Class Sessions

Students were asked to identify their primary mode of learning given four categories of remote course design that emerged from the pilot survey and across literature on online teaching: live (synchronous) classes, recorded lectures and videos, emailed or uploaded materials, and chats and discussion forums. While 42.7% ( n = 2,045) students identified live classes as their primary mode of learning, 54.6% ( n = 2613) students preferred this mode ( Figure 1 ). Both recorded lectures and live classes were preferred over uploaded materials (6.22%, n = 298) and chat (3.36%, n = 161).

www.frontiersin.org

Figure 1. Actual (A) and preferred (B) primary modes of learning.

In addition to a preference for live classes, students whose primary mode was synchronous were more likely to enjoy the class, feel motivated and engaged, be satisfied with instruction and report higher levels of participation ( Table 2 and Supplementary Figure 2 ). Regardless of primary mode, over two-thirds of students reported they are often distracted during remote courses.

www.frontiersin.org

Table 2. The effect of synchronous vs. asynchronous primary modes of learning on student perceptions.

Variation in Pedagogical Techniques for Synchronous Classes Results in More Positive Perceptions of the Student Learning Experience

To survey the use of passive vs. active instructional methods, students reported the pedagogical techniques used in their live classes. Among the synchronous methods, we identify three different categories ( National Research Council, 2000 ; Freeman et al., 2014 ). Passive methods (P) include lectures, presentations, and explanation using diagrams, white boards and/or other media. These methods all rely on instructor delivery rather than student participation. Our next category represents active learning through primarily one-on-one interactions (A). The methods in this group are in-class assessment, question-and-answer (Q&A), and classroom chat. Group interactions (F) included classroom discussions and small-group activities. Given these categories, Mann-Whitney U pairwise comparisons between the 7 possible combinations and Likert scale responses about student experience showed that the use of a variety of methods resulted in higher ratings of experience vs. the use of a single method whether or not that single method was active or passive ( Table 3 ). Indeed, students whose classes used methods from each category (PAF) had higher ratings of enjoyment, motivation, and satisfaction with instruction than those who only chose any single method ( p < 0.0001) and also rated higher rates of participation and engagement compared to students whose only method was passive (P) or active through one-on-one interactions (A) ( p < 0.00001). Student ratings of distraction were not significantly different for any comparison. Given that sets of Likert responses often appeared significant together in these comparisons, we ran a CrossCat analysis to look at the probability of dependence across Likert responses. Responses have a high probability of dependence on each other, limiting what we can claim about any discrete response ( Supplementary Figure 3 ).

www.frontiersin.org

Table 3. Comparison of combinations of synchronous methods on student perceptions. Effect size (f).

Mann-Whitney U pairwise comparisons were also used to check if improvement in student experience was associated with the number of methods used vs. the variety of types of methods. For every comparison, we found that more methods resulted in higher scores on all Likert measures except distraction ( Table 4 ). Even comparison between four or fewer methods and greater than four methods resulted in a 59% chance that the latter enjoyed the courses more ( p < 0.00001) and 60% chance that they felt more motivated to learn ( p < 0.00001). Students who selected more than four methods ( n = 417) were also 65.1% ( p < 0.00001), 62.9% ( p < 0.00001) and 64.3% ( p < 0.00001) more satisfied with instruction, engaged, and actively participating, respectfully. Therefore, there was an overlap between how the number and variety of methods influenced students’ experiences. Since the number of techniques per category is 2–3, we cannot fully disentangle the effect of number vs. variety. Pairwise comparisons to look at subsets of data with 2–3 methods from a single group vs. 2–3 methods across groups controlled for this but had low sample numbers in most groups and resulted in no significant findings (data not shown). Therefore, from the data we have in our survey, there seems to be an interdependence between number and variety of methods on students’ learning experiences.

www.frontiersin.org

Table 4. Comparison of the number of synchronous methods on student perceptions. Effect size (f).

Variation in Asynchronous Pedagogical Techniques Results in More Positive Perceptions of the Student Learning Experience

Along with synchronous pedagogical methods, students reported the asynchronous methods that were used for their classes. We divided these methods into three main categories and conducted pairwise comparisons. Learning methods include video lectures, video content, and posted study materials. Interacting methods include discussion/chat forums, live office hours, and email Q&A with professors. Testing methods include assignments and exams. Our results again show the importance of variety in students’ perceptions ( Table 5 ). For example, compared to providing learning materials only, providing learning materials, interaction, and testing improved enjoyment ( f = 0.546, p < 0.001), motivation ( f = 0.553, p < 0.0001), satisfaction with instruction ( f = 0.596, p < 0.00001), engagement ( f = 0.572, p < 0.00001) and active participation ( f = 0.563, p < 0.00001) (row 6). Similarly, compared to just being interactive with conversations, the combination of all three methods improved five out of six indicators, except for distraction in class (row 11).

www.frontiersin.org

Table 5. Comparison of combinations of asynchronous methods on student perceptions. Effect size (f).

Ordinal logistic regression was used to assess the likelihood that the platforms students used predicted student perceptions ( Supplementary Table 2 ). Platform choices were based on the answers to open-ended questions in the pre-test survey. The synchronous and asynchronous methods used were consistently more predictive of Likert responses than the specific platforms. Likewise, distraction continued to be our outlier with no differences across methods or platforms.

Students Prefer In-Person and Synchronous Online Learning Largely Due to Social-Emotional Reasoning

As expected, 86.1% (4,123) of survey participants report a preference for in-person courses, while 13.9% (666) prefer online courses. When asked to explain the reasons for their preference, students who prefer in-person courses most often mention the importance of social interaction (693 mentions), engagement (639 mentions), and motivation (440 mentions). These students are also more likely to mention a preference for a fixed schedule (185 mentions) vs. a flexible schedule (2 mentions).

In addition to identifying social reasons for their preference for in-person learning, students’ suggestions for improvements in online learning focus primarily on increasing interaction and engagement, with 845 mentions of live classes, 685 mentions of interaction, 126 calls for increased participation and calls for changes related to these topics such as, “Smaller teaching groups for live sessions so that everyone is encouraged to talk as some people don’t say anything and don’t participate in group work,” and “Make it less of the professor reading the pdf that was given to us and more interaction.”

Students who prefer online learning primarily identify independence and flexibility (214 mentions) and reasons related to anxiety and discomfort in in-person settings (41 mentions). Anxiety was only mentioned 12 times in the much larger group that prefers in-person learning.

The preference for synchronous vs. asynchronous modes of learning follows similar trends ( Table 6 ). Students who prefer live classes mention engagement and interaction most often while those who prefer recorded lectures mention flexibility.

www.frontiersin.org

Table 6. Most prevalent themes for students based on their preferred mode of remote learning.

Student Perceptions Align With Research on Active Learning

The first, and most robust, conclusion is that incorporation of active-learning methods correlates with more positive student perceptions of affect and engagement. We can see this clearly in the substantial differences on a number of measures, where students whose classes used only passive-learning techniques reported lower levels of engagement, satisfaction, participation, and motivation when compared with students whose classes incorporated at least some active-learning elements. This result is consistent with prior research on the value of active learning ( Freeman et al., 2014 ).

Though research shows that student learning improves in active learning classes, on campus, student perceptions of their learning, enjoyment, and satisfaction with instruction are often lower in active-learning courses ( Deslauriers et al., 2019 ). Our finding that students rate enjoyment and satisfaction with instruction higher for active learning online suggests that the preference for passive lectures on campus relies on elements outside of the lecture itself. That might include the lecture hall environment, the social physical presence of peers, or normalization of passive lectures as the expected mode for on-campus classes. This implies that there may be more buy-in for active learning online vs. in-person.

A second result from our survey is that student perceptions of affect and engagement are associated with students experiencing a greater diversity of learning modalities. We see this in two different results. First, in addition to the fact that classes that include active learning outperform classes that rely solely on passive methods, we find that on all measures besides distraction, the highest student ratings are associated with a combination of active and passive methods. Second, we find that these higher scores are associated with classes that make use of a larger number of different methods.

This second result suggests that students benefit from classes that make use of multiple different techniques, possibly invoking a combination of passive and active methods. However, it is unclear from our data whether this effect is associated specifically with combining active and passive methods, or if it is associated simply with the use of multiple different methods, irrespective of whether those methods are active, passive, or some combination. The problem is that the number of methods used is confounded with the diversity of methods (e.g., it is impossible for a classroom using only one method to use both active and passive methods). In an attempt to address this question, we looked separately at the effect of number and diversity of methods while holding the other constant. Across a large number of such comparisons, we found few statistically significant differences, which may be a consequence of the fact that each comparison focused on a small subset of the data.

Thus, our data suggests that using a greater diversity of learning methods in the classroom may lead to better student outcomes. This is supported by research on student attention span which suggests varying delivery after 10–15 min to retain student’s attention ( Bradbury, 2016 ). It is likely that this is more relevant for online learning where students report high levels of distraction across methods, modalities, and platforms. Given that number and variety are key, and there are few passive learning methods, we can assume that some combination of methods that includes active learning improves student experience. However, it is not clear whether we should predict that this benefit would come simply from increasing the number of different methods used, or if there are benefits specific to combining particular methods. Disentangling these effects would be an interesting avenue for future research.

Students Value Social Presence in Remote Learning

Student responses across our open-ended survey questions show a striking difference in reasons for their preferences compared with traditional online learners who prefer flexibility ( Harris and Martin, 2012 ; Levitz, 2016 ). Students reasons for preferring in-person classes and synchronous remote classes emphasize the desire for social interaction and echo the research on the importance of social presence for learning in online courses.

Short et al. (1976) outlined Social Presence Theory in depicting students’ perceptions of each other as real in different means of telecommunications. These ideas translate directly to questions surrounding online education and pedagogy in regards to educational design in networked learning where connection across learners and instructors improves learning outcomes especially with “Human-Human interaction” ( Goodyear, 2002 , 2005 ; Tu, 2002 ). These ideas play heavily into asynchronous vs. synchronous learning, where Tu reports students having positive responses to both synchronous “real-time discussion in pleasantness, responsiveness and comfort with familiar topics” and real-time discussions edging out asynchronous computer-mediated communications in immediate replies and responsiveness. Tu’s research indicates that students perceive more interaction with synchronous mediums such as discussions because of immediacy which enhances social presence and support the use of active learning techniques ( Gunawardena, 1995 ; Tu, 2002 ). Thus, verbal immediacy and communities with face-to-face interactions, such as those in synchronous learning classrooms, lessen the psychological distance of communicators online and can simultaneously improve instructional satisfaction and reported learning ( Gunawardena and Zittle, 1997 ; Richardson and Swan, 2019 ; Shea et al., 2019 ). While synchronous learning may not be ideal for traditional online students and a subset of our participants, this research suggests that non-traditional online learners are more likely to appreciate the value of social presence.

Social presence also connects to the importance of social connections in learning. Too often, current systems of education emphasize course content in narrow ways that fail to embrace the full humanity of students and instructors ( Gay, 2000 ). With the COVID-19 pandemic leading to further social isolation for many students, the importance of social presence in courses, including live interactions that build social connections with classmates and with instructors, may be increased.

Limitations of These Data

Our undergraduate data consisted of 4,789 responses from 95 different countries, an unprecedented global scale for research on online learning. However, since respondents were followers of @unjadedjade who focuses on learning and wellness, these respondents may not represent the average student. Biases in survey responses are often limited by their recruitment techniques and our bias likely resulted in more robust and thoughtful responses to free-response questions and may have influenced the preference for synchronous classes. It is unlikely that it changed students reporting on remote learning pedagogical methods since those are out of student control.

Though we surveyed a global population, our design was rooted in literature assessing pedagogy in North American institutions. Therefore, our survey may not represent a global array of teaching practices.

This survey was sent out during the initial phase of emergency remote learning for most countries. This has two important implications. First, perceptions of remote learning may be clouded by complications of the pandemic which has increased social, mental, and financial stresses globally. Future research could disaggregate the impact of the pandemic from students’ learning experiences with a more detailed and holistic analysis of the impact of the pandemic on students.

Second, instructors, students and institutions were not able to fully prepare for effective remote education in terms of infrastructure, mentality, curriculum building, and pedagogy. Therefore, student experiences reflect this emergency transition. Single-modality courses may correlate with instructors who lacked the resources or time to learn or integrate more than one modality. Regardless, the main insights of this research align well with the science of teaching and learning and can be used to inform both education during future emergencies and course development for online programs that wish to attract traditional college students.

Global Student Voices Improve Our Understanding of the Experience of Emergency Remote Learning

Our survey shows that global student perspectives on remote learning agree with pedagogical best practices, breaking with the often-found negative reactions of students to these practices in traditional classrooms ( Shekhar et al., 2020 ). Our analysis of open-ended questions and preferences show that a majority of students prefer pedagogical approaches that promote both active learning and social interaction. These results can serve as a guide to instructors as they design online classes, especially for students whose first choice may be in-person learning. Indeed, with the near ubiquitous adoption of remote learning during the COVID-19 pandemic, remote learning may be the default for colleges during temporary emergencies. This has already been used at the K-12 level as snow days become virtual learning days ( Aspergren, 2020 ).

In addition to informing pedagogical decisions, the results of this survey can be used to inform future research. Although we survey a global population, our recruitment method selected for students who are English speakers, likely majority female, and have an interest in self-improvement. Repeating this study with a more diverse and representative sample of university students could improve the generalizability of our findings. While the use of a variety of pedagogical methods is better than a single method, more research is needed to determine what the optimal combinations and implementations are for courses in different disciplines. Though we identified social presence as the major trend in student responses, the over 12,000 open-ended responses from students could be analyzed in greater detail to gain a more nuanced understanding of student preferences and suggestions for improvement. Likewise, outliers could shed light on the diversity of student perspectives that we may encounter in our own classrooms. Beyond this, our findings can inform research that collects demographic data and/or measures learning outcomes to understand the impact of remote learning on different populations.

Importantly, this paper focuses on a subset of responses from the full data set which includes 10,563 students from secondary school, undergraduate, graduate, or professional school and additional questions about in-person learning. Our full data set is available here for anyone to download for continued exploration: https://dataverse.harvard.edu/dataset.xhtml?persistentId= doi: 10.7910/DVN/2TGOPH .

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. The patients/participants provided their written informed consent to participate in this study.

Author Contributions

GS: project lead, survey design, qualitative coding, writing, review, and editing. TN: data analysis, writing, review, and editing. CN and PB: qualitative coding. JW: data analysis, writing, and editing. CS: writing, review, and editing. EV and KL: original survey design and qualitative coding. PP: data analysis. JB: original survey design and survey distribution. HH: data analysis. MP: writing. All authors contributed to the article and approved the submitted version.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We want to thank Minerva Schools at KGI for providing funding for summer undergraduate research internships. We also want to thank Josh Fost and Christopher V. H.-H. Chen for discussion that helped shape this project.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2021.647986/full#supplementary-material

Aspergren, E. (2020). Snow Days Canceled Because of COVID-19 Online School? Not in These School Districts.sec. Education. USA Today. Available online at: https://www.usatoday.com/story/news/education/2020/12/15/covid-school-canceled-snow-day-online-learning/3905780001/ (accessed December 15, 2020).

Google Scholar

Bostock, S. J. (1998). Constructivism in mass higher education: a case study. Br. J. Educ. Technol. 29, 225–240. doi: 10.1111/1467-8535.00066

CrossRef Full Text | Google Scholar

Bradbury, N. A. (2016). Attention span during lectures: 8 seconds, 10 minutes, or more? Adv. Physiol. Educ. 40, 509–513. doi: 10.1152/advan.00109.2016

PubMed Abstract | CrossRef Full Text | Google Scholar

Chen, B., Bastedo, K., and Howard, W. (2018). Exploring best practices for online STEM courses: active learning, interaction & assessment design. Online Learn. 22, 59–75. doi: 10.24059/olj.v22i2.1369

Davis, D., Chen, G., Hauff, C., and Houben, G.-J. (2018). Activating learning at scale: a review of innovations in online learning strategies. Comput. Educ. 125, 327–344. doi: 10.1016/j.compedu.2018.05.019

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., and Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. 116, 19251–19257. doi: 10.1073/pnas.1821936116

Fink, L. D. (2013). Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses. Somerset, NJ: John Wiley & Sons, Incorporated.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., et al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. 111, 8410–8415. doi: 10.1073/pnas.1319030111

Gale, N. K., Heath, G., Cameron, E., Rashid, S., and Redwood, S. (2013). Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med. Res. Methodol. 13:117. doi: 10.1186/1471-2288-13-117

Garrison, D. R., Anderson, T., and Archer, W. (1999). Critical inquiry in a text-based environment: computer conferencing in higher education. Internet High. Educ. 2, 87–105. doi: 10.1016/S1096-7516(00)00016-6

Gay, G. (2000). Culturally Responsive Teaching: Theory, Research, and Practice. Multicultural Education Series. New York, NY: Teachers College Press.

Gillingham, and Molinari, C. (2012). Online courses: student preferences survey. Internet Learn. 1, 36–45. doi: 10.18278/il.1.1.4

Gillis, A., and Krull, L. M. (2020). COVID-19 remote learning transition in spring 2020: class structures, student perceptions, and inequality in college courses. Teach. Sociol. 48, 283–299. doi: 10.1177/0092055X20954263

Goodyear, P. (2002). “Psychological foundations for networked learning,” in Networked Learning: Perspectives and Issues. Computer Supported Cooperative Work , eds C. Steeples and C. Jones (London: Springer), 49–75. doi: 10.1007/978-1-4471-0181-9_4

Goodyear, P. (2005). Educational design and networked learning: patterns, pattern languages and design practice. Australas. J. Educ. Technol. 21, 82–101. doi: 10.14742/ajet.1344

Gunawardena, C. N. (1995). Social presence theory and implications for interaction and collaborative learning in computer conferences. Int. J. Educ. Telecommun. 1, 147–166.

Gunawardena, C. N., and Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer mediated conferencing environment. Am. J. Distance Educ. 11, 8–26. doi: 10.1080/08923649709526970

Harris, H. S., and Martin, E. (2012). Student motivations for choosing online classes. Int. J. Scholarsh. Teach. Learn. 6, 1–8. doi: 10.20429/ijsotl.2012.060211

Levitz, R. N. (2016). 2015-16 National Online Learners Satisfaction and Priorities Report. Cedar Rapids: Ruffalo Noel Levitz, 12.

Mansinghka, V., Shafto, P., Jonas, E., Petschulat, C., Gasner, M., and Tenenbaum, J. B. (2016). CrossCat: a fully Bayesian nonparametric method for analyzing heterogeneous, high dimensional data. J. Mach. Learn. Res. 17, 1–49. doi: 10.1007/978-0-387-69765-9_7

National Research Council (2000). How People Learn: Brain, Mind, Experience, and School: Expanded Edition. Washington, DC: National Academies Press, doi: 10.17226/9853

Richardson, J. C., and Swan, K. (2019). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Online Learn. 7, 68–88. doi: 10.24059/olj.v7i1.1864

Shea, P., Pickett, A. M., and Pelz, W. E. (2019). A Follow-up investigation of ‘teaching presence’ in the suny learning network. Online Learn. 7, 73–75. doi: 10.24059/olj.v7i2.1856

Shekhar, P., Borrego, M., DeMonbrun, M., Finelli, C., Crockett, C., and Nguyen, K. (2020). Negative student response to active learning in STEM classrooms: a systematic review of underlying reasons. J. Coll. Sci. Teach. 49, 45–54.

Short, J., Williams, E., and Christie, B. (1976). The Social Psychology of Telecommunications. London: John Wiley & Sons.

Tu, C.-H. (2002). The measurement of social presence in an online learning environment. Int. J. E Learn. 1, 34–45. doi: 10.17471/2499-4324/421

Zull, J. E. (2002). The Art of Changing the Brain: Enriching Teaching by Exploring the Biology of Learning , 1st Edn. Sterling, VA: Stylus Publishing.

Keywords : online learning, COVID-19, active learning, higher education, pedagogy, survey, international

Citation: Nguyen T, Netto CLM, Wilkins JF, Bröker P, Vargas EE, Sealfon CD, Puthipiroj P, Li KS, Bowler JE, Hinson HR, Pujar M and Stein GM (2021) Insights Into Students’ Experiences and Perceptions of Remote Learning Methods: From the COVID-19 Pandemic to Best Practice for the Future. Front. Educ. 6:647986. doi: 10.3389/feduc.2021.647986

Received: 30 December 2020; Accepted: 09 March 2021; Published: 09 April 2021.

Reviewed by:

Copyright © 2021 Nguyen, Netto, Wilkins, Bröker, Vargas, Sealfon, Puthipiroj, Li, Bowler, Hinson, Pujar and Stein. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Geneva M. Stein, [email protected]

This article is part of the Research Topic

Covid-19 and Beyond: From (Forced) Remote Teaching and Learning to ‘The New Normal’ in Higher Education

  • Open access
  • Published: 28 September 2023

Research on the development and innovation of online education based on digital knowledge sharing community

  • Xi Huang 1 ,
  • Hongwei Li 2 ,
  • Lirong Huang 3 &
  • Tao Jiang 4  

BMC Psychology volume  11 , Article number:  295 ( 2023 ) Cite this article

2193 Accesses

1 Citations

Metrics details

Digital knowledge sharing (DKS) communities have emerged as a promising approach to support learning and innovation in online higher education. These communities facilitate the exchange of knowledge, resources, and ideas among educators, students, and experts, creating opportunities for collaboration, innovation, and lifelong learning. However, the impact and role of DKS communities in online education are not well understood, and further research is needed to explore their potential benefits and challenges.

This multi-objective qualitative study aims to investigate the impact and role of DKS communities in online higher education, identifying the factors that promote student success and the implications for the development of online education. The study collected data from 20 informants who have experienced teaching online during and after the pandemic. Data were collected through in-depth interviews and analyzed using thematic analysis. The informants were selected through theoretical sampling.

Methodology

To explore the impact and role of DKS communities in online higher education, this study employed a multi-objective qualitative research method. Data were collected through in-depth interviews conducted with 20 informants who possessed experience in teaching online during and after the pandemic. The informants were selected through theoretical sampling to ensure diverse perspectives and insights. The collected data were subsequently analyzed using thematic analysis, allowing for the identification of key themes and patterns.

The findings of this study provide valuable insights into the impact and role of DKS communities in online higher education. These insights encompass various aspects, including the benefits and challenges of DKS in online education, the factors that contribute to student success, and the implications for the ongoing development and innovation of online education.

Conclusions

In conclusion, this multi-objective qualitative study sheds light on the significance of DKS communities in online higher education. It underscores their potential to enhance collaboration, innovation, and lifelong learning. The findings also emphasize the importance of addressing challenges and fostering an inclusive and supportive online learning environment. These insights inform best practices and contribute to the continuous development and innovation of online education, particularly in the post-pandemic educational landscape.

Peer Review reports

Introduction

Education for sustainable development is a vital aspect of achieving the Sustainable Development Goals proposed by the United Nations Economic and Social Development Organization (UNESCO) and adopted by institutions worldwide [ 1 ]. Education is seen as an essential means of creating awareness and promoting sustainable development by encouraging individuals to adopt environmentally friendly behaviors. The concept of Higher Education for Sustainable Development (HESD) has been widely discussed in recent years [ 2 ]. Higher education institutions, such as universities and technical training colleges, have gradually become essential platforms for promoting sustainable development in the 21st century [ 3 ].

Digitalization, on the other hand, has silently revolutionized the way humans live. Almost all fields of knowledge are benefiting from digitalization, including education, healthcare, business, and entertainment [ 4 ]. Digitalization has provided a more efficient way of knowledge sharing, allowing individuals to access and share information quickly and easily. This has enabled education to be more widespread and accessible to individuals worldwide, creating opportunities for people to improve their knowledge and skills [ 5 ]. Institutions of higher education have been at the forefront of digitalization, transforming the way instructors develop courses and disseminate research findings. Digital networks such as 5G are gradually rolling out worldwide, enabling faster and more reliable communication, which has transformed many industries, including the industrial sector. Universities and technical training colleges have played a significant role in advancing UNESCO’s Sustainable Development Goals, with many initiatives launched worldwide to promote its further development [ 1 ].

According to Elmassah et al. [ 4 ] higher education has traditionally been the primary platform for generating, developing, and promoting knowledge. In recent years, countries such as China, India, Thailand, Vietnam, Nigeria, and Kenya have successfully applied digitalization and used higher education to promote sustainable development [ 6 , 7 ]. Digitalization has been a powerful tool for sharing knowledge, as noted by Gregson et al. [ 8 ]. With the advent of the internet and higher technology, institutions of higher education can share knowledge generated by experts with new generations of learners [ 9 ]. As Funk [ 10 ] suggests, “sharing knowledge is power,” and digitalization provides an effective means of achieving this.

Digital technology is transforming how learners understand and interpret new knowledge, as well as impacting the motivation of academics to share their research findings. The construction industry [ 11 , 12 ] and the information technology field [ 12 ] are examples of how digitalization is changing the way people work. This is due to the increased flow of information made possible by digitalization, which enables better coordination between independent units and individuals. In higher education institutions, digitization has had a significant impact, with several initiatives launched worldwide to promote its further development [ 13 ]. Digital platforms provide new structures for better knowledge sharing and continuous innovation, as noted by Arfi et al. [ 14 ].

Technology can be used to create Digital Knowledge Sharing (DKS) communities (DKSCs). These communities enable collaboration and networking among learners and educators, facilitate personalized learning, incorporate gamification to enhance the learning experience, leverage artificial intelligence and machine learning to provide adaptive learning experiences, and require quality control to ensure the accuracy and reliability of learning resources [ 15 , 16 , 17 , 18 , 19 , 20 , 21 ]. Digital knowledge sharing in online education has several benefits, including accessibility, flexibility, cost-effectiveness, customization, and interactive learning [ 22 , 23 ]. This approach has the potential to transform traditional education by providing students with a flexible, accessible, and engaging learning experience.

Digitalization is portrayed as an omnipresent force that has revolutionized various sectors, including education [ 15 ]. The literature underscores the advantages of digital technology in facilitating knowledge sharing, making education more accessible, and fostering global connectivity. Nonetheless, it is crucial to acknowledge the digital divide that persists globally, wherein not all individuals have equitable access to digital resources and technologies. Therefore, the assertion that digitalization universally enhances accessibility should be tempered with an awareness of existing disparities [ 16 , 17 ].

Grounded in the belief that collaborative learning, dissemination of best practices, and technological innovation are central components of educational progress, this research seeks to explore how DKS communities act as catalysts for continual improvement in the digital education landscape. Drawing from theories of educational technology, social learning, and innovation diffusion, the study aims to elucidate the multifaceted ways in which these communities enhance the online learning experience, foster critical thinking, and promote a culture of lifelong education [ 17 ]. By investigating the interplay between technology, collaborative learning, and innovation within the context of online education, this study endeavors to contribute to the theoretical foundations underpinning the advancement of digital learning environments [ 17 ].

Furthermore, the literature review highlights the transformative potential of digital technology in higher education, particularly in terms of knowledge dissemination and collaboration among learners and educators. It suggests that digital platforms can facilitate continuous innovation and personalized learning experiences [ 12 , 13 , 14 , 15 , 16 , 17 , 18 ]. However, the review does not thoroughly address the challenges and drawbacks of this digital transformation. Issues related to digital literacy, data privacy, and the quality of online education resources require careful consideration. Additionally, the potential homogenization of education through digitalization, where diverse perspectives may be marginalized, warrants scrutiny.

The concept of DKS communities is introduced as a means to harness the power of digital technology in education. These communities are presented as innovative solutions for collaboration, gamification, and adaptive learning. While the potential benefits of DKSCs are intriguing, the review does not offer a comprehensive examination of the effectiveness of such communities in practice. Are DKS communities accessible to all students, and do they effectively enhance learning outcomes? These questions remain unanswered.

In conclusion, the literature review provides a compelling narrative about the transformative potential of digitalization in higher education for sustainable development. However, it is important to approach this paradigm shift with critical scrutiny. Equitable access, digital literacy, quality assurance, and the preservation of diverse educational experiences are paramount considerations in the era of digital education. The promising fusion of education and digitalization should be tempered with a commitment to addressing the challenges that arise in this evolving landscape.

As we stand at the intersection of technology and education, it becomes increasingly evident that our ability to harness these innovations will have far-reaching implications for the future of learning. In this context, understanding the intricate relationship between cognitive control, relational aggression, and emerging learning technologies among sportsmen adults is not just academically valuable but also relevant to the broader discourse on how technology is shaping the educational landscape. By delving into this subject matter, we aim to contribute to the ongoing dialogue on educational innovation, shedding light on the ways in which emerging technologies impact cognitive processes and social dynamics within the context of sports education. Specifically, there is a need to understand the challenges and opportunities of using these communities to develop and innovate online education. This study aims to contribute to the growing body of research on the development and innovation of online education based on digital knowledge sharing. This research can inform educators and policymakers on how to best leverage DKS communities to enhance the quality and effectiveness of online education in the post-pandemic era. Therefore, this research aims to investigate the following research questions:

What are the benefits and challenges of digital knowledge sharing in online education?

What is the role of DKS communities in the development and innovation of online education?

How do DKS communities impact student learning and engagement in online education?

What factors contribute to the effectiveness of DKS communities in promoting student success?

Research method

Research design.

The research design for this study was a qualitative research method using a phenomenological approach. “Phenomenology is particularly useful in research studies that aim to explore subjective experiences and perceptions of participants. It allows the researcher to gain a deep understanding of the lived experiences of the participants, and to uncover the meaning and essence of those experiences [ 24 ]. The study aimed to explore the experiences of university professors in higher education in China regarding the role and contribution of DKS communities to online education. The phenomenological approach allowed the researcher to understand the subjective experiences and perceptions of the informants regarding the use of DKS communities in online education.

The informants for this study were 20 university professors of higher education in China who were selected through theoretical sampling. The selection criteria for the informants were that they have experience in teaching online courses and have used DKS communities in their teaching. The rationale for sample size was data saturation which occurred when the 20 th informant was interviewed. The informants were selected from Tsinghua University, Peking University, Fudan University, Nanjing University, and Wuhan University. The average age of the group was 30 years old with a standard deviation of 5 years, while the average number of years of teaching experience was 12 with a standard deviation of 5 years. The informants who participated in the study did so willingly and were aware of the purpose of the research. They were provided with information about the study and provided their informed consent before being interviewed. The confidentiality and privacy of the informants and the data collected were ensured throughout the research process. The informants were assured that their personal information and responses would be kept confidential and that their identities would not be revealed in any publications or reports resulting from the study. All data collected during the study was stored securely and only accessed by the research team. These measures were taken to ensure that the informants felt comfortable sharing their experiences and perceptions and to protect their privacy and confidentiality.

Instrumentation

The primary instrument for data collection in this study was semi-structured and focused interviews. The semi-structured interviews allowed the researcher to explore the informants’ experiences and perceptions regarding the use of digital knowledge-sharing communities in online education in a flexible and open-ended manner. The focus interviews allowed the researcher to probe more deeply into specific topics or issues related to the use of digital knowledge-sharing communities in online education (See Additional file 1 : Appendix A).

Data collection procedure

The informants who participated in the study were recruited through theoretical sampling, which involved identifying potential participants who met the selection criteria and inviting them to participate in the study. The selection criteria included having experience in teaching online courses and using digital knowledge-sharing communities in their teaching. The informants were selected from several universities, including Tsinghua, Peking, Fudan, Nanjing, and Wuhan. Once the informants were recruited, they were given information about the study and provided their informed consent before being interviewed. The interviews were conducted either face-to-face or online, depending on the availability and preference of the informants. For the face-to-face interviews, the researcher arranged to meet with the informants at their universities or other convenient locations. For the online interviews, the researcher used video conferencing applications such as Skype or Zoom.

The interviews were semi-structured and focused on the research questions, with the aim of exploring the experiences and perceptions of the informants regarding the role and contribution of digital knowledge-sharing communities to online education. The interviews were audio-recorded with the consent of the informants and later transcribed verbatim for analysis. In addition to the audio recordings, the researcher also took field notes during the interviews to capture non-verbal cues and contextual information.

Throughout the data collection process, the privacy and confidentiality of the informants and the data collected were ensured. The informants were assured that their personal information and responses would be kept confidential and that their identities would not be revealed in any publications or reports resulting from the study. All data collected during the study was stored securely and only accessed by the research team. These measures were taken to protect the privacy and confidentiality of the informants and to ensure that they felt comfortable sharing their experiences and perceptions.

Data analysis

The data were analyzed using MAXQDA software (version 2022), following the recommendation of Creswell [ 24 ]. The unit of analysis in this study was the sentence, and the researcher focused on manifest content rather than latent content. The qualitative data were collected, analyzed, and reported in English. An inductive approach to content analysis was employed, as no preexisting theory or framework guided the generation of codes, categories, and themes [ 25 ]. Sutton and Austin [ 26 ] proposed a five-step process for qualitative data analysis, which was followed in this study. Firstly, the data were cleaned by addressing linguistic errors, ambiguities, inaccuracies, and repetitions. Secondly, the researcher read the data multiple times and developed open codes. Thirdly, the open codes were categorized as relevant axial codes or subtopics. Fourthly, the axial codes and subtopics were grouped under higher-order selective codes and general themes. Finally, a detailed report was prepared to document the completed data analysis process and its interpretation. The frequency of generated codes, topics, and categories was reported, and the results were visually presented using the MAXMAP properties of MAXQDA, creating visual representations. To ensure the credibility of the analytical process, 20% of the generated codes were randomly selected and re-coded by a second coder with sufficient knowledge and experience in qualitative research. Specifically, 80 codes were created, and 20 of them were sent to the second coder. Upon coding, the second coder disagreed with the first coder on one code, resulting in an intercoder agreement coefficient of 96%. The two coders discussed and resolved any disagreements by making the necessary changes, ensuring the completion of the qualitative data analysis process.

Research quality

In order to uphold the integrity of the research conducted in this study, the researcher implemented various methodologies, such as member checking, peer debriefing, and reflexivity. Member checking is a method used to validate and authenticate data by consulting with the individuals who provided the information. This process aids in ensuring that the data accurately represents their experiences and perspectives. Peer debriefing is a process that entails soliciting feedback and input from fellow researchers or subject matter experts in order to enhance the credibility and reliability of the research findings. Reflexivity encompasses the critical examination of the researcher’s biases, assumptions, and values, and their potential impact on the research process and outcomes. This practice serves to enhance the study’s validity and reliability.

The findings of the study are presented based on the order of the research questions.

Research questions 1

The first research question aimed at exploring the benefits and challenges of digital knowledge sharing in online education. interviews with the informants were analyzed and 7 benefits and 10 main challenges were extracted. Below are explanations for each extracted theme using at least two citations, along with two quotations from informants for each theme:

Benefits of DKS communities

Detailed analysis of the interviews with the informants revealed that DKS communities have several benefits and challenges, each is explained and exemplified as follows.

Improved student engagement

Improved student engagement is a benefit of digital knowledge sharing in online education. DKS communities provide a platform for students to interact and collaborate with each other, which can improve their engagement and motivation to learn. According to one informant, “DKS communities help keep students engaged in the course material, which can lead to better learning outcomes” (Informant 1). Another informant stated, “When students are able to participate in online discussions and share their own ideas and perspectives, they become more invested in the learning process” (Informant 2).

Enhanced learning outcomes

Using DKS communities in online education can enhance learning outcomes by providing students with access to a wider range of resources and perspectives. According to one informant, “DKS communities can help students develop critical thinking skills by exposing them to diverse perspectives and encouraging them to challenge their own assumptions” (Informant 3). Another informant stated, “Using digital tools like online discussion forums and collaborative documents can help students engage more deeply with the material and apply what they’ve learned in new ways” (Informant 4).

Flexibility and accessibility

Flexibility and accessibility are benefits of digital knowledge sharing in online education. Online education allows for greater flexibility in terms of when and where learning takes place, which can be especially beneficial for students who have other commitments such as work or family responsibilities. According to one informant, “Online courses provide flexibility for students who might not be able to attend traditional in-person classes due to other commitments” (Informant 5). Another informant stated, “DKS communities make education more accessible to students who might not have access to traditional educational resources, which can help level the playing field for students from different backgrounds” (Informant 6).

Improved student outcomes

Digital knowledge sharing in online education can lead to improved student outcomes such as better grades, higher retention rates, and increased satisfaction with the learning experience. According to one informant, “Digital knowledge sharing can lead to better learning outcomes because students are able to access a wider range of resources and engage with different perspectives” (Informant 13). Another informant stated, “Online education can be especially beneficial for students who might struggle with traditional classroom settings, as it can provide a more personalized and flexible learning experience” (Informant 14).

Increased collaboration

DKS communities can facilitate increased collaboration among students and instructors. According to one informant, “Online discussion forums and collaborative documents can allow students to work together on projects and assignments, which can enhance their understanding of the material and improve their communication skills” (Informant 15). Another informant stated, “Digital knowledge sharing can create a sense of community among students who might not have had the opportunity to interact with each other otherwise” (Informant 16).

Personalization of learning:

Digital knowledge sharing in online education can enable a more personalized learning experience for students. According to one informant, “Online courses can allow students to work at their own pace and focus on the areas where they need the most help, which can lead to better learning outcomes” (Informant 11). Another informant stated, “Digital knowledge sharing can allow instructors to provide more targeted feedback to individual students, which can help them improve their understanding of the material” (Informant 18).

Greater feedback and assessment opportunities:

DKS communities can provide greater opportunities for feedback and assessment. According to one informant, “Online quizzes and assessments can provide immediate feedback to students, which can help them identify areas where they need to improve” (Informant 19). Another informant stated, “Digital knowledge sharing can allow instructors to provide more frequent and detailed feedback to students, which can help them stay on track and improve their performance” (Informant 20). Despite the benefits of the DKS communities, the informants mentioned some challenges, which are exemplified and explained as follows.

Challenges of DKS communities

Informant have stated some challenges of DKS communities, which are explained and exemplified as follows.

Technical difficulties

Technical difficulties are a challenge of digital knowledge sharing in online education. Students and instructors may encounter technical difficulties such as internet connectivity issues or software glitches, which can disrupt the learning process and create frustration for students and instructors alike. According to one informant, “Technical difficulties can be a major barrier to effective online learning, and can lead to students feeling frustrated and disengaged from the course material” (Informant 7). Another informant stated, “Instructors need to be prepared to troubleshoot technical issues and provide support to students who are experiencing difficulties with the digital tools” (Informant 8).

Maintaining academic integrity

Maintaining academic integrity is a challenge of digital knowledge sharing in online education. DKS communities can create challenges around maintaining academic integrity, as students may be tempted to plagiarize or share answers with each other. According to one informant, “Ensuring academic integrity in DKS communities requires a concerted effort from instructors and students to communicate expectations and uphold ethical standards” (Informant 9). Another informant stated, “Instructors need to be vigilant about monitoring student behavior in DKS communities to ensure that academic dishonesty is not taking place” (Informant 10).

Digital divide

The digital divide is a challenge of digital knowledge sharing in online education. Not all students have equal access to technology and internet connectivity, which can create a digital divide in online education. According to one informant, “The digital divide can exacerbate existing inequalities in education and limit opportunities for students who lack access to the necessary technology and resources” (Informant 11). Another informant stated, “Institutions need to be mindful of the digital divide and take steps to ensure that all students have access to the technology and resources necessary to participate in online learning” (Informant 12).

Social isolation and lack of interaction

DKS communities can create a sense of social isolation and limit opportunities for interaction among students and instructors. According to one informant, “Online education can be a lonely experience for students who are used to traditional classroom settings, as they may miss out on the social interactions that are a key part of the learning experience” (Informant 17). Another informant stated, “Instructors need to be intentional about creating opportunities for interaction and collaboration among students in online courses” (Informant 20).

Time management and self-discipline

Digital knowledge sharing in online education can require strong time management and self-discipline skills, which can be challenging for some students. According to one informant, “Online courses require a high degree of self-discipline and time management skills, as students are often responsible for setting their own schedules and managing their own learning” (Informant 10). Another informant stated, “Instructors can help students develop these skills by providing clear expectations and deadlines, and by encouraging them to set goals and prioritize their workload” (Informant 4).

Limited access to hands-on learning experiences:

Digital knowledge sharing in online education can limit opportunities for hands-on learning experiences, which can be a challenge for students in certain fields of study. According to one informant, “Online education may not be suitable for certain fields such as science and engineering, where hands-on learning experiences are an important part of the curriculum” (Informant 3). Another informant stated, “Instructors need to be creative in finding ways to provide hands-on learning experiences in online courses, such as through virtual simulations or online labs” (Informant 14).

Quality control

The challenge of DKS communities is ensuring the quality and accuracy of the content being shared. With so much information available, it can be difficult to sift through it all and ensure that learners are accessing high-quality and reliable resources.” (Informant 14). Similarly, informant 18 has stated that, “in the era of DKS Communities, the abundance of information poses a formidable challenge: the assurance of content quality and reliability. Amidst the vast digital landscape, the task of curating high-quality resources becomes imperative to safeguard the learning journey.” (Informant 11).

Intellectual property

The issue of intellectual property is a complex one in the context of digital knowledge-sharing communities. It is important to ensure that content is properly attributed and that copyright laws are respected, but this can be difficult to enforce in online environments. (Informant 16). In addition, informant 15 stated, “In the realm of DKS communities, the intricacies of intellectual property come to the forefront. Balancing the imperative of proper attribution and the adherence to copyright laws with the challenges of enforcement in the vast online domain presents a multifaceted dilemma.”

Cultural differences

Cultural differences can present a challenge for digital knowledge-sharing communities, particularly when it comes to language barriers. It is important to ensure that these communities are inclusive of all cultures and languages and that learners have access to resources that are culturally relevant to them. This finding is in line with quotations from informant 9 who stated, “cultural diversity emerges as a compelling challenge, often manifesting through the formidable barriers of language. The imperative lies in fostering inclusive platforms that transcend cultural boundaries, granting learners access to resources imbued with cultural relevance.” Informant 5 also stated, “the harmonious coexistence of diverse cultures within digital knowledge-sharing communities highlights the importance of dismantling language barriers. Ensuring inclusivity through culturally relevant resources stands as an essential endeavor to bridge the gap in global education.”

Research question 2

The second research question addressed the role of DKS communities in the development and innovation of online education. Interviews with the informants were analyzed and 7 themes were extracted, which are explained and exemplified as follows.

Facilitation of collaboration and innovation

DKS communities can facilitate collaboration and knowledge exchange among students and instructors, which can lead to the development of innovative approaches to teaching and learning in online education. According to one informant, “DKS communities can help to create a culture of collaboration and experimentation, where ideas can be shared and refined in real-time” (Informant 1). Another informant stated, “Collaboration is a key component of online education, and DKS communities can provide a platform for students and instructors to work together on projects and assignments, which can lead to the development of innovative solutions” (Informant 2).

Encouragement of innovation

DKS communities can encourage innovation in online education by providing a space for experimentation and exploration of new teaching methods and technologies. According to one informant, “DKS communities can encourage instructors to experiment with new technologies and teaching methods, which can lead to the development of more engaging and effective online courses” (Informant 3). Another informant stated, “Innovation in online education can lead to improved learning outcomes and greater student engagement, and DKS communities can play a key role in supporting this innovation” (Informant 4).

Dissemination of best practices

DKS communities can serve as a platform for disseminating knowledge about best practices and successful approaches to online education. According to one informant, “Sharing knowledge and experiences through digital platforms can help to build a community of practice around online education, and can lead to the development of new insights and approaches” (Informant 5). Another informant stated, “DKS communities can provide a way for instructors to learn from each other and to stay up-to-date on the latest trends and developments in online education” (Informant 6).

Support for lifelong learning

DKS communities can play a role in supporting lifelong learning by providing access to a wide range of learning resources and opportunities. According to one informant, “DKS communities can provide a platform for individuals to continue learning throughout their lives and can help to bridge the gap between formal education and informal learning” (Informant 9).

Enhancement of critical thinking

DKS communities can enhance critical thinking skills by exposing students to diverse perspectives and encouraging them to engage in discussions and debates. According to one informant, “DKS communities can help to develop critical thinking skills by exposing students to a variety of viewpoints and challenging them to think deeply about complex issues” (Informant 10).

Promotion of student agency

DKS communities can promote student agency by giving them more control over their learning and encouraging them to take an active role in shaping their educational experiences. According to one informant, “DKS communities can give students a sense of ownership over their learning, and can help to foster a sense of autonomy and independence” (Informant 14).

Development of digital literacy

DKS communities can help to develop digital literacy skills by providing opportunities for students to engage with digital tools and platforms. According to one informant, “DKS communities can help to develop digital literacy skills by giving students the opportunity to interact with a variety of digital tools and platforms, and by encouraging them to experiment and explore” (Informant 18).

Research question 3

The third research question aimed at exploring how DKS communities impact student learning and engagement in online education. Findings revealed that digital knowledge sharing (DKS) communities can have a significant impact on student learning and engagement in online education. Here are some potential ways that DKS communities can impact student learning and engagement:

Increased access to resources

DKS communities can provide students with access to a wide range of learning resources, including articles, videos, podcasts, and other multimedia. This can help to increase the diversity of perspectives and ideas that students are exposed to, enhancing the quality and effectiveness of their learning. For instance, informant 7 stated, “Being part of the DKS community has given me access to a wide range of resources that I wouldn’t have found on my own. It has helped me to deepen my understanding of the course material and to see things from different perspectives.”

Peer support and collaboration

DKS communities can provide opportunities for peer support and collaboration, which can help to promote student engagement and motivation. Students can ask questions, share insights, and work together on projects and assignments, fostering a sense of community and connection. This finding can be supported by a quotation from informant 2 who stated, “The DKS community has been a great way to meet new people and to work together on projects. It’s helped me to feel more connected to the course and to stay motivated.”

Diverse perspectives

DKS communities can expose students to diverse perspectives and ideas, which can help to broaden their understanding and deepen their critical thinking skills. Students can engage in discussions and debates with others who have different backgrounds and experiences, leading to a more well-rounded learning experience. The finding is supported by quotation from informant 7, who stated, “The DKS community has exposed me to a wide range of perspectives and ideas that I never would have encountered otherwise. It’s helped me to broaden my understanding of the course material and to think more critically about the issues.”

Active learning

DKS communities can promote active learning by providing opportunities for students to engage with course material in meaningful ways. Students can apply their learning to real-world scenarios, participate in simulations or case studies, and engage in hands-on activities that promote deeper understanding of course concepts. As an example, informant 5 stated, “The DKS community has been a great way to engage with the course material in a more meaningful way. It’s helped me to stay motivated and to feel like I’m making progress.”

Flexibility and personalization

DKS communities can provide flexibility and personalization in the online learning experience, giving students more control over their learning and allowing them to tailor their experience to their individual needs and preferences. This can help to increase engagement and motivation, as well as promote a sense of ownership and responsibility for learning. As an example, informant 9 stated, “The DKS community has been a great way to personalize my learning experience. I’ve been able to explore topics that interest me and to find resources that are relevant to my goals.”

Research question 4

Research question 4 aimed at exploring the factors which contribute to the effectiveness of DKS communities in promoting student success. The theme analysis of the interviews with the informants showed 5 factors contribute to the effectiveness of DKS communities, each is explained as follows:

Active participation by students:

The data analysis revealed that active participation by students is a key factor that contributes to the effectiveness of DKS communities in promoting student success. Students who actively participate in these communities tend to have a better understanding of the course material and are more engaged in their learning. For instance, informant 8 argued, “Students who participate in the community are more likely to have a better understanding of the course material and are more engaged in their learning.”

Clear expectations and guidelines

The analysis also found that establishing clear expectations and guidelines for DKS community participation can help students understand what is expected of them and how they can contribute to the community. This can lead to greater engagement and participation by students, which in turn can contribute to their success. One of the informants stated. “I give clear guidelines and expectations to my students, and I encourage them to interact with each other and share their knowledge.”

Supportive and collaborative learning environment:

The findings suggest that a supportive and collaborative learning environment is another factor that contributes to the effectiveness of DKS communities in promoting student success. Students who feel supported and encouraged by their peers and instructors in these communities tend to be more motivated and engaged in their learning. For instance, informant 5 stated, “The DKS community is a place where students can learn from each other, share their experiences, and support each other.”

Relevance and usefulness of the DKS community

The analysis also revealed that the relevance and usefulness of the DKS community is important in promoting student success. Students are more likely to participate in these communities when they see the relevance and usefulness of the community in relation to their course goals and learning outcomes. To exemplify the theme, the following quotation is used, “The DKS community is a valuable resource for students to ask questions, share knowledge, and learn from each other.” (Informant 3).

Flexibility and adaptability of the DKS community:

Finally, the data analysis showed that the flexibility and adaptability of the DKS community is another important factor in promoting student success. DKS communities that are flexible and adaptable to the changing needs and preferences of students tend to be more effective in promoting student success. For instance, informant 19 stated, “I try to be flexible and adapt the community to the changing needs and preferences of my students.”

This multi-objective qualitative study at exploring the impact and role of DKS communities in innovation in online higher education. A qualitative research method was used and the interviews with 20 informants were analyzed. With regard to the first objective, the informants mentioned 7 benefits and 9 challenges of using DKS communities in the development and innovation of online education based on a digital knowledge-sharing community. The findings highlight the potential benefits and challenges of using DKS communities for developing and innovating online education, as well as the ways in which these communities can contribute to higher education sustainability. Recent research has supported these findings and provided further insights into the benefits and challenges of using DKS communities in education. For example, a study by Ansari, et al. [ 27 ] found that DKS communities can enhance collaborative learning and knowledge-sharing among students, leading to better learning outcomes. Similarly, a study by Cheng, et al., [ 28 ] found that DKS communities can improve teacher collaboration and professional development, leading to more effective teaching practices.

The findings related to the challenges are also consistent with the findings of some researchers. For example, a study by Nugroho, et al. [ 29 ] found that quality control remains a challenge for digital knowledge-sharing communities, with a need for better tools and strategies for evaluating the quality of shared resources. Additionally, a study by Zhao, et al., [ 30 ] found that the digital divide remains a major barrier to accessing digital knowledge-sharing communities, particularly for learners in rural areas or with limited access to technology. To address these challenges, recent research has proposed various solutions and strategies. For example, Similarly, a study by Thi Minh Ly, et al. [ 31 ] proposed a model for bridging the digital divide, which involves providing learners with access to digital infrastructure and training in digital literacy skills. Similarly, a study by Shawar et al. [ 32 ] proposed a framework for quality assurance in digital knowledge-sharing communities, which includes guidelines for content evaluation and quality control.

With regard to the second objective, the analysis of the interviews with informants in this study revealed seven themes related to the role of DKS communities in the development and innovation of online education. These themes include facilitation of collaboration and innovation, encouragement of innovation, dissemination of best practices, support for lifelong learning, enhancement of critical thinking, promotion of student agency, and development of digital literacy. The findings of this study are consistent with previous studies that have examined the role of DKS communities in the development and innovation of online education. For example, Dabbagh and Kitsantas [ 33 ] found that personal learning environments, which are similar to digital knowledge-sharing communities, can facilitate collaboration and knowledge exchange among learners and can enhance critical thinking and digital literacy skills. Similarly, Barab, et al. [ 34 ] found that online communities of practice can support professional development and knowledge sharing among educators.

The finding that DKS communities can encourage innovation is also consistent with previous studies. Siemens and Tittenberger [ 35 ] argued that emerging technologies, such as digital knowledge-sharing communities, can support innovation in education by providing a platform for experimentation and exploration of new teaching methods and technologies. Moreover, the finding that DKS communities can disseminate knowledge about best practices and successful approaches to online education is also consistent with previous studies. Palloff and Pratt [ 21 ] argued that online communities of practice can serve as a platform for sharing knowledge and experiences, leading to the development of new insights and approaches. Similarly, Garrison, et al. [ 36 ] found that computer conferencing can support knowledge sharing and dissemination among learners and educators. Furthermore, the finding that DKS communities can support lifelong learning is also consistent with previous studies. Warschauer and Matuchniak [ 37 ] argued that digital technologies can support lifelong learning by providing access to a wide range of learning resources and opportunities.

In addition, that DKS communities can enhance critical thinking skills is also consistent with previous studies. Hrastinski [ 38 ] argued that online learning can support critical thinking skills by providing opportunities for collaborative learning and discussion among learners. Similarly, the findings are consistent with Wang, et al., [ 39 ] found that social media, which are similar to digital knowledge-sharing communities, can promote student agency by giving students more control over their learning and encouraging them to take an active role in shaping their educational experiences. Finally, the finding that DKS communities can develop digital literacy skills is also aligned with previous studies. Siemans and Tittenberger [ 35 ] and Garrison, et al. [ 36 ] argued that emerging technologies, such as digital knowledge-sharing communities, can support the development of digital literacy skills by providing opportunities for learners to engage with digital tools and platforms.

The third objective was to delve into the impact of DKS on students learning in higher education. The findings of this study are consistent with previous studies that have examined the impact of DKS communities on student learning and engagement in online education. The first potential impact, increased access to resources, is supported by the findings of Palloff andPratt [ 21 ] and Garrison, et al. [ 36 ] who argued that digital technologies can support lifelong learning by providing access to a wide range of learning resources and opportunities. The second potential impact, peer support, and collaboration is supported by the findings of Warschauer, and Matuchniak [ 37 ] who found that online communities of practice can support professional development and knowledge sharing among educators. The third potential impact, diverse perspectives, is supported by the findings of Hrastinski [ 38 ] who argued that online learning can support critical thinking skills by providing opportunities for collaborative learning and discussion among learners. The fourth potential impact, active learning, is supported by the findings of Dabbagh and Kitsantas [ 33 ] who found that personal learning environments can enhance critical thinking and digital literacy skills. The fifth potential impact, flexibility, and personalization, is supported by the findings of Wang, et al., [ 39 ] who found that social media can promote student agency by giving students more control over their learning and encouraging them to take an active role in shaping their educational experiences.

With regard to the last question findings, it can be argued that the findings of this study are consistent with previous research on the importance of student participation in online learning communities. A study by Warschauer and Matuchniak [ 37 ] found that students who actively participated in online learning communities had higher levels of engagement and were more likely to succeed in their courses. Similarly, a study by Rovai and Jordan [ 40 ] found that students who were highly involved in online learning communities had higher levels of satisfaction and academic success.

The next finding was the importance of clear expectations and guidelines for online learning communities which has also been identified in previous research. For example, Richardson [ 41 ] found that providing clear guidelines and expectations for online learning communities can help students feel more comfortable and engaged in the learning process. Additionally, Palloff and Pratt [ 21 ] found that clear guidelines and expectations can help students understand how to participate in online learning communities and contribute to their success.

The significance of a supportive and collaborative learning environment has also been supported by previous research. A study by Garrison, et al., [ 36 ] found that students who felt supported and encouraged by their peers and instructors in online learning communities were more likely to be engaged and successful in their courses. Additionally, a study by Shea et al., [ 42 ] found that a sense of community and support was a key factor in promoting student success in online learning environments. Moreover, the relevance and usefulness of online learning communities have also been identified as an important factor in promoting student success. Similarly, a study by Ertl [ 43 ] found that students who perceived online learning communities as valuable were more likely to engage in collaborative learning and be successful in their courses. Finally, the importance of flexibility and adaptability in online learning communities has also been supported by previous research. A study by Shea, et al., [ 42 ] found that flexibility and adaptability were important factors in promoting student success in online learning environments. Additionally, a study by Garrison, et al., [ 36 ] found that online learning communities that were flexible and adaptable to the needs and preferences of students were more effective in promoting student success. Finally, Swan and Shih [ 44 ] found that students who perceived online learning communities as relevant and useful were more likely to participate and be successful in their courses.

In conclusion, this multi-objective qualitative study has shed light on the impact and role of digital knowledge-sharing (DKS) communities in the development and innovation of online higher education. The study identified several benefits and challenges of using DKS communities in online education, as well as the ways in which these communities can contribute to higher education sustainability. The findings also revealed the role of DKS communities in facilitating collaboration and innovation, encouraging innovation, disseminating best practices, supporting lifelong learning, enhancing critical thinking, promoting student agency, and developing digital literacy. Moreover, the study highlighted the potential impact of DKS communities on student learning, including increased access to resources, peer support and collaboration, diverse perspectives, active learning, flexibility, and personalization. The study also identified important factors that promote student success in online learning communities, such as clear expectations and guidelines, supportive and collaborative learning environments, relevance and usefulness, and flexibility and adaptability. These findings have important implications for the development of online education and the use of DKS communities in higher education.

Despite the valuable insights provided by this multi-objective qualitative study, there are limitations that must be considered. First, the sample size of 20 informants may not be representative of the larger population, limiting the generalizability of the findings. Further studies with larger sample sizes are needed to confirm the results and provide more comprehensive insights into the impact and role of DKS communities in online higher education. Second, the use of a qualitative research approach may introduce researcher bias and limit the generalizability of the findings. Combining qualitative and quantitative methods could provide a more comprehensive understanding of the impact of DKS communities on online education. Finally, the study was conducted in a specific context, and the findings may not be applicable to other contexts. Future studies should consider contextual factors such as cultural differences and institutional policies to provide a more comprehensive understanding of the impact of DKS communities on online education. To address these limitations, future studies could employ quantitative research designs to provide more objective and generalizable results. Additionally, longitudinal studies could investigate the long-term impact of DKS communities on online education, providing insights into the sustainability of these communities. Comparative studies could also be conducted to compare the impact of DKS communities with other models of online education, identifying the strengths and weaknesses of DKS communities and providing insights into how to optimize their impact on online education. By addressing these limitations and exploring these suggestions, future studies can further advance our understanding of the impact and role of DKS communities in online higher education, informing best practices and contributing to the ongoing development and innovation of online education.

Implications

First and foremost, it underscores the transformative potential of DKS communities in enhancing online education through the facilitation of collaboration, dissemination of best practices, and encouragement of innovation. These communities are poised to act as powerful catalysts for the continuous improvement of digital education. Secondly, the study places a spotlight on the paramount importance of addressing challenges such as quality control and bridging the digital divide to safeguard the long-term sustainability of DKS communities. Effective proactive strategies and purpose-built tools are imperative to unlock their full potential. Thirdly, DKS communities emerge as champions of lifelong learning, expanding horizons by broadening access to a wealth of diverse resources. Institutions stand to harness these communities to cater to a more extensive demographic of learners, thereby nurturing a culture of lifelong education. Fourthly, within the realm of online education, DKS communities assume a pivotal role in elevating critical thinking skills among students while fostering a sense of empowerment. The creation of environments that stimulate critical thought becomes indispensable in this context. Fifthly, the study underscores the necessity of establishing clear guidelines and expectations within DKS communities, serving as the linchpin for maximizing student engagement and success. The provision of structured and meticulously defined online learning environments emerges as the cornerstone of this endeavor.

Availability of data and materials

The data would be available upon request from the corresponding author (email:[email protected]).

Chankseliani M, McCowan T. Higher education and sustainable development goals. High Educ. 2021;81(1):1–8. https://doi.org/10.1007/s10734-020-00652-w .

Article   Google Scholar  

García-Peñalvo FJ. Avoiding the dark side of digital transformation in teaching. An institutional reference framework for eLearning in higher education. Sustainability. 2021;13(4):2023. https://doi.org/10.3390/su13042023 .

Chen Y, Luo H, Chen J, Guo Y. Building data-driven dynamic capabilities to arrest knowledge hiding: a knowledge management perspective. J Bus Res. 2022;139:1138–54. https://doi.org/10.1016/j.jbusres.2021.10.050 .

Elmassah S, Biltagy M, Gamal D. Framing the role of higher education in sustainable development: a case study analysis. Int J Sustain High Educ. 2021;23(2):320–55. https://doi.org/10.1108/IJSHE-05-2020-0164 .

Bygstad B, Øvrelid E, Ludvigsen S, Dæhlen M. From dual digitalization to digital learning space: exploring the digital transformation of higher education. Comput Educ. 2022;182: 104463. https://doi.org/10.1016/j.compedu.2022.104463 .

Hallinger P, Chatpinyakoop C. A bibliometric review of research on higher education for sustainable development, 1998–2018. Sustainability. 2019;11(8):2401. https://doi.org/10.3390/su11082401 .

Soltani A, Allan A, Nguyen HA, Berry S. Students’ commuting pattern from the viewpoint of environmentalism: comparing Australia with China. Int J Sustain High Educ. 2019;20(1):91–114. https://doi.org/10.1108/IJSHE-08-2018-0146 .

Gregson J, Brownlee JM, Playforth R, Bimbe N. The Future of Knowledge Sharing in a Digital Age: Exploring Impacts and Policy Implications for Development, IDS Evidence Report 125, Brighton: IDS; 2015. https://opendocs.ids.ac.uk/opendocs/handle/20.500.12413/5946 .

Frolova EV, Rogach OV, Ryabova TM. Digitalization of education in modern scientific discourse: new trends and risks analysis. Eur J Contemp Educ. 2020;9(2):313–36. https://doi.org/10.13187/ejced.2020.2.331 .

Funk J. Caring in practice, caring for knowledge. J Interact Media Educ. 2021;2021(1). https://doi.org/10.5334/JIME.648 .

Kraus S, Jones P, Kailer N, Weinmann A, Chaparro-Banegas N, Roig-Tierno N. Digital transformation: an overview of the current state of the art of research. SAGE Open. 2021;11(3):21582440211047576. https://doi.org/10.1177/21582440211047576 .

Lindell T. Improving knowledge sharing via digitalization: case Vt 6 TaaLa. 2018. https://urn.fi/URN:NBN:fi-fe2018090634686 .

Lo YY. Creating a sustainable campus through digitalization: exploring workplace practices to increase employee engagement. 2019. urn: nbn:se:kth:diva-249717.

Arfi WB, Arzumanyan L, Hikkerova L. Knowledge sharing and innovation in the era of digitalization. Management Avenir. 2020;118(4):63–88.

Google Scholar  

Adam T. Between Social Justice and Decolonization: Exploring South African MOOC Designers’ Conceptualizations and Approaches to Addressing Injustices [journal article]. J Interact Media Educ. 2020;2020(1):7.

Dicheva D, Dichev C, Agre G, Angelova G. Gamification in education: a systematic mapping study. J Educ Technol Soc. 2015;18(3):75–88.

OECD. Innovating education and educating for innovation: the power of digital technologies and skills. OECD; 2016.

Mishra P, Koehler MJ. Technological pedagogical content knowledge: a framework for teacher knowledge. Teach Coll Rec. 2006;108(6):1017–54.

Othman R, Mukherjee D, Mostofa SM, Kamrul Hasan K. Synchronous web-based learning during COVID -19 pandemic: a survey on library and information science students of Bangladesh. J Inf Technol Manag. 2021;13(2):93–112. https://doi.org/10.22059/jitm.2021.80357 .

Palloff RM, Pratt K. Building online learning communities: effective strategies for the virtual classroom. Wiley; 2007. Retrieved from: https://www.amazon.com/Building-Online-Learning-Communities-Strategies/dp/0787988251 .

Palloff RM, Pratt K. Building Learning Communities in Cyberspace: Effective Strategies for the Online Classroom. San Francisco, CA: Jossey-Bass; 1999.

Alqahtani AY, Rajkhan AA. E-learning critical success factors during the covid-19 pandemic: a comprehensive analysis of e-learning managerial perspectives. Educ Sci. 2020;10(9):216.

Shah HG, Kant R. Knowledge management enablers: metadata analysis for KM implementation. J Inf Knowl Manag. 2018;17(04):1850036.

Creswell JW. Research design: qualitative, quantitative, and mixed methods approach. London: Sage Publications; 2014.

Berg BL, Lune H. An introduction to content analysis. Qual Res Methods Soc Sci. 2001;7(1):238–67.

Sutton J, Austin Z. Qualitative research: data collection, analysis, and management. Can J Hosp Pharm. 2015;68(3):226–31. https://doi.org/10.4212/cjhp.v68i3.1456 .

Article   PubMed   PubMed Central   Google Scholar  

Ansari JAN, Khan NA. Exploring the role of social media in collaborative learning the new domain of learning. Smart Learn Environ. 2020;7(1):1–16.

Cheng L, Wu D, Cao J. Determinants of preschool teachers’ knowledge-sharing behavior from a thinking style perspective. Behav Sci. 2023;13(3):230. https://doi.org/10.3390/bs13030230 .

Nugroho YA, Putra F, Novitasari D, Asbari M, Purwanto A. Developing innovation capability: between individual and organizational factors. Int J Soc Manag Stud. 2020;1(1):74–88.

Zhao L, Cao C, Li Y, Li Y. Determinants of the digital outcome divide in E-learning between rural and urban students: Empirical evidence from the COVID-19 pandemic based on capital theory. Comput Human Behav. n.d.;130:107–177. https://doi.org/10.1016/j.chb.2021.107177 .

Thi Minh Ly P, Tien Thanh P, Duy LT, Nghi CNP, Giao NDP, Nghi TM. Online knowledge sharing and creativity in the context of working from home during the COVID-19 pandemic. VINE J Inf Knowl Manag Syst. 2023;53(2):292–314. https://doi.org/10.1108/VJIKMS-03-2022-0078 .

Shawar BA, Al-Sadi J. Learning management systems: are they knowledge management tools? Int J Emerg Technol Learn. 2010;5(1):4–10.

Dabbagh N, Kitsantas A. Personal learning environments, social media, and self-regulated learning: a natural formula for connecting formal and informal learning. Internet High Educ. 2012;15(1):3–8.

Barab SA, MaKinster JG, Scheckler R. Designing system dualities: Characterizing and online professional development community. In: Barab SA, Kling R, Gray JH, editors. Designing for virtual communities in the ser-vice of learning. Cambridge, UK: Cam-bridge University Press; 2004. p. 53–90.

Chapter   Google Scholar  

Siemens G, Tittenberger P. Handbook of emerging technologies for learning. Canada: University of Manitoba; 2009. p. 65.

Garrison DR, Anderson T, Archer W. Critical inquiry in a text-based environment: computer conferencing in higher education. Internet High Educ. 2000;2:87–105. https://doi.org/10.1016/S1096-7516(00)00016-6 .

Warschauer M, Matuchniak T. New technology and digital worlds: analyzing evidence of equity in access, use, and outcomes. Rev Res Educ. 2010;34(1):179.

Hrastinski S. A theory of online learning as online participation. Comput Educ. 2009;52(1):78–82.

Wang Q, Chen W, Liang Y. The Effects of social media on College Students. MBA Student Scholarship. 2011. p. 5. https://scholarsarchive.jwu.edu/mba_student/ .

Rovai AP, Jordan M. Blended learning and sense of community: a comparative analysis with traditional and fully online graduate courses. Int Rev Res Open Distance Learn. 2004;5. https://doi.org/10.19173/irrodl.v5i2.192 .

Richardson JC. Examining social presence in online courses in relation to students’ perceived learning and satisfaction. State University of New York at Albany; 2001. https://doi.org/10.24059/olj.v7i1.1864 .

Shea P, Li CS, Swan K, Pickett A. Developing learning community in online asynchronous college courses: the role of teaching presence. J Asynchronous Learn Netw. 2005;9(4):59–82.

Ertl B. E-collaborative knowledge construction: Learning from computer-supported and virtual environments. Inf Sci Reference IGI Glob. 2010. https://doi.org/10.4018/978-1-61520-729-9 .

Swan K, Shih LF. On the nature and development of social presence in https://olj.onlinelearningconsortium.org/index.php/olj/article/view/1788online course discussions. J Asynchronous Learn Netw. 2005;9:115–36. https://doi.org/10.24059/olj.v9i3.1788 .  https://olj.onlinelearningconsortium.org/index.php/olj/article/view/1788 .

Download references

Acknowledgements

The authors would like to thank all participants of the study.

Conflicts of interests

The authors declare that they have no conflicts of interests.

Funding is not applicable to this study.

Author information

Authors and affiliations.

School of Economics and Management, Xiamen University of Technology, Xiamen, 361000, China

Zhejiang Industry &Trade Vocational College, Wenzhou, 325000, China

School of Marxism, Xiamen University of Technology, Xiamen, 361000, China

Lirong Huang

School of Marxism, Huaqiao University, Xiamen, 361000, China

You can also search for this author in PubMed   Google Scholar

Contributions

Xi Huang drafted the manuscript. Tao Jiang approved the draft. Xi Huang and Hongwei Li collected data and completed the draft. Xi Huang, Hongwei Li, Lirong Huang, and Tao Jiang read the manuscript and verified the content and findings.

Corresponding author

Correspondence to Tao Jiang .

Ethics declarations

Ethics approval and consent participate.

1. Ethical approval

The study was reviewed and approved by the ethics committee of the School of Marxism, Xiamen University of Technology. The approval committee confirmed that all research was performed in accordance with relevant guidelines/regulations and in accordance with the Declaration of Helsinki.

2. Informed consent

Informed consent was obtained from all participants and they signed the form electronically and agreed to participate in the study.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: appendix..

Interview checklist.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Huang, X., Li, H., Huang, L. et al. Research on the development and innovation of online education based on digital knowledge sharing community. BMC Psychol 11 , 295 (2023). https://doi.org/10.1186/s40359-023-01337-6

Download citation

Received : 02 August 2023

Accepted : 18 September 2023

Published : 28 September 2023

DOI : https://doi.org/10.1186/s40359-023-01337-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Digital knowledge-sharing communities
  • Online education
  • Collaboration
  • Student success
  • Thematic analysis
  • Qualitative research

BMC Psychology

ISSN: 2050-7283

significance of the study in research about online learning

  • Research article
  • Open access
  • Published: 02 October 2020

Development of a new model on utilizing online learning platforms to improve students’ academic achievements and satisfaction

  • Hassan Abuhassna   ORCID: orcid.org/0000-0002-5774-3652 1 ,
  • Waleed Mugahed Al-Rahmi 1 ,
  • Noraffandy Yahya 1 ,
  • Megat Aman Zahiri Megat Zakaria 1 ,
  • Azlina Bt. Mohd Kosnin 1 &
  • Mohamad Darwish 2  

International Journal of Educational Technology in Higher Education volume  17 , Article number:  38 ( 2020 ) Cite this article

181k Accesses

112 Citations

7 Altmetric

Metrics details

This research aims to explore and investigate potential factors influencing students’ academic achievements and satisfaction with using online learning platforms. This study was constructed based on Transactional Distance Theory (TDT) and Bloom’s Taxonomy Theory (BTT). This study was conducted on 243 students using online learning platforms in higher education. This research utilized a quantitative research method. The model of this research illustrates eleven factors on using online learning platforms to improve students’ academic achievements and satisfaction. The findings showed that the students’ background, experience, collaborations, interactions, and autonomy positively affected students’ satisfaction. Moreover, effects of the students’ application, remembering, understanding, analyzing, and satisfaction was positively aligned with students’ academic achievements. Consequently, the empirical findings present a strong support to the integrative association between TDT and BTT theories in relation to using online learning platforms to improve students’ academic achievements and satisfaction, which could help decision makers in universities and higher education and colleges to plan, evaluate, and implement online learning platforms in their institutions.

Introduction

Higher education organizations over the previous two decades have offered full courses online as an integral part of their curricula, besides encouraging the completion throughout the online courses. Additionally, the number of students who are not participating in any courses online has continued to drop over the past few years. Similarly, it is perfectly possible to state that learning online is obviously an educational platform (Allen, Seaman, Poulin, & Straut, 2016 ). Courses online are trying to connect social networking components, experts’ content, because online resources are growing on daily basis. Such courses depend on active participation of a significant number of learners who participate independently in accordance with their education objectives, skills, and previous background and experience (McAuley, Stewart, Siemens, & Cormier, 2010 ). Nevertheless, learners differ in their previous background and experience, along with their education techniques, which clearly influence their online courses results besides their achievement (Kauffman, 2015 ). Consequently, despite the online learning evolution, learning online possibly will not be appropriate for each learner (Bouhnik & Carmi, 2013 ). Nevertheless, while online learning application among academic world has grown rapidly, not enough is identified regarding learners’ previous background and experience in learning online. Not so long ago, investigation concentrated on particular characteristics of learners’ experiences along with beliefs, for instance collaboration with their own instructor, online course quality, or studying with a certain learning management system (LMS) (Alexander & Golja, 2007 ; (Lester & King, 2009 ). Generally, limited courses or a single institution were investigated (Coates, James, & Baldwin, 2005 ; Lee, Yoon, & Lee, 2009 ). Few studies examined bigger sample sizes between one or more particular institutes (Alexander & Golja, 2007 ). Additionally, there is a shortage of researches that examine learners’ previous background and experience comparing face-to-face along with learning online elements, e.g., (Bliuc, Goodyear, & Ellis, 2007 ). The development of learners’ previous background and experience, skills, are realized to be the major advantages for administrative level for learning online.

Similarly, learners’ satisfaction and academic achievement towards learning online attracted considerable attention from scholars who employed several theoretical models in order to evaluate learners’ satisfaction and academic achievements (Abuhassna, Megat, Yahaya, Azlina, & Al-rahmi, 2020 ; Abuhassna & Yahaya, 2018 ; Al-Rahmi, Othman, & Yusuf, 2015a ; Al-Rahmi, Othman, & Yusuf, 2015b ). This present study highlights the effects of online learning platforms on student’s satisfaction, in relation to their background and prior experiences towards online learning platforms to identify learners that are going to be satisfied toward online course. Furthermore, this research explores the effects of transactional distance theory (TDT); student collaboration, student- instructor dialogue or communication, and student autonomy in relation to their satisfaction. Accordingly, this study investigates students’ academic achievements within online platforms, utilizing Bloom theory to measure students’ achievements through four main components, namely, understanding, remembering, applying, and analyzing. This study could have a significant influence on online course design and development. Additionally, this research may influence not only academic online courses but then other educational organizations according to the fact that several organizations offer training courses and solutions online. Both researchers and Instructors will be able to utilize and elaborate in accordance with the preliminary model, which was developed throughout this research, on the effects of online platforms on student’s satisfaction and academic achievements. Advantages of online learning and along with its applications were mentioned in earlier correlated literature (Abuhassna et al., 2020 ;Abuhassna & Yahaya, 2018 ; Al-Rahmi et al., 2018 ). However, despite the growing usage of online platforms, there is a shortage of employing this technology, which creates an issue in itself (Abuhassna & Yahaya, 2018 ; Al-Rahmi et al., 2018 ). Consequently, the research problem lies in the point that a model needs to be created to locate the significant evidence based on the data of student’s background, experiences and interactions within online learning environments which influence their academic performance and satisfaction. Thus, this developed model must be as a guidance for instructors and decision makers in the online education industry in terms of using online platforms to improve students learning experience through online platforms. Bearing in mind these conditions, our major problem was: how could we enhance students online learning experience in relation to both their academic achievements and satisfaction?

Research questions

The major research question that are anticipated to be answered is:

how could we enhance students online learning experience in relation to both their academic achievements and satisfaction?

To be able to answer this question, it is required to examine numerous sub-questions which have been stated as follow:

Q1: What is the relationship between students’ background and students’ satisfaction?

Q2: What is the relationship between students’ experience and students’ satisfaction?

Q3: What is the relationship between students’ collaboration and students’ satisfaction?

Q4: What is the relationship between students’ interaction and students’ satisfaction?

Q5: What is the relationship between students’ autonomy and students’ satisfaction?

Q6: What is the relationship between students’ satisfaction and students’ academic achievements?

Q7: What is the relationship between students’ application and students’ academic achievements?

Q8: What is the relationship between students’ remembering and students’ academic achievements?

Q9: What is the relationship between students’ understanding and students’ academic achievements?

Q10: What is the relationship between students’ analyzing and students’ academic achievements?

Research theory and hypotheses development

When designing web-courses within online learning instructions or mechanisms in general, educators are left with several decisions and considerations to face, which accordingly affect how students experience instruction, how they construct and process knowledge, how students could be satisfied through this experiment, and how web-based learning courses could enhance their academic achievements. In this study, we construct our theoretical framework according to Moore transactional distance theory (TDT) to measure student’s satisfaction, in addition to Bloom theory components to measure students’ academic achievements. Though the origins of TDT can be traced to the work of Dewey, it is Michael Moore who is identified as the innovator of this theory that first appeared in 1972. In his study and development of the theory, he acknowledged three main components of TDT that work as the base for much of the research on DL. Also, Bloom’s Taxonomy was established in 1956 under the direction of educational psychologist to measure students’ academic achievement (Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956 ). TDT theory has been selected in this study since Transactional distance’s term indicates the geographical space between the student and instructor. Based on the learning understanding, which happens through learner’s interaction with his environment. This theory considers the role of each of these elements (Student’s autonomy, Dialogue, and class structure) whereas these three elements could help to investigate student’s satisfaction. Moore’s ( 1990 ) notion of ‘Transactional Distance’ adopt the distance that happens in all relations in education. The distance in the theory is mainly specified the dialogue’s amount which happens between the student and the teacher, and the structure’s amount in the course design. Which serves the main goal of this study as to enhance students online learning experience in relation to their satisfaction. Whereas, Bloom Theory has been selected in this study in addition to TDT to enhance students online learning experience in relation to their student’s achievements. In a conclusion both methods were implemented to develop and hypothesis this study hypothesis. See Fig.  1 .

figure 1

Research Model and Hypotheses

Hypothesis of the study

H1: There is a significant relationship between students’ background and students’ satisfaction.

H2: There is a significant relationship between students’ experience and students’ satisfaction.

H3: There is a significant relationship between students’ collaboration and students’ satisfaction.

H4: There is a significant relationship between students’ interaction and students’ satisfaction.

H5: There is a significant relationship between students’ autonomy and students’ satisfaction.

H6: There is a significant relationship between students’ satisfaction and students’ academic achievements.

H7: There is a significant relationship between students’ application and students’ academic achievements.

H8: There is a significant relationship between students’ remembering and students’ academic achievements.

H9: There is a significant relationship between students’ understanding and students’ academic achievements.

H10: There is a significant relationship between students’ analyzing and students’ academic achievements.

Hypothesis developments and literature review

This Section of the study will discuss the study hypothesis and relates each hypothesis to its related studies from the literature.

Students background toward online platforms

Students’ background regarding online platforms in this study is referred to as their readiness and willingness to use and adapt to different online platforms, providing them with the needed support and assistance. Students’ background towards online learning is a crucial component throughout this process, as prior research revealed that there are implementation issues, for instance; the deficiency of qualified lecturers, infrastructure and facilities, in addition to students’ readiness, besides students’ resistance to accept online learning platforms in addition to the Learning Management System (LMS) platforms, as educational tools (Azhari & Ming, 2015 ). However, student demand continued to increase, spreading to global audiences due to its exceptional functionality, flexibility and eventual accessibility (Azhari & Ming, 2015 ). There have been persistent apprehensions regarding online learning quality compared with traditional learning settings. In their research, (Paechter & Maier, 2010 ; Panyajamorn, Suthathip, Kohda, Chongphaisal, & Supnithi, 2018 ) have discovered that Austrian learners continue to prefer traditional learning environments due to communication goals, along with the interpersonal relations preservation. Moreover, (Lau & Shaikh, 2012 ) have discovered that Malaysian learners’ internet efficiency and computer skills, along with their personal demographics like gender, background, level of the study, as well as their financial income lead to a significant difference in their readiness towards online learning platforms. Abuhassna and Yahaya ( 2018 ) claimed that the current technologies in education play an essential role in providing a full online learning experience which is close enough to a face-to-face class in spite of the physical separation of the students from their educator, along with other students. Platforms of online learning lend themselves towards a less hierarchical methodology in education, fulfilling the learning desires of individuals which do not approach new information in a linear or a systematic manner. Platforms of online learning additionally are the most suitable ways for autonomous students (Abuhassna et al., 2020 ; Abuhassna & Yahaya, 2018 ; Paechter & Maier, 2010 ; Panyajamorn et al., 2018 ).

Students experience toward online platforms

Students’ experience in the current research indicates that learners must have prior experience in relation to utilizing online learning platform in their education settings. Thus, students experience towards online learning offers several advantages among themselves and their instructors in strengthening students’ learning experiences especially for isolated learners (Jaques & Salmon, 2007 ; Lau & Shaikh, 2012 ; Salmon, 2011 ; Salmon, 2014 ). Regardless of student recognition of the advantages towards supporting their learning throughout utilizing the technology, difficulties may occur through the boundaries about their technical capabilities and prior experiences towards utilizing the software itself from the perspective of its functionality. As demonstrated over learner’s experience and feedback from several online sessions over the years, this may frequently become a frustration source between both learners and their instructors, as this may make typically uncomplicated duties, for instance, watching a video, uploading a document, and other simple tasks to be progressively complicated for them, having no such prior experience. Furthermore, when filling out evaluations, for instance, online group presentations, the relatively limited capability to communicate face-to-face then to rely on a non-verbal signal along with audience’s body language might be a discouraging component. Nonetheless, the significance of being in a position to participate with other colleagues employing online sessions, which are occasionally nonvisual, for instance; teleconference format is a progressively significant skill in the modern workplace, thus affirming the importance of concise, clear, intensive interactions skills (Salmon, 2011 ; Salmon, 2014 ).

Student collaboration among themselves in online platforms

Students’ collaborations in the current study refers to the communication and feedback among themselves in online platforms. To refine and measure transactional distance using a survey tool, (Rabinovich, 2009 ) created a survey instrument to measure transactional distance in a higher education setting. A survey was sent to 235 students enrolled in a synchronous web-based graduate class in business regarding transactional distance and Collaborations (Rabinovich, 2009 ). The synchronous learning environment was described as a place where “live on-campus classes are conveyed simultaneously to both in-class students on campus and remote students on the Web who join via virtual classroom Web collaboration software” (Rabinovich, 2009 ). The virtual classroom software is similar to the characteristics of the two different software described by (Falloon, 2011 ; Mathieson, 2012 ) that it allows for students to interact with the educator and fellow students in real-time (Rabinovich, 2009 ). Moreover, (Kassandrinou, Angelaki, & Mavroidis, 2014 ) reported that the instructor plays a crucial role as interaction and communication helpers, as they are tasked with fostering, reassuring and assisting communication and interaction among students. Face-to-face tutorials have proven to be a vast opportunity for a multitude of students to interchange ideas, argue the content of the course and its related concerns (Vasala & Andreadou, 2010 ).

Students’ interactions with the instructor in online platforms

Purposeful interaction or (dialogue) in the current study describes communication that is learner-learner and learner-instructor which is designed to improve the understanding of the student. According to (Shearer, 2010 ) communication should also be constructive in that it builds upon ideas and work from others, as well as assists others in learning. (Moore, 1972 ) affirmed that learners also must realize that, and value the importance of the learning interactions as a vital part of the learning process. In a manner similar to (Benson & Samarawickrema, 2009 ] study of teacher preparatory students, (Falloon, 2011 ) investigated the use of digital tools in a case study at a teacher education program in New Zealand. (Mathieson, 2012 ) also explored the role dialogue plays in digital learning environments. She created a digital survey that examined students’ perception of audio-visual feedback in courses that utilize screen casting digital tools. (Moore, 2007 ) discusses autonomous learners searching for courses that do not stress structure and dialogue in order explain and enhance their learning progression. (Abuhassna et al., 2020 ; Abuhassna & Yahaya, 2018 ; Al-Rahmi et al., 2015b ; Al-Rahmi, Othman, & Yusuf, 2015d ; Furnborough, 2012 ) concluded that the feeling of cooperation that learners’ share with their fellow students effect their reaction concerning their collaboration with their peers.

Student autonomy in online platforms

Student autonomy in the current study refers to their independence and motivation towards learning. The learner is the motivation of the way toward learning, along with their expectations and requirements, thinking about everyone as a unique individual and hence investigating their own capacities and possibilities. Thus, extraordinary importance is attributed to autonomy in DL environments, since the option of instructive intercession offered in distance education empowers students towards learning autonomy (Massimo, 2014 ). In this respect, the connection between autonomy of student and explicit parts of the learning procedure are in the center of consideration as mentioned. (Madjar, Nave, & Hen, 2013 ) concluded that a learners’ autonomy-supportive environment provides these learners with adoption of a more aims guided learning, leading to more learning achievements. This is why autonomy is desired in the online settings for both individual development and greater achievement in academic environments. The researchers also indicate in their research that while autonomy supports outcomes in goals and aims guiding, educator practices mainly lead to goals which necessary cannot adapt. Thus, supportive-autonomy learning process needs to be designed with affective elements consideration as well. However, (Stroet, Opdenakker, & Minnaert, 2013 ) efficiently surveyed 71 experimental studies on the impacts of autonomy supportive teaching on motivation of learner and discovered a clear positive correlation. Similar to attribution theory, the relationship between learner control and inspiration involves the possibility of learners adjusting their own inspirations, for example, learners may be competent to change self-determined extrinsic motivation to intrinsic motivation. However, (Jacobs, Renandya, & Power, 2016 ) further indicated that learners will not reach the same level of autonomy without reviewing learner’s autonomy insights, reflecting on their learning experiences, sharing these experiences and reflections with other learners, and realizing the elements influencing all these processes, and the process of learning as well.

Student satisfaction in online platforms

Student satisfaction in the current study refers to the fact that there are many factors that play a role in determining the learner’s satisfaction, such as faculty, institution, individual learner element, interaction/communication elements, the course elements, and learning environment. Discussion of the elements also related to the role of the instructor, with the learner’s attitude, social presence, usefulness, and effectiveness of Online Platforms. (Yu, 2015 ) investigated that student satisfaction was positively associated with interaction, self-efficacy and self-regulation without significant gender variations. (Choy & Quek, 2016 ). examined the relationships between the learners’ perceived teaching, social, and cognitive element. In addition, satisfaction, academic performance, and achievement can be measured using a revised form of the survey instrument. (Kirmizi, 2014 ) studied connection between 6 psychosocial scales: personal relevance, educator assistance, student interaction and collaboration, student autonomy, authentic learning, along with active learning. A moderate level of correlation was found between these mentioned variables. Learner satisfaction predictors were educator support, personal relevance and authentic learning, while authentic learning was the only academic success predictor. Findings of (Bordelon, 2013 ) determined and described a positive correlation between both achievement and satisfaction. He demonstrated that the reasons behind these conclusions could be cultural variations in learner’s satisfaction which point out learning accession Zhu ( 2012 ). Scholars in the field of student satisfaction emphasis on the delivery besides the operational side of the student’s experience in the teaching process (Al-Rahmi, Othman, & Yusuf, 2015e ).

Students’ academic achievements in online platforms

Students achievements in this study refers to Bloom’s main four components of achievements, which are remembering, understanding, applying, and analyzing. Finding in a study conducted by (Whitmer, 2013 ) revealed the relationships between student academic achievement and the LMS usage, thus the findings showed a highly systematic association ( p  < .0000) in relation to every variable. These variables described 12% and 23% of variations within the final course marks, which indicates that learners who employed the LMS more often obtained higher marks than the others. Thus, the correlation techniques examined these variables separately to ascertain their association with the final mark. Moreover, it is not the technology itself; it is the educational methods in relation to which technology has been utilized that create a change in learners’ achievement. Instruments used are significant in identifying the technology impact, moreover, it is the implementation of those instruments under specific activities and for certain purposes which indicates whether or not they are effective. In contrast, a study conducted by (Barkand, 2017 ) revealed that LMS tools were not considered to have an effect on semester final grades when categorized by school year. In his study, semester final grades were a measure of student achievement, which has subjective elements. To account for the subjective elements in semester final grades, the study also included objective post test scores to evaluate student learning. Additionally, in this study, we refer to Bloom’s Taxonomy established in 1956 under the direction of educational psychologist for measuring students’ academic achievement (Bloom et al., 1956 ). Moreover, in this study, we selected fours domains of Blooms Taxonomy in order to achieve this study objectives, which are; application: which refers to using a concept in new context, for instance; applying what has been learned inside the classroom into different circumstances; remembering, which refers to recalling or retrieving prior learned knowledge; understanding, which refers to realizing the meaning, then clarification of problems instructions; analyzing, which refers to separating concepts or material into parts in such a way that its structure can be distinguished, understood among inferences and facts.

Students’ application

Applying involves “carrying out or using a procedure through executing or implementing” (Anderson & Krathwohl, 2001 ). Applying in this study refers to the student’s ability to use online platforms, such as how to log in, how to end session, how to download materials, how to access links and videos. Students can exchange information about a specific topic in online platforms such as Moodle, Google Documents, Wikis and apply knowledge to create and participate in online platforms.

Students’ remembering

Remembering is defined as “retrieving, recognizing, and recalling relevant knowledge from long-term memory” (Anderson & Krathwohl, 2001 ). In this study, remembering is referred to the ability to organize and remember online resources to easily find information on the internet. Moreover, students can easily cooperate with their colleagues and educator, contributing to the educational process and justifying their study procedure. Anderson and Krathwohl ( 2001 ) In their review of Bloom’s taxonomy, Anderson and Krathwohl ( 2001 ) recognized greater learning levels as creating, evaluating, and analyzing, with the lower learning levels as applying, understanding, and remembering.

Students’ understanding

Understanding involves “constructing meaning from oral, written, and graphic messages through interpreting, exemplifying, classifying, summarizing, inferring, comparing, and explaining” (Anderson & Krathwohl, 2001 ). In this study, understanding is referred to as understanding regarding a subject then putting forward new suggestions about online settings, for instance; understanding how e-learning works, or LMS. For example, students use online platforms to review concepts, courses, and prominent resources are being used inside the classroom environment.

Students’ analyzing

Analyzing includes “breaking material into constituent parts, determining how the parts relate to one another and to an overall structure or purpose through differentiating, organizing, and attributing” (Anderson & Krathwohl, 2001 ). Analyzing refers to the student’s ability to connect, discuss, mark-up, then evaluate the information received into one certain workplace or playground. Solomon and Schrum ( 2010 ) claim that educators have started employing online platforms for a range of activities, since they have become more familiar and there are ways for learners to benefit from using them. Generally, the purpose and goal are to publicize the development types, innovation, as well as additional activities that their learners usually do independently. Such instruments have also provided instructors ways to encourage and promote genuine cooperation in their project’s development (Solomon & Schrum, 2010 ).

Research methodology

A quantitative approach was implemented in this study to provide an inclusive insight in relation to students online learning experience and how to enhance both their satisfaction and academic achievements using a questionnaire. Two experts were referred for the evaluation of the questionnaire’s content. Before the collection of the data, permission regarding the current research purpose has been obtained from Universiti Teknologi Malaysia (UTM). In relation to the sampling and population, this research was conducted among undergraduate learners who have been online learning users. Learners, who had manually obtained the questionnaires, have been requested to fill in their details, then fill their own assessments regarding online learning platforms and its effects towards their academic achievements. Thus, for data analysis, the data that were attained from questionnaires were then analyzed using the Statistical Package for the Social Sciences (SPSS). Specifically, Structural Equation Modeling (SEM- Amos), which has been employed as a primary data analysis tool. Moreover, utilizing SEM-Amos process involves two main phases: evaluating construct validity, the convergent validity, along with the discriminant validity of the measurements; then analyzing the structural model. These mentioned two phases followed the recommendations of (Bagozzi, Yi, & Nassen, 1998; Hair, Sarstedt, Ringle, & Mena, 2012a , 2012b ).

Sample characteristics and data collection

A total of 283 questionnaires were distributed manually; of these, only 264, which make up 93.3% of the total number, were returned to the authors. Excluding the 26 incomplete questionnaires, 264 were evaluated employing SPSS. A total of 21 questionnaires have been excluded: 14 were incomplete and 7 having outliners. Thus, the overall number of valid questionnaires was 243 following this exclusion. This exclusion step is being supported by Hair et al. ( 2012a , 2012b ) . Moreover, Venkatesh, Thong, & Xu, 2012 who pointed out that this procedure is essential to be implemented as the existence of outliers could be a reason for inaccurate results. Regarding the respondent’s demographic details: 91 (37.4%) were males, and 152 (62.6%) were females. 149 (61.3%) were in the age range of 18 t0 20 years old, 77 (31.7%) were in the age range of 21 to 24 years old, and 17 (7.0%) were in the age range of 25 to 29 years old. Regarding level of study: 63 (25.9%) were from level 1, 72 (29.6%) were from level 2, 50 (20.6%) were from level 3, and 58 (23.9%) were from level 4.

Measurement instruments

The questionnaire in this study has been developed to fit the study hypothesis. Consequently, it was developed based into both theories that have been utilized in this study. The questionnaire has two main sections, first section aims to measure student satisfaction which is based on the TDT theory variables. Second section of the questionnaire has been developed to measure students’ academic achievement based on Bloom theory. According to Bloom theory there are four variables that measure students’ achievements, which are application, remembering, understanding, analyzing. On that basis the questionnaire has been developed to measure both students’ satisfaction and academic achievements . The construct items were adapted to ensure content validity. This questionnaire consisted of two main sections. First part covered the demographic details of the respondents’ including age, gender, educational level. The second part comprises 51 items which were adapted from previous researches as following; student background, five items, student experience, five items adapted from (Akaslan & Law, 2011 ), student collaborations, and, student interactions items adapted from (Bolliger & Inan, 2012 ), student autonomy, five items adapted from (Barnard et al., 2009 ; Pintrich, Smith, Garcia, & McKeachie, 1991 ), student satisfaction, six items adapted from (The blended learning impact evaluation at UCF is conducted by Research Initiative for Teaching Effectiveness, n.d. ). Moreover, effects of the students’ application, four items, students’ remembering, four items, students’ understanding, four items, students’ analyzing, four items, and students’ academic achievements, four items adapted from (Pekrun, Goetz, & Perry, 2005 ). The questionnaire has been distributed to the students after taking the online course.

Result and analysis

Cronbach’s Alpha reliability coefficient result was 0.917 among all research model factors. Thus, the discriminant validity (DV) assessment was carried out through utilizing three criteria, which are: index between variables, which is expected to be less than 0.80 (Bagozzi, Yi, & Nassen, 1988 ); each construct AVE value must be equal to or higher than 0.50; square of (AVE) between every construct should be higher, in value, than the inter construct correlations (IC) associated with the factor [49]. Furthermore, the crematory factor analysis (CFA) findings along with factor loading (FL) should therefore be 0.70 or above although the Cronbach’s Alpha (CA) results are confirmed to be ≥0.70 [50]. Researchers have also added that composite reliability (CR) is supposed to be ≥0.70.

Model analysis

Current research employed AMOS 23 to analyze the data. Both structural equation modeling (SEM) as well as confirmatory factor analysis (CFA) have been employed as the main analysis tools. Uni-dimensionality, reliability, convergent validity along with discriminant validity have been employed to assess the measurement model. (Bagozzi et al., 1988 ; Byrne, 2010 ; Kline, 2011 ) highlighted that goodness-of-fit guidelines, such as the normed chi-square, chi-square/degree of freedom, normed fit index (NFI), relative fit index (RFI), Tucker-Lewis coefficient (TLI) comparative fit index (CFI), incremental fit index (IFI), the parsimonious goodness of fit index (PGFI), thus, the root mean square error of approximation (RMSEA) besides the root mean-square residual (RMR). All these are tools which could be utilized as the assessment procedures for the model estimation. See Table  1 & Fig.  2 .

figure 2

Measurement Model

Measurement model

Such type of validity is commonly employed to specify the size difference between a concept and its indicators and other concepts (Hair et al., 2012a , 2012b ). Through analysis in this context, discriminant validity has proven to be positive over all concepts given that values have been over 0.50 (cut-off value) from p  = 0.001 according to Fornell and Larcker ( 1981 ). In line with Hair et al. ( 2012a , 2012b ) . Bagozzi, Yi, & Nassen, (1998), the correlation between items at any two specified constructs must not exceed the square root of the average variance that is shared between them in a single construct. The outcomes values of composite reliability (CR) besides those of Cronbach’s Alpha (CA) remained about 0.70 and over, while the outcomes of the average variance extracted (AVE) remained about 0.50 and higher, indicating that all factor loadings (FL) were significant, thereby fulfilling conventions in the current assessment Bagozzi, Yi, & Nassen, (1998), and Byrne ( 2010 ). Following sections expand on the results of the measurement model. Findings of validity, reliability, average variance extracted (AVE), composite reliability (CR) as well as Cronbach’s Alpha (CA) have all been accepted, which also demonstrated determining the discriminant validity. It is determined that all the values of (CR) vary between 0.812 and 0.917, meaning they are above the cut-off value of 0.70. The (CA) result values also varied between 0.839 and 0.897 exceeding the cut-off value of 0.70. Thus, the (AVE) was similarly higher than 0.50, varying between 0.610 and 0.684. All these findings are positive, thus indicating significant (FLs) and they comply with the conventional assessment guidelines Bagozzi, Yi, & Nassen, (1998), along with Fornell and Larcker ( 1981 ). See Table  2 and Additional file  1 .

Structural model analysis

In the current study, the path modeling analysis has been utilized to examine the impact of students’ academic achievements among higher education institutions through the following factors (students’ background, students’ experience, students’ collaborations, students’ interaction, students’ autonomy, students’ remembering, students’ understanding, students’ analyzing, students’ application, students’ satisfaction), which is based on online learning. The findings are displayed then compared in hypothesis testing discussion. Subsequently, as the second stage, factor analysis (CFA) has being conducted on structural equation modeling (SEM) in order to assess the proposed hypotheses as demonstrated in Fig.  3 .

figure 3

Findings for the Proposed Model Path analysis

As shown in both Figs.  3 and 4 , all hypotheses have been accepted. Moreover, Table  3 below shows that the fundamental statistics of the model was good, which indicates model validity along with the testing results of the hypotheses through demonstrating the values of unstandardized coefficients besides standard errors of the structural model.

figure 4

Findings for the Proposed Model T.Values

The first direct five assumptions, students’ background, students’ experience, students’ collaborations, students’ interaction; students’ autonomy with students’ satisfaction, were addressed. In accordance with Fig.  4 and Table 3 , relations between students’ background and students’ satisfaction was (β = .281, t = 5.591, p  < 0.001), demonstrating that the first hypothesis (H1) has suggested a positive and significant relationship. Following hypothesis illustrated the relationship between students’ experience and students’ satisfaction (β = .111, t = 1.951, p  < 0.001), demonstrating that the second hypothesis (H2) proposed a positive and significant relationship. Third hypothesis illustrated the relationship between students’ collaborations and students’ satisfaction (β = .123, t = 2.584, p  < 0.001) demonstrating that the third hypothesis (H3) has suggested a positive and significant relationship. Additionally, the relationship between students’ background and students’ satisfaction was (β = .116, t = 2.212, p < 0.001), indicating that the fourth hypothesis (H4) has suggested a positive and significant relationship. Further to the above-mentioned findings, the relationship between students’ autonomy and students’ satisfaction was (β = .470, t = 7.711, p  < 0.001), demonstrating that the fifth hypothesis (H5) has suggested a positive and significant relationship. Moreover, in the second section, five assumptions were discussed, which are students’ satisfaction, students’ remembering, students’ understanding, students’ analyzing, students’ application along with students’ academic achievements.

As shown in Fig. 4 and Table 3 , the association between students’ satisfaction and students’ academic achievements was (β = .135, t = 3.473, p  < 0.001), demonstrating that the sixth hypothesis (H6) has suggested a positive and significant relationship. Following hypothesis indicated the relationship between students’ application and students’ academic achievements (β = .215, t = 6.361, p  < 0.001), indicating that the seventh hypothesis (H7) has suggested a positive and significant relationship. Thus, the eighth hypothesis indicated the relationship between students’ remembering and students’ academic achievements was (β = .154, t = 4.228, p  < 0.001), demonstrating that the eight hypothesis (H8) has suggested a positive and significant relationship. Additionally, the correlation between students’ understanding and students’ academic achievements was (β = .252, t = 6.513, p < 0.001), demonstrating that the ninth hypothesis (H9) has suggested a positive and significant relationship. Finally, the relationship between students’ analyzing and students’ academic achievements was (β = .179, t = 6.215, p < 0.001), demonstrating that the tenth hypothesis (H10) has suggested a positive and significant relationship. Accordingly, this current model demonstrated student’s compatibility to use online learning platforms to improve students’ academic achievements and satisfaction. This is in accordance with earlier investigations (Abuhassna & Yahaya, 2018 ; Al-Rahmi et al., 2018 ; Al-rahmi, Othman, & Yusuf, 2015c ; Barkand, 2017 ; Madjar et al., 2013 ; Salmon, 2014 ).

Discussion and implications

Developing a new hybrid technology acceptance model through combining TDT and BTT has been the major objective of the current research, which aimed to investigate the guiding factors towards utilizing online learning platforms to improve students’ academic achievements and satisfaction in higher education institutions. The current research is intensifying a step forward by implementing TDT along with a BTT model. Using the proposed model, the current research examined how students’ background, students’ experience, students’ collaborations, students’ interactions, and students’ autonomy positively affected students’ satisfaction. Moreover, effects of the students’ application, students’ remembering, students’ understanding, students’ analyzing, and students’ satisfaction positively affected students’ academic achievements. The current research found that students’ background, students’ experience, students’ collaborations, students’ interactions, and students’ autonomy were influenced by students’ satisfaction. Also, effects of the students’ application, students’ remembering, students’ understanding, students’ analyzing, and students’ satisfaction positively affected students’ academic achievements. This conclusion is consistent with earlier correlated literature. Thus, this reveals that learners first make sure whether using platforms of online learning were able to meet their study requirements, or that using platforms of online learning are relevant to their study process before considering employing such technology in their study. Learners have been noted to perceive that platforms of online learning is more useful only once they discover that such a technology is actually better than the traditional learning which does not include online learning platforms (Choy & Quek, 2016 ; Illinois Online Network, 2003 ). Using the proposed model, the current research examined how to improve students’ academic achievements and satisfaction. Thus, the following section will be a comparison between this study results and previous research, as follows.

The first hypotheses of this study demonstrated a positive and significant association between students’ prior background towards online platforms with their satisfaction. As clearly investigated in Osika and Sharp ( 2002 ) study, numerous learners deprived of these main skills enroll in the courses, struggle, and subsequently drop out. In addition, Bocchi, Eastman, and Swift ( 2004 ) investigation claimed that prior knowledge of students’ concerns, demands along with their anticipations is crucial in constructing an efficient instruction. Thus, to clarify, students must have prior knowledge and background before letting them into the online platforms. On the other hand, there are constant concerns about the online learning platforms quality in comparison to a face-to-face learning environment, as students do not have the essential skills required toward using online learning platforms (Illinois Online Network, 2003 ). Moreover, a study by Alalwan et al. ( 2019 ) discovered that Austrian learners still would rather choose face-to-face learning for communication purposes, and the preservation of interpersonal relations. This is due to the fact that learners do not as yet have the background knowledge and skills needed towards using online learning platforms. Additional research by Orton-Johnson ( 2009 ) among UK learners claimed that learners have not accepted online materials, and continue to prefer traditional context materials as the medium for their learning, which also indicates the importance of prior knowledge and background towards online platforms before going through such a technology.

The second hypotheses of this study proposed a positive and significant association between students’ experience along with students’ satisfaction, which revealed that putting the students in such an experience would provide and support them with the ability to overcome all difficulties that arise through the limits around the technical ability of the online platforms. This is in line with some earlier researches regarding the reasons that lead to people’s technology acceptance behavior. One reason is the notion of “conformity,” which means the degree to which an individual take into consideration that an innovation is consistent with their existing demands, experiences, values and practices (Chau & Hu, 2002 ; Moore & Benbasat, 1991 ; Rogers, 2003 ; Taylor & Todd, 1995 ). Moreover, (Anderson & Reed, 1998 ; Galvin, 2003 ; Lewis, 2004 ) claimed that most students who had prior experience with online education tended to exhibit positive attitudes toward online education, and it affects their attitudes toward online learning platforms.

The third hypotheses of this study demonstrated a positive and significant association among student collaboration with themselves in online platforms, which indicates the key role of collaboration between students in order to make the experiment more realistic and increase their ability to feel more involved and active. This is agreement with Al-rahmi, Othman, and Yusuf ( 2015f ) who claimed that type, quality, and amount of feedback that each student received was correlated to a student’s sense of success or course satisfaction. Moreover, Rabinovich ( 2009 ) found that all types of dialogue were important to transactional distance, which make it easier for the student to adapt to online learning platform. Also, online learning platforms enable learners to share then exchange information among their colleagues Abuhassna et al., 2020 ; Abuhassna & Yahaya, 2018 ).

Students’ interaction with the instructor in online platforms

The fourth hypothesis of this study proposed a positive and significant correlation between students’ collaborations and students’ satisfaction, which indicates the significance of the communication between students and their instructor throughout the online platforms experiment. These results agree with (Mathieson, 2012 ) results, which stated that the ability of communication between students and their instructor lowered the sense of separation between learner and educator. Moreover, in line with (Kassandrinou et al., 2014 ), communication guides learners to undergo constructive emotions, for example relief, satisfaction and excitement, which assist them to achieve their educational goals. In addition, (Furnborough, 2012 ) draws conclusion that learners’ feeling of cooperating with their fellow students effects their reaction concerning their collaboration with their peers. Moreover, Kassandrinou et al., 2014 focused on the instructor as crucial part as interaction and communication helpers, as they are thought to constantly foster, reassure and assist communication and interaction amongst students.

Student’s autonomy in online platforms

The fifth hypotheses of this study proposed a positive and significant relationship between student’s autonomy and online learning platforms, which indicates that students need a sense of dependence towards online platforms, which agrees with Madjar et al. ( 2013 ) who concluded that a learners’ autonomy-supportive environment provides these learners with adoption of more aims, leading to more learning achievements. Moreover, Stroet et al. ( 2013 ) found a clear positive correlation on the impacts of autonomy supportive teaching on motivation of learner. O’Donnell, Chang, and Miller ( 2013 ) also argues that autonomy is the ability of the learners to govern themselves, especially in the process of making decisions and setting their own course and taking responsibility for their own actions.

Student’s satisfaction in online platforms

The sixth hypotheses of this study proposed a positive and significant correlation between student’s satisfaction with online learning platforms, which indicates a level of acceptance by the students to adapt into online learning platforms. This is in agreement with Zhu ( 2012 ) who reported that student’s satisfaction in online platforms is a statement of confidence with the system. Moreover, Kirmizi ( 2014 ) study revealed that the predictors of the learners’ satisfaction were educator’s support, personal relevance and authentic learning, whereas the authentic learning is only the predictor of academic success. Furthermore, the findings of Bordelon ( 2013 ) stated and determined a positive correlation between both satisfaction and achievement. In addition, the results of Mahle ( 2011 ) clarified that student satisfaction occurs when it is realized that the accomplishment has met the learners’ expectations, which is then considered a short-term attitude toward the learning procedure.

Hypotheses seven, eight, nine and ten of this study proposed a positive and significant relationship between student’s academic achievements with online learning platforms, which indicates the key main role of online platform with students’ academic achievements. This agrees with Whitmer ( 2013 ) findings, which revealed that the associations between student usage of the LMS and academic achievement exposed a highly systematic relationship. In contrast, Barkand ( 2017 ) found that there is no significant difference in students’ academic achievements in utilizing online platforms regarding students’ academic achievements, which is due to the fact that academic achievement towards online learning platforms requires a certain set of skills and knowledge as mentioned in the above sections in order to make such technology a success.

The seventh hypotheses of this study proposed a positive and significant correlation between students’ application and students’ academic achievements, which indicates the major key of applying in the learning process as an effected element. This is in line with the Computer Science Teachers’ Association (CSTA) taskforce in the U. S (Computer Science Teachers’ Association (CSTA), 2011 ), where they mentioned that applying elements of computer skills is essential in all state curricula, directing to their value for improving pupils’ higher order thinking in addition to general problem-solving abilities. Moreover, Gouws, Bradshaw, and Wentworth ( 2013 ) created a theoretical framework which drawn education computational thoughts compared to cognitive levels established from Bloom’s Taxonomy of Learning Purposes. Four thinking skill levels have been utilized to assess the ‘cognitive demands’ initiated by computational concepts for instance abstraction, modelling, developing algorithms, generating automated processes. Through the iPad app, LightBot. thinking skills remained recognizing (which means recognize and recall expertise correlating to the problem); Understanding (interpret, compare besides explain the problem); whereas, applying (make use of computer skills to create a solution) then Assimilating (critically decompose and analyses the problem).

The eighth hypotheses of this study proposed a positive and significant correlation between students’ remembering and students’ academic achievements, which indicates the importance of remembering as a process of retrieving information relating to what needed to be done and/or outcome attributes) over the procedure of learning according to Bloom’s Taxonomy of Educational Objectives. Additionally, Falloon ( 2016 ) claimed that responding to data indicated the use of general thinking skills to clarify and understand steps and stages needed to complete a task (average 29%); recalling or remembering information about a task or available tools (average 13%); and discussing and understanding success criteria (average 3%).

The ninth hypotheses of this study proposed a positive and significant correlation between students’ understanding and students’ academic achievements, which indicates its significance with the academic achievements as a process of criticizing the task or the problem faced by the students into phases or activities to help understanding of how to resolve the problem. The current results agree with Falloon ( 2016 ) who demonstrated the necessity to build understanding over the thinking processes employed by students once they are engaged in their work. In addition, Falloon ( 2016 ) suggested that the purpose and nature of questioning was broader than this, with questioning of self and others being an important strategy in solution development. In many respects, the questioning for those students was not much a perspective, although more a practice, to the degree that assisted them to understand their tasks, analyze intended or developed explanations and to evaluate their outcomes.

The tenth hypotheses of this study proposed a positive and significant correlation between students’ understanding and students’ academic achievements, which reveals the importance of analysis as a process of employing general thinking besides computational knowledge in order to realize the challenges through using online platforms, in addition to predictive thinking to categorize, explore and fix any possible errors throughout the whole process. Falloon ( 2016 ) claimed that analyzing was often a collaborative procedure between pairs receiving and giving counseling from others to assist in solving complications. On the other hand, online learning platforms are highly dependent on connecting and sharing as a basic strategy that needs to be employed over all stages of online learning settings, whether between students and students, or between students and their instructor. Moreover, Falloon ( 2016 ) findings showed that Analyzing (average 17%) was present in various phases of these online students’ work, which is based on what phase they were at together with their tasks, despite the fact that most analysis was associated with students depending on themselves during online process.

Conclusion and future work

In this investigation, both transactional distance theory (TDT) and Bloom’s Taxonomy theory (BTT) have been validated in the educational context, providing further understanding towards the students’ prospective perceptions on using online learning platforms to improve students’ academic achievement and satisfaction. The contribution that the current research might have to the field of online learning platforms have been discussed and explained. Additional insights towards students’ satisfactions and students’ academic achievements have also been presented. The current research emphasizes that the incorporation of both TDT and BTT can positively influence the research outcome. The current research has determined that numerous stakeholders, for instance developers, system designers, along with institutional users of online learning platforms reasonably consider student demands and needs, then ensure that the such a system is effectively meeting their requirements and needs. Adoption among users of online learning platforms could be broadly clarified by the eleven factor features which is based on this research model. Thus, the current research suggests more investigation be carried out to examine relationships among the complexity of online learning platforms combined with technology acceptance model (TAM).

Recommendations for stakeholders of online platforms

Based on the study findings, the first recommendation would be for administrators of higher institution. In order to implement online learning, there must be more interest given to the course structure design, whereas it should be based on theories and prior literature. Moreover, instructor and course developer need to be trained and skilled to achieve online learning platforms goals. Workshops and training sessions must be given for both instructors and students to make them more familiar in order to take the most advantages of the learning management system like Moodle and LMS. The software itself is not enough for creating an online learning environment that is suitable for students and instructors. If instructors were not trained and unaware of utilizing the software (e.g. Moodle) in the class, then the quality of education imparted to students will be jeopardized. Training and assessing the class instructor and making modifications to the software could result in a good environment for the instructor and a quality education for the student. Both students’ satisfaction and academic achievements depends on their prior knowledge and experience in relation to online learning. This current research intended to investigate student satisfaction and academic achievements in relation to online learning platforms in on of the higher education in Malaysia. Future research could integrate more in relation to blended learning settings.

Availability of data and materials

All the hardcopy questionnaires, data and statistical analysis are available.

Abuhassna, H., Megat, A., Yahaya, N., Azlina, M., & Al-rahmi, W. M. (2020). Examining Students' satisfaction and learning autonomy through web-based courses. International Journal of Advanced Trends in Computer Science and Engineering , 1 (9), 356–370. https://doi.org/10.30534/ijatcse/2020/53912020 .

Article   Google Scholar  

Abuhassna, H., & Yahaya, N. (2018). Students’ utilization of distance learning through an interventional online module based on Moore transactional distance theory. Eurasia Journal of Mathematics, Science and Technology Education , 14 (7), 3043–3052. https://doi.org/10.29333/ejmste/91606 .

Akaslan, D., & Law, E. L.-C. (2011). Measuring student E-learning readiness: A case about the subject of Electricity in Higher Education Institutions in Turkey. In H. Leung, E. Popescu, Y. Cao, R. W. H. Lau, & W. Nejdl (Eds.), ICWL 2011. LNCS, vol. 7048 , (pp. 209–218). Heidelberg: Springer.

Google Scholar  

Alalwan, N., Al-Rahmi, W. M., Alfarraj, O., Alzahrani, A., Yahaya, N., & Al-Rahmi, A. M. (2019). Integrated three theories to develop a model of factors affecting students’ academic performance in higher education. IEEE Access , 7 , 98725–98742.

Alexander, S., & Golja, T. (2007). Using students' experiences to derive quality in an e-learning system: An institution's perspective. Educational Technology & Society , 10 (2), 17–33.

Allen, I. E., Seaman, J., Poulin, R., & Straut, T. T. (2016). Online report card: Tracking online education in the United States. Babson survey research group and the online learning consortium (OLC), Pearson, and WCET state authorization Network .

Al-Rahmi, W., Othman, M. S., & Yusuf, L. M. (2015b). The role of social media for collaborative learning to improve academic performance of students and researchers in Malaysian higher education. The International Review of Research in Open and Distributed Learning , 16 (4). http://www.irrodl.org/index.php/irrodl/article/view/2326 . https://doi.org/10.19173/irrodl.v16i4.2326 .

Al-Rahmi, W. M., Alias, N., Othman, M. S., Alzahrani, A. I., Alfarraj, O., Saged, A. A., & Rahman, N. S. A. (2018). Use of e-learning by university students in Malaysian higher educational institutions: A case in Universiti Teknologi Malaysia. IEEE Access , 6 , 14268–14276.

Al-Rahmi, W. M., Othman, M. S., & Yusuf, L. M. (2015a). The effectiveness of using e-learning in Malaysian higher education: A case study Universiti Teknologi Malaysia. Mediterranean Journal of Social Sciences , 6 (5), 625–625.

Al-rahmi, W. M., Othman, M. S., & Yusuf, L. M. (2015c). Using social media for research: The role of interactivity, collaborative learning, and engagement on the performance of students in Malaysian post-secondary institutes. Mediterranean Journal of Social Sciences , 6 (5), 536.

Al-Rahmi, W. M., Othman, M. S., & Yusuf, L. M. (2015d). Exploring the factors that affect student satisfaction through using e-learning in Malaysian higher education institutions. Mediterranean Journal of Social Sciences , 6 (4), 299.

Al-Rahmi, W. M., Othman, M. S., & Yusuf, L. M. (2015e). Effect of engagement and collaborative learning on satisfaction through the use of social media on Malaysian higher education. Res. J. Appl. Sci., Eng. Technol , 9 (12), 1132–1142.

Anderson, D. K., & Reed, W. M. (1998). The effects of internet instruction, prior computer experience, and learning style on teachers’ internet attitudes and knowledge. Journal of Educational Computing Research , 19 (3), 227–246. https://doi.org/10.2190/8WX1-5Q3J-P3BW-JD61 .

Anderson, L. W., & Krathwohl, D. R. (Eds.) (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives . New York: Longman.

Azhari, F. A., & Ming, L. C. (2015). Review of e-learning practice at the tertiary education level in Malaysia. Indian Journal of Pharmaceutical Education and Research , 49 (4), 248–257.

Bagozzi, R. P., Yi, Y., & Nassen, K. D. (1988). Representation of measurement error in marketing variables: Review of approaches and extension to three-facet designs. Elsevier. Journal of Econometrics , 89 (1–2), 393–421.

Barkand, J. M. (2017). Using educational data mining techniques to analyze the effect of instructors' LMS tool use frequency on student learning and achievement in online secondary courses. Available from ProQuest Dissertations & Theses Global. Retrieved from https://vpn.utm.my/docview/2007550976?accountid=41678

Barnard, L., Lan, W. Y., To, Y. M, Paton, V. O., & Lai, S. L. (2009). Measuring self-regulation in online and blended learning environments. The Internet and Higher Education , 12 (1), 1–6. https://doi.org/10.1016/j.iheduc.2008.10.005 .

Benson, R., & Samarawickrema, G. (2009). Addressing the context of e-learning: Using transactional distance theory to inform design. Distance Education Journal , 30 (1), 5–21.

Bliuc, A. M., Goodyear, P., & Ellis, R. A. (2007). Research focus and methodological choices in studies into students' experiences of blended learning in higher education. The Internet and Higher Education , 10 , 231–244.

Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives, handbook I: The cognitive domain . New York: David McKay Co Inc.

Bocchi, J., Eastman, J. K., & Swift, C. O. (2004). Retaining the online learner: Profile of students in an online MBA program and implications for teaching them. Journal of Education for Business , 79 (4), 245–253.

Bolliger, D. U., & Inan, F. A. (2012). Development and validation of the online student connectedness survey (OSCS). The International Review of Research in Open and Distributed Learning , 13 (3), 41–65. https://doi.org/10.19173/irrodl.v13i3.1171 .

Bordelon, K. (2013). Perceptions of achievement and satisfaction as related to interactions in online courses (PhD dissertation) . Northcentral University.

Bouhnik, D., & Carmi, G. (2013). Thinking styles in virtual learning courses , (p. 141e145). Toronto: Proceedings of the 2013 international conference on information society (i-society) Retrieved from: http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber¼6619545 .

Byrne, B. M. (2010). Structural equation modeling with AMOS: Basic concepts, applications, and programming , (2nd ed., ). New York: Routledge.

Chau, P. Y. K., & Hu, P. J. (2002). Examining a model of information technology acceptance by individual professionals: An exploratory study. Journal of Management Information System , 18 (4), 191–229.

Choy, J. L. F., & Quek, C. L. (2016). Modelling relationships between students’ academic achievement and community of inquiry in an online learning environment for a blended course. Australasian Journal of Educational Technology , 32 (4), 106–124 https://doi.org/10.14742/ajet.2500 .

Coates, H., James, R., & Baldwin, G. (2005). A critical examination of the effects of learning management systems on university teaching and learning. Tertiary Education and Management , 11 , 19–36.

Computer Science Teachers’ Association (CSTA). (2011) The computational thinking leadership toolkit. [Online] Available from: http://www.csta.acm.org/Curriculum/sub/CompThinking.html [Accessed 13 Jan 2020].

Falloon, G. (2011). Exploring the virtual classroom: What students need to know (and teachers should consider). Journal of online learning and teaching. , 7 (4), 439–451.

Falloon, G. W. (2016). An analysis of young students’ thinking when completing basic coding tasks using scratch Jnr. On the iPad. Journal of Computer-Assisted Learning , 32 , 576–379.

Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research , 18 (1), 39–50. https://doi.org/10.2307/3151312 .

Furnborough, C. (2012). Making the most of others: Autonomous interdependence in adult beginner distance language learners. Distance Education , 33 (1), 99–116. https://doi.org/10.1080/01587919.2012.667962 .

Galvin, T. (2003). The (22nd Annual) 2003. Industry report. Training , 40 (9), 19–45.

Gouws, L., Bradshaw, K., & Wentworth, P. (2013). Computational thinking in educational activities. In J. Carter, I. Utting, & A. Clear (Eds.), The proceedings of the 18th conference on innovation and Technology in Computer Science Education , (pp. 10–15). Canterbury: ACM.

Hair, J. F., Sarstedt, M., Ringle, C. M., & Mena, J. A. (2012a). An assessment of the use of partial least squares structural equation modeling in marketing research. Journal of the Academy of Marketing Science. , 40 (3), 414–433.

Illinois Online Network. 2003. Learning styles and the online environment. Illinois Online Network and the Board of Trustees of the University of Illinois, http://illinois.online.uillinois.edu/IONresources/instructionaldesign/learningstyles.html

Jacobs, G. M., Renandya, W. A., & Power, M. (2016). Learner autonomy. In G. Jacobs, W. A. Renandya, & M. Power (Eds.), Simple, powerful strategies for student centered learning . New York: Springer International Publishing. https://doi.org/10.1007/978-3-319-25712-9_3 .

Chapter   Google Scholar  

Jaques, D., & Salmon, G. (2007). Learning in groups: A handbook for face-to-face and online environments . Abingdon: Routledge.

Book   Google Scholar  

Kassandrinou, A., Angelaki, C., & Mavroidis, I. (2014). Transactional distance among Open University students. How does it affect the learning Progress? European journal of open. Distance and e-Learning , 16 (1), 78–93.

Kauffman, H. (2015). A review of predictive factors of student success in and satisfaction with online learning. Research in Learning Technology , 23 , 1e13. https://doi.org/10.3402/rlt.v23.26507 .

Kirmizi, O. (2014). A Study on the Predictors of Success and Satisfaction in an Online Higher Education Program in Turkey. International Journal of Education , 6 , 4.

Kline, R. B. (2011). Principles and practice of structural equation modeling , (3rd ed., ). New York: The Guilford Press.

MATH   Google Scholar  

Lau, C. Y., & Shaikh, J. M. (2012). The impacts of personal qualities on online learning readiness at Curtin Sarawak Malaysia (CSM). Educational Research and Reviews , 7 (20), 430–444.

Lee, B. C., Yoon, J. O., & Lee, I. (2009). Learners' acceptance of e-learning in South Korea: Theories and results. Computers & Education , 53 , 1320–1329.

Lester, P. M., & King, C. M. (2009). Analog vs. digital instruction and learning: Teaching within first and second life environments. Journal of Computer-Mediated Communication , 14 , 457–483.

Lewis, N. (2004). Military student participation in distance learning . Doctorate dissertation. Johnson & Wales University. USA.

Madjar, N., Nave, A., & Hen, S. (2013). Are teachers’ psychological control, autonomy support and autonomy suppression associated with students’ goals? Educational Studies , 39 (1), 43–55. https://doi.org/10.1080/03055698.2012.667871 .

Mahle, M. (2011). Effects of interaction on student achievement and motivation in distance education. Quarterly Review of Distance Education , 12 (3), 207–215, 222.

Massimo, P. (2014). Multidimensional analysis applied to the quality of the websites: Some empirical evidences from the Italian public sector. Economics and Sociology , 7 (4), 128–138. https://doi.org/10.14254/2071-789X.2014/7-4/9 .

Mathieson, K. (2012). Exploring student perceptions of audiovisual feedback via screen casting in online courses. American Journal of Distance Education , 26 (3), 143–156.

McAuley, A., Stewart, B., Siemens, G., & Cormier, D. (2010). The MOOC model for digital practice (created through funding received by the University of Prince Edward Island through the social sciences and humanities research Council's “knowledge synthesis Grants on the digital economy”) .

Moore, G. C., & Benbasat, I. (1991). Development of an instrument to measure the perception of adopting an information technology innovation. Information System Research , 2 (3), 192–223.

Moore, M. (1990). Background and overview of contemporary American distance education. In M. Moore (Ed.) Contemporary issues in American distance education.

Moore, M. G. (1972). Learner autonomy: The second dimension of independent learning .

Moore, M. G. (2007). Theory of transactional distance. In M. G. Moore (Ed.), Handbook of distance education . Lawrence Erlbaum Associates.

O’Donnell, S. L., Chang, K. B., & Miller, K. S. (2013). Relations among autonomy, attribution style, and happiness in college students. College Student Journal .

Orton-Johnson, K. (2009). ‘I’ve stuck to the path I’m afraid’: Exploring student non-use of blended learning. British Journal of Educational Technology , 40 (5), 837–847.

Osika, R. E., & Sharp, D. P. (2002). Minimum technical competencies for distance learning students. Journal of Research on Technology in Education , 34 (3), 318–325.

Paechter, M., & Maier, B. (2010). Online or face-to-face? Students’ experiences and preferences in e-learning. Internet and Higher Education , 13 (4), 292–297.

Panyajamorn, T., Suthathip, S., Kohda, Y., Chongphaisal, P., & Supnithi, T. (2018). Effectiveness of E learning design and affecting variables in Thai public schools. Malaysian Journal of Learning and Instruction , 15 (1), 1–34.

Pekrun, R., Goetz, T., & Perry, P. R. (2005). Academic Emotions Questionnaire (AEQ): User's Manual . Munich: University of Munich, Department of Psychology; University of Manitoba Retrieved February 21, 2017. Available online at: https://de.scribd.com/doc/217451779/2005-AEQ-Manual# (Accessed 17 July 2019.

Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1991). A manual for the use of the motivated strategies for learning questionnaire (MSLQ) . Ann Arbor: The University of Michigan.

Rabinovich, T. (2009). Transactional distance in a synchronous web-extended classroom learning environment . Unpublished doctoral dissertation. Massachusetts: Boston University.

Rogers, E. M. (2003). Diffusion of innovations , (5th ed., ). New York: Free Press.

Salmon, G. (2011). E-moderating: The key to teaching and learning online , (3rd ed., ). London: Routledge.

Salmon, G. (2014). Learning innovation: A framework for transformation. European Journal of Open, Distance and e-Learning , 17 (1), 219–235.

Shearer, R. L. (2010). Transactional distance and dialogue: An exploratory study to refine the theoretical construct of dialogue in online learning. Dissertation Abstracts International Section A , 71 , 800.

Solomon, G., & Schrum, L. (2010). Web 2.0 how-to for educators .

Stroet, K., Opdenakker, M. C., & Minnaert, A. (2013). Effects of need supportive teaching on early adolescents’ motivation and engagement: A review of the literature. Educational Research Review , 9 , 65–87.

Taylor, S., & Todd, P. A. (1995). Assessing IT usage: The role of prior experience. MIS Quarterly , 19 (2), 561–570.

The blended learning impact evaluation at UCF is conducted by Research Initiative for Teaching Effectiveness. (n.d.) https://digitallearning.ucf.edu/learning-analytics/ . Accessed 25 Feb 2020.

Vasala, P., & Andreadou, D. (2010). Student’s support from tutors and peer students in distance learning. Perceptions of Hellenic Open University “studies in education” postgraduate program graduates. Open Education – The Journal for Open and Distance Education and Educational Technology , 6 (1–2), 123–137 (in Greek with English abstract).

Venkatesh, V., Thong, J. Y., & Xu, X. (2012). Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS Quarterly , 36 (1), 157–178.

Whitmer J.C. (2013). Logging on to improve achievement: Evaluating the relationship between use of the learning management system, student characteristics, and academic achievement in a hybrid large enrollment undergraduate course. Doctorate dissertation, university of California. USA.

Yu, Z. (2015). Indicators of satisfaction in clickers aided EFL class. Frontiers in Psychology , 6 , 587 https://www.frontiersin.org/articles/10.3389/fpsyg.2015.00587/full .

Zhu, C. (2012). Student satisfaction, performance, and knowledge construction in online collaborative learning. Educational Technology & Society , 15 (1), 127–136.

Download references

Acknowledgements

Not applicable.

Declarations

The study involved both undergraduate and graduate students at unviersiti teknologi Malaysia (UTM), an ethical approve was taken before collecting any data from the participants

Author information

Authors and affiliations.

Faculty of Social Sciences & Humanities, School of Education, Universiti Teknologi Malaysia, UTM, 81310, Skudai, Johor, Malaysia

Hassan Abuhassna, Waleed Mugahed Al-Rahmi, Noraffandy Yahya, Megat Aman Zahiri Megat Zakaria & Azlina Bt. Mohd Kosnin

Faculty of Engineering, School of Civil Engineering, Universiti Teknologi Malaysia, UTM, 81310, Skudai, Johor, Malaysia

Mohamad Darwish

You can also search for this author in PubMed   Google Scholar

Contributions

The corresponding author worked in writing the paper, collecting the data, the second author done all the statistical analysis. Moreover, all authors worked collaboratively to write the literature review and discussion and read and approved the final manuscript.

Corresponding author

Correspondence to Hassan Abuhassna .

Ethics declarations

Competing interests.

This paper is an original work, as its main objective is to develop a model to enhance students’ satisfaction and academic achievement towards using online platforms. As Universiti teknologi Malaysia (UTM) implementing a fully online courses starting from the second semester of 2020.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1..

General objective of the study

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Abuhassna, H., Al-Rahmi, W.M., Yahya, N. et al. Development of a new model on utilizing online learning platforms to improve students’ academic achievements and satisfaction. Int J Educ Technol High Educ 17 , 38 (2020). https://doi.org/10.1186/s41239-020-00216-z

Download citation

Received : 10 March 2020

Accepted : 19 May 2020

Published : 02 October 2020

DOI : https://doi.org/10.1186/s41239-020-00216-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online learning platforms
  • Students’ achievements
  • student’s satisfaction
  • Transactional distance theory (TDT)
  • Bloom’s taxonomy theory (BTT)

significance of the study in research about online learning

  • Open access
  • Published: 06 June 2024

Perceptions and challenges of online teaching and learning amidst the COVID-19 pandemic in India: a cross-sectional study with dental students and teachers

  • Lakshmi Nidhi Rao 1 ,
  • Aditya Shetty 1 ,
  • Varun Pai 2 ,
  • Srikant Natarajan 3 ,
  • Manjeshwar Shrinath Baliga 4 ,
  • Dian Agustin Wahjuningrum 5 ,
  • Heeresh Shetty 6 ,
  • Irmaleny Irmaleny 7 &
  • Ajinkya M. Pawar 6  

BMC Medical Education volume  24 , Article number:  637 ( 2024 ) Cite this article

80 Accesses

Metrics details

Online education has emerged as a crucial tool for imparting knowledge and skills to students in the twenty-first century, especially in developing nations like India, which previously relied heavily on traditional teaching methods.

This study delved into the perceptions and challenges experienced by students and teachers in the context of online education during the COVID-19 pandemic. Data were collected from a sample of 491 dental students and 132 teachers utilizing a cross-sectional research design and an online-validated survey questionnaire.

The study’s findings revealed significant insights. Internet accessibility emerged as a major impediment for students, with online instruction proving more effective for theoretical subjects compared to practical ones. Although most teachers expressed comfort with online teaching, they highlighted the absence of classroom interaction as a significant challenge.

This study comprehensively examines the perspectives of both students and teachers regarding online education during the pandemic. The results carry substantial implications for the academic community, underscoring the need to address internet access issues and explore ways to enhance engagement and interaction in online learning environments.

Peer Review reports

Introduction

The COVID-19 pandemic has undeniably reshaped the global educational landscape, forcing a rapid shift towards online learning methodologies. While some disciplines have transitioned relatively smoothly, dental education presents unique challenges. Unlike fields with a primarily theoretical foundation, dental education hinges on the development of practical skills and direct patient interaction [ 1 , 2 ]. This inherent need for hands-on clinical experience necessitates a critical examination of online learning’s suitability for dental education [ 1 ].

Research across diverse international contexts underscores the limitations of online learning alone in fostering essential technical skills in dentistry [ 3 , 4 ]. Recognizing this reality, the Indian dental education model prioritizes hands-on learning as a core curricular element. However, the pre-clinical phases often incorporate simulations using mannequins, hinting at the potential for blended learning approaches. In this scenario, online platforms could be strategically utilized to deliver theoretical knowledge, thereby freeing up valuable classroom time for instructors to conduct in-person skill development sessions with students [ 5 , 6 ].

Despite advancements in technology, digitalization efforts in the Indian dental sector have primarily focused on practical training tools like computer-aided design/computer-aided manufacturing (CAD/CAM) and 3D printing technologies [ 7 , 8 ]. Traditional face-to-face lectures remained the dominant method for knowledge delivery, with online learning remaining largely unexplored within the Indian dental education curriculum before the pandemic [ 9 ].

The COVID-19 pandemic has disrupted this status quo, propelling online learning to the forefront of dental education [ 10 ]. This unprecedented situation necessitates a comprehensive assessment of its impact on the perceptions and experiences of both dental students and educators across India. This leads us to our central research question: “To what extent has the COVID-19 pandemic impacted the perceptions and challenges of online learning among dental students and teachers in India?”

By delving into this question, we aim to shed light on the strengths, weaknesses, and areas for improvement in online learning within the context of Indian dental education. These findings will inform future curricular development, allowing for a well-considered and strategic integration of online and traditional approaches. Ultimately, this research seeks to enhance the overall educational experience for dental students. By ensuring a balanced curriculum that leverages the strengths of both online and offline learning, we can equip future dentists with the essential knowledge and practical skills necessary to thrive in a rapidly evolving healthcare landscape.

Study design

The study utilized a cross-sectional research design to collect data from 500 dental students and 150 teachers in India. An online-validated survey questionnaire was employed to gather quantitative data. The study population consisted of undergraduate and postgraduate dental students and teachers from diverse dental colleges across India. Participants were selected through purposive sampling based on their willingness and availability during the study period. Ethical principles were strictly followed, including obtaining informed consent, ensuring confidentiality of participant data, and safeguarding participant privacy. This sampling method was chosen due to practical reasons, as randomly sampling would have been resource intensive. Leveraging existing networks and professional contacts facilitated access to a varied participant pool, ensuring engagement and data quality. To enhance representativeness, participants from various dental colleges, urban and rural locations, academic levels, and age groups were included in the sample.

Questionnaire

A self-administered, English-language questionnaire developed using Google Forms was utilized to evaluate perceptions and challenges of online dental education during the COVID-19 pandemic in India [ 11 ]. The questionnaire was structured around three main domains: satisfaction with online teaching, encountered problems, and comparisons between online and traditional classroom learning experiences.

In order to ensure the validity and reliability of this questionnaire within the unique context of Indian dental education, a thorough validation process was undertaken. Face validity was established through evaluation by a qualified researcher and questionnaire design specialist. Their assessment focused on the clarity, comprehensiveness, and relevance of the questions, resulting in revisions to improve clarity and minimize ambiguity in terminology, phrasing, and structure.

Content validity was ensured through the input of two subject-matter experts (SMEs) with significant experience in Indian dental education. These SMEs, who were independent of the study, assessed the questionnaire against the defined research objectives. Their feedback ensured that the questionnaire comprehensively covered the intended constructs, leading to further refinements.

Pilot testing was then conducted with a representative sample of 20 dental students and 10 teachers. This phase aimed to identify and address any remaining issues with the questionnaire’s understandability, flow, and length. Based on the feedback received from the pilot test participants, minor adjustments were made to optimize the user experience.

Data analysis

Following data collection, survey responses were entered into a Microsoft Excel spreadsheet and then imported into the Statistical Package for Social Sciences (SPSS) version 25 for analysis. Descriptive statistics were employed to summarize participant characteristics such as age, course of study (undergraduate, postgraduate), place of study (town, village), and self-reported familiarity with e-learning skills. These characteristics were presented as frequencies (N) and percentages (%) to provide an overview of the sample composition.

Chi-square tests were conducted to assess potential associations between categorical variables. However, the use of the Chi-square test is contingent upon meeting the assumption of expected cell frequencies being greater than 5. In instances where expected cell frequencies fell below 5, Fisher’s exact test was employed as a more appropriate alternative. Statistical significance was established at a p -value of 0.05 or less.

Participant characteristics and survey completion

A total of 500 students initiated the online survey, with a completion rate of 81.8% ( n  = 409). Similarly, among the 150 teachers who began the survey, 132 completed it (completion rate: 88%). To ensure a sufficient sample size for analysis, the survey period was extended beyond its original timeframe, potentially introducing a selection bias. This decision aligns with the purposive sampling methodology employed in this study.

Student perceptions

Satisfaction with online learning.

A significant portion (44.7%, n  = 183) of students aged 18–21 reported satisfaction with online instruction. Interestingly, age did not significantly influence satisfaction levels. Undergraduates expressed higher satisfaction compared to other course levels ( p  = 0.001). Location also played a role, with students from both urban (41.8%, n  = 79) and rural areas (45.6%, n  = 73) reporting similar contentment levels ( p -value = 0.034). Notably, students with advanced e-learning skills reported significantly higher satisfaction ( p -value = 0.001).

Evaluation of specific aspects

Students across various age groups, locations, and course levels expressed satisfaction with the topics covered ( p  = 0.032 for undergraduate students, p  = 0.002 for those knowledgeable about e-learning) and the instructors’ efforts (particularly those aged 18–21, p  = 0.001, undergraduates, p  = 0.010, and students with e-learning skills, p  = 0.001). However, no significant difference was observed in self-reported understanding of the subject matter based on demographics or e-learning skills. Overall, students aged 18–21 (42.7%, p  = 0.001) and those with e-learning knowledge ( p  = 0.006) exhibited greater appreciation for the quality of teaching.

Engagement and flexibility

Among participants familiar with e-learning (specific number not provided), a significant proportion (42.9%, p -value of 0.019) felt they could effectively engage with instructors during and after online sessions, regardless of age, location, or course level. Additionally, a notable number of undergraduate students with e-learning skills ( p -values of 0.039 and 0.001, respectively) appreciated the flexibility of attending online classes at their convenience. Furthermore, 40.2% of participants with e-learning skills ( p -value = 0.054) found online learning beneficial, particularly for theoretical subjects lacking practical components. Notably, a majority of participants across demographics agreed that online teaching could be valuable for future mass education initiatives (data presented in Table  1 ).

Challenges with online learning

Despite some advantages, participants with e-learning skills (48.2%) also reported internet connectivity and speed issues. Slow internet hindered video streaming for students across all age groups ( p  = 0.005). Only 20.6% of participants with e-learning skills disagreed with this finding.

Interaction and collaboration

Except for those residing in rural and semi-urban areas ( p  = 0.022), participants did not report significant concerns about general interaction problems. However, challenges emerged regarding sound quality and group study. Poor internet connections caused sound issues for students above 21 years old (55%, p  = 0.02) and those without e-learning skills (27.7%, p  = 0.03). Similarly, joint or group study proved difficult for participants over 21 (55%) and those residing in rural areas (48.8%, p  = 0.025).

Subject suitability

A significant portion (41%) of participants unfamiliar with e-learning skills expressed concerns about the effectiveness of online learning for subjects like mathematics, accounting, and laboratory-based courses ( p  = 0.077). This suggests that students perceive these subjects as requiring a more hands-on or interactive approach that may be challenging to replicate in an online environment.

Learning environment

Across all demographics, a consistent trend emerged: most participants reported feelings of isolation and a lack of belonging when learning online (data presented in Table  2 ). This indicates that online learning environments may not adequately foster the sense of community and social interaction typically found in traditional classrooms. Students generally favoured classroom settings for the increased engagement and interaction with teachers and classmates, qualities perceived as lacking in online environments. This preference was further supported by students with limited e-learning skills (33.7%), who agreed that classroom learning was superior and considered online teaching/learning to be less beneficial ( p  = 0.045).

Impact on learning

The majority of participants believed online classes had minimal impact on developing students’ overall personalities and communication skills. Students with limited e-learning skills (50.6%) likened online learning to watching YouTube lectures ( p  = 0.061), implying a passive learning experience. This suggests online learning may not be as effective as traditional classroom settings in fostering these crucial soft skills.

Despite concerns about suitability and learning environment, a significant portion of participants, particularly undergraduates (42.7%, p  = 0.001), expressed satisfaction with the topics covered and the instructors’ efforts in the online environment. This highlights a potential disconnect between student concerns and their actual experience with well-designed online learning.

Most undergraduates strongly agreed (49.2%, p  = 0.04) that online teaching/learning is extremely useful during disasters such as the coronavirus pandemic (Table  3 ). This emphasizes the potential of online learning as a contingency measure for educational continuity during unforeseen circumstances.

Overall, student perceptions regarding the suitability and learning outcomes of online learning were mixed. While some found it beneficial for specific situations and expressed satisfaction with well-designed online courses, concerns existed about its effectiveness in fostering a sense of community, developing soft skills, and replicating the interactive nature of traditional classroom settings.

Teacher perceptions

Advantages of online teaching.

A considerable number of teachers (40%) viewed online classes as a more adaptable alternative to traditional classroom settings. Similarly, nearly half (49.2%) expressed this view regarding student accessibility. Additionally, a significant majority (59.8%) believed online teaching offered students improved 24/7 access to learning materials.

Challenges of online teaching

Teachers reported a significant decrease (50%) in the use of standardized coursework compared to traditional classrooms. While they strongly disagreed (38%) that online teaching eliminates the need for proper lesson planning, they overwhelmingly felt it hindered creating a good interactive environment with students (87%). Furthermore, teachers believed students were less likely to ask questions in an online setting (61%) compared to a physical classroom. However, most teachers (79%) appreciated the elimination of physical travel associated with online teaching.

Suitability and effectiveness

In terms of learner level, online teaching was perceived as more suitable for advanced learners (58%) than beginners. Teachers also believed online teaching was better suited for theory-based subjects (87%) compared to laboratory-based ones. Opinions were divided regarding the optimal use of online teaching for knowledge transfer, with 39% disagreeing and 24% remaining neutral. The teachers concurred that online teaching was a valuable tool during crises like the COVID-19 pandemic, but they generally preferred face-to-face teaching under normal circumstances. Data pertaining to these findings is presented in Table  4 .

The COVID-19 pandemic necessitated a rapid shift to online learning platforms in dental education globally, including India. While this transition aimed to maintain educational continuity [ 12 ], it presented unique challenges for a country grappling with limited internet infrastructure [ 13 ]. Existing disparities in access were exacerbated by the pandemic’s suddenness, highlighting the need for innovative solutions tailored to the Indian context [ 14 ].

Our study aimed to understand the perceptions and challenges of online dental education among students and educators. Our findings resonate with existing research, highlighting both the advantages and limitations of online learning. Similar to previous studies, both students and educators in our research acknowledged the benefits of flexibility, improved online teaching skills, and efficient time management [ 15 , 16 , 17 , 18 , 19 ]. Additionally, the significant role of online resources and social media platforms in fostering learning and interaction, as emphasized by Azer et al. (2023) and Wimardhani et al. (2023), was evident in our findings [ 17 , 19 ].

This research explored factors influencing student satisfaction with online learning. Consistent with Shaheen et al. (2023), our results indicated higher satisfaction among younger students (aged 18–21) and those with stronger e-learning skills, suggesting a correlation with technological comfort [ 18 ]. However, unlike Schlenz et al. (2023) who observed a general preference for online learning, our study did not find significant variations in satisfaction based on age, location, or field of study [ 15 ]. Notably, students with advanced e-learning skills reported higher dissatisfaction with internet connectivity and speed, suggesting a potential link between heightened expectations and increased frustration with technical limitations. This aligns with observations made by Pratheebha & Jayaraman (2022), Chang et al. (2021), and Wimardhani et al. (2023) regarding student challenges in online learning environments [ 16 , 19 , 20 ]. While acknowledging the quality of online instruction, many students in our study, similar to those in Chang et al. (2021), expressed feelings of isolation and a preference for the interactive elements of traditional classroom settings [ 20 ].

The transition to online learning presented specific challenges in dental education, particularly for subjects requiring hands-on experience. Deery (2020) emphasizes the need for dental schools to adapt their curricula and policies to incorporate effective distance learning methods [ 21 ]. Our research reinforces this notion by highlighting the importance of a strong educator-student connection for successful online learning. In the face of these challenges, educators and administrators remain committed to creating a conducive learning environment that prioritizes adaptability.

Online learning platforms offer unique advantages. E-learning technologies empower learners to personalize their learning pace, sequence, and content, leading to improved engagement [ 22 ]. Additionally, recorded online lectures provide flexibility for students to access learning materials at their convenience [ 23 ]. Our research, building upon prior work by Pham (2022) and Chang et al. (2021), demonstrated a weaker association between peer-to-peer interactions and student satisfaction, consistent with findings in other online learning environments [ 20 , 24 ].

Several factors influence the success of online education, including educator willingness to share content online, student capacity for online learning, and the quality of available digital resources [ 25 ]. Political, economic, and cultural factors also significantly influence the transition from traditional to online learning [ 25 ]. While acknowledging the potential for academic collaboration and remote work, many educators recognize the opportunity to integrate blended learning models into future curriculum development [ 26 ].

“Internet self-efficacy” – an individual’s confidence in navigating online tasks – plays a crucial role in online learning success [ 27 ]. In India, internet connectivity disparities between urban and rural areas present a challenge for both students and teachers. These connectivity issues, along with software problems and audio/video functionalities, can disrupt learning and create a frustrating experience. Institutions can mitigate these challenges by offering comprehensive internet skills training to enhance students’ and educators’ internet self-efficacy before implementing online courses [ 24 ]. However, the pandemic’s swift implementation of remote learning may have limited the availability of such training protocols.

Challenges and innovations in clinical skills development

While online learning offers numerous advantages, it presents unique challenges in dental education, particularly for subjects requiring hands-on clinical experience with patients. The absence of direct patient interaction remains a significant hurdle [ 21 ]. However, several institutions are actively addressing this limitation by adopting diverse e-learning tools like flash multimedia, digitized images, virtual patient simulations, and virtual reality (VR) simulators. Research has shown the effectiveness of these tools in teaching various clinical skills, including examination, palpation, surgical procedures, and resuscitation [ 28 ]. Notably, VR simulators have been found to be equally effective as live patient interactions in achieving learning objectives, offering a promising solution for overcoming limitations in online dental education.

The rise of virtual interaction and blended learning models

The COVID-19 pandemic has significantly transformed the educational landscape in dental education by introducing virtual teaching platforms. This shift has reshaped interactions between educators and students, impacting how they learn and assess progress. The rise of web-based resources has facilitated the emergence of innovative virtual interaction methods, such as student-patient simulations and peer mentoring programs. Research suggests these methods can be effective in enhancing medical students’ knowledge and psychological well-being [ 29 ]. However, this transition to online learning has also encountered obstacles, including technical difficulties, privacy concerns, reduced student engagement, and potential exacerbation of mental health issues due to social isolation [ 27 , 29 , 30 ].

Optimizing blended learning for future dental education

The unique circumstances of the COVID-19 pandemic have highlighted the importance of exploring student preferences and technical challenges to optimize blended learning models in dental education. By addressing the diverse needs of students and effectively integrating online and offline learning components, educators can foster successful learning outcomes in an ever-evolving educational environment [ 30 ]. This research underscores the multifaceted nature of online dental education and emphasizes the necessity for collaborative efforts to leverage its advantages while mitigating limitations.

Building educational resilience and adaptability

The significance of these studies extends beyond immediate pandemic adaptations. They contribute to a broader understanding of learning adaptations, hybrid learning environments, digital literacy, pedagogical innovation, mental health and well-being, policy implications, and the continuous enhancement of educational practices [ 30 ]. Reflecting on experiences and lessons learned during the pandemic can assist educational institutions in refining their teaching and learning approaches, ensuring greater resilience and adaptability in the face of future challenges [ 29 ]. Therefore, the insights from these studies offer valuable guidance for shaping the future of dental education and broader educational practices in a post-pandemic world.

Limitations and future research directions

We acknowledge limitations in our study. Employing random sampling methods in future research would be crucial to draw more widely applicable conclusions regarding perceptions and challenges in online dental education in India. Additionally, we recognize the challenges associated with relying on self-reported data, including potential social desirability bias. While acknowledging these limitations, our study adopted a people-centred approach, employing a diverse questionnaire, contextual analysis, and insightful techniques to gain a profound understanding of participants’ experiences with digital instruction. However, these limitations underscore the need for further exploration, particularly in understanding the potential misalignment between outcomes of digital and in-person events from instructors’ perspectives. This area warrants additional research through targeted interviews, subgroup analyses, and consideration of contextual factors, aiming to enhance our understanding of effective teaching modes and benefitting student learning outcomes.

In conclusion, the COVID-19 pandemic has accelerated the adoption of online and virtual teaching platforms in dental education, offering both opportunities and challenges. By exploring student preferences and addressing technical obstacles, educators can refine blended learning models to better cater to diverse student needs. The insights gleaned from pandemic experiences provide valuable direction for bolstering the resilience and adaptability of educational practices in a post-pandemic era.

Availability of data and materials

The datasets used and/or analysed throughout the current investigation are attainable from the corresponding author following a justifiable request.

Desai BK. Clinical implications of the COVID-19 pandemic on dental education. J Dent Educ. 2020;84:512.

Article   Google Scholar  

Alsoufi A, Alsuyihili A, Msherghi A, Elhadi A, Atiyah H, Ashini A, Ashwieb A, Ghula M, Ben Hasan H, Abudabuos S, Alameen H, Abokhdhir T, Anaiba M, Nagib T, Shuwayyah A, Benothman R, Arrefae G, Alkhwayildi A, Alhadi A, Zaid A, Elhadi M. Impact of the COVID-19 pandemic on medical education: medical students’ knowledge, attitudes, and practices regarding electronic learning. PLoS One. 2020;15: e0242905.

Hillenburg KL, Cederberg RA, Gray SA, Hurst CL, Johnson GK, Potter BJ. E-learning and the future of dental education: opinions of administrators and information technology specialists. Eur J Dent Educ. 2006;10:169–77.

Naik N, Hameed BMZ, Sooriyaperakasam N, Vinayahalingam S, Patil V, Smriti K, Saxena J, Shah M, Ibrahim S, Singh A, Karimi H, Naganathan K, Shetty DK, Rai BP, Chlosta P, Somani BK. Transforming healthcare through a digital revolution: a review of digital healthcare technologies and solutions. Front Digit Health. 2022;4: 919985.

Machado RA, Bonan PRF, Perez D, Martelli JÚnior H. COVID-19 pandemic and the impact on dental education: discussing current and future perspectives. Braz Oral Res. 2020;34:e083.

Talapko J, Perić I, Vulić P, Pustijanac E, Jukić M, Bekić S, Meštrović T, Škrlec I. Mental health and physical activity in health-related university students during the COVID-19 pandemic. Healthc (Basel). 2021;9:801.

Google Scholar  

Jum’ah AA, Elsalem L, Loch C, Schwass D, Brunton PA. Perception of health and educational risks amongst dental students and educators in the era of COVID-19. Eur J Dent Educ. 2021;25:506–15.

O’Doherty D, Dromey M, Lougheed J, Hannigan A, Last J, McGrath D. Barriers and solutions to online learning in medical education - an integrative review. BMC Med Educ. 2018;18:130.

Schlenz MA, Schmidt A, Wöstmann B, Kramer N, Schulz-Weidner N. Students’ and lecturers’ perspective on the implementation of online learning in dental education due to SARS-CoV-2 (COVID-19): a cross-sectional study. BMC Med Educ. 2020;20:354.

Röhle A, Horneff H, Willemer MC. Practical teaching in undergraduate human and dental medical training during the COVID-19 crisis. Report on the COVID19-related transformation of peer-based teaching in the Skills Lab using an Inverted Classroom Model. GMS J Med Educ. 2021;38:Doc2.

Wright KB. Researching internet-based populations: advantages and disadvantages of online survey research, online questionnaire authoring software packages, and web survey services. J Computer-Mediated Communication. 2006;10:1034.

Turkyilmaz I, Hariri NH, Jahangiri L. Student\’s perception of the impact of e-learning on dental education. J Contemp Dent Pract. 2019;20:616–21.

Warnecke E, Pearson S. Medical students’ perceptions of using e-learning to enhance the acquisition of consulting skills. Australas Med J. 2011;4:300–7.

Tull S, Dabner N, Ayebi-Arthur K. Social media and e-learning in response to seismic events: resilient practices. Journal of Open, Flexible and Distance Learning. 2020;24:63–76.

Schlenz MA, Wöstmann B, Krämer N, Schulz-Weidner N. Update of students’ and lecturers’ perspectives on online learning in dental education after a five-semester experience due to the SARS-CoV-2 (COVID-19) pandemic: insights for future curriculum reform. BMC Med Educ. 2023;23(1):556. https://doi.org/10.1186/s12909-023-04544-2 .

Pratheebha C, Jayaraman M. Learning and satisfaction levels with online teaching methods among undergraduate dental students - a survey. J Adv Pharm Technol Res. 2022;13(Suppl 1):S168–72. https://doi.org/10.4103/japtr.japtr_285_22 . (Epub 2022 Nov 30).

Azer SA, Alhudaithi D, AlBuqami F, et al. Online learning resources and social media platforms used by medical students during the COVID-19 pandemic. BMC Med Educ. 2023;23:969. https://doi.org/10.1186/s12909-023-04906-w .

Shaheen MY, Basudan AM, Almubarak AM, Alzawawi AS, Al-Ahmari FM, Aldulaijan HA, Almoharib H, Ashri NY. Dental students’ perceptions towards e-learning in comparison with traditional classroom learning. Cureus. 2023;26: e51129. https://doi.org/10.7759/cureus.51129 .

Wimardhani YS, Indrastiti RK, Ayu AP, Soegyanto AI, Wardhany II, Subarnbhesaj A, Nik Mohd Rosdy NMM, Do TT. Perceptions of online learning implementation in dental education during the COVID-19 pandemic: a cross-sectional study of dental school faculty members in Southeast Asia. Dent J (Basel). 2023;11:article 201. https://doi.org/10.3390/dj11090201 .

Yu-Fong Chang J, Wang LH, Lin TC, Cheng FC, Chiang CP. Comparison of learning effectiveness between physical classroom and online learning for dental education during the COVID-19 pandemic. J Dent Sci. 2021;16:1281–9. https://doi.org/10.1016/j.jds.2021.07.016 .

Deery C. The COVID-19 pandemic: implications for dental education. Evid Based Dent. 2020;21:46–7.

Taylor DL, Yeung M, Bashet AZ. Personalized and adaptive learning. 2021. p. 17–34.

Tull S, Dabner N, Ayebi-Arthur K. Social media and e-learning in response to seismic events: resilient practices. J Open Flex Distance Learn. 2017;21:63–76.

Pham AT. Engineering students’ interaction in online classes via google meet: a case study during the COVID-19 pandemic. Int J Eng Pedagogy (iJEP). 2022;12:158–70.

Scanlon E, McAndrew P, O’Shea T. Designing for educational technology to enhance the experience of learners in distance education: how open educational resources, learning design and Moocs are influencing learning. J Interact Media Educ. 2015;2015:1–9.

Ong SGT, Quek GCL. Enhancing teacher–student interactions and student online engagement in an online learning environment. Learn Environ Res. 2023;26:681–707.

Hsu M-H, Chiu C-M. Internet self-efficacy and electronic service acceptance. Decis Support Syst. 2004;38:369–81.

Iyer P, Aziz K, Ojcius DM. Impact of COVID-19 on dental education in the United States. J Dent Educ. 2020;84:718–22.

Pokhrel S, Chhetri RA. Literature review on impact of COVID-19 pandemic on teaching and learning. Higher Education for the Future. 2021;8:133–41. https://doi.org/10.1177/2347631120983481 .

Nurunnabi M, Almusharraf N, Aldeghaither D. Mental health and well-being during the COVID-19 pandemic in higher education: evidence from G20 countries. J Public Health Res. 2021;9:2010.

Download references

Acknowledgements

The authors are grateful to all the volunteers who participated in the study.

The authors did not receive support from any organizations for the submitted work.

Author information

Authors and affiliations.

Department of Conservative Dentistry and Endodontics, AB Shetty Memorial Institute of Dental Sciences, NITTE Deemed to be University, Mangalore, Karnataka, India

Lakshmi Nidhi Rao & Aditya Shetty

Forensic Medicine Unit, Faculty of Medicine, AIMST University, Jalan Semeling, Bedong, Kedah, Malaysia

Department of Oral Pathology and Microbiology, Manipal College of Dental Sciences, Mangalore, Karnataka, India

Srikant Natarajan

Research Unit, Mangarole Institute of Oncology, Pumpwell, Mangalore, Karnataka, 575002, India

Manjeshwar Shrinath Baliga

Department of Conservative Dentistry, Faculty of Dental Medicine, Universitas Airlangga, Surabaya, Indonesia

Dian Agustin Wahjuningrum

Department of Conservative Dentistry and Endodontics, Nair Hospital Dental College, Mumbai, Maharashtra, 400008, India

Heeresh Shetty & Ajinkya M. Pawar

Department of Conservative Dentistry, Universitas Padjadjaran, Bandung, Indonesia

Irmaleny Irmaleny

You can also search for this author in PubMed   Google Scholar

Contributions

LNR: Conception and design of the study, Data acquisition, Data analysis, Discussion of the results, Drafting of the manuscript. AS: Conception and design of the study, Data acquisition, Discussion of the results, Drafting of the manuscript. VP: Conception and design of the study, Data acquisition, Data analysis, Discussion of the results, Drafting of the manuscript. SN: Conception and design of the study, Data acquisition, Data analysis, Discussion of the results, Drafting of the manuscript. MSB: Conception and design of the study, Data acquisition, Data analysis, Discussion of the results, Drafting of the manuscript. HS: Drafting of the manuscript, Proofreading and editing for final submission. AMP: Proofreading and editing for final submission. DAW: Proofreading and editing for final submission. II: Proofreading and editing for final submission. All authors read and approved the final manuscript.

Corresponding authors

Correspondence to Dian Agustin Wahjuningrum or Ajinkya M. Pawar .

Ethics declarations

Ethics approval and consent to participate.

The study was done after obtaining clearance from Institutional Ethics Committee (PEAIEC/2020/01/05).

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Rao, L.N., Shetty, A., Pai, V. et al. Perceptions and challenges of online teaching and learning amidst the COVID-19 pandemic in India: a cross-sectional study with dental students and teachers. BMC Med Educ 24 , 637 (2024). https://doi.org/10.1186/s12909-024-05340-2

Download citation

Received : 18 October 2023

Accepted : 22 March 2024

Published : 06 June 2024

DOI : https://doi.org/10.1186/s12909-024-05340-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Coronavirus pandemic
  • Face-to-face-instruction
  • Online education
  • Education and training
  • Worldwide web technology

BMC Medical Education

ISSN: 1472-6920

significance of the study in research about online learning

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Wiley - PMC COVID-19 Collection

Logo of pheblackwell

Students’ experience of online learning during the COVID‐19 pandemic: A province‐wide survey study

Lixiang yan.

1 Centre for Learning Analytics at Monash, Faculty of Information Technology, Monash University, Clayton VIC, Australia

Alexander Whitelock‐Wainwright

2 Portfolio of the Deputy Vice‐Chancellor (Education), Monash University, Melbourne VIC, Australia

Quanlong Guan

3 Department of Computer Science, Jinan University, Guangzhou China

Gangxin Wen

4 College of Cyber Security, Jinan University, Guangzhou China

Dragan Gašević

Guanliang chen, associated data.

The data is not openly available as it is restricted by the Chinese government.

Online learning is currently adopted by educational institutions worldwide to provide students with ongoing education during the COVID‐19 pandemic. Even though online learning research has been advancing in uncovering student experiences in various settings (i.e., tertiary, adult, and professional education), very little progress has been achieved in understanding the experience of the K‐12 student population, especially when narrowed down to different school‐year segments (i.e., primary and secondary school students). This study explores how students at different stages of their K‐12 education reacted to the mandatory full‐time online learning during the COVID‐19 pandemic. For this purpose, we conducted a province‐wide survey study in which the online learning experience of 1,170,769 Chinese students was collected from the Guangdong Province of China. We performed cross‐tabulation and Chi‐square analysis to compare students’ online learning conditions, experiences, and expectations. Results from this survey study provide evidence that students’ online learning experiences are significantly different across school years. Foremost, policy implications were made to advise government authorises and schools on improving the delivery of online learning, and potential directions were identified for future research into K‐12 online learning.

Practitioner notes

What is already known about this topic

  • Online learning has been widely adopted during the COVID‐19 pandemic to ensure the continuation of K‐12 education.
  • Student success in K‐12 online education is substantially lower than in conventional schools.
  • Students experienced various difficulties related to the delivery of online learning.

What this paper adds

  • Provide empirical evidence for the online learning experience of students in different school years.
  • Identify the different needs of students in primary, middle, and high school.
  • Identify the challenges of delivering online learning to students of different age.

Implications for practice and/or policy

  • Authority and schools need to provide sufficient technical support to students in online learning.
  • The delivery of online learning needs to be customised for students in different school years.

INTRODUCTION

The ongoing COVID‐19 pandemic poses significant challenges to the global education system. By July 2020, the UN Educational, Scientific and Cultural Organization (2020) reported nationwide school closure in 111 countries, affecting over 1.07 billion students, which is around 61% of the global student population. Traditional brick‐and‐mortar schools are forced to transform into full‐time virtual schools to provide students with ongoing education (Van Lancker & Parolin,  2020 ). Consequently, students must adapt to the transition from face‐to‐face learning to fully remote online learning, where synchronous video conferences, social media, and asynchronous discussion forums become their primary venues for knowledge construction and peer communication.

For K‐12 students, this sudden transition is problematic as they often lack prior online learning experience (Barbour & Reeves,  2009 ). Barbour and LaBonte ( 2017 ) estimated that even in countries where online learning is growing rapidly, such as USA and Canada, less than 10% of the K‐12 student population had prior experience with this format. Maladaptation to online learning could expose inexperienced students to various vulnerabilities, including decrements in academic performance (Molnar et al.,  2019 ), feeling of isolation (Song et al.,  2004 ), and lack of learning motivation (Muilenburg & Berge,  2005 ). Unfortunately, with confirmed cases continuing to rise each day, and new outbreaks occur on a global scale, full‐time online learning for most students could last longer than anticipated (World Health Organization,  2020 ). Even after the pandemic, the current mass adoption of online learning could have lasting impacts on the global education system, and potentially accelerate and expand the rapid growth of virtual schools on a global scale (Molnar et al.,  2019 ). Thus, understanding students' learning conditions and their experiences of online learning during the COVID pandemic becomes imperative.

Emerging evidence on students’ online learning experience during the COVID‐19 pandemic has identified several major concerns, including issues with internet connection (Agung et al.,  2020 ; Basuony et al.,  2020 ), problems with IT equipment (Bączek et al.,  2021 ; Niemi & Kousa,  2020 ), limited collaborative learning opportunities (Bączek et al.,  2021 ; Yates et al.,  2020 ), reduced learning motivation (Basuony et al.,  2020 ; Niemi & Kousa,  2020 ; Yates et al.,  2020 ), and increased learning burdens (Niemi & Kousa,  2020 ). Although these findings provided valuable insights about the issues students experienced during online learning, information about their learning conditions and future expectations were less mentioned. Such information could assist educational authorises and institutions to better comprehend students’ difficulties and potentially improve their online learning experience. Additionally, most of these recent studies were limited to higher education, except for Yates et al. ( 2020 ) and Niemi and Kousa’s ( 2020 ) studies on senior high school students. Empirical research targeting the full spectrum of K‐12students remain scarce. Therefore, to address these gaps, the current paper reports the findings of a large‐scale study that sought to explore K‐12 students’ online learning experience during the COVID‐19 pandemic in a provincial sample of over one million Chinese students. The findings of this study provide policy recommendations to educational institutions and authorities regarding the delivery of K‐12 online education.

LITERATURE REVIEW

Learning conditions and technologies.

Having stable access to the internet is critical to students’ learning experience during online learning. Berge ( 2005 ) expressed the concern of the divide in digital‐readiness, and the pedagogical approach between different countries could influence students’ online learning experience. Digital‐readiness is the availability and adoption of information technologies and infrastructures in a country. Western countries like America (3rd) scored significantly higher in digital‐readiness compared to Asian countries like China (54th; Cisco,  2019 ). Students from low digital‐readiness countries could experience additional technology‐related problems. Supporting evidence is emerging in recent studies conducted during the COVID‐19 pandemic. In Egypt's capital city, Basuony et al. ( 2020 ) found that only around 13.9%of the students experienced issues with their internet connection. Whereas more than two‐thirds of the students in rural Indonesia reported issues of unstable internet, insufficient internet data, and incompatible learning device (Agung et al.,  2020 ).

Another influential factor for K‐12 students to adequately adapt to online learning is the accessibility of appropriate technological devices, especially having access to a desktop or a laptop (Barbour et al., 2018 ). However, it is unlikely for most of the students to satisfy this requirement. Even in higher education, around 76% of students reported having incompatible devices for online learning and only 15% of students used laptop for online learning, whereas around 85% of them used smartphone (Agung et al.,  2020 ). It is very likely that K‐12 students also suffer from this availability issue as they depend on their parents to provide access to relevant learning devices.

Technical issues surrounding technological devices could also influence students’ experience in online learning. (Barbour & Reeves,  2009 ) argues that students need to have a high level of digital literacy to find and use relevant information and communicate with others through technological devices. Students lacking this ability could experience difficulties in online learning. Bączek et al. ( 2021 ) found that around 54% of the medical students experienced technical problems with IT equipment and this issue was more prevalent in students with lower years of tertiary education. Likewise, Niemi and Kousa ( 2020 ) also find that students in a Finish high school experienced increased amounts of technical problems during the examination period, which involved additional technical applications. These findings are concerning as young children and adolescent in primary and lower secondary school could be more vulnerable to these technical problems as they are less experienced with the technologies in online learning (Barbour & LaBonte,  2017 ). Therefore, it is essential to investigate the learning conditions and the related difficulties experienced by students in K‐12 education as the extend of effects on them remain underexplored.

Learning experience and interactions

Apart from the aforementioned issues, the extent of interaction and collaborative learning opportunities available in online learning could also influence students’ experience. The literature on online learning has long emphasised the role of effective interaction for the success of student learning. According to Muirhead and Juwah ( 2004 ), interaction is an event that can take the shape of any type of communication between two or subjects and objects. Specifically, the literature acknowledges the three typical forms of interactions (Moore,  1989 ): (i) student‐content, (ii) student‐student, and (iii) student‐teacher. Anderson ( 2003 ) posits, in the well‐known interaction equivalency theorem, learning experiences will not deteriorate if only one of the three interaction is of high quality, and the other two can be reduced or even eliminated. Quality interaction can be accomplished by across two dimensions: (i) structure—pedagogical means that guide student interaction with contents or other students and (ii) dialogue—communication that happens between students and teachers and among students. To be able to scale online learning and prevent the growth of teaching costs, the emphasise is typically on structure (i.e., pedagogy) that can promote effective student‐content and student‐student interaction. The role of technology and media is typically recognised as a way to amplify the effect of pedagogy (Lou et al.,  2006 ). Novel technological innovations—for example learning analytics‐based personalised feedback at scale (Pardo et al.,  2019 ) —can also empower teachers to promote their interaction with students.

Online education can lead to a sense of isolation, which can be detrimental to student success (McInnerney & Roberts,  2004 ). Therefore, integration of social interaction into pedagogy for online learning is essential, especially at the times when students do not actually know each other or have communication and collaboration skills underdeveloped (Garrison et al.,  2010 ; Gašević et al.,  2015 ). Unfortunately, existing evidence suggested that online learning delivery during the COVID‐19 pandemic often lacks interactivity and collaborative experiences (Bączek et al.,  2021 ; Yates et al.,  2020 ). Bączek et al., ( 2021 ) found that around half of the medical students reported reduced interaction with teachers, and only 4% of students think online learning classes are interactive. Likewise, Yates et al. ( 2020 )’s study in high school students also revealed that over half of the students preferred in‐class collaboration over online collaboration as they value the immediate support and the proximity to teachers and peers from in‐class interaction.

Learning expectations and age differentiation

Although these studies have provided valuable insights and stressed the need for more interactivity in online learning, K‐12 students in different school years could exhibit different expectations for the desired activities in online learning. Piaget's Cognitive Developmental Theory illustrated children's difficulties in understanding abstract and hypothetical concepts (Thomas,  2000 ). Primary school students will encounter many abstract concepts in their STEM education (Uttal & Cohen,  2012 ). In face‐to‐face learning, teachers provide constant guidance on students’ learning progress and can help them to understand difficult concepts. Unfortunately, the level of guidance significantly drops in online learning, and, in most cases, children have to face learning obstacles by themselves (Barbour,  2013 ). Additionally, lower primary school students may lack the metacognitive skills to use various online learning functions, maintain engagement in synchronous online learning, develop and execute self‐regulated learning plans, and engage in meaningful peer interactions during online learning (Barbour,  2013 ; Broadbent & Poon,  2015 ; Huffaker & Calvert, 2003; Wang et al.,  2013 ). Thus, understanding these younger students’ expectations is imperative as delivering online learning to them in the same way as a virtual high school could hinder their learning experiences. For students with more matured metacognition, their expectations of online learning could be substantially different from younger students. Niemi et al.’s study ( 2020 ) with students in a Finish high school have found that students often reported heavy workload and fatigue during online learning. These issues could cause anxiety and reduce students’ learning motivation, which would have negative consequences on their emotional well‐being and academic performance (Niemi & Kousa,  2020 ; Yates et al.,  2020 ), especially for senior students who are under the pressure of examinations. Consequently, their expectations of online learning could be orientated toward having additional learning support functions and materials. Likewise, they could also prefer having more opportunities for peer interactions as these interactions are beneficial to their emotional well‐being and learning performance (Gašević et al., 2013 ; Montague & Rinaldi, 2001 ). Therefore, it is imperative to investigate the differences between online learning expectations in students of different school years to suit their needs better.

Research questions

By building upon the aforementioned relevant works, this study aimed to contribute to the online learning literature with a comprehensive understanding of the online learning experience that K‐12 students had during the COVID‐19 pandemic period in China. Additionally, this study also aimed to provide a thorough discussion of what potential actions can be undertaken to improve online learning delivery. Formally, this study was guided by three research questions (RQs):

RQ1 . What learning conditions were experienced by students across 12 years of education during their online learning process in the pandemic period? RQ2 . What benefits and obstacles were perceived by students across 12 years of education when performing online learning? RQ3 . What expectations do students, across 12 years of education, have for future online learning practices ?

Participants

The total number of K‐12 students in the Guangdong Province of China is around 15 million. In China, students of Year 1–6, Year 7–9, and Year 10–12 are referred to as students of primary school, middle school, and high school, respectively. Typically, students in China start their study in primary school at the age of around six. At the end of their high‐school study, students have to take the National College Entrance Examination (NCEE; also known as Gaokao) to apply for tertiary education. The survey was administrated across the whole Guangdong Province, that is the survey was exposed to all of the 15 million K‐12 students, though it was not mandatory for those students to accomplish the survey. A total of 1,170,769 students completed the survey, which accounts for a response rate of 7.80%. After removing responses with missing values and responses submitted from the same IP address (duplicates), we had 1,048,575 valid responses, which accounts to about 7% of the total K‐12 students in the Guangdong Province. The number of students in different school years is shown in Figure  1 . Overall, students were evenly distributed across different school years, except for a smaller sample in students of Year 10–12.

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g004.jpg

The number of students in each school year

Survey design

The survey was designed collaboratively by multiple relevant parties. Firstly, three educational researchers working in colleges and universities and three educational practitioners working in the Department of Education in Guangdong Province were recruited to co‐design the survey. Then, the initial draft of the survey was sent to 30 teachers from different primary and secondary schools, whose feedback and suggestions were considered to improve the survey. The final survey consisted of a total of 20 questions, which, broadly, can be classified into four categories: demographic, behaviours, experiences, and expectations. Details are available in Appendix.

All K‐12 students in the Guangdong Province were made to have full‐time online learning from March 1, 2020 after the outbreak of COVID‐19 in January in China. A province‐level online learning platform was provided to all schools by the government. In addition to the learning platform, these schools can also use additional third‐party platforms to facilitate the teaching activities, for example WeChat and Dingding, which provide services similar to WhatsApp and Zoom. The main change for most teachers was that they had to shift the classroom‐based lectures to online lectures with the aid of web‐conferencing tools. Similarly, these teachers also needed to perform homework marking and have consultation sessions in an online manner.

The Department of Education in the Guangdong Province of China distributed the survey to all K‐12 schools in the province on March 21, 2020 and collected responses on March 26, 2020. Students could access and answer the survey anonymously by either scan the Quick Response code along with the survey or click the survey address link on their mobile device. The survey was administrated in a completely voluntary manner and no incentives were given to the participants. Ethical approval was granted by the Department of Education in the Guangdong Province. Parental approval was not required since the survey was entirely anonymous and facilitated by the regulating authority, which satisfies China's ethical process.

The original survey was in Chinese, which was later translated by two bilingual researchers and verified by an external translator who is certified by the Australian National Accreditation Authority of Translators and Interpreters. The original and translated survey questionnaires are available in Supporting Information. Given the limited space we have here and the fact that not every survey item is relevant to the RQs, the following items were chosen to answer the RQs: item Q3 (learning media) and Q11 (learning approaches) for RQ1, item Q13 (perceived obstacle) and Q19 (perceived benefits) for RQ2, and item Q19 (expected learning activities) for RQ3. Cross‐tabulation based approaches were used to analyse the collected data. To scrutinise whether the differences displayed by students of different school years were statistically significant, we performed Chi‐square tests and calculated the Cramer's V to assess the strengths of the association after chi‐square had determined significance.

For the analyses, students were segmented into four categories based on their school years, that is Year 1–3, Year 4–6, Year 7–9, and Year 10–12, to provide a clear understanding of the different experiences and needs that different students had for online learning. This segmentation was based on the educational structure of Chinese schools: elementary school (Year 1–6), middle school (Year 7–9), and high school (Year 10–12). Children in elementary school can further be segmented into junior (Year 1–3) or senior (Year 4–6) students because senior elementary students in China are facing more workloads compared to junior students due to the provincial Middle School Entry Examination at the end of Year 6.

Learning conditions—RQ1

Learning media.

The Chi‐square test showed significant association between school years and students’ reported usage of learning media, χ 2 (55, N  = 1,853,952) = 46,675.38, p  < 0.001. The Cramer's V is 0.07 ( df ∗ = 5), which indicates a small‐to‐medium effect according to Cohen’s ( 1988 ) guidelines. Based on Figure  2 , we observed that an average of up to 87.39% students used smartphones to perform online learning, while only 25.43% students used computer, which suggests that smartphones, with widespread availability in China (2020), have been adopted by students for online learning. As for the prevalence of the two media, we noticed that both smartphones ( χ 2 (3, N  = 1,048,575) = 9,395.05, p < 0.001, Cramer's V  = 0.10 ( df ∗ = 1)) and computers ( χ 2 (3, N  = 1,048,575) = 11,025.58, p <.001, Cramer's V  = 0.10 ( df ∗ = 1)) were more adopted by high‐school‐year (Year 7–12) than early‐school‐year students (Year 1–6), both with a small effect size. Besides, apparent discrepancies can be observed between the usages of TV and paper‐based materials across different school years, that is early‐school‐year students reported more TV usage ( χ 2 (3, N  = 1,048,575) = 19,505.08, p <.001), with a small‐to‐medium effect size, Cramer's V  = 0.14( df ∗ = 1). High‐school‐year students (especially Year 10–12) reported more usage of paper‐based materials ( χ 2 (3, N  = 1,048,575) = 23,401.64, p < 0.001), with a small‐to‐medium effect size, Cramer's V  = 0.15( df ∗ = 1).

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g002.jpg

Learning media used by students in online learning

Learning approaches

School years is also significantly associated with the different learning approaches students used to tackle difficult concepts during online learning, χ 2 (55, N  = 2,383,751) = 58,030.74, p < 0.001. The strength of this association is weak to moderate as shown by the Cramer's V (0.07, df ∗ = 5; Cohen,  1988 ). When encountering problems related to difficult concepts, students typically chose to “solve independently by searching online” or “rewatch recorded lectures” instead of consulting to their teachers or peers (Figure  3 ). This is probably because, compared to classroom‐based education, it is relatively less convenient and more challenging for students to seek help from others when performing online learning. Besides, compared to high‐school‐year students, early‐school‐year students (Year 1–6), reported much less use of “solve independently by searching online” ( χ 2 (3, N  = 1,048,575) = 48,100.15, p <.001), with a small‐to‐medium effect size, Cramer's V  = 0.21 ( df ∗ = 1). Also, among those approaches of seeking help from others, significantly more high‐school‐year students preferred “communicating with other students” than early‐school‐year students ( χ 2 (3, N  = 1,048,575) = 81,723.37, p < 0.001), with a medium effect size, Cramer's V  = 0.28 ( df ∗ = 1).

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g003.jpg

Learning approaches used by students in online learning

Perceived benefits and obstacles—RQ2

Perceived benefits.

The association between school years and perceived benefits in online learning is statistically significant, χ 2 (66, N  = 2,716,127) = 29,534.23, p  < 0.001, and the Cramer's V (0.04, df ∗ = 6) indicates a small effect (Cohen,  1988 ). Unsurprisingly, benefits brought by the convenience of online learning are widely recognised by students across all school years (Figure  4 ), that is up to 75% of students reported that it is “more convenient to review course content” and 54% said that they “can learn anytime and anywhere” . Besides, we noticed that about 50% of early‐school‐year students appreciated the “access to courses delivered by famous teachers” and 40%–47% of high‐school‐year students indicated that online learning is “helpful to develop self‐regulation and autonomy” .

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g005.jpg

Perceived benefits of online learning reported by students

Perceived obstacles

The Chi‐square test shows a significant association between school years and students’ perceived obstacles in online learning, χ 2 (77, N  = 2,699,003) = 31,987.56, p < 0.001. This association is relatively weak as shown by the Cramer's V (0.04, df ∗ = 7; Cohen,  1988 ). As shown in Figure  5 , the biggest obstacles encountered by up to 73% of students were the “eyestrain caused by long staring at screens” . Disengagement caused by nearby disturbance was reported by around 40% of students, especially those of Year 1–3 and 10–12. Technological‐wise, about 50% of students experienced poor Internet connection during their learning process, and around 20% of students reported the “confusion in setting up the platforms” across of school years.

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g001.jpg

Perceived obstacles of online learning reported by students

Expectations for future practices of online learning – RQ3

Online learning activities.

The association between school years and students’ expected online learning activities is significant, χ 2 (66, N  = 2,416,093) = 38,784.81, p < 0.001. The Cramer's V is 0.05 ( df ∗ = 6) which suggests a small effect (Cohen,  1988 ). As shown in Figure  6 , the most expected activity for future online learning is “real‐time interaction with teachers” (55%), followed by “online group discussion and collaboration” (38%). We also observed that more early‐school‐year students expect reflective activities, such as “regular online practice examinations” ( χ 2 (3, N  = 1,048,575) = 11,644.98, p < 0.001), with a small effect size, Cramer's V  = 0.11 ( df ∗ = 1). In contrast, more high‐school‐year students expect “intelligent recommendation system …” ( χ 2 (3, N  = 1,048,575) = 15,327.00, p < 0.001), with a small effect size, Cramer's V  = 0.12 ( df ∗ = 1).

An external file that holds a picture, illustration, etc.
Object name is BJET-52-2038-g006.jpg

Students’ expected online learning activities

Regarding students’ learning conditions, substantial differences were observed in learning media, family dependency, and learning approaches adopted in online learning between students in different school years. The finding of more computer and smartphone usage in high‐school‐year than early‐school‐year students can probably be explained by that, with the growing abilities in utilising these media as well as the educational systems and tools which run on these media, high‐school‐year students tend to make better use of these media for online learning practices. Whereas, the differences in paper‐based materials may imply that high‐school‐year students in China have to accomplish a substantial amount of exercise, assignments, and exam papers to prepare for the National College Entrance Examination (NCEE), whose delivery was not entirely digitised due to the sudden transition to online learning. Meanwhile, high‐school‐year students may also have preferred using paper‐based materials for exam practice, as eventually, they would take their NCEE in the paper format. Therefore, these substantial differences in students’ usage of learning media should be addressed by customising the delivery method of online learning for different school years.

Other than these between‐age differences in learning media, the prevalence of smartphone in online learning resonates with Agung et al.’s ( 2020 ) finding on the issues surrounding the availability of compatible learning device. The prevalence of smartphone in K‐12 students is potentially problematic as the majority of the online learning platform and content is designed for computer‐based learning (Berge,  2005 ; Molnar et al.,  2019 ). Whereas learning with smartphones has its own unique challenges. For example, Gikas and Grant ( 2013 ) discovered that students who learn with smartphone experienced frustration with the small screen‐size, especially when trying to type with the tiny keypad. Another challenge relates to the distraction of various social media applications. Although similar distractions exist in computer and web‐based social media, the level of popularity, especially in the young generation, are much higher in mobile‐based social media (Montag et al.,  2018 ). In particular, the message notification function in smartphones could disengage students from learning activities and allure them to social media applications (Gikas & Grant,  2013 ). Given these challenges of learning with smartphones, more research efforts should be devoted to analysing students’ online learning behaviour in the setting of mobile learning to accommodate their needs better.

The differences in learning approaches, once again, illustrated that early‐school‐year students have different needs compared to high‐school‐year students. In particular, the low usage of the independent learning methods in early‐school‐year students may reflect their inability to engage in independent learning. Besides, the differences in help seeking behaviours demonstrated the distinctive needs for communication and interaction between different students, that is early‐school‐year students have a strong reliance on teachers and high‐school‐year students, who are equipped with stronger communication ability, are more inclined to interact with their peers. This finding implies that the design of online learning platforms should take students’ different needs into account. Thus, customisation is urgently needed for the delivery of online learning to different school years.

In terms of the perceived benefits and challenges of online learning, our results resonate with several previous findings. In particular, the benefits of convenience are in line with the flexibility advantages of online learning, which were mentioned in prior works (Appana,  2008 ; Bączek et al.,  2021 ; Barbour,  2013 ; Basuony et al.,  2020 ; Harvey et al.,  2014 ). Early‐school‐year students’ higher appreciation in having “access to courses delivered by famous teachers” and lower appreciation in the independent learning skills developed through online learning are also in line with previous literature (Barbour,  2013 ; Harvey et al.,  2014 ; Oliver et al.,  2009 ). Again, these similar findings may indicate the strong reliance that early‐school‐year students place on teachers, while high‐school‐year students are more capable of adapting to online learning by developing independent learning skills.

Technology‐wise, students’ experience of poor internet connection and confusion in setting up online learning platforms are particularly concerning. The problem of poor internet connection corroborated the findings reported in prior studies (Agung et al.,  2020 ; Barbour,  2013 ; Basuony et al.,  2020 ; Berge,  2005 ; Rice,  2006 ), that is the access issue surrounded the digital divide as one of the main challenges of online learning. In the era of 4G and 5G networks, educational authorities and institutions that deliver online education could fall into the misconception of most students have a stable internet connection at home. The internet issue we observed is particularly vital to students’ online learning experience as most students prefer real‐time communications (Figure  6 ), which rely heavily on stable internet connection. Likewise, the finding of students’ confusion in technology is also consistent with prior studies (Bączek et al.,  2021 ; Muilenburg & Berge,  2005 ; Niemi & Kousa,  2020 ; Song et al.,  2004 ). Students who were unsuccessfully in setting up the online learning platforms could potentially experience declines in confidence and enthusiasm for online learning, which would cause a subsequent unpleasant learning experience. Therefore, both the readiness of internet infrastructure and student technical skills remain as the significant challenges for the mass‐adoption of online learning.

On the other hand, students’ experience of eyestrain from extended screen time provided empirical evidence to support Spitzer’s ( 2001 ) speculation about the potential ergonomic impact of online learning. This negative effect is potentially related to the prevalence of smartphone device and the limited screen size of these devices. This finding not only demonstrates the potential ergonomic issues that would be caused by smartphone‐based online learning but also resonates with the aforementioned necessity of different platforms and content designs for different students.

A less‐mentioned problem in previous studies on online learning experiences is the disengagement caused by nearby disturbance, especially in Year 1–3 and 10–12. It is likely that early‐school‐year students suffered from this problem because of their underdeveloped metacognitive skills to concentrate on online learning without teachers’ guidance. As for high‐school‐year students, the reasons behind their disengagement require further investigation in the future. Especially it would be worthwhile to scrutinise whether this type of disengagement is caused by the substantial amount of coursework they have to undertake and the subsequent a higher level of pressure and a lower level of concentration while learning.

Across age‐level differences are also apparent in terms of students’ expectations of online learning. Although, our results demonstrated students’ needs of gaining social interaction with others during online learning, findings (Bączek et al.,  2021 ; Harvey et al.,  2014 ; Kuo et al.,  2014 ; Liu & Cavanaugh,  2012 ; Yates et al.,  2020 ). This need manifested differently across school years, with early‐school‐year students preferring more teacher interactions and learning regulation support. Once again, this finding may imply that early‐school‐year students are inadequate in engaging with online learning without proper guidance from their teachers. Whereas, high‐school‐year students prefer more peer interactions and recommendation to learning resources. This expectation can probably be explained by the large amount of coursework exposed to them. Thus, high‐school‐year students need further guidance to help them better direct their learning efforts. These differences in students’ expectations for future practices could guide the customisation of online learning delivery.

Implications

As shown in our results, improving the delivery of online learning not only requires the efforts of policymakers but also depend on the actions of teachers and parents. The following sub‐sections will provide recommendations for relevant stakeholders and discuss their essential roles in supporting online education.

Technical support

The majority of the students has experienced technical problems during online learning, including the internet lagging and confusion in setting up the learning platforms. These problems with technology could impair students’ learning experience (Kauffman,  2015 ; Muilenburg & Berge,  2005 ). Educational authorities and schools should always provide a thorough guide and assistance for students who are experiencing technical problems with online learning platforms or other related tools. Early screening and detection could also assist schools and teachers to direct their efforts more effectively in helping students with low technology skills (Wilkinson et al.,  2010 ). A potential identification method involves distributing age‐specific surveys that assess students’ Information and Communication Technology (ICT) skills at the beginning of online learning. For example, there are empirical validated ICT surveys available for both primary (Aesaert et al.,  2014 ) and high school (Claro et al.,  2012 ) students.

For students who had problems with internet lagging, the delivery of online learning should provide options that require fewer data and bandwidth. Lecture recording is the existing option but fails to address students’ need for real‐time interaction (Clark et al.,  2015 ; Malik & Fatima,  2017 ). A potential alternative involves providing students with the option to learn with digital or physical textbooks and audio‐conferencing, instead of screen sharing and video‐conferencing. This approach significantly reduces the amount of data usage and lowers the requirement of bandwidth for students to engage in smooth online interactions (Cisco,  2018 ). It also requires little additional efforts from teachers as official textbooks are often available for each school year, and thus, they only need to guide students through the materials during audio‐conferencing. Educational authority can further support this approach by making digital textbooks available for teachers and students, especially those in financial hardship. However, the lack of visual and instructor presence could potentially reduce students’ attention, recall of information, and satisfaction in online learning (Wang & Antonenko,  2017 ). Therefore, further research is required to understand whether the combination of digital or physical textbooks and audio‐conferencing is appropriate for students with internet problems. Alternatively, suppose the local technological infrastructure is well developed. In that case, governments and schools can also collaborate with internet providers to issue data and bandwidth vouchers for students who are experiencing internet problems due to financial hardship.

For future adoption of online learning, policymakers should consider the readiness of the local internet infrastructure. This recommendation is particularly important for developing countries, like Bangladesh, where the majority of the students reported the lack of internet infrastructure (Ramij & Sultana,  2020 ). In such environments, online education may become infeasible, and alternative delivery method could be more appropriate, for example, the Telesecundaria program provides TV education for rural areas of Mexico (Calderoni,  1998 ).

Other than technical problems, choosing a suitable online learning platform is also vital for providing students with a better learning experience. Governments and schools should choose an online learning platform that is customised for smartphone‐based learning, as the majority of students could be using smartphones for online learning. This recommendation is highly relevant for situations where students are forced or involuntarily engaged in online learning, like during the COVID‐19 pandemic, as they might not have access to a personal computer (Molnar et al.,  2019 ).

Customisation of delivery methods

Customising the delivery of online learning for students in different school years is the theme that appeared consistently across our findings. This customisation process is vital for making online learning an opportunity for students to develop independent learning skills, which could help prepare them for tertiary education and lifelong learning. However, the pedagogical design of K‐12 online learning programs should be differentiated from adult‐orientated programs as these programs are designed for independent learners, which is rarely the case for students in K‐12 education (Barbour & Reeves,  2009 ).

For early‐school‐year students, especially Year 1–3 students, providing them with sufficient guidance from both teachers and parents should be the priority as these students often lack the ability to monitor and reflect on learning progress. In particular, these students would prefer more real‐time interaction with teachers, tutoring from parents, and regular online practice examinations. These forms of guidance could help early‐school‐year students to cope with involuntary online learning, and potentially enhance their experience in future online learning. It should be noted that, early‐school‐year students demonstrated interest in intelligent monitoring and feedback systems for learning. Additional research is required to understand whether these young children are capable of understanding and using learning analytics that relay information on their learning progress. Similarly, future research should also investigate whether young children can communicate effectively through digital tools as potential inability could hinder student learning in online group activities. Therefore, the design of online learning for early‐school‐year students should focus less on independent learning but ensuring that students are learning effective under the guidance of teachers and parents.

In contrast, group learning and peer interaction are essential for older children and adolescents. The delivery of online learning for these students should focus on providing them with more opportunities to communicate with each other and engage in collaborative learning. Potential methods to achieve this goal involve assigning or encouraging students to form study groups (Lee et al.,  2011 ), directing students to use social media for peer communication (Dabbagh & Kitsantas,  2012 ), and providing students with online group assignments (Bickle & Rucker,  2018 ).

Special attention should be paid to students enrolled in high schools. For high‐school‐year students, in particular, students in Year 10–12, we also recommend to provide them with sufficient access to paper‐based learning materials, such as revision booklet and practice exam papers, so they remain familiar with paper‐based examinations. This recommendation applies to any students who engage in online learning but has to take their final examination in paper format. It is also imperative to assist high‐school‐year students who are facing examinations to direct their learning efforts better. Teachers can fulfil this need by sharing useful learning resources on the learning management system, if it is available, or through social media groups. Alternatively, students are interested in intelligent recommendation systems for learning resources, which are emerging in the literature (Corbi & Solans,  2014 ; Shishehchi et al.,  2010 ). These systems could provide personalised recommendations based on a series of evaluation on learners’ knowledge. Although it is infeasible for situations where the transformation to online learning happened rapidly (i.e., during the COVID‐19 pandemic), policymakers can consider embedding such systems in future online education.

Limitations

The current findings are limited to primary and secondary Chinese students who were involuntarily engaged in online learning during the COVID‐19 pandemic. Despite the large sample size, the population may not be representative as participants are all from a single province. Also, information about the quality of online learning platforms, teaching contents, and pedagogy approaches were missing because of the large scale of our study. It is likely that the infrastructures of online learning in China, such as learning platforms, instructional designs, and teachers’ knowledge about online pedagogy, were underprepared for the sudden transition. Thus, our findings may not represent the experience of students who voluntarily participated in well‐prepared online learning programs, in particular, the virtual school programs in America and Canada (Barbour & LaBonte,  2017 ; Molnar et al.,  2019 ). Lastly, the survey was only evaluated and validated by teachers but not students. Therefore, students with the lowest reading comprehension levels might have a different understanding of the items’ meaning, especially terminologies that involve abstract contracts like self‐regulation and autonomy in item Q17.

In conclusion, we identified across‐year differences between primary and secondary school students’ online learning experience during the COVID‐19 pandemic. Several recommendations were made for the future practice and research of online learning in the K‐12 student population. First, educational authorities and schools should provide sufficient technical support to help students to overcome potential internet and technical problems, as well as choosing online learning platforms that have been customised for smartphones. Second, customising the online pedagogy design for students in different school years, in particular, focusing on providing sufficient guidance for young children, more online collaborative opportunity for older children and adolescent, and additional learning resource for senior students who are facing final examinations.

CONFLICT OF INTEREST

There is no potential conflict of interest in this study.

ETHICS STATEMENT

The data are collected by the Department of Education of the Guangdong Province who also has the authority to approve research studies in K12 education in the province.

Supporting information

Supplementary Material

ACKNOWLEDGEMENTS

This work is supported by the National Natural Science Foundation of China (62077028, 61877029), the Science and Technology Planning Project of Guangdong (2020B0909030005, 2020B1212030003, 2020ZDZX3013, 2019B1515120010, 2018KTSCX016, 2019A050510024), the Science and Technology Planning Project of Guangzhou (201902010041), and the Fundamental Research Funds for the Central Universities (21617408, 21619404).

SURVEY ITEMS

DimensionsQuestion textQuestion types
DemographicQ1. What is the location and category of your school?Single‐response MCQ
Q2. Which school year are you in?Single‐response MCQ
BehaviourQ3. What equipment and materials did you use for online learning during the COVID−19 pandemic period?Multiple‐response MCQ
Q4. Other than the lecture function, which features of the online education platform have you used?Multiple‐response MCQ
Q5. What is the longest class time for your online courses?Single‐response MCQ
Q6. How long do you study online every day?Slider questions
Q8. Did you need family companionship when studying online?Single‐response MCQ
Q10. What content does your online course include?Multiple‐response MCQ
Q11. What approaches did you use to tackle the unlearnt concepts you had when performing online learning?Multiple‐response MCQ
Q12. How often do you interact with your classroom in online learning?Single‐response MCQ
Q14. Regarding the following online learning behaviours, please select the answer that fits your situation in the form below.Yes/No Questions
ExperienceQ7. Which of the following learning statuses is appropriate for your situation?Multiple‐response MCQ
Q13. What obstacles did you encounter when studying online?Multiple‐response MCQ
Q15. What skills do you think are developed from online education?Multiple‐response MCQ
Q16. How satisfied are you with the following aspects of online learning?Four‐point bipolar scale
Q17. Compared to classroom‐based learning, what are the advantages of online learning?Multiple‐response MCQ
Q18. What do you think are the deficiencies of online learning compared to physical classrooms?Multiple‐response MCQ
ExpectationsQ9. What is your preferred online classroom format?Single‐response MCQ
Q19. What online activities or experiences do you expect to have that will enhance your online learning?Multiple‐response MCQ
Q20. After the COVID−19 pandemic, which type of learning would you prefer?Single‐response MCQ

Yan, L , Whitelock‐Wainwright, A , Guan, Q , Wen, G , Gašević, D , & Chen, G . Students’ experience of online learning during the COVID‐19 pandemic: A province‐wide survey study . Br J Educ Technol . 2021; 52 :2038–2057. 10.1111/bjet.13102 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]

DATA AVAILABILITY STATEMENT

  • Aesaert, K. , Van Nijlen, D. , Vanderlinde, R. , & van Braak, J. (2014). Direct measures of digital information processing and communication skills in primary education: Using item response theory for the development and validation of an ICT competence scale . Computers & Education , 76 , 168–181. 10.1016/j.compedu.2014.03.013 [ CrossRef ] [ Google Scholar ]
  • Agung, A. S. N. , Surtikanti, M. W. , & Quinones, C. A. (2020). Students’ perception of online learning during COVID‐19 pandemic: A case study on the English students of STKIP Pamane Talino . SOSHUM: Jurnal Sosial Dan Humaniora , 10 ( 2 ), 225–235. 10.31940/soshum.v10i2.1316 [ CrossRef ] [ Google Scholar ]
  • Anderson, T. (2003). Getting the mix right again: An updated and theoretical rationale for interaction . The International Review of Research in Open and Distributed Learning , 4 ( 2 ). 10.19173/irrodl.v4i2.149 [ CrossRef ] [ Google Scholar ]
  • Appana, S. (2008). A review of benefits and limitations of online learning in the context of the student, the instructor and the tenured faculty . International Journal on E‐learning , 7 ( 1 ), 5–22. [ Google Scholar ]
  • Bączek, M. , Zagańczyk‐Bączek, M. , Szpringer, M. , Jaroszyński, A. , & Wożakowska‐Kapłon, B. (2021). Students’ perception of online learning during the COVID‐19 pandemic: A survey study of Polish medical students . Medicine , 100 ( 7 ), e24821. 10.1097/MD.0000000000024821 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Barbour, M. K. (2013). The landscape of k‐12 online learning: Examining what is known . Handbook of Distance Education , 3 , 574–593. [ Google Scholar ]
  • Barbour, M. , Huerta, L. , & Miron, G. (2018). Virtual schools in the US: Case studies of policy, performance and research evidence. In Society for information technology & teacher education international conference (pp. 672–677). Association for the Advancement of Computing in Education (AACE). [ Google Scholar ]
  • Barbour, M. K. , & LaBonte, R. (2017). State of the nation: K‐12 e‐learning in Canada, 2017 edition . http://k12sotn.ca/wp‐content/uploads/2018/02/StateNation17.pdf [ Google Scholar ]
  • Barbour, M. K. , & Reeves, T. C. (2009). The reality of virtual schools: A review of the literature . Computers & Education , 52 ( 2 ), 402–416. [ Google Scholar ]
  • Basuony, M. A. K. , EmadEldeen, R. , Farghaly, M. , El‐Bassiouny, N. , & Mohamed, E. K. A. (2020). The factors affecting student satisfaction with online education during the COVID‐19 pandemic: An empirical study of an emerging Muslim country . Journal of Islamic Marketing . 10.1108/JIMA-09-2020-0301 [ CrossRef ] [ Google Scholar ]
  • Berge, Z. L. (2005). Virtual schools: Planning for success . Teachers College Press, Columbia University. [ Google Scholar ]
  • Bickle, M. C. , & Rucker, R. (2018). Student‐to‐student interaction: Humanizing the online classroom using technology and group assignments . Quarterly Review of Distance Education , 19 ( 1 ), 1–56. [ Google Scholar ]
  • Broadbent, J. , & Poon, W. L. (2015). Self‐regulated learning strategies & academic achievement in online higher education learning environments: A systematic review . The Internet and Higher Education , 27 , 1–13. [ Google Scholar ]
  • Calderoni, J. (1998). Telesecundaria: Using TV to bring education to rural Mexico (Tech. Rep.). The World Bank. [ Google Scholar ]
  • Cisco . (2018). Bandwidth requirements for meetings with cisco Webex and collaboration meeting rooms white paper . http://dwz.date/dpbc [ Google Scholar ]
  • Cisco . (2019). Cisco digital readiness 2019 . https://www.cisco.com/c/m/en_us/about/corporate‐social‐responsibility/research‐resources/digital‐readiness‐index.html#/ (Library Catalog: www.cisco.com). [ Google Scholar ]
  • Clark, C. , Strudler, N. , & Grove, K. (2015). Comparing asynchronous and synchronous video vs. text based discussions in an online teacher education course . Online Learning , 19 ( 3 ), 48–69. [ Google Scholar ]
  • Claro, M. , Preiss, D. D. , San Martín, E. , Jara, I. , Hinostroza, J. E. , Valenzuela, S. , Cortes, F. , & Nussbaum, M. (2012). Assessment of 21st century ICT skills in Chile: Test design and results from high school level students . Computers & Education , 59 ( 3 ), 1042–1053. 10.1016/j.compedu.2012.04.004 [ CrossRef ] [ Google Scholar ]
  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences . Routledge Academic. [ Google Scholar ]
  • Corbi, A. , & Solans, D. B. (2014). Review of current student‐monitoring techniques used in elearning‐focused recommender systems and learning analytics: The experience API & LIME model case study . IJIMAI , 2 ( 7 ), 44–52. [ Google Scholar ]
  • Dabbagh, N. , & Kitsantas, A. (2012). Personal learning environments, social media, and self‐regulated learning: A natural formula for connecting formal and informal learning . The Internet and Higher Education , 15 ( 1 ), 3–8. 10.1016/j.iheduc.2011.06.002 [ CrossRef ] [ Google Scholar ]
  • Garrison, D. R. , Cleveland‐Innes, M. , & Fung, T. S. (2010). Exploring causal relationships among teaching, cognitive and social presence: Student perceptions of the community of inquiry framework . The Internet and Higher Education , 13 ( 1–2 ), 31–36. 10.1016/j.iheduc.2009.10.002 [ CrossRef ] [ Google Scholar ]
  • Gašević, D. , Adesope, O. , Joksimović, S. , & Kovanović, V. (2015). Externally‐facilitated regulation scaffolding and role assignment to develop cognitive presence in asynchronous online discussions . The Internet and Higher Education , 24 , 53–65. 10.1016/j.iheduc.2014.09.006 [ CrossRef ] [ Google Scholar ]
  • Gašević, D. , Zouaq, A. , & Janzen, R. (2013). “Choose your classmates, your GPA is at stake!” The association of cross‐class social ties and academic performance . American Behavioral Scientist , 57 ( 10 ), 1460–1479. [ Google Scholar ]
  • Gikas, J. , & Grant, M. M. (2013). Mobile computing devices in higher education: Student perspectives on learning with cellphones, smartphones & social media . The Internet and Higher Education , 19 , 18–26. [ Google Scholar ]
  • Harvey, D. , Greer, D. , Basham, J. , & Hu, B. (2014). From the student perspective: Experiences of middle and high school students in online learning . American Journal of Distance Education , 28 ( 1 ), 14–26. 10.1080/08923647.2014.868739 [ CrossRef ] [ Google Scholar ]
  • Kauffman, H. (2015). A review of predictive factors of student success in and satisfaction with online learning . Research in Learning Technology , 23 . 10.3402/rlt.v23.26507 [ CrossRef ] [ Google Scholar ]
  • Kuo, Y.‐C. , Walker, A. E. , Belland, B. R. , Schroder, K. E. , & Kuo, Y.‐T. (2014). A case study of integrating interwise: Interaction, internet self‐efficacy, and satisfaction in synchronous online learning environments . International Review of Research in Open and Distributed Learning , 15 ( 1 ), 161–181. 10.19173/irrodl.v15i1.1664 [ CrossRef ] [ Google Scholar ]
  • Lee, S. J. , Srinivasan, S. , Trail, T. , Lewis, D. , & Lopez, S. (2011). Examining the relationship among student perception of support, course satisfaction, and learning outcomes in online learning . The Internet and Higher Education , 14 ( 3 ), 158–163. 10.1016/j.iheduc.2011.04.001 [ CrossRef ] [ Google Scholar ]
  • Liu, F. , & Cavanaugh, C. (2012). Factors influencing student academic performance in online high school algebra . Open Learning: The Journal of Open, Distance and e‐Learning , 27 ( 2 ), 149–167. 10.1080/02680513.2012.678613 [ CrossRef ] [ Google Scholar ]
  • Lou, Y. , Bernard, R. M. , & Abrami, P. C. (2006). Media and pedagogy in undergraduate distance education: A theory‐based meta‐analysis of empirical literature . Educational Technology Research and Development , 54 ( 2 ), 141–176. 10.1007/s11423-006-8252-x [ CrossRef ] [ Google Scholar ]
  • Malik, M. , & Fatima, G. (2017). E‐learning: Students’ perspectives about asynchronous and synchronous resources at higher education level . Bulletin of Education and Research , 39 ( 2 ), 183–195. [ Google Scholar ]
  • McInnerney, J. M. , & Roberts, T. S. (2004). Online learning: Social interaction and the creation of a sense of community . Journal of Educational Technology & Society , 7 ( 3 ), 73–81. [ Google Scholar ]
  • Molnar, A. , Miron, G. , Elgeberi, N. , Barbour, M. K. , Huerta, L. , Shafer, S. R. , & Rice, J. K. (2019). Virtual schools in the US 2019 . National Education Policy Center. [ Google Scholar ]
  • Montague, M. , & Rinaldi, C. (2001). Classroom dynamics and children at risk: A followup . Learning Disability Quarterly , 24 ( 2 ), 75–83. [ Google Scholar ]
  • Montag, C. , Becker, B. , & Gan, C. (2018). The multipurpose application Wechat: A review on recent research . Frontiers in Psychology , 9 , 2247. 10.3389/fpsyg.2018.02247 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Moore, M. G. (1989). Editorial: Three types of interaction . American Journal of Distance Education , 3 ( 2 ), 1–7. 10.1080/08923648909526659 [ CrossRef ] [ Google Scholar ]
  • Muilenburg, L. Y. , & Berge, Z. L. (2005). Student barriers to online learning: A factor analytic study . Distance Education , 26 ( 1 ), 29–48. 10.1080/01587910500081269 [ CrossRef ] [ Google Scholar ]
  • Muirhead, B. , & Juwah, C. (2004). Interactivity in computer‐mediated college and university education: A recent review of the literature . Journal of Educational Technology & Society , 7 ( 1 ), 12–20. [ Google Scholar ]
  • Niemi, H. M. , & Kousa, P. (2020). A case study of students’ and teachers’ perceptions in a finnish high school during the COVID pandemic . International Journal of Technology in Education and Science , 4 ( 4 ), 352–369. 10.46328/ijtes.v4i4.167 [ CrossRef ] [ Google Scholar ]
  • Oliver, K. , Osborne, J. , & Brady, K. (2009). What are secondary students’ expectations for teachers in virtual school environments? Distance Education , 30 ( 1 ), 23–45. 10.1080/01587910902845923 [ CrossRef ] [ Google Scholar ]
  • Pardo, A. , Jovanovic, J. , Dawson, S. , Gašević, D. , & Mirriahi, N. (2019). Using learning analytics to scale the provision of personalised feedback . British Journal of Educational Technology , 50 ( 1 ), 128–138. 10.1111/bjet.12592 [ CrossRef ] [ Google Scholar ]
  • Ramij, M. , & Sultana, A. (2020). Preparedness of online classes in developing countries amid covid‐19 outbreak: A perspective from Bangladesh. Afrin, Preparedness of Online Classes in Developing Countries amid COVID‐19 Outbreak: A Perspective from Bangladesh (June 29, 2020) .
  • Rice, K. L. (2006). A comprehensive look at distance education in the k–12 context . Journal of Research on Technology in Education , 38 ( 4 ), 425–448. 10.1080/15391523.2006.10782468 [ CrossRef ] [ Google Scholar ]
  • Shishehchi, S. , Banihashem, S. Y. , & Zin, N. A. M. (2010). A proposed semantic recommendation system for elearning: A rule and ontology based e‐learning recommendation system. In 2010 international symposium on information technology (Vol. 1, pp. 1–5).
  • Song, L. , Singleton, E. S. , Hill, J. R. , & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics . The Internet and Higher Education , 7 ( 1 ), 59–70. 10.1016/j.iheduc.2003.11.003 [ CrossRef ] [ Google Scholar ]
  • Spitzer, D. R. (2001). Don’t forget the high‐touch with the high‐tech in distance learning . Educational Technology , 41 ( 2 ), 51–55. [ Google Scholar ]
  • Thomas, R. M. (2000). Comparing theories of child development. Wadsworth/Thomson Learning. United Nations Educational, Scientific and Cultural Organization. (2020, March). Education: From disruption to recovery . https://en.unesco.org/covid19/educationresponse (Library Catalog: en.unesco.org)
  • Uttal, D. H. , & Cohen, C. A. (2012). Spatial thinking and stem education: When, why, and how? In Psychology of learning and motivation (Vol. 57 , pp. 147–181). Elsevier. [ Google Scholar ]
  • Van Lancker, W. , & Parolin, Z. (2020). Covid‐19, school closures, and child poverty: A social crisis in the making . The Lancet Public Health , 5 ( 5 ), e243–e244. 10.1016/S2468-2667(20)30084-0 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Wang, C.‐H. , Shannon, D. M. , & Ross, M. E. (2013). Students’ characteristics, self‐regulated learning, technology self‐efficacy, and course outcomes in online learning . Distance Education , 34 ( 3 ), 302–323. 10.1080/01587919.2013.835779 [ CrossRef ] [ Google Scholar ]
  • Wang, J. , & Antonenko, P. D. (2017). Instructor presence in instructional video: Effects on visual attention, recall, and perceived learning . Computers in Human Behavior , 71 , 79–89. 10.1016/j.chb.2017.01.049 [ CrossRef ] [ Google Scholar ]
  • Wilkinson, A. , Roberts, J. , & While, A. E. (2010). Construction of an instrument to measure student information and communication technology skills, experience and attitudes to e‐learning . Computers in Human Behavior , 26 ( 6 ), 1369–1376. 10.1016/j.chb.2010.04.010 [ CrossRef ] [ Google Scholar ]
  • World Health Organization . (2020, July). Coronavirus disease 2019 (COVID‐19): Situation Report‐164 (Situation Report No. 164). https://www.who.int/docs/default‐source/coronaviruse/situation‐reports/20200702‐covid‐19‐sitrep‐164.pdf?sfvrsn$=$ac074f58$_$2
  • Yates, A. , Starkey, L. , Egerton, B. , & Flueggen, F. (2020). High school students’ experience of online learning during Covid‐19: The influence of technology and pedagogy . Technology, Pedagogy and Education , 9 , 1–15. 10.1080/1475939X.2020.1854337 [ CrossRef ] [ Google Scholar ]

The essential components of a successful L&D strategy

Over the past decade, the global workforce has been continually evolving because of a number of factors. An increasingly competitive business landscape, rising complexity, and the digital revolution are reshaping the mix of employees. Meanwhile, persistent uncertainty, a multigenerational workforce, and a shorter shelf life for knowledge have placed a premium on reskilling and upskilling. The shift to a digital, knowledge-based economy means that a vibrant workforce is more important than ever: research suggests that a very significant percentage of market capitalization in public companies is based on intangible assets—skilled employees, exceptional leaders, and knowledge. 1 Intangible Asset Market Value Study, Ocean Tomo.

Learning and development—From evolution to revolution

We began in 2014 by surveying 1,500 executives about capability building. In 2016, we added 120 L&D leaders at 91 organizations to our database, gathering information on their traditional training strategies and aspirations for future programs. We also interviewed 15 chief learning officers or L&D heads at major companies.

Historically, the L&D function has been relatively successful in helping employees build skills and perform well in their existing roles. The main focus of L&D has been on upskilling. However, the pace of change continues to accelerate; McKinsey research estimates that as many as 800 million jobs could be displaced by automation by 2030.

Employee roles are expected to continue evolving, and a large number of people will need to learn new skills to remain employable. Unsurprisingly, our research confirmed our initial hypothesis: corporate learning must undergo revolutionary changes over the next few years to keep pace with constant technological advances. In addition to updating training content, companies must increase their focus on blended-learning solutions, which combine digital learning, fieldwork, and highly immersive classroom sessions. With the growth of user-friendly digital-learning platforms, employees will take more ownership of their professional development, logging in to take courses when the need arises rather than waiting for a scheduled classroom session.

Such innovations will require companies to devote more resources to training: our survey revealed that 60 percent of respondents plan to increase L&D spending over the next few years, and 66 percent want to boost the number of employee-training hours. As they commit more time and money, companies must ensure that the transformation of the L&D function proceeds smoothly.

All of these trends have elevated the importance of the learning-and-development (L&D) function. We undertook several phases of research to understand trends and current priorities in L&D (see sidebar, “Learning and development—From evolution to revolution”). Our efforts highlighted how the L&D function is adapting to meet the changing needs of organizations, as well as the growing levels of investment in professional development.

To get the most out of investments in training programs and curriculum development, L&D leaders must embrace a broader role within the organization and formulate an ambitious vision for the function. An essential component of this effort is a comprehensive, coordinated strategy that engages the organization and encourages collaboration. The ACADEMIES© framework, which consists of nine dimensions of L&D, can help to strengthen the function and position it to serve the organization more effectively.

The strategic role of L&D

One of L&D’s primary responsibilities is to manage the development of people—and to do so in a way that supports other key business priorities. L&D’s strategic role spans five areas (Exhibit 1). 2 Nick van Dam, 25 Best Practices in Learning & Talent Development , second edition, Raleigh, NC: Lulu Publishing, 2008.

  • Attract and retain talent. Traditionally, learning focused solely on improving productivity. Today, learning also contributes to employability. Over the past several decades, employment has shifted from staying with the same company for a lifetime to a model where workers are being retained only as long as they can add value to an enterprise. Workers are now in charge of their personal and professional growth and development—one reason that people list “opportunities for learning and development” among the top criteria for joining an organization. Conversely, a lack of L&D is one of the key reasons people cite for leaving a company.
  • Develop people capabilities. Human capital requires ongoing investments in L&D to retain its value. When knowledge becomes outdated or forgotten—a more rapid occurrence today—the value of human capital declines and needs to be supplemented by new learning and relevant work experiences. 3 Gary S. Becker, “Investment in human capital: A theoretical analysis,” Journal of Political Economy , 1962, Volume 70, Number 5, Part 2, pp. 9–49, jstor.org. Companies that make investments in the next generation of leaders are seeing an impressive return. Research indicates that companies in the top quartile of leadership outperform other organizations by nearly two times on earnings before interest, taxes, depreciation, and amortization (EBITDA). Moreover, companies that invest in developing leaders during significant transformations are 2.4 times more likely to hit their performance targets . 4 “ Economic Conditions Snapshot, June 2009: McKinsey Global Survey results ,” June 2009.
  • Create a values-based culture. As the workforce in many companies becomes increasingly virtual and globally dispersed, L&D can help to build a values-based culture and a sense of community. In particular, millennials are particularly interested in working for values-based, sustainable enterprises that contribute to the welfare of society.
  • Build an employer brand. An organization’s brand is one of its most important assets and conveys a great deal about the company’s success in the market, financial strengths, position in the industry, and products and services. Investments in L&D can help to enhance company’s brand and boost its reputation as an “employer of choice.” As large segments of the workforce prepare to retire, employers must work harder to compete for a shrinking talent pool. To do so, they must communicate their brand strength explicitly through an employer value proposition.
  • Motivate and engage employees. The most important way to engage employees is to provide them with opportunities to learn and develop new competencies. Research suggests that lifelong learning contributes to happiness. 5 John Coleman, “Lifelong learning is good for your health, your wallet, and your social life,” Harvard Business Review , February 7, 2017, hbr.org. When highly engaged employees are challenged and given the skills to grow and develop within their chosen career path, they are more likely to be energized by new opportunities at work and satisfied with their current organization.

The L&D function in transition

Over the years, we have identified and field-tested nine dimensions that contribute to a strong L&D function. We combined these dimensions to create the ACADEMIES framework, which covers all aspects of L&D functions, from setting aspirations to measuring impact (Exhibit 2). Although many companies regularly execute on several dimensions of this framework, our recent research found that only a few companies are fully mature in all dimensions.

1. Alignment with business strategy

One of an L&D executive’s primary tasks is to develop and shape a learning strategy based on the company’s business and talent strategies. The learning strategy seeks to support professional development and build capabilities across the company, on time, and in a cost-effective manner. In addition, the learning strategy can enhance the company culture and encourage employees to live the company’s values.

For many organizations, the L&D function supports the implementation of the business strategy. For example, if one of the business strategies is a digital transformation, L&D will focus on building the necessary people capabilities to make that possible.

Every business leader would agree that L&D must align with a company’s overall priorities. Yet research has found that many L&D functions fall short on this dimension. Only 40 percent of companies say that their learning strategy is aligned with business goals. 6 Human Capital Management Excellence Conference 2018, Brandon Hall Group. For 60 percent, then, learning has no explicit connection to the company’s strategic objectives. L&D functions may be out of sync with the business because of outdated approaches or because budgets have been based on priorities from previous years rather than today’s imperatives, such as a digital transformation.

Would you like to learn more about the McKinsey Academy ?

To be effective, L&D must take a hard look at employee capabilities and determine which are most essential to support the execution of the company’s business strategy. L&D leaders should reevaluate this alignment on a yearly basis to ensure they are creating a people-capability agenda that truly reflects business priorities and strategic objectives.

2. Co-ownership between business units and HR

With new tools and technologies constantly emerging, companies must become more agile, ready to adapt their business processes and practices. L&D functions must likewise be prepared to rapidly launch capability-building programs—for example, if new business needs suddenly arise or staff members require immediate training on new technologies such as cloud-based collaboration tools.

L&D functions can enhance their partnership with business leaders by establishing a governance structure in which leadership from both groups share responsibility for defining, prioritizing, designing, and securing funds for capability-building programs. Under this governance model, a company’s chief experience officer (CXO), senior executives, and business-unit heads will develop the people-capability agenda for segments of the enterprise and ensure that it aligns with the company’s overall strategic goals. Top business executives will also help firmly embed the learning function and all L&D initiatives in the organizational culture. The involvement of senior leadership enables full commitment to the L&D function’s longer-term vision.

3. Assessment of capability gaps and estimated value

After companies identify their business priorities, they must verify that their employees can deliver on them—a task that may be more difficult than it sounds. Some companies make no effort to assess employee capabilities, while others do so only at a high level. Conversations with L&D, HR, and senior executives suggest that many companies are ineffective or indifferent at assessing capability gaps, especially when it comes to senior leaders and midlevel managers.

The most effective companies take a deliberate, systematic approach to capability assessment. At the heart of this process is a comprehensive competency or capability model based on the organization’s strategic direction. For example, a key competency for a segment of an e-commerce company’s workforce could be “deep expertise in big data and predictive analytics.”

After identifying the most essential capabilities for various functions or job descriptions, companies should then assess how employees rate in each of these areas. L&D interventions should seek to close these capability gaps.

4. Design of learning journeys

Most corporate learning is delivered through a combination of digital-learning formats and in-person sessions. While our research indicates that immersive L&D experiences in the classroom still have immense value, leaders have told us that they are incredibly busy “from eight to late,” which does not give them a lot of time to sit in a classroom. Furthermore, many said that they prefer to develop and practice new skills and behaviors in a “safe environment,” where they don’t have to worry about public failures that might affect their career paths.

Traditional L&D programs consisted of several days of classroom learning with no follow-up sessions, even though people tend to forget what they have learned without regular reinforcement. As a result, many L&D functions are moving away from stand-alone programs by designing learning journeys—continuous learning opportunities that take place over a period of time and include L&D interventions such as fieldwork, pre- and post-classroom digital learning, social learning, on-the-job coaching and mentoring, and short workshops. The main objectives of a learning journey are to help people develop the required new competencies in the most effective and efficient way and to support the transfer of learning to the job.

5. Execution and scale-up

An established L&D agenda consists of a number of strategic initiatives that support capability building and are aligned with business goals, such as helping leaders develop high-performing teams or roll out safety training. The successful execution of L&D initiatives on time and on budget is critical to build and sustain support from business leaders.

L&D functions often face an overload of initiatives and insufficient funding. L&D leadership needs to maintain an ongoing discussion with business leaders about initiatives and priorities to ensure the requisite resources and support.

Many new L&D initiatives are initially targeted to a limited audience. A successful execution of a small pilot, such as an online orientation program for a specific audience, can lead to an even bigger impact once the program is rolled out to the entire enterprise. The program’s cost per person declines as companies benefit from economies of scale.

6. Measurement of impact on business performance

A learning strategy’s execution and impact should be measured using key performance indicators (KPIs). The first indicator looks at business excellence: how closely aligned all L&D initiatives and investments are with business priorities. The second KPI looks at learning excellence: whether learning interventions change people’s behavior and performance. Last, an operational-excellence KPI measures how well investments and resources in the corporate academy are used.

Accurate measurement is not simple, and many organizations still rely on traditional impact metrics such as learning-program satisfaction and completion scores. But high-performing organizations focus on outcomes-based metrics such as impact on individual performance, employee engagement, team effectiveness, and business-process improvement.

We have identified several lenses for articulating and measuring learning impact:

  • Strategic alignment: How effectively does the learning strategy support the organization’s priorities?
  • Capabilities: How well does the L&D function help colleagues build the mind-sets, skills, and expertise they need most? This impact can be measured by assessing people’s capability gaps against a comprehensive competency framework.
  • Organizational health: To what extent does learning strengthen the overall health and DNA of the organization? Relevant dimensions of the McKinsey Organizational Health Index can provide a baseline.
  • Individual peak performance: Beyond raw capabilities, how well does the L&D function help colleagues achieve maximum impact in their role while maintaining a healthy work-life balance?

Access to big data provides L&D functions with more opportunities to assess and predict the business impact of their interventions.

7. Integration of L&D interventions into HR processes

Just as L&D corporate-learning activities need to be aligned with the business, they should also be an integral part of the HR agenda. L&D has an important role to play in recruitment, onboarding, performance management, promotion, workforce, and succession planning. Our research shows that at best, many L&D functions have only loose connections to annual performance reviews and lack a structured approach and follow-up to performance-management practices.

L&D leadership must understand major HR management practices and processes and collaborate closely with HR leaders. The best L&D functions use consolidated development feedback from performance reviews as input for their capability-building agenda. A growing number of companies are replacing annual performance appraisals with frequent, in-the-moment feedback. 7 HCM outlook 2018 , Brandon Hall Group. This is another area in which the L&D function can help managers build skills to provide development feedback effectively.

Elevating Learning & Development: Insights and Practical Guidance from the Field

Elevating Learning & Development: Insights and Practical Guidance from the Field

Another example is onboarding. Companies that have developed high-impact onboarding processes score better on employee engagement and satisfaction and lose fewer new hires. 8 HCM outlook 2018 , Brandon Hall Group. The L&D function can play a critical role in onboarding—for example, by helping people build the skills to be successful in their role, providing new hires with access to digital-learning technologies, and connecting them with other new hires and mentors.

8. Enabling of the 70:20:10 learning framework

Many L&D functions embrace a framework known as “70:20:10,” in which 70 percent of learning takes place on the job, 20 percent through interaction and collaboration, and 10 percent through formal-learning interventions such as classroom training and digital curricula. These percentages are general guidelines and vary by industry and organization. L&D functions have traditionally focused on the formal-learning component.

Today, L&D leaders must design and implement interventions that support informal learning, including coaching and mentoring, on-the-job instruction, apprenticeships, leadership shadowing, action-based learning, on-demand access to digital learning, and lunch-and-learn sessions. Social technologies play a growing role in connecting experts and creating and sharing knowledge.

9. Systems and learning technology applications

The most significant enablers for just-in-time learning are technology platforms and applications. Examples include next-generation learning-management systems, virtual classrooms, mobile-learning apps, embedded performance-support systems, polling software, learning-video platforms, learning-assessment and -measurement platforms, massive open online courses (MOOCs), and small private online courses (SPOCs), to name just a few.

The learning-technology industry has moved entirely to cloud-based platforms, which provide L&D functions with unlimited opportunities to plug and unplug systems and access the latest functionality without having to go through lengthy and expensive implementations of an on-premises system. L&D leaders must make sure that learning technologies fit into an overall system architecture that includes functionality to support the entire talent cycle, including recruitment, onboarding, performance management, L&D, real-time feedback tools, career management, succession planning, and rewards and recognition.

L&D leaders are increasingly aware of the challenges created by the fourth industrial revolution (technologies that are connecting the physical and digital worlds), but few have implemented large-scale transformation programs. Instead, most are slowly adapting their strategy and curricula as needed. However, with technology advancing at an ever-accelerating pace, L&D leaders can delay no longer: human capital is more important than ever and will be the primary factor in sustaining competitive advantage over the next few years.

The leaders of L&D functions need to revolutionize their approach by creating a learning strategy that aligns with business strategy and by identifying and enabling the capabilities needed to achieve success. This approach will result in robust curricula that employ every relevant and available learning method and technology. The most effective companies will invest in innovative L&D programs, remain flexible and agile, and build the human talent needed to master the digital age.

These changes entail some risk, and perhaps some trial and error, but the rewards are great.

A version of this chapter was published in TvOO Magazine in September 2016. It is also included in Elevating Learning & Development: Insights and Practical Guidance from the Field , August 2018.

Stay current on your favorite topics

Jacqueline Brassey is director of Enduring Priorities Learning in McKinsey’s Amsterdam office, where Nick van Dam is an alumnus and senior adviser to the firm as well as professor and chief of the IE University (Madrid) Center for Learning Innovation; Lisa Christensen is a senior learning expert in the San Francisco office.

Explore a career with us

Related articles.

Elevating Learning & Development: Insights and Practical Guidance from the Field

Seven essential elements of a lifelong-learning mind-set

Putting lifelong learning on the CEO agenda

Putting lifelong learning on the CEO agenda

Work Trend Index Special Report

Research Proves Your Brain Needs Breaks

New options help you carve out downtime between meetings.

April 20, 2021

Illustration by Ben Wiseman

The Latest News

Stay up-to-date with WorkLab

New data and actionable insights, delivered right to your inbox

F For many people , back-to-back video meetings are a hallmark of the pandemic era. One conversation ends, another begins, and too often there’s no chance to stretch, pour a glass of water, or just clear your head. or many people , back-to-back video meetings are a hallmark of the pandemic era. One conversation ends, another begins, and too often there’s no chance to stretch, pour a glass of water, or just clear your head.

In our latest study of brain wave activity, researchers confirmed what many people sense from experience: Back-to-back virtual meetings are stressful. But the research also points to a simple remedy—short breaks.

“Our research shows breaks are important, not just to make us less exhausted by the end of the day, but to actually improve our ability to focus and engage while in those meetings,” says Michael Bohan, senior director of Microsoft’s Human Factors Engineering group, who oversaw the project.

Settings in Microsoft Outlook make it easier and automatic to carve out these essential breaks between back-to-backs—and because we know that one size does not fit all, companies have two options. Individuals can set scheduling defaults that automatically shorten meetings they schedule. And now customers have the ability to set organization-wide scheduling defaults that shorten meetings and create space for breaks for everyone at the company.

“The back-to-back meetings that have become the norm over the last 12 months just aren't sustainable,” says Jared Spataro, CVP, Microsoft 365. “Outlook and Microsoft Teams are used by millions of people around the world, and this small change can help customers develop new cultural norms and improve wellbeing for everyone.”

“ In today’s world of remote and hybrid work, it’s not sufficient to only encourage self-care. We need to innovate and leverage technology to help employees operationalize much-needed breaks into their daily routines. ”

Kathleen Hogan, Chief People Officer at Microsoft

Here at Microsoft, because we have many employees in different functions around the world, we’re encouraging individuals to turn on the setting if it works for them and their team. And as we shift into hybrid work, we’ll continue to learn and look for ways to improve the way we work together in this new, more digital world.

The case for breaks: what the research says

As the pandemic upended routines and heightened the digital intensity of workdays, hundreds of researchers across Microsoft came together to study how work is changing, amassing one of the world's largest bodies of research on the subject.

Our most recent study builds on that work. Microsoft’s Human Factors Lab sought to find a solution for meeting fatigue—a pressing concern in our new era of remote and hybrid work. Researchers from the lab, which examines how humans interact with technology, asked 14 people to take part in video meetings while wearing electroencephalogram (EEG) equipment—a cap to monitor the electrical activity in their brains.

The 14 volunteers each participated in two different sessions of meetings. On one day they attended stretches of four half-hour meetings back-to-back, with each call devoted to different tasks—designing an office layout, for example, or creating a marketing plan. On another day, the four half-hour meetings were interspersed with 10-minute breaks. Instead of hurriedly jumping from one meeting to the next, participants meditated with the Headspace app during the breaks.

To ensure clean data, all the participants taking breaks were assigned the same downtime activity—in this case meditation—so the results would be comparable. The sessions took place on two consecutive Mondays; some participants started with back-to-backs while the others had breaks between meetings, and the next week they switched. We also had additional people join meetings with the research subjects to simulate a typical day interacting with various teams.

The research showed three main takeaways.

1. Breaks between meetings allow the brain to “reset,” reducing a cumulative buildup of stress across meetings.

As we’ve seen in previous studies , in two straight hours of back-to-back meetings, the average activity of beta waves—those associated with stress—increased over time. In other words, the stress kept accumulating.

But when participants were given a chance to rest using meditation, beta activity dropped, allowing for a “reset.” This reset meant participants started their next meeting in a more relaxed state. It also meant the average level of beta waves held steady through four meetings, with no buildup of stress even as four video calls continued.

The antidote to meeting fatigue is simple: taking short breaks.

Your brain works differently when you take breaks

Taking time out between video calls prevents stress from building up.

Microsoft’s Human Factors Lab used EEG caps to measure beta wave activity—associated with stress—in the brains of meeting participants. For those given breaks, their average beta wave activity remained largely steady over time; the “coolness” of their stress levels is visualized here in blues and greens. For those deprived of breaks, their average beta wave activity rose as time passed, suggesting a buildup of stress; that increase is depicted here with colors shifting from cool to hot. The chart represents the relative difference in beta activity between break and no-break conditions at the top of each meeting (averaged across the 14 research participants).

Illustration by Brown Bird Design

2. Back-to-back meetings can decrease your ability to focus and engage.

When participants had meditation breaks, brainwave patterns showed positive levels of frontal alpha asymmetry, which correlates to higher engagement during the meeting. Without breaks, the levels were negative, suggesting the participants were withdrawn, or less engaged in the meeting. This shows that when the brain is experiencing stress, it’s harder to stay focused and engaged.

In sum, breaks are not only good for wellbeing, they also improve our ability to do our best work.

Taking breaks helps you engage better

Breathers don’t just alleviate stress, they help your performance.

An infographic shows how breaks help keep people engaged between meetings.

To gauge whether people are engaged or withdrawn, researchers study a brainwave pattern known as frontal alpha asymmetry (the difference between right and left alpha wave activity in the frontal area of the brain). In Microsoft’s study, those taking breaks showed positive asymmetry, which is associated with higher engagement. Those who didn’t take breaks showed negative asymmetry, which is associated with being more withdrawn.

3. Transitioning between meetings can be a source of high stress.

For the participants deprived of breaks, researchers also noticed that the transition period between calls caused beta activity, or stress levels, to spike.

That might be because “you’re coming to the end of the meeting, knowing you have another one coming right up, and you’re going to have to switch gears and use your brain to think hard about something else,” Bohan says.

For those participants, beta wave activity jumped again when new check ins started. When people took meditation breaks, by contrast, the increase in beta activity dropped between meetings, and the increase at the start of the next meeting was much gentler and smoother.

Jumping directly from one meeting to another can cause spikes of stress

Taking breaks between conversations eases that stress.

An infographic shows how—without breaks—beta wave activity in the brain can rise sharply at the beginning and end of meetings, suggesting heightened stress.

Without breaks, beta wave activity in the brain can rise sharply at the beginning and end of meetings, suggesting heightened stress. Taking breaks not only prevents those spikes but causes a dip in beta activity—which correlates with less stress.

Illustration by Valerio Pellegrini

The takeaway: Breaks, even short ones, are important to make the transitions between meetings feel less stressful.

“What makes this study so powerful and relatable is that we’re effectively visualizing for people what they experience phenomenologically inside,” Bohan says. “It’s not an abstraction—quite the opposite. It's a scientific expression of the stress and fatigue people feel during back-to-backs.”

How we are adapting our products—and practices

These findings helped inform settings in Outlook that allow individuals or organizations to set defaults that shave five, 10, or 15 minutes off Microsoft Teams meetings to carve out breaks between conversations.

For example, an individual or company might decide to start its meetings five minutes after the hour or half-hour, so that 30-minute check ins drop to 25 minutes and hour-long conversations shorten to 55 minutes. That means a half-hour meeting that would have started at 11 a.m. will become a 25-minute meeting beginning at 11:05 a.m.

It’s not just the brain research that supports this change. Digital overload has become an urgent issue in the new era of remote and hybrid work. In Microsoft’s 2021 Work Trend Index published in March, 54 percent of respondents in a global external survey said they feel overworked, while 39 percent described themselves as outright exhausted.

Over the past year, we have introduced several new capabilities to foster wellbeing in this time of rapid change. Together mode in Microsoft Teams helps combat meeting fatigue; a virtual commute helps reestablish boundaries between work and home; and a Headspace integration coming with the Microsoft Viva Insights app promotes mindfulness. This new Outlook setting is a next step in this wellbeing journey, with more to come.

One final note: If you’re using the new setting in Outlook to build in break times between meetings, consider stepping away from your computer. “Try not to use that five or 10 minutes to squeeze in some other kind of work,” Bohan says. “Catch your breath and take a break away from your screen.”

Strategies for making breaks successful—and beating meeting fatigue

Because we know making space for breaks is easier said than done, we've pulled together some research-backed tips on carving out time to pause, getting the most from moments of respite, and making meetings more effective and energizing.

Shift your mindset. While it might feel more productive to power through back-to-backs, research shows the opposite is true. View breaks away from your computer as an essential part of your workday.

Find break activities that calm your mind. Meditation is one effective way to relax and recharge between meetings, but other studies show that physical activity such as walking is also beneficial. Past Microsoft studies suggest that doodling or reading something enjoyable also bring benefits. “It can be anything that takes your mind away from work-related things and focuses it on something that you feel is relaxing,” Bohan says. That will help you be refreshed and recharged when you start your next meeting.

Create even more time for breaks by considering other modes of communication. Before scheduling a video call, pause and ask yourself: Do we really need a meeting on this issue? More dynamic, creative, or emotional topics may require a meeting, while status check ins and informational subjects may benefit from document collaboration, a Teams channel, or email. Other simple tasks may be handled via chat. Read more about creating time for breaks.

Make meetings more intentional. The best—and often shortest—meetings are more intentional. Best practices like creating and sending an agenda ahead of time, being thoughtful about who attends, starting and stopping on time, and transitioning to a recap for the final five minutes will make it easier to accomplish your goals in less time. Read more about intentional meetings.

Keep participants engaged and energized. In virtual meetings, it can be hard to chime in remotely. A moderator can help ensure remote participants are included. Features like Raise your hand, Whiteboard, and Breakout Rooms in Microsoft Teams are great ways to use technology to elicit creative and strategic conversations.

Methodology

Study conducted from March 8-18, 2021, by Microsoft Human Factors Lab with 14 people participating in video meetings while wearing electroencephalogram (EEG) equipment to monitor the electrical activity in their brains. Participants consisted of Microsoft and non-Microsoft employees who are US-based information workers and who typically work remotely. Volunteers each participated in two different session blocks of meetings. In the first session, half the participants attended a stretch of four half-hour meetings back-to-back (two continuous hours), with each call devoted to different tasks (designing an office layout, for example, or creating a marketing plan). For the others, the four half-hour meetings were interspersed with 10-minute breaks, during which participants meditated with the Headspace app. The following week, the groups switched; those who had done back-to-backs had breaks, and vice versa. Three to four additional non-EEG-measured volunteers participated in each 30-minute meeting to create variation of attendees collaborating to complete the assigned tasks. Note: Headspace was not involved in the design or execution of the study.

A giant hand draws lines that connect a series of tiny people. The lines form an arrow pointing the way forward.

In Hybrid Work, Managers Keep Teams Connected

Colored pencils pointing straight ahead.

Finding Sparks in Darkness

Prepare for the MCAT® Exam

New section.

Preparing for the MCAT® exam takes time and dedication. Balancing your preparation with an already busy schedule can be a challenge. The AAMC has resources and practice products to help you no matter where you are in the preparation process.

  • MCAT Official Prep Hub
  • Register for the MCAT Exam
  • Get Your Test Scores

Learn more about the new MCAT® Prep enhancements for the upcoming testing year. 

significance of the study in research about online learning

Discover the complete list of foundational concepts, content categories, skills, and disciplines you will need to know for test day. We also offer the outline as a  course in the MCAT Official Prep Hub , with links to free open access resources covering the same content.

significance of the study in research about online learning

Learn about the free AAMC MCAT® Official Prep resources that the AAMC offers to help you study.

significance of the study in research about online learning

Get answers to your questions about MCAT® registration, scores, and more.

The AAMC Fee Assistance Program assists those who, without financial assistance, would be unable to take the MCAT exam or apply to medical schools that use the AMCAS. The benefits include discounted fees, free MCAT Official Prep products, and more.

Get a comprehensive overview of all MCAT Official Prep products, including pricing information and key features.

Learn about available resources to help you as you advise your students.

The AAMC offers bulk order purchasing for quantities of 10 or more MCAT Official Prep products.

  • Original article
  • Open access
  • Published: 01 September 2023

Evaluating the efficacy of AI content detection tools in differentiating between human and AI-generated text

  • Ahmed M. Elkhatat   ORCID: orcid.org/0000-0003-0383-939X 1 ,
  • Khaled Elsaid 2 &
  • Saeed Almeer 3  

International Journal for Educational Integrity volume  19 , Article number:  17 ( 2023 ) Cite this article

43k Accesses

27 Citations

80 Altmetric

Metrics details

The proliferation of artificial intelligence (AI)-generated content, particularly from models like ChatGPT, presents potential challenges to academic integrity and raises concerns about plagiarism. This study investigates the capabilities of various AI content detection tools in discerning human and AI-authored content. Fifteen paragraphs each from ChatGPT Models 3.5 and 4 on the topic of cooling towers in the engineering process and five human-witten control responses were generated for evaluation. AI content detection tools developed by OpenAI, Writer, Copyleaks, GPTZero, and CrossPlag were used to evaluate these paragraphs. Findings reveal that the AI detection tools were more accurate in identifying content generated by GPT 3.5 than GPT 4. However, when applied to human-written control responses, the tools exhibited inconsistencies, producing false positives and uncertain classifications. This study underscores the need for further development and refinement of AI content detection tools as AI-generated content becomes more sophisticated and harder to distinguish from human-written text.

Introduction

The instances of academic plagiarism have escalated in educational settings, as it has been identified in various student work, encompassing reports, assignments, projects, and beyond. Academic plagiarism can be defined as the act of employing ideas, content, or structures without providing sufficient attribution to the source (Fishman 2009 ). Students' plagiarism strategies differ, with the most egregious instances involving outright replication of source materials. Other approaches include partial rephrasing through modifications in grammatical structures, substituting words with their synonyms, and using online paraphrasing services to reword text (Elkhatat 2023 ; Meuschke & Gipp 2013 ; Sakamoto & Tsuda 2019 ). Academic plagiarism violates ethical principles and ranks among the most severe cases of misconduct, as it jeopardizes the acquisition and assessment of competencies. As a result, implementing strategies to reduce plagiarism is vital for preserving academic integrity and preventing such dishonest practices in students' future scholarly and professional endeavors (Alsallal et al. 2013 ; Elkhatat 2022 ; Foltýnek et al. 2020 ). Text-Matching Software Products (TMSPs) are powerful instruments that educational institutions utilize to detect specific sets of plagiarism, attributed to their sophisticated text-matching algorithms and extensive databases containing web pages, journal articles, periodicals, and other publications. Certain TMSPs also enhance their efficacy in identifying plagiarism by incorporating databases that index previously submitted student papers (Elkhatat et al. 2021 ).

Recently, Artificial Intelligence (AI)-driven ChatGPT has surfaced as a tool that aids students in creating tailored content based on prompts by employing natural language processing (NLP) techniques (Radford et al. 2018 ). The initial GPT model showcased the potential of combining unsupervised pre-training with supervised fine-tuning for a broad array of NLP tasks. Following this, OpenAI introduced ChatGPT (model 2), which enhanced the model's performance by enlarging the architecture and using a more comprehensive pre-training dataset (Radford et al. 2019 ). The subsequent launch of ChatGPT (models 3 and 3.5) represented a significant advancement in ChatGPT's development, as it exhibited exceptional proficiency in producing human-like text and attained top results on various NLP benchmark lines. This model's capacity to generate contextually appropriate and coherent text in response to user prompts made it suitable for release of ChatGPT, an AI-driven chatbot aimed at helping users produce text and participate in natural language dialogues(Brown et al. 2020 ; OpenAI 2022 ).

The recently unveiled ChatGPT (model 4) by OpenAI on March 14, 2023, is a significant milestone in NLP technology. With enhanced cybersecurity safety measures and superior response quality, it surpasses its predecessors in tackling complex challenges. ChatGPT (model 4) boasts a wealth of general knowledge and problem-solving skills, enabling it to manage demanding tasks with heightened precision. Moreover, its inventive and cooperative features aid in generating, editing, and iterating various creative and technical writing projects, such as song composition, screenplay development, and personal writing style adaptation. However, it is crucial to acknowledge that ChatGPT (model 4)'s knowledge is confined to the cutoff date of September 2021 (OpenAI 2023 ), although the recently embedded plugins allow it to access current website content.

This development presents potential risks concerning cheating and plagiarism, which may result in severe academic and legal ramifications (Foltýnek et al. 2019 ). These potentially elevated risks of cheating and plagiarism include but are not limited to the Ease of Access to Information with its extensive knowledge base and ability to generate coherent and contextually relevant responses. In addition, the Adaptation to Personal Writing Style allows for generating content that closely matches a student's writing, making it even more difficult for educators to identify whether a language model has generated the work(OpenAI 2023 ).

Academic misconduct in undergraduate education using ChatGPT has been widely studied (Crawford et al. 2023 ; King & chatGpt 2023 ; Lee 2023 ; Perkins 2023 ; Sullivan; et al. 2023 ). Despite the advantages of ChatGPT for supporting students in essay composition and other scholarly tasks, questions have been raised regarding the authenticity and suitability of the content generated by the chatbot for academic purposes (King & chatGpt 2023 ). Additionally, ChatGPT has been rightly criticized for generating incoherent or erroneous content (Gao et al. 2022 ; Qadir 2022 ), providing superficial information (Frye 2022 ), and having a restricted knowledge base due to its lack of internet access and dependence on data up until September 2021 (Williams 2022 ). Nonetheless, the repeatability (repeatedly generated responses within the same chatbot prompt) and reproducibility (repeatedly generated responses with a new chatbot prompt)of authenticity capabilities in GPT-3.5 and GPT-4 were examined by text-matching software, demonstrating that the generation of responses remains consistently elevated and coherent, predominantly proving challenging to detect by conventional text-matching tools (Elkhatat 2023 ).

Recently, Open AI classifier tools have become relied upon for distinguishing between human writing and AI-generated content, ensuring text authenticity across various applications. For instance, OpenAI, which developed ChatGPT, introduced an AI text classifier that assists users in determining whether an essay was authored by a human or generated by AI. This classifier categorizes documents into five levels based on the likelihood of being AI-generated: very unlikely, unlikely, unclear, possibly, and likely AI-generated. The OpenOpen AI classifier has been trained using a diverse range of human-written texts, although the training data does not encompass every type of human-written text. Furthermore, the developers' tests reveal that the classifier accurately identifies 26% of AI-written text (true positives) as "likely AI-generated" while incorrectly labeling 9% of the human-written text (false positives) as AI-generated (Kirchner et al. 2023 ). Hence, OpenAI advises users to treat the classifier's results as supplementary information rather than relying on them exclusively for determining AI-generated content (Kirchner et al. 2023 ). Other AI text classifier tools include Writer.com's AI content detector, which offers a limited application programming interface API-based solution for detecting AI-generated content and emphasizes its suitability for content marketing. Copyleaks, an AI content detection solution, claims a 99% accuracy rate and provides integration with many Learning Management Systems (LMS) and APIs. GPTZero, developed by Edward Tian, is an Open AI classifier tool targeting educational institutions to combat AI plagiarism by detecting AI-generated text in student assignments. Lastly, CrossPlag's AI content detector employs machine learning algorithms and natural language processing techniques to precisely predict a text's origin, drawing on patterns and characteristics identified from an extensive human and AI-generated content dataset.

The development and implementation of AI content detectors and classifier tools underscore the growing importance and need to differentiate between human-written and AI-generated content across various fields, such as education and content marketing. To date, no studies have comprehensively examined the abilities of these AI content detectors and classifiers to distinguish between human and AI-generated content. The present study aims to investigate the capabilities of several recently launched AI content detectors and classifier tools in discerning human-written and AI-generated content.

Methodology

The ChatGPT chatbot generated two 15-paragraph responses on "Application of Cooling Towers in the Engineering Process." The first set was generated using ChatGPT's Model 3.5, while the second set was created using Model 4. The initial prompt was to "write around 100 words on the application of cooling towers in the engineering process." Five human-written samples were incorporated as control samples to evaluate false positive responses by AI detectors, as detailed in Table 1 . These samples were chosen from the introduction sections of five distinct lab reports penned by undergraduate chemical engineering students. The reports were submitted and evaluated in 2018, a planned selection to ensure no interference from AI tools available at that time.

Five AI text content detectors, namely OpenAI, Writer, Copyleaks, GPTZero, and CrossPlag, were selected and evaluated for their ability to differentiate between human and AI-generated content. These AI detectors were selected based on extensive online research and valuable feedback from individual educators at the time of the study. It is important to note that this landscape is continually evolving, with new tools and websites expected to be launched shortly. Some tools, like the Turnitin AI detector, have already been introduced but are yet to be widely adopted or activated across educational institutions. In addition, the file must have at least 300 words of prose text in a long-form writing format (Turnitin 2023 ).

It is important to note that different AI content detection tools display their results in distinct representations, as summarized in Table 2 . To standardize the results across all detection tools, we normalized them according to the OpenAI theme. This normalization was based on the AI content percentage. Texts with less than 20% AI content were classified as "very unlikely AI-generated," those with 20–40% AI content were considered "unlikely AI-generated," those with 40–60% AI content were deemed "unclear if AI-generated," those with 60–80% AI content were labeled "possibly AI-generated." Those with over 80% AI content were categorized as "likely AI-generated." Statistical analysis and capabilities tests were conducted using Minitab (Minitab 2023 ).

The diagnostic accuracy of AI detector responses was classified into positive, negative, false positive, false negative, and uncertain based on the original content's nature (AI-generated or human-written). The AI detector responses were classified as positive if the original content was AI-generated and the detector output was "Likely AI-generated" or, more inclusively, "Possibly AI-generated." Negative responses arise when the original content is human-generated, and the detector output is "Very unlikely AI-generated" or, more inclusively, "Unlikely AI-generated." False positive responses occur when the original content is human-generated, and the detector output is "Likely AI-generated" or "Possibly AI-generated." In contrast, false negative responses emerge when the original content is AI-generated, and the detector output is "Very unlikely AI-generated" or "Unlikely AI-generated." Finally, uncertain responses are those where the detector output is "Unclear if it is AI-generated," regardless of whether the original content is AI-generated or human-generated. This classification scheme assumes that "Possibly AI-generated" and "Unlikely AI-generated" responses could be considered borderline cases, falling into either positive/negative or false positive/false negative categories based on the desired level of inclusivity or strictness in the classification process.

This study evaluated these five detectors, OpenAI, Writer, Copyleaks, GPTZero, and CrossPlag, focusing on their Specificity, Sensitivity, Positive Predictive Value (PPV), and Negative Predictive Value (NPV). These metrics are used in biostatistics and machine learning to evaluate the performance of binary classification tests. Sensitivity (True Positive Rate) is the proportion of actual positive cases which are correctly identified. In this context, sensitivity is defined as the proportion of AI-generated content correctly identified by the detectors out of all AI-generated content. It is calculated as the ratio of true positives (AI-generated content correctly identified) to the sum of true positives and false negatives (AI-generated content incorrectly identified as human-generated) (Nelson et al. 2001 ; Nhu et al. 2020 ).

On the other hand, Specificity (True Negative Rate) is the proportion of actual negative cases which are correctly identified. In this context, it refers to the proportion of human-generated content correctly identified by the detectors out of all actual human-generated content. It is computed as the ratio of true negatives (human-generated content correctly identified) to the sum of true negatives and false positives (human-generated content incorrectly identified as AI-generated) (Nelson et al. 2001 ; Nhu et al. 2020 ).

Predictive power, a vital determinant of the detectors' efficacy, is divided into positive predictive value (PPV) and negative predictive value (NPV). Positive Predictive Value (PPV) is the proportion of positive results in statistics and diagnostic tests that are actually positive results. In this context, it is the proportion of actual AI-generated content among all content identified as AI-generated by the detectors. It is calculated as the ratio of true positives to the sum of true and false positives. Conversely, Negative Predictive Value (NPV) is the proportion of negative results in statistics and diagnostic tests that are accurate negative results.in this context, it is the proportion of actual human-generated content among all content identified as human-generated by the detectors. It is calculated as the ratio of true negatives to the sum of true and false negatives (Nelson et al. 2001 ; Nhu et al. 2020 ). These metrics provide a robust framework for evaluating the performance of AI text content detectors; collectively, they can be called "Classification Performance Metrics" or "Binary Classification Metrics."

Table 3 outlines the outcomes of AI content detection tools implemented on 15 paragraphs generated by ChatGPT Model 3.5, 15 more from ChatGPT Model 4, and five control paragraphs penned by humans. It is important to emphasize that, as stated in the methodology section and detailed in Table 2 , different AI content detection tools display their results in distinct representations. For instance, GPTZERO classifies the content into two groups: AI-Generated or Human-Generated content. In contrast, the OpenOpen AI classifier divides the content into a quintuple classification system: Likely AI-Generated, Possibly AI-Generated, Unclear if it is AI-Generated, Unlikely AI-Generated, and Very Unlikely AI-Generated. Notably, both GPTZERO and the OpenOpen AI classifier do not disclose the specific proportions of AI or human contribution within the content. In contrast, other AI detectors provide percentages detailing the AI or human contribution in the submitted text. Therefore, to standardize the responses from all AI detectors, the percentage data were normalized to fit the five-tier classification system of the OpenOpen AI classifier, where each category represents a 20% increment. The table also includes the exact percentage representation of AI contribution within each category for enhanced clarity and specificity.

Table 4 , on the other hand, demonstrates the diagnostic accuracy of these AI detection tools in differentiating between AI-generated and human-written content. The results for GPT 3.5-generated content indicate a high degree of consistency among the tools. The AI-generated content was often correctly identified as "Likely AI-Generated." However, there were a few instances where the tools provided an uncertain or false-negative classification. GPT 3.5_7 and GPT 3.5_14 received "Very unlikely AI-Generated" ratings from GPTZERO, while WRITER classified GPT 3.5_9 and GPT 3.5_14 as "Unclear if AI-Generated." Despite these discrepancies, most GPT 3.5-generated content was correctly identified as AI-generated by all tools.

The performance of the tools on GPT 4-generated content was notably less consistent. While some AI-generated content was correctly identified, there were several false negatives and uncertain classifications. For example, GPT 4_1, GPT 4_3, and GPT 4_4 received "Very unlikely AI-Generated" ratings from WRITER, CROSSPLAG, and GPTZERO. Furthermore, GPT 4_13 was classified as "Very unlikely AI-Generated" by WRITER and CROSSPLAG, while GPTZERO labeled it as "Unclear if it is AI-Generated." Overall, the tools struggled more with accurately identifying GPT 4-generated content than GPT 3.5-generated content.

When analyzing the control responses, it is evident that the tools' performance was not entirely reliable. While some human-written content was correctly classified as "Very unlikely AI-Generated" or "Unlikely AI-Generated," there were false positives and uncertain classifications. For example, WRITER ranked Human 1 and 2 as "Likely AI-Generated," while GPTZERO provided a "Likely AI-Generated" classification for Human 2. Additionally, Human 5 received an "Uncertain" classification from WRITER.

In order to effectively illustrate the distribution of discrete variables, the Tally Individual Variables function in Minitab was employed. This method facilitated the visualization of varying categories or outcomes' frequencies, thereby providing valuable insights into the inherent patterns within the dataset. To further enhance comprehension, the outcomes of the Tally analysis were depicted using bar charts, as demonstrated in Figs. 1 , 2 , 3 , 4 , 5 and 6 . Moreover, the classification performance metrics of these five AI text content are demonstrated in Fig.  7 , indicating a varied performance across different metrics. Looking at the GPT 3.5 results, the OpenAI Classifier displayed the highest sensitivity, with a score of 100%, implying that it correctly identified all AI-generated content. However, its specificity and NPV were the lowest, at 0%, indicating a limitation in correctly identifying human-generated content and giving pessimistic predictions when it was genuinely human-generated. GPTZero exhibited a balanced performance, with a sensitivity of 93% and specificity of 80%, while Writer and Copyleaks struggled with sensitivity. The results for GPT 4 were generally lower, with Copyleaks having the highest sensitivity, 93%, and CrossPlag maintaining 100% specificity. The OpenAI Classifier demonstrated substantial sensitivity and NPV but no specificity.

figure 1

The responses of five AI text content detectors for GPT-3.5 generated contents

figure 2

The diagnostic accuracy of the AI text content detectors' responses for GPT-3.5 generated contents

figure 3

The responses of five AI text content detectors for GPT-4 generated contents

figure 4

The diagnostic accuracy of the AI text content detectors' responses for GPT-4 generated contents

figure 5

The responses of five AI text content detectors for human-written contents

figure 6

The diagnostic accuracy of the AI text content detectors' responses for the human-written contents

figure 7

The Classification Performance Metrics of (a) OpenAI Classifier, (b) WRITER, (c) CROSSPLAG, (d) COPYLEAKS, and (e) GPTZERO

The analysis focuses on the performance of five AI text content detectors developed by OpenAI, Writer, Copyleaks, GPTZero, and CrossPlag corporations. These tools were utilized to evaluate the generated content and determine the effectiveness of each detector in correctly identifying and categorizing the text as either AI-generated or human-written. The results indicate a variance in the performance of these tools across GPT 3.5, GPT 4, and human-generated content. While the tools were generally more successful in identifying GPT 3.5-generated content, they struggled with GPT 4-generated content and exhibited inconsistencies when analyzing human-written control responses. The varying degrees of performance across these AI text content detectors highlight the complexities and challenges associated with differentiating between human and AI-generated content.

The OpenAI Classifier's high sensitivity but low specificity in both GPT versions suggest that it is efficient at identifying AI-generated content but might struggle to identify human-generated content accurately. CrossPlag's high specificity indicates its ability to identify human-generated content correctly but struggles to identify AI-generated content, especially in the GPT 4 version. These findings raise questions about its effectiveness in the rapidly advancing AI landscape.

The differences between the GPT 3.5 and GPT 4 results underline the evolving challenge of AI-generated content detection, suggesting that detector performance can significantly vary depending on the AI model's sophistication. These findings have significant implications for plagiarism detection, highlighting the need for ongoing advancements in detection tools to keep pace with evolving AI text generation capabilities.

Notably, the study's findings underscore the need for a nuanced understanding of the capabilities and limitations of these technologies. While this study indicates that AI-detection tools can distinguish between human and AI-generated content to a certain extent, their performance is inconsistent and varies depending on the sophistication of the AI model used to generate the content. This inconsistency raises concerns about the reliability of these tools, especially in high-stakes contexts such as academic integrity investigations. Therefore, while AI-detection tools may serve as a helpful aid in identifying AI-generated content, they should not be used as the sole determinant in academic integrity cases. Instead, a more holistic approach that includes manual review and consideration of contextual factors should be adopted. This approach would ensure a fairer evaluation process and mitigate the ethical concerns of using AI detection tools.

It is important to emphasize that the advent of AI and other digital technologies necessitates rethinking traditional assessment methods. Rather than resorting solely to methods less vulnerable to AI cheating, educational institutions should also consider leveraging these technologies to enhance learning and assessment. For instance, AI could provide personalized feedback, facilitate peer review, or even create more complex and realistic assessment tasks that are difficult to cheat. In addition, it is essential to note that academic integrity is not just about preventing cheating but also about fostering a culture of honesty and responsibility. This involves educating students about the importance of academic integrity and the consequences of academic misconduct and providing them with the necessary skills and resources to avoid plagiarism and other forms of cheating.

The limitations of this study, such as the tools used, the statistics included, and the disciplinary specificity against which these tools are evaluated, need to be acknowledged. It should be noted that the tools analyzed in this study were only those developed by OpenAI, Writer, Copyleaks, GPTZero, and CrossPlag corporations. These AI detectors were selected based on extensive online research and valuable feedback from individual educators at the time of the study. It is important to note that this landscape is continually evolving, with new tools and websites expected to be launched shortly. Some tools, like the Turnitin AI detector, have already been introduced but are yet to be widely adopted or activated across educational institutions. In addition, the file must have at least 300 words of prose text in a long-form writing format. Moreover, the content used for testing the tools was generated by ChatGPT Models 3.5 and 4 and included only five human-written control responses. The sample size and nature of content could affect the findings, as the performance of these tools might differ when applied to other AI models or a more extensive, more diverse set of human-written content.

It is essential to mention that this study was conducted at a specific time. Therefore, the performance of the tools might have evolved, and they might perform differently on different versions of AI models that have been released after this study was conducted. Future research should explore techniques to increase both sensitivity and specificity simultaneously for more accurate content detection, considering the rapidly evolving nature of AI content generation.

The present study sought to evaluate the performance of AI text content detectors, including OpenAI, Writer, Copyleaks, GPTZero, and CrossPlag. The results of this study indicate considerable variability in the tools' ability to correctly identify and categorize text as either AI-generated or human-written, with a general trend showing a better performance when identifying GPT 3.5-generated content compared to GPT 4-generated content or human-written content. Notably, the varying performance underscores the intricacies involved in distinguishing between AI and human-generated text and the challenges that arise with advancements in AI text generation capabilities.

The study highlighted significant performance differences between the AI detectors, with OpenAI showing high sensitivity but low specificity in detecting AI-generated content. In contrast, CrossPlag showed high specificity but struggled with AI-generated content, particularly from GPT 4. This suggests that the effectiveness of these tools may be limited in the fast-paced world of AI evolution. Furthermore, the discrepancy in detecting GPT 3.5 and GPT 4 content emphasizes the growing challenge in AI-generated content detection and the implications for plagiarism detection. The findings necessitate improvements in detection tools to keep up with sophisticated AI text generation models.

Notably, while AI detection tools can provide some insights, their inconsistent performance and dependence on the sophistication of the AI models necessitate a more holistic approach for academic integrity cases, combining AI tools with manual review and contextual considerations. The findings also call for reassessing traditional educational methods in the face of AI and digital technologies, suggesting a shift towards AI-enhanced learning and assessment while fostering an environment of academic honesty and responsibility. The study acknowledges limitations related to the selected AI detectors, the nature of content used for testing, and the study's timing. Therefore, future research should consider expanding the selection of detectors, increasing the variety and size of the testing content, and regularly evaluating the detectors' performance over time to keep pace with the rapidly evolving AI landscape. Future research should also focus on improving sensitivity and specificity simultaneously for more accurate content detection.

In conclusion, as AI text generation evolves, so must the tools designed to detect it. This necessitates continuous development and regular evaluation to ensure their efficacy and reliability. Furthermore, a balanced approach involving AI tools and traditional methods best upholds academic integrity in an ever-evolving digital landscape.

Availability of data and materials

All data and materials are available.

Abbreviations

Artificial Intelligence

Learning Management Systems

Natural Language Processing

Negative Predictive Value

Positive Predictive Value

Text-Matching Software Product

Alsallal M, Iqbal R, Amin S, James A (2013) Intrinsic Plagiarism Detection Using Latent Semantic Indexing and Stylometry. 2013 Sixth International Conference on Developments in eSystems Engineering

Google Scholar  

Brown T, Mann B, Ryder N, Subbiah M, Kaplan JD, Dhariwal P, Neelakantan A, Shyam P, Sastry G, Askell A (2020) Language models are few-shot learners. Adv Neural Inf Process Syst 33:1877–1901

Crawford J, Cowling M, Allen KA (2023) Leadership is needed for ethical ChatGPT: Character, assessment, and learning using artificial intelligence (AI). J Univ Teach Learning Pract 20(3). https://doi.org/10.53761/1.20.3.02

Elkhatat AM (2023) Evaluating the Efficacy of AI Detectors: A Comparative Analysis of Tools for Discriminating Human-Generated and AI-Generated Texts. Int J Educ Integr. https://doi.org/10.1007/s40979-023-00137-0

Article   Google Scholar  

Elkhatat AM, Elsaid K, Almeer S (2021) Some students plagiarism tricks, and tips for effective check. Int J Educ Integrity 17(1). https://doi.org/10.1007/s40979-021-00082-w

Elkhatat AM (2022) Practical randomly selected question exam design to address replicated and sequential questions in online examinations. Int J Educ Integrity 18(1). https://doi.org/10.1007/s40979-022-00103-2

Fishman T (2009) “We know it when we see it” is not good enough: toward a standard definition of plagiarism that transcends theft, fraud, and copyright 4th Asia Pacific Conference on Educational Integrity, University of Wollongong NSW Australia

Foltýnek T, Meuschke N, Gipp B (2019) Academic Plagiarism Detection. ACM Comput Surv 52(6):1–42. https://doi.org/10.1145/3345317

Foltýnek T, Meuschke N, Gipp B (2020) Academic Plagiarism Detection. ACM Comput Surv 52(6):1–42. https://doi.org/10.1145/3345317

Frye BL (2022) Should Using an AI Text Generator to Produce Academic Writing Be Plagiarism? Fordham Intellectual Property, Media & Entertainment Law Journal. https://ssrn.com/abstract=4292283

Gao CA, Howard FM, Markov NS, Dyer EC, Ramesh S, Luo Y, Pearson AT (2022) Comparing scientific abstracts generated by ChatGPT to original abstracts using an artificial intelligence output detector, plagiarism detector, and blinded human reviewers. https://doi.org/10.1101/2022.12.23.521610

King MR, chatGpt (2023) A Conversation on Artificial Intelligence, Chatbots, and Plagiarism in Higher Education. Cell Mol Bioeng 16(1):1–2. https://doi.org/10.1007/s12195-022-00754-8

Kirchner JH, Ahmad L, Aaronson S, Leike J (2023) New AI classifier for indicating AI-written text. OpenAI. Retrieved 16 April from https://openai.com/blog/new-ai-classifier-for-indicating-ai-written-text

Lee H (2023) The rise of ChatGPT: Exploring its potential in medical education. Anat Sci Educ. https://doi.org/10.1002/ase.2270

Meuschke N, Gipp B (2013) State-of-the-art in detecting academic plagiarism. Int J Educ Integrity 9(1). https://doi.org/10.21913/IJEI.v9i1.847

Minitab (2023). https://www.minitab.com/en-us/

Nelson EC, Hanna GL, Hudziak JJ, Botteron KN, Heath AC, Todd RD (2001) Obsessive-compulsive scale of the child behavior checklist: specificity, sensitivity, and predictive power. Pediatrics 108(1):E14. https://doi.org/10.1542/peds.108.1.e14

Nhu VH, Mohammadi A, Shahabi H, Ahmad BB, Al-Ansari N, Shirzadi A, Clague JJ, Jaafari A, Chen W, Nguyen H (2020) Landslide Susceptibility Mapping Using Machine Learning Algorithms and Remote Sensing Data in a Tropical Environment. Int J Environ Res Public Health, 17(14). https://doi.org/10.3390/ijerph17144933

OpenAI (2022) Introducing ChatGPT. Retrieved March 21 from https://openai.com/blog/chatgpt/

OpenAI (2023) GPT-4 is OpenAI's most advanced system, producing safer and more useful responses. Retrieved March 22 from https://openai.com/product/gpt-4

Perkins M (2023) Academic integrity considerations of AI Large Language Models in the post-pandemic era: ChatGPT and beyond. J Univ Teach Learning Pract 20(2). https://doi.org/10.53761/1.20.02.07

Qadir J (2022) Engineering Education in the Era of ChatGPT: Promise and Pitfalls of Generative AI for Education. TechRxiv. Preprint. https://doi.org/10.36227/techrxiv.21789434.v1

Radford A, Wu J, Child R, Luan D, Amodei D, Sutskever I (2019) Language models are unsupervised multitask learners. OpenAI Blog 1(8):9

Radford A, Narasimhan K, Salimans T, Sutskever I (2018) Improving language understanding by generative pre-training

Sakamoto D, Tsuda K (2019) A Detection Method for Plagiarism Reports of Students. Procedia Computer Science 159:1329–1338. https://doi.org/10.1016/j.procs.2019.09.303

Sullivan M, Kelly A, Mclaughlan P (2023) ChatGPT in higher education: Considerations for academic integrity and student learning. J Appl Learning Teach 6(1). https://doi.org/10.37074/jalt.2023.6.1.17

Turnitin (2023) AI Writing Detection Frequently Asked Questions. Retrieved 21 June from https://www.turnitin.com/products/features/ai-writing-detection/faq

Williams C (2022) Hype, or the future of learning and teaching? 3 Limits to AI's ability to write student essays. The University of Kent's Academic Repository, Blog post. https://kar.kent.ac.uk/99505/

Download references

Acknowledgements

The publication of this article was funded by the Qatar National Library.

Author information

Authors and affiliations.

Department of Chemical Engineering, Qatar University, P.O. 2713, Doha, Qatar

Ahmed M. Elkhatat

Chemical Engineering Program, Texas A&M University at Qatar, P.O. 23874, Doha, Qatar

Khaled Elsaid

Department of Chemistry and Earth Sciences, Qatar University, P.O. 2713, Doha, Qatar

Saeed Almeer

You can also search for this author in PubMed   Google Scholar

Contributions

Ahmed M. Elkhatat: Conceptionizaion, Conducting the experiments discussing the results, Writing the first draft. Khaled Elsaid: Validating the concepts, contributing to the discussion, and writing the second Draft. Saeed Almeer: project administration and supervision, proofreading, improving, and writing the final version.

Corresponding author

Correspondence to Ahmed M. Elkhatat .

Ethics declarations

Competing interests.

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Elkhatat, A.M., Elsaid, K. & Almeer, S. Evaluating the efficacy of AI content detection tools in differentiating between human and AI-generated text. Int J Educ Integr 19 , 17 (2023). https://doi.org/10.1007/s40979-023-00140-5

Download citation

Received : 30 April 2023

Accepted : 30 June 2023

Published : 01 September 2023

DOI : https://doi.org/10.1007/s40979-023-00140-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • AI-generated content
  • Academic integrity
  • AI content detection tools

International Journal for Educational Integrity

ISSN: 1833-2595

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

significance of the study in research about online learning

IMAGES

  1. Significance of The Study

    significance of the study in research about online learning

  2. (PDF) RESEARCH ON ONLINE LEARNING

    significance of the study in research about online learning

  3. Types of Research

    significance of the study in research about online learning

  4. Significance of The Study

    significance of the study in research about online learning

  5. Significance of the Study

    significance of the study in research about online learning

  6. Significance of the Study

    significance of the study in research about online learning

VIDEO

  1. CASP Online Learning

  2. 5 Step Designer Career Moves for 2024

  3. Pros and Cons of an Online Degree

  4. 5 Myths about Online Learning

  5. The Zoom Classroom: The Psychological Impact of Online Learning

  6. online classes benefits || online classes for adults

COMMENTS

  1. A systematic review of research on online teaching and learning from 2009 to 2018

    Cross-cultural online learning is gaining importance along with access in global settings. In addition, providing inclusive opportunities for all learners and in ethical ways is being examined. 12: ... Also, since there is no recent study examining online learning research themes in the last decade, this study strives to address that gap by ...

  2. The effects of online education on academic success: A meta-analysis study

    Publication bias. Publication bias is the low capability of published studies on a research subject to represent all completed studies on the same subject (Card, 2011; Littell et al., 2008).Similarly, publication bias is the state of having a relationship between the probability of the publication of a study on a subject, and the effect size and significance that it produces.

  3. Frontiers

    BackgroundThe effectiveness of online learning in higher education during the COVID-19 pandemic period is a debated topic but a systematic review on this topic is absent.MethodsThe present study implemented a systematic review of 25 selected articles to comprehensively evaluate online learning effectiveness during the pandemic period and identify factors that influence such effectiveness ...

  4. The effects of online education on academic success: A meta-analysis study

    The purpose of this study is to analyze the effect of online education, which has been extensively used on student achievement since the beginning of the pandemic. In line with this purpose, a meta-analysis of the related studies focusing on the effect of online education on students' academic achievement in several countries between the years 2010 and 2021 was carried out. Furthermore, this ...

  5. Review of Education

    This systematic analysis examines effectiveness research on online and blended learning from schools, particularly relevant during the Covid-19 pandemic, and also educational games, computer-supported cooperative learning (CSCL) and computer-assisted instruction (CAI), largely used in schools but with potential for outside school.

  6. Assessing the Impact of Online-Learning Effectiveness and Benefits in

    Online learning is one of the educational solutions for students during the COVID-19 pandemic. Worldwide, most universities have shifted much of their learning frameworks to an online learning model to limit physical interaction between people and slow the spread of COVID-19. The effectiveness of online learning depends on many factors, including student and instructor self-efficacy, attitudes ...

  7. The Impact of Online Learning Strategies on Students' Academic

    Furthermore, this study recommends the continued use of online learning if both students and instructors are technologically and physically prepared. Distribution of Aspects linked with Online ...

  8. Research on the influencing factors of adult learners' intent to use

    This study addresses the understanding gap concerning the factors that influence the continuous learning intention of adult learners on online education platforms. The uniqueness and significance ...

  9. Integrating students' perspectives about online learning: a hierarchy

    This article reports on a large-scale (n = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students' perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social ...

  10. Examining research on the impact of distance and online learning: A

    Distance learning has evolved over many generations into its newest form of what we commonly label as online learning. In this second-order meta-analysis, we analyze 19 first-order meta-analyses to examine the impact of distance learning and the special case of online learning on students' cognitive, affective and behavioral outcomes.

  11. Online vs in-person learning in higher education: effects on student

    Given the dearth of research on the academic impact of online learning, especially in light of Covid-19 in the educational arena, the present study aims to address that gap by assessing the ...

  12. Navigating the New Normal: Adapting Online and Distance Learning in the

    A study by Gikandi, Morrow, and Davis emphasizes the importance of aligning assessments with the online learning environment, suggesting the incorporation of collaborative and reflective assessments to promote active learning and student engagement. This scenario presents new opportunities for innovation in assessment design.

  13. (PDF) The Effectiveness of Online Learning: Beyond No Significant

    Nashville, TN 3720 3 USA. t [email protected]. Abstract. The physical "brick and mortar" classroom is starting to lose its monopoly as the place of. learning. The Internet has made ...

  14. The Impact of Online Learning on Student's Academic Performance

    online classes could affect the academic performance of students. This paper seeks to study the. impact of online learning on the academic performance of university students and to determine. whether education systems should increase the amount of online learning for traditional in-class. subjects.

  15. Online and face‐to‐face learning: Evidence from students' performance

    1.1. Related literature. Online learning is a form of distance education which mainly involves internet‐based education where courses are offered synchronously (i.e. live sessions online) and/or asynchronously (i.e. students access course materials online in their own time, which is associated with the more traditional distance education).

  16. COVID-19's impacts on the scope, effectiveness, and ...

    The COVID-19 outbreak brought online learning to the forefront of education. Scholars have conducted many studies on online learning during the pandemic, but only a few have performed quantitative comparative analyses of students' online learning behavior before and after the outbreak. We collected review data from China's massive open online course platform called icourse.163 and ...

  17. Traditional Learning Compared to Online Learning During the COVID-19

    This study reveals the importance of online learning since, clearly, the performance of students has been better via this method than traditional learning. During the COVID-19 pandemic student commitment to class attendance online has increased, along with participation and interaction, and marks.

  18. Online Education and Its Effective Practice: A Research Review

    gued that effective online instruction is dependent upon 1) w ell-designed course content, motiva t-. ed interaction between the instructor and learners, we ll-prepared and fully-supported ...

  19. Insights Into Students' Experiences and Perceptions of Remote Learning

    This spring, students across the globe transitioned from in-person classes to remote learning as a result of the COVID-19 pandemic. This unprecedented change to undergraduate education saw institutions adopting multiple online teaching modalities and instructional platforms. We sought to understand students' experiences with and perspectives on those methods of remote instruction in order to ...

  20. PDF Students' Perceptions towards the Quality of Online Education: A

    Yi Yang Linda F. Cornelius Mississippi State University. Abstract. How to ensure the quality of online learning in institutions of higher education has been a growing concern during the past several years. While several studies have focused on the perceptions of faculty and administrators, there has been a paucity of research conducted on ...

  21. Online learning during COVID-19 produced equivalent or better student

    Research across disciplines has demonstrated that well-designed online learning can lead to students' enhanced motivation, satisfaction, and learning [1,2,3,4,5,6,7].]. A report by the U.S. Department of Education [], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning ...

  22. Research on the development and innovation of online education based on

    The findings of the study are presented based on the order of the research questions. Research questions 1. The first research question aimed at exploring the benefits and challenges of digital knowledge sharing in online education. interviews with the informants were analyzed and 7 benefits and 10 main challenges were extracted.

  23. Development of a new model on utilizing online learning platforms to

    This research aims to explore and investigate potential factors influencing students' academic achievements and satisfaction with using online learning platforms. This study was constructed based on Transactional Distance Theory (TDT) and Bloom's Taxonomy Theory (BTT). This study was conducted on 243 students using online learning platforms in higher education. This research utilized a ...

  24. Perceptions and challenges of online teaching and learning amidst the

    Online education has emerged as a crucial tool for imparting knowledge and skills to students in the twenty-first century, especially in developing nations like India, which previously relied heavily on traditional teaching methods. This study delved into the perceptions and challenges experienced by students and teachers in the context of online education during the COVID-19 pandemic.

  25. Technology is shaping learning in higher education

    Investors have taken note. Edtech start-ups raised record amounts of venture capital in 2020 and 2021, and market valuations for bigger players soared. A study conducted by McKinsey in 2021 found that to engage most effectively with students, higher-education institutions can focus on eight dimensions of the learning experience. In this article ...

  26. Students' experience of online learning during the COVID‐19 pandemic: A

    This study explores how students at different stages of their K‐12 education reacted to the mandatory full‐time online learning during the COVID‐19 pandemic. For this purpose, we conducted a province‐wide survey study in which the online learning experience of 1,170,769 Chinese students was collected from the Guangdong Province of China.

  27. Essential components of a learning and development strategy

    The strategic role of L&D. One of L&D's primary responsibilities is to manage the development of people—and to do so in a way that supports other key business priorities. L&D's strategic role spans five areas (Exhibit 1). 2. Attract and retain talent. Traditionally, learning focused solely on improving productivity.

  28. Research Proves Your Brain Needs Breaks

    The research showed three main takeaways. 1. Breaks between meetings allow the brain to "reset," reducing a cumulative buildup of stress across meetings. As we've seen in previous studies , in two straight hours of back-to-back meetings, the average activity of beta waves—those associated with stress—increased over time.

  29. Prepare for the MCAT® Exam

    Learn about the free AAMC MCAT ® Official Prep resources that the AAMC offers to help you study. Learning through practice is key when it comes to the MCAT ® exam. Prepare for the exam with AAMC MCAT Official Prep products written by the test developers. Get answers to your questions about MCAT ® registration, scores, and more.

  30. Evaluating the efficacy of AI content detection tools in

    The proliferation of artificial intelligence (AI)-generated content, particularly from models like ChatGPT, presents potential challenges to academic integrity and raises concerns about plagiarism. This study investigates the capabilities of various AI content detection tools in discerning human and AI-authored content. Fifteen paragraphs each from ChatGPT Models 3.5 and 4 on the topic of ...