Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Published: 25 January 2021

Online education in the post-COVID era

  • Barbara B. Lockee 1  

Nature Electronics volume  4 ,  pages 5–6 ( 2021 ) Cite this article

137k Accesses

199 Citations

337 Altmetric

Metrics details

  • Science, technology and society

The coronavirus pandemic has forced students and educators across all levels of education to rapidly adapt to online learning. The impact of this — and the developments required to make it work — could permanently change how education is delivered.

The COVID-19 pandemic has forced the world to engage in the ubiquitous use of virtual learning. And while online and distance learning has been used before to maintain continuity in education, such as in the aftermath of earthquakes 1 , the scale of the current crisis is unprecedented. Speculation has now also begun about what the lasting effects of this will be and what education may look like in the post-COVID era. For some, an immediate retreat to the traditions of the physical classroom is required. But for others, the forced shift to online education is a moment of change and a time to reimagine how education could be delivered 2 .

articles on online education

Looking back

Online education has traditionally been viewed as an alternative pathway, one that is particularly well suited to adult learners seeking higher education opportunities. However, the emergence of the COVID-19 pandemic has required educators and students across all levels of education to adapt quickly to virtual courses. (The term ‘emergency remote teaching’ was coined in the early stages of the pandemic to describe the temporary nature of this transition 3 .) In some cases, instruction shifted online, then returned to the physical classroom, and then shifted back online due to further surges in the rate of infection. In other cases, instruction was offered using a combination of remote delivery and face-to-face: that is, students can attend online or in person (referred to as the HyFlex model 4 ). In either case, instructors just had to figure out how to make it work, considering the affordances and constraints of the specific learning environment to create learning experiences that were feasible and effective.

The use of varied delivery modes does, in fact, have a long history in education. Mechanical (and then later electronic) teaching machines have provided individualized learning programmes since the 1950s and the work of B. F. Skinner 5 , who proposed using technology to walk individual learners through carefully designed sequences of instruction with immediate feedback indicating the accuracy of their response. Skinner’s notions formed the first formalized representations of programmed learning, or ‘designed’ learning experiences. Then, in the 1960s, Fred Keller developed a personalized system of instruction 6 , in which students first read assigned course materials on their own, followed by one-on-one assessment sessions with a tutor, gaining permission to move ahead only after demonstrating mastery of the instructional material. Occasional class meetings were held to discuss concepts, answer questions and provide opportunities for social interaction. A personalized system of instruction was designed on the premise that initial engagement with content could be done independently, then discussed and applied in the social context of a classroom.

These predecessors to contemporary online education leveraged key principles of instructional design — the systematic process of applying psychological principles of human learning to the creation of effective instructional solutions — to consider which methods (and their corresponding learning environments) would effectively engage students to attain the targeted learning outcomes. In other words, they considered what choices about the planning and implementation of the learning experience can lead to student success. Such early educational innovations laid the groundwork for contemporary virtual learning, which itself incorporates a variety of instructional approaches and combinations of delivery modes.

Online learning and the pandemic

Fast forward to 2020, and various further educational innovations have occurred to make the universal adoption of remote learning a possibility. One key challenge is access. Here, extensive problems remain, including the lack of Internet connectivity in some locations, especially rural ones, and the competing needs among family members for the use of home technology. However, creative solutions have emerged to provide students and families with the facilities and resources needed to engage in and successfully complete coursework 7 . For example, school buses have been used to provide mobile hotspots, and class packets have been sent by mail and instructional presentations aired on local public broadcasting stations. The year 2020 has also seen increased availability and adoption of electronic resources and activities that can now be integrated into online learning experiences. Synchronous online conferencing systems, such as Zoom and Google Meet, have allowed experts from anywhere in the world to join online classrooms 8 and have allowed presentations to be recorded for individual learners to watch at a time most convenient for them. Furthermore, the importance of hands-on, experiential learning has led to innovations such as virtual field trips and virtual labs 9 . A capacity to serve learners of all ages has thus now been effectively established, and the next generation of online education can move from an enterprise that largely serves adult learners and higher education to one that increasingly serves younger learners, in primary and secondary education and from ages 5 to 18.

The COVID-19 pandemic is also likely to have a lasting effect on lesson design. The constraints of the pandemic provided an opportunity for educators to consider new strategies to teach targeted concepts. Though rethinking of instructional approaches was forced and hurried, the experience has served as a rare chance to reconsider strategies that best facilitate learning within the affordances and constraints of the online context. In particular, greater variance in teaching and learning activities will continue to question the importance of ‘seat time’ as the standard on which educational credits are based 10 — lengthy Zoom sessions are seldom instructionally necessary and are not aligned with the psychological principles of how humans learn. Interaction is important for learning but forced interactions among students for the sake of interaction is neither motivating nor beneficial.

While the blurring of the lines between traditional and distance education has been noted for several decades 11 , the pandemic has quickly advanced the erasure of these boundaries. Less single mode, more multi-mode (and thus more educator choices) is becoming the norm due to enhanced infrastructure and developed skill sets that allow people to move across different delivery systems 12 . The well-established best practices of hybrid or blended teaching and learning 13 have served as a guide for new combinations of instructional delivery that have developed in response to the shift to virtual learning. The use of multiple delivery modes is likely to remain, and will be a feature employed with learners of all ages 14 , 15 . Future iterations of online education will no longer be bound to the traditions of single teaching modes, as educators can support pedagogical approaches from a menu of instructional delivery options, a mix that has been supported by previous generations of online educators 16 .

Also significant are the changes to how learning outcomes are determined in online settings. Many educators have altered the ways in which student achievement is measured, eliminating assignments and changing assessment strategies altogether 17 . Such alterations include determining learning through strategies that leverage the online delivery mode, such as interactive discussions, student-led teaching and the use of games to increase motivation and attention. Specific changes that are likely to continue include flexible or extended deadlines for assignment completion 18 , more student choice regarding measures of learning, and more authentic experiences that involve the meaningful application of newly learned skills and knowledge 19 , for example, team-based projects that involve multiple creative and social media tools in support of collaborative problem solving.

In response to the COVID-19 pandemic, technological and administrative systems for implementing online learning, and the infrastructure that supports its access and delivery, had to adapt quickly. While access remains a significant issue for many, extensive resources have been allocated and processes developed to connect learners with course activities and materials, to facilitate communication between instructors and students, and to manage the administration of online learning. Paths for greater access and opportunities to online education have now been forged, and there is a clear route for the next generation of adopters of online education.

Before the pandemic, the primary purpose of distance and online education was providing access to instruction for those otherwise unable to participate in a traditional, place-based academic programme. As its purpose has shifted to supporting continuity of instruction, its audience, as well as the wider learning ecosystem, has changed. It will be interesting to see which aspects of emergency remote teaching remain in the next generation of education, when the threat of COVID-19 is no longer a factor. But online education will undoubtedly find new audiences. And the flexibility and learning possibilities that have emerged from necessity are likely to shift the expectations of students and educators, diminishing further the line between classroom-based instruction and virtual learning.

Mackey, J., Gilmore, F., Dabner, N., Breeze, D. & Buckley, P. J. Online Learn. Teach. 8 , 35–48 (2012).

Google Scholar  

Sands, T. & Shushok, F. The COVID-19 higher education shove. Educause Review https://go.nature.com/3o2vHbX (16 October 2020).

Hodges, C., Moore, S., Lockee, B., Trust, T. & Bond, M. A. The difference between emergency remote teaching and online learning. Educause Review https://go.nature.com/38084Lh (27 March 2020).

Beatty, B. J. (ed.) Hybrid-Flexible Course Design Ch. 1.4 https://go.nature.com/3o6Sjb2 (EdTech Books, 2019).

Skinner, B. F. Science 128 , 969–977 (1958).

Article   Google Scholar  

Keller, F. S. J. Appl. Behav. Anal. 1 , 79–89 (1968).

Darling-Hammond, L. et al. Restarting and Reinventing School: Learning in the Time of COVID and Beyond (Learning Policy Institute, 2020).

Fulton, C. Information Learn. Sci . 121 , 579–585 (2020).

Pennisi, E. Science 369 , 239–240 (2020).

Silva, E. & White, T. Change The Magazine Higher Learn. 47 , 68–72 (2015).

McIsaac, M. S. & Gunawardena, C. N. in Handbook of Research for Educational Communications and Technology (ed. Jonassen, D. H.) Ch. 13 (Simon & Schuster Macmillan, 1996).

Irvine, V. The landscape of merging modalities. Educause Review https://go.nature.com/2MjiBc9 (26 October 2020).

Stein, J. & Graham, C. Essentials for Blended Learning Ch. 1 (Routledge, 2020).

Maloy, R. W., Trust, T. & Edwards, S. A. Variety is the spice of remote learning. Medium https://go.nature.com/34Y1NxI (24 August 2020).

Lockee, B. J. Appl. Instructional Des . https://go.nature.com/3b0ddoC (2020).

Dunlap, J. & Lowenthal, P. Open Praxis 10 , 79–89 (2018).

Johnson, N., Veletsianos, G. & Seaman, J. Online Learn. 24 , 6–21 (2020).

Vaughan, N. D., Cleveland-Innes, M. & Garrison, D. R. Assessment in Teaching in Blended Learning Environments: Creating and Sustaining Communities of Inquiry (Athabasca Univ. Press, 2013).

Conrad, D. & Openo, J. Assessment Strategies for Online Learning: Engagement and Authenticity (Athabasca Univ. Press, 2018).

Download references

Author information

Authors and affiliations.

School of Education, Virginia Tech, Blacksburg, VA, USA

Barbara B. Lockee

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Barbara B. Lockee .

Ethics declarations

Competing interests.

The author declares no competing interests.

Rights and permissions

Reprints and permissions

About this article

Cite this article.

Lockee, B.B. Online education in the post-COVID era. Nat Electron 4 , 5–6 (2021). https://doi.org/10.1038/s41928-020-00534-0

Download citation

Published : 25 January 2021

Issue Date : January 2021

DOI : https://doi.org/10.1038/s41928-020-00534-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

A comparative study on the effectiveness of online and in-class team-based learning on student performance and perceptions in virtual simulation experiments.

BMC Medical Education (2024)

Leveraging privacy profiles to empower users in the digital society

  • Davide Di Ruscio
  • Paola Inverardi
  • Phuong T. Nguyen

Automated Software Engineering (2024)

Growth mindset and social comparison effects in a peer virtual learning environment

  • Pamela Sheffler
  • Cecilia S. Cheung

Social Psychology of Education (2024)

Nursing students’ learning flow, self-efficacy and satisfaction in virtual clinical simulation and clinical case seminar

  • Sunghee H. Tak

BMC Nursing (2023)

Online learning for WHO priority diseases with pandemic potential: evidence from existing courses and preparing for Disease X

  • Heini Utunen
  • Corentin Piroux

Archives of Public Health (2023)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

articles on online education

How Online Learning Is Reshaping Higher Education

As the pandemic eases, many institutions are realizing that properly planned online platforms will allow them to better serve all students, including nontraditional learners.

Online Learning Is Reshaping Higher Ed

articles on online education

Getty Stock Images

“The nice thing about online education is that it can actually escape geographical boundaries,” said Don Kilburn, the CEO of UMass Online.

Two years ago, as COVID-19 caused campuses to close, some institutions were able to shift their students to already robust online learning programs. But many other colleges and universities scrambled to build online education curricula from scratch. Students and faculty often found themselves logging onto Zoom or other platforms for the first time, with little knowledge of how to navigate a new world of virtual learning.

“When the pandemic hit, it was a provocation, as well as a demand for innovation,” said Caroline Levander, the vice president for global and digital strategy at Rice University in Houston, during a recent webinar on the future of online learning hosted by U.S. News & World Report.

While the changes were challenging for many, faculty members at Rice and elsewhere embraced the new opportunities that online learning offered. Levander shared an example of a Rice physics professor, Jason Hafner, who capitalized on the virtual environment to find compelling new ways to teach concepts to students.

“He had been innovating with online delivery in our non-credit offerings before the pandemic,” said Levander. But once COVID-19 spread, Hafner moved beyond the walls of his classroom and took advantage of Rice’s physical campus to enhance his teaching with video-recorded experiments conducted outside of normal class times. For example, in one lesson, he climbed atop a rock edifice in Rice’s engineering quad to drop two equally sized spheres – one made of aluminum and the other of steel – to demonstrate that they would fall with the same acceleration despite their different densities.

Now, many educators are reassessing how virtual learning can further enhance the student experience by offering greater flexibility than in-class options, particularly for hybrid and all-virtual instruction models. During the early days of the pandemic, “people stood up Zoom classrooms” and “they put a lot of video lectures up online,” said Jeff Borden, the chief academic officer for D2L, a company that creates online learning software. “That’s fine. That was important to get people through.” Now, however, Borden stressed, colleges and universities have the opportunity to move beyond these makeshift models. They can work to build more durable online learning platforms that meet the needs of a range of learners who must access coursework at different times and in different formats to suit their particular goals and lifestyles.

While a four-year college education can be thought of as a default for many, there are a lot of people for whom “that’s not the right path,” said Borden. In fact, some students may be looking simply to gain credentials or to upskill, rather than get traditional degrees. “There are tens of millions of other people in our society who have needs that are other than that, who have desires that are different than that,” Borden noted. Online learning now enables older students, working adults, people from nontraditional backgrounds and those who might be neurodiverse to access content more easily than ever before, Borden added.

The multitude of options also extends to graduate and professional schools, many of which have rolled out fully or partially online programs in recent years. In fact, applicants to Rice’s fully online master’s degree program are “much more diverse in every way than students who apply to the residential counterpart,” Levander said, because access is made easier and more compatible to students who may be juggling work and family obligations.

“The nice thing about online education is that it can actually escape geographical boundaries,” said Don Kilburn, the CEO of UMass Online, which has offerings across the five University of Massachusetts schools. Kilburn agreed with his fellow panelists that online learning models play a critical role in broadening access. He also emphasized the potential added benefit of lessening the financial burden on students, since online programs can often cost a fraction of in-person ones. “Part of accessibility is affordability,” he said. “I do think there are ways to actually deliver fully online programs that have a lower cost structure and may actually reduce the cost of education significantly.”

Part of serving the needs of those who choose to attend classes online means understanding why they do so and how their needs differ from those who choose traditional, in-person options, said Nancy Gonzales, the executive vice president and university provost at Arizona State University , whose online programs will reach approximately 84,000 students this year.

Many online students choose to take fewer courses at a time and may take semesters off to accommodate other aspects of their lives like taking care of children or work responsibilities – part of why the flexibility of online learning is so appealing, Gonzales said. “We’ve been trying to really try to understand what is the cadence of attendance and how do we meet the needs of students, because they are a very different population,” said Gonzales.

At the same time, for Gonzales, part of what makes an online education model successful is providing students with comparable support and services to what they might receive through in-person instruction. Such services might range from financial aid counseling to ensuring that students can interact with their peers on discussion boards, in order to ensure that interactions with classmates are not lost when attending class online.

But the promise of online education, the panelists agreed, is great. “I think we are just at the beginning of the digital transformation,” said Kilburn. “I can’t tell you when, but at some point you will see a revolution in education like you will in everything else.”

Join the Conversation

Tags: online education , internet , education , colleges , students , Coronavirus , pandemic , United States

America 2024

articles on online education

Health News Bulletin

Stay informed on the latest news on health and COVID-19 from the editors at U.S. News & World Report.

Sign in to manage your newsletters »

Sign up to receive the latest updates from U.S News & World Report and our trusted partners and sponsors. By clicking submit, you are agreeing to our Terms and Conditions & Privacy Policy .

You May Also Like

The 10 worst presidents.

U.S. News Staff Feb. 23, 2024

articles on online education

Cartoons on President Donald Trump

Feb. 1, 2017, at 1:24 p.m.

articles on online education

Photos: Obama Behind the Scenes

April 8, 2022

articles on online education

Photos: Who Supports Joe Biden?

March 11, 2020

articles on online education

Florida High Court OKs Abortion Vote

Lauren Camera April 1, 2024

articles on online education

The Week in Cartoons April 1-5

April 1, 2024, at 3:48 p.m.

articles on online education

White House Easter Egg Roll

April 1, 2024

What to Know About the Easter Egg Roll

Steven Ross Johnson April 1, 2024

articles on online education

Jobs Back on the Radar to Start Q2

Tim Smart April 1, 2024

articles on online education

Biden Besting Trump in Campaign Cash

Susan Milligan and Lauren Camera March 29, 2024

articles on online education

  • Reference Manager
  • Simple TEXT file

People also looked at

Systematic review article, a systematic review of the effectiveness of online learning in higher education during the covid-19 pandemic period.

articles on online education

  • 1 Department of Basic Education, Beihai Campus, Guilin University of Electronic Technology Beihai, Beihai, Guangxi, China
  • 2 School of Sports and Arts, Harbin Sport University, Harbin, Heilongjiang, China
  • 3 School of Music, Harbin Normal University, Harbin, Heilongjiang, China
  • 4 School of General Education, Beihai Vocational College, Beihai, Guangxi, China
  • 5 School of Economics and Management, Beihai Campus, Guilin University of Electronic Technology, Guilin, Guangxi, China

Background: The effectiveness of online learning in higher education during the COVID-19 pandemic period is a debated topic but a systematic review on this topic is absent.

Methods: The present study implemented a systematic review of 25 selected articles to comprehensively evaluate online learning effectiveness during the pandemic period and identify factors that influence such effectiveness.

Results: It was concluded that past studies failed to achieve a consensus over online learning effectiveness and research results are largely by how learning effectiveness was assessed, e.g., self-reported online learning effectiveness, longitudinal comparison, and RCT. Meanwhile, a set of factors that positively or negatively influence the effectiveness of online learning were identified, including infrastructure factors, instructional factors, the lack of social interaction, negative emotions, flexibility, and convenience.

Discussion: Although it is debated over the effectiveness of online learning during the pandemic period, it is generally believed that the pandemic brings a lot of challenges and difficulties to higher education and these challenges and difficulties are more prominent in developing countries. In addition, this review critically assesses limitations in past research, develops pedagogical implications, and proposes recommendations for future research.

1 Introduction

1.1 research background.

The COVID-19 pandemic first out broken in early 2020 has considerably shaped the higher education landscape globally. To restrain viral transmission, universities globally locked down, and teaching and learning activities were transferred to online platforms. Although online learning is a relatively mature learning model and is increasingly integrated into higher education, the sudden and unprepared transition to wholly online learning caused by the pandemic posed formidable challenges to higher education stakeholders, e.g., policymakers, instructors, and students, especially at the early stage of the pandemic ( García-Morales et al., 2021 ; Grafton-Clarke et al., 2022 ). Correspondingly, the effectiveness of online learning during the pandemic period is still questionable as online learning during this period has some unique characteristics, e.g., the lack of preparation, sudden and unprepared transition, the huge scale of implementation, and social distancing policies ( Sharma et al., 2020 ; Rahman, 2021 ; Tsang et al., 2021 ; Hollister et al., 2022 ; Zhang and Chen, 2023 ). This question is more prominent in developing or undeveloped countries because of insufficient Internet access, network problems, the lack of electronic devices, and poor network infrastructure ( Adnan and Anwar, 2020 ; Muthuprasad et al., 2021 ; Rahman, 2021 ; Chandrasiri and Weerakoon, 2022 ).

Learning effectiveness is a key consideration of education as it reflects the extent to which learning and teaching objectives are achieved and learners’ needs are satisfied ( Joy and Garcia, 2000 ; Swan, 2003 ). Online learning was generally proven to be effective within a higher education context ( Kebritchi et al., 2017 ) prior to the pandemic. ICTs have fundamentally shaped the process of learning as they allow learners to learn anywhere and anytime, interact with others efficiently and conveniently, and freely acquire a large volume of learning materials online ( Kebritchi et al., 2017 ; Choudhury and Pattnaik, 2020 ). Such benefits may be offset by the challenges brought about by the pandemic. A lot of empirical studies globally have investigated the effectiveness of online learning but there is currently a scarcity of a systematic review of these studies to comprehensively evaluate online learning effectiveness and identify factors that influence effectiveness.

At present, although the vast majority of countries have implemented opening policies to deal with the pandemic and higher education institutes have recovered offline teaching and learning, assessing the effectiveness of online learning during the pandemic period via a systematic review is still essential. First, it is necessary to summarize, learn from, and reflect on the lessons and experiences of online learning practices during the pandemic period to offer implications for future practices and research. Second, the review of online learning research carried out during the pandemic period is likely to generate interesting knowledge because of the unique research context. Third, higher education institutes still need a contingency plan for emergency online learning to deal with potential crises in the future, e.g., wars, pandemics, and natural disasters. A systematic review of research on the effectiveness of online learning during the pandemic period offers valuable knowledge for designing a contingency plan for the future.

1.2 Related concepts

1.2.1 online learning.

Online learning should not be simply understood as learning on the Internet or the integration of ICTs with learning because it is a systematic framework consisting of a set of pedagogies, technologies, implementations, and processes ( Kebritchi et al., 2017 ; Choudhury and Pattnaik, 2020). Choudhury and Pattnaik (2020; p.2) summarized prior definitions of online learning and provided a comprehensive and up-to-date definition, i.e., online learning refers to “ the transfer of knowledge and skills, in a well-designed course content that has established accreditations, through an electronic media like the Internet, Web 4.0, intranets and extranets .” Online learning differs from traditional learning because of not only technological differences, but also differences in social development and pedagogies ( Camargo et al., 2020 ). Online learning has also considerably shaped the patterns by which knowledge is stored, shared, and transferred, skills are practiced, as well as the way by which stakeholders (e.g., teachers and teachers) interact ( Desai et al., 2008 ; Anderson and Hajhashemi, 2013 ). In addition, online learning has altered educational objectives and learning requirements. Memorizing knowledge was traditionally viewed as vital to learning but it is now less important since required knowledge can be conveniently searched and acquired on the Internet while the reflection and application of knowledge becomes more important ( Gamage et al., 2023 ). Online learning also entails learners’ self-regulated learning ability more than traditional learning because the online learning environment inflicts less external regulation and provides more autonomy and flexibility ( Barnard-Brak et al., 2010 ; Wong et al., 2019 ). The above differences imply that traditional pedagogies may not apply to online learning.

There are a variety of online learning models according to the differences in learning methods, processes, outcomes, and the application of technologies ( Zeitoun, 2008 ). As ICTs can be used as either the foundation of learning or auxiliary means, online learning can be classified into assistant, blended, and wholly online models. Here, assistant online learning refers to the scenario where online learning technologies are used to supplement and support traditional learning; blended online learning refers to the integration/ mixture of online and offline methods, and; wholly online learning refers to the exclusive use of the Internet for learning ( Arkorful and Abaidoo, 2015 ). The present review focuses on wholly online learning because the review is interested in the COVID-19 pandemic context where learning activities are fully switched to online platforms.

1.2.2 Learning effectiveness

Learning effectiveness can be broadly defined as the extent to which learning and teaching objectives have been effectively and efficiently achieved via educational activities ( Swan, 2003 ) or the extent to which learners’ needs are satisfied by learning activities ( Joy and Garcia, 2000 ). It is a multi-dimensional construct because learning objectives and needs are always diversified ( Joy and Garcia, 2000 ; Swan, 2003 ). Assessing learning effectiveness is a key challenge in educational research and researchers generally use a set of subjective and objective indicators to assess learning effectiveness, e.g., examination scores, assignment performance, perceived effectiveness, student satisfaction, learning motivation, engagement in learning, and learning experience ( Rajaram and Collins, 2013 ; Noesgaard and Ørngreen, 2015 ). Prior research related to the effectiveness of online learning was diversified in terms of learning outcomes, e.g., satisfaction, perceived effectiveness, motivation, and learning engagement, and there is no consensus over which outcomes are valid indicators of learning effectiveness. The present study adopts a broad definition of learning effectiveness and considers various learning outcomes that are closely associated with learning objectives and needs.

1.3 Previous review research

Up to now, online learning during the COVID-19 pandemic period has attracted considerable attention from academia and there is a lot of related review research. Some review research analyzed the trends and major topics in related research. Pratama et al. (2020) tracked the trend of using online meeting applications in online learning during the pandemic period based on a systematic review of 12 articles. It was reported that the use of these applications kept a rising trend and this use helps promote learning and teaching processes. However, this review was descriptive and failed to identify problems related to these applications as well as the limitations of these applications. Zhang et al. (2022) implemented a bibliometric review to provide a holistic view of research on online learning in higher education during the COVID-19 pandemic period. They concluded that the majority of research focused on identifying the use of strategies and technologies, psychological impacts brought by the pandemic, and student perceptions. Meanwhile, collaborative learning, hands-on learning, discovery learning, and inquiry-based learning were the most frequently discussed instructional approaches. In addition, chemical and medical education were found to be the most investigated disciplines. This review hence offered a relatively comprehensive landscape of related research in the field. However, since it was a bibliometric review, it merely analyzed the superficial characteristics of past articles in the field without a detailed analysis of their research contributions. Bughrara et al. (2023) categorized the major research topics in the field of online medical education during the pandemic period via a scoping review. A total of 174 articles were included in the review and it was found there were seven major topics, including students’ mental health, stigma, student vaccination, use of telehealth, students’ physical health, online modifications and educational adaptations, and students’ attitudes and knowledge. Overall, the review comprehensively reveals major topics in the focused field.

Some scholars believed that online learning during the pandemic period has brought about a lot of problems while both students and teachers encounter many challenges. García-Morales et al. (2021) implemented a systematic review to identify the challenges encountered by higher education in an online learning scenario during the pandemic period. A total of seven studies were included and it was found that higher education suddenly transferred to online learning and a lot of technologies and platforms were used to support online learning. However, this transition was hasty and forced by the extreme situation. Thus, various stakeholders in learning and teaching (e.g., students, universities, and teachers) encountered difficulties in adapting to this sudden change. To deal with these challenges, universities need to utilize the potential of technologies, improve learning experience, and meet students’ expectations. The major limitation of García-Morales et al. (2021) review of the small-sized sample. Meanwhile, García-Morales et al. (2021) also failed to systematically categorize various types of challenges. Stojan et al. (2022) investigated the changes to medical education brought about by the shift to online learning in the COVID-19 pandemic context as well as the lessons and impacts of these changes via a systematic review. A total of 56 articles were included in the analysis, it was reported that small groups and didactics were the most prevalent instructional methods. Although learning engagement was always interactive, teachers majorly integrated technologies to amplify and replace, rather than transform learning. Based on this, they argued that the use of asynchronous and synchronous formats promoted online learning engagement and offered self-directed and flexible learning. The major limitation of this review is that the article is somewhat descriptive and lacks the crucial evaluation of problems of online learning.

Review research has also focused on the changes and impacts brought by online learning during the pandemic period. Camargo et al. (2020) implemented a meta-analysis on seven empirical studies regarding online learning methods during the pandemic period to evaluate feasible online learning platforms, effective online learning models, and the optimal duration of online lectures, as well as the perceptions of teachers and students in the online learning process. Overall, it was concluded that the shift from offline to online learning is feasible, and; effective online learning needs a well-trained and integrated team to identify students’ and teachers’ needs, timely respond, and support them via digital tools. In addition, the pandemic has brought more or less difficulties to online learning. An obvious limitation of this review is the overly small-sized sample ( N  = 7), which offers very limited information, but the review tries to answer too many questions (four questions). Grafton-Clarke et al. (2022) investigated the innovation/adaptations implemented, their impacts, and the reasons for their selections in the shift to online learning in medical education during the pandemic period via a systematic review of 55 articles. The major adaptations implemented include the rapid shift to the virtual space, pre-recorded videos or live streaming of surgical procedures, remote adaptations for clinical visits, and multidisciplinary ward rounds and team meetings. Major challenges encountered by students and teachers include the need for technical resources, faculty time, and devices, the shortage of standardized telemedicine curricula, and the lack of personal interactions. Based on this, they criticized the quality of online medical education. Tang (2023) explored the impact of the pandemic on primary, secondary, and tertiary education in the pandemic context via a systematic review of 41 articles. It was reported that the majority of these impacts are negative, e.g., learning loss among learners, assessment and experiential learning in the virtual environment, limitations in instructions, technology-related constraints, the lack of learning materials and resources, and deteriorated psychosocial well-being. These negative impacts are amplified by the unequal distribution of resources, unfair socioeconomic status, ethnicity, gender, physical conditions, and learning ability. Overall, this review comprehensively criticizes the problems brought about by online learning during the pandemic period.

Very little review research evaluated students’ responses to online learning during the pandemic period. For instance, Salas-Pilco et al. (2022) evaluated the engagement in online learning in Latin American higher education during the COVID-19 pandemic period via a systematic review of 23 studies. They considered three dimensions of engagement, including affective, cognitive, and behavioral engagement. They described the characteristics of learning engagement and proposed suggestions for enhancing engagement, including improving Internet connectivity, providing professional training, transforming higher education, ensuring quality, and offering emotional support. A key limitation of the review is that these authors focused on describing the characteristics of engagement without identifying factors that influence engagement.

A synthesis of previous review research offers some implications. First, although learning effectiveness is an important consideration in educational research, review research is scarce on this topic and hence there is a lack of comprehensive knowledge regarding the extent to which online learning is effective during the COVID-19 pandemic period. Second, according to past review research that summarized the major topics of related research, e.g., Bughrara et al. (2023) and Zhang et al. (2022) , the effectiveness of online learning is not a major topic in prior empirical research and hence the author of this article argues that this topic has not received due attention from researchers. Third, some review research has identified a lot of problems in online learning during the pandemic period, e.g., García-Morales et al. (2021) and Stojan et al. (2022) . Many of these problems are caused by the sudden and rapid shift to online learning as well as the unique context of the pandemic. These problems may undermine the effectiveness of online learning. However, the extent to which these problems influence online learning effectiveness is still under-investigated.

1.4 Purpose of the review research

The research is carried out based on a systematic review of past empirical research to answer the following two research questions:

Q1: To what extent online learning in higher education is effective during the COVID-19 pandemic period?

Q2: What factors shape the effectiveness of online learning in higher education during the COVID-19 pandemic period?

2 Research methodology

2.1 literature review as a research methodology.

Regardless of discipline, all academic research activities should be related to and based on existing knowledge. As a result, scholars must identify related research on the topic of interest, critically assess the quality and content of existing research, and synthesize available results ( Linnenluecke et al., 2020 ). However, this task is increasingly challenging for scholars because of the exponential growth of academic knowledge, which makes it difficult to be at the forefront and keep up with state-of-the-art research ( Snyder, 2019 ). Correspondingly, literature review, as a research methodology is more relevant than previously ( Snyder, 2019 ; Linnenluecke et al., 2020 ). A well-implemented review provides a solid foundation for facilitating theory development and advancing knowledge ( Webster and Watson, 2002 ). Here, a literature review is broadly defined as a more or less systematic way of collecting and synthesizing past studies ( Tranfield et al., 2003 ). It allows researchers to integrate perspectives and results from a lot of past research and is able to address research questions unanswered by a single study ( Snyder, 2019 ).

There are generally three types of literature review, including meta-analysis, bibliometric review, and systematic review ( Snyder, 2019 ). A meta-analysis refers to a statistical technique for integrating results from a large volume of empirical research (majorly quantitative research) to compare, identify, and evaluate patterns, relationships, agreements, and disagreements generated by research on the same topic ( Davis et al., 2014 ). This study does not adopt a meta-analysis for two reasons. First, the research on the effectiveness of online learning in the context of the COVID-19 pandemic was published since 2020 and currently, there is a limited volume of empirical evidence. If the study adopts a meta-analysis, the sample size will be small, resulting in limited statistical power. Second, as mentioned above, there are a variety of indicators, e.g., motivation, satisfaction, experience, test score, and perceived effectiveness ( Rajaram and Collins, 2013 ; Noesgaard and Ørngreen, 2015 ), that reflect different aspects of online learning effectiveness. The use of diversified effectiveness indicators increases the difficulty of carrying out meta-analysis.

A bibliometric review refers to the analysis of a large volume of empirical research in terms of publication characteristics (e.g., year, journal, and citation), theories, methods, research questions, countries, and authors ( Donthu et al., 2021 ) and it is useful in tracing the trend, distribution, relationship, and general patterns of research published in a focused topic ( Wallin, 2005 ). A bibliometric review does not fit the present study for two reasons. First, at present, there are less than 4 years of history of research on online learning effectiveness. Hence the volume of relevant research is limited and the public trend is currently unclear. Second, this study is interested in the inner content and results of articles published, rather than their external characteristics.

A systematic review is a method and process of critically identifying and appraising research in a specific field based on predefined inclusion and exclusion criteria to test a hypothesis, answer a research question, evaluate problems in past research, identify research gaps, and/or point out the avenue for future research ( Liberati et al., 2009 ; Moher et al., 2009 ). This type of review is particularly suitable to the present study as there are still a lot of unanswered questions regarding the effectiveness of online learning in the pandemic context, a need for indicating future research direction, a lack of summary of relevant research in this field, and a scarcity of critical appraisal of problems in past research.

Adopting a systematic review methodology brings multiple benefits to the present study. First, it is helpful for distinguishing what needs to be done from what has been done, identifying major contributions made by past research, finding out gaps in past research, avoiding fruitless research, and providing insights for future research in the focused field ( Linnenluecke et al., 2020 ). Second, it is also beneficial for finding out new research directions, needs for theory development, and potential solutions for limitations in past research ( Snyder, 2019 ). Third, this methodology helps scholars to efficiently gain an overview of valuable research results and theories generated by past research, which inspires their research design, ideas, and perspectives ( Callahan, 2014 ).

Commonly, a systematic review can be either author-centric or theme-centric ( Webster and Watson, 2002 ) and the present review is theme-centric. Specifically, an author-centric review focuses on works published by a certain author or a group of authors and summarizes the major contributions made by the author(s; ( Webster and Watson, 2002 ). This type of review is problematic in terms of its incompleteness of research conclusions in a specific field and descriptive nature ( Linnenluecke et al., 2020 ). A theme-centric review is more common where a researcher guides readers through reviewing themes, concepts, and interesting phenomena according to a certain logic ( Callahan, 2014 ). A theme in this review can be further structured into several related sub-themes and this type of review helps researchers to gain a comprehensive understanding of relevant academic knowledge ( Papaioannou et al., 2016 ).

2.2 Research procedures

This study follows the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guideline ( Liberati et al., 2009 ) to implement a systematic review. The guideline indicates four phases of performing a systematic review, including (1) identifying possible research, (2) abstract screening, (3) assessing full-text for eligibility, and (4) qualitatively synthesizing included research. Figure 1 provides a flowchart of the process and the number of articles excluded and included in each phase.

www.frontiersin.org

Figure 1 . PRISMA flowchart concerning the selection of articles.

This study uses multiple academic databases to identify possible research, e.g., Academic Search Complete, IGI Global, ACM Digital Library, Elsevier (SCOPUS), Emerald, IEEE Xplore, Web of Science, Science Direct, ProQuest, Wiley Online Library, Taylor and Francis, and EBSCO. Since the COVID-19 pandemic broke out in January 2020, this study limits the literature search to articles published from January 2020 to August 2023. During this period, online learning was highly prevalent in schools globally and a considerable volume of articles were published to investigate various aspects of online learning in this period. Keywords used for searching possible research include pandemic, COVID, SARS-CoV-2, 2019-nCoV, coronavirus, online learning, e-learning, electronic learning, higher education, tertiary education, universities, learning effectiveness, learning satisfaction, learning engagement, and learning motivation. Aside from searching from databases, this study also manually checks the reference lists of relevant articles and uses Google Scholar to find out other articles that have cited these articles.

2.3 Inclusion and exclusion criteria

Articles included in the review must meet the following criteria. First, articles have to be written in English and published on peer-reviewed journals. The academic language being English was chosen because it is in the Q zone of the specified search engines. Second, the research must be carried out in an online learning context. Third, the research must have collected and analyzed empirical data. Fourth, the research should be implemented in a higher education context and during the pandemic period. Fifth, the outcome variable must be factors related to learning effectiveness, and included studies must have reported the quantitative results for online learning effectiveness. The outcome variable should be measured by data collected from students, rather than other individuals (e.g., instructors). For instance, the research of Rahayu and Wirza (2020) used teacher perception as a measurement of online learning effectiveness and was hence excluded from the sample. According to the above criteria, a total of 25 articles were included in the review.

2.4 Data extraction and analysis

Content analysis is performed on included articles and an inductive approach is used to answer the two research questions. First, to understand the basic characteristics of the 25 articles/studies, the researcher summarizes their types, research designs, and samples and categorizes them into several groups. The researcher carefully reads the full-text of these articles and codes valuable pieces of content. In this process, an inductive approach is used, and key themes in these studies have been extracted and summarized. Second, the researcher further categorizes these studies into different groups according to their similarities and differences in research findings. In this way, these studies are broadly categorized into three groups, i.e., (1) ineffective (2) neutral, and (3) effective. Based on this, the research answers the research question and indicates the percentage of studies that evidenced online learning as effective in a COVID-19 pandemic context. The researcher also discusses how online learning is effective by analyzing the learning outcomes brought by online learning. Third, the researcher analyzes and compares the characteristics of the three groups of studies and extracts key themes that are relevant to the conditional effectiveness of online learning from these studies. Based on this, the researcher identifies factors that influence the effectiveness of online learning in a pandemic context. In this way, the two research questions have been adequately answered.

3 Research results and discussion

3.1 study characteristics.

Table 1 shows the statistics of the 25 studies while Table 2 shows a summary of these studies. Overall, these studies varied greatly in terms of research design, research subjects, contexts, measurements of learning effectiveness, and eventually research findings. Approximately half of the studies were published in 2021 and the number of studies reduced in 2022 and 2023, which may be attributed to the fact that universities gradually implemented opening-up policies after 2020. China received the largest number of studies ( N  = 5), followed by India ( N = 4) and the United States ( N  = 3). The sample sizes of the majority of studies (88.0%) ranged between 101 and 500. As this review excluded qualitative studies, all studies included adopted a purely quantitative design (88.0%) or a mixed design (12.0%). The majority of the studies were cross-sectional (72%) and a few studies (2%) were experimental.

www.frontiersin.org

Table 1 . Statistics of studies included in the review.

www.frontiersin.org

Table 2 . A summary of studies reviewed.

3.2 The effectiveness of online learning

Overall, the 25 studies generated mixed results regarding the effectiveness of online learning during the pandemic period. 9 (36%) studies reported online learning as effective; 13 (52%) studies reported online learning as ineffective, and the rest 3 (12%) studies produced neutral results. However, it should be noted that the results generated by these studies are not comparable as they used different approaches to evaluate the effectiveness of online learning. According to the approach of evaluating online learning effectiveness, these studies are categorized into four groups, including (1) Cross-sectional evaluation of online learning effectiveness without a comparison with offline learning; without a control group ( N  = 14; 56%), (2) Cross-sectional comparison of the effectiveness of online learning with offline learning; without control group (7; 28%), (3) Longitudinal comparison of the effectiveness of online learning with offline learning, without a control group ( N  = 2; 8%), and (4) Randomized Controlled Trial (RCT); with a control group ( N  = 2; 8%).

The first group of studies asked students to report the extent to which they perceived online learning as effective, they had achieved expected learning outcomes through online learning, or they were satisfied with online learning experience or outcomes, without a comparison with offline learning. Six out of 14 studies reported online learning as ineffective, including Adnan and Anwar (2020) , Hong et al. (2021) , Mok et al. (2021) , Baber (2022) , Chandrasiri and Weerakoon (2022) , and Lalduhawma et al. (2022) . Five out of 14 studies reported online learning as effective, including Almusharraf and Khahro (2020) , Sharma et al. (2020) , Mahyoob (2021) , Rahman (2021) , and Haningsih and Rohmi (2022) . In addition, 3 out of 14 studies reported neutral results, including Cranfield et al. (2021) , Tsang et al. (2021) , and Conrad et al. (2022) . It should be noted that this measurement approach is problematic in three aspects. First, researchers used various survey instruments to measure learning effectiveness without reaching a consensus over a widely accepted instrument. As a result, these studies measured different aspects of learning effectiveness and hence their results may be incomparable. Second, these studies relied on students’ self-reports to evaluate learning effectiveness, which is subjective and inaccurate. Third, even though students perceived online learning as effective, it does not imply that online learning is more effective than offline learning because of the absence of comparables.

The second group of studies asked students to compare online learning with offline learning to evaluate learning effectiveness. Interestingly, all 7 studies, including Alawamleh et al. (2020) , Almahasees et al. (2021) , Gonzalez-Ramirez et al. (2021) , Muthuprasad et al. (2021) , Selco and Habbak (2021) , Hollister et al. (2022) , and Zhang and Chen (2023) , reported that online learning was perceived by participants as less effective than offline learning. It should be noted that these results were specific to the COVID-19 pandemic context where strict social distancing policies were implemented. Consequently, these results should be interpreted as online learning during the school lockdown period was perceived by participants as less effective than offline learning during the pre-pandemic period. A key problem of the measurement of learning effectiveness in these studies is subjectivity, i.e., students’ self-reported online learning effectiveness relative to offline learning may be subjective and influenced by a lot of factors caused by the pandemic, e.g., negative emotions (e.g., fear, loneliness, and anxiety).

Only two studies implemented a longitudinal comparison of the effectiveness of online learning with offline learning, i.e., Chang et al. (2021) and Fyllos et al. (2021) . Interestingly, both studies reported that participants perceived online learning as more effective than offline learning, which is contradicted with the second group of studies. In the two studies, the same group of students participated in offline learning and online learning successively and rated the effectiveness of the two learning approaches, respectively. The two studies were implemented by time coincidence, i.e., researchers unexpectedly encountered the pandemic and subsequently, school lockdown when they were investigating learning effectiveness. Such time coincidence enabled them to compare the effectiveness of offline and online learning. However, this research design has three key problems. First, the content of learning in the online and offline learning periods was different and hence the evaluations of learning effectiveness of the two periods are not comparable. Second, self-reported learning effectiveness is subjective. Third, students are likely to obtain better examination scores in online examinations than in offline examinations because online examinations bring a lot of cheating behaviors and are less fair than offline examinations. As reported by Fyllos et al. (2021) , the examination score after online learning was significantly higher than after offline learning. Chang et al. (2021) reported that participants generally believed that offline examinations are fairer than online examinations.

Lastly, only two studies, i.e., Jiang et al. (2023) and Shirahmadi et al. (2023) , implemented an RCT design, which is more persuasive, objective, and accurate than the above-reviewed studies. Indeed, implementing an RCT to evaluate the effectiveness of online learning was a formidable challenge during the pandemic period because of viral transmission and social distancing policies. Both studies reported that online learning is more effective than offline learning during the pandemic period. However, it is questionable about the extent to which such results are affected by health/safety-related issues. It is reasonable to infer that online learning was perceived by students as safer than offline learning during the pandemic period and such perceptions may affect learning effectiveness.

Overall, it is difficult to conclude whether online learning is effective during the pandemic period. Nevertheless, it is possible to identify factors that shape the effectiveness of online learning, which is discussed in the next section.

3.3 Factors that shape online learning effectiveness

Infrastructure factors were reported as the most salient factors that determine online learning effectiveness. It seems that research from developed countries generated more positive results for online learning than research from less developed countries. This view was confirmed by the cross-country comparative study of Cranfield et al. (2021) . Indeed, online learning entails the support of ICT infrastructure, and hence ICT related factors, e.g., Internet connectivity, technical issues, network speed, accessibility of digital devices, and digital devices, considerably influence the effectiveness of online learning ( García-Morales et al., 2021 ; Grafton-Clarke et al., 2022 ). Prior review research, e.g., Tang (2023) also suggested that the unequal distribution of resources and unfair socioeconomic status intensified the problems brought about by online learning during the pandemic period. Salas-Pilco et al. (2022) recommended that improving Internet connectivity would increase students’ engagement in online learning during the pandemic period.

Adnan and Anwar (2020) study is one of the most cited works in the focused field. They reported that online learning is ineffective in Pakistan because of the problems of Internet access due to monetary and technical issues. The above problems hinder students from implementing online learning activities, making online learning ineffective. Likewise, Lalduhawma et al. (2022) research from India indicated that online learning is ineffective because of poor network interactivity, slow data speed, low data limits, and expensive costs of devices. As a result, online learning during the COVID-19 pandemic may have expanded the education gap between developed and developing countries because of developing countries’ infrastructure disadvantages. More attention to online learning infrastructure problems in developing countries is needed.

Instructional factors, e.g., course management and design, instructor characteristics, instructor-student interaction, assignments, and assessments were found to affect online learning effectiveness ( Sharma et al., 2020 ; Rahman, 2021 ; Tsang et al., 2021 ; Hollister et al., 2022 ; Zhang and Chen, 2023 ). Although these instructional factors have been well-documented as significant drivers of learning effectiveness in traditional learning literature, these factors in the pandemic period have some unique characteristics. Both students and teachers were not well prepared for wholly online instruction and learning in 2020 and hence they encountered a lot of problems in course management and design, learning interactions, assignments, and assessments ( Stojan et al., 2022 ; Tang, 2023 ). García-Morales et al. (2021) review also suggested that various stakeholders in learning and teaching encountered difficulties in adapting to the sudden, hasty, and forced transition of offline to online learning. Consequently, these instructional factors become salient in terms of affecting online learning effectiveness.

The negative role of the lack of social interaction caused by social distancing in affecting online learning effectiveness was highlighted by a lot of studies ( Almahasees et al., 2021 ; Baber, 2022 ; Conrad et al., 2022 ; Hollister et al., 2022 ). Baber (2022) argued that people give more importance to saving lives than socializing in the online environment and hence social interactions in learning are considerably reduced by social distancing norms. The negative impact of the lack of social interaction on online learning effectiveness is reflected in two aspects. First, according to a constructivist view, interaction is an indispensable element of learning because knowledge is actively constructed by learners in social interactions ( Woo and Reeves, 2007 ). Consequently, online learning effectiveness during the pandemic period is reduced by the lack of social interaction. Second, the lack of social interaction brings a lot of negative emotions, e.g., feelings of isolation, loneliness, anxiety, and depression ( Alawamleh et al., 2020 ; Gonzalez-Ramirez et al., 2021 ; Selco and Habbak, 2021 ). Such negative emotions undermine online learning effectiveness.

Negative emotions caused by the pandemic and school lockdown were also found to be detrimental to online learning effectiveness. In this context, it was reported that many students experience a lot of negative emotions, e.g., feelings of isolation, exhaustion, loneliness, and distraction ( Alawamleh et al., 2020 ; Gonzalez-Ramirez et al., 2021 ; Selco and Habbak, 2021 ). Such negative emotions, as mentioned above, reduce online learning effectiveness.

Several factors were also found to increase online learning effectiveness during the pandemic period, e.g., convenience and flexibility ( Hong et al., 2021 ; Muthuprasad et al., 2021 ; Selco and Habbak, 2021 ). Students with strong self-regulated learning abilities gain more benefits from convenience and flexibility in online learning ( Hong et al., 2021 ).

Overall, although it is debated over the effectiveness of online learning during the pandemic period, it is generally believed that the pandemic brings a lot of challenges and difficulties to higher education. Meanwhile, the majority of students prefer offline learning to online learning. The above challenges and difficulties are more prominent in developing countries than in developed countries.

3.4 Pedagogical implications

The results generated by the systematic review offer a lot of pedagogical implications. First, online learning entails the support of ICT infrastructure, and infrastructure defects strongly undermine learning effectiveness ( García-Morales et al., 2021 ; Grafton-Clarke et al., 2022 ). Given the fact online learning is increasingly integrated into higher education ( Kebritchi et al., 2017 ) regardless of the presence of the pandemic, governments globally should increase the investment in learning-related ICT infrastructure in higher education institutes. Meanwhile, schools should consider students’ affordability of digital devices and network fees when implementing online learning activities. It is important to offer material support for those students with poor economic status. Infrastructure issues are more prominent in developing countries because of the lack of monetary resources and poor infrastructure base. Thus, international collaboration and aid are recommended to address these issues.

Second, since the lack of social interaction is a key factor that reduces online learning effectiveness, it is important to increase social interactions during the implementation of online learning activities. On the one hand, both students and instructors are encouraged to utilize network technologies to promote inter-individual interactions. On the other hand, the two parties are also encouraged to engage in offline interaction activities if the risk is acceptable.

Third, special attention should be paid to students’ emotions during the online learning process as online learning may bring a lot of negative emotions to students, which undermine learning effectiveness ( Alawamleh et al., 2020 ; Gonzalez-Ramirez et al., 2021 ; Selco and Habbak, 2021 ). In addition, higher education institutes should prepare a contingency plan for emergency online learning to deal with potential crises in the future, e.g., wars, pandemics, and natural disasters.

3.5 Limitations and suggestions for future research

There are several limitations in past research regarding online learning effectiveness during the pandemic period. The first is the lack of rigor in assessing learning effectiveness. Evidently, there is a scarcity of empirical research with an RCT design, which is considered to be accurate, objective, and rigorous in assessing pedagogical models ( Torgerson and Torgerson, 2001 ). The scarcity of ICT research leads to the difficulty in accurately assessing the effectiveness of online learning and comparing it with offline learning. Second, the widely accepted criteria for assessing learning effectiveness are absent, and past empirical studies used diversified procedures, techniques, instruments, and criteria for measuring online learning effectiveness, resulting in difficulty in comparing research results. Third, learning effectiveness is a multi-dimensional construct but its multidimensionality was largely ignored by past research. Therefore, it is difficult to evaluate which dimensions of learning effectiveness are promoted or undermined by online learning and it is also difficult to compare the results of different studies. Finally, there is very limited knowledge about the difference in online learning effectiveness between different subjects. It is likely that the subjects that depend on lab-based work (e.g., experimental physics, organic chemistry, and cell biology) are less appropriate for online learning than the subjects that depend on desk-based work (e.g., economics, psychology, and literature).

To deal with the above limitations, there are several recommendations for future research on online learning effectiveness. First, future research is encouraged to adopt an RCT design and collect a large-sized sample to objectively, rigorously, and accurately quantify the effectiveness of online learning. Second, scholars are also encouraged to develop a new framework to assess learning effectiveness comprehensively. This framework should cover multiple dimensions of learning effectiveness and have strong generalizability. Finally, it is recommended that future research could compare the effectiveness of online learning between different subjects.

4 Conclusion

This study carried out a systematic review of 25 empirical studies published between 2020 and 2023 to evaluate the effectiveness of online learning during the COVID-19 pandemic period. According to how online learning effectiveness was assessed, these 25 studies were categorized into four groups. The first group of studies employed a cross-sectional design and assessed online learning based on students’ perceptions without a control group. Less than half of these studies reported online learning as effective. The second group of studies also employed a cross-sectional design and asked students to compare the effectiveness of online learning with offline learning. All these studies reported that online learning is less effective than offline learning. The third group of studies employed a longitudinal design and compared the effectiveness of online learning with offline learning but without a control group and this group includes only 2 studies. It was reported that online learning is more effective than offline learning. The fourth group of studies employed an RCT design and this group includes only 2 studies. Both studies reported online learning as more effective than offline learning.

Overall, it is difficult to conclude whether online learning is effective during the pandemic period because of the diversified research contexts, methods, and approaches in past research. Nevertheless, the review identifies a set of factors that positively or negatively influence the effectiveness of online learning, including infrastructure factors, instructional factors, the lack of social interaction, negative emotions, flexibility, and convenience. Although it is debated over the effectiveness of online learning during the pandemic period, it is generally believed that the pandemic brings a lot of challenges and difficulties to higher education. Meanwhile, the majority of students prefer offline learning to online learning. In addition, developing countries face more challenges and difficulties in online learning because of monetary and infrastructure issues.

The findings of this review offer significant pedagogical implications for online learning in higher education institutes, including enhancing the development of ICT infrastructure, providing material support for students with poor economic status, enhancing social interactions, paying attention to students’ emotional status, and preparing a contingency plan of emergency online learning.

The review also identifies several limitations in past research regarding online learning effectiveness during the pandemic period, including the lack of rigor in assessing learning effectiveness, the absence of accepted criteria for assessing learning effectiveness, the neglect of the multidimensionality of learning effectiveness, and limited knowledge about the difference in online learning effectiveness between different subjects.

To deal with the above limitations, there are several recommendations for future research on online learning effectiveness. First, future research is encouraged to adopt an RCT design and collect a large-sized sample to objectively, rigorously, and accurately quantify the effectiveness of online learning. Second, scholars are also encouraged to develop a new framework to assess learning effectiveness comprehensively. This framework should cover multiple dimensions of learning effectiveness and have strong generalizability. Finally, it is recommended that future research could compare the effectiveness of online learning between different subjects. To fix these limitations in future research, recommendations are made.

It should be noted that this review is not free of problems. First, only studies that quantitatively measured online learning effectiveness were included in the review and hence a lot of other studies (e.g., qualitative studies) that investigated factors that influence online learning effectiveness were excluded, resulting in a relatively small-sized sample and incomplete synthesis of past research contributions. Second, since this review was qualitative, it was difficult to accurately quantify the level of online learning effectiveness.

Data availability statement

The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.

Author contributions

WM: Writing – original draft, Writing – review & editing. LY: Writing – original draft, Writing – review & editing. CL: Writing – review & editing. NP: Writing – review & editing. XP: Writing – review & editing. YZ: Writing – review & editing.

The author(s) declare that no financial support was received for the research, authorship, and/or publication of this article.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Adnan, M., and Anwar, K. (2020). Online learning amid the COVID-19 pandemic: Students' perspectives. J. Pedagogical Sociol. Psychol. 1, 45–51. doi: 10.33902/JPSP.2020261309

Crossref Full Text | Google Scholar

Alawamleh, M., Al-Twait, L. M., and Al-Saht, G. R. (2020). The effect of online learning on communication between instructors and students during Covid-19 pandemic. Asian Educ. Develop. Stud. 11, 380–400. doi: 10.1108/AEDS-06-2020-0131

Almahasees, Z., Mohsen, K., and Amin, M. O. (2021). Faculty’s and students’ perceptions of online learning during COVID-19. Front. Educ. 6:638470. doi: 10.3389/feduc.2021.638470

Almusharraf, N., and Khahro, S. (2020). Students satisfaction with online learning experiences during the COVID-19 pandemic. Int. J. Emerg. Technol. Learn. (iJET) 15, 246–267. doi: 10.3991/ijet.v15i21.15647

Anderson, N., and Hajhashemi, K. (2013). Online learning: from a specialized distance education paradigm to a ubiquitous element of contemporary education. In 4th international conference on e-learning and e-teaching (ICELET 2013) (pp. 91–94). IEEE.

Google Scholar

Arkorful, V., and Abaidoo, N. (2015). The role of e-learning, advantages and disadvantages of its adoption in higher education. Int. J. Instructional Technol. Distance Learn. 12, 29–42.

Baber, H. (2022). Social interaction and effectiveness of the online learning–a moderating role of maintaining social distance during the pandemic COVID-19. Asian Educ. Develop. Stud. 11, 159–171. doi: 10.1108/AEDS-09-2020-0209

Barnard-Brak, L., Paton, V. O., and Lan, W. Y. (2010). Profiles in self-regulated learning in the online learning environment. Int. Rev. Res. Open Dist. Learn. 11, 61–80. doi: 10.19173/irrodl.v11i1.769

Bughrara, M. S., Swanberg, S. M., Lucia, V. C., Schmitz, K., Jung, D., and Wunderlich-Barillas, T. (2023). Beyond COVID-19: the impact of recent pandemics on medical students and their education: a scoping review. Med. Educ. Online 28:2139657. doi: 10.1080/10872981.2022.2139657

PubMed Abstract | Crossref Full Text | Google Scholar

Callahan, J. L. (2014). Writing literature reviews: a reprise and update. Hum. Resour. Dev. Rev. 13, 271–275. doi: 10.1177/1534484314536705

Camargo, C. P., Tempski, P. Z., Busnardo, F. F., Martins, M. D. A., and Gemperli, R. (2020). Online learning and COVID-19: a meta-synthesis analysis. Clinics 75:e2286. doi: 10.6061/clinics/2020/e2286

Choudhury, S., and Pattnaik, S. (2020). Emerging themes in e-learning: A review from the stakeholders’ perspective. Computers and Education 144, 103657. doi: 10.1016/j.compedu.2019.103657

Chandrasiri, N. R., and Weerakoon, B. S. (2022). Online learning during the COVID-19 pandemic: perceptions of allied health sciences undergraduates. Radiography 28, 545–549. doi: 10.1016/j.radi.2021.11.008

Chang, J. Y. F., Wang, L. H., Lin, T. C., Cheng, F. C., and Chiang, C. P. (2021). Comparison of learning effectiveness between physical classroom and online learning for dental education during the COVID-19 pandemic. J. Dental Sci. 16, 1281–1289. doi: 10.1016/j.jds.2021.07.016

Conrad, C., Deng, Q., Caron, I., Shkurska, O., Skerrett, P., and Sundararajan, B. (2022). How student perceptions about online learning difficulty influenced their satisfaction during Canada's Covid-19 response. Br. J. Educ. Technol. 53, 534–557. doi: 10.1111/bjet.13206

Cranfield, D. J., Tick, A., Venter, I. M., Blignaut, R. J., and Renaud, K. (2021). Higher education students’ perceptions of online learning during COVID-19—a comparative study. Educ. Sci. 11, 403–420. doi: 10.3390/educsci11080403

Desai, M. S., Hart, J., and Richards, T. C. (2008). E-learning: paradigm shift in education. Education 129, 1–20.

Davis, J., Mengersen, K., Bennett, S., and Mazerolle, L. (2014). Viewing systematic reviews and meta-analysis in social research through different lenses. SpringerPlus 3, 1–9. doi: 10.1186/2193-1801-3-511

Donthu, N., Kumar, S., Mukherjee, D., Pandey, N., and Lim, W. M. (2021). How to conduct a bibliometric analysis: An overview and guidelines. Journal of business research 133, 264–269. doi: 10.1016/j.jbusres.2021.04.070

Fyllos, A., Kanellopoulos, A., Kitixis, P., Cojocari, D. V., Markou, A., Raoulis, V., et al. (2021). University students perception of online education: is engagement enough? Acta Informatica Medica 29, 4–9. doi: 10.5455/aim.2021.29.4-9

Gamage, D., Ruipérez-Valiente, J. A., and Reich, J. (2023). A paradigm shift in designing education technology for online learning: opportunities and challenges. Front. Educ. 8:1194979. doi: 10.3389/feduc.2023.1194979

García-Morales, V. J., Garrido-Moreno, A., and Martín-Rojas, R. (2021). The transformation of higher education after the COVID disruption: emerging challenges in an online learning scenario. Front. Psychol. 12:616059. doi: 10.3389/fpsyg.2021.616059

Gonzalez-Ramirez, J., Mulqueen, K., Zealand, R., Silverstein, S., Mulqueen, C., and BuShell, S. (2021). Emergency online learning: college students' perceptions during the COVID-19 pandemic. Coll. Stud. J. 55, 29–46.

Grafton-Clarke, C., Uraiby, H., Gordon, M., Clarke, N., Rees, E., Park, S., et al. (2022). Pivot to online learning for adapting or continuing workplace-based clinical learning in medical education following the COVID-19 pandemic: a BEME systematic review: BEME guide no. 70. Med. Teach. 44, 227–243. doi: 10.1080/0142159X.2021.1992372

Haningsih, S., and Rohmi, P. (2022). The pattern of hybrid learning to maintain learning effectiveness at the higher education level post-COVID-19 pandemic. Eurasian J. Educ. Res. 11, 243–257. doi: 10.12973/eu-jer.11.1.243

Hollister, B., Nair, P., Hill-Lindsay, S., and Chukoskie, L. (2022). Engagement in online learning: student attitudes and behavior during COVID-19. Front. Educ. 7:851019. doi: 10.3389/feduc.2022.851019

Hong, J. C., Lee, Y. F., and Ye, J. H. (2021). Procrastination predicts online self-regulated learning and online learning ineffectiveness during the coronavirus lockdown. Personal. Individ. Differ. 174:110673. doi: 10.1016/j.paid.2021.110673

Jiang, P., Namaziandost, E., Azizi, Z., and Razmi, M. H. (2023). Exploring the effects of online learning on EFL learners’ motivation, anxiety, and attitudes during the COVID-19 pandemic: a focus on Iran. Curr. Psychol. 42, 2310–2324. doi: 10.1007/s12144-022-04013-x

Joy, E. H., and Garcia, F. E. (2000). Measuring learning effectiveness: a new look at no-significant-difference findings. JALN 4, 33–39.

Kebritchi, M., Lipschuetz, A., and Santiague, L. (2017). Issues and challenges for teaching successful online courses in higher education: a literature review. J. Educ. Technol. Syst. 46, 4–29. doi: 10.1177/0047239516661713

Lalduhawma, L. P., Thangmawia, L., and Hussain, J. (2022). Effectiveness of online learning during the COVID-19 pandemic in Mizoram. J. Educ. e-Learning Res. 9, 175–183. doi: 10.20448/jeelr.v9i3.4162

Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gotzsche, P. C., Ioannidis, J. P., et al. (2009). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Annals of internal medicine , 151, W-65. doi: 10.7326/0003-4819-151-4-200908180-00136

Linnenluecke, M. K., Marrone, M., and Singh, A. K. (2020). Conducting systematic literature reviews and bibliometric analyses. Aust. J. Manag. 45, 175–194. doi: 10.1177/0312896219877678

Mahyoob, M. (2021). Online learning effectiveness during the COVID-19 pandemic: a case study of Saudi universities. Int. J. Info. Commun. Technol. Educ. (IJICTE) 17, 1–14. doi: 10.4018/IJICTE.20211001.oa7

Moher, D., Liberati, A., Tetzlaff, D., and Altman, G. and PRISMA Group (2009). Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Annals of internal medicine , 151, 264–269. doi: 10.3736/jcim20090918

Mok, K. H., Xiong, W., and Bin Aedy Rahman, H. N. (2021). COVID-19 pandemic’s disruption on university teaching and learning and competence cultivation: student evaluation of online learning experiences in Hong Kong. Int. J. Chinese Educ. 10:221258682110070. doi: 10.1177/22125868211007011

Muthuprasad, T., Aiswarya, S., Aditya, K. S., and Jha, G. K. (2021). Students’ perception and preference for online education in India during COVID-19 pandemic. Soc. Sci. Humanities open 3:100101. doi: 10.1016/j.ssaho.2020.100101

Noesgaard, S. S., and Ørngreen, R. (2015). The effectiveness of e-learning: an explorative and integrative review of the definitions, methodologies and factors that promote e-learning effectiveness. Electronic J. E-learning 13, 278–290.

Papaioannou, D., Sutton, A., and Booth, A. (2016). Systematic approaches to a successful literature review. London: Sage.

Pratama, H., Azman, M. N. A., Kassymova, G. K., and Duisenbayeva, S. S. (2020). The trend in using online meeting applications for learning during the period of pandemic COVID-19: a literature review. J. Innovation in Educ. Cultural Res. 1, 58–68. doi: 10.46843/jiecr.v1i2.15

Rahayu, R. P., and Wirza, Y. (2020). Teachers’ perception of online learning during pandemic covid-19. Jurnal penelitian pendidikan 20, 392–406. doi: 10.17509/jpp.v20i3.29226

Rahman, A. (2021). Using students’ experience to derive effectiveness of COVID-19-lockdown-induced emergency online learning at undergraduate level: evidence from Assam. India. Higher Education for the Future 8, 71–89. doi: 10.1177/2347631120980549

Rajaram, K., and Collins, B. (2013). Qualitative identification of learning effectiveness indicators among mainland Chinese students in culturally dislocated study environments. J. Int. Educ. Bus. 6, 179–199. doi: 10.1108/JIEB-03-2013-0010

Salas-Pilco, S. Z., Yang, Y., and Zhang, Z. (2022). Student engagement in online learning in Latin American higher education during the COVID-19 pandemic: a systematic review. Br. J. Educ. Technol. 53, 593–619. doi: 10.1111/bjet.13190

Selco, J. I., and Habbak, M. (2021). Stem students’ perceptions on emergency online learning during the covid-19 pandemic: challenges and successes. Educ. Sci. 11:799. doi: 10.3390/educsci11120799

Sharma, K., Deo, G., Timalsina, S., Joshi, A., Shrestha, N., and Neupane, H. C. (2020). Online learning in the face of COVID-19 pandemic: assessment of students’ satisfaction at Chitwan medical college of Nepal. Kathmandu Univ. Med. J. 18, 40–47. doi: 10.3126/kumj.v18i2.32943

Shirahmadi, S., Hazavehei, S. M. M., Abbasi, H., Otogara, M., Etesamifard, T., Roshanaei, G., et al. (2023). Effectiveness of online practical education on vaccination training in the students of bachelor programs during the Covid-19 pandemic. PLoS One 18:e0280312. doi: 10.1371/journal.pone.0280312

Snyder, H. (2019). Literature review as a research methodology: an overview and guidelines. J. Bus. Res. 104, 333–339. doi: 10.1016/j.jbusres.2019.07.039

Stojan, J., Haas, M., Thammasitboon, S., Lander, L., Evans, S., Pawlik, C., et al. (2022). Online learning developments in undergraduate medical education in response to the COVID-19 pandemic: a BEME systematic review: BEME guide no. 69. Med. Teach. 44, 109–129. doi: 10.1080/0142159X.2021.1992373

Swan, K. (2003). Learning effectiveness online: what the research tells us. Elements of quality online education, practice and direction 4, 13–47.

Tang, K. H. D. (2023). Impacts of COVID-19 on primary, secondary and tertiary education: a comprehensive review and recommendations for educational practices. Educ. Res. Policy Prac. 22, 23–61. doi: 10.1007/s10671-022-09319-y

Torgerson, C. J., and Torgerson, D. J. (2001). The need for randomised controlled trials in educational research. Br. J. Educ. Stud. 49, 316–328. doi: 10.1111/1467-8527.t01-1-00178

Tranfield, D., Denyer, D., and Smart, P. (2003). Towards a methodology for developing evidence‐informed management knowledge by means of systematic review. British journal of management , 14, 207–222. doi: 10.1111/1467-8551.00375

Tsang, J. T., So, M. K., Chong, A. C., Lam, B. S., and Chu, A. M. (2021). Higher education during the pandemic: the predictive factors of learning effectiveness in COVID-19 online learning. Educ. Sci. 11:446. doi: 10.3390/educsci11080446

Wallin, J. A. (2005). Bibliometric methods: pitfalls and possibilities. Basic Clin. Pharmacol. Toxicol. 97, 261–275. doi: 10.1111/j.1742-7843.2005.pto_139.x

Webster, J., and Watson, R. T. (2002). Analyzing the past to prepare for the future: Writing a literature review. MIS quarterly , 26, 13–23.

Wong, J., Baars, M., Davis, D., Van Der Zee, T., Houben, G. J., and Paas, F. (2019). Supporting self-regulated learning in online learning environments and MOOCs: a systematic review. Int. J. Human–Computer Interaction 35, 356–373. doi: 10.1080/10447318.2018.1543084

Woo, Y., and Reeves, T. C. (2007). Meaningful interaction in web-based learning: a social constructivist interpretation. Internet High. Educ. 10, 15–25. doi: 10.1016/j.iheduc.2006.10.005

Zeitoun, H. (2008). E-learning: Concept, Issues, Application, Evaluation . Riyadh: Dar Alsolateah Publication.

Zhang, L., Carter, R. A. Jr., Qian, X., Yang, S., Rujimora, J., and Wen, S. (2022). Academia's responses to crisis: a bibliometric analysis of literature on online learning in higher education during COVID-19. Br. J. Educ. Technol. 53, 620–646. doi: 10.1111/bjet.13191

Zhang, Y., and Chen, X. (2023). Students’ perceptions of online learning in the post-COVID era: a focused case from the universities of applied sciences in China. Sustain. For. 15:946. doi: 10.3390/su15020946

Keywords: COVID-19 pandemic, higher education, online learning, learning effectiveness, systematic review

Citation: Meng W, Yu L, Liu C, Pan N, Pang X and Zhu Y (2024) A systematic review of the effectiveness of online learning in higher education during the COVID-19 pandemic period. Front. Educ . 8:1334153. doi: 10.3389/feduc.2023.1334153

Received: 06 November 2023; Accepted: 27 December 2023; Published: 17 January 2024.

Reviewed by:

Copyright © 2024 Meng, Yu, Liu, Pan, Pang and Zhu. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Lei Yu, [email protected]

Articles on Online education

Displaying 1 - 20 of 82 articles.

articles on online education

With campus numbers plummeting due to online learning, do we need two categories of university degree?

Ananish Chaudhuri , University of Auckland, Waipapa Taumata Rau

articles on online education

5 challenges of doing college in the metaverse

Nir Kshetri , University of North Carolina – Greensboro

articles on online education

Choosing university or college courses? 5 questions for students to consider

Terence Day , Simon Fraser University and Paul N. McDaniel , Kennesaw State University

articles on online education

COVID was a setback for indigenous languages: South African lecturers on what went wrong

Sisanda Nkoala , Cape Peninsula University of Technology

articles on online education

Professor flexibility, recorded lectures: Some positive university legacies of the pandemic

Terence Day , Simon Fraser University

articles on online education

South Africa’s COVID school closures hit girls hard – but they showed resilience too

Zoe Duby , South African Medical Research Council

articles on online education

4 lessons from online learning that should stick after the pandemic

F. Haider Alvi , Athabasca University ; Deborah Hurst , Athabasca University ; Janice Thomas , Athabasca University , and Martha Cleveland-Innes , Athabasca University

articles on online education

How unis can save millions by tackling the biggest causes of online students’ high dropout rates

Steven Greenland , Charles Darwin University ; Catherine DT Moore , University of Liverpool ; Ninh Nguyen , Charles Darwin University , and Roopali Misra , Charles Darwin University

articles on online education

Teaching music online in the pandemic has yielded creative surprises, like mixing ‘Blob Opera’ and beatboxing

Robbie MacKay , Queen's University, Ontario

articles on online education

Watch for these conflicts over education in 2022

Joseph J. Ferrare , University of Washington, Bothell and Kate Phillippo , Loyola University Chicago

articles on online education

Why students don’t attend lectures: what we found at a South African university

Christie Swanepoel , University of the Western Cape ; Derek Yu , University of the Western Cape , and Rochelle Beukes , University of the Western Cape

articles on online education

South Africa’s universities are adopting an international lens: why it matters

Orla Quinlan , Rhodes University

articles on online education

3 things we need to get right to ensure online professional development works

Filia Garivaldis , Monash University and Sarah Kneebone , Monash University

articles on online education

What to look for when choosing a university as the digital competition grows

Gabriele Suder , RMIT University and Angelito Calma , The University of Melbourne

articles on online education

Online exam monitoring is now common in Australian universities — but is it here to stay?

Christopher O'Neill , Monash University ; Gavin JD Smith , Australian National University ; Mark Andrejevic , Monash University ; Neil Selwyn , Monash University , and Xin Gu , Monash University

articles on online education

Ontario’s ‘choice’ of fully online school would gamble on children for profit

Lana Parker , University of Windsor

articles on online education

‘A lot of us can relate to struggling to keep on top of everything.’ This is what mature-age students need from online higher education

Ameena L. Payne , Swinburne University of Technology

articles on online education

Motivation is a key factor in whether students cheat

Carlton J. Fong , Texas State University and Megan Krou , Teachers College, Columbia University

articles on online education

5 ways for teachers to build a good rapport with their students online

Meredith Aquila , Northern Virginia Community College

articles on online education

No joke: Using humor in class is harder when learning is remote

Scott Henderson , Furman University

Related Topics

  • Future of Higher Education
  • Higher education
  • Massive Open Online Courses
  • Online learning
  • Online teaching
  • Remote learning
  • Universities
  • US higher education

Top contributors

articles on online education

Professor of Management, University of North Carolina – Greensboro

articles on online education

Emeritus professor, Deakin University

articles on online education

Professor of of Instructional Systems & Learning Technologies, Florida State University

articles on online education

Professor of Theoretical Physics, Australian National University

articles on online education

Adjunct Professor of Geography, Simon Fraser University

articles on online education

Associate Professor of History, Australian National University

articles on online education

Vice Chancellor, Monash University

articles on online education

Associate Professor, School of Engineering, RMIT University

articles on online education

Professor in the Practice of Higher Education Policy, Australian National University

articles on online education

Associate Professor of Sustainabilty and Social Innocation, Swinburne University of Technology

articles on online education

Associate Director Student Experience, Monash University

articles on online education

Professor and Co-Director, Centre for Research in Assessment and Digital Learning, Deakin University

articles on online education

Professor of Human Geography and Planning, University of Tasmania

articles on online education

Senior Lecturer in Communication, Deakin University

articles on online education

Professor in Marketing, Charles Darwin University

  • X (Twitter)
  • Unfollow topic Follow topic
  • International edition
  • Australia edition
  • Europe edition

Studying with video online lesson at homeYoung student watching lesson online and studying from home. Young woman taking notes while looking at computer screen following professor doing math on video call. Latin girl student studying from home and watching teacher explaining math formula on video chat.

The future of online learning: the long-term trends accelerated by Covid-19

With the technology now available, it’s clear that simply broadcasting pre-recorded lectures is no longer an option for forward-thinking universities

For Prof John Domingue, director of the Open University’s pioneering research and development lab, the Knowledge Media Institute (KMI), the “online genie” is out of the bottle and won’t go back in.

“It’s slightly galling to see some universities trying to replicate online almost exactly what they delivered face-to-face before Covid. Standing before a camera and broadcasting is not online teaching. You need to do things differently,” he says.

So what can universities undertake to make online learning more than just a heavy focus on streaming and recording technology? Domingue points to artificial intelligence (AI) and the concept of an online library for educators based on a Google search engine dedicated to education, and a Netflix-style recommendation tool that tracks down content to suit a lecturer’s own field, based on previous searches.

KMI is currently developing a personalised AI assistant or chatbot, an AI career coach and other tools that can analyse essays for marking and set up quizzes on revision topics.

Personalisation is also key to giving students and lecturers a better online experience. In 2017, Oxford’s Saïd Business School installed the first immersive virtual classroom of its kind in the UK: a bank of 27 HD screens able to simultaneously support up to 84 students from across the globe, called the Oxford Hub for International Virtual Education (or HIVE). An in-room camera follows lecturers moving around the room, who can respond – as in real life – to visual cues from and talk directly to individual students.

While such technology could be prohibitively expensive for many institutions, Duncan Peberdy, a consultant specialising in tech-enabled learning spaces and former adviser at the educational IT body, Jisc, says a much cheaper alternative in the form of a 3-4m wide screen offering a different dynamic based on simplified specifications has been developed by ViewSonic. “We are now in talks with two UK universities to jointly develop it on their campuses,” he says.

Meanwhile on UK campuses, many universities are striving to make the online experience more than just a lecturer broadcasting in front of a camera.

“We didn’t want that approach so we ‘shifted’ academics who were simply recurating their material with PowerPoint slides and brought in new hardware and specialists to assist them,” says Guy Daly, deputy vice-chancellor (education and students) at Coventry University.

“We realised our academics either needed the skills or support to deliver online learning in a very engaging way in a now very different world. Since March, we’ve repurposed 2,500 course modules at under- and postgraduate level for delivery in the first term of this academic year.”

Coventry has moved virtually all its student assessments and exams online. “We also used to talk about the death of the traditional lecture and bringing in more student activity-based learning as opposed to traditional didactic methods, but we’ve accelerated that journey due to Covid,” says Daly.

Wholesale and now permanent changes have gone hand in hand with the launches of Coventry’s first online postgraduate certificate in education and the first online nursing degree in England.

Many taught postgrad students, particularly those using labs, have been among students hardest hit, according to Prof Danielle George, associate dean for teaching and learning at the University of Manchester. “They only have one year to ensure they receive all their intended learning outcomes from their course. So we’ve invested in software to enable them to do prep work at home so they will then need less time in the lab itself,” she says.

“We have also helped them with time management, which is absolutely key [during short courses]. Covid took away their daily structure of going from room to room on campus so we’ve timetabled asynchronous activities – their lecturer will, say, be available ‘live’ at 9am to deliver a lecture and then answer questions, or they can choose to watch a recorded version later in their own time.

“My best advice to postgrads is to get involved in anything to do with induction – we’ve invested a lot more energy, time and passion in this area than we’ve done before and put on numerous practical online sessions,” says George.

  • Postgraduates
  • Higher education

Most viewed

How Effective Is Online Learning? What the Research Does and Doesn’t Tell Us

articles on online education

  • Share article

Editor’s Note: This is part of a continuing series on the practical takeaways from research.

The times have dictated school closings and the rapid expansion of online education. Can online lessons replace in-school time?

Clearly online time cannot provide many of the informal social interactions students have at school, but how will online courses do in terms of moving student learning forward? Research to date gives us some clues and also points us to what we could be doing to support students who are most likely to struggle in the online setting.

The use of virtual courses among K-12 students has grown rapidly in recent years. Florida, for example, requires all high school students to take at least one online course. Online learning can take a number of different forms. Often people think of Massive Open Online Courses, or MOOCs, where thousands of students watch a video online and fill out questionnaires or take exams based on those lectures.

In the online setting, students may have more distractions and less oversight, which can reduce their motivation.

Most online courses, however, particularly those serving K-12 students, have a format much more similar to in-person courses. The teacher helps to run virtual discussion among the students, assigns homework, and follows up with individual students. Sometimes these courses are synchronous (teachers and students all meet at the same time) and sometimes they are asynchronous (non-concurrent). In both cases, the teacher is supposed to provide opportunities for students to engage thoughtfully with subject matter, and students, in most cases, are required to interact with each other virtually.

Coronavirus and Schools

Online courses provide opportunities for students. Students in a school that doesn’t offer statistics classes may be able to learn statistics with virtual lessons. If students fail algebra, they may be able to catch up during evenings or summer using online classes, and not disrupt their math trajectory at school. So, almost certainly, online classes sometimes benefit students.

In comparisons of online and in-person classes, however, online classes aren’t as effective as in-person classes for most students. Only a little research has assessed the effects of online lessons for elementary and high school students, and even less has used the “gold standard” method of comparing the results for students assigned randomly to online or in-person courses. Jessica Heppen and colleagues at the American Institutes for Research and the University of Chicago Consortium on School Research randomly assigned students who had failed second semester Algebra I to either face-to-face or online credit recovery courses over the summer. Students’ credit-recovery success rates and algebra test scores were lower in the online setting. Students assigned to the online option also rated their class as more difficult than did their peers assigned to the face-to-face option.

Most of the research on online courses for K-12 students has used large-scale administrative data, looking at otherwise similar students in the two settings. One of these studies, by June Ahn of New York University and Andrew McEachin of the RAND Corp., examined Ohio charter schools; I did another with colleagues looking at Florida public school coursework. Both studies found evidence that online coursetaking was less effective.

About this series

BRIC ARCHIVE

This essay is the fifth in a series that aims to put the pieces of research together so that education decisionmakers can evaluate which policies and practices to implement.

The conveners of this project—Susanna Loeb, the director of Brown University’s Annenberg Institute for School Reform, and Harvard education professor Heather Hill—have received grant support from the Annenberg Institute for this series.

To suggest other topics for this series or join in the conversation, use #EdResearchtoPractice on Twitter.

Read the full series here .

It is not surprising that in-person courses are, on average, more effective. Being in person with teachers and other students creates social pressures and benefits that can help motivate students to engage. Some students do as well in online courses as in in-person courses, some may actually do better, but, on average, students do worse in the online setting, and this is particularly true for students with weaker academic backgrounds.

Students who struggle in in-person classes are likely to struggle even more online. While the research on virtual schools in K-12 education doesn’t address these differences directly, a study of college students that I worked on with Stanford colleagues found very little difference in learning for high-performing students in the online and in-person settings. On the other hand, lower performing students performed meaningfully worse in online courses than in in-person courses.

But just because students who struggle in in-person classes are even more likely to struggle online doesn’t mean that’s inevitable. Online teachers will need to consider the needs of less-engaged students and work to engage them. Online courses might be made to work for these students on average, even if they have not in the past.

Just like in brick-and-mortar classrooms, online courses need a strong curriculum and strong pedagogical practices. Teachers need to understand what students know and what they don’t know, as well as how to help them learn new material. What is different in the online setting is that students may have more distractions and less oversight, which can reduce their motivation. The teacher will need to set norms for engagement—such as requiring students to regularly ask questions and respond to their peers—that are different than the norms in the in-person setting.

Online courses are generally not as effective as in-person classes, but they are certainly better than no classes. A substantial research base developed by Karl Alexander at Johns Hopkins University and many others shows that students, especially students with fewer resources at home, learn less when they are not in school. Right now, virtual courses are allowing students to access lessons and exercises and interact with teachers in ways that would have been impossible if an epidemic had closed schools even a decade or two earlier. So we may be skeptical of online learning, but it is also time to embrace and improve it.

A version of this article appeared in the April 01, 2020 edition of Education Week as How Effective Is Online Learning?

Sign Up for EdWeek Tech Leader

Edweek top school jobs.

Brightly colored custom illustration of a young depressed female sitting inside of a chat bubble and looking at a laptop with her head in her hand while there is another chat bubble with the ellipsis as if someone is typing something to her. Digital and techie textures applied to the background.

Sign Up & Sign In

module image 9

  • Research article
  • Open access
  • Published: 02 December 2020

Integrating students’ perspectives about online learning: a hierarchy of factors

  • Montgomery Van Wart 1 ,
  • Anna Ni 1 ,
  • Pamela Medina 1 ,
  • Jesus Canelon 1 ,
  • Melika Kordrostami 1 ,
  • Jing Zhang 1 &

International Journal of Educational Technology in Higher Education volume  17 , Article number:  53 ( 2020 ) Cite this article

147k Accesses

47 Citations

24 Altmetric

Metrics details

This article reports on a large-scale ( n  = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students’ perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social Comfort, Online Interactive Modality, and Social Presence--were identified as significant and reliable. Regression analysis indicates the minimal factors for enrollment in future classes—when students consider convenience and scheduling—were Basic Online Modality, Cognitive Presence, and Online Social Comfort. Students who accepted or embraced online courses on their own merits wanted a minimum of Basic Online Modality, Teaching Presence, Cognitive Presence, Online Social Comfort, and Social Presence. Students, who preferred face-to-face classes and demanded a comparable experience, valued Online Interactive Modality and Instructional Support more highly. Recommendations for online course design, policy, and future research are provided.

Introduction

While there are different perspectives of the learning process such as learning achievement and faculty perspectives, students’ perspectives are especially critical since they are ultimately the raison d’être of the educational endeavor (Chickering & Gamson, 1987 ). More pragmatically, students’ perspectives provide invaluable, first-hand insights into their experiences and expectations (Dawson et al., 2019 ). The student perspective is especially important when new teaching approaches are used and when new technologies are being introduced (Arthur, 2009 ; Crews & Butterfield, 2014 ; Van Wart, Ni, Ready, Shayo, & Court, 2020 ). With the renewed interest in “active” education in general (Arruabarrena, Sánchez, Blanco, et al., 2019 ; Kay, MacDonald, & DiGiuseppe, 2019 ; Nouri, 2016 ; Vlachopoulos & Makri, 2017 ) and the flipped classroom approach in particular (Flores, del-Arco, & Silva, 2016 ; Gong, Yang, & Cai, 2020 ; Lundin, et al., 2018 ; Maycock, 2019 ; McGivney-Burelle, 2013 ; O’Flaherty & Phillips, 2015 ; Tucker , 2012 ) along with extraordinary shifts in the technology, the student perspective on online education is profoundly important. What shapes students’ perceptions of quality integrate are their own sense of learning achievement, satisfaction with the support they receive, technical proficiency of the process, intellectual and emotional stimulation, comfort with the process, and sense of learning community. The factors that students perceive as quality online teaching, however, has not been as clear as it might be for at least two reasons.

First, it is important to note that the overall online learning experience for students is also composed of non-teaching factors which we briefly mention. Three such factors are (1) convenience, (2) learner characteristics and readiness, and (3) antecedent conditions that may foster teaching quality but are not directly responsible for it. (1) Convenience is an enormous non-quality factor for students (Artino, 2010 ) which has driven up online demand around the world (Fidalgo, Thormann, Kulyk, et al., 2020 ; Inside Higher Education and Gallup, 2019 ; Legon & Garrett, 2019 ; Ortagus, 2017 ). This is important since satisfaction with online classes is frequently somewhat lower than face-to-face classes (Macon, 2011 ). However, the literature generally supports the relative equivalence of face-to-face and online modes regarding learning achievement criteria (Bernard et al., 2004 ; Nguyen, 2015 ; Ni, 2013 ; Sitzmann, Kraiger, Stewart, & Wisher, 2006 ; see Xu & Jaggars, 2014 for an alternate perspective). These contrasts are exemplified in a recent study of business students, in which online students using a flipped classroom approach outperformed their face-to-face peers, but ironically rated instructor performance lower (Harjoto, 2017 ). (2) Learner characteristics also affect the experience related to self-regulation in an active learning model, comfort with technology, and age, among others,which affect both receptiveness and readiness of online instruction. (Alqurashi, 2016 ; Cohen & Baruth, 2017 ; Kintu, Zhu, & Kagambe, 2017 ; Kuo, Walker, Schroder, & Belland, 2013 ; Ventura & Moscoloni, 2015 ) (3) Finally, numerous antecedent factors may lead to improved instruction, but are not themselves directly perceived by students such as instructor training (Brinkley-Etzkorn, 2018 ), and the sources of faculty motivation (e.g., incentives, recognition, social influence, and voluntariness) (Wingo, Ivankova, & Moss, 2017 ). Important as these factors are, mixing them with the perceptions of quality tends to obfuscate the quality factors directly perceived by students.

Second, while student perceptions of quality are used in innumerable studies, our overall understanding still needs to integrate them more holistically. Many studies use student perceptions of quality and overall effectiveness of individual tools and strategies in online contexts such as mobile devices (Drew & Mann, 2018 ), small groups (Choi, Land, & Turgeon, 2005 ), journals (Nair, Tay, & Koh, 2013 ), simulations (Vlachopoulos & Makri, 2017 ), video (Lange & Costley, 2020 ), etc. Such studies, however, cannot provide the overall context and comparative importance. Some studies have examined the overall learning experience of students with exploratory lists, but have mixed non-quality factors with quality of teaching factors making it difficult to discern the instructor’s versus contextual roles in quality (e.g., Asoodar, Vaezi, & Izanloo, 2016 ; Bollinger & Martindale, 2004 ; Farrell & Brunton, 2020 ; Hong, 2002 ; Song, Singleton, Hill, & Koh, 2004 ; Sun, Tsai, Finger, Chen, & Yeh, 2008 ). The application of technology adoption studies also fall into this category by essentially aggregating all teaching quality in the single category of performance ( Al-Gahtani, 2016 ; Artino, 2010 ). Some studies have used high-level teaching-oriented models, primarily the Community of Inquiry model (le Roux & Nagel, 2018 ), but empirical support has been mixed (Arbaugh et al., 2008 ); and its elegance (i.e., relying on only three factors) has not provided much insight to practitioners (Anderson, 2016 ; Cleveland-Innes & Campbell, 2012 ).

Research questions

Integration of studies and concepts explored continues to be fragmented and confusing despite the fact that the number of empirical studies related to student perceptions of quality factors has increased. It is important to have an empirical view of what students’ value in a single comprehensive study and, also, to know if there is a hierarchy of factors, ranging from students who are least to most critical of the online learning experience. This research study has two research questions.

The first research question is: What are the significant factors in creating a high-quality online learning experience from students’ perspectives? That is important to know because it should have a significant effect on the instructor’s design of online classes. The goal of this research question is identify a more articulated and empirically-supported set of factors capturing the full range of student expectations.

The second research question is: Is there a priority or hierarchy of factors related to students’ perceptions of online teaching quality that relate to their decisions to enroll in online classes? For example, is it possible to distinguish which factors are critical for enrollment decisions when students are primarily motivated by convenience and scheduling flexibility (minimum threshold)? Do these factors differ from students with a genuine acceptance of the general quality of online courses (a moderate threshold)? What are the factors that are important for the students who are the most critical of online course delivery (highest threshold)?

This article next reviews the literature on online education quality, focusing on the student perspective and reviews eight factors derived from it. The research methods section discusses the study structure and methods. Demographic data related to the sample are next, followed by the results, discussion, and conclusion.

Literature review

Online education is much discussed (Prinsloo, 2016 ; Van Wart et al., 2019 ; Zawacki-Richter & Naidu, 2016 ), but its perception is substantially influenced by where you stand and what you value (Otter et al., 2013 ; Tanner, Noser, & Totaro, 2009 ). Accrediting bodies care about meeting technical standards, proof of effectiveness, and consistency (Grandzol & Grandzol, 2006 ). Institutions care about reputation, rigor, student satisfaction, and institutional efficiency (Jung, 2011 ). Faculty care about subject coverage, student participation, faculty satisfaction, and faculty workload (Horvitz, Beach, Anderson, & Xia, 2015 ; Mansbach & Austin, 2018 ). For their part, students care about learning achievement (Marks, Sibley, & Arbaugh, 2005 ; O’Neill & Sai, 2014 ; Shen, Cho, Tsai, & Marra, 2013 ), but also view online education as a function of their enjoyment of classes, instructor capability and responsiveness, and comfort in the learning environment (e.g., Asoodar et al., 2016 ; Sebastianelli, Swift, & Tamimi, 2015 ). It is this last perspective, of students, upon which we focus.

It is important to note students do not sign up for online classes solely based on perceived quality. Perceptions of quality derive from notions of the capacity of online learning when ideal—relative to both learning achievement and satisfaction/enjoyment, and perceptions about the likelihood and experience of classes living up to expectations. Students also sign up because of convenience and flexibility, and personal notions of suitability about learning. Convenience and flexibility are enormous drivers of online registration (Lee, Stringer, & Du, 2017 ; Mann & Henneberry, 2012 ). Even when students say they prefer face-to-face classes to online, many enroll in online classes and re-enroll in the future if the experience meets minimum expectations. This study examines the threshold expectations of students when they are considering taking online classes.

When discussing students’ perceptions of quality, there is little clarity about the actual range of concepts because no integrated empirical studies exist comparing major factors found throughout the literature. Rather, there are practitioner-generated lists of micro-competencies such as the Quality Matters consortium for higher education (Quality Matters, 2018 ), or broad frameworks encompassing many aspects of quality beyond teaching (Open and Distant Learning Quality Council, 2012 ). While checklists are useful for practitioners and accreditation processes, they do not provide robust, theoretical bases for scholarly development. Overarching frameworks are heuristically useful, but not for pragmatic purposes or theory building arenas. The most prominent theoretical framework used in online literature is the Community of Inquiry (CoI) model (Arbaugh et al., 2008 ; Garrison, Anderson, & Archer, 2003 ), which divides instruction into teaching, cognitive, and social presence. Like deductive theories, however, the supportive evidence is mixed (Rourke & Kanuka, 2009 ), especially regarding the importance of social presence (Annand, 2011 ; Armellini and De Stefani, 2016 ). Conceptually, the problem is not so much with the narrow articulation of cognitive or social presence; cognitive presence is how the instructor provides opportunities for students to interact with material in robust, thought-provoking ways, and social presence refers to building a community of learning that incorporates student-to-student interactions. However, teaching presence includes everything else the instructor does—structuring the course, providing lectures, explaining assignments, creating rehearsal opportunities, supplying tests, grading, answering questions, and so on. These challenges become even more prominent in the online context. While the lecture as a single medium is paramount in face-to-face classes, it fades as the primary vehicle in online classes with increased use of detailed syllabi, electronic announcements, recorded and synchronous lectures, 24/7 communications related to student questions, etc. Amassing the pedagogical and technological elements related to teaching under a single concept provides little insight.

In addition to the CoI model, numerous concepts are suggested in single-factor empirical studies when focusing on quality from a student’s perspective, with overlapping conceptualizations and nonstandardized naming conventions. Seven distinct factors are derived here from the literature of student perceptions of online quality: Instructional Support, Teaching Presence, Basic Online Modality, Social Presence, Online Social Comfort, cognitive Presence, and Interactive Online Modality.

Instructional support

Instructional Support refers to students’ perceptions of techniques by the instructor used for input, rehearsal, feedback, and evaluation. Specifically, this entails providing detailed instructions, designed use of multimedia, and the balance between repetitive class features for ease of use, and techniques to prevent boredom. Instructional Support is often included as an element of Teaching Presence, but is also labeled “structure” (Lee & Rha, 2009 ; So & Brush, 2008 ) and instructor facilitation (Eom, Wen, & Ashill, 2006 ). A prime example of the difference between face-to-face and online education is the extensive use of the “flipped classroom” (Maycock, 2019 ; Wang, Huang, & Schunn, 2019 ) in which students move to rehearsal activities faster and more frequently than traditional classrooms, with less instructor lecture (Jung, 2011 ; Martin, Wang, & Sadaf, 2018 ). It has been consistently supported as an element of student perceptions of quality (Espasa & Meneses, 2010 ).

  • Teaching presence

Teaching Presence refers to students’ perceptions about the quality of communication in lectures, directions, and individual feedback including encouragement (Jaggars & Xu, 2016 ; Marks et al., 2005 ). Specifically, instructor communication is clear, focused, and encouraging, and instructor feedback is customized and timely. If Instructional Support is what an instructor does before the course begins and in carrying out those plans, then Teaching Presence is what the instructor does while the class is conducted and in response to specific circumstances. For example, a course could be well designed but poorly delivered because the instructor is distracted; or a course could be poorly designed but an instructor might make up for the deficit by spending time and energy in elaborate communications and ad hoc teaching techniques. It is especially important in student satisfaction (Sebastianelli et al., 2015 ; Young, 2006 ) and also referred to as instructor presence (Asoodar et al., 2016 ), learner-instructor interaction (Marks et al., 2005 ), and staff support (Jung, 2011 ). As with Instructional Support, it has been consistently supported as an element of student perceptions of quality.

Basic online modality

Basic Online Modality refers to the competent use of basic online class tools—online grading, navigation methods, online grade book, and the announcements function. It is frequently clumped with instructional quality (Artino, 2010 ), service quality (Mohammadi, 2015 ), instructor expertise in e-teaching (Paechter, Maier, & Macher, 2010 ), and similar terms. As a narrowly defined concept, it is sometimes called technology (Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Sun et al., 2008 ). The only empirical study that did not find Basic Online Modality significant, as technology, was Sun et al. ( 2008 ). Because Basic Online Modality is addressed with basic instructor training, some studies assert the importance of training (e.g., Asoodar et al., 2016 ).

Social presence

Social Presence refers to students’ perceptions of the quality of student-to-student interaction. Social Presence focuses on the quality of shared learning and collaboration among students, such as in threaded discussion responses (Garrison et al., 2003 ; Kehrwald, 2008 ). Much emphasized but challenged in the CoI literature (Rourke & Kanuka, 2009 ), it has mixed support in the online literature. While some studies found Social Presence or related concepts to be significant (e.g., Asoodar et al., 2016 ; Bollinger & Martindale, 2004 ; Eom et al., 2006 ; Richardson, Maeda, Lv, & Caskurlu, 2017 ), others found Social Presence insignificant (Joo, Lim, & Kim, 2011 ; So & Brush, 2008 ; Sun et al., 2008 ).

Online social comfort

Online Social Comfort refers to the instructor’s ability to provide an environment in which anxiety is low, and students feel comfortable interacting even when expressing opposing viewpoints. While numerous studies have examined anxiety (e.g., Liaw & Huang, 2013 ; Otter et al., 2013 ; Sun et al., 2008 ), only one found anxiety insignificant (Asoodar et al., 2016 ); many others have not examined the concept.

  • Cognitive presence

Cognitive Presence refers to the engagement of students such that they perceive they are stimulated by the material and instructor to reflect deeply and critically, and seek to understand different perspectives (Garrison et al., 2003 ). The instructor provides instructional materials and facilitates an environment that piques interest, is reflective, and enhances inclusiveness of perspectives (Durabi, Arrastia, Nelson, Cornille, & Liang, 2011 ). Cognitive Presence includes enhancing the applicability of material for student’s potential or current careers. Cognitive Presence is supported as significant in many online studies (e.g., Artino, 2010 ; Asoodar et al., 2016 ; Joo et al., 2011 ; Marks et al., 2005 ; Sebastianelli et al., 2015 ; Sun et al., 2008 ). Further, while many instructors perceive that cognitive presence is diminished in online settings, neuroscientific studies indicate this need not be the case (Takamine, 2017 ). While numerous studies failed to examine Cognitive Presence, this review found no studies that lessened its significance for students.

Interactive online modality

Interactive Online Modality refers to the “high-end” usage of online functionality. That is, the instructor uses interactive online class tools—video lectures, videoconferencing, and small group discussions—well. It is often included in concepts such as instructional quality (Artino, 2010 ; Asoodar et al., 2016 ; Mohammadi, 2015 ; Otter et al., 2013 ; Paechter et al., 2010 ) or engagement (Clayton, Blumberg, & Anthony, 2018 ). While individual methods have been investigated (e.g. Durabi et al., 2011 ), high-end engagement methods have not.

Other independent variables affecting perceptions of quality include age, undergraduate versus graduate status, gender, ethnicity/race, discipline, educational motivation of students, and previous online experience. While age has been found to be small or insignificant, more notable effects have been reported at the level-of-study, with graduate students reporting higher “success” (Macon, 2011 ), and community college students having greater difficulty with online classes (Legon & Garrett, 2019 ; Xu & Jaggars, 2014 ). Ethnicity and race have also been small or insignificant. Some situational variations and student preferences can be captured by paying attention to disciplinary differences (Arbaugh, 2005 ; Macon, 2011 ). Motivation levels of students have been reported to be significant in completion and achievement, with better students doing as well across face-to-face and online modes, and weaker students having greater completion and achievement challenges (Clayton et al., 2018 ; Lu & Lemonde, 2013 ).

Research methods

To examine the various quality factors, we apply a critical success factor methodology, initially introduced to schools of business research in the 1970s. In 1981, Rockhart and Bullen codified an approach embodying principles of critical success factors (CSFs) as a way to identify the information needs of executives, detailing steps for the collection and analyzation of data to create a set of organizational CSFs (Rockhart & Bullen, 1981 ). CSFs describe the underlying or guiding principles which must be incorporated to ensure success.

Utilizing this methodology, CSFs in the context of this paper define key areas of instruction and design essential for an online class to be successful from a student’s perspective. Instructors implicitly know and consider these areas when setting up an online class and designing and directing activities and tasks important to achieving learning goals. CSFs make explicit those things good instructors may intuitively know and (should) do to enhance student learning. When made explicit, CSFs not only confirm the knowledge of successful instructors, but tap their intuition to guide and direct the accomplishment of quality instruction for entire programs. In addition, CSFs are linked with goals and objectives, helping generate a small number of truly important matters an instructor should focus attention on to achieve different thresholds of online success.

After a comprehensive literature review, an instrument was created to measure students’ perceptions about the importance of techniques and indicators leading to quality online classes. Items were designed to capture the major factors in the literature. The instrument was pilot studied during academic year 2017–18 with a 397 student sample, facilitating an exploratory factor analysis leading to important preliminary findings (reference withheld for review). Based on the pilot, survey items were added and refined to include seven groups of quality teaching factors and two groups of items related to students’ overall acceptance of online classes as well as a variable on their future online class enrollment. Demographic information was gathered to determine their effects on students’ levels of acceptance of online classes based on age, year in program, major, distance from university, number of online classes taken, high school experience with online classes, and communication preferences.

This paper draws evidence from a sample of students enrolled in educational programs at Jack H. Brown College of Business and Public Administration (JHBC), California State University San Bernardino (CSUSB). The JHBC offers a wide range of online courses for undergraduate and graduate programs. To ensure comparable learning outcomes, online classes and face-to-face classes of a certain subject are similar in size—undergraduate classes are generally capped at 60 and graduate classes at 30, and often taught by the same instructors. Students sometimes have the option to choose between both face-to-face and online modes of learning.

A Qualtrics survey link was sent out by 11 instructors to students who were unlikely to be cross-enrolled in classes during the 2018–19 academic year. 1 Approximately 2500 students were contacted, with some instructors providing class time to complete the anonymous survey. All students, whether they had taken an online class or not, were encouraged to respond. Nine hundred eighty-seven students responded, representing a 40% response rate. Although drawn from a single business school, it is a broad sample representing students from several disciplines—management, accounting and finance, marketing, information decision sciences, and public administration, as well as both graduate and undergraduate programs of study.

The sample age of students is young, with 78% being under 30. The sample has almost no lower division students (i.e., freshman and sophomore), 73% upper division students (i.e., junior and senior) and 24% graduate students (master’s level). Only 17% reported having taken a hybrid or online class in high school. There was a wide range of exposure to university level online courses, with 47% reporting having taken 1 to 4 classes, and 21% reporting no online class experience. As a Hispanic-serving institution, 54% self-identified as Latino, 18% White, and 13% Asian and Pacific Islander. The five largest majors were accounting & finance (25%), management (21%), master of public administration (16%), marketing (12%), and information decision sciences (10%). Seventy-four percent work full- or part-time. See Table  1 for demographic data.

Measures and procedure

To increase the reliability of evaluation scores, composite evaluation variables are formed after an exploratory factor analysis of individual evaluation items. A principle component method with Quartimin (oblique) rotation was applied to explore the factor construct of student perceptions of online teaching CSFs. The item correlations for student perceptions of importance coefficients greater than .30 were included, a commonly acceptable ratio in factor analysis. A simple least-squares regression analysis was applied to test the significance levels of factors on students’ impression of online classes.

Exploratory factor constructs

Using a threshold loading of 0.3 for items, 37 items loaded on seven factors. All factors were logically consistent. The first factor, with eight items, was labeled Teaching Presence. Items included providing clear instructions, staying on task, clear deadlines, and customized feedback on strengths and weaknesses. Teaching Presence items all related to instructor involvement during the course as a director, monitor, and learning facilitator. The second factor, with seven items, aligned with Cognitive Presence. Items included stimulating curiosity, opportunities for reflection, helping students construct explanations posed in online courses, and the applicability of material. The third factor, with six items, aligned with Social Presence defined as providing student-to-student learning opportunities. Items included getting to know course participants for sense of belonging, forming impressions of other students, and interacting with others. The fourth factor, with six new items as well as two (“interaction with other students” and “a sense of community in the class”) shared with the third factor, was Instructional Support which related to the instructor’s roles in providing students a cohesive learning experience. They included providing sufficient rehearsal, structured feedback, techniques for communication, navigation guide, detailed syllabus, and coordinating student interaction and creating a sense of online community. This factor also included enthusiasm which students generally interpreted as a robustly designed course, rather than animation in a traditional lecture. The fifth factor was labeled Basic Online Modality and focused on the basic technological requirements for a functional online course. Three items included allowing students to make online submissions, use of online gradebooks, and online grading. A fourth item is the use of online quizzes, viewed by students as mechanical practice opportunities rather than small tests and a fifth is navigation, a key component of Online Modality. The sixth factor, loaded on four items, was labeled Online Social Comfort. Items here included comfort discussing ideas online, comfort disagreeing, developing a sense of collaboration via discussion, and considering online communication as an excellent medium for social interaction. The final factor was called Interactive Online Modality because it included items for “richer” communications or interactions, no matter whether one- or two-way. Items included videoconferencing, instructor-generated videos, and small group discussions. Taken together, these seven explained 67% of the variance which is considered in the acceptable range in social science research for a robust model (Hair, Black, Babin, & Anderson, 2014 ). See Table  2 for the full list.

To test for factor reliability, the Cronbach alpha of variables were calculated. All produced values greater than 0.7, the standard threshold used for reliability, except for system trust which was therefore dropped. To gauge students’ sense of factor importance, all items were means averaged. Factor means (lower means indicating higher importance to students), ranged from 1.5 to 2.6 on a 5-point scale. Basic Online Modality was most important, followed by Instructional Support and Teaching Presence. Students deemed Cognitive Presence, Social Online Comfort, and Online Interactive Modality less important. The least important for this sample was Social Presence. Table  3 arrays the critical success factor means, standard deviations, and Cronbach alpha.

To determine whether particular subgroups of respondents viewed factors differently, a series of ANOVAs were conducted using factor means as dependent variables. Six demographic variables were used as independent variables: graduate vs. undergraduate, age, work status, ethnicity, discipline, and past online experience. To determine strength of association of the independent variables to each of the seven CSFs, eta squared was calculated for each ANOVA. Eta squared indicates the proportion of variance in the dependent variable explained by the independent variable. Eta squared values greater than .01, .06, and .14 are conventionally interpreted as small, medium, and large effect sizes, respectively (Green & Salkind, 2003 ). Table  4 summarizes the eta squared values for the ANOVA tests with Eta squared values less than .01 omitted.

While no significant differences in factor means among students in different disciplines in the College occur, all five other independent variables have some small effect on some or all CSFs. Graduate students tend to rate Online Interactive Modality, Instructional Support, Teaching Presence, and Cognitive Presence higher than undergraduates. Elder students value more Online Interactive Modality. Full-time working students rate all factors, except Social Online Comfort, slightly higher than part-timers and non-working students. Latino and White rate Basic Online Modality and Instructional Support higher; Asian and Pacific Islanders rate Social Presence higher. Students who have taken more online classes rate all factors higher.

In addition to factor scores, two variables are constructed to identify the resultant impressions labeled online experience. Both were logically consistent with a Cronbach’s α greater than 0.75. The first variable, with six items, labeled “online acceptance,” included items such as “I enjoy online learning,” “My overall impression of hybrid/online learning is very good,” and “the instructors of online/hybrid classes are generally responsive.” The second variable was labeled “face-to-face preference” and combines four items, including enjoying, learning, and communicating more in face-to-face classes, as well as perceiving greater fairness and equity. In addition to these two constructed variables, a one-item variable was also used subsequently in the regression analysis: “online enrollment.” That question asked: if hybrid/online classes are well taught and available, how much would online education make up your entire course selection going forward?

Regression results

As noted above, two constructed variables and one item were used as dependent variables for purposes of regression analysis. They were online acceptance, F2F preference, and the selection of online classes. In addition to seven quality-of-teaching factors identified by factor analysis, control variables included level of education (graduate versus undergraduate), age, ethnicity, work status, distance to university, and number of online/hybrid classes taken in the past. See Table  5 .

When the ETA squared values for ANOVA significance were measured for control factors, only one was close to a medium effect. Graduate versus undergraduate status had a .05 effect (considered medium) related to Online Interactive Modality, meaning graduate students were more sensitive to interactive modality than undergraduates. Multiple regression analysis of critical success factors and online impressions were conducted to compare under what conditions factors were significant. The only consistently significant control factor was number of online classes taken. The more classes students had taken online, the more inclined they were to take future classes. Level of program, age, ethnicity, and working status do not significantly affect students’ choice or overall acceptance of online classes.

The least restrictive condition was online enrollment (Table  6 ). That is, students might not feel online courses were ideal, but because of convenience and scheduling might enroll in them if minimum threshold expectations were met. When considering online enrollment three factors were significant and positive (at the 0.1 level): Basic Online Modality, Cognitive Presence, and Online Social Comfort. These least-demanding students expected classes to have basic technological functionality, provide good opportunities for knowledge acquisition, and provide comfortable interaction in small groups. Students who demand good Instructional Support (e.g., rehearsal opportunities, standardized feedback, clear syllabus) are less likely to enroll.

Online acceptance was more restrictive (see Table  7 ). This variable captured the idea that students not only enrolled in online classes out of necessity, but with an appreciation of the positive attributes of online instruction, which balanced the negative aspects. When this standard was applied, students expected not only Basic Online Modality, Cognitive Presence, and Online Social Comfort, but expected their instructors to be highly engaged virtually as the course progressed (Teaching Presence), and to create strong student-to-student dynamics (Social Presence). Students who rated Instructional Support higher are less accepting of online classes.

Another restrictive condition was catering to the needs of students who preferred face-to-face classes (see Table  8 ). That is, they preferred face-to-face classes even when online classes were well taught. Unlike students more accepting of, or more likely to enroll in, online classes, this group rates Instructional Support as critical to enrolling, rather than a negative factor when absent. Again different from the other two groups, these students demand appropriate interactive mechanisms (Online Interactive Modality) to enable richer communication (e.g., videoconferencing). Student-to-student collaboration (Social Presence) was also significant. This group also rated Cognitive Presence and Online Social Comfort as significant, but only in their absence. That is, these students were most attached to direct interaction with the instructor and other students rather than specific teaching methods. Interestingly, Basic Online Modality and Teaching Presence were not significant. Our interpretation here is this student group, most critical of online classes for its loss of physical interaction, are beyond being concerned with mechanical technical interaction and demand higher levels of interactivity and instructional sophistication.

Discussion and study limitations

Some past studies have used robust empirical methods to identify a single factor or a small number of factors related to quality from a student’s perspective, but have not sought to be relatively comprehensive. Others have used a longer series of itemized factors, but have less used less robust methods, and have not tied those factors back to the literature. This study has used the literature to develop a relatively comprehensive list of items focused on quality teaching in a single rigorous protocol. That is, while a Beta test had identified five coherent factors, substantial changes to the current survey that sharpened the focus on quality factors rather than antecedent factors, as well as better articulating the array of factors often lumped under the mantle of “teaching presence.” In addition, it has also examined them based on threshold expectations: from minimal, such as when flexibility is the driving consideration, to modest, such as when students want a “good” online class, to high, when students demand an interactive virtual experience equivalent to face-to-face.

Exploratory factor analysis identified seven factors that were reliable, coherent, and significant under different conditions. When considering students’ overall sense of importance, they are, in order: Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Social Online Comfort, Interactive Online Modality, and Social Presence. Students are most concerned with the basics of a course first, that is the technological and instructor competence. Next they want engagement and virtual comfort. Social Presence, while valued, is the least critical from this overall perspective.

The factor analysis is quite consistent with the range of factors identified in the literature, pointing to the fact that students can differentiate among different aspects of what have been clumped as larger concepts, such as teaching presence. Essentially, the instructor’s role in quality can be divided into her/his command of basic online functionality, good design, and good presence during the class. The instructor’s command of basic functionality is paramount. Because so much of online classes must be built in advance of the class, quality of the class design is rated more highly than the instructor’s role in facilitating the class. Taken as a whole, the instructor’s role in traditional teaching elements is primary, as we would expect it to be. Cognitive presence, especially as pertinence of the instructional material and its applicability to student interests, has always been found significant when studied, and was highly rated as well in a single factor. Finally, the degree to which students feel comfortable with the online environment and enjoy the learner-learner aspect has been less supported in empirical studies, was found significant here, but rated the lowest among the factors of quality to students.

Regression analysis paints a more nuanced picture, depending on student focus. It also helps explain some of the heterogeneity of previous studies, depending on what the dependent variables were. If convenience and scheduling are critical and students are less demanding, minimum requirements are Basic Online Modality, Cognitive Presence, and Online Social Comfort. That is, students’ expect an instructor who knows how to use an online platform, delivers useful information, and who provides a comfortable learning environment. However, they do not expect to get poor design. They do not expect much in terms of the quality teaching presence, learner-to-learner interaction, or interactive teaching.

When students are signing up for critical classes, or they have both F2F and online options, they have a higher standard. That is, they not only expect the factors for decisions about enrolling in noncritical classes, but they also expect good Teaching and Social Presence. Students who simply need a class may be willing to teach themselves a bit more, but students who want a good class expect a highly present instructor in terms responsiveness and immediacy. “Good” classes must not only create a comfortable atmosphere, but in social science classes at least, must provide strong learner-to-learner interactions as well. At the time of the research, most students believe that you can have a good class without high interactivity via pre-recorded video and videoconference. That may, or may not, change over time as technology thresholds of various video media become easier to use, more reliable, and more commonplace.

The most demanding students are those who prefer F2F classes because of learning style preferences, poor past experiences, or both. Such students (seem to) assume that a worthwhile online class has basic functionality and that the instructor provides a strong presence. They are also critical of the absence of Cognitive Presence and Online Social Comfort. They want strong Instructional Support and Social Presence. But in addition, and uniquely, they expect Online Interactive Modality which provides the greatest verisimilitude to the traditional classroom as possible. More than the other two groups, these students crave human interaction in the learning process, both with the instructor and other students.

These findings shed light on the possible ramifications of the COVID-19 aftermath. Many universities around the world jumped from relatively low levels of online instruction in the beginning of spring 2020 to nearly 100% by mandate by the end of the spring term. The question becomes, what will happen after the mandate is removed? Will demand resume pre-crisis levels, will it increase modestly, or will it skyrocket? Time will be the best judge, but the findings here would suggest that the ability/interest of instructors and institutions to “rise to the occasion” with quality teaching will have as much effect on demand as students becoming more acclimated to online learning. If in the rush to get classes online many students experience shoddy basic functional competence, poor instructional design, sporadic teaching presence, and poorly implemented cognitive and social aspects, they may be quite willing to return to the traditional classroom. If faculty and institutions supporting them are able to increase the quality of classes despite time pressures, then most students may be interested in more hybrid and fully online classes. If instructors are able to introduce high quality interactive teaching, nearly the entire student population will be interested in more online classes. Of course students will have a variety of experiences, but this analysis suggests that those instructors, departments, and institutions that put greater effort into the temporary adjustment (and who resist less), will be substantially more likely to have increases in demand beyond what the modest national trajectory has been for the last decade or so.

There are several study limitations. First, the study does not include a sample of non-respondents. Non-responders may have a somewhat different profile. Second, the study draws from a single college and university. The profile derived here may vary significantly by type of student. Third, some survey statements may have led respondents to rate quality based upon experience rather than assess the general importance of online course elements. “I felt comfortable participating in the course discussions,” could be revised to “comfort in participating in course discussions.” The authors weighed differences among subgroups (e.g., among majors) as small and statistically insignificant. However, it is possible differences between biology and marketing students would be significant, leading factors to be differently ordered. Emphasis and ordering might vary at a community college versus research-oriented university (Gonzalez, 2009 ).

Availability of data and materials

We will make the data available.

Al-Gahtani, S. S. (2016). Empirical investigation of e-learning acceptance and assimilation: A structural equation model. Applied Comput Information , 12 , 27–50.

Google Scholar  

Alqurashi, E. (2016). Self-efficacy in online learning environments: A literature review. Contemporary Issues Educ Res (CIER) , 9 (1), 45–52.

Anderson, T. (2016). A fourth presence for the Community of Inquiry model? Retrieved from https://virtualcanuck.ca/2016/01/04/a-fourth-presence-for-the-community-of-inquiry-model/ .

Annand, D. (2011). Social presence within the community of inquiry framework. The International Review of Research in Open and Distributed Learning , 12 (5), 40.

Arbaugh, J. B. (2005). How much does “subject matter” matter? A study of disciplinary effects in on-line MBA courses. Academy of Management Learning & Education , 4 (1), 57–73.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S. R., Garrison, D. R., Ice, P., Richardson, J. C., & Swan, K. P. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education , 11 , 133–136.

Armellini, A., & De Stefani, M. (2016). Social presence in the 21st century: An adjustment to the Community of Inquiry framework. British Journal of Educational Technology , 47 (6), 1202–1216.

Arruabarrena, R., Sánchez, A., Blanco, J. M., et al. (2019). Integration of good practices of active methodologies with the reuse of student-generated content. International Journal of Educational Technology in Higher Education , 16 , #10.

Arthur, L. (2009). From performativity to professionalism: Lecturers’ responses to student feedback. Teaching in Higher Education , 14 (4), 441–454.

Artino, A. R. (2010). Online or face-to-face learning? Exploring the personal factors that predict students’ choice of instructional format. Internet and Higher Education , 13 , 272–276.

Asoodar, M., Vaezi, S., & Izanloo, B. (2016). Framework to improve e-learner satisfaction and further strengthen e-learning implementation. Computers in Human Behavior , 63 , 704–716.

Bernard, R. M., et al. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research , 74 (3), 379–439.

Bollinger, D., & Martindale, T. (2004). Key factors for determining student satisfaction in online courses. Int J E-learning , 3 (1), 61–67.

Brinkley-Etzkorn, K. E. (2018). Learning to teach online: Measuring the influence of faculty development training on teaching effectiveness through a TPACK lens. The Internet and Higher Education , 38 , 28–35.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin , 3 , 7.

Choi, I., Land, S. M., & Turgeon, A. J. (2005). Scaffolding peer-questioning strategies to facilitate metacognition during online small group discussion. Instructional Science , 33 , 483–511.

Clayton, K. E., Blumberg, F. C., & Anthony, J. A. (2018). Linkages between course status, perceived course value, and students’ preferences for traditional versus non-traditional learning environments. Computers & Education , 125 , 175–181.

Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and Distributed Learning , 13 (4), 269–292.

Cohen, A., & Baruth, O. (2017). Personality, learning, and satisfaction in fully online academic courses. Computers in Human Behavior , 72 , 1–12.

Crews, T., & Butterfield, J. (2014). Data for flipped classroom design: Using student feedback to identify the best components from online and face-to-face classes. Higher Education Studies , 4 (3), 38–47.

Dawson, P., Henderson, M., Mahoney, P., Phillips, M., Ryan, T., Boud, D., & Molloy, E. (2019). What makes for effective feedback: Staff and student perspectives. Assessment & Evaluation in Higher Education , 44 (1), 25–36.

Drew, C., & Mann, A. (2018). Unfitting, uncomfortable, unacademic: A sociological reading of an interactive mobile phone app in university lectures. International Journal of Educational Technology in Higher Education , 15 , #43.

Durabi, A., Arrastia, M., Nelson, D., Cornille, T., & Liang, X. (2011). Cognitive presence in asynchronous online learning: A comparison of four discussion strategies. Journal of Computer Assisted Learning , 27 (3), 216–227.

Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education , 4 (2), 215–235.

Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and learning environment: An exploratory study. Higher Education , 59 (3), 277–292.

Farrell, O., & Brunton, J. (2020). A balancing act: A window into online student engagement experiences. International Journal of Educational Technology in High Education , 17 , #25.

Fidalgo, P., Thormann, J., Kulyk, O., et al. (2020). Students’ perceptions on distance education: A multinational study. International Journal of Educational Technology in High Education , 17 , #18.

Flores, Ò., del-Arco, I., & Silva, P. (2016). The flipped classroom model at the university: Analysis based on professors’ and students’ assessment in the educational field. International Journal of Educational Technology in Higher Education , 13 , #21.

Garrison, D. R., Anderson, T., & Archer, W. (2003). A theory of critical inquiry in online distance education. Handbook of Distance Education , 1 , 113–127.

Gong, D., Yang, H. H., & Cai, J. (2020). Exploring the key influencing factors on college students’ computational thinking skills through flipped-classroom instruction. International Journal of Educational Technology in Higher Education , 17 , #19.

Gonzalez, C. (2009). Conceptions of, and approaches to, teaching online: A study of lecturers teaching postgraduate distance courses. Higher Education , 57 (3), 299–314.

Grandzol, J. R., & Grandzol, C. J. (2006). Best practices for online business Education. International Review of Research in Open and Distance Learning , 7 (1), 1–18.

Green, S. B., & Salkind, N. J. (2003). Using SPSS: Analyzing and understanding data , (3rd ed., ). Upper Saddle River: Prentice Hall.

Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2014). Multivariate data analysis: Pearson new international edition . Essex: Pearson Education Limited.

Harjoto, M. A. (2017). Blended versus face-to-face: Evidence from a graduate corporate finance class. Journal of Education for Business , 92 (3), 129–137.

Hong, K.-S. (2002). Relationships between students’ instructional variables with satisfaction and learning from a web-based course. The Internet and Higher Education , 5 , 267–281.

Horvitz, B. S., Beach, A. L., Anderson, M. L., & Xia, J. (2015). Examination of faculty self-efficacy related to online teaching. Innovation Higher Education , 40 , 305–316.

Inside Higher Education and Gallup. (2019). The 2019 survey of faculty attitudes on technology. Author .

Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers and Education , 95 , 270–284.

Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictor in a structural model. Computers & Education , 57 (2), 1654–1664.

Jung, I. (2011). The dimensions of e-learning quality: From the learner’s perspective. Educational Technology Research and Development , 59 (4), 445–464.

Kay, R., MacDonald, T., & DiGiuseppe, M. (2019). A comparison of lecture-based, active, and flipped classroom teaching approaches in higher education. Journal of Computing in Higher Education , 31 , 449–471.

Kehrwald, B. (2008). Understanding social presence in text-based online learning environments. Distance Education , 29 (1), 89–106.

Kintu, M. J., Zhu, C., & Kagambe, E. (2017). Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. International Journal of Educational Technology in Higher Education , 14 , #7.

Kuo, Y.-C., Walker, A. E., Schroder, K. E., & Belland, B. R. (2013). Interaction, internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Education , 20 , 35–50.

Lange, C., & Costley, J. (2020). Improving online video lectures: Learning challenges created by media. International Journal of Educational Technology in Higher Education , 17 , #16.

le Roux, I., & Nagel, L. (2018). Seeking the best blend for deep learning in a flipped classroom – Viewing student perceptions through the Community of Inquiry lens. International Journal of Educational Technology in High Education , 15 , #16.

Lee, H.-J., & Rha, I. (2009). Influence of structure and interaction on student achievement and satisfaction in web-based distance learning. Educational Technology & Society , 12 (4), 372–382.

Lee, Y., Stringer, D., & Du, J. (2017). What determines students’ preference of online to F2F class? Business Education Innovation Journal , 9 (2), 97–102.

Legon, R., & Garrett, R. (2019). CHLOE 3: Behind the numbers . Published online by Quality Matters and Eduventures. https://www.qualitymatters.org/sites/default/files/research-docs-pdfs/CHLOE-3-Report-2019-Behind-the-Numbers.pdf

Liaw, S.-S., & Huang, H.-M. (2013). Perceived satisfaction, perceived usefulness and interactive learning environments as predictors of self-regulation in e-learning environments. Computers & Education , 60 (1), 14–24.

Lu, F., & Lemonde, M. (2013). A comparison of online versus face-to-face students teaching delivery in statistics instruction for undergraduate health science students. Advances in Health Science Education , 18 , 963–973.

Lundin, M., Bergviken Rensfeldt, A., Hillman, T., Lantz-Andersson, A., & Peterson, L. (2018). Higher education dominance and siloed knowledge: a systematic review of flipped classroom research. International Journal of Educational Technology in Higher Education , 15 (1).

Macon, D. K. (2011). Student satisfaction with online courses versus traditional courses: A meta-analysis . Disssertation: Northcentral University, CA.

Mann, J., & Henneberry, S. (2012). What characteristics of college students influence their decisions to select online courses? Online Journal of Distance Learning Administration , 15 (5), 1–14.

Mansbach, J., & Austin, A. E. (2018). Nuanced perspectives about online teaching: Mid-career senior faculty voices reflecting on academic work in the digital age. Innovative Higher Education , 43 (4), 257–272.

Marks, R. B., Sibley, S. D., & Arbaugh, J. B. (2005). A structural equation model of predictors for effective online learning. Journal of Management Education , 29 (4), 531–563.

Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of facilitation strategies that enhance instructor presence, connectedness, engagement and learning in online courses. Internet and Higher Education , 37 , 52–65.

Maycock, K. W. (2019). Chalk and talk versus flipped learning: A case study. Journal of Computer Assisted Learning , 35 , 121–126.

McGivney-Burelle, J. (2013). Flipping Calculus. PRIMUS Problems, Resources, and Issues in Mathematics Undergraduate . Studies , 23 (5), 477–486.

Mohammadi, H. (2015). Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Computers in Human Behavior , 45 , 359–374.

Nair, S. S., Tay, L. Y., & Koh, J. H. L. (2013). Students’ motivation and teachers’ teaching practices towards the use of blogs for writing of online journals. Educational Media International , 50 (2), 108–119.

Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons. MERLOT Journal of Online Learning and Teaching , 11 (2), 309–319.

Ni, A. Y. (2013). Comparing the effectiveness of classroom and online learning: Teaching research methods. Journal of Public Affairs Education , 19 (2), 199–215.

Nouri, J. (2016). The flipped classroom: For active, effective and increased learning – Especially for low achievers. International Journal of Educational Technology in Higher Education , 13 , #33.

O’Neill, D. K., & Sai, T. H. (2014). Why not? Examining college students’ reasons for avoiding an online course. Higher Education , 68 (1), 1–14.

O'Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education , 25 , 85–95.

Open & Distant Learning Quality Council (2012). ODLQC standards . England: Author https://www.odlqc.org.uk/odlqc-standards .

Ortagus, J. C. (2017). From the periphery to prominence: An examination of the changing profile of online students in American higher education. Internet and Higher Education , 32 , 47–57.

Otter, R. R., Seipel, S., Graef, T., Alexander, B., Boraiko, C., Gray, J., … Sadler, K. (2013). Comparing student and faculty perceptions of online and traditional courses. Internet and Higher Education , 19 , 27–35.

Paechter, M., Maier, B., & Macher, D. (2010). Online or face-to-face? Students’ experiences and preferences in e-learning. Internet and Higher Education , 13 , 292–329.

Prinsloo, P. (2016). (re)considering distance education: Exploring its relevance, sustainability and value contribution. Distance Education , 37 (2), 139–145.

Quality Matters (2018). Specific review standards from the QM higher Education rubric , (6th ed., ). MD: MarylandOnline.

Richardson, J. C., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior , 71 , 402–417.

Rockhart, J. F., & Bullen, C. V. (1981). A primer on critical success factors . Cambridge: Center for Information Systems Research, Massachusetts Institute of Technology.

Rourke, L., & Kanuka, H. (2009). Learning in Communities of Inquiry: A Review of the Literature. The Journal of Distance Education / Revue de l'ducation Distance , 23 (1), 19–48 Athabasca University Press. Retrieved August 2, 2020 from https://www.learntechlib.org/p/105542/ .

Sebastianelli, R., Swift, C., & Tamimi, N. (2015). Factors affecting perceived learning, satisfaction, and quality in the online MBA: A structural equation modeling approach. Journal of Education for Business , 90 (6), 296–305.

Shen, D., Cho, M.-H., Tsai, C.-L., & Marra, R. (2013). Unpacking online learning experiences: Online learning self-efficacy and learning satisfaction. Internet and Higher Education , 19 , 10–17.

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology , 59 (3), 623–664.

So, H. J., & Brush, T. A. (2008). Student perceptions of collaborative learning, social presence and satisfaction in a blended learning environment: Relationships and critical factors. Computers & Education , 51 (1), 318–336.

Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student perceptions of useful and challenging characteristics. The Internet and Higher Education , 7 (1), 59–70.

Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education , 50 (4), 1183–1202.

Takamine, K. (2017). Michelle D. miller: Minds online: Teaching effectively with technology. Higher Education , 73 , 789–791.

Tanner, J. R., Noser, T. C., & Totaro, M. W. (2009). Business faculty and undergraduate students’ perceptions of online learning: A comparative study. Journal of Information Systems Education , 20 (1), 29.

Tucker, B. (2012). The flipped classroom. Education Next , 12 (1), 82–83.

Van Wart, M., Ni, A., Ready, D., Shayo, C., & Court, J. (2020). Factors leading to online learner satisfaction. Business Educational Innovation Journal , 12 (1), 15–24.

Van Wart, M., Ni, A., Rose, L., McWeeney, T., & Worrell, R. A. (2019). Literature review and model of online teaching effectiveness integrating concerns for learning achievement, student satisfaction, faculty satisfaction, and institutional results. Pan-Pacific . Journal of Business Research , 10 (1), 1–22.

Ventura, A. C., & Moscoloni, N. (2015). Learning styles and disciplinary differences: A cross-sectional study of undergraduate students. International Journal of Learning and Teaching , 1 (2), 88–93.

Vlachopoulos, D., & Makri, A. (2017). The effect of games and simulations on higher education: A systematic literature review. International Journal of Educational Technology in Higher Education , 14 , #22.

Wang, Y., Huang, X., & Schunn, C. D. (2019). Redesigning flipped classrooms: A learning model and its effects on student perceptions. Higher Education , 78 , 711–728.

Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online: Exploring the literature using the technology acceptance model as an organizing framework. Online Learning , 21 (1), 15–35.

Xu, D., & Jaggars, S. S. (2014). Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas. Journal of Higher Education , 85 (5), 633–659.

Young, S. (2006). Student views of effective online teaching in higher education. American Journal of Distance Education , 20 (2), 65–77.

Zawacki-Richter, O., & Naidu, S. (2016). Mapping research trends from 35 years of publications in distance Education. Distance Education , 37 (3), 245–269.

Download references

Acknowledgements

No external funding/ NA.

Author information

Authors and affiliations.

Development for the JHB College of Business and Public Administration, 5500 University Parkway, San Bernardino, California, 92407, USA

Montgomery Van Wart, Anna Ni, Pamela Medina, Jesus Canelon, Melika Kordrostami, Jing Zhang & Yu Liu

You can also search for this author in PubMed   Google Scholar

Contributions

Equal. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Montgomery Van Wart .

Ethics declarations

Competing interests.

We have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Van Wart, M., Ni, A., Medina, P. et al. Integrating students’ perspectives about online learning: a hierarchy of factors. Int J Educ Technol High Educ 17 , 53 (2020). https://doi.org/10.1186/s41239-020-00229-8

Download citation

Received : 29 April 2020

Accepted : 30 July 2020

Published : 02 December 2020

DOI : https://doi.org/10.1186/s41239-020-00229-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Online teaching
  • Student perceptions
  • Online quality
  • Student presence

articles on online education

  • Open access
  • Published: 16 September 2021

Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study

  • Meixun Zheng 1 ,
  • Daniel Bender 1 &
  • Cindy Lyon 1  

BMC Medical Education volume  21 , Article number:  495 ( 2021 ) Cite this article

205k Accesses

74 Citations

115 Altmetric

Metrics details

The COVID-19 pandemic forced dental schools to close their campuses and move didactic instruction online. The abrupt transition to online learning, however, has raised several issues that have not been resolved. While several studies have investigated dental students’ attitude towards online learning during the pandemic, mixed results have been reported. Additionally, little research has been conducted to identify and understand factors, especially pedagogical factors, that impacted students’ acceptance of online learning during campus closure. Furthermore, how online learning during the pandemic impacted students’ learning performance has not been empirically investigated. In March 2020, the dental school studied here moved didactic instruction online in response to government issued stay-at-home orders. This first-of-its-kind comparative study examined students’ perceived effectiveness of online courses during summer quarter 2020, explored pedagogical factors impacting their acceptance of online courses, and empirically evaluated the impact of online learning on students’ course performance, during the pandemic.

The study employed a quasi-experimental design. Participants were 482 pre-doctoral students in a U.S dental school. Students’ perceived effectiveness of online courses during the pandemic was assessed with a survey. Students’ course grades for online courses during summer quarter 2020 were compared with that of a control group who received face-to-face instruction for the same courses before the pandemic in summer quarter 2019.

Survey results revealed that most online courses were well accepted by the students, and 80 % of them wanted to continue with some online instruction post pandemic. Regression analyses revealed that students’ perceived engagement with faculty and classmates predicted their perceived effectiveness of the online course. More notably, Chi Square tests demonstrated that in 16 out of the 17 courses compared, the online cohort during summer quarter 2020 was equally or more likely to get an A course grade than the analogous face-to-face cohort during summer quarter 2019.

Conclusions

This is the first empirical study in dental education to demonstrate that online courses during the pandemic could achieve equivalent or better student course performance than the same pre-pandemic in-person courses. The findings fill in gaps in literature and may inform online learning design moving forward.

Peer Review reports

Introduction

Research across disciplines has demonstrated that well-designed online learning can lead to students’ enhanced motivation, satisfaction, and learning [ 1 , 2 , 3 , 4 , 5 , 6 , 7 ]. A report by the U.S. Department of Education [ 8 ], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning outcomes equivalent to or better than face-to-face learning. The more recent systematic review by Pei and Wu [ 9 ] provided additional evidence that online learning is at least as effective as face-to-face learning for undergraduate medical students.

To take advantage of the opportunities presented by online learning, thought leaders in dental education in the U.S. have advocated for the adoption of online learning in the nation’s dental schools [ 10 , 11 , 12 ]. However, digital innovation has been a slow process in academic dentistry [ 13 , 14 , 15 ]. In March 2020, the COVID-19 pandemic brought unprecedented disruption to dental education by necessitating the need for online learning. In accordance with stay-at-home orders to prevent the spread of the virus, dental schools around the world closed their campuses and moved didactic instruction online.

The abrupt transition to online learning, however, has raised several concerns and question. First, while several studies have examined dental students’ online learning satisfaction during the pandemic, mixed results have been reported. Some studies have reported students’ positive attitude towards online learning [ 15 , 16 , 17 , 18 , 19 , 20 ]. Sadid-Zadeh et al. [ 18 ] found that 99 % of the surveyed dental students at University of Buffalo, in the U.S., were satisfied with live web-based lectures during the pandemic. Schlenz et al. [ 15 ] reported that students in a German dental school had a favorable attitude towards online learning and wanted to continue with online instruction in their future curriculum. Other studies, however, have reported students’ negative online learning experience during the pandemic [ 21 , 22 , 23 , 24 , 25 , 26 ]. For instance, dental students at Harvard University felt that learning during the pandemic had worsened and engagement had decreased [ 23 , 24 ]. In a study with medical and dental students in Pakistan, Abbasi et al. [ 21 ] found that 77 % of the students had negative perceptions about online learning and 84 % reported reduced student-instructor interactions.

In addition to these mixed results, little attention has been given to factors affecting students’ acceptance of online learning during the pandemic. With the likelihood that online learning will persist post pandemic [ 27 ], research in this area is warranted to inform online course design moving forward. In particular, prior research has demonstrated that one of the most important factors influencing students’ performance in any learning environment is a sense of belonging, the feeling of being connected with and supported by the instructor and classmates [ 28 , 29 , 30 , 31 ]. Unfortunately, this aspect of the classroom experience has suffered during school closure. While educational events can be held using a video conferencing system, virtual peer interaction on such platforms has been perceived by medical trainees to be not as easy and personal as physical interaction [ 32 ]. The pandemic highlights the need to examine instructional strategies most suited to the current situation to support students’ engagement with faculty and classmates.

Furthermore, there is considerable concern from the academic community about the quality of online learning. Pre-pandemic, some faculty and students were already skeptical about the value of online learning [ 33 ]. The longer the pandemic lasts, the more they may question the value of online education, asking: Can online learning during the pandemic produce learning outcomes that are similar to face-to-face learning before the pandemic? Despite the documented benefits of online learning prior to the pandemic, the actual impact of online learning during the pandemic on students’ academic performance is still unknown due to reasons outlined below.

On one hand, several factors beyond the technology used could influence the effectiveness of online learning, one of which is the teaching context [ 34 ]. The sudden transition to online learning has posed many challenges to faculty and students. Faculty may not have had adequate time to carefully design online courses to take full advantage of the possibilities of the online format. Some faculty may not have had prior online teaching experience and experienced a deeper learning curve when it came to adopting online teaching methods [ 35 ]. Students may have been at the risk of increased anxiety due to concerns about contracting the virus, on time graduation, finances, and employment [ 36 , 37 ], which may have negatively impacted learning performance [ 38 ]. Therefore, whether online learning during the pandemic could produce learning outcomes similar to those of online learning implemented during more normal times remains to be determined.

Most existing studies on online learning in dental education during the pandemic have only reported students’ satisfaction. The actual impact of the online format on academic performance has not been empirically investigated. The few studies that have examined students’ learning outcomes have only used students’ self-reported data from surveys and focus groups. According to Kaczmarek et al. [ 24 ], 50 % of the participating dental faculty at Harvard University perceived student learning to have worsened during the pandemic and 70 % of the students felt the same. Abbasi et al. [ 21 ] reported that 86 % of medical and dental students in a Pakistan college felt that they learned less online. While student opinions are important, research has demonstrated a poor correlation between students’ perceived learning and actual learning gains [ 39 ]. As we continue to navigate the “new normal” in teaching, students’ learning performance needs to be empirically evaluated to help institutions gauge the impact of this grand online learning experiment.

Research purposes

In March 2020, the University of the Pacific Arthur A. Dugoni School of Dentistry, in the U.S., moved didactic instruction online to ensure the continuity of education during building closure. This study examined students’ acceptance of online learning during the pandemic and its impacting factors, focusing on instructional practices pertaining to students’ engagement/interaction with faculty and classmates. Another purpose of this study was to empirically evaluate the impact of online learning during the pandemic on students’ actual course performance by comparing it with that of a pre-pandemic cohort. To understand the broader impact of the institutional-wide online learning effort, we examined all online courses offered in summer quarter 2020 (July to September) that had a didactic component.

This is the first empirical study in dental education to evaluate students’ learning performance during the pandemic. The study aimed to answer the following three questions.

How well was online learning accepted by students, during the summer quarter 2020 pandemic interruption?

How did instructional strategies, centered around students’ engagement with faculty and classmates, impact their acceptance of online learning?

How did online learning during summer quarter 2020 impact students’ course performance as compared with a previous analogous cohort who received face-to-face instruction in summer quarter 2019?

This study employed a quasi-experimental design. The study was approved by the university’s institutional review board (#2020-68).

Study context and participants

The study was conducted at the Arthur A. Dugoni School of Dentistry, University of the Pacific. The program runs on a quarter system. It offers a 3-year accelerated Doctor of Dental Surgery (DDS) program and a 2-year International Dental Studies (IDS) program for international dentists who have obtained a doctoral degree in dentistry from a country outside the U.S. and want to practice in the U.S. Students advance throughout the program in cohorts. IDS students take some courses together with their DDS peers. All three DDS classes (D1/DDS 2023, D2/DDS 2022, and D3/DDS 2021) and both IDS classes (I1/IDS 2022 and I2/IDS 2021) were invited to participate in the study. The number of students in each class was: D1 = 145, D2 = 143, D3 = 143, I1 = 26, and I2 = 25. This resulted in a total of 482 student participants.

During campus closure, faculty delivered remote instruction in various ways, including live online classes via Zoom @  [ 40 ], self-paced online modules on the school’s learning management system Canvas @  [ 41 ], or a combination of live and self-paced delivery. For self-paced modules, students studied assigned readings and/or viewings such as videos and pre-recorded slide presentations. Some faculty also developed self-paced online lessons with SoftChalk @  [ 42 ], a cloud-based platform that supports the inclusion of gamified learning by insertion of various mini learning activities. The SoftChalk lessons were integrated with Canvas @  [ 41 ] and faculty could monitor students’ progress. After students completed the pre-assigned online materials, some faculty held virtual office hours or live online discussion sessions for students to ask questions and discuss key concepts.

Data collection and analysis

Student survey.

Students’ perceived effectiveness of summer quarter 2020 online courses was evaluated by the school’s Office of Academic Affairs in lieu of the regular course evaluation process. A total of 19 courses for DDS students and 10 courses for IDS students were evaluated. An 8-question survey developed by the researchers (Additional file 1 ) was administered online in the last week of summer quarter 2020. Course directors invited student to take the survey during live online classes. The survey introduction stated that taking the survey was voluntary and that their anonymous responses would be reported in aggregated form for research purposes. Students were invited to continue with the survey if they chose to participate; otherwise, they could exit the survey. The number of students in each class who took the survey was as follows: D1 ( n  = 142; 98 %), D2 ( n  = 133; 93 %), D3 ( n  = 61; 43 %), I1 ( n  = 23; 88 %), and I2 ( n  = 20; 80 %). This resulted in a total of 379 (79 %) respondents across all classes.

The survey questions were on a 4-point scale, ranging from Strongly Disagree (1 point), Disagree (2 points), Agree (3 points), and Strongly Agree (4 points). Students were asked to rate each online course by responding to four statements: “ I could fully engage with the instructor and classmates in this course”; “The online format of this course supported my learning”; “Overall this online course is effective.”, and “ I would have preferred face-to-face instruction for this course ”. For the first three survey questions, a higher mean score indicated a more positive attitude toward the online course. For the fourth question “ I would have preferred face-to-face instruction for this course ”, a higher mean score indicated that more students would have preferred face-to-face instruction for the course. Two additional survey questions asked students to select their preferred online delivery method for fully online courses during the pandemic from three given choices (synchronous online/live, asynchronous online/self-paced, and a combination of both), and to report whether they wanted to continue with some online instruction post pandemic. Finally, two open-ended questions at the end of the survey allowed students to comment on the aspects of online format that they found to be helpful and to provide suggestion for improvement. For the purpose of this study, we focused on the quantitative data from the Likert-scale questions.

Descriptive data such as the mean scores were reported for each course. Regression analyses were conducted to examine the relationship between instructional strategies focusing on students’ engagement with faculty and classmates, and their overall perceived effectiveness of the online course. The independent variable was student responses to the question “ I could fully engage with the instructor and classmates in this course ”, and the dependent variable was their answer to the question “ Overall, this online course is effective .”

Student course grades

Using Chi-square tests, student course grade distributions (A, B, C, D, and F) for summer quarter 2020 online courses were compared with that of a previous cohort who received face-to-face instruction for the same course in summer quarter 2019. Note that as a result of the school’s pre-doctoral curriculum redesign implemented in July 2019, not all courses offered in summer quarter 2020 were offered in the previous year in summer quarter 2019. In other words, some of the courses offered in summer quarter 2020 were new courses offered for the first time. Because these new courses did not have a previous face-to-face version to compare to, they were excluded from data analysis. For some other courses, while course content remained the same between 2019 and 2020, the sequence of course topics within the course had changed. These courses were also excluded from data analysis.

After excluding the aforementioned courses, it resulted in a total of 17 “comparable” courses that were included in data analysis (see the subsequent section). For these courses, the instructor, course content, and course goals were the same in both 2019 and 2020. The assessment methods and grading policies also remained the same through both years. For exams and quizzes, multiple choice questions were the dominating format for both years. While some exam questions in 2020 were different from 2019, faculty reported that the overall exam difficulty level was similar. The main difference in assessment was testing conditions. The 2019 cohort took computer-based exams in the physical classroom with faculty proctoring, and the 2020 cohort took exams at home with remote proctoring to ensure exam integrity. The remote proctoring software monitored the student during the exam through a web camera on their computer/laptop. The recorded video file flags suspicious activities for faculty review after exam completion.

Students’ perceived effectiveness of online learning

Table  1 summarized data on DDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, the majority of courses received a mean score that was approaching or over 3 points on the 4-point scale, suggesting that online learning was generally well accepted by students. Despite overall positive online course experiences, for many of the courses examined, there was an equal split in student responses to the question “ I would have preferred face-to-face instruction for this course .” Additionally, for students’ preferred online delivery method for fully online courses, about half of the students in each class preferred a combination of synchronous and asynchronous online learning (see Fig.  1 ). Finally, the majority of students wanted faculty to continue with some online instruction post pandemic: D1class (110; 78.60 %), D2 class (104; 80 %), and D3 class (49; 83.10 %).

While most online courses received favorable ratings, some variations did exist among courses. For D1 courses, “ Anatomy & Histology ” received lower ratings than others. This could be explained by its lab component, which didn’t lend itself as well to the online format. For D2 courses, several of them received lower ratings than others, especially for the survey question on students’ perceived engagement with faculty and classmates.

figure 1

DDS students’ preferred online delivery method for fully online courses

Table  2 summarized IDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, all courses received a mean score that was approaching or over 3 points on a 4-point scale, suggesting that online learning was well accepted by students. For the survey question “ I would have preferred face-to-face instruction for this course ”, for most online courses examined, the percentage of students who would have preferred face-to-face instruction was similar to that of students who preferred online instruction for the course. Like their DDS peers, about half of the IDS students in each class also preferred a combination of synchronous and asynchronous online delivery for fully online courses (See Fig.  2 ). Finally, the majority of IDS students (I1, n = 18, 81.80 %; I2, n = 16, 84.20 %) wanted to continue with some online learning after the pandemic is over.

figure 2

IDS students’ preferred online delivery method for fully online courses

Factors impacting students’ acceptance of online learning

For all 19 online courses taken by DDS students, regression analyses indicated that there was a significantly positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across all courses. The ranges of effect size (r 2 ) were: D1 courses (0.26 to 0.50), D2 courses (0.39 to 0.650), and D3 courses (0.22 to 0.44), indicating moderate to high correlations across courses.

For 9 out of the 10 online courses taken by IDS students, there was a positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across courses. The ranges of effect size were: I1 courses (0.35 to 0.77) and I2 courses (0.47 to 0.63), indicating consistently high correlations across courses. The only course in which students’ perceived engagement with faculty and classmates didn’t predict perceived effective of the course was “ Integrated Clinical Science III (ICS III) ”, which the I2 class took together with their D3 peers.

Impact of online learning on students’ course performance

Chi square test results (Table  3 ) indicated that in 4 out of the 17 courses compared, the online cohort during summer quarter 2020 was more likely to receive an A grade than the face-to-face cohort during summer quarter 2019. In 12 of the courses, the online cohort were equally likely to receive an A grade as the face-to-face cohort. In the remaining one course, the online cohort was less likely to receive an A grade than the face-to-face cohort.

Students’ acceptance of online learning during the pandemic

Survey results revealed that students had generally positive perceptions about online learning during the pandemic and the majority of them wanted to continue with some online learning post pandemic. Overall, our findings supported several other studies in dental [ 18 , 20 ], medical [ 43 , 44 ], and nursing [ 45 ] education that have also reported students’ positive attitudes towards online learning during the pandemic. In their written comments in the survey, students cited enhanced flexibility as one of the greatest benefits of online learning. Some students also commented that typing questions in the chat box during live online classes was less intimidating than speaking in class. Others explicitly stated that not having to commute to/from school provided more time for sleep, which helped with self-care and mental health. Our findings are in line with previous studies which have also demonstrated that online learning offered higher flexibility [ 46 , 47 ]. Meanwhile, consistent with findings of other researchers [ 19 , 21 , 46 ], our students felt difficulty engaging with faculty and classmates in several online courses.

There were some variations among individual courses in students’ acceptance of the online format. One factor that could partially account for the observed differences was instructional strategies. In particular, our regression analysis results demonstrated a positive correlation between students’ perceived engagement with faculty and classmates and their perceived overall effectiveness of the online course. Other aspects of course design might also have influenced students’ overall rating of the online course. For instance, some D2 students commented that the requirements of the course “ Integrated Case-based Seminars (ICS II) ” were not clear and that assessment did not align with lecture materials. It is important to remember that communicating course requirements clearly and aligning course content and assessment are principles that should be applied in any course, whether face-to-face or online. Our results highlighted the importance of providing faculty training on basic educational design principles and online learning design strategies. Furthermore, the nature of the course might also have impacted student ratings. For example, D1 course “ Anatomy and Histology ” had a lab component, which did not lend itself as well to the online format. Many students reported that it was difficult to see faculty’s live demonstration during Zoom lectures, which may have resulted in a lower student satisfaction rating.

As for students’ preferred online delivery method for fully online courses during the pandemic, about half of them preferred a combination of synchronous and asynchronous online learning. In light of this finding, as we continue with remote learning until public health directives allow a return to campus, we will encourage faculty to integrate these two online delivery modalities. Finally, in view of the result that over 80 % of the students wanted to continue with some online instruction after the pandemic, the school will advocate for blended learning in the post-pandemic world [ 48 ]. For future face-to-face courses on campus after the pandemic, faculty are encouraged to deliver some content online to reduce classroom seat time and make learning more flexible. Taken together, our findings not only add to the overall picture of the current situation but may inform learning design moving forward.

Role of online engagement and interaction

To reiterate, we found that students’ perceived engagement with faculty and classmates predicted their perceived overall effectiveness of the online course. This aligns with the larger literature on best practices in online learning design. Extensive research prior to the pandemic has confirmed that the effectiveness of online learning is determined by a number of factors beyond the tools used, including students’ interactions with the instructor and classmates [ 49 , 50 , 51 , 52 ]. Online students may feel isolated due to reduced or lack of interaction [ 53 , 54 ]. Therefore, in designing online learning experiences, it is important to remember that learning is a social process [ 55 ]. Faculty’s role is not only to transmit content but also to promote the different types of interactions that are an integral part of the online learning process [ 33 ]. The online teaching model in which faculty uploads materials online but teach it in the same way as in the physical classroom, without special effort to engage students, doesn’t make the best use of the online format. Putting the “sage on the screen” during a live class meeting on a video conferencing system is not different from “sage on the stage” in the physical classroom - both provide limited space for engagement. Such one-way monologue devalues the potentials that online learning presents.

In light of the critical role that social interaction plays in online learning, faculty are encouraged to use the interactive features of online learning platforms to provide clear channels for student-instructor and student-student interactions. In the open-ended comments, students highlighted several instructional strategies that they perceived to be helpful for learning. For live online classes, these included conducting breakout room activities, using the chat box to facilitate discussions, polling, and integrating gameplay with apps such as Kahoot! @  [ 56 ]. For self-paced classes, students appreciated that faculty held virtual office hours or subsequent live online discussion sessions to reinforce understanding of the pre-assigned materials.

Quality of online education during the pandemic

This study provided empirical evidence in dental education that it was possible to ensure the continuity of education without sacrificing the quality of education provided to students during forced migration to distance learning upon building closure. To reiterate, in all but one online course offered in summer quarter 2020, students were equally or more likely to get an A grade than the face-to-face cohort from summer quarter 2019. Even for courses that had less student support for the online format (e.g., the D1 course “ Anatomy and Histology ”), there was a significant increase in the number of students who earned an A grade in 2020 as compared with the previous year. The reduced capacity for technical training during the pandemic may have resulted in more study time for didactic content. Overall, our results resonate with several studies in health sciences education before the pandemic that the quality of learning is comparable in face-to-face and online formats [ 9 , 57 , 58 ]. For the only course ( Integrated Case-based Seminars ICS II) in which the online cohort had inferior performance than the face-to-face cohort, as mentioned earlier, students reported that assessment was not aligned with course materials and that course expectations were not clear. This might explain why students’ course performance was not as strong as expected.

Limitations

This study used a pre-existing control group from the previous year. There may have been individual differences between students in the online and the face-to-face cohorts, such as motivation, learning style, and prior knowledge, that could have impacted the observed outcomes. Additionally, even though course content and assessment methods were largely the same in 2019 and 2020, changes in other aspects of the course could have impacted students’ course performance. Some faculty may have been more compassionate with grading (e.g., more flexible with assignment deadlines) in summer quarter 2020 given the hardship students experienced during the pandemic. On the other hand, remote proctoring in summer quarter 2020 may have heightened some students’ exam anxiety knowing that they were being monitored through a webcam. The existence and magnitude of effect of these factors needs to be further investigated.

This present study only examined the correlation between students’ perceived online engagement and their perceived overall effectiveness of the online course. Other factors that might impact their acceptance of the online format need to be further researched in future studies. Another future direction is to examine how students’ perceived online engagement correlates with their actual course performance. Because the survey data collected for our present study are anonymous, we cannot match students’ perceived online engagement data with their course grades to run this additional analysis. It should also be noted that this study was focused on didactic online instruction. Future studies might examine how technical training was impacted during the COVID building closure. It was also out of the scope of this study to examine how student characteristics, especially high and low academic performance as reflected by individual grades, affects their online learning experience and performance. We plan to conduct a follow-up study to examine which group of students are most impacted by the online format. Finally, this study was conducted in a single dental school, and so the findings may not be generalizable to other schools and disciplines. Future studies could be conducted in another school or disciplines to compare results.

This study revealed that dental students had generally favorable attitudes towards online learning during the COVID-19 pandemic and that their perceived engagement with faculty and classmates predicted their acceptance of the online course. Most notably, this is the first study in dental education to demonstrate that online learning during the pandemic could achieve similar or better learning outcomes than face-to-face learning before the pandemic. Findings of our study could contribute significantly to the literature on online learning during the COVID-19 pandemic in health sciences education. The results could also inform future online learning design as we re-envision the future of online learning.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Bello G, Pennisi MA, Maviglia R, Maggiore SM, Bocci MG, Montini L, et al. Online vs live methods for teaching difficult airway management to anesthesiology residents. Intensive Care Med. 2005; 31 (4): 547–552.

Article   Google Scholar  

Ruiz JG, Mintzer MJ, Leipzig RM. The impact of e-learning in medical education. Acad Med. 2006; 81(3): 207–12.

Kavadella A, Tsiklakis K, Vougiouklakis G, Lionarakis A. Evaluation of a blended learning course for teaching oral radiology to undergraduate dental students. Eur J Dent Educ. 2012; 16(1): 88–95.

de Jong N, Verstegen DL, Tan FS, O’Connor SJ. A comparison of classroom and online asynchronous problem-based learning for students undertaking statistics training as part of a public health master’s degree. Adv Health Sci Educ. 2013; 18(2):245–64.

Hegeman JS. Using instructor-generated video lectures in online mathematics coursesimproves student learning. Online Learn. 2015;19(3):70–87.

Gaupp R, Körner M, Fabry G. Effects of a case-based interactive e-learning course on knowledge and attitudes about patient safety: a quasi-experimental study with third-year medical students. BMC Med Educ. 2016; 16(1):172.

Zheng M, Bender D, Reid L, Milani J. An interactive online approach to teaching evidence-based dentistry with Web 2.0 technology. J Dent Educ. 2017; 81(8): 995–1003.

Means B, Toyama Y, Murphy R, Bakia M, Jones K. Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. U.S. Department of Education, Office of Planning, Evaluation and Policy Development. Washington D.C. 2009.

Google Scholar  

Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Med Educ Online. 2019; 24(1):1666538.

Andrews KG, Demps EL. Distance education in the U.S. and Canadian undergraduate dental curriculum. J Dent Educ. 2003; 67(4):427–38.

Kassebaum DK, Hendricson WD, Taft T, Haden NK. The dental curriculum at North American dental institutions in 2002–03: a survey of current structure, recent innovations, and planned changes. J Dent Educ. 2004; 68(9):914–931.

Haden NK, Hendricson WD, Kassebaum DK, Ranney RR, Weinstein G, Anderson EL, et al. Curriculum changes in dental education, 2003–09. J Dent Educ. 2010; 74(5):539–57.

DeBate RD, Cragun D, Severson HH, Shaw T, Christiansen S, Koerber A, et al. Factors for increasing adoption of e-courses among dental and dental hygiene faculty members. J Dent Educ. 2011; 75 (5): 589–597.

Saeed SG, Bain J, Khoo E, Siqueira WL. COVID-19: Finding silver linings for dental education. J Dent Educ. 2020; 84(10):1060–1063.

Schlenz MA, Schmidt A, Wöstmann B, Krämer N, Schulz-Weidner N. Students’ and lecturers’ perspective on the implementation of online learning in dental education due to SARS-CoV-2 (COVID-19): a cross-sectional study. BMC Med Educ. 2020;20(1):1–7.

Donn J, Scott JA, Binnie V, Bell A. A pilot of a virtual Objective Structured Clinical Examination in dental education. A response to COVID-19. Eur J Dent Educ. 2020; https://doi.org/10.1111/eje.12624

Hung M, Licari FW, Hon ES, Lauren E, Su S, Birmingham WC, Wadsworth LL, Lassetter JH, Graff TC, Harman W, et al. In an era of uncertainty: impact of COVID-19 on dental education. J Dent Educ. 2020; 85 (2): 148–156.

Sadid-Zadeh R, Wee A, Li R, Somogyi‐Ganss E. Audience and presenter comparison of live web‐based lectures and traditional classroom lectures during the COVID‐19 pandemic. J Prosthodont. 2020. doi: https://doi.org/10.1111/jopr.13301

Wang K, Zhang L, Ye L. A nationwide survey of online teaching strategies in dental education in China. J Dent Educ. 2020; 85 (2): 128–134.

Rad FA, Otaki F, Baqain Z, Zary N, Al-Halabi M. Rapid transition to distance learning due to COVID-19: Perceptions of postgraduate dental learners and instructors. PLoS One. 2021; 16(2): e0246584.

Abbasi S, Ayoob T, Malik A, Memon SI. Perceptions of students regarding E-learning during Covid-19 at a private medical college. Pak J Med Sci. 2020; 3 6 : 57–61.

Al-Azzam N, Elsalem L, Gombedza F. A cross-sectional study to determine factors affecting dental and medical students’ preference for virtual learning during the COVID-19 outbreak. Heliyon. 6(12). 2020. doi: https://doi.org/10.1016/j.heliyon.2020.e05704

Chen E, Kaczmarek K, Ohyama H. Student perceptions of distance learning strategies during COVID-19. J Dent Educ. 2020. doi: https://doi.org/10.1002/jdd.12339

Kaczmarek K, Chen E, Ohyama H. Distance learning in the COVID-19 era: Comparison of student and faculty perceptions. J Dent Educ. 2020. https://doi.org/10.1002/jdd.12469

Sarwar H, Akhtar H, Naeem MM, Khan JA, Waraich K, Shabbir S, et al. Self-reported effectiveness of e-learning classes during COVID-19 pandemic: A nation-wide survey of Pakistani undergraduate dentistry students. Eur J Dent. 2020; 14 (S01): S34-S43.

Al-Taweel FB, Abdulkareem AA, Gul SS, Alshami ML. Evaluation of technology‐based learning by dental students during the pandemic outbreak of coronavirus disease 2019. Eur J Dent Educ. 2021; 25(1): 183–190.

Elangovan S, Mahrous A, Marchini L. Disruptions during a pandemic: Gaps identified and lessons learned. J Dent Educ. 2020; 84 (11): 1270–1274.

Goodenow C. Classroom belonging among early adolescent students: Relationships to motivation and achievement. J Early Adolesc.1993; 13(1): 21–43.

Goodenow C. The psychological sense of school membership among adolescents: Scale development and educational correlates. Psychol Sch. 1993; 30(1): 79–90.

St-Amand J, Girard S, Smith J. Sense of belonging at school: Defining attributes, determinants, and sustaining strategies. IAFOR Journal of Education. 2017; 5(2):105–19.

Peacock S, Cowan J. Promoting sense of belonging in online learning communities of inquiry at accredited courses. Online Learn. 2019; 23(2): 67–81.

Chan GM, Kanneganti A, Yasin N, Ismail-Pratt I, Logan SJ. Well‐being, obstetrics and gynecology and COVID‐19: Leaving no trainee behind. Aust N Z J Obstet Gynaecol. 2020; 60(6): 983–986.

Hodges C, Moore S, Lockee B, Trust T, Bond A. The difference between emergency remote teaching and online learning. Educause Review. 2020; 2 7 , 1–12.

Means B, Bakia M, Murphy R. Learning online: What research tells us about whether, when and how. Routledge. 2014.

Iyer P, Aziz K, Ojcius DM. Impact of COVID-19 on dental education in the United States. J Dent Educ. 2020; 84(6): 718–722.

Machado RA, Bonan PRF, Perez DEDC, Martelli JÚnior H. 2020. COVID-19 pandemic and the impact on dental education: Discussing current and future perspectives. Braz Oral Res. 2020; 34: e083.

Wu DT, Wu KY, Nguyen TT, Tran SD. The impact of COVID-19 on dental education in North America-Where do we go next? Eur J Dent Educ. 2020; 24(4): 825–827.

de Oliveira Araújo FJ, de Lima LSA, Cidade PIM, Nobre CB, Neto MLR. Impact of Sars-Cov-2 and its reverberation in global higher education and mental health. Psychiatry Res. 2020; 288:112977. doi: https://doi.org/10.1016/j.psychres.2020.112977

Persky AM, Lee E, Schlesselman LS. Perception of learning versus performance as outcome measures of educational research. Am J Pharm Educ. 2020; 8 4 (7): ajpe7782.

Zoom @ . Zoom Video Communications , San Jose, CA, USA. https://zoom.us/

Canvas @ . Instructure, INC. Salt Lake City, UT, USA. https://www.instructure.com/canvas

SoftChalk @ . SoftChalk LLC . San Antonio, TX, USA. https://www.softchalkcloud.com/

Agarwal S, Kaushik JS. Student’s perception of online learning during COVID pandemic. Indian J Pediatr. 2020; 87: 554–554.

Khalil R, Mansour AE, Fadda WA, Almisnid K, Aldamegh M, Al-Nafeesah A, et al. The sudden transition to synchronized online learning during the COVID-19 pandemic in Saudi Arabia: a qualitative study exploring medical students’ perspectives. BMC Med Educ. 2020; 20(1): 1–10.

Riley E, Capps N, Ward N, McCormack L, Staley J. Maintaining academic performance and student satisfaction during the remote transition of a nursing obstetrics course to online instruction. Online Learn. 2021; 25(1), 220–229.

Amir LR, Tanti I, Maharani DA, Wimardhani YS, Julia V, Sulijaya B, et al. Student perspective of classroom and distance learning during COVID-19 pandemic in the undergraduate dental study program Universitas Indonesia. BMC Med Educ. 2020; 20(1):1–8.

Dost S, Hossain A, Shehab M, Abdelwahed A, Al-Nusair L. Perceptions of medical students towards online teaching during the COVID-19 pandemic: a national cross-sectional survey of 2721 UK medical students. BMJ Open. 2020; 10(11).

Graham CR, Woodfield W, Harrison JB. A framework for institutional adoption and implementation of blended learning in higher education. Internet High Educ. 2013; 18 : 4–14.

Sing C, Khine M. An analysis of interaction and participation patterns in online community. J Educ Techno Soc. 2006; 9(1): 250–261.

Bernard RM, Abrami PC, Borokhovski E, Wade CA, Tamim RM, Surkes MA, et al. A meta-analysis of three types of interaction treatments in distance education. Rev Educ Res. 2009; 79(3): 1243–1289.

Fedynich L, Bradley KS, Bradley J. Graduate students’ perceptions of online learning. Res High Educ. 2015; 27.

Tanis CJ. The seven principles of online learning: Feedback from faculty and alumni on its importance for teaching and learning. Res Learn Technol. 2020; 28 . https://doi.org/10.25304/rlt.v28.2319

Dixson MD. Measuring student engagement in the online course: The Online Student Engagement scale (OSE). Online Learn. 2015; 19 (4).

Kwary DA, Fauzie S. Students’ achievement and opinions on the implementation of e-learning for phonetics and phonology lectures at Airlangga University. Educ Pesqui. 2018; 44 .

Vygotsky LS. Mind in society: The development of higher psychological processes. Cambridge (MA): Harvard University Press. 1978.

Kahoot! @ . Oslo, Norway. https://kahoot.com/

Davis J, Chryssafidou E, Zamora J, Davies D, Khan K, Coomarasamy A. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomised controlled trial. BMC Med Educ. 2007; 7(1): 1–6.

Davis J, Crabb S, Rogers E, Zamora J, Khan K. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomized controlled trial. Med Teach. 2008; 30(3): 302–307.

Download references

Acknowledgements

Not applicable.

Authors’ information

MZ is an Associate Professor of Learning Sciences and Senior Instructional Designer at School of Dentistry, University of the Pacific. She has a PhD in Education, with a specialty on learning sciences and technology. She has dedicated her entire career to conducting research on online learning, learning technology, and faculty development. Her research has resulted in several peer-reviewed publications in medical, dental, and educational technology journals. MZ has also presented regularly at national conferences.

DB is an Assistant Dean for Academic Affairs at School of Dentistry, University of the Pacific. He has an EdD degree in education, with a concentration on learning and instruction. Over the past decades, DB has been overseeing and delivering faculty pedagogical development programs to dental faculty. His research interest lies in educational leadership and instructional innovation. DB has co-authored several peer-reviewed publications in health sciences education and presented regularly at national conferences.

CL is Associate Dean of Oral Healthcare Education, School of Dentistry, University of the Pacific. She has a Doctor of Dental Surgery (DDS) degree and an EdD degree with a focus on educational leadership. Her professional interest lies in educational leadership, oral healthcare education innovation, and faculty development. CL has co-authored several publications in peer-reviewed journals in health sciences education and presented regularly at national conferences.

Author information

Authors and affiliations.

Office of Academic Affairs, Arthur A. Dugoni School of Dentistry, University of the Pacific, CA, San Francisco, USA

Meixun Zheng, Daniel Bender & Cindy Lyon

You can also search for this author in PubMed   Google Scholar

Contributions

MZ analyzed the data and wrote the initial draft of the manuscript. DB and CL both provided assistance with research design, data collection, and reviewed and edited the manuscript. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Meixun Zheng .

Ethics declarations

Ethics approval and consent to participate.

The study was approved by the institutional review board at University of the Pacific in the U.S. (#2020-68). Informed consent was obtained from all participants. All methods were carried out in accordance with relevant guidelines and regulations.

Consent for publication

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1:.

Survey of online courses during COVID-19 pandemic.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Zheng, M., Bender, D. & Lyon, C. Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study. BMC Med Educ 21 , 495 (2021). https://doi.org/10.1186/s12909-021-02909-z

Download citation

Received : 31 March 2021

Accepted : 26 August 2021

Published : 16 September 2021

DOI : https://doi.org/10.1186/s12909-021-02909-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Dental education
  • Online learning
  • COVID-19 pandemic
  • Instructional strategies
  • Interaction
  • Learning performance

BMC Medical Education

ISSN: 1472-6920

articles on online education

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Elsevier - PMC COVID-19 Collection

Logo of pheelsevier

A systematic review of research on online teaching and learning from 2009 to 2018

Associated data.

Systematic reviews were conducted in the nineties and early 2000's on online learning research. However, there is no review examining the broader aspect of research themes in online learning in the last decade. This systematic review addresses this gap by examining 619 research articles on online learning published in twelve journals in the last decade. These studies were examined for publication trends and patterns, research themes, research methods, and research settings and compared with the research themes from the previous decades. While there has been a slight decrease in the number of studies on online learning in 2015 and 2016, it has then continued to increase in 2017 and 2018. The majority of the studies were quantitative in nature and were examined in higher education. Online learning research was categorized into twelve themes and a framework across learner, course and instructor, and organizational levels was developed. Online learner characteristics and online engagement were examined in a high number of studies and were consistent with three of the prior systematic reviews. However, there is still a need for more research on organization level topics such as leadership, policy, and management and access, culture, equity, inclusion, and ethics and also on online instructor characteristics.

  • • Twelve online learning research themes were identified in 2009–2018.
  • • A framework with learner, course and instructor, and organizational levels was used.
  • • Online learner characteristics and engagement were the mostly examined themes.
  • • The majority of the studies used quantitative research methods and in higher education.
  • • There is a need for more research on organization level topics.

1. Introduction

Online learning has been on the increase in the last two decades. In the United States, though higher education enrollment has declined, online learning enrollment in public institutions has continued to increase ( Allen & Seaman, 2017 ), and so has the research on online learning. There have been review studies conducted on specific areas on online learning such as innovations in online learning strategies ( Davis et al., 2018 ), empirical MOOC literature ( Liyanagunawardena et al., 2013 ; Veletsianos & Shepherdson, 2016 ; Zhu et al., 2018 ), quality in online education ( Esfijani, 2018 ), accessibility in online higher education ( Lee, 2017 ), synchronous online learning ( Martin et al., 2017 ), K-12 preparation for online teaching ( Moore-Adams et al., 2016 ), polychronicity in online learning ( Capdeferro et al., 2014 ), meaningful learning research in elearning and online learning environments ( Tsai, Shen, & Chiang, 2013 ), problem-based learning in elearning and online learning environments ( Tsai & Chiang, 2013 ), asynchronous online discussions ( Thomas, 2013 ), self-regulated learning in online learning environments ( Tsai, Shen, & Fan, 2013 ), game-based learning in online learning environments ( Tsai & Fan, 2013 ), and online course dropout ( Lee & Choi, 2011 ). While there have been review studies conducted on specific online learning topics, very few studies have been conducted on the broader aspect of online learning examining research themes.

2. Systematic Reviews of Distance Education and Online Learning Research

Distance education has evolved from offline to online settings with the access to internet and COVID-19 has made online learning the common delivery method across the world. Tallent-Runnels et al. (2006) reviewed research late 1990's to early 2000's, Berge and Mrozowski (2001) reviewed research 1990 to 1999, and Zawacki-Richter et al. (2009) reviewed research in 2000–2008 on distance education and online learning. Table 1 shows the research themes from previous systematic reviews on online learning research. There are some themes that re-occur in the various reviews, and there are also new themes that emerge. Though there have been reviews conducted in the nineties and early 2000's, there is no review examining the broader aspect of research themes in online learning in the last decade. Hence, the need for this systematic review which informs the research themes in online learning from 2009 to 2018. In the following sections, we review these systematic review studies in detail.

Comparison of online learning research themes from previous studies.

2.1. Distance education research themes, 1990 to 1999 ( Berge & Mrozowski, 2001 )

Berge and Mrozowski (2001) reviewed 890 research articles and dissertation abstracts on distance education from 1990 to 1999. The four distance education journals chosen by the authors to represent distance education included, American Journal of Distance Education, Distance Education, Open Learning, and the Journal of Distance Education. This review overlapped in the dates of the Tallent-Runnels et al. (2006) study. Berge and Mrozowski (2001) categorized the articles according to Sherry's (1996) ten themes of research issues in distance education: redefining roles of instructor and students, technologies used, issues of design, strategies to stimulate learning, learner characteristics and support, issues related to operating and policies and administration, access and equity, and costs and benefits.

In the Berge and Mrozowski (2001) study, more than 100 studies focused on each of the three themes: (1) design issues, (2) learner characteristics, and (3) strategies to increase interactivity and active learning. By design issues, the authors focused on instructional systems design and focused on topics such as content requirement, technical constraints, interactivity, and feedback. The next theme, strategies to increase interactivity and active learning, were closely related to design issues and focused on students’ modes of learning. Learner characteristics focused on accommodating various learning styles through customized instructional theory. Less than 50 studies focused on the three least examined themes: (1) cost-benefit tradeoffs, (2) equity and accessibility, and (3) learner support. Cost-benefit trade-offs focused on the implementation costs of distance education based on school characteristics. Equity and accessibility focused on the equity of access to distance education systems. Learner support included topics such as teacher to teacher support as well as teacher to student support.

2.2. Online learning research themes, 1993 to 2004 ( Tallent-Runnels et al., 2006 )

Tallent-Runnels et al. (2006) reviewed research on online instruction from 1993 to 2004. They reviewed 76 articles focused on online learning by searching five databases, ERIC, PsycINFO, ContentFirst, Education Abstracts, and WilsonSelect. Tallent-Runnels et al. (2006) categorized research into four themes, (1) course environment, (2) learners' outcomes, (3) learners’ characteristics, and (4) institutional and administrative factors. The first theme that the authors describe as course environment ( n  = 41, 53.9%) is an overarching theme that includes classroom culture, structural assistance, success factors, online interaction, and evaluation.

Tallent-Runnels et al. (2006) for their second theme found that studies focused on questions involving the process of teaching and learning and methods to explore cognitive and affective learner outcomes ( n  = 29, 38.2%). The authors stated that they found the research designs flawed and lacked rigor. However, the literature comparing traditional and online classrooms found both delivery systems to be adequate. Another research theme focused on learners’ characteristics ( n  = 12, 15.8%) and the synergy of learners, design of the online course, and system of delivery. Research findings revealed that online learners were mainly non-traditional, Caucasian, had different learning styles, and were highly motivated to learn. The final theme that they reported was institutional and administrative factors (n  = 13, 17.1%) on online learning. Their findings revealed that there was a lack of scholarly research in this area and most institutions did not have formal policies in place for course development as well as faculty and student support in training and evaluation. Their research confirmed that when universities offered online courses, it improved student enrollment numbers.

2.3. Distance education research themes 2000 to 2008 ( Zawacki-Richter et al., 2009 )

Zawacki-Richter et al. (2009) reviewed 695 articles on distance education from 2000 to 2008 using the Delphi method for consensus in identifying areas and classified the literature from five prominent journals. The five journals selected due to their wide scope in research in distance education included Open Learning, Distance Education, American Journal of Distance Education, the Journal of Distance Education, and the International Review of Research in Open and Distributed Learning. The reviewers examined the main focus of research and identified gaps in distance education research in this review.

Zawacki-Richter et al. (2009) classified the studies into macro, meso and micro levels focusing on 15 areas of research. The five areas of the macro-level addressed: (1) access, equity and ethics to deliver distance education for developing nations and the role of various technologies to narrow the digital divide, (2) teaching and learning drivers, markets, and professional development in the global context, (3) distance delivery systems and institutional partnerships and programs and impact of hybrid modes of delivery, (4) theoretical frameworks and models for instruction, knowledge building, and learner interactions in distance education practice, and (5) the types of preferred research methodologies. The meso-level focused on seven areas that involve: (1) management and organization for sustaining distance education programs, (2) examining financial aspects of developing and implementing online programs, (3) the challenges and benefits of new technologies for teaching and learning, (4) incentives to innovate, (5) professional development and support for faculty, (6) learner support services, and (7) issues involving quality standards and the impact on student enrollment and retention. The micro-level focused on three areas: (1) instructional design and pedagogical approaches, (2) culturally appropriate materials, interaction, communication, and collaboration among a community of learners, and (3) focus on characteristics of adult learners, socio-economic backgrounds, learning preferences, and dispositions.

The top three research themes in this review by Zawacki-Richter et al. (2009) were interaction and communities of learning ( n  = 122, 17.6%), instructional design ( n  = 121, 17.4%) and learner characteristics ( n  = 113, 16.3%). The lowest number of studies (less than 3%) were found in studies examining the following research themes, management and organization ( n  = 18), research methods in DE and knowledge transfer ( n  = 13), globalization of education and cross-cultural aspects ( n  = 13), innovation and change ( n  = 13), and costs and benefits ( n  = 12).

2.4. Online learning research themes

These three systematic reviews provide a broad understanding of distance education and online learning research themes from 1990 to 2008. However, there is an increase in the number of research studies on online learning in this decade and there is a need to identify recent research themes examined. Based on the previous systematic reviews ( Berge & Mrozowski, 2001 ; Hung, 2012 ; Tallent-Runnels et al., 2006 ; Zawacki-Richter et al., 2009 ), online learning research in this study is grouped into twelve different research themes which include Learner characteristics, Instructor characteristics, Course or program design and development, Course Facilitation, Engagement, Course Assessment, Course Technologies, Access, Culture, Equity, Inclusion, and Ethics, Leadership, Policy and Management, Instructor and Learner Support, and Learner Outcomes. Table 2 below describes each of the research themes and using these themes, a framework is derived in Fig. 1 .

Research themes in online learning.

Fig. 1

Online learning research themes framework.

The collection of research themes is presented as a framework in Fig. 1 . The themes are organized by domain or level to underscore the nested relationship that exists. As evidenced by the assortment of themes, research can focus on any domain of delivery or associated context. The “Learner” domain captures characteristics and outcomes related to learners and their interaction within the courses. The “Course and Instructor” domain captures elements about the broader design of the course and facilitation by the instructor, and the “Organizational” domain acknowledges the contextual influences on the course. It is important to note as well that due to the nesting, research themes can cross domains. For example, the broader cultural context may be studied as it pertains to course design and development, and institutional support can include both learner support and instructor support. Likewise, engagement research can involve instructors as well as learners.

In this introduction section, we have reviewed three systematic reviews on online learning research ( Berge & Mrozowski, 2001 ; Tallent-Runnels et al., 2006 ; Zawacki-Richter et al., 2009 ). Based on these reviews and other research, we have derived twelve themes to develop an online learning research framework which is nested in three levels: learner, course and instructor, and organization.

2.5. Purpose of this research

In two out of the three previous reviews, design, learner characteristics and interaction were examined in the highest number of studies. On the other hand, cost-benefit tradeoffs, equity and accessibility, institutional and administrative factors, and globalization and cross-cultural aspects were examined in the least number of studies. One explanation for this may be that it is a function of nesting, noting that studies falling in the Organizational and Course levels may encompass several courses or many more participants within courses. However, while some research themes re-occur, there are also variations in some themes across time, suggesting the importance of research themes rise and fall over time. Thus, a critical examination of the trends in themes is helpful for understanding where research is needed most. Also, since there is no recent study examining online learning research themes in the last decade, this study strives to address that gap by focusing on recent research themes found in the literature, and also reviewing research methods and settings. Notably, one goal is to also compare findings from this decade to the previous review studies. Overall, the purpose of this study is to examine publication trends in online learning research taking place during the last ten years and compare it with the previous themes identified in other review studies. Due to the continued growth of online learning research into new contexts and among new researchers, we also examine the research methods and settings found in the studies of this review.

The following research questions are addressed in this study.

  • 1. What percentage of the population of articles published in the journals reviewed from 2009 to 2018 were related to online learning and empirical?
  • 2. What is the frequency of online learning research themes in the empirical online learning articles of journals reviewed from 2009 to 2018?
  • 3. What is the frequency of research methods and settings that researchers employed in the empirical online learning articles of the journals reviewed from 2009 to 2018?

This five-step systematic review process described in the U.S. Department of Education, Institute of Education Sciences, What Works Clearinghouse Procedures and Standards Handbook, Version 4.0 ( 2017 ) was used in this systematic review: (a) developing the review protocol, (b) identifying relevant literature, (c) screening studies, (d) reviewing articles, and (e) reporting findings.

3.1. Data sources and search strategies

The Education Research Complete database was searched using the keywords below for published articles between the years 2009 and 2018 using both the Title and Keyword function for the following search terms.

“online learning" OR "online teaching" OR "online program" OR "online course" OR “online education”

3.2. Inclusion/exclusion criteria

The initial search of online learning research among journals in the database resulted in more than 3000 possible articles. Therefore, we limited our search to select journals that focus on publishing peer-reviewed online learning and educational research. Our aim was to capture the journals that published the most articles in online learning. However, we also wanted to incorporate the concept of rigor, so we used expert perception to identify 12 peer-reviewed journals that publish high-quality online learning research. Dissertations and conference proceedings were excluded. To be included in this systematic review, each study had to meet the screening criteria as described in Table 3 . A research study was excluded if it did not meet all of the criteria to be included.

Inclusion/Exclusion criteria.

3.3. Process flow selection of articles

Fig. 2 shows the process flow involved in the selection of articles. The search in the database Education Research Complete yielded an initial sample of 3332 articles. Targeting the 12 journals removed 2579 articles. After reviewing the abstracts, we removed 134 articles based on the inclusion/exclusion criteria. The final sample, consisting of 619 articles, was entered into the computer software MAXQDA ( VERBI Software, 2019 ) for coding.

Fig. 2

Flowchart of online learning research selection.

3.4. Developing review protocol

A review protocol was designed as a codebook in MAXQDA ( VERBI Software, 2019 ) by the three researchers. The codebook was developed based on findings from the previous review studies and from the initial screening of the articles in this review. The codebook included 12 research themes listed earlier in Table 2 (Learner characteristics, Instructor characteristics, Course or program design and development, Course Facilitation, Engagement, Course Assessment, Course Technologies, Access, Culture, Equity, Inclusion, and Ethics, Leadership, Policy and Management, Instructor and Learner Support, and Learner Outcomes), four research settings (higher education, continuing education, K-12, corporate/military), and three research designs (quantitative, qualitative and mixed methods). Fig. 3 below is a screenshot of MAXQDA used for the coding process.

Fig. 3

Codebook from MAXQDA.

3.5. Data coding

Research articles were coded by two researchers in MAXQDA. Two researchers independently coded 10% of the articles and then discussed and updated the coding framework. The second author who was a doctoral student coded the remaining studies. The researchers met bi-weekly to address coding questions that emerged. After the first phase of coding, we found that more than 100 studies fell into each of the categories of Learner Characteristics or Engagement, so we decided to pursue a second phase of coding and reexamine the two themes. Learner Characteristics were classified into the subthemes of Academic, Affective, Motivational, Self-regulation, Cognitive, and Demographic Characteristics. Engagement was classified into the subthemes of Collaborating, Communication, Community, Involvement, Interaction, Participation, and Presence.

3.6. Data analysis

Frequency tables were generated for each of the variables so that outliers could be examined and narrative data could be collapsed into categories. Once cleaned and collapsed into a reasonable number of categories, descriptive statistics were used to describe each of the coded elements. We first present the frequencies of publications related to online learning in the 12 journals. The total number of articles for each journal (collectively, the population) was hand-counted from journal websites, excluding editorials and book reviews. The publication trend of online learning research was also depicted from 2009 to 2018. Then, the descriptive information of the 12 themes, including the subthemes of Learner Characteristics and Engagement were provided. Finally, research themes by research settings and methodology were elaborated.

4.1. Publication trends on online learning

Publication patterns of the 619 articles reviewed from the 12 journals are presented in Table 4 . International Review of Research in Open and Distributed Learning had the highest number of publications in this review. Overall, about 8% of the articles appearing in these twelve journals consisted of online learning publications; however, several journals had concentrations of online learning articles totaling more than 20%.

Empirical online learning research articles by journal, 2009–2018.

Note . Journal's Total Article count excludes reviews and editorials.

The publication trend of online learning research is depicted in Fig. 4 . When disaggregated by year, the total frequency of publications shows an increasing trend. Online learning articles increased throughout the decade and hit a relative maximum in 2014. The greatest number of online learning articles ( n  = 86) occurred most recently, in 2018.

Fig. 4

Online learning publication trends by year.

4.2. Online learning research themes that appeared in the selected articles

The publications were categorized into the twelve research themes identified in Fig. 1 . The frequency counts and percentages of the research themes are provided in Table 5 below. A majority of the research is categorized into the Learner domain. The fewest number of articles appears in the Organization domain.

Research themes in the online learning publications from 2009 to 2018.

The specific themes of Engagement ( n  = 179, 28.92%) and Learner Characteristics ( n  = 134, 21.65%) were most often examined in publications. These two themes were further coded to identify sub-themes, which are described in the next two sections. Publications focusing on Instructor Characteristics ( n  = 21, 3.39%) were least common in the dataset.

4.2.1. Research on engagement

The largest number of studies was on engagement in online learning, which in the online learning literature is referred to and examined through different terms. Hence, we explore this category in more detail. In this review, we categorized the articles into seven different sub-themes as examined through different lenses including presence, interaction, community, participation, collaboration, involvement, and communication. We use the term “involvement” as one of the terms since researchers sometimes broadly used the term engagement to describe their work without further description. Table 6 below provides the description, frequency, and percentages of the various studies related to engagement.

Research sub-themes on engagement.

In the sections below, we provide several examples of the different engagement sub-themes that were studied within the larger engagement theme.

Presence. This sub-theme was the most researched in engagement. With the development of the community of inquiry framework most of the studies in this subtheme examined social presence ( Akcaoglu & Lee, 2016 ; Phirangee & Malec, 2017 ; Wei et al., 2012 ), teaching presence ( Orcutt & Dringus, 2017 ; Preisman, 2014 ; Wisneski et al., 2015 ) and cognitive presence ( Archibald, 2010 ; Olesova et al., 2016 ).

Interaction . This was the second most studied theme under engagement. Researchers examined increasing interpersonal interactions ( Cung et al., 2018 ), learner-learner interactions ( Phirangee, 2016 ; Shackelford & Maxwell, 2012 ; Tawfik et al., 2018 ), peer-peer interaction ( Comer et al., 2014 ), learner-instructor interaction ( Kuo et al., 2014 ), learner-content interaction ( Zimmerman, 2012 ), interaction through peer mentoring ( Ruane & Koku, 2014 ), interaction and community building ( Thormann & Fidalgo, 2014 ), and interaction in discussions ( Ruane & Lee, 2016 ; Tibi, 2018 ).

Community. Researchers examined building community in online courses ( Berry, 2017 ), supporting a sense of community ( Jiang, 2017 ), building an online learning community of practice ( Cho, 2016 ), building an academic community ( Glazer & Wanstreet, 2011 ; Nye, 2015 ; Overbaugh & Nickel, 2011 ), and examining connectedness and rapport in an online community ( Bolliger & Inan, 2012 ; Murphy & Rodríguez-Manzanares, 2012 ; Slagter van Tryon & Bishop, 2012 ).

Participation. Researchers examined engagement through participation in a number of studies. Some of the topics include, participation patterns in online discussion ( Marbouti & Wise, 2016 ; Wise et al., 2012 ), participation in MOOCs ( Ahn et al., 2013 ; Saadatmand & Kumpulainen, 2014 ), features that influence students’ online participation ( Rye & Støkken, 2012 ) and active participation.

Collaboration. Researchers examined engagement through collaborative learning. Specific studies focused on cross-cultural collaboration ( Kumi-Yeboah, 2018 ; Yang et al., 2014 ), how virtual teams collaborate ( Verstegen et al., 2018 ), types of collaboration teams ( Wicks et al., 2015 ), tools for collaboration ( Boling et al., 2014 ), and support for collaboration ( Kopp et al., 2012 ).

Involvement. Researchers examined engaging learners through involvement in various learning activities ( Cundell & Sheepy, 2018 ), student engagement through various measures ( Dixson, 2015 ), how instructors included engagement to involve students in learning ( O'Shea et al., 2015 ), different strategies to engage the learner ( Amador & Mederer, 2013 ), and designed emotionally engaging online environments ( Koseoglu & Doering, 2011 ).

Communication. Researchers examined communication in online learning in studies using social network analysis ( Ergün & Usluel, 2016 ), using informal communication tools such as Facebook for class discussion ( Kent, 2013 ), and using various modes of communication ( Cunningham et al., 2010 ; Rowe, 2016 ). Studies have also focused on both asynchronous and synchronous aspects of communication ( Swaggerty & Broemmel, 2017 ; Yamagata-Lynch, 2014 ).

4.2.2. Research on learner characteristics

The second largest theme was learner characteristics. In this review, we explore this further to identify several aspects of learner characteristics. In this review, we categorized the learner characteristics into self-regulation characteristics, motivational characteristics, academic characteristics, affective characteristics, cognitive characteristics, and demographic characteristics. Table 7 provides the number of studies and percentages examining the various learner characteristics.

Research sub-themes on learner characteristics.

Online learning has elements that are different from the traditional face-to-face classroom and so the characteristics of the online learners are also different. Yukselturk and Top (2013) categorized online learner profile into ten aspects: gender, age, work status, self-efficacy, online readiness, self-regulation, participation in discussion list, participation in chat sessions, satisfaction, and achievement. Their categorization shows that there are differences in online learner characteristics in these aspects when compared to learners in other settings. Some of the other aspects such as participation and achievement as discussed by Yukselturk and Top (2013) are discussed in different research themes in this study. The sections below provide examples of the learner characteristics sub-themes that were studied.

Self-regulation. Several researchers have examined self-regulation in online learning. They found that successful online learners are academically motivated ( Artino & Stephens, 2009 ), have academic self-efficacy ( Cho & Shen, 2013 ), have grit and intention to succeed ( Wang & Baker, 2018 ), have time management and elaboration strategies ( Broadbent, 2017 ), set goals and revisit course content ( Kizilcec et al., 2017 ), and persist ( Glazer & Murphy, 2015 ). Researchers found a positive relationship between learner's self-regulation and interaction ( Delen et al., 2014 ) and self-regulation and communication and collaboration ( Barnard et al., 2009 ).

Motivation. Researchers focused on motivation of online learners including different motivation levels of online learners ( Li & Tsai, 2017 ), what motivated online learners ( Chaiprasurt & Esichaikul, 2013 ), differences in motivation of online learners ( Hartnett et al., 2011 ), and motivation when compared to face to face learners ( Paechter & Maier, 2010 ). Harnett et al. (2011) found that online learner motivation was complex, multifaceted, and sensitive to situational conditions.

Academic. Several researchers have focused on academic aspects for online learner characteristics. Readiness for online learning has been examined as an academic factor by several researchers ( Buzdar et al., 2016 ; Dray et al., 2011 ; Wladis & Samuels, 2016 ; Yu, 2018 ) specifically focusing on creating and validating measures to examine online learner readiness including examining students emotional intelligence as a measure of student readiness for online learning. Researchers have also examined other academic factors such as academic standing ( Bradford & Wyatt, 2010 ), course level factors ( Wladis et al., 2014 ) and academic skills in online courses ( Shea & Bidjerano, 2014 ).

Affective. Anderson and Bourke (2013) describe affective characteristics through which learners express feelings or emotions. Several research studies focused on the affective characteristics of online learners. Learner satisfaction for online learning has been examined by several researchers ( Cole et al., 2014 ; Dziuban et al., 2015 ; Kuo et al., 2013 ; Lee, 2014a ) along with examining student emotions towards online assessment ( Kim et al., 2014 ).

Cognitive. Researchers have also examined cognitive aspects of learner characteristics including meta-cognitive skills, cognitive variables, higher-order thinking, cognitive density, and critical thinking ( Chen & Wu, 2012 ; Lee, 2014b ). Lee (2014b) examined the relationship between cognitive presence density and higher-order thinking skills. Chen and Wu (2012) examined the relationship between cognitive and motivational variables in an online system for secondary physical education.

Demographic. Researchers have examined various demographic factors in online learning. Several researchers have examined gender differences in online learning ( Bayeck et al., 2018 ; Lowes et al., 2016 ; Yukselturk & Bulut, 2009 ), ethnicity, age ( Ke & Kwak, 2013 ), and minority status ( Yeboah & Smith, 2016 ) of online learners.

4.2.3. Less frequently studied research themes

While engagement and learner characteristics were studied the most, other themes were less often studied in the literature and are presented here, according to size, with general descriptions of the types of research examined for each.

Evaluation and Quality Assurance. There were 38 studies (6.14%) published in the theme of evaluation and quality assurance. Some of the studies in this theme focused on course quality standards, using quality matters to evaluate quality, using the CIPP model for evaluation, online learning system evaluation, and course and program evaluations.

Course Technologies. There were 35 studies (5.65%) published in the course technologies theme. Some of the studies examined specific technologies such as Edmodo, YouTube, Web 2.0 tools, wikis, Twitter, WebCT, Screencasts, and Web conferencing systems in the online learning context.

Course Facilitation. There were 34 studies (5.49%) published in the course facilitation theme. Some of the studies in this theme examined facilitation strategies and methods, experiences of online facilitators, and online teaching methods.

Institutional Support. There were 33 studies (5.33%) published in the institutional support theme which included support for both the instructor and learner. Some of the studies on instructor support focused on training new online instructors, mentoring programs for faculty, professional development resources for faculty, online adjunct faculty training, and institutional support for online instructors. Studies on learner support focused on learning resources for online students, cognitive and social support for online learners, and help systems for online learner support.

Learner Outcome. There were 32 studies (5.17%) published in the learner outcome theme. Some of the studies that were examined in this theme focused on online learner enrollment, completion, learner dropout, retention, and learner success.

Course Assessment. There were 30 studies (4.85%) published in the course assessment theme. Some of the studies in the course assessment theme examined online exams, peer assessment and peer feedback, proctoring in online exams, and alternative assessments such as eportfolio.

Access, Culture, Equity, Inclusion, and Ethics. There were 29 studies (4.68%) published in the access, culture, equity, inclusion, and ethics theme. Some of the studies in this theme examined online learning across cultures, multi-cultural effectiveness, multi-access, and cultural diversity in online learning.

Leadership, Policy, and Management. There were 27 studies (4.36%) published in the leadership, policy, and management theme. Some of the studies on leadership, policy, and management focused on online learning leaders, stakeholders, strategies for online learning leadership, resource requirements, university policies for online course policies, governance, course ownership, and faculty incentives for online teaching.

Course Design and Development. There were 27 studies (4.36%) published in the course design and development theme. Some of the studies examined in this theme focused on design elements, design issues, design process, design competencies, design considerations, and instructional design in online courses.

Instructor Characteristics. There were 21 studies (3.39%) published in the instructor characteristics theme. Some of the studies in this theme were on motivation and experiences of online instructors, ability to perform online teaching duties, roles of online instructors, and adjunct versus full-time online instructors.

4.3. Research settings and methodology used in the studies

The research methods used in the studies were classified into quantitative, qualitative, and mixed methods ( Harwell, 2012 , pp. 147–163). The research setting was categorized into higher education, continuing education, K-12, and corporate/military. As shown in Table A in the appendix, the vast majority of the publications used higher education as the research setting ( n  = 509, 67.6%). Table B in the appendix shows that approximately half of the studies adopted the quantitative method ( n  = 324, 43.03%), followed by the qualitative method ( n  = 200, 26.56%). Mixed methods account for the smallest portion ( n  = 95, 12.62%).

Table A shows that the patterns of the four research settings were approximately consistent across the 12 themes except for the theme of Leaner Outcome and Institutional Support. Continuing education had a higher relative frequency in Learner Outcome (0.28) and K-12 had a higher relative frequency in Institutional Support (0.33) compared to the frequencies they had in the total themes (0.09 and 0.08 respectively). Table B in the appendix shows that the distribution of the three methods were not consistent across the 12 themes. While quantitative studies and qualitative studies were roughly evenly distributed in Engagement, they had a large discrepancy in Learner Characteristics. There were 100 quantitative studies; however, only 18 qualitative studies published in the theme of Learner Characteristics.

In summary, around 8% of the articles published in the 12 journals focus on online learning. Online learning publications showed a tendency of increase on the whole in the past decade, albeit fluctuated, with the greatest number occurring in 2018. Among the 12 research themes related to online learning, the themes of Engagement and Learner Characteristics were studied the most and the theme of Instructor Characteristics was studied the least. Most studies were conducted in the higher education setting and approximately half of the studies used the quantitative method. Looking at the 12 themes by setting and method, we found that the patterns of the themes by setting or by method were not consistent across the 12 themes.

The quality of our findings was ensured by scientific and thorough searches and coding consistency. The selection of the 12 journals provides evidence of the representativeness and quality of primary studies. In the coding process, any difficulties and questions were resolved by consultations with the research team at bi-weekly meetings, which ensures the intra-rater and interrater reliability of coding. All these approaches guarantee the transparency and replicability of the process and the quality of our results.

5. Discussion

This review enabled us to identify the online learning research themes examined from 2009 to 2018. In the section below, we review the most studied research themes, engagement and learner characteristics along with implications, limitations, and directions for future research.

5.1. Most studied research themes

Three out of the four systematic reviews informing the design of the present study found that online learner characteristics and online engagement were examined in a high number of studies. In this review, about half of the studies reviewed (50.57%) focused on online learner characteristics or online engagement. This shows the continued importance of these two themes. In the Tallent-Runnels et al.’s (2006) study, the learner characteristics theme was identified as least studied for which they state that researchers are beginning to investigate learner characteristics in the early days of online learning.

One of the differences found in this review is that course design and development was examined in the least number of studies in this review compared to two prior systematic reviews ( Berge & Mrozowski, 2001 ; Zawacki-Richter et al., 2009 ). Zawacki-Richter et al. did not use a keyword search but reviewed all the articles in five different distance education journals. Berge and Mrozowski (2001) included a research theme called design issues to include all aspects of instructional systems design in distance education journals. In our study, in addition to course design and development, we also had focused themes on learner outcomes, course facilitation, course assessment and course evaluation. These are all instructional design focused topics and since we had multiple themes focusing on instructional design topics, the course design and development category might have resulted in fewer studies. There is still a need for more studies to focus on online course design and development.

5.2. Least frequently studied research themes

Three out of the four systematic reviews discussed in the opening of this study found management and organization factors to be least studied. In this review, Leadership, Policy, and Management was studied among 4.36% of the studies and Access, Culture, Equity, Inclusion, and Ethics was studied among 4.68% of the studies in the organizational level. The theme on Equity and accessibility was also found to be the least studied theme in the Berge and Mrozowski (2001) study. In addition, instructor characteristics was the least examined research theme among the twelve themes studied in this review. Only 3.39% of the studies were on instructor characteristics. While there were some studies examining instructor motivation and experiences, instructor ability to teach online, online instructor roles, and adjunct versus full-time online instructors, there is still a need to examine topics focused on instructors and online teaching. This theme was not included in the prior reviews as the focus was more on the learner and the course but not on the instructor. While it is helpful to see research evolving on instructor focused topics, there is still a need for more research on the online instructor.

5.3. Comparing research themes from current study to previous studies

The research themes from this review were compared with research themes from previous systematic reviews, which targeted prior decades. Table 8 shows the comparison.

Comparison of most and least studied online learning research themes from current to previous reviews.

L = Learner, C=Course O=Organization.

5.4. Need for more studies on organizational level themes of online learning

In this review there is a greater concentration of studies focused on Learner domain topics, and reduced attention to broader more encompassing research themes that fall into the Course and Organization domains. There is a need for organizational level topics such as Access, Culture, Equity, Inclusion and Ethics, and Leadership, Policy and Management to be researched on within the context of online learning. Examination of access, culture, equity, inclusion and ethics is very important to support diverse online learners, particularly with the rapid expansion of online learning across all educational levels. This was also least studied based on Berge and Mrozowski (2001) systematic review.

The topics on leadership, policy and management were least studied both in this review and also in the Tallent-Runnels et al. (2006) and Zawacki-Richter et al. (2009) study. Tallent-Runnels categorized institutional and administrative aspects into institutional policies, institutional support, and enrollment effects. While we included support as a separate category, in this study leadership, policy and management were combined. There is still a need for research on leadership of those who manage online learning, policies for online education, and managing online programs. In the Zawacki-Richter et al. (2009) study, only a few studies examined management and organization focused topics. They also found management and organization to be strongly correlated with costs and benefits. In our study, costs and benefits were collectively included as an aspect of management and organization and not as a theme by itself. These studies will provide research-based evidence for online education administrators.

6. Limitations

As with any systematic review, there are limitations to the scope of the review. The search is limited to twelve journals in the field that typically include research on online learning. These manuscripts were identified by searching the Education Research Complete database which focuses on education students, professionals, and policymakers. Other discipline-specific journals as well as dissertations and proceedings were not included due to the volume of articles. Also, the search was performed using five search terms “online learning" OR "online teaching" OR "online program" OR "online course" OR “online education” in title and keyword. If authors did not include these terms, their respective work may have been excluded from this review even if it focused on online learning. While these terms are commonly used in North America, it may not be commonly used in other parts of the world. Additional studies may exist outside this scope.

The search strategy also affected how we presented results and introduced limitations regarding generalization. We identified that only 8% of the articles published in these journals were related to online learning; however, given the use of search terms to identify articles within select journals it was not feasible to identify the total number of research-based articles in the population. Furthermore, our review focused on the topics and general methods of research and did not systematically consider the quality of the published research. Lastly, some journals may have preferences for publishing studies on a particular topic or that use a particular method (e.g., quantitative methods), which introduces possible selection and publication biases which may skew the interpretation of results due to over/under representation. Future studies are recommended to include more journals to minimize the selection bias and obtain a more representative sample.

Certain limitations can be attributed to the coding process. Overall, the coding process for this review worked well for most articles, as each tended to have an individual or dominant focus as described in the abstracts, though several did mention other categories which likely were simultaneously considered to a lesser degree. However, in some cases, a dominant theme was not as apparent and an effort to create mutually exclusive groups for clearer interpretation the coders were occasionally forced to choose between two categories. To facilitate this coding, the full-texts were used to identify a study focus through a consensus seeking discussion among all authors. Likewise, some studies focused on topics that we have associated with a particular domain, but the design of the study may have promoted an aggregated examination or integrated factors from multiple domains (e.g., engagement). Due to our reliance on author descriptions, the impact of construct validity is likely a concern that requires additional exploration. Our final grouping of codes may not have aligned with the original author's description in the abstract. Additionally, coding of broader constructs which disproportionately occur in the Learner domain, such as learner outcomes, learner characteristics, and engagement, likely introduced bias towards these codes when considering studies that involved multiple domains. Additional refinement to explore the intersection of domains within studies is needed.

7. Implications and future research

One of the strengths of this review is the research categories we have identified. We hope these categories will support future researchers and identify areas and levels of need for future research. Overall, there is some agreement on research themes on online learning research among previous reviews and this one, at the same time there are some contradicting findings. We hope the most-researched themes and least-researched themes provide authors a direction on the importance of research and areas of need to focus on.

The leading themes found in this review is online engagement research. However, presentation of this research was inconsistent, and often lacked specificity. This is not unique to online environments, but the nuances of defining engagement in an online environment are unique and therefore need further investigation and clarification. This review points to seven distinct classifications of online engagement. Further research on engagement should indicate which type of engagement is sought. This level of specificity is necessary to establish instruments for measuring engagement and ultimately testing frameworks for classifying engagement and promoting it in online environments. Also, it might be of importance to examine the relationship between these seven sub-themes of engagement.

Additionally, this review highlights growing attention to learner characteristics, which constitutes a shift in focus away from instructional characteristics and course design. Although this is consistent with the focus on engagement, the role of the instructor, and course design with respect to these outcomes remains important. Results of the learner characteristics and engagement research paired with course design will have important ramifications for the use of teaching and learning professionals who support instruction. Additionally, the review also points to a concentration of research in the area of higher education. With an immediate and growing emphasis on online learning in K-12 and corporate settings, there is a critical need for further investigation in these settings.

Lastly, because the present review did not focus on the overall effect of interventions, opportunities exist for dedicated meta-analyses. Particular attention to research on engagement and learner characteristics as well as how these vary by study design and outcomes would be logical additions to the research literature.

8. Conclusion

This systematic review builds upon three previous reviews which tackled the topic of online learning between 1990 and 2010 by extending the timeframe to consider the most recent set of published research. Covering the most recent decade, our review of 619 articles from 12 leading online learning journal points to a more concentrated focus on the learner domain including engagement and learner characteristics, with more limited attention to topics pertaining to the classroom or organizational level. The review highlights an opportunity for the field to clarify terminology concerning online learning research, particularly in the areas of learner outcomes where there is a tendency to classify research more generally (e.g., engagement). Using this sample of published literature, we provide a possible taxonomy for categorizing this research using subcategories. The field could benefit from a broader conversation about how these categories can shape a comprehensive framework for online learning research. Such efforts will enable the field to effectively prioritize research aims over time and synthesize effects.

Credit author statement

Florence Martin: Conceptualization; Writing - original draft, Writing - review & editing Preparation, Supervision, Project administration. Ting Sun: Methodology, Formal analysis, Writing - original draft, Writing - review & editing. Carl Westine: Methodology, Formal analysis, Writing - original draft, Writing - review & editing, Supervision

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

1 Includes articles that are cited in this manuscript and also included in the systematic review. The entire list of 619 articles used in the systematic review can be obtained by emailing the authors.*

Appendix B Supplementary data to this article can be found online at https://doi.org/10.1016/j.compedu.2020.104009 .

Appendix A. 

Research Themes by the Settings in the Online Learning Publications

Research Themes by the Methodology in the Online Learning Publications

Appendix B. Supplementary data

The following are the Supplementary data to this article:

References 1

  • Ahn J., Butler B.S., Alam A., Webster S.A. Learner participation and engagement in open online courses: Insights from the Peer 2 Peer University. MERLOT Journal of Online Learning and Teaching. 2013; 9 (2):160–171. * [ Google Scholar ]
  • Akcaoglu M., Lee E. Increasing social presence in online learning through small group discussions. International Review of Research in Open and Distance Learning. 2016; 17 (3) * [ Google Scholar ]
  • Allen I.E., Seaman J. Babson survey research group; 2017. Digital compass learning: Distance education enrollment Report 2017. [ Google Scholar ]
  • Amador J.A., Mederer H. Migrating successful student engagement strategies online: Opportunities and challenges using jigsaw groups and problem-based learning. Journal of Online Learning and Teaching. 2013; 9 (1):89. * [ Google Scholar ]
  • Anderson L.W., Bourke S.F. Routledge; 2013. Assessing affective characteristics in the schools. [ Google Scholar ]
  • Archibald D. Fostering the development of cognitive presence: Initial findings using the community of inquiry survey instrument. The Internet and Higher Education. 2010; 13 (1–2):73–74. * [ Google Scholar ]
  • Artino A.R., Jr., Stephens J.M. Academic motivation and self-regulation: A comparative analysis of undergraduate and graduate students learning online. The Internet and Higher Education. 2009; 12 (3–4):146–151. [ Google Scholar ]
  • Barnard L., Lan W.Y., To Y.M., Paton V.O., Lai S.L. Measuring self-regulation in online and blended learning environments. Internet and Higher Education. 2009; 12 (1):1–6. * [ Google Scholar ]
  • Bayeck R.Y., Hristova A., Jablokow K.W., Bonafini F. Exploring the relevance of single‐gender group formation: What we learn from a massive open online course (MOOC) British Journal of Educational Technology. 2018; 49 (1):88–100. * [ Google Scholar ]
  • Berge Z., Mrozowski S. Review of research in distance education, 1990 to 1999. American Journal of Distance Education. 2001; 15 (3):5–19. doi: 10.1080/08923640109527090. [ CrossRef ] [ Google Scholar ]
  • Berry S. Building community in online doctoral classrooms: Instructor practices that support community. Online Learning. 2017; 21 (2):n2. * [ Google Scholar ]
  • Boling E.C., Holan E., Horbatt B., Hough M., Jean-Louis J., Khurana C., Spiezio C. Using online tools for communication and collaboration: Understanding educators' experiences in an online course. The Internet and Higher Education. 2014; 23 :48–55. * [ Google Scholar ]
  • Bolliger D.U., Inan F.A. Development and validation of the online student connectedness survey (OSCS) International Review of Research in Open and Distance Learning. 2012; 13 (3):41–65. * [ Google Scholar ]
  • Bradford G., Wyatt S. Online learning and student satisfaction: Academic standing, ethnicity and their influence on facilitated learning, engagement, and information fluency. The Internet and Higher Education. 2010; 13 (3):108–114. * [ Google Scholar ]
  • Broadbent J. Comparing online and blended learner's self-regulated learning strategies and academic performance. The Internet and Higher Education. 2017; 33 :24–32. [ Google Scholar ]
  • Buzdar M., Ali A., Tariq R. Emotional intelligence as a determinant of readiness for online learning. International Review of Research in Open and Distance Learning. 2016; 17 (1) * [ Google Scholar ]
  • Capdeferro N., Romero M., Barberà E. Polychronicity: Review of the literature and a new configuration for the study of this hidden dimension of online learning. Distance Education. 2014; 35 (3):294–310. [ Google Scholar ]
  • Chaiprasurt C., Esichaikul V. Enhancing motivation in online courses with mobile communication tool support: A comparative study. International Review of Research in Open and Distance Learning. 2013; 14 (3):377–401. [ Google Scholar ]
  • Chen C.H., Wu I.C. The interplay between cognitive and motivational variables in a supportive online learning system for secondary physical education. Computers & Education. 2012; 58 (1):542–550. * [ Google Scholar ]
  • Cho H. Under co-construction: An online community of practice for bilingual pre-service teachers. Computers & Education. 2016; 92 :76–89. * [ Google Scholar ]
  • Cho M.H., Shen D. Self-regulation in online learning. Distance Education. 2013; 34 (3):290–301. [ Google Scholar ]
  • Cole M.T., Shelley D.J., Swartz L.B. Online instruction, e-learning, and student satisfaction: A three-year study. International Review of Research in Open and Distance Learning. 2014; 15 (6) * [ Google Scholar ]
  • Comer D.K., Clark C.R., Canelas D.A. Writing to learn and learning to write across the disciplines: Peer-to-peer writing in introductory-level MOOCs. International Review of Research in Open and Distance Learning. 2014; 15 (5):26–82. * [ Google Scholar ]
  • Cundell A., Sheepy E. Student perceptions of the most effective and engaging online learning activities in a blended graduate seminar. Online Learning. 2018; 22 (3):87–102. * [ Google Scholar ]
  • Cung B., Xu D., Eichhorn S. Increasing interpersonal interactions in an online course: Does increased instructor email activity and voluntary meeting time in a physical classroom facilitate student learning? Online Learning. 2018; 22 (3):193–215. [ Google Scholar ]
  • Cunningham U.M., Fägersten K.B., Holmsten E. Can you hear me, Hanoi?" Compensatory mechanisms employed in synchronous net-based English language learning. International Review of Research in Open and Distance Learning. 2010; 11 (1):161–177. [ Google Scholar ]
  • Davis D., Chen G., Hauff C., Houben G.J. Activating learning at scale: A review of innovations in online learning strategies. Computers & Education. 2018; 125 :327–344. [ Google Scholar ]
  • Delen E., Liew J., Willson V. Effects of interactivity and instructional scaffolding on learning: Self-regulation in online video-based environments. Computers & Education. 2014; 78 :312–320. [ Google Scholar ]
  • Dixson M.D. Measuring student engagement in the online course: The Online Student Engagement scale (OSE) Online Learning. 2015; 19 (4):n4. * [ Google Scholar ]
  • Dray B.J., Lowenthal P.R., Miszkiewicz M.J., Ruiz‐Primo M.A., Marczynski K. Developing an instrument to assess student readiness for online learning: A validation study. Distance Education. 2011; 32 (1):29–47. * [ Google Scholar ]
  • Dziuban C., Moskal P., Thompson J., Kramer L., DeCantis G., Hermsdorfer A. Student satisfaction with online learning: Is it a psychological contract? Online Learning. 2015; 19 (2):n2. * [ Google Scholar ]
  • Ergün E., Usluel Y.K. An analysis of density and degree-centrality according to the social networking structure formed in an online learning environment. Journal of Educational Technology & Society. 2016; 19 (4):34–46. * [ Google Scholar ]
  • Esfijani A. Measuring quality in online education: A meta-synthesis. American Journal of Distance Education. 2018; 32 (1):57–73. [ Google Scholar ]
  • Glazer H.R., Murphy J.A. Optimizing success: A model for persistence in online education. American Journal of Distance Education. 2015; 29 (2):135–144. [ Google Scholar ]
  • Glazer H.R., Wanstreet C.E. Connection to the academic community: Perceptions of students in online education. Quarterly Review of Distance Education. 2011; 12 (1):55. * [ Google Scholar ]
  • Hartnett M., George A.S., Dron J. Examining motivation in online distance learning environments: Complex, multifaceted and situation-dependent. International Review of Research in Open and Distance Learning. 2011; 12 (6):20–38. [ Google Scholar ]
  • Harwell M.R. 2012. Research design in qualitative/quantitative/mixed methods. Section III. Opportunities and challenges in designing and conducting inquiry. [ Google Scholar ]
  • Hung J.L. Trends of e‐learning research from 2000 to 2008: Use of text mining and bibliometrics. British Journal of Educational Technology. 2012; 43 (1):5–16. [ Google Scholar ]
  • Jiang W. Interdependence of roles, role rotation, and sense of community in an online course. Distance Education. 2017; 38 (1):84–105. [ Google Scholar ]
  • Ke F., Kwak D. Online learning across ethnicity and age: A study on learning interaction participation, perception, and learning satisfaction. Computers & Education. 2013; 61 :43–51. [ Google Scholar ]
  • Kent M. Changing the conversation: Facebook as a venue for online class discussion in higher education. MERLOT Journal of Online Learning and Teaching. 2013; 9 (4):546–565. * [ Google Scholar ]
  • Kim C., Park S.W., Cozart J. Affective and motivational factors of learning in online mathematics courses. British Journal of Educational Technology. 2014; 45 (1):171–185. [ Google Scholar ]
  • Kizilcec R.F., Pérez-Sanagustín M., Maldonado J.J. Self-regulated learning strategies predict learner behavior and goal attainment in Massive Open Online Courses. Computers & Education. 2017; 104 :18–33. [ Google Scholar ]
  • Kopp B., Matteucci M.C., Tomasetto C. E-tutorial support for collaborative online learning: An explorative study on experienced and inexperienced e-tutors. Computers & Education. 2012; 58 (1):12–20. [ Google Scholar ]
  • Koseoglu S., Doering A. Understanding complex ecologies: An investigation of student experiences in adventure learning programs. Distance Education. 2011; 32 (3):339–355. * [ Google Scholar ]
  • Kumi-Yeboah A. Designing a cross-cultural collaborative online learning framework for online instructors. Online Learning. 2018; 22 (4):181–201. * [ Google Scholar ]
  • Kuo Y.C., Walker A.E., Belland B.R., Schroder K.E. A predictive study of student satisfaction in online education programs. International Review of Research in Open and Distance Learning. 2013; 14 (1):16–39. * [ Google Scholar ]
  • Kuo Y.C., Walker A.E., Schroder K.E., Belland B.R. Interaction, Internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Higher Education. 2014; 20 :35–50. * [ Google Scholar ]
  • Lee J. An exploratory study of effective online learning: Assessing satisfaction levels of graduate students of mathematics education associated with human and design factors of an online course. International Review of Research in Open and Distance Learning. 2014; 15 (1) [ Google Scholar ]
  • Lee S.M. The relationships between higher order thinking skills, cognitive density, and social presence in online learning. The Internet and Higher Education. 2014; 21 :41–52. * [ Google Scholar ]
  • Lee K. Rethinking the accessibility of online higher education: A historical review. The Internet and Higher Education. 2017; 33 :15–23. [ Google Scholar ]
  • Lee Y., Choi J. A review of online course dropout research: Implications for practice and future research. Educational Technology Research & Development. 2011; 59 (5):593–618. [ Google Scholar ]
  • Li L.Y., Tsai C.C. Accessing online learning material: Quantitative behavior patterns and their effects on motivation and learning performance. Computers & Education. 2017; 114 :286–297. [ Google Scholar ]
  • Liyanagunawardena T., Adams A., Williams S. MOOCs: A systematic study of the published literature 2008-2012. International Review of Research in Open and Distance Learning. 2013; 14 (3):202–227. [ Google Scholar ]
  • Lowes S., Lin P., Kinghorn B.R. Gender differences in online high school courses. Online Learning. 2016; 20 (4):100–117. [ Google Scholar ]
  • Marbouti F., Wise A.F. Starburst: A new graphical interface to support purposeful attention to others' posts in online discussions. Educational Technology Research & Development. 2016; 64 (1):87–113. * [ Google Scholar ]
  • Martin F., Ahlgrim-Delzell L., Budhrani K. Systematic review of two decades (1995 to 2014) of research on synchronous online learning. American Journal of Distance Education. 2017; 31 (1):3–19. [ Google Scholar ]
  • Moore-Adams B.L., Jones W.M., Cohen J. Learning to teach online: A systematic review of the literature on K-12 teacher preparation for teaching online. Distance Education. 2016; 37 (3):333–348. [ Google Scholar ]
  • Murphy E., Rodríguez-Manzanares M.A. Rapport in distance education. International Review of Research in Open and Distance Learning. 2012; 13 (1):167–190. * [ Google Scholar ]
  • Nye A. Building an online academic learning community among undergraduate students. Distance Education. 2015; 36 (1):115–128. * [ Google Scholar ]
  • Olesova L., Slavin M., Lim J. Exploring the effect of scripted roles on cognitive presence in asynchronous online discussions. Online Learning. 2016; 20 (4):34–53. * [ Google Scholar ]
  • Orcutt J.M., Dringus L.P. Beyond being there: Practices that establish presence, engage students and influence intellectual curiosity in a structured online learning environment. Online Learning. 2017; 21 (3):15–35. * [ Google Scholar ]
  • Overbaugh R.C., Nickel C.E. A comparison of student satisfaction and value of academic community between blended and online sections of a university-level educational foundations course. The Internet and Higher Education. 2011; 14 (3):164–174. * [ Google Scholar ]
  • O'Shea S., Stone C., Delahunty J. “I ‘feel’like I am at university even though I am online.” Exploring how students narrate their engagement with higher education institutions in an online learning environment. Distance Education. 2015; 36 (1):41–58. * [ Google Scholar ]
  • Paechter M., Maier B. Online or face-to-face? Students' experiences and preferences in e-learning. Internet and Higher Education. 2010; 13 (4):292–297. [ Google Scholar ]
  • Phirangee K. Students' perceptions of learner-learner interactions that weaken a sense of community in an online learning environment. Online Learning. 2016; 20 (4):13–33. * [ Google Scholar ]
  • Phirangee K., Malec A. Othering in online learning: An examination of social presence, identity, and sense of community. Distance Education. 2017; 38 (2):160–172. * [ Google Scholar ]
  • Preisman K.A. Teaching presence in online education: From the instructor's point of view. Online Learning. 2014; 18 (3):n3. * [ Google Scholar ]
  • Rowe M. Developing graduate attributes in an open online course. British Journal of Educational Technology. 2016; 47 (5):873–882. * [ Google Scholar ]
  • Ruane R., Koku E.F. Social network analysis of undergraduate education student interaction in online peer mentoring settings. Journal of Online Learning and Teaching. 2014; 10 (4):577–589. * [ Google Scholar ]
  • Ruane R., Lee V.J. Analysis of discussion board interaction in an online peer mentoring site. Online Learning. 2016; 20 (4):79–99. * [ Google Scholar ]
  • Rye S.A., Støkken A.M. The implications of the local context in global virtual education. International Review of Research in Open and Distance Learning. 2012; 13 (1):191–206. * [ Google Scholar ]
  • Saadatmand M., Kumpulainen K. Participants' perceptions of learning and networking in connectivist MOOCs. Journal of Online Learning and Teaching. 2014; 10 (1):16. * [ Google Scholar ]
  • Shackelford J.L., Maxwell M. Sense of community in graduate online education: Contribution of learner to learner interaction. International Review of Research in Open and Distance Learning. 2012; 13 (4):228–249. * [ Google Scholar ]
  • Shea P., Bidjerano T. Does online learning impede degree completion? A national study of community college students. Computers & Education. 2014; 75 :103–111. * [ Google Scholar ]
  • Sherry L. Issues in distance learning. International Journal of Educational Telecommunications. 1996; 1 (4):337–365. [ Google Scholar ]
  • Slagter van Tryon P.J., Bishop M.J. Evaluating social connectedness online: The design and development of the social perceptions in learning contexts instrument. Distance Education. 2012; 33 (3):347–364. * [ Google Scholar ]
  • Swaggerty E.A., Broemmel A.D. Authenticity, relevance, and connectedness: Graduate students' learning preferences and experiences in an online reading education course. The Internet and Higher Education. 2017; 32 :80–86. * [ Google Scholar ]
  • Tallent-Runnels M.K., Thomas J.A., Lan W.Y., Cooper S., Ahern T.C., Shaw S.M., Liu X. Teaching courses online: A review of the research. Review of Educational Research. 2006; 76 (1):93–135. doi: 10.3102/00346543076001093. [ CrossRef ] [ Google Scholar ]
  • Tawfik A.A., Giabbanelli P.J., Hogan M., Msilu F., Gill A., York C.S. Effects of success v failure cases on learner-learner interaction. Computers & Education. 2018; 118 :120–132. [ Google Scholar ]
  • Thomas J. Exploring the use of asynchronous online discussion in health care education: A literature review. Computers & Education. 2013; 69 :199–215. [ Google Scholar ]
  • Thormann J., Fidalgo P. Guidelines for online course moderation and community building from a student's perspective. Journal of Online Learning and Teaching. 2014; 10 (3):374–388. * [ Google Scholar ]
  • Tibi M.H. Computer science students' attitudes towards the use of structured and unstructured discussion forums in fully online courses. Online Learning. 2018; 22 (1):93–106. * [ Google Scholar ]
  • Tsai C.W., Chiang Y.C. Research trends in problem‐based learning (pbl) research in e‐learning and online education environments: A review of publications in SSCI‐indexed journals from 2004 to 2012. British Journal of Educational Technology. 2013; 44 (6):E185–E190. [ Google Scholar ]
  • Tsai C.W., Fan Y.T. Research trends in game‐based learning research in online learning environments: A review of studies published in SSCI‐indexed journals from 2003 to 2012. British Journal of Educational Technology. 2013; 44 (5):E115–E119. [ Google Scholar ]
  • Tsai C.W., Shen P.D., Chiang Y.C. Research trends in meaningful learning research on e‐learning and online education environments: A review of studies published in SSCI‐indexed journals from 2003 to 2012. British Journal of Educational Technology. 2013; 44 (6):E179–E184. [ Google Scholar ]
  • Tsai C.W., Shen P.D., Fan Y.T. Research trends in self‐regulated learning research in online learning environments: A review of studies published in selected journals from 2003 to 2012. British Journal of Educational Technology. 2013; 44 (5):E107–E110. [ Google Scholar ]
  • U.S. Department of Education, Institute of Education Sciences . InstituteofEducationSciences; Washington,DC: 2017. What Works Clearinghouse procedures and standards handbook, version3.0. https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_procedures_v3_0_standards_handbook.pdf Retrievedfrom. [ Google Scholar ]
  • Veletsianos G., Shepherdson P. A systematic analysis and synthesis of the empirical MOOC literature published in 2013–2015. International Review of Research in Open and Distance Learning. 2016; 17 (2) [ Google Scholar ]
  • VERBI Software . 2019. MAXQDA 2020 online manual. Retrieved from maxqda. Com/help-max20/welcome [ Google Scholar ]
  • Verstegen D., Dailey-Hebert A., Fonteijn H., Clarebout G., Spruijt A. How do virtual teams collaborate in online learning tasks in a MOOC? International Review of Research in Open and Distance Learning. 2018; 19 (4) * [ Google Scholar ]
  • Wang Y., Baker R. Grit and intention: Why do learners complete MOOCs? International Review of Research in Open and Distance Learning. 2018; 19 (3) * [ Google Scholar ]
  • Wei C.W., Chen N.S., Kinshuk A model for social presence in online classrooms. Educational Technology Research & Development. 2012; 60 (3):529–545. * [ Google Scholar ]
  • Wicks D., Craft B.B., Lee D., Lumpe A., Henrikson R., Baliram N., Wicks K. An evaluation of low versus high collaboration in online learning. Online Learning. 2015; 19 (4):n4. * [ Google Scholar ]
  • Wise A.F., Perera N., Hsiao Y.T., Speer J., Marbouti F. Microanalytic case studies of individual participation patterns in an asynchronous online discussion in an undergraduate blended course. The Internet and Higher Education. 2012; 15 (2):108–117. * [ Google Scholar ]
  • Wisneski J.E., Ozogul G., Bichelmeyer B.A. Does teaching presence transfer between MBA teaching environments? A comparative investigation of instructional design practices associated with teaching presence. The Internet and Higher Education. 2015; 25 :18–27. * [ Google Scholar ]
  • Wladis C., Hachey A.C., Conway K. An investigation of course-level factors as predictors of online STEM course outcomes. Computers & Education. 2014; 77 :145–150. * [ Google Scholar ]
  • Wladis C., Samuels J. Do online readiness surveys do what they claim? Validity, reliability, and subsequent student enrollment decisions. Computers & Education. 2016; 98 :39–56. [ Google Scholar ]
  • Yamagata-Lynch L.C. Blending online asynchronous and synchronous learning. International Review of Research in Open and Distance Learning. 2014; 15 (2) * [ Google Scholar ]
  • Yang J., Kinshuk, Yu H., Chen S.J., Huang R. Strategies for smooth and effective cross-cultural online collaborative learning. Journal of Educational Technology & Society. 2014; 17 (3):208–221. * [ Google Scholar ]
  • Yeboah A.K., Smith P. Relationships between minority students online learning experiences and academic performance. Online Learning. 2016; 20 (4):n4. * [ Google Scholar ]
  • Yu T. Examining construct validity of the student online learning readiness (SOLR) instrument using confirmatory factor analysis. Online Learning. 2018; 22 (4):277–288. * [ Google Scholar ]
  • Yukselturk E., Bulut S. Gender differences in self-regulated online learning environment. Educational Technology & Society. 2009; 12 (3):12–22. [ Google Scholar ]
  • Yukselturk E., Top E. Exploring the link among entry characteristics, participation behaviors and course outcomes of online learners: An examination of learner profile using cluster analysis. British Journal of Educational Technology. 2013; 44 (5):716–728. [ Google Scholar ]
  • Zawacki-Richter O., Backer E., Vogt S. Review of distance education research (2000 to 2008): Analysis of research areas, methods, and authorship patterns. International Review of Research in Open and Distance Learning. 2009; 10 (6):30. doi: 10.19173/irrodl.v10i6.741. [ CrossRef ] [ Google Scholar ]
  • Zhu M., Sari A., Lee M.M. A systematic review of research methods and topics of the empirical MOOC literature (2014–2016) The Internet and Higher Education. 2018; 37 :31–39. [ Google Scholar ]
  • Zimmerman T.D. Exploring learner to content interaction as a success factor in online courses. International Review of Research in Open and Distance Learning. 2012; 13 (4):152–165. [ Google Scholar ]

Skip to Content

Researchers warn of danger, call for pause in bringing AI to schools

  • Share via Twitter
  • Share via Facebook
  • Share via LinkedIn
  • Share via E-mail

In K-12 schools across the country, a new gold rush of sorts is underway: Classrooms nationwide are racing to bring the latest artificial intelligence tools, such as platforms powered by the chat bot ChatGPT, into the classroom.

Alex Molnar, a director of the National Education Policy Center (NEPC) at CU Boulder, sees a danger in this hurry to introduce AI to schools. These platforms, he said, use opaque and usually proprietary algorithms—making their inner workings mysterious to educators, parents and students alike.

“What you have is a pocketful of promises that AI will deliver as promised,” said Molnar, a research professor in the School of Education . “The problem is there is currently no way to independently evaluate the claims being made.” 

In a new report, Molnar and his colleagues highlight the potential pitfalls of AI in education and call for an indefinite “pause” in integrating AI into K-12 learning. Co-authors included Ben Williamson of the University of Edinburgh in the United Kingdom and Faith Boninger, assistant research professor of education at CU Boulder.  

Molnar gives his take on why AI is a risky gamble for education—and what concerned parents and others can do to get involved.

Alex Molnar headshot

Alex Molnar

Does new technology pose risks to K-12 education?

There have been all kinds of issues associated with the use of digital platforms in schools, even before the widespread adoption of artificial intelligence. 

Student data are often not properly protected. For example, there have been all kinds of leaks from third-party vendors, and there's no law or effective policy that holds them accountable. You also have an awful lot of beta testing going on in schools. Marketing claims sound good, but digital platforms often don't produce the promised results and are riddled with technical issues.

Digital technologies have made it difficult or impossible to answer fundamental questions, such as: Who's deciding the curriculum content that gets built into these platforms? Who's reviewing their work?

Could AI make those issues worse?

All of the issues related to digital technologies tend to be amplified by artificial intelligence.

So-called AI uses algorithms and massive amounts of computing power to produce results based on countless calculations of probabilities. For example, what is the probability that the next word in a sequence will be ‘juice’? These calculations do not produce ‘truth’ or even, necessarily, accuracy. They produce probabilistic output. 

Currently, the construction and operation of AI algorithms is largely outside of public view and without any public accountability. Nevertheless, school people are being pushed, both by marketers and government entities, to be seen to be in the forefront of this alleged digital revolution—turning more and more school processes over to technologists with little or no knowledge of pedagogy or school curriculum.

A lot of people call AI tools a ‘black box.’ What does that mean?

To use an old-world explanation, imagine if you said, ‘I’d like to see my child’s geography textbook.’ You might say, ‘I have some issues here.’ You could talk to somebody about it, somebody who could possibly explain those issues. But with AI, you can’t do that.

You can’t go in and say, for example, ‘How did the scoring on this work?’ The answer would be, ‘Well, we don’t know.’ ‘How do we know that this content is accurate?’ ‘Well, we don’t know that, either.’ 

Is the concern, then, that AI might make decisions in place of educators or parents? 

You can use AI to assist you in determining if a child cheated. You use it to determine whether or not a child should be in this program or that program. You can use AI to decide all kinds of things about a child, and the child is locked in with little or no recourse. Parents can complain all they want. They still can’t get the information about the basis for a decision made by AI because the principal doesn’t have it. The teacher doesn’t have it. The superintendent doesn’t have it. It’s hidden behind a proprietary curtain by a private vendor.

You advocate for a ‘pause’ in the use of AI in schools. What would that look like?

The solution would be for state legislatures to, by statute, say, in essence: Public schools in this state may not adopt artificial intelligence programs unless and until those programs are certified by this governmental entity—they’d have to create the entity. It has reviewed these programs. It has said they are safe for use, and it defines what the appropriate uses of the program are and for whom.

In other words, nothing goes in the schools until we have the statutory and regulatory framework and institutional  capacity in place to independently assess AI platforms that are proposed for school use.

What can parents, or anyone else, who are concerned about this issue do?

Demand that your representatives take these issues seriously—first of all, to legislate a pause in the adoption of AI in schools. Period. Then they can ask their representatives to create a state entity that is designed to regulate the use of AI in schools.

This is a political problem. This is not a technical problem.

We have a long history of tech companies failing to follow their own rules, which are themselves laughably inadequate. For anybody who's seriously trying to figure out how to responsibly use AI in education, if they're not talking political action, they're not really talking. The technologists won’t save us.

  • Education & Outreach

News Headlines

CU Boulder Today regularly publishes Q&As with our faculty members weighing in on news topics through the lens of their scholarly expertise and research/creative work. The responses here reflect the knowledge and interpretations of the expert and should not be considered the university position on the issue. All publication content is subject to edits for clarity, brevity and  university style guidelines .

Related Articles

elementary school classroom

The complexity of trauma in teaching

Mia Torres in front of a mural that says: We'll get through this

Undergraduate’s menstrual justice project grows into policy change

Women in long dresses holding a chain of abandoned belongings near the border

What Remains founders braid migration dreams, art, stories

  • Arts & Humanities
  • Business & Entrepreneurship
  • Climate & Environment
  • Health & Society
  • Law & Politics
  • Science & Technology

Campus Community

  • Administration
  • Announcements & Deadlines
  • Career Development
  • Getting Involved
  • Mind & Body

Events & Exhibits

  • Arts & Culture
  • Conferences
  • Lectures & Presentations
  • Performances & Concerts
  • Sports & Recreation
  • Workshops & Seminars

Subscribe to CUBT

Sign up for Alerts

Administrative eMemos

Buff Bulletin Board

Events Calendar

TOI logo

  • Education News

KVS Admission 2024-25: Registration for Class 1-12 begins; check direct link here

KVS Admission 2024-25: Registration for Class 1-12 begins; check direct link here

Visual Stories

articles on online education

By Jessica Grose

Opinion Writer

A few weeks ago, a parent who lives in Texas asked me how much my kids were using screens to do schoolwork in their classrooms. She wasn’t talking about personal devices. (Smartwatches and smartphones are banned in my children’s schools during the school day, which I’m very happy about; I find any argument for allowing these devices in the classroom to be risible.) No, this parent was talking about screens that are school sanctioned, like iPads and Chromebooks issued to children individually for educational activities.

I’m embarrassed to say that I couldn’t answer her question because I had never asked or even thought about asking. Partly because the Covid-19 era made screens imperative in an instant — as one ed-tech executive told my colleague Natasha Singer in 2021, the pandemic “sped the adoption of technology in education by easily five to 10 years.” In the early Covid years, when my older daughter started using a Chromebook to do assignments for second and third grade, I was mostly just relieved that she had great teachers and seemed to be learning what she needed to know. By the time she was in fifth grade and the world was mostly back to normal, I knew she took her laptop to school for in-class assignments, but I never asked for specifics about how devices were being used. I trusted her teachers and her school implicitly.

In New York State, ed tech is often discussed as an equity problem — with good reason: At home, less privileged children might not have access to personal devices and high-speed internet that would allow them to complete digital assignments. But in our learn-to-code society, in which computer skills are seen as a meal ticket and the humanities as a ticket to the unemployment line, there seems to be less chatter about whether there are too many screens in our kids’ day-to-day educational environment beyond the classes that are specifically tech focused. I rarely heard details about what these screens are adding to our children’s literacy, math, science or history skills.

And screens truly are everywhere. For example, according to 2022 data from the National Assessment of Educational Progress, only about 8 percent of eighth graders in public schools said their math teachers “never or hardly ever” used computers or digital devices to teach math, 37 percent said their math teachers used this technology half or more than half the time, and 44 percent said their math teachers used this technology all or most of the time.

As is often the case with rapid change, “the speed at which new technologies and intervention models are reaching the market has far outpaced the ability of policy researchers to keep up with evaluating them,” according to a dazzlingly thorough review of the research on education technology by Maya Escueta, Andre Joshua Nickow, Philip Oreopoulos and Vincent Quan published in The Journal of Economic Literature in 2020.

Despite the relative paucity of research, particularly on in-class use of tech, Escueta and her co-authors put together “a comprehensive list of all publicly available studies on technology-based education interventions that report findings from studies following either of two research designs, randomized controlled trials or regression discontinuity designs.”

They found that increasing access to devices didn’t always lead to positive academic outcomes. In a couple of cases, it just increased the amount of time kids were spending on devices playing games. They wrote, “We found that simply providing students with access to technology yields largely mixed results. At the K-12 level, much of the experimental evidence suggests that giving a child a computer may have limited impacts on learning outcomes but generally improves computer proficiency and other cognitive outcomes.”

Some of the most promising research is around computer-assisted learning, which the researchers defined as “computer programs and other software applications designed to improve academic skills.” They cited a 2016 randomized study of 2,850 seventh-grade math students in Maine who used an online homework tool. The authors of that study “found that the program improved math scores for treatment students by 0.18 standard deviations. This impact is particularly noteworthy, given that treatment students used the program, on average, for less than 10 minutes per night, three to four nights per week,” according to Escueta and her co-authors.

They also explained that in the classroom, computer programs may help teachers meet the needs of students who are at different levels, since “when confronted with a wide range of student ability, teachers often end up teaching the core curriculum and tailoring instruction to the middle of the class.” A good program, they found, could help provide individual attention and skill building for kids at the bottom and the top, as well. There are computer programs for reading comprehension that have shown similar positive results in the research. Anecdotally: My older daughter practices her Spanish language skills using an app, and she hand-writes Spanish vocabulary words on index cards. The combination seems to be working well for her.

Though their review was published in 2020, before the data was out on our grand remote-learning experiment, Escueta and her co-authors found that fully online remote learning did not work as well as hybrid or in-person school. I called Thomas Dee, a professor at Stanford’s Graduate School of Education, who said that in light of earlier studies “and what we’re coming to understand about the long-lived effects of the pandemic on learning, it underscores for me that there’s a social dimension to learning that we ignore at our peril. And I think technology can often strip that away.”

Still, Dee summarized the entire topic of ed tech to me this way: “I don’t want to be black and white about this. I think there are really positive things coming from technology.” But he said that they are “meaningful supports on the margins, not fundamental changes in the modality of how people learn.”

I’d add that the implementation of any technology also matters a great deal; any educational tool can be great or awful, depending on how it’s used.

I’m neither a tech evangelist nor a Luddite. (Though I haven’t even touched on the potential implications of classroom teaching with artificial intelligence, a technology that, in other contexts, has so much destructive potential .) What I do want is the most effective educational experience for all kids.

Because there’s such a lag in the data and a lack of granularity to the information we do have, I want to hear from my readers: If you’re a teacher or a parent of a current K-12 student, I want to know how you and they are using technology — the good and the bad. Please complete the questionnaire below and let me know. I may reach out to you for further conversation.

Do your children or your students use technology in the classroom?

If you’re a parent, an educator or both, I want to hear from you.

Jessica Grose is an Opinion writer for The Times, covering family, religion, education, culture and the way we live now.

IMAGES

  1. 5 advantages of Online Education

    articles on online education

  2. Essay on Online Education

    articles on online education

  3. Why Online Learning is the Future of Education

    articles on online education

  4. Online learning’s impact on student performance

    articles on online education

  5. (PDF) Online Education and Its Effective Practice: A Research Review

    articles on online education

  6. Write an Article on Online Learning

    articles on online education

VIDEO

  1. eCornell Online Certificates

COMMENTS

  1. Online education in the post-COVID era

    Metrics. The coronavirus pandemic has forced students and educators across all levels of education to rapidly adapt to online learning. The impact of this — and the developments required to make ...

  2. Is Online Learning Effective?

    217. A UNESCO report says schools' heavy focus on remote online learning during the pandemic worsened educational disparities among students worldwide. Amira Karaoud/Reuters. By Natalie Proulx ...

  3. Taking a Closer Look at Online Learning in Colleges and Universities

    Not everyone loved online learning during the pandemic — especially in the early stages, when it was at its most haphazard. Nearly three in 10 students in a Strada Education survey in the fall ...

  4. Online Learning: Challenges and Solutions for Learners and Teachers

    The article presents some challenges faced by teachers and learners, supplemented with the recommendations to remove them. JEL Code: A20. The COVID-19 pandemic has led to an expansion in the demand for online teaching and learning across the globe. Online teaching and learning is attracting many students for enhanced learning experiences.

  5. What We're Learning About Online Learning

    The virtual learners' overall average score was 3.21, compared to 3.04 among Michigan peers who took the course in a classroom. The national average on those same tests was 2.89. "On these ...

  6. Capturing the benefits of remote learning

    On online learning platforms, it's easier for kids with social anxiety or shyness to participate. One of Gardner's students with social anxiety participated far more in virtual settings and chats. Now, Gardner is brainstorming ways to encourage students to chat in person, such as by projecting a chat screen on the blackboard. ...

  7. How Online Learning Is Reshaping Higher Education

    Feb. 15, 2022, at 10:19 a.m. Online Learning Is Reshaping Higher Ed. More. Getty Stock Images. "The nice thing about online education is that it can actually escape geographical boundaries ...

  8. Frontiers

    BackgroundThe effectiveness of online learning in higher education during the COVID-19 pandemic period is a debated topic but a systematic review on this topic is absent.MethodsThe present study implemented a systematic review of 25 selected articles to comprehensively evaluate online learning effectiveness during the pandemic period and identify factors that influence such effectiveness ...

  9. Full article: Online Education: Worldwide Status, Challenges, Trends

    Third, since our emphasis is on business education, we have analyzed several business journals' articles that focus mostly on online education. Fourth, it is a global study and provides a broader perspective of the state of online education in business from five regions of the world - North America, Europe, South America, Asia, Asia-Pacific ...

  10. Online education News, Research and Analysis

    Articles on Online education. Displaying 1 - 20 of 82 articles. Getty Images June 22, 2023 With campus numbers plummeting due to online learning, do we need two categories of university degree?

  11. Online Learning: A Panacea in the Time of COVID-19 Crisis

    The sudden outbreak of a deadly disease called Covid-19 caused by a Corona Virus (SARS-CoV-2) shook the entire world. The World Health Organization declared it as a pandemic. This situation challenged the education system across the world and forced educators to shift to an online mode of teaching overnight.

  12. The future of online learning: the long-term trends accelerated by

    Domingue points to artificial intelligence (AI) and the concept of an online library for educators based on a Google search engine dedicated to education, and a Netflix-style recommendation tool ...

  13. How Effective Is Online Learning? What the Research Does and Doesn't

    Most online courses, however, particularly those serving K-12 students, have a format much more similar to in-person courses. The teacher helps to run virtual discussion among the students ...

  14. Full article: Effective online education under COVID-19: Perspectives

    The institutional decision to shift to online education in light of the COVID-19 pandemic was considered timely by the majority of students (76%) and staff (84.4%). However, institutional support was neutrally evaluated by the respondents, with a score of 3.60 (0.72) for students and 3.71 (0.60) for staff out of 5.

  15. Online and face‐to‐face learning: Evidence from students' performance

    1.1. Related literature. Online learning is a form of distance education which mainly involves internet‐based education where courses are offered synchronously (i.e. live sessions online) and/or asynchronously (i.e. students access course materials online in their own time, which is associated with the more traditional distance education).

  16. The effects of online education on academic success: A meta-analysis

    Online education has advantages and disadvantages. The advantages of online learning compared to face-to-face learning in the classroom is the flexibility of learning time in online learning, the learning time does not include a single program, and it can be shaped according to circumstances (Lai et al., 2019). The next advantage is the ease of ...

  17. Students' experience of online learning during the COVID‐19 pandemic: A

    Online learning is currently adopted by educational institutions worldwide to provide students with ongoing education during the COVID‐19 pandemic. Even though online learning research has been advancing in uncovering student experiences in various settings (i.e., tertiary, adult, and professional education), very little progress has been ...

  18. Full article: Online education next wave: peer to peer learning

    Online Learning has to adapt and align as per the above to empower learners to excel in the 21st-century creative economy. Peer-to-Peer learning in an online mode can be termed Online Education 2.0 with a focus on Solve-to-Learn, leading to relaxed exam taking. This second generation of online learning is likely to trump Education 1.0 which is ...

  19. Traditional Learning Compared to Online Learning During the COVID-19

    By examining the strategic goals of online learning, college facilitators, faculty, and instructors find that while online education thus targets learners, develops their skills, encourages student participation, and promotes scientific innovation, its full implementation remains underdeveloped (Andrade et al., 2020). Some universities have ...

  20. Integrating students' perspectives about online learning: a hierarchy

    This article reports on a large-scale (n = 987), exploratory factor analysis study incorporating various concepts identified in the literature as critical success factors for online learning from the students' perspective, and then determines their hierarchical significance. Seven factors--Basic Online Modality, Instructional Support, Teaching Presence, Cognitive Presence, Online Social ...

  21. Online learning during COVID-19 produced equivalent or better student

    Research across disciplines has demonstrated that well-designed online learning can lead to students' enhanced motivation, satisfaction, and learning [1,2,3,4,5,6,7].]. A report by the U.S. Department of Education [], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning ...

  22. An Online-Education Gambit Hits Headwinds

    After his 2014 military retirement, he spent seven years at Southern New Hampshire University, an online-education powerhouse with more than 160,000 students. His positions included executive vice ...

  23. Dependence on Tech Caused 'Staggering' Education Inequality, U.N

    In early 2020, as the coronavirus spread, schools around the world abruptly halted in-person education. To many governments and parents, moving classes online seemed the obvious stopgap solution.

  24. Online continuing ed programs see improved staffing

    A new report finds online programs focused on continuing education getting increased support and faculty, but struggling for respect among other departments. Online continuing education programs saw improved staffing and increased support from institutional leadership in the last year, but still face many challenges, according to a new report released Tuesday.

  25. A systematic review of research on online teaching and learning from

    2.1. Distance education research themes, 1990 to 1999 (Berge & Mrozowski, 2001)Berge and Mrozowski (2001) reviewed 890 research articles and dissertation abstracts on distance education from 1990 to 1999. The four distance education journals chosen by the authors to represent distance education included, American Journal of Distance Education, Distance Education, Open Learning, and the Journal ...

  26. Researchers warn of danger, call for pause in bringing AI to schools

    In a new report, Molnar and his colleagues highlight the potential pitfalls of AI in education and call for an indefinite "pause" in integrating AI into K-12 learning. Co-authors included Ben Williamson of the University of Edinburgh in the United Kingdom and Faith Boninger, assistant research professor of education at CU Boulder.

  27. New FAFSA calculation error by Education Department could further delay

    The Department of Education's new mistake resulted in incorrect financial need information, known as Institutional Student Information Records, or ISIRs, being sent from the government to colleges.

  28. 2024-25 FAFSA Student Aid Index Update and Timeline (Updated March 14

    We would like to provide you with an important update regarding the 2024-25 Free Application for Federal Student Aid (FAFSA ®) process.This Electronic Announcement provides further details regarding aid eligibility and the post-processing experience for students, institutions, state higher education agencies, and scholarship organizations.

  29. KVS Admission 2024-25: Registration for Class 1-12 begins; check direct

    Direct link to apply online for the KVS admissions for the academic year 2024-25 Balvatika Registrations and Reservation Quotas Registrations for Balvatika levels 1, 2, and 3 are being conducted ...

  30. Screens Are Everywhere in Schools. Do They Actually Help Kids Learn?

    They cited a 2016 randomized study of 2,850 seventh-grade math students in Maine who used an online homework tool. The authors of that study "found that the program improved math scores for ...