• Multi-Tiered System of Supports Build effective, district-wide MTSS
  • School Climate & Culture Create a safe, supportive learning environment
  • Positive Behavior Interventions & Supports Promote positive behavior and climate
  • Family Engagement Engage families as partners in education
  • Platform Holistic data and student support tools
  • Integrations Daily syncs with district data systems and assessments
  • Professional Development Strategic advising, workshop facilitation, and ongoing support

Mesa OnTime

  • Success Stories
  • Surveys and Toolkits
  • Product Demos
  • Events and Conferences

 alt=

AIM FOR IMPACT

Join us to hear from AI visionaries and education leaders on building future-ready schools.

  • Connecticut
  • Massachusetts
  • Mississippi
  • New Hampshire
  • North Carolina
  • North Dakota
  • Pennsylvania
  • Rhode Island
  • South Carolina
  • South Dakota
  • West Virginia
  • Testimonials
  • About Panorama
  • Data Privacy
  • Leadership Team
  • In the Press
  • Request a Demo

Request a Demo

  • Popular Posts
  • Multi-Tiered System of Supports
  • Family Engagement
  • Social-Emotional Well-Being
  • College and Career Readiness

Show Categories

School Climate

45 survey questions to understand student engagement in online learning.

Nick Woolf

In our work with K-12 school districts during the COVID-19 pandemic, countless district leaders and school administrators have told us how challenging it's been to  build student engagement outside of the traditional classroom. 

Not only that, but the challenges associated with online learning may have the largest impact on students from marginalized communities.   Research   suggests that some groups of students experience more difficulty with academic performance and engagement when course content is delivered online vs. face-to-face.

As you look to improve the online learning experience for students, take a moment to understand  how students, caregivers, and staff are currently experiencing virtual learning. Where are the areas for improvement? How supported do students feel in their online coursework? Do teachers feel equipped to support students through synchronous and asynchronous facilitation? How confident do families feel in supporting their children at home?

Below, we've compiled a bank of 45 questions to understand student engagement in online learning.  Interested in running a student, family, or staff engagement survey? Click here to learn about Panorama's survey analytics platform for K-12 school districts.

Download Toolkit: 9 Virtual Learning Resources to Engage Students, Families, and Staff

45 Questions to Understand Student Engagement in Online Learning

For students (grades 3-5 and 6-12):.

1. How excited are you about going to your classes?

2. How often do you get so focused on activities in your classes that you lose track of time?

3. In your classes, how eager are you to participate?

4. When you are not in school, how often do you talk about ideas from your classes?

5. Overall, how interested are you in your classes?

6. What are the most engaging activities that happen in this class?

7. Which aspects of class have you found least engaging?

8. If you were teaching class, what is the one thing you would do to make it more engaging for all students?

9. How do you know when you are feeling engaged in class?

10. What projects/assignments/activities do you find most engaging in this class?

11. What does this teacher do to make this class engaging?

12. How much effort are you putting into your classes right now?

13. How difficult or easy is it for you to try hard on your schoolwork right now?

14. How difficult or easy is it for you to stay focused on your schoolwork right now?

15. If you have missed in-person school recently, why did you miss school?

16. If you have missed online classes recently, why did you miss class?

17. How would you like to be learning right now?

18. How happy are you with the amount of time you spend speaking with your teacher?

19. How difficult or easy is it to use the distance learning technology (computer, tablet, video calls, learning applications, etc.)?

20. What do you like about school right now?

21. What do you not like about school right now?

22. When you have online schoolwork, how often do you have the technology (laptop, tablet, computer, etc) you need?

23. How difficult or easy is it for you to connect to the internet to access your schoolwork?

24. What has been the hardest part about completing your schoolwork?

25. How happy are you with how much time you spend in specials or enrichment (art, music, PE, etc.)?

26. Are you getting all the help you need with your schoolwork right now?

27. How sure are you that you can do well in school right now?

28. Are there adults at your school you can go to for help if you need it right now?

29. If you are participating in distance learning, how often do you hear from your teachers individually?

For Families, Parents, and Caregivers:

30 How satisfied are you with the way learning is structured at your child’s school right now?

31. Do you think your child should spend less or more time learning in person at school right now?

32. How difficult or easy is it for your child to use the distance learning tools (video calls, learning applications, etc.)?

33. How confident are you in your ability to support your child's education during distance learning?

34. How confident are you that teachers can motivate students to learn in the current model?

35. What is working well with your child’s education that you would like to see continued?

36. What is challenging with your child’s education that you would like to see improved?

37. Does your child have their own tablet, laptop, or computer available for schoolwork when they need it?

38. What best describes your child's typical internet access?

39. Is there anything else you would like us to know about your family’s needs at this time?

For Teachers and Staff:

40.   In the past week, how many of your students regularly participated in your virtual classes?

41. In the past week, how engaged have students been in your virtual classes?

42. In the past week, how engaged have students been in your in-person classes?

43. Is there anything else you would like to share about student engagement at this time?

44. What is working well with the current learning model that you would like to see continued?

45. What is challenging about the current learning model that you would like to see improved?

Elevate Student, Family, and Staff Voices This Year With Panorama

Schools and districts can use Panorama’s leading survey administration and analytics platform to quickly gather and take action on information from students, families, teachers, and staff. The questions are applicable to all types of K-12 school settings and grade levels, as well as to communities serving students from a range of socioeconomic backgrounds.

back-to-school-students

In the Panorama platform, educators can view and disaggregate results by topic, question, demographic group, grade level, school, and more to inform priority areas and action plans. Districts may use the data to improve teaching and learning models, build stronger academic and social-emotional support systems, improve stakeholder communication, and inform staff professional development.

To learn more about Panorama's survey platform, get in touch with our team.

Related Articles

44 Questions to Ask Students, Families, and Staff During the Pandemic

44 Questions to Ask Students, Families, and Staff During the Pandemic

Identify ways to support students, families, and staff in your school district during the pandemic with these 44 questions.

Engaging Your School Community in Survey Results (Q&A Ep. 4)

Engaging Your School Community in Survey Results (Q&A Ep. 4)

Learn how to engage principals, staff, families, and students in the survey results when running a stakeholder feedback program around school climate.

Strategies to Promote Positive Student-Teacher Relationships

Strategies to Promote Positive Student-Teacher Relationships

Explore four strategies for building strong student-teacher relationships in your school.

research questions on online learning

Featured Resource

9 virtual learning resources to connect with students, families, and staff.

We've bundled our top resources for building belonging in hybrid or distance learning environments.

Join 90,000+ education leaders on our weekly newsletter.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

What we know about online learning and the homework gap amid the pandemic

A sixth grader completes his homework online in his family's living room in Boston on March 31, 2020.

America’s K-12 students are returning to classrooms this fall after 18 months of virtual learning at home during the COVID-19 pandemic. Some students who lacked the home internet connectivity needed to finish schoolwork during this time – an experience often called the “ homework gap ” – may continue to feel the effects this school year.

Here is what Pew Research Center surveys found about the students most likely to be affected by the homework gap and their experiences learning from home.

Children across the United States are returning to physical classrooms this fall after 18 months at home, raising questions about how digital disparities at home will affect the existing homework gap between certain groups of students.

Methodology for each Pew Research Center poll can be found at the links in the post.

With the exception of the 2018 survey, everyone who took part in the surveys is a member of the Center’s American Trends Panel (ATP), an online survey panel that is recruited through national, random sampling of residential addresses. This way nearly all U.S. adults have a chance of selection. The survey is weighted to be representative of the U.S. adult population by gender, race, ethnicity, partisan affiliation, education and other categories. Read more about the  ATP’s methodology .

The 2018 data on U.S. teens comes from a Center poll of 743 U.S. teens ages 13 to 17 conducted March 7 to April 10, 2018, using the NORC AmeriSpeak panel. AmeriSpeak is a nationally representative, probability-based panel of the U.S. household population. Randomly selected U.S. households are sampled with a known, nonzero probability of selection from the NORC National Frame, and then contacted by U.S. mail, telephone or face-to-face interviewers. Read more details about the NORC AmeriSpeak panel methodology .

Around nine-in-ten U.S. parents with K-12 children at home (93%) said their children have had some online instruction since the coronavirus outbreak began in February 2020, and 30% of these parents said it has been very or somewhat difficult for them to help their children use technology or the internet as an educational tool, according to an April 2021 Pew Research Center survey .

A bar chart showing that mothers and parents with lower incomes are more likely than fathers and those with higher incomes to have trouble helping their children with tech for online learning

Gaps existed for certain groups of parents. For example, parents with lower and middle incomes (36% and 29%, respectively) were more likely to report that this was very or somewhat difficult, compared with just 18% of parents with higher incomes.

This challenge was also prevalent for parents in certain types of communities – 39% of rural residents and 33% of urban residents said they have had at least some difficulty, compared with 23% of suburban residents.

Around a third of parents with children whose schools were closed during the pandemic (34%) said that their child encountered at least one technology-related obstacle to completing their schoolwork during that time. In the April 2021 survey, the Center asked parents of K-12 children whose schools had closed at some point about whether their children had faced three technology-related obstacles. Around a quarter of parents (27%) said their children had to do schoolwork on a cellphone, 16% said their child was unable to complete schoolwork because of a lack of computer access at home, and another 14% said their child had to use public Wi-Fi to finish schoolwork because there was no reliable connection at home.

Parents with lower incomes whose children’s schools closed amid COVID-19 were more likely to say their children faced technology-related obstacles while learning from home. Nearly half of these parents (46%) said their child faced at least one of the three obstacles to learning asked about in the survey, compared with 31% of parents with midrange incomes and 18% of parents with higher incomes.

A chart showing that parents with lower incomes are more likely than parents with higher incomes to say their children have faced tech-related schoolwork challenges in the pandemic

Of the three obstacles asked about in the survey, parents with lower incomes were most likely to say that their child had to do their schoolwork on a cellphone (37%). About a quarter said their child was unable to complete their schoolwork because they did not have computer access at home (25%), or that they had to use public Wi-Fi because they did not have a reliable internet connection at home (23%).

A Center survey conducted in April 2020 found that, at that time, 59% of parents with lower incomes who had children engaged in remote learning said their children would likely face at least one of the obstacles asked about in the 2021 survey.

A year into the outbreak, an increasing share of U.S. adults said that K-12 schools have a responsibility to provide all students with laptop or tablet computers in order to help them complete their schoolwork at home during the pandemic. About half of all adults (49%) said this in the spring 2021 survey, up 12 percentage points from a year earlier. An additional 37% of adults said that schools should provide these resources only to students whose families cannot afford them, and just 13% said schools do not have this responsibility.

A bar chart showing that roughly half of adults say schools have responsibility to provide technology to all students during pandemic

While larger shares of both political parties in April 2021 said K-12 schools have a responsibility to provide computers to all students in order to help them complete schoolwork at home, there was a 15-point change among Republicans: 43% of Republicans and those who lean to the Republican Party said K-12 schools have this responsibility, compared with 28% last April. In the 2021 survey, 22% of Republicans also said schools do not have this responsibility at all, compared with 6% of Democrats and Democratic leaners.

Even before the pandemic, Black teens and those living in lower-income households were more likely than other groups to report trouble completing homework assignments because they did not have reliable technology access. Nearly one-in-five teens ages 13 to 17 (17%) said they are often or sometimes unable to complete homework assignments because they do not have reliable access to a computer or internet connection, a 2018 Center survey of U.S. teens found.

A bar chart showing that in 2018, Black teens and those from lower-income households were especially likely to be impacted by the digital 'homework gap'

One-quarter of Black teens said they were at least sometimes unable to complete their homework due to a lack of digital access, including 13% who said this happened to them often. Just 4% of White teens and 6% of Hispanic teens said this often happened to them. (There were not enough Asian respondents in the survey sample to be broken out into a separate analysis.)

A wide gap also existed by income level: 24% of teens whose annual family income was less than $30,000 said the lack of a dependable computer or internet connection often or sometimes prohibited them from finishing their homework, but that share dropped to 9% among teens who lived in households earning $75,000 or more a year.

  • Coronavirus (COVID-19)
  • COVID-19 & Technology
  • Digital Divide
  • Education & Learning Online

Download Katherine Schaeffer's photo

Katherine Schaeffer is a research analyst at Pew Research Center .

How Americans View the Coronavirus, COVID-19 Vaccines Amid Declining Levels of Concern

Online religious services appeal to many americans, but going in person remains more popular, about a third of u.s. workers who can work from home now do so all the time, how the pandemic has affected attendance at u.s. religious services, mental health and the pandemic: what u.s. surveys have found, most popular.

901 E St. NW, Suite 300 Washington, DC 20004 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan, nonadvocacy fact tank that informs the public about the issues, attitudes and trends shaping the world. It does not take policy positions. The Center conducts public opinion polling, demographic research, computational social science research and other data-driven research. Pew Research Center is a subsidiary of The Pew Charitable Trusts , its primary funder.

© 2024 Pew Research Center

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

The PMC website is updating on October 15, 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Elsevier - PMC COVID-19 Collection

Logo of pheelsevier

A systematic review of research on online teaching and learning from 2009 to 2018

Associated data.

Systematic reviews were conducted in the nineties and early 2000's on online learning research. However, there is no review examining the broader aspect of research themes in online learning in the last decade. This systematic review addresses this gap by examining 619 research articles on online learning published in twelve journals in the last decade. These studies were examined for publication trends and patterns, research themes, research methods, and research settings and compared with the research themes from the previous decades. While there has been a slight decrease in the number of studies on online learning in 2015 and 2016, it has then continued to increase in 2017 and 2018. The majority of the studies were quantitative in nature and were examined in higher education. Online learning research was categorized into twelve themes and a framework across learner, course and instructor, and organizational levels was developed. Online learner characteristics and online engagement were examined in a high number of studies and were consistent with three of the prior systematic reviews. However, there is still a need for more research on organization level topics such as leadership, policy, and management and access, culture, equity, inclusion, and ethics and also on online instructor characteristics.

  • • Twelve online learning research themes were identified in 2009–2018.
  • • A framework with learner, course and instructor, and organizational levels was used.
  • • Online learner characteristics and engagement were the mostly examined themes.
  • • The majority of the studies used quantitative research methods and in higher education.
  • • There is a need for more research on organization level topics.

1. Introduction

Online learning has been on the increase in the last two decades. In the United States, though higher education enrollment has declined, online learning enrollment in public institutions has continued to increase ( Allen & Seaman, 2017 ), and so has the research on online learning. There have been review studies conducted on specific areas on online learning such as innovations in online learning strategies ( Davis et al., 2018 ), empirical MOOC literature ( Liyanagunawardena et al., 2013 ; Veletsianos & Shepherdson, 2016 ; Zhu et al., 2018 ), quality in online education ( Esfijani, 2018 ), accessibility in online higher education ( Lee, 2017 ), synchronous online learning ( Martin et al., 2017 ), K-12 preparation for online teaching ( Moore-Adams et al., 2016 ), polychronicity in online learning ( Capdeferro et al., 2014 ), meaningful learning research in elearning and online learning environments ( Tsai, Shen, & Chiang, 2013 ), problem-based learning in elearning and online learning environments ( Tsai & Chiang, 2013 ), asynchronous online discussions ( Thomas, 2013 ), self-regulated learning in online learning environments ( Tsai, Shen, & Fan, 2013 ), game-based learning in online learning environments ( Tsai & Fan, 2013 ), and online course dropout ( Lee & Choi, 2011 ). While there have been review studies conducted on specific online learning topics, very few studies have been conducted on the broader aspect of online learning examining research themes.

2. Systematic Reviews of Distance Education and Online Learning Research

Distance education has evolved from offline to online settings with the access to internet and COVID-19 has made online learning the common delivery method across the world. Tallent-Runnels et al. (2006) reviewed research late 1990's to early 2000's, Berge and Mrozowski (2001) reviewed research 1990 to 1999, and Zawacki-Richter et al. (2009) reviewed research in 2000–2008 on distance education and online learning. Table 1 shows the research themes from previous systematic reviews on online learning research. There are some themes that re-occur in the various reviews, and there are also new themes that emerge. Though there have been reviews conducted in the nineties and early 2000's, there is no review examining the broader aspect of research themes in online learning in the last decade. Hence, the need for this systematic review which informs the research themes in online learning from 2009 to 2018. In the following sections, we review these systematic review studies in detail.

Comparison of online learning research themes from previous studies.

1990–1999 ( )1993–2004 ( )2000–2008 (Zawacki-Richter et al.,
2009)
Most Number of Studies
Lowest Number of Studies

2.1. Distance education research themes, 1990 to 1999 ( Berge & Mrozowski, 2001 )

Berge and Mrozowski (2001) reviewed 890 research articles and dissertation abstracts on distance education from 1990 to 1999. The four distance education journals chosen by the authors to represent distance education included, American Journal of Distance Education, Distance Education, Open Learning, and the Journal of Distance Education. This review overlapped in the dates of the Tallent-Runnels et al. (2006) study. Berge and Mrozowski (2001) categorized the articles according to Sherry's (1996) ten themes of research issues in distance education: redefining roles of instructor and students, technologies used, issues of design, strategies to stimulate learning, learner characteristics and support, issues related to operating and policies and administration, access and equity, and costs and benefits.

In the Berge and Mrozowski (2001) study, more than 100 studies focused on each of the three themes: (1) design issues, (2) learner characteristics, and (3) strategies to increase interactivity and active learning. By design issues, the authors focused on instructional systems design and focused on topics such as content requirement, technical constraints, interactivity, and feedback. The next theme, strategies to increase interactivity and active learning, were closely related to design issues and focused on students’ modes of learning. Learner characteristics focused on accommodating various learning styles through customized instructional theory. Less than 50 studies focused on the three least examined themes: (1) cost-benefit tradeoffs, (2) equity and accessibility, and (3) learner support. Cost-benefit trade-offs focused on the implementation costs of distance education based on school characteristics. Equity and accessibility focused on the equity of access to distance education systems. Learner support included topics such as teacher to teacher support as well as teacher to student support.

2.2. Online learning research themes, 1993 to 2004 ( Tallent-Runnels et al., 2006 )

Tallent-Runnels et al. (2006) reviewed research on online instruction from 1993 to 2004. They reviewed 76 articles focused on online learning by searching five databases, ERIC, PsycINFO, ContentFirst, Education Abstracts, and WilsonSelect. Tallent-Runnels et al. (2006) categorized research into four themes, (1) course environment, (2) learners' outcomes, (3) learners’ characteristics, and (4) institutional and administrative factors. The first theme that the authors describe as course environment ( n  = 41, 53.9%) is an overarching theme that includes classroom culture, structural assistance, success factors, online interaction, and evaluation.

Tallent-Runnels et al. (2006) for their second theme found that studies focused on questions involving the process of teaching and learning and methods to explore cognitive and affective learner outcomes ( n  = 29, 38.2%). The authors stated that they found the research designs flawed and lacked rigor. However, the literature comparing traditional and online classrooms found both delivery systems to be adequate. Another research theme focused on learners’ characteristics ( n  = 12, 15.8%) and the synergy of learners, design of the online course, and system of delivery. Research findings revealed that online learners were mainly non-traditional, Caucasian, had different learning styles, and were highly motivated to learn. The final theme that they reported was institutional and administrative factors (n  = 13, 17.1%) on online learning. Their findings revealed that there was a lack of scholarly research in this area and most institutions did not have formal policies in place for course development as well as faculty and student support in training and evaluation. Their research confirmed that when universities offered online courses, it improved student enrollment numbers.

2.3. Distance education research themes 2000 to 2008 ( Zawacki-Richter et al., 2009 )

Zawacki-Richter et al. (2009) reviewed 695 articles on distance education from 2000 to 2008 using the Delphi method for consensus in identifying areas and classified the literature from five prominent journals. The five journals selected due to their wide scope in research in distance education included Open Learning, Distance Education, American Journal of Distance Education, the Journal of Distance Education, and the International Review of Research in Open and Distributed Learning. The reviewers examined the main focus of research and identified gaps in distance education research in this review.

Zawacki-Richter et al. (2009) classified the studies into macro, meso and micro levels focusing on 15 areas of research. The five areas of the macro-level addressed: (1) access, equity and ethics to deliver distance education for developing nations and the role of various technologies to narrow the digital divide, (2) teaching and learning drivers, markets, and professional development in the global context, (3) distance delivery systems and institutional partnerships and programs and impact of hybrid modes of delivery, (4) theoretical frameworks and models for instruction, knowledge building, and learner interactions in distance education practice, and (5) the types of preferred research methodologies. The meso-level focused on seven areas that involve: (1) management and organization for sustaining distance education programs, (2) examining financial aspects of developing and implementing online programs, (3) the challenges and benefits of new technologies for teaching and learning, (4) incentives to innovate, (5) professional development and support for faculty, (6) learner support services, and (7) issues involving quality standards and the impact on student enrollment and retention. The micro-level focused on three areas: (1) instructional design and pedagogical approaches, (2) culturally appropriate materials, interaction, communication, and collaboration among a community of learners, and (3) focus on characteristics of adult learners, socio-economic backgrounds, learning preferences, and dispositions.

The top three research themes in this review by Zawacki-Richter et al. (2009) were interaction and communities of learning ( n  = 122, 17.6%), instructional design ( n  = 121, 17.4%) and learner characteristics ( n  = 113, 16.3%). The lowest number of studies (less than 3%) were found in studies examining the following research themes, management and organization ( n  = 18), research methods in DE and knowledge transfer ( n  = 13), globalization of education and cross-cultural aspects ( n  = 13), innovation and change ( n  = 13), and costs and benefits ( n  = 12).

2.4. Online learning research themes

These three systematic reviews provide a broad understanding of distance education and online learning research themes from 1990 to 2008. However, there is an increase in the number of research studies on online learning in this decade and there is a need to identify recent research themes examined. Based on the previous systematic reviews ( Berge & Mrozowski, 2001 ; Hung, 2012 ; Tallent-Runnels et al., 2006 ; Zawacki-Richter et al., 2009 ), online learning research in this study is grouped into twelve different research themes which include Learner characteristics, Instructor characteristics, Course or program design and development, Course Facilitation, Engagement, Course Assessment, Course Technologies, Access, Culture, Equity, Inclusion, and Ethics, Leadership, Policy and Management, Instructor and Learner Support, and Learner Outcomes. Table 2 below describes each of the research themes and using these themes, a framework is derived in Fig. 1 .

Research themes in online learning.

Research ThemeDescription
1Learner CharacteristicsFocuses on understanding the learner characteristics and how online learning can be designed and delivered to meet their needs. Online learner characteristics can be broadly categorized into demographic characteristics, academic characteristics, cognitive characteristics, affective, self-regulation, and motivational characteristics.
2Learner OutcomesLearner outcomes are statements that specify what the learner will achieve at the end of the course or program. Examining learner outcomes such as success, retention, and dropouts are critical in online courses.
3EngagementEngaging the learner in the online course is vitally important as they are separated from the instructor and peers in the online setting. Engagement is examined through the lens of interaction, participation, community, collaboration, communication, involvement and presence.
4Course or Program Design and DevelopmentCourse design and development is critical in online learning as it engages and assists the students in achieving the learner outcomes. Several models and processes are used to develop the online course, employing different design elements to meet student needs.
5Course FacilitationThe delivery or facilitation of the course is as important as course design. Facilitation strategies used in delivery of the course such as in communication and modeling practices are examined in course facilitation.
6Course AssessmentCourse Assessments are adapted and delivered in an online setting. Formative assessments, peer assessments, differentiated assessments, learner choice in assessments, feedback system, online proctoring, plagiarism in online learning, and alternate assessments such as eportfolios are examined.
7Evaluation and Quality AssuranceEvaluation is making a judgment either on the process, the product or a program either during or at the end. There is a need for research on evaluation and quality in the online courses. This has been examined through course evaluations, surveys, analytics, social networks, and pedagogical assessments. Quality assessment rubrics such as Quality Matters have also been researched.
8Course TechnologiesA number of online course technologies such as learning management systems, online textbooks, online audio and video tools, collaborative tools, social networks to build online community have been the focus of research.
9Instructor CharacteristicsWith the increase in online courses, there has also been an increase in the number of instructors teaching online courses. Instructor characteristics can be examined through their experience, satisfaction, and roles in online teaching.
10Institutional SupportThe support for online learning is examined both as learner support and instructor support. Online students need support to be successful online learners and this could include social, academic, and cognitive forms of support. Online instructors need support in terms of pedagogy and technology to be successful online instructors.
11Access, Culture, Equity, Inclusion, and EthicsCross-cultural online learning is gaining importance along with access in global settings. In addition, providing inclusive opportunities for all learners and in ethical ways is being examined.
12Leadership, Policy and ManagementLeadership support is essential for success of online learning. Leaders perspectives, challenges and strategies used are examined. Policies and governance related research are also being studied.

Fig. 1

Online learning research themes framework.

The collection of research themes is presented as a framework in Fig. 1 . The themes are organized by domain or level to underscore the nested relationship that exists. As evidenced by the assortment of themes, research can focus on any domain of delivery or associated context. The “Learner” domain captures characteristics and outcomes related to learners and their interaction within the courses. The “Course and Instructor” domain captures elements about the broader design of the course and facilitation by the instructor, and the “Organizational” domain acknowledges the contextual influences on the course. It is important to note as well that due to the nesting, research themes can cross domains. For example, the broader cultural context may be studied as it pertains to course design and development, and institutional support can include both learner support and instructor support. Likewise, engagement research can involve instructors as well as learners.

In this introduction section, we have reviewed three systematic reviews on online learning research ( Berge & Mrozowski, 2001 ; Tallent-Runnels et al., 2006 ; Zawacki-Richter et al., 2009 ). Based on these reviews and other research, we have derived twelve themes to develop an online learning research framework which is nested in three levels: learner, course and instructor, and organization.

2.5. Purpose of this research

In two out of the three previous reviews, design, learner characteristics and interaction were examined in the highest number of studies. On the other hand, cost-benefit tradeoffs, equity and accessibility, institutional and administrative factors, and globalization and cross-cultural aspects were examined in the least number of studies. One explanation for this may be that it is a function of nesting, noting that studies falling in the Organizational and Course levels may encompass several courses or many more participants within courses. However, while some research themes re-occur, there are also variations in some themes across time, suggesting the importance of research themes rise and fall over time. Thus, a critical examination of the trends in themes is helpful for understanding where research is needed most. Also, since there is no recent study examining online learning research themes in the last decade, this study strives to address that gap by focusing on recent research themes found in the literature, and also reviewing research methods and settings. Notably, one goal is to also compare findings from this decade to the previous review studies. Overall, the purpose of this study is to examine publication trends in online learning research taking place during the last ten years and compare it with the previous themes identified in other review studies. Due to the continued growth of online learning research into new contexts and among new researchers, we also examine the research methods and settings found in the studies of this review.

The following research questions are addressed in this study.

  • 1. What percentage of the population of articles published in the journals reviewed from 2009 to 2018 were related to online learning and empirical?
  • 2. What is the frequency of online learning research themes in the empirical online learning articles of journals reviewed from 2009 to 2018?
  • 3. What is the frequency of research methods and settings that researchers employed in the empirical online learning articles of the journals reviewed from 2009 to 2018?

This five-step systematic review process described in the U.S. Department of Education, Institute of Education Sciences, What Works Clearinghouse Procedures and Standards Handbook, Version 4.0 ( 2017 ) was used in this systematic review: (a) developing the review protocol, (b) identifying relevant literature, (c) screening studies, (d) reviewing articles, and (e) reporting findings.

3.1. Data sources and search strategies

The Education Research Complete database was searched using the keywords below for published articles between the years 2009 and 2018 using both the Title and Keyword function for the following search terms.

“online learning" OR "online teaching" OR "online program" OR "online course" OR “online education”

3.2. Inclusion/exclusion criteria

The initial search of online learning research among journals in the database resulted in more than 3000 possible articles. Therefore, we limited our search to select journals that focus on publishing peer-reviewed online learning and educational research. Our aim was to capture the journals that published the most articles in online learning. However, we also wanted to incorporate the concept of rigor, so we used expert perception to identify 12 peer-reviewed journals that publish high-quality online learning research. Dissertations and conference proceedings were excluded. To be included in this systematic review, each study had to meet the screening criteria as described in Table 3 . A research study was excluded if it did not meet all of the criteria to be included.

Inclusion/Exclusion criteria.

CriteriaInclusionExclusion
Focus of the articleOnline learningArticles that did not focus on online learning
Journals PublishedTwelve identified journalsJournals outside of the 12 journals
Publication date2009 to 2018Prior to 2009 and after 2018
Publication typeScholarly articles of original research from peer reviewed journalsBook chapters, technical reports, dissertations, or proceedings
Research Method and ResultsThere was an identifiable method and results section describing how the study was conducted and included the findings. Quantitative and qualitative methods were included.Reviews of other articles, opinion, or discussion papers that do not include a discussion of the procedures of the study or analysis of data such as product reviews or conceptual articles.
LanguageJournal article was written in EnglishOther languages were not included

3.3. Process flow selection of articles

Fig. 2 shows the process flow involved in the selection of articles. The search in the database Education Research Complete yielded an initial sample of 3332 articles. Targeting the 12 journals removed 2579 articles. After reviewing the abstracts, we removed 134 articles based on the inclusion/exclusion criteria. The final sample, consisting of 619 articles, was entered into the computer software MAXQDA ( VERBI Software, 2019 ) for coding.

Fig. 2

Flowchart of online learning research selection.

3.4. Developing review protocol

A review protocol was designed as a codebook in MAXQDA ( VERBI Software, 2019 ) by the three researchers. The codebook was developed based on findings from the previous review studies and from the initial screening of the articles in this review. The codebook included 12 research themes listed earlier in Table 2 (Learner characteristics, Instructor characteristics, Course or program design and development, Course Facilitation, Engagement, Course Assessment, Course Technologies, Access, Culture, Equity, Inclusion, and Ethics, Leadership, Policy and Management, Instructor and Learner Support, and Learner Outcomes), four research settings (higher education, continuing education, K-12, corporate/military), and three research designs (quantitative, qualitative and mixed methods). Fig. 3 below is a screenshot of MAXQDA used for the coding process.

Fig. 3

Codebook from MAXQDA.

3.5. Data coding

Research articles were coded by two researchers in MAXQDA. Two researchers independently coded 10% of the articles and then discussed and updated the coding framework. The second author who was a doctoral student coded the remaining studies. The researchers met bi-weekly to address coding questions that emerged. After the first phase of coding, we found that more than 100 studies fell into each of the categories of Learner Characteristics or Engagement, so we decided to pursue a second phase of coding and reexamine the two themes. Learner Characteristics were classified into the subthemes of Academic, Affective, Motivational, Self-regulation, Cognitive, and Demographic Characteristics. Engagement was classified into the subthemes of Collaborating, Communication, Community, Involvement, Interaction, Participation, and Presence.

3.6. Data analysis

Frequency tables were generated for each of the variables so that outliers could be examined and narrative data could be collapsed into categories. Once cleaned and collapsed into a reasonable number of categories, descriptive statistics were used to describe each of the coded elements. We first present the frequencies of publications related to online learning in the 12 journals. The total number of articles for each journal (collectively, the population) was hand-counted from journal websites, excluding editorials and book reviews. The publication trend of online learning research was also depicted from 2009 to 2018. Then, the descriptive information of the 12 themes, including the subthemes of Learner Characteristics and Engagement were provided. Finally, research themes by research settings and methodology were elaborated.

4.1. Publication trends on online learning

Publication patterns of the 619 articles reviewed from the 12 journals are presented in Table 4 . International Review of Research in Open and Distributed Learning had the highest number of publications in this review. Overall, about 8% of the articles appearing in these twelve journals consisted of online learning publications; however, several journals had concentrations of online learning articles totaling more than 20%.

Empirical online learning research articles by journal, 2009–2018.

Journal NameFrequency of Empirical Online Learning ResearchPercent of SamplePercent of Journal's Total Articles
International Review of Research in Open and Distributed Learning15224.4022.55
Internet & Higher Education8413.4826.58
Computers & Education7512.0418.84
Online Learning7211.563.25
Distance Education6410.2725.10
Journal of Online Learning & Teaching396.2611.71
Journal of Educational Technology & Society365.783.63
Quarterly Review of Distance Education243.854.71
American Journal of Distance Education213.379.17
British Journal of Educational Technology193.051.93
Educational Technology Research & Development193.0510.80
Australasian Journal of Educational Technology142.252.31
Total619100.08.06

Note . Journal's Total Article count excludes reviews and editorials.

The publication trend of online learning research is depicted in Fig. 4 . When disaggregated by year, the total frequency of publications shows an increasing trend. Online learning articles increased throughout the decade and hit a relative maximum in 2014. The greatest number of online learning articles ( n  = 86) occurred most recently, in 2018.

Fig. 4

Online learning publication trends by year.

4.2. Online learning research themes that appeared in the selected articles

The publications were categorized into the twelve research themes identified in Fig. 1 . The frequency counts and percentages of the research themes are provided in Table 5 below. A majority of the research is categorized into the Learner domain. The fewest number of articles appears in the Organization domain.

Research themes in the online learning publications from 2009 to 2018.

Research ThemesFrequencyPercentage
Engagement17928.92
Learner Characteristics13421.65
Learner Outcome325.17
Evaluation and Quality Assurance386.14
Course Technologies355.65
Course Facilitation345.49
Course Assessment304.85
Course Design and Development274.36
Instructor Characteristics213.39
Institutional Support335.33
Access, Culture, Equity, Inclusion, and Ethics294.68
Leadership, Policy, and Management274.36

The specific themes of Engagement ( n  = 179, 28.92%) and Learner Characteristics ( n  = 134, 21.65%) were most often examined in publications. These two themes were further coded to identify sub-themes, which are described in the next two sections. Publications focusing on Instructor Characteristics ( n  = 21, 3.39%) were least common in the dataset.

4.2.1. Research on engagement

The largest number of studies was on engagement in online learning, which in the online learning literature is referred to and examined through different terms. Hence, we explore this category in more detail. In this review, we categorized the articles into seven different sub-themes as examined through different lenses including presence, interaction, community, participation, collaboration, involvement, and communication. We use the term “involvement” as one of the terms since researchers sometimes broadly used the term engagement to describe their work without further description. Table 6 below provides the description, frequency, and percentages of the various studies related to engagement.

Research sub-themes on engagement.

DescriptionFrequencyPercentage
PresenceLearning experience through social, cognitive, and teaching presence.508.08
InteractionProcess of interacting with peers, instructor, or content that results in learners understanding or behavior436.95
CommunitySense of belonging within a group254.04
ParticipationProcess of being actively involved213.39
CollaborationWorking with someone to create something172.75
InvolvementInvolvement in learning. This includes articles that focused broadly on engagement of learners.142.26
CommunicationProcess of exchanging information with the intent to share information91.45

In the sections below, we provide several examples of the different engagement sub-themes that were studied within the larger engagement theme.

Presence. This sub-theme was the most researched in engagement. With the development of the community of inquiry framework most of the studies in this subtheme examined social presence ( Akcaoglu & Lee, 2016 ; Phirangee & Malec, 2017 ; Wei et al., 2012 ), teaching presence ( Orcutt & Dringus, 2017 ; Preisman, 2014 ; Wisneski et al., 2015 ) and cognitive presence ( Archibald, 2010 ; Olesova et al., 2016 ).

Interaction . This was the second most studied theme under engagement. Researchers examined increasing interpersonal interactions ( Cung et al., 2018 ), learner-learner interactions ( Phirangee, 2016 ; Shackelford & Maxwell, 2012 ; Tawfik et al., 2018 ), peer-peer interaction ( Comer et al., 2014 ), learner-instructor interaction ( Kuo et al., 2014 ), learner-content interaction ( Zimmerman, 2012 ), interaction through peer mentoring ( Ruane & Koku, 2014 ), interaction and community building ( Thormann & Fidalgo, 2014 ), and interaction in discussions ( Ruane & Lee, 2016 ; Tibi, 2018 ).

Community. Researchers examined building community in online courses ( Berry, 2017 ), supporting a sense of community ( Jiang, 2017 ), building an online learning community of practice ( Cho, 2016 ), building an academic community ( Glazer & Wanstreet, 2011 ; Nye, 2015 ; Overbaugh & Nickel, 2011 ), and examining connectedness and rapport in an online community ( Bolliger & Inan, 2012 ; Murphy & Rodríguez-Manzanares, 2012 ; Slagter van Tryon & Bishop, 2012 ).

Participation. Researchers examined engagement through participation in a number of studies. Some of the topics include, participation patterns in online discussion ( Marbouti & Wise, 2016 ; Wise et al., 2012 ), participation in MOOCs ( Ahn et al., 2013 ; Saadatmand & Kumpulainen, 2014 ), features that influence students’ online participation ( Rye & Støkken, 2012 ) and active participation.

Collaboration. Researchers examined engagement through collaborative learning. Specific studies focused on cross-cultural collaboration ( Kumi-Yeboah, 2018 ; Yang et al., 2014 ), how virtual teams collaborate ( Verstegen et al., 2018 ), types of collaboration teams ( Wicks et al., 2015 ), tools for collaboration ( Boling et al., 2014 ), and support for collaboration ( Kopp et al., 2012 ).

Involvement. Researchers examined engaging learners through involvement in various learning activities ( Cundell & Sheepy, 2018 ), student engagement through various measures ( Dixson, 2015 ), how instructors included engagement to involve students in learning ( O'Shea et al., 2015 ), different strategies to engage the learner ( Amador & Mederer, 2013 ), and designed emotionally engaging online environments ( Koseoglu & Doering, 2011 ).

Communication. Researchers examined communication in online learning in studies using social network analysis ( Ergün & Usluel, 2016 ), using informal communication tools such as Facebook for class discussion ( Kent, 2013 ), and using various modes of communication ( Cunningham et al., 2010 ; Rowe, 2016 ). Studies have also focused on both asynchronous and synchronous aspects of communication ( Swaggerty & Broemmel, 2017 ; Yamagata-Lynch, 2014 ).

4.2.2. Research on learner characteristics

The second largest theme was learner characteristics. In this review, we explore this further to identify several aspects of learner characteristics. In this review, we categorized the learner characteristics into self-regulation characteristics, motivational characteristics, academic characteristics, affective characteristics, cognitive characteristics, and demographic characteristics. Table 7 provides the number of studies and percentages examining the various learner characteristics.

Research sub-themes on learner characteristics.

Learner CharacteristicsDescriptionFrequencyPercentage
Self-regulation CharacteristicsInvolves controlling learner's behavior, emotions, and thoughts to achieve specific learning and performance goals548.72
Motivational CharacteristicsLearners goal-directed activity instigated and sustained such as beliefs, and behavioral change233.72
Academic CharacteristicsEducation characteristics such as educational type and educational level193.07
Affective CharacteristicsLearner characteristics that describe learners' feelings or emotions such as satisfaction172.75
Cognitive CharacteristicsLearner characteristics related to cognitive elements such as attention, memory, and intellect (e.g., learning strategies, learning skills, etc.)142.26
Demographic CharacteristicsLearner characteristics that relate to information as age, gender, language, social economic status, and cultural background.71.13

Online learning has elements that are different from the traditional face-to-face classroom and so the characteristics of the online learners are also different. Yukselturk and Top (2013) categorized online learner profile into ten aspects: gender, age, work status, self-efficacy, online readiness, self-regulation, participation in discussion list, participation in chat sessions, satisfaction, and achievement. Their categorization shows that there are differences in online learner characteristics in these aspects when compared to learners in other settings. Some of the other aspects such as participation and achievement as discussed by Yukselturk and Top (2013) are discussed in different research themes in this study. The sections below provide examples of the learner characteristics sub-themes that were studied.

Self-regulation. Several researchers have examined self-regulation in online learning. They found that successful online learners are academically motivated ( Artino & Stephens, 2009 ), have academic self-efficacy ( Cho & Shen, 2013 ), have grit and intention to succeed ( Wang & Baker, 2018 ), have time management and elaboration strategies ( Broadbent, 2017 ), set goals and revisit course content ( Kizilcec et al., 2017 ), and persist ( Glazer & Murphy, 2015 ). Researchers found a positive relationship between learner's self-regulation and interaction ( Delen et al., 2014 ) and self-regulation and communication and collaboration ( Barnard et al., 2009 ).

Motivation. Researchers focused on motivation of online learners including different motivation levels of online learners ( Li & Tsai, 2017 ), what motivated online learners ( Chaiprasurt & Esichaikul, 2013 ), differences in motivation of online learners ( Hartnett et al., 2011 ), and motivation when compared to face to face learners ( Paechter & Maier, 2010 ). Harnett et al. (2011) found that online learner motivation was complex, multifaceted, and sensitive to situational conditions.

Academic. Several researchers have focused on academic aspects for online learner characteristics. Readiness for online learning has been examined as an academic factor by several researchers ( Buzdar et al., 2016 ; Dray et al., 2011 ; Wladis & Samuels, 2016 ; Yu, 2018 ) specifically focusing on creating and validating measures to examine online learner readiness including examining students emotional intelligence as a measure of student readiness for online learning. Researchers have also examined other academic factors such as academic standing ( Bradford & Wyatt, 2010 ), course level factors ( Wladis et al., 2014 ) and academic skills in online courses ( Shea & Bidjerano, 2014 ).

Affective. Anderson and Bourke (2013) describe affective characteristics through which learners express feelings or emotions. Several research studies focused on the affective characteristics of online learners. Learner satisfaction for online learning has been examined by several researchers ( Cole et al., 2014 ; Dziuban et al., 2015 ; Kuo et al., 2013 ; Lee, 2014a ) along with examining student emotions towards online assessment ( Kim et al., 2014 ).

Cognitive. Researchers have also examined cognitive aspects of learner characteristics including meta-cognitive skills, cognitive variables, higher-order thinking, cognitive density, and critical thinking ( Chen & Wu, 2012 ; Lee, 2014b ). Lee (2014b) examined the relationship between cognitive presence density and higher-order thinking skills. Chen and Wu (2012) examined the relationship between cognitive and motivational variables in an online system for secondary physical education.

Demographic. Researchers have examined various demographic factors in online learning. Several researchers have examined gender differences in online learning ( Bayeck et al., 2018 ; Lowes et al., 2016 ; Yukselturk & Bulut, 2009 ), ethnicity, age ( Ke & Kwak, 2013 ), and minority status ( Yeboah & Smith, 2016 ) of online learners.

4.2.3. Less frequently studied research themes

While engagement and learner characteristics were studied the most, other themes were less often studied in the literature and are presented here, according to size, with general descriptions of the types of research examined for each.

Evaluation and Quality Assurance. There were 38 studies (6.14%) published in the theme of evaluation and quality assurance. Some of the studies in this theme focused on course quality standards, using quality matters to evaluate quality, using the CIPP model for evaluation, online learning system evaluation, and course and program evaluations.

Course Technologies. There were 35 studies (5.65%) published in the course technologies theme. Some of the studies examined specific technologies such as Edmodo, YouTube, Web 2.0 tools, wikis, Twitter, WebCT, Screencasts, and Web conferencing systems in the online learning context.

Course Facilitation. There were 34 studies (5.49%) published in the course facilitation theme. Some of the studies in this theme examined facilitation strategies and methods, experiences of online facilitators, and online teaching methods.

Institutional Support. There were 33 studies (5.33%) published in the institutional support theme which included support for both the instructor and learner. Some of the studies on instructor support focused on training new online instructors, mentoring programs for faculty, professional development resources for faculty, online adjunct faculty training, and institutional support for online instructors. Studies on learner support focused on learning resources for online students, cognitive and social support for online learners, and help systems for online learner support.

Learner Outcome. There were 32 studies (5.17%) published in the learner outcome theme. Some of the studies that were examined in this theme focused on online learner enrollment, completion, learner dropout, retention, and learner success.

Course Assessment. There were 30 studies (4.85%) published in the course assessment theme. Some of the studies in the course assessment theme examined online exams, peer assessment and peer feedback, proctoring in online exams, and alternative assessments such as eportfolio.

Access, Culture, Equity, Inclusion, and Ethics. There were 29 studies (4.68%) published in the access, culture, equity, inclusion, and ethics theme. Some of the studies in this theme examined online learning across cultures, multi-cultural effectiveness, multi-access, and cultural diversity in online learning.

Leadership, Policy, and Management. There were 27 studies (4.36%) published in the leadership, policy, and management theme. Some of the studies on leadership, policy, and management focused on online learning leaders, stakeholders, strategies for online learning leadership, resource requirements, university policies for online course policies, governance, course ownership, and faculty incentives for online teaching.

Course Design and Development. There were 27 studies (4.36%) published in the course design and development theme. Some of the studies examined in this theme focused on design elements, design issues, design process, design competencies, design considerations, and instructional design in online courses.

Instructor Characteristics. There were 21 studies (3.39%) published in the instructor characteristics theme. Some of the studies in this theme were on motivation and experiences of online instructors, ability to perform online teaching duties, roles of online instructors, and adjunct versus full-time online instructors.

4.3. Research settings and methodology used in the studies

The research methods used in the studies were classified into quantitative, qualitative, and mixed methods ( Harwell, 2012 , pp. 147–163). The research setting was categorized into higher education, continuing education, K-12, and corporate/military. As shown in Table A in the appendix, the vast majority of the publications used higher education as the research setting ( n  = 509, 67.6%). Table B in the appendix shows that approximately half of the studies adopted the quantitative method ( n  = 324, 43.03%), followed by the qualitative method ( n  = 200, 26.56%). Mixed methods account for the smallest portion ( n  = 95, 12.62%).

Table A shows that the patterns of the four research settings were approximately consistent across the 12 themes except for the theme of Leaner Outcome and Institutional Support. Continuing education had a higher relative frequency in Learner Outcome (0.28) and K-12 had a higher relative frequency in Institutional Support (0.33) compared to the frequencies they had in the total themes (0.09 and 0.08 respectively). Table B in the appendix shows that the distribution of the three methods were not consistent across the 12 themes. While quantitative studies and qualitative studies were roughly evenly distributed in Engagement, they had a large discrepancy in Learner Characteristics. There were 100 quantitative studies; however, only 18 qualitative studies published in the theme of Learner Characteristics.

In summary, around 8% of the articles published in the 12 journals focus on online learning. Online learning publications showed a tendency of increase on the whole in the past decade, albeit fluctuated, with the greatest number occurring in 2018. Among the 12 research themes related to online learning, the themes of Engagement and Learner Characteristics were studied the most and the theme of Instructor Characteristics was studied the least. Most studies were conducted in the higher education setting and approximately half of the studies used the quantitative method. Looking at the 12 themes by setting and method, we found that the patterns of the themes by setting or by method were not consistent across the 12 themes.

The quality of our findings was ensured by scientific and thorough searches and coding consistency. The selection of the 12 journals provides evidence of the representativeness and quality of primary studies. In the coding process, any difficulties and questions were resolved by consultations with the research team at bi-weekly meetings, which ensures the intra-rater and interrater reliability of coding. All these approaches guarantee the transparency and replicability of the process and the quality of our results.

5. Discussion

This review enabled us to identify the online learning research themes examined from 2009 to 2018. In the section below, we review the most studied research themes, engagement and learner characteristics along with implications, limitations, and directions for future research.

5.1. Most studied research themes

Three out of the four systematic reviews informing the design of the present study found that online learner characteristics and online engagement were examined in a high number of studies. In this review, about half of the studies reviewed (50.57%) focused on online learner characteristics or online engagement. This shows the continued importance of these two themes. In the Tallent-Runnels et al.’s (2006) study, the learner characteristics theme was identified as least studied for which they state that researchers are beginning to investigate learner characteristics in the early days of online learning.

One of the differences found in this review is that course design and development was examined in the least number of studies in this review compared to two prior systematic reviews ( Berge & Mrozowski, 2001 ; Zawacki-Richter et al., 2009 ). Zawacki-Richter et al. did not use a keyword search but reviewed all the articles in five different distance education journals. Berge and Mrozowski (2001) included a research theme called design issues to include all aspects of instructional systems design in distance education journals. In our study, in addition to course design and development, we also had focused themes on learner outcomes, course facilitation, course assessment and course evaluation. These are all instructional design focused topics and since we had multiple themes focusing on instructional design topics, the course design and development category might have resulted in fewer studies. There is still a need for more studies to focus on online course design and development.

5.2. Least frequently studied research themes

Three out of the four systematic reviews discussed in the opening of this study found management and organization factors to be least studied. In this review, Leadership, Policy, and Management was studied among 4.36% of the studies and Access, Culture, Equity, Inclusion, and Ethics was studied among 4.68% of the studies in the organizational level. The theme on Equity and accessibility was also found to be the least studied theme in the Berge and Mrozowski (2001) study. In addition, instructor characteristics was the least examined research theme among the twelve themes studied in this review. Only 3.39% of the studies were on instructor characteristics. While there were some studies examining instructor motivation and experiences, instructor ability to teach online, online instructor roles, and adjunct versus full-time online instructors, there is still a need to examine topics focused on instructors and online teaching. This theme was not included in the prior reviews as the focus was more on the learner and the course but not on the instructor. While it is helpful to see research evolving on instructor focused topics, there is still a need for more research on the online instructor.

5.3. Comparing research themes from current study to previous studies

The research themes from this review were compared with research themes from previous systematic reviews, which targeted prior decades. Table 8 shows the comparison.

Comparison of most and least studied online learning research themes from current to previous reviews.

Level1990–1999 ( )1993–2004 ( )2000–2008 ( )2009–2018 (Current Study)
Learner CharacteristicsLXXX
Engagement and InteractionLXXX
Design Issues/Instructional DesignCXX
Course Environment
Learner Outcomes
C
L
X
X
Learner SupportLX
Equity and AccessibilityOXX
Institutional& Administrative FactorsOXX
Management and OrganizationOXX
Cost-BenefitOX

L = Learner, C=Course O=Organization.

5.4. Need for more studies on organizational level themes of online learning

In this review there is a greater concentration of studies focused on Learner domain topics, and reduced attention to broader more encompassing research themes that fall into the Course and Organization domains. There is a need for organizational level topics such as Access, Culture, Equity, Inclusion and Ethics, and Leadership, Policy and Management to be researched on within the context of online learning. Examination of access, culture, equity, inclusion and ethics is very important to support diverse online learners, particularly with the rapid expansion of online learning across all educational levels. This was also least studied based on Berge and Mrozowski (2001) systematic review.

The topics on leadership, policy and management were least studied both in this review and also in the Tallent-Runnels et al. (2006) and Zawacki-Richter et al. (2009) study. Tallent-Runnels categorized institutional and administrative aspects into institutional policies, institutional support, and enrollment effects. While we included support as a separate category, in this study leadership, policy and management were combined. There is still a need for research on leadership of those who manage online learning, policies for online education, and managing online programs. In the Zawacki-Richter et al. (2009) study, only a few studies examined management and organization focused topics. They also found management and organization to be strongly correlated with costs and benefits. In our study, costs and benefits were collectively included as an aspect of management and organization and not as a theme by itself. These studies will provide research-based evidence for online education administrators.

6. Limitations

As with any systematic review, there are limitations to the scope of the review. The search is limited to twelve journals in the field that typically include research on online learning. These manuscripts were identified by searching the Education Research Complete database which focuses on education students, professionals, and policymakers. Other discipline-specific journals as well as dissertations and proceedings were not included due to the volume of articles. Also, the search was performed using five search terms “online learning" OR "online teaching" OR "online program" OR "online course" OR “online education” in title and keyword. If authors did not include these terms, their respective work may have been excluded from this review even if it focused on online learning. While these terms are commonly used in North America, it may not be commonly used in other parts of the world. Additional studies may exist outside this scope.

The search strategy also affected how we presented results and introduced limitations regarding generalization. We identified that only 8% of the articles published in these journals were related to online learning; however, given the use of search terms to identify articles within select journals it was not feasible to identify the total number of research-based articles in the population. Furthermore, our review focused on the topics and general methods of research and did not systematically consider the quality of the published research. Lastly, some journals may have preferences for publishing studies on a particular topic or that use a particular method (e.g., quantitative methods), which introduces possible selection and publication biases which may skew the interpretation of results due to over/under representation. Future studies are recommended to include more journals to minimize the selection bias and obtain a more representative sample.

Certain limitations can be attributed to the coding process. Overall, the coding process for this review worked well for most articles, as each tended to have an individual or dominant focus as described in the abstracts, though several did mention other categories which likely were simultaneously considered to a lesser degree. However, in some cases, a dominant theme was not as apparent and an effort to create mutually exclusive groups for clearer interpretation the coders were occasionally forced to choose between two categories. To facilitate this coding, the full-texts were used to identify a study focus through a consensus seeking discussion among all authors. Likewise, some studies focused on topics that we have associated with a particular domain, but the design of the study may have promoted an aggregated examination or integrated factors from multiple domains (e.g., engagement). Due to our reliance on author descriptions, the impact of construct validity is likely a concern that requires additional exploration. Our final grouping of codes may not have aligned with the original author's description in the abstract. Additionally, coding of broader constructs which disproportionately occur in the Learner domain, such as learner outcomes, learner characteristics, and engagement, likely introduced bias towards these codes when considering studies that involved multiple domains. Additional refinement to explore the intersection of domains within studies is needed.

7. Implications and future research

One of the strengths of this review is the research categories we have identified. We hope these categories will support future researchers and identify areas and levels of need for future research. Overall, there is some agreement on research themes on online learning research among previous reviews and this one, at the same time there are some contradicting findings. We hope the most-researched themes and least-researched themes provide authors a direction on the importance of research and areas of need to focus on.

The leading themes found in this review is online engagement research. However, presentation of this research was inconsistent, and often lacked specificity. This is not unique to online environments, but the nuances of defining engagement in an online environment are unique and therefore need further investigation and clarification. This review points to seven distinct classifications of online engagement. Further research on engagement should indicate which type of engagement is sought. This level of specificity is necessary to establish instruments for measuring engagement and ultimately testing frameworks for classifying engagement and promoting it in online environments. Also, it might be of importance to examine the relationship between these seven sub-themes of engagement.

Additionally, this review highlights growing attention to learner characteristics, which constitutes a shift in focus away from instructional characteristics and course design. Although this is consistent with the focus on engagement, the role of the instructor, and course design with respect to these outcomes remains important. Results of the learner characteristics and engagement research paired with course design will have important ramifications for the use of teaching and learning professionals who support instruction. Additionally, the review also points to a concentration of research in the area of higher education. With an immediate and growing emphasis on online learning in K-12 and corporate settings, there is a critical need for further investigation in these settings.

Lastly, because the present review did not focus on the overall effect of interventions, opportunities exist for dedicated meta-analyses. Particular attention to research on engagement and learner characteristics as well as how these vary by study design and outcomes would be logical additions to the research literature.

8. Conclusion

This systematic review builds upon three previous reviews which tackled the topic of online learning between 1990 and 2010 by extending the timeframe to consider the most recent set of published research. Covering the most recent decade, our review of 619 articles from 12 leading online learning journal points to a more concentrated focus on the learner domain including engagement and learner characteristics, with more limited attention to topics pertaining to the classroom or organizational level. The review highlights an opportunity for the field to clarify terminology concerning online learning research, particularly in the areas of learner outcomes where there is a tendency to classify research more generally (e.g., engagement). Using this sample of published literature, we provide a possible taxonomy for categorizing this research using subcategories. The field could benefit from a broader conversation about how these categories can shape a comprehensive framework for online learning research. Such efforts will enable the field to effectively prioritize research aims over time and synthesize effects.

Credit author statement

Florence Martin: Conceptualization; Writing - original draft, Writing - review & editing Preparation, Supervision, Project administration. Ting Sun: Methodology, Formal analysis, Writing - original draft, Writing - review & editing. Carl Westine: Methodology, Formal analysis, Writing - original draft, Writing - review & editing, Supervision

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

1 Includes articles that are cited in this manuscript and also included in the systematic review. The entire list of 619 articles used in the systematic review can be obtained by emailing the authors.*

Appendix B Supplementary data to this article can be found online at https://doi.org/10.1016/j.compedu.2020.104009 .

Appendix A. 

Research Themes by the Settings in the Online Learning Publications

Research ThemeHigher Ed (  = 506)Continuing Education (  = 58)K-12 (  = 53)Corporate/Military (  = 3)
Engagement15315120
Presence46230
Interaction35440
Community19240
Participation16500
Collaboration16100
Involvement13010
Communication8100
Learner Characteristics1061891
Self-regulation Characteristics43920
Motivation Characteristics18320
Academic Characteristics17020
Affective Characteristics12311
Cognitive Characteristics11120
Demographic Characteristics5200
Evaluation and Quality Assurance33320
Course Technologies33200
Course Facilitation30310
Institutional Support24081
Learner Outcome24710
Course Assessment23250
Access, Culture, Equity, Inclusion and Ethics26120
Leadership, Policy and Management17550
Course Design and Development21141
Instructor Characteristics16140

Research Themes by the Methodology in the Online Learning Publications

Research ThemeMixed Method (  = 95)Quantitative (  = 324)Qualitative (  = 200)
Engagement327869
Presence112514
Interaction92014
Community2914
Participation687
Collaboration2510
Involvement266
Communication054
Learner Characteristics1610018
Self-regulation Characteristics5436
Motivation Characteristics4154
Academic Characteristics1153
Affective Characteristics2123
Cognitive Characteristics482
Demographic Characteristics160
Evaluation and Quality Assurance52211
Course Technologies42011
Course Facilitation71413
Institutional Support12912
Learner Outcome3236
Course Assessment5205
Access, Culture, Equity, Inclusion & Ethics31313
Leadership, Policy and Management5913
Course Design and Development2817
Instructor Characteristics1812

Appendix B. Supplementary data

The following are the Supplementary data to this article:

References 1

  • Ahn J., Butler B.S., Alam A., Webster S.A. Learner participation and engagement in open online courses: Insights from the Peer 2 Peer University. MERLOT Journal of Online Learning and Teaching. 2013; 9 (2):160–171. * [ Google Scholar ]
  • Akcaoglu M., Lee E. Increasing social presence in online learning through small group discussions. International Review of Research in Open and Distance Learning. 2016; 17 (3) * [ Google Scholar ]
  • Allen I.E., Seaman J. Babson survey research group; 2017. Digital compass learning: Distance education enrollment Report 2017. [ Google Scholar ]
  • Amador J.A., Mederer H. Migrating successful student engagement strategies online: Opportunities and challenges using jigsaw groups and problem-based learning. Journal of Online Learning and Teaching. 2013; 9 (1):89. * [ Google Scholar ]
  • Anderson L.W., Bourke S.F. Routledge; 2013. Assessing affective characteristics in the schools. [ Google Scholar ]
  • Archibald D. Fostering the development of cognitive presence: Initial findings using the community of inquiry survey instrument. The Internet and Higher Education. 2010; 13 (1–2):73–74. * [ Google Scholar ]
  • Artino A.R., Jr., Stephens J.M. Academic motivation and self-regulation: A comparative analysis of undergraduate and graduate students learning online. The Internet and Higher Education. 2009; 12 (3–4):146–151. [ Google Scholar ]
  • Barnard L., Lan W.Y., To Y.M., Paton V.O., Lai S.L. Measuring self-regulation in online and blended learning environments. Internet and Higher Education. 2009; 12 (1):1–6. * [ Google Scholar ]
  • Bayeck R.Y., Hristova A., Jablokow K.W., Bonafini F. Exploring the relevance of single‐gender group formation: What we learn from a massive open online course (MOOC) British Journal of Educational Technology. 2018; 49 (1):88–100. * [ Google Scholar ]
  • Berge Z., Mrozowski S. Review of research in distance education, 1990 to 1999. American Journal of Distance Education. 2001; 15 (3):5–19. doi: 10.1080/08923640109527090. [ CrossRef ] [ Google Scholar ]
  • Berry S. Building community in online doctoral classrooms: Instructor practices that support community. Online Learning. 2017; 21 (2):n2. * [ Google Scholar ]
  • Boling E.C., Holan E., Horbatt B., Hough M., Jean-Louis J., Khurana C., Spiezio C. Using online tools for communication and collaboration: Understanding educators' experiences in an online course. The Internet and Higher Education. 2014; 23 :48–55. * [ Google Scholar ]
  • Bolliger D.U., Inan F.A. Development and validation of the online student connectedness survey (OSCS) International Review of Research in Open and Distance Learning. 2012; 13 (3):41–65. * [ Google Scholar ]
  • Bradford G., Wyatt S. Online learning and student satisfaction: Academic standing, ethnicity and their influence on facilitated learning, engagement, and information fluency. The Internet and Higher Education. 2010; 13 (3):108–114. * [ Google Scholar ]
  • Broadbent J. Comparing online and blended learner's self-regulated learning strategies and academic performance. The Internet and Higher Education. 2017; 33 :24–32. [ Google Scholar ]
  • Buzdar M., Ali A., Tariq R. Emotional intelligence as a determinant of readiness for online learning. International Review of Research in Open and Distance Learning. 2016; 17 (1) * [ Google Scholar ]
  • Capdeferro N., Romero M., Barberà E. Polychronicity: Review of the literature and a new configuration for the study of this hidden dimension of online learning. Distance Education. 2014; 35 (3):294–310. [ Google Scholar ]
  • Chaiprasurt C., Esichaikul V. Enhancing motivation in online courses with mobile communication tool support: A comparative study. International Review of Research in Open and Distance Learning. 2013; 14 (3):377–401. [ Google Scholar ]
  • Chen C.H., Wu I.C. The interplay between cognitive and motivational variables in a supportive online learning system for secondary physical education. Computers & Education. 2012; 58 (1):542–550. * [ Google Scholar ]
  • Cho H. Under co-construction: An online community of practice for bilingual pre-service teachers. Computers & Education. 2016; 92 :76–89. * [ Google Scholar ]
  • Cho M.H., Shen D. Self-regulation in online learning. Distance Education. 2013; 34 (3):290–301. [ Google Scholar ]
  • Cole M.T., Shelley D.J., Swartz L.B. Online instruction, e-learning, and student satisfaction: A three-year study. International Review of Research in Open and Distance Learning. 2014; 15 (6) * [ Google Scholar ]
  • Comer D.K., Clark C.R., Canelas D.A. Writing to learn and learning to write across the disciplines: Peer-to-peer writing in introductory-level MOOCs. International Review of Research in Open and Distance Learning. 2014; 15 (5):26–82. * [ Google Scholar ]
  • Cundell A., Sheepy E. Student perceptions of the most effective and engaging online learning activities in a blended graduate seminar. Online Learning. 2018; 22 (3):87–102. * [ Google Scholar ]
  • Cung B., Xu D., Eichhorn S. Increasing interpersonal interactions in an online course: Does increased instructor email activity and voluntary meeting time in a physical classroom facilitate student learning? Online Learning. 2018; 22 (3):193–215. [ Google Scholar ]
  • Cunningham U.M., Fägersten K.B., Holmsten E. Can you hear me, Hanoi?" Compensatory mechanisms employed in synchronous net-based English language learning. International Review of Research in Open and Distance Learning. 2010; 11 (1):161–177. [ Google Scholar ]
  • Davis D., Chen G., Hauff C., Houben G.J. Activating learning at scale: A review of innovations in online learning strategies. Computers & Education. 2018; 125 :327–344. [ Google Scholar ]
  • Delen E., Liew J., Willson V. Effects of interactivity and instructional scaffolding on learning: Self-regulation in online video-based environments. Computers & Education. 2014; 78 :312–320. [ Google Scholar ]
  • Dixson M.D. Measuring student engagement in the online course: The Online Student Engagement scale (OSE) Online Learning. 2015; 19 (4):n4. * [ Google Scholar ]
  • Dray B.J., Lowenthal P.R., Miszkiewicz M.J., Ruiz‐Primo M.A., Marczynski K. Developing an instrument to assess student readiness for online learning: A validation study. Distance Education. 2011; 32 (1):29–47. * [ Google Scholar ]
  • Dziuban C., Moskal P., Thompson J., Kramer L., DeCantis G., Hermsdorfer A. Student satisfaction with online learning: Is it a psychological contract? Online Learning. 2015; 19 (2):n2. * [ Google Scholar ]
  • Ergün E., Usluel Y.K. An analysis of density and degree-centrality according to the social networking structure formed in an online learning environment. Journal of Educational Technology & Society. 2016; 19 (4):34–46. * [ Google Scholar ]
  • Esfijani A. Measuring quality in online education: A meta-synthesis. American Journal of Distance Education. 2018; 32 (1):57–73. [ Google Scholar ]
  • Glazer H.R., Murphy J.A. Optimizing success: A model for persistence in online education. American Journal of Distance Education. 2015; 29 (2):135–144. [ Google Scholar ]
  • Glazer H.R., Wanstreet C.E. Connection to the academic community: Perceptions of students in online education. Quarterly Review of Distance Education. 2011; 12 (1):55. * [ Google Scholar ]
  • Hartnett M., George A.S., Dron J. Examining motivation in online distance learning environments: Complex, multifaceted and situation-dependent. International Review of Research in Open and Distance Learning. 2011; 12 (6):20–38. [ Google Scholar ]
  • Harwell M.R. 2012. Research design in qualitative/quantitative/mixed methods. Section III. Opportunities and challenges in designing and conducting inquiry. [ Google Scholar ]
  • Hung J.L. Trends of e‐learning research from 2000 to 2008: Use of text mining and bibliometrics. British Journal of Educational Technology. 2012; 43 (1):5–16. [ Google Scholar ]
  • Jiang W. Interdependence of roles, role rotation, and sense of community in an online course. Distance Education. 2017; 38 (1):84–105. [ Google Scholar ]
  • Ke F., Kwak D. Online learning across ethnicity and age: A study on learning interaction participation, perception, and learning satisfaction. Computers & Education. 2013; 61 :43–51. [ Google Scholar ]
  • Kent M. Changing the conversation: Facebook as a venue for online class discussion in higher education. MERLOT Journal of Online Learning and Teaching. 2013; 9 (4):546–565. * [ Google Scholar ]
  • Kim C., Park S.W., Cozart J. Affective and motivational factors of learning in online mathematics courses. British Journal of Educational Technology. 2014; 45 (1):171–185. [ Google Scholar ]
  • Kizilcec R.F., Pérez-Sanagustín M., Maldonado J.J. Self-regulated learning strategies predict learner behavior and goal attainment in Massive Open Online Courses. Computers & Education. 2017; 104 :18–33. [ Google Scholar ]
  • Kopp B., Matteucci M.C., Tomasetto C. E-tutorial support for collaborative online learning: An explorative study on experienced and inexperienced e-tutors. Computers & Education. 2012; 58 (1):12–20. [ Google Scholar ]
  • Koseoglu S., Doering A. Understanding complex ecologies: An investigation of student experiences in adventure learning programs. Distance Education. 2011; 32 (3):339–355. * [ Google Scholar ]
  • Kumi-Yeboah A. Designing a cross-cultural collaborative online learning framework for online instructors. Online Learning. 2018; 22 (4):181–201. * [ Google Scholar ]
  • Kuo Y.C., Walker A.E., Belland B.R., Schroder K.E. A predictive study of student satisfaction in online education programs. International Review of Research in Open and Distance Learning. 2013; 14 (1):16–39. * [ Google Scholar ]
  • Kuo Y.C., Walker A.E., Schroder K.E., Belland B.R. Interaction, Internet self-efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. Internet and Higher Education. 2014; 20 :35–50. * [ Google Scholar ]
  • Lee J. An exploratory study of effective online learning: Assessing satisfaction levels of graduate students of mathematics education associated with human and design factors of an online course. International Review of Research in Open and Distance Learning. 2014; 15 (1) [ Google Scholar ]
  • Lee S.M. The relationships between higher order thinking skills, cognitive density, and social presence in online learning. The Internet and Higher Education. 2014; 21 :41–52. * [ Google Scholar ]
  • Lee K. Rethinking the accessibility of online higher education: A historical review. The Internet and Higher Education. 2017; 33 :15–23. [ Google Scholar ]
  • Lee Y., Choi J. A review of online course dropout research: Implications for practice and future research. Educational Technology Research & Development. 2011; 59 (5):593–618. [ Google Scholar ]
  • Li L.Y., Tsai C.C. Accessing online learning material: Quantitative behavior patterns and their effects on motivation and learning performance. Computers & Education. 2017; 114 :286–297. [ Google Scholar ]
  • Liyanagunawardena T., Adams A., Williams S. MOOCs: A systematic study of the published literature 2008-2012. International Review of Research in Open and Distance Learning. 2013; 14 (3):202–227. [ Google Scholar ]
  • Lowes S., Lin P., Kinghorn B.R. Gender differences in online high school courses. Online Learning. 2016; 20 (4):100–117. [ Google Scholar ]
  • Marbouti F., Wise A.F. Starburst: A new graphical interface to support purposeful attention to others' posts in online discussions. Educational Technology Research & Development. 2016; 64 (1):87–113. * [ Google Scholar ]
  • Martin F., Ahlgrim-Delzell L., Budhrani K. Systematic review of two decades (1995 to 2014) of research on synchronous online learning. American Journal of Distance Education. 2017; 31 (1):3–19. [ Google Scholar ]
  • Moore-Adams B.L., Jones W.M., Cohen J. Learning to teach online: A systematic review of the literature on K-12 teacher preparation for teaching online. Distance Education. 2016; 37 (3):333–348. [ Google Scholar ]
  • Murphy E., Rodríguez-Manzanares M.A. Rapport in distance education. International Review of Research in Open and Distance Learning. 2012; 13 (1):167–190. * [ Google Scholar ]
  • Nye A. Building an online academic learning community among undergraduate students. Distance Education. 2015; 36 (1):115–128. * [ Google Scholar ]
  • Olesova L., Slavin M., Lim J. Exploring the effect of scripted roles on cognitive presence in asynchronous online discussions. Online Learning. 2016; 20 (4):34–53. * [ Google Scholar ]
  • Orcutt J.M., Dringus L.P. Beyond being there: Practices that establish presence, engage students and influence intellectual curiosity in a structured online learning environment. Online Learning. 2017; 21 (3):15–35. * [ Google Scholar ]
  • Overbaugh R.C., Nickel C.E. A comparison of student satisfaction and value of academic community between blended and online sections of a university-level educational foundations course. The Internet and Higher Education. 2011; 14 (3):164–174. * [ Google Scholar ]
  • O'Shea S., Stone C., Delahunty J. “I ‘feel’like I am at university even though I am online.” Exploring how students narrate their engagement with higher education institutions in an online learning environment. Distance Education. 2015; 36 (1):41–58. * [ Google Scholar ]
  • Paechter M., Maier B. Online or face-to-face? Students' experiences and preferences in e-learning. Internet and Higher Education. 2010; 13 (4):292–297. [ Google Scholar ]
  • Phirangee K. Students' perceptions of learner-learner interactions that weaken a sense of community in an online learning environment. Online Learning. 2016; 20 (4):13–33. * [ Google Scholar ]
  • Phirangee K., Malec A. Othering in online learning: An examination of social presence, identity, and sense of community. Distance Education. 2017; 38 (2):160–172. * [ Google Scholar ]
  • Preisman K.A. Teaching presence in online education: From the instructor's point of view. Online Learning. 2014; 18 (3):n3. * [ Google Scholar ]
  • Rowe M. Developing graduate attributes in an open online course. British Journal of Educational Technology. 2016; 47 (5):873–882. * [ Google Scholar ]
  • Ruane R., Koku E.F. Social network analysis of undergraduate education student interaction in online peer mentoring settings. Journal of Online Learning and Teaching. 2014; 10 (4):577–589. * [ Google Scholar ]
  • Ruane R., Lee V.J. Analysis of discussion board interaction in an online peer mentoring site. Online Learning. 2016; 20 (4):79–99. * [ Google Scholar ]
  • Rye S.A., Støkken A.M. The implications of the local context in global virtual education. International Review of Research in Open and Distance Learning. 2012; 13 (1):191–206. * [ Google Scholar ]
  • Saadatmand M., Kumpulainen K. Participants' perceptions of learning and networking in connectivist MOOCs. Journal of Online Learning and Teaching. 2014; 10 (1):16. * [ Google Scholar ]
  • Shackelford J.L., Maxwell M. Sense of community in graduate online education: Contribution of learner to learner interaction. International Review of Research in Open and Distance Learning. 2012; 13 (4):228–249. * [ Google Scholar ]
  • Shea P., Bidjerano T. Does online learning impede degree completion? A national study of community college students. Computers & Education. 2014; 75 :103–111. * [ Google Scholar ]
  • Sherry L. Issues in distance learning. International Journal of Educational Telecommunications. 1996; 1 (4):337–365. [ Google Scholar ]
  • Slagter van Tryon P.J., Bishop M.J. Evaluating social connectedness online: The design and development of the social perceptions in learning contexts instrument. Distance Education. 2012; 33 (3):347–364. * [ Google Scholar ]
  • Swaggerty E.A., Broemmel A.D. Authenticity, relevance, and connectedness: Graduate students' learning preferences and experiences in an online reading education course. The Internet and Higher Education. 2017; 32 :80–86. * [ Google Scholar ]
  • Tallent-Runnels M.K., Thomas J.A., Lan W.Y., Cooper S., Ahern T.C., Shaw S.M., Liu X. Teaching courses online: A review of the research. Review of Educational Research. 2006; 76 (1):93–135. doi: 10.3102/00346543076001093. [ CrossRef ] [ Google Scholar ]
  • Tawfik A.A., Giabbanelli P.J., Hogan M., Msilu F., Gill A., York C.S. Effects of success v failure cases on learner-learner interaction. Computers & Education. 2018; 118 :120–132. [ Google Scholar ]
  • Thomas J. Exploring the use of asynchronous online discussion in health care education: A literature review. Computers & Education. 2013; 69 :199–215. [ Google Scholar ]
  • Thormann J., Fidalgo P. Guidelines for online course moderation and community building from a student's perspective. Journal of Online Learning and Teaching. 2014; 10 (3):374–388. * [ Google Scholar ]
  • Tibi M.H. Computer science students' attitudes towards the use of structured and unstructured discussion forums in fully online courses. Online Learning. 2018; 22 (1):93–106. * [ Google Scholar ]
  • Tsai C.W., Chiang Y.C. Research trends in problem‐based learning (pbl) research in e‐learning and online education environments: A review of publications in SSCI‐indexed journals from 2004 to 2012. British Journal of Educational Technology. 2013; 44 (6):E185–E190. [ Google Scholar ]
  • Tsai C.W., Fan Y.T. Research trends in game‐based learning research in online learning environments: A review of studies published in SSCI‐indexed journals from 2003 to 2012. British Journal of Educational Technology. 2013; 44 (5):E115–E119. [ Google Scholar ]
  • Tsai C.W., Shen P.D., Chiang Y.C. Research trends in meaningful learning research on e‐learning and online education environments: A review of studies published in SSCI‐indexed journals from 2003 to 2012. British Journal of Educational Technology. 2013; 44 (6):E179–E184. [ Google Scholar ]
  • Tsai C.W., Shen P.D., Fan Y.T. Research trends in self‐regulated learning research in online learning environments: A review of studies published in selected journals from 2003 to 2012. British Journal of Educational Technology. 2013; 44 (5):E107–E110. [ Google Scholar ]
  • U.S. Department of Education, Institute of Education Sciences . InstituteofEducationSciences; Washington,DC: 2017. What Works Clearinghouse procedures and standards handbook, version3.0. https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_procedures_v3_0_standards_handbook.pdf Retrievedfrom. [ Google Scholar ]
  • Veletsianos G., Shepherdson P. A systematic analysis and synthesis of the empirical MOOC literature published in 2013–2015. International Review of Research in Open and Distance Learning. 2016; 17 (2) [ Google Scholar ]
  • VERBI Software . 2019. MAXQDA 2020 online manual. Retrieved from maxqda. Com/help-max20/welcome [ Google Scholar ]
  • Verstegen D., Dailey-Hebert A., Fonteijn H., Clarebout G., Spruijt A. How do virtual teams collaborate in online learning tasks in a MOOC? International Review of Research in Open and Distance Learning. 2018; 19 (4) * [ Google Scholar ]
  • Wang Y., Baker R. Grit and intention: Why do learners complete MOOCs? International Review of Research in Open and Distance Learning. 2018; 19 (3) * [ Google Scholar ]
  • Wei C.W., Chen N.S., Kinshuk A model for social presence in online classrooms. Educational Technology Research & Development. 2012; 60 (3):529–545. * [ Google Scholar ]
  • Wicks D., Craft B.B., Lee D., Lumpe A., Henrikson R., Baliram N., Wicks K. An evaluation of low versus high collaboration in online learning. Online Learning. 2015; 19 (4):n4. * [ Google Scholar ]
  • Wise A.F., Perera N., Hsiao Y.T., Speer J., Marbouti F. Microanalytic case studies of individual participation patterns in an asynchronous online discussion in an undergraduate blended course. The Internet and Higher Education. 2012; 15 (2):108–117. * [ Google Scholar ]
  • Wisneski J.E., Ozogul G., Bichelmeyer B.A. Does teaching presence transfer between MBA teaching environments? A comparative investigation of instructional design practices associated with teaching presence. The Internet and Higher Education. 2015; 25 :18–27. * [ Google Scholar ]
  • Wladis C., Hachey A.C., Conway K. An investigation of course-level factors as predictors of online STEM course outcomes. Computers & Education. 2014; 77 :145–150. * [ Google Scholar ]
  • Wladis C., Samuels J. Do online readiness surveys do what they claim? Validity, reliability, and subsequent student enrollment decisions. Computers & Education. 2016; 98 :39–56. [ Google Scholar ]
  • Yamagata-Lynch L.C. Blending online asynchronous and synchronous learning. International Review of Research in Open and Distance Learning. 2014; 15 (2) * [ Google Scholar ]
  • Yang J., Kinshuk, Yu H., Chen S.J., Huang R. Strategies for smooth and effective cross-cultural online collaborative learning. Journal of Educational Technology & Society. 2014; 17 (3):208–221. * [ Google Scholar ]
  • Yeboah A.K., Smith P. Relationships between minority students online learning experiences and academic performance. Online Learning. 2016; 20 (4):n4. * [ Google Scholar ]
  • Yu T. Examining construct validity of the student online learning readiness (SOLR) instrument using confirmatory factor analysis. Online Learning. 2018; 22 (4):277–288. * [ Google Scholar ]
  • Yukselturk E., Bulut S. Gender differences in self-regulated online learning environment. Educational Technology & Society. 2009; 12 (3):12–22. [ Google Scholar ]
  • Yukselturk E., Top E. Exploring the link among entry characteristics, participation behaviors and course outcomes of online learners: An examination of learner profile using cluster analysis. British Journal of Educational Technology. 2013; 44 (5):716–728. [ Google Scholar ]
  • Zawacki-Richter O., Backer E., Vogt S. Review of distance education research (2000 to 2008): Analysis of research areas, methods, and authorship patterns. International Review of Research in Open and Distance Learning. 2009; 10 (6):30. doi: 10.19173/irrodl.v10i6.741. [ CrossRef ] [ Google Scholar ]
  • Zhu M., Sari A., Lee M.M. A systematic review of research methods and topics of the empirical MOOC literature (2014–2016) The Internet and Higher Education. 2018; 37 :31–39. [ Google Scholar ]
  • Zimmerman T.D. Exploring learner to content interaction as a success factor in online courses. International Review of Research in Open and Distance Learning. 2012; 13 (4):152–165. [ Google Scholar ]
  • Open access
  • Published: 16 September 2021

Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study

  • Meixun Zheng 1 ,
  • Daniel Bender 1 &
  • Cindy Lyon 1  

BMC Medical Education volume  21 , Article number:  495 ( 2021 ) Cite this article

216k Accesses

96 Citations

119 Altmetric

Metrics details

The COVID-19 pandemic forced dental schools to close their campuses and move didactic instruction online. The abrupt transition to online learning, however, has raised several issues that have not been resolved. While several studies have investigated dental students’ attitude towards online learning during the pandemic, mixed results have been reported. Additionally, little research has been conducted to identify and understand factors, especially pedagogical factors, that impacted students’ acceptance of online learning during campus closure. Furthermore, how online learning during the pandemic impacted students’ learning performance has not been empirically investigated. In March 2020, the dental school studied here moved didactic instruction online in response to government issued stay-at-home orders. This first-of-its-kind comparative study examined students’ perceived effectiveness of online courses during summer quarter 2020, explored pedagogical factors impacting their acceptance of online courses, and empirically evaluated the impact of online learning on students’ course performance, during the pandemic.

The study employed a quasi-experimental design. Participants were 482 pre-doctoral students in a U.S dental school. Students’ perceived effectiveness of online courses during the pandemic was assessed with a survey. Students’ course grades for online courses during summer quarter 2020 were compared with that of a control group who received face-to-face instruction for the same courses before the pandemic in summer quarter 2019.

Survey results revealed that most online courses were well accepted by the students, and 80 % of them wanted to continue with some online instruction post pandemic. Regression analyses revealed that students’ perceived engagement with faculty and classmates predicted their perceived effectiveness of the online course. More notably, Chi Square tests demonstrated that in 16 out of the 17 courses compared, the online cohort during summer quarter 2020 was equally or more likely to get an A course grade than the analogous face-to-face cohort during summer quarter 2019.

Conclusions

This is the first empirical study in dental education to demonstrate that online courses during the pandemic could achieve equivalent or better student course performance than the same pre-pandemic in-person courses. The findings fill in gaps in literature and may inform online learning design moving forward.

Peer Review reports

Introduction

Research across disciplines has demonstrated that well-designed online learning can lead to students’ enhanced motivation, satisfaction, and learning [ 1 , 2 , 3 , 4 , 5 , 6 , 7 ]. A report by the U.S. Department of Education [ 8 ], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning outcomes equivalent to or better than face-to-face learning. The more recent systematic review by Pei and Wu [ 9 ] provided additional evidence that online learning is at least as effective as face-to-face learning for undergraduate medical students.

To take advantage of the opportunities presented by online learning, thought leaders in dental education in the U.S. have advocated for the adoption of online learning in the nation’s dental schools [ 10 , 11 , 12 ]. However, digital innovation has been a slow process in academic dentistry [ 13 , 14 , 15 ]. In March 2020, the COVID-19 pandemic brought unprecedented disruption to dental education by necessitating the need for online learning. In accordance with stay-at-home orders to prevent the spread of the virus, dental schools around the world closed their campuses and moved didactic instruction online.

The abrupt transition to online learning, however, has raised several concerns and question. First, while several studies have examined dental students’ online learning satisfaction during the pandemic, mixed results have been reported. Some studies have reported students’ positive attitude towards online learning [ 15 , 16 , 17 , 18 , 19 , 20 ]. Sadid-Zadeh et al. [ 18 ] found that 99 % of the surveyed dental students at University of Buffalo, in the U.S., were satisfied with live web-based lectures during the pandemic. Schlenz et al. [ 15 ] reported that students in a German dental school had a favorable attitude towards online learning and wanted to continue with online instruction in their future curriculum. Other studies, however, have reported students’ negative online learning experience during the pandemic [ 21 , 22 , 23 , 24 , 25 , 26 ]. For instance, dental students at Harvard University felt that learning during the pandemic had worsened and engagement had decreased [ 23 , 24 ]. In a study with medical and dental students in Pakistan, Abbasi et al. [ 21 ] found that 77 % of the students had negative perceptions about online learning and 84 % reported reduced student-instructor interactions.

In addition to these mixed results, little attention has been given to factors affecting students’ acceptance of online learning during the pandemic. With the likelihood that online learning will persist post pandemic [ 27 ], research in this area is warranted to inform online course design moving forward. In particular, prior research has demonstrated that one of the most important factors influencing students’ performance in any learning environment is a sense of belonging, the feeling of being connected with and supported by the instructor and classmates [ 28 , 29 , 30 , 31 ]. Unfortunately, this aspect of the classroom experience has suffered during school closure. While educational events can be held using a video conferencing system, virtual peer interaction on such platforms has been perceived by medical trainees to be not as easy and personal as physical interaction [ 32 ]. The pandemic highlights the need to examine instructional strategies most suited to the current situation to support students’ engagement with faculty and classmates.

Furthermore, there is considerable concern from the academic community about the quality of online learning. Pre-pandemic, some faculty and students were already skeptical about the value of online learning [ 33 ]. The longer the pandemic lasts, the more they may question the value of online education, asking: Can online learning during the pandemic produce learning outcomes that are similar to face-to-face learning before the pandemic? Despite the documented benefits of online learning prior to the pandemic, the actual impact of online learning during the pandemic on students’ academic performance is still unknown due to reasons outlined below.

On one hand, several factors beyond the technology used could influence the effectiveness of online learning, one of which is the teaching context [ 34 ]. The sudden transition to online learning has posed many challenges to faculty and students. Faculty may not have had adequate time to carefully design online courses to take full advantage of the possibilities of the online format. Some faculty may not have had prior online teaching experience and experienced a deeper learning curve when it came to adopting online teaching methods [ 35 ]. Students may have been at the risk of increased anxiety due to concerns about contracting the virus, on time graduation, finances, and employment [ 36 , 37 ], which may have negatively impacted learning performance [ 38 ]. Therefore, whether online learning during the pandemic could produce learning outcomes similar to those of online learning implemented during more normal times remains to be determined.

Most existing studies on online learning in dental education during the pandemic have only reported students’ satisfaction. The actual impact of the online format on academic performance has not been empirically investigated. The few studies that have examined students’ learning outcomes have only used students’ self-reported data from surveys and focus groups. According to Kaczmarek et al. [ 24 ], 50 % of the participating dental faculty at Harvard University perceived student learning to have worsened during the pandemic and 70 % of the students felt the same. Abbasi et al. [ 21 ] reported that 86 % of medical and dental students in a Pakistan college felt that they learned less online. While student opinions are important, research has demonstrated a poor correlation between students’ perceived learning and actual learning gains [ 39 ]. As we continue to navigate the “new normal” in teaching, students’ learning performance needs to be empirically evaluated to help institutions gauge the impact of this grand online learning experiment.

Research purposes

In March 2020, the University of the Pacific Arthur A. Dugoni School of Dentistry, in the U.S., moved didactic instruction online to ensure the continuity of education during building closure. This study examined students’ acceptance of online learning during the pandemic and its impacting factors, focusing on instructional practices pertaining to students’ engagement/interaction with faculty and classmates. Another purpose of this study was to empirically evaluate the impact of online learning during the pandemic on students’ actual course performance by comparing it with that of a pre-pandemic cohort. To understand the broader impact of the institutional-wide online learning effort, we examined all online courses offered in summer quarter 2020 (July to September) that had a didactic component.

This is the first empirical study in dental education to evaluate students’ learning performance during the pandemic. The study aimed to answer the following three questions.

How well was online learning accepted by students, during the summer quarter 2020 pandemic interruption?

How did instructional strategies, centered around students’ engagement with faculty and classmates, impact their acceptance of online learning?

How did online learning during summer quarter 2020 impact students’ course performance as compared with a previous analogous cohort who received face-to-face instruction in summer quarter 2019?

This study employed a quasi-experimental design. The study was approved by the university’s institutional review board (#2020-68).

Study context and participants

The study was conducted at the Arthur A. Dugoni School of Dentistry, University of the Pacific. The program runs on a quarter system. It offers a 3-year accelerated Doctor of Dental Surgery (DDS) program and a 2-year International Dental Studies (IDS) program for international dentists who have obtained a doctoral degree in dentistry from a country outside the U.S. and want to practice in the U.S. Students advance throughout the program in cohorts. IDS students take some courses together with their DDS peers. All three DDS classes (D1/DDS 2023, D2/DDS 2022, and D3/DDS 2021) and both IDS classes (I1/IDS 2022 and I2/IDS 2021) were invited to participate in the study. The number of students in each class was: D1 = 145, D2 = 143, D3 = 143, I1 = 26, and I2 = 25. This resulted in a total of 482 student participants.

During campus closure, faculty delivered remote instruction in various ways, including live online classes via Zoom @  [ 40 ], self-paced online modules on the school’s learning management system Canvas @  [ 41 ], or a combination of live and self-paced delivery. For self-paced modules, students studied assigned readings and/or viewings such as videos and pre-recorded slide presentations. Some faculty also developed self-paced online lessons with SoftChalk @  [ 42 ], a cloud-based platform that supports the inclusion of gamified learning by insertion of various mini learning activities. The SoftChalk lessons were integrated with Canvas @  [ 41 ] and faculty could monitor students’ progress. After students completed the pre-assigned online materials, some faculty held virtual office hours or live online discussion sessions for students to ask questions and discuss key concepts.

Data collection and analysis

Student survey.

Students’ perceived effectiveness of summer quarter 2020 online courses was evaluated by the school’s Office of Academic Affairs in lieu of the regular course evaluation process. A total of 19 courses for DDS students and 10 courses for IDS students were evaluated. An 8-question survey developed by the researchers (Additional file 1 ) was administered online in the last week of summer quarter 2020. Course directors invited student to take the survey during live online classes. The survey introduction stated that taking the survey was voluntary and that their anonymous responses would be reported in aggregated form for research purposes. Students were invited to continue with the survey if they chose to participate; otherwise, they could exit the survey. The number of students in each class who took the survey was as follows: D1 ( n  = 142; 98 %), D2 ( n  = 133; 93 %), D3 ( n  = 61; 43 %), I1 ( n  = 23; 88 %), and I2 ( n  = 20; 80 %). This resulted in a total of 379 (79 %) respondents across all classes.

The survey questions were on a 4-point scale, ranging from Strongly Disagree (1 point), Disagree (2 points), Agree (3 points), and Strongly Agree (4 points). Students were asked to rate each online course by responding to four statements: “ I could fully engage with the instructor and classmates in this course”; “The online format of this course supported my learning”; “Overall this online course is effective.”, and “ I would have preferred face-to-face instruction for this course ”. For the first three survey questions, a higher mean score indicated a more positive attitude toward the online course. For the fourth question “ I would have preferred face-to-face instruction for this course ”, a higher mean score indicated that more students would have preferred face-to-face instruction for the course. Two additional survey questions asked students to select their preferred online delivery method for fully online courses during the pandemic from three given choices (synchronous online/live, asynchronous online/self-paced, and a combination of both), and to report whether they wanted to continue with some online instruction post pandemic. Finally, two open-ended questions at the end of the survey allowed students to comment on the aspects of online format that they found to be helpful and to provide suggestion for improvement. For the purpose of this study, we focused on the quantitative data from the Likert-scale questions.

Descriptive data such as the mean scores were reported for each course. Regression analyses were conducted to examine the relationship between instructional strategies focusing on students’ engagement with faculty and classmates, and their overall perceived effectiveness of the online course. The independent variable was student responses to the question “ I could fully engage with the instructor and classmates in this course ”, and the dependent variable was their answer to the question “ Overall, this online course is effective .”

Student course grades

Using Chi-square tests, student course grade distributions (A, B, C, D, and F) for summer quarter 2020 online courses were compared with that of a previous cohort who received face-to-face instruction for the same course in summer quarter 2019. Note that as a result of the school’s pre-doctoral curriculum redesign implemented in July 2019, not all courses offered in summer quarter 2020 were offered in the previous year in summer quarter 2019. In other words, some of the courses offered in summer quarter 2020 were new courses offered for the first time. Because these new courses did not have a previous face-to-face version to compare to, they were excluded from data analysis. For some other courses, while course content remained the same between 2019 and 2020, the sequence of course topics within the course had changed. These courses were also excluded from data analysis.

After excluding the aforementioned courses, it resulted in a total of 17 “comparable” courses that were included in data analysis (see the subsequent section). For these courses, the instructor, course content, and course goals were the same in both 2019 and 2020. The assessment methods and grading policies also remained the same through both years. For exams and quizzes, multiple choice questions were the dominating format for both years. While some exam questions in 2020 were different from 2019, faculty reported that the overall exam difficulty level was similar. The main difference in assessment was testing conditions. The 2019 cohort took computer-based exams in the physical classroom with faculty proctoring, and the 2020 cohort took exams at home with remote proctoring to ensure exam integrity. The remote proctoring software monitored the student during the exam through a web camera on their computer/laptop. The recorded video file flags suspicious activities for faculty review after exam completion.

Students’ perceived effectiveness of online learning

Table  1 summarized data on DDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, the majority of courses received a mean score that was approaching or over 3 points on the 4-point scale, suggesting that online learning was generally well accepted by students. Despite overall positive online course experiences, for many of the courses examined, there was an equal split in student responses to the question “ I would have preferred face-to-face instruction for this course .” Additionally, for students’ preferred online delivery method for fully online courses, about half of the students in each class preferred a combination of synchronous and asynchronous online learning (see Fig.  1 ). Finally, the majority of students wanted faculty to continue with some online instruction post pandemic: D1class (110; 78.60 %), D2 class (104; 80 %), and D3 class (49; 83.10 %).

While most online courses received favorable ratings, some variations did exist among courses. For D1 courses, “ Anatomy & Histology ” received lower ratings than others. This could be explained by its lab component, which didn’t lend itself as well to the online format. For D2 courses, several of them received lower ratings than others, especially for the survey question on students’ perceived engagement with faculty and classmates.

figure 1

DDS students’ preferred online delivery method for fully online courses

Table  2 summarized IDS students’ perceived effectiveness of each online course during summer quarter 2020. For the survey question “ Overall, this online course is effective ”, all courses received a mean score that was approaching or over 3 points on a 4-point scale, suggesting that online learning was well accepted by students. For the survey question “ I would have preferred face-to-face instruction for this course ”, for most online courses examined, the percentage of students who would have preferred face-to-face instruction was similar to that of students who preferred online instruction for the course. Like their DDS peers, about half of the IDS students in each class also preferred a combination of synchronous and asynchronous online delivery for fully online courses (See Fig.  2 ). Finally, the majority of IDS students (I1, n = 18, 81.80 %; I2, n = 16, 84.20 %) wanted to continue with some online learning after the pandemic is over.

figure 2

IDS students’ preferred online delivery method for fully online courses

Factors impacting students’ acceptance of online learning

For all 19 online courses taken by DDS students, regression analyses indicated that there was a significantly positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across all courses. The ranges of effect size (r 2 ) were: D1 courses (0.26 to 0.50), D2 courses (0.39 to 0.650), and D3 courses (0.22 to 0.44), indicating moderate to high correlations across courses.

For 9 out of the 10 online courses taken by IDS students, there was a positive relationship between students’ perceived engagement with faculty and classmates and their perceived effectiveness of the course. P value was 0.00 across courses. The ranges of effect size were: I1 courses (0.35 to 0.77) and I2 courses (0.47 to 0.63), indicating consistently high correlations across courses. The only course in which students’ perceived engagement with faculty and classmates didn’t predict perceived effective of the course was “ Integrated Clinical Science III (ICS III) ”, which the I2 class took together with their D3 peers.

Impact of online learning on students’ course performance

Chi square test results (Table  3 ) indicated that in 4 out of the 17 courses compared, the online cohort during summer quarter 2020 was more likely to receive an A grade than the face-to-face cohort during summer quarter 2019. In 12 of the courses, the online cohort were equally likely to receive an A grade as the face-to-face cohort. In the remaining one course, the online cohort was less likely to receive an A grade than the face-to-face cohort.

Students’ acceptance of online learning during the pandemic

Survey results revealed that students had generally positive perceptions about online learning during the pandemic and the majority of them wanted to continue with some online learning post pandemic. Overall, our findings supported several other studies in dental [ 18 , 20 ], medical [ 43 , 44 ], and nursing [ 45 ] education that have also reported students’ positive attitudes towards online learning during the pandemic. In their written comments in the survey, students cited enhanced flexibility as one of the greatest benefits of online learning. Some students also commented that typing questions in the chat box during live online classes was less intimidating than speaking in class. Others explicitly stated that not having to commute to/from school provided more time for sleep, which helped with self-care and mental health. Our findings are in line with previous studies which have also demonstrated that online learning offered higher flexibility [ 46 , 47 ]. Meanwhile, consistent with findings of other researchers [ 19 , 21 , 46 ], our students felt difficulty engaging with faculty and classmates in several online courses.

There were some variations among individual courses in students’ acceptance of the online format. One factor that could partially account for the observed differences was instructional strategies. In particular, our regression analysis results demonstrated a positive correlation between students’ perceived engagement with faculty and classmates and their perceived overall effectiveness of the online course. Other aspects of course design might also have influenced students’ overall rating of the online course. For instance, some D2 students commented that the requirements of the course “ Integrated Case-based Seminars (ICS II) ” were not clear and that assessment did not align with lecture materials. It is important to remember that communicating course requirements clearly and aligning course content and assessment are principles that should be applied in any course, whether face-to-face or online. Our results highlighted the importance of providing faculty training on basic educational design principles and online learning design strategies. Furthermore, the nature of the course might also have impacted student ratings. For example, D1 course “ Anatomy and Histology ” had a lab component, which did not lend itself as well to the online format. Many students reported that it was difficult to see faculty’s live demonstration during Zoom lectures, which may have resulted in a lower student satisfaction rating.

As for students’ preferred online delivery method for fully online courses during the pandemic, about half of them preferred a combination of synchronous and asynchronous online learning. In light of this finding, as we continue with remote learning until public health directives allow a return to campus, we will encourage faculty to integrate these two online delivery modalities. Finally, in view of the result that over 80 % of the students wanted to continue with some online instruction after the pandemic, the school will advocate for blended learning in the post-pandemic world [ 48 ]. For future face-to-face courses on campus after the pandemic, faculty are encouraged to deliver some content online to reduce classroom seat time and make learning more flexible. Taken together, our findings not only add to the overall picture of the current situation but may inform learning design moving forward.

Role of online engagement and interaction

To reiterate, we found that students’ perceived engagement with faculty and classmates predicted their perceived overall effectiveness of the online course. This aligns with the larger literature on best practices in online learning design. Extensive research prior to the pandemic has confirmed that the effectiveness of online learning is determined by a number of factors beyond the tools used, including students’ interactions with the instructor and classmates [ 49 , 50 , 51 , 52 ]. Online students may feel isolated due to reduced or lack of interaction [ 53 , 54 ]. Therefore, in designing online learning experiences, it is important to remember that learning is a social process [ 55 ]. Faculty’s role is not only to transmit content but also to promote the different types of interactions that are an integral part of the online learning process [ 33 ]. The online teaching model in which faculty uploads materials online but teach it in the same way as in the physical classroom, without special effort to engage students, doesn’t make the best use of the online format. Putting the “sage on the screen” during a live class meeting on a video conferencing system is not different from “sage on the stage” in the physical classroom - both provide limited space for engagement. Such one-way monologue devalues the potentials that online learning presents.

In light of the critical role that social interaction plays in online learning, faculty are encouraged to use the interactive features of online learning platforms to provide clear channels for student-instructor and student-student interactions. In the open-ended comments, students highlighted several instructional strategies that they perceived to be helpful for learning. For live online classes, these included conducting breakout room activities, using the chat box to facilitate discussions, polling, and integrating gameplay with apps such as Kahoot! @  [ 56 ]. For self-paced classes, students appreciated that faculty held virtual office hours or subsequent live online discussion sessions to reinforce understanding of the pre-assigned materials.

Quality of online education during the pandemic

This study provided empirical evidence in dental education that it was possible to ensure the continuity of education without sacrificing the quality of education provided to students during forced migration to distance learning upon building closure. To reiterate, in all but one online course offered in summer quarter 2020, students were equally or more likely to get an A grade than the face-to-face cohort from summer quarter 2019. Even for courses that had less student support for the online format (e.g., the D1 course “ Anatomy and Histology ”), there was a significant increase in the number of students who earned an A grade in 2020 as compared with the previous year. The reduced capacity for technical training during the pandemic may have resulted in more study time for didactic content. Overall, our results resonate with several studies in health sciences education before the pandemic that the quality of learning is comparable in face-to-face and online formats [ 9 , 57 , 58 ]. For the only course ( Integrated Case-based Seminars ICS II) in which the online cohort had inferior performance than the face-to-face cohort, as mentioned earlier, students reported that assessment was not aligned with course materials and that course expectations were not clear. This might explain why students’ course performance was not as strong as expected.

Limitations

This study used a pre-existing control group from the previous year. There may have been individual differences between students in the online and the face-to-face cohorts, such as motivation, learning style, and prior knowledge, that could have impacted the observed outcomes. Additionally, even though course content and assessment methods were largely the same in 2019 and 2020, changes in other aspects of the course could have impacted students’ course performance. Some faculty may have been more compassionate with grading (e.g., more flexible with assignment deadlines) in summer quarter 2020 given the hardship students experienced during the pandemic. On the other hand, remote proctoring in summer quarter 2020 may have heightened some students’ exam anxiety knowing that they were being monitored through a webcam. The existence and magnitude of effect of these factors needs to be further investigated.

This present study only examined the correlation between students’ perceived online engagement and their perceived overall effectiveness of the online course. Other factors that might impact their acceptance of the online format need to be further researched in future studies. Another future direction is to examine how students’ perceived online engagement correlates with their actual course performance. Because the survey data collected for our present study are anonymous, we cannot match students’ perceived online engagement data with their course grades to run this additional analysis. It should also be noted that this study was focused on didactic online instruction. Future studies might examine how technical training was impacted during the COVID building closure. It was also out of the scope of this study to examine how student characteristics, especially high and low academic performance as reflected by individual grades, affects their online learning experience and performance. We plan to conduct a follow-up study to examine which group of students are most impacted by the online format. Finally, this study was conducted in a single dental school, and so the findings may not be generalizable to other schools and disciplines. Future studies could be conducted in another school or disciplines to compare results.

This study revealed that dental students had generally favorable attitudes towards online learning during the COVID-19 pandemic and that their perceived engagement with faculty and classmates predicted their acceptance of the online course. Most notably, this is the first study in dental education to demonstrate that online learning during the pandemic could achieve similar or better learning outcomes than face-to-face learning before the pandemic. Findings of our study could contribute significantly to the literature on online learning during the COVID-19 pandemic in health sciences education. The results could also inform future online learning design as we re-envision the future of online learning.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Bello G, Pennisi MA, Maviglia R, Maggiore SM, Bocci MG, Montini L, et al. Online vs live methods for teaching difficult airway management to anesthesiology residents. Intensive Care Med. 2005; 31 (4): 547–552.

Article   Google Scholar  

Ruiz JG, Mintzer MJ, Leipzig RM. The impact of e-learning in medical education. Acad Med. 2006; 81(3): 207–12.

Kavadella A, Tsiklakis K, Vougiouklakis G, Lionarakis A. Evaluation of a blended learning course for teaching oral radiology to undergraduate dental students. Eur J Dent Educ. 2012; 16(1): 88–95.

de Jong N, Verstegen DL, Tan FS, O’Connor SJ. A comparison of classroom and online asynchronous problem-based learning for students undertaking statistics training as part of a public health master’s degree. Adv Health Sci Educ. 2013; 18(2):245–64.

Hegeman JS. Using instructor-generated video lectures in online mathematics coursesimproves student learning. Online Learn. 2015;19(3):70–87.

Gaupp R, Körner M, Fabry G. Effects of a case-based interactive e-learning course on knowledge and attitudes about patient safety: a quasi-experimental study with third-year medical students. BMC Med Educ. 2016; 16(1):172.

Zheng M, Bender D, Reid L, Milani J. An interactive online approach to teaching evidence-based dentistry with Web 2.0 technology. J Dent Educ. 2017; 81(8): 995–1003.

Means B, Toyama Y, Murphy R, Bakia M, Jones K. Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. U.S. Department of Education, Office of Planning, Evaluation and Policy Development. Washington D.C. 2009.

Google Scholar  

Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Med Educ Online. 2019; 24(1):1666538.

Andrews KG, Demps EL. Distance education in the U.S. and Canadian undergraduate dental curriculum. J Dent Educ. 2003; 67(4):427–38.

Kassebaum DK, Hendricson WD, Taft T, Haden NK. The dental curriculum at North American dental institutions in 2002–03: a survey of current structure, recent innovations, and planned changes. J Dent Educ. 2004; 68(9):914–931.

Haden NK, Hendricson WD, Kassebaum DK, Ranney RR, Weinstein G, Anderson EL, et al. Curriculum changes in dental education, 2003–09. J Dent Educ. 2010; 74(5):539–57.

DeBate RD, Cragun D, Severson HH, Shaw T, Christiansen S, Koerber A, et al. Factors for increasing adoption of e-courses among dental and dental hygiene faculty members. J Dent Educ. 2011; 75 (5): 589–597.

Saeed SG, Bain J, Khoo E, Siqueira WL. COVID-19: Finding silver linings for dental education. J Dent Educ. 2020; 84(10):1060–1063.

Schlenz MA, Schmidt A, Wöstmann B, Krämer N, Schulz-Weidner N. Students’ and lecturers’ perspective on the implementation of online learning in dental education due to SARS-CoV-2 (COVID-19): a cross-sectional study. BMC Med Educ. 2020;20(1):1–7.

Donn J, Scott JA, Binnie V, Bell A. A pilot of a virtual Objective Structured Clinical Examination in dental education. A response to COVID-19. Eur J Dent Educ. 2020; https://doi.org/10.1111/eje.12624

Hung M, Licari FW, Hon ES, Lauren E, Su S, Birmingham WC, Wadsworth LL, Lassetter JH, Graff TC, Harman W, et al. In an era of uncertainty: impact of COVID-19 on dental education. J Dent Educ. 2020; 85 (2): 148–156.

Sadid-Zadeh R, Wee A, Li R, Somogyi‐Ganss E. Audience and presenter comparison of live web‐based lectures and traditional classroom lectures during the COVID‐19 pandemic. J Prosthodont. 2020. doi: https://doi.org/10.1111/jopr.13301

Wang K, Zhang L, Ye L. A nationwide survey of online teaching strategies in dental education in China. J Dent Educ. 2020; 85 (2): 128–134.

Rad FA, Otaki F, Baqain Z, Zary N, Al-Halabi M. Rapid transition to distance learning due to COVID-19: Perceptions of postgraduate dental learners and instructors. PLoS One. 2021; 16(2): e0246584.

Abbasi S, Ayoob T, Malik A, Memon SI. Perceptions of students regarding E-learning during Covid-19 at a private medical college. Pak J Med Sci. 2020; 3 6 : 57–61.

Al-Azzam N, Elsalem L, Gombedza F. A cross-sectional study to determine factors affecting dental and medical students’ preference for virtual learning during the COVID-19 outbreak. Heliyon. 6(12). 2020. doi: https://doi.org/10.1016/j.heliyon.2020.e05704

Chen E, Kaczmarek K, Ohyama H. Student perceptions of distance learning strategies during COVID-19. J Dent Educ. 2020. doi: https://doi.org/10.1002/jdd.12339

Kaczmarek K, Chen E, Ohyama H. Distance learning in the COVID-19 era: Comparison of student and faculty perceptions. J Dent Educ. 2020. https://doi.org/10.1002/jdd.12469

Sarwar H, Akhtar H, Naeem MM, Khan JA, Waraich K, Shabbir S, et al. Self-reported effectiveness of e-learning classes during COVID-19 pandemic: A nation-wide survey of Pakistani undergraduate dentistry students. Eur J Dent. 2020; 14 (S01): S34-S43.

Al-Taweel FB, Abdulkareem AA, Gul SS, Alshami ML. Evaluation of technology‐based learning by dental students during the pandemic outbreak of coronavirus disease 2019. Eur J Dent Educ. 2021; 25(1): 183–190.

Elangovan S, Mahrous A, Marchini L. Disruptions during a pandemic: Gaps identified and lessons learned. J Dent Educ. 2020; 84 (11): 1270–1274.

Goodenow C. Classroom belonging among early adolescent students: Relationships to motivation and achievement. J Early Adolesc.1993; 13(1): 21–43.

Goodenow C. The psychological sense of school membership among adolescents: Scale development and educational correlates. Psychol Sch. 1993; 30(1): 79–90.

St-Amand J, Girard S, Smith J. Sense of belonging at school: Defining attributes, determinants, and sustaining strategies. IAFOR Journal of Education. 2017; 5(2):105–19.

Peacock S, Cowan J. Promoting sense of belonging in online learning communities of inquiry at accredited courses. Online Learn. 2019; 23(2): 67–81.

Chan GM, Kanneganti A, Yasin N, Ismail-Pratt I, Logan SJ. Well‐being, obstetrics and gynecology and COVID‐19: Leaving no trainee behind. Aust N Z J Obstet Gynaecol. 2020; 60(6): 983–986.

Hodges C, Moore S, Lockee B, Trust T, Bond A. The difference between emergency remote teaching and online learning. Educause Review. 2020; 2 7 , 1–12.

Means B, Bakia M, Murphy R. Learning online: What research tells us about whether, when and how. Routledge. 2014.

Iyer P, Aziz K, Ojcius DM. Impact of COVID-19 on dental education in the United States. J Dent Educ. 2020; 84(6): 718–722.

Machado RA, Bonan PRF, Perez DEDC, Martelli JÚnior H. 2020. COVID-19 pandemic and the impact on dental education: Discussing current and future perspectives. Braz Oral Res. 2020; 34: e083.

Wu DT, Wu KY, Nguyen TT, Tran SD. The impact of COVID-19 on dental education in North America-Where do we go next? Eur J Dent Educ. 2020; 24(4): 825–827.

de Oliveira Araújo FJ, de Lima LSA, Cidade PIM, Nobre CB, Neto MLR. Impact of Sars-Cov-2 and its reverberation in global higher education and mental health. Psychiatry Res. 2020; 288:112977. doi: https://doi.org/10.1016/j.psychres.2020.112977

Persky AM, Lee E, Schlesselman LS. Perception of learning versus performance as outcome measures of educational research. Am J Pharm Educ. 2020; 8 4 (7): ajpe7782.

Zoom @ . Zoom Video Communications , San Jose, CA, USA. https://zoom.us/

Canvas @ . Instructure, INC. Salt Lake City, UT, USA. https://www.instructure.com/canvas

SoftChalk @ . SoftChalk LLC . San Antonio, TX, USA. https://www.softchalkcloud.com/

Agarwal S, Kaushik JS. Student’s perception of online learning during COVID pandemic. Indian J Pediatr. 2020; 87: 554–554.

Khalil R, Mansour AE, Fadda WA, Almisnid K, Aldamegh M, Al-Nafeesah A, et al. The sudden transition to synchronized online learning during the COVID-19 pandemic in Saudi Arabia: a qualitative study exploring medical students’ perspectives. BMC Med Educ. 2020; 20(1): 1–10.

Riley E, Capps N, Ward N, McCormack L, Staley J. Maintaining academic performance and student satisfaction during the remote transition of a nursing obstetrics course to online instruction. Online Learn. 2021; 25(1), 220–229.

Amir LR, Tanti I, Maharani DA, Wimardhani YS, Julia V, Sulijaya B, et al. Student perspective of classroom and distance learning during COVID-19 pandemic in the undergraduate dental study program Universitas Indonesia. BMC Med Educ. 2020; 20(1):1–8.

Dost S, Hossain A, Shehab M, Abdelwahed A, Al-Nusair L. Perceptions of medical students towards online teaching during the COVID-19 pandemic: a national cross-sectional survey of 2721 UK medical students. BMJ Open. 2020; 10(11).

Graham CR, Woodfield W, Harrison JB. A framework for institutional adoption and implementation of blended learning in higher education. Internet High Educ. 2013; 18 : 4–14.

Sing C, Khine M. An analysis of interaction and participation patterns in online community. J Educ Techno Soc. 2006; 9(1): 250–261.

Bernard RM, Abrami PC, Borokhovski E, Wade CA, Tamim RM, Surkes MA, et al. A meta-analysis of three types of interaction treatments in distance education. Rev Educ Res. 2009; 79(3): 1243–1289.

Fedynich L, Bradley KS, Bradley J. Graduate students’ perceptions of online learning. Res High Educ. 2015; 27.

Tanis CJ. The seven principles of online learning: Feedback from faculty and alumni on its importance for teaching and learning. Res Learn Technol. 2020; 28 . https://doi.org/10.25304/rlt.v28.2319

Dixson MD. Measuring student engagement in the online course: The Online Student Engagement scale (OSE). Online Learn. 2015; 19 (4).

Kwary DA, Fauzie S. Students’ achievement and opinions on the implementation of e-learning for phonetics and phonology lectures at Airlangga University. Educ Pesqui. 2018; 44 .

Vygotsky LS. Mind in society: The development of higher psychological processes. Cambridge (MA): Harvard University Press. 1978.

Kahoot! @ . Oslo, Norway. https://kahoot.com/

Davis J, Chryssafidou E, Zamora J, Davies D, Khan K, Coomarasamy A. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomised controlled trial. BMC Med Educ. 2007; 7(1): 1–6.

Davis J, Crabb S, Rogers E, Zamora J, Khan K. Computer-based teaching is as good as face to face lecture-based teaching of evidence-based medicine: a randomized controlled trial. Med Teach. 2008; 30(3): 302–307.

Download references

Acknowledgements

Not applicable.

Authors’ information

MZ is an Associate Professor of Learning Sciences and Senior Instructional Designer at School of Dentistry, University of the Pacific. She has a PhD in Education, with a specialty on learning sciences and technology. She has dedicated her entire career to conducting research on online learning, learning technology, and faculty development. Her research has resulted in several peer-reviewed publications in medical, dental, and educational technology journals. MZ has also presented regularly at national conferences.

DB is an Assistant Dean for Academic Affairs at School of Dentistry, University of the Pacific. He has an EdD degree in education, with a concentration on learning and instruction. Over the past decades, DB has been overseeing and delivering faculty pedagogical development programs to dental faculty. His research interest lies in educational leadership and instructional innovation. DB has co-authored several peer-reviewed publications in health sciences education and presented regularly at national conferences.

CL is Associate Dean of Oral Healthcare Education, School of Dentistry, University of the Pacific. She has a Doctor of Dental Surgery (DDS) degree and an EdD degree with a focus on educational leadership. Her professional interest lies in educational leadership, oral healthcare education innovation, and faculty development. CL has co-authored several publications in peer-reviewed journals in health sciences education and presented regularly at national conferences.

Author information

Authors and affiliations.

Office of Academic Affairs, Arthur A. Dugoni School of Dentistry, University of the Pacific, CA, San Francisco, USA

Meixun Zheng, Daniel Bender & Cindy Lyon

You can also search for this author in PubMed   Google Scholar

Contributions

MZ analyzed the data and wrote the initial draft of the manuscript. DB and CL both provided assistance with research design, data collection, and reviewed and edited the manuscript. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Meixun Zheng .

Ethics declarations

Ethics approval and consent to participate.

The study was approved by the institutional review board at University of the Pacific in the U.S. (#2020-68). Informed consent was obtained from all participants. All methods were carried out in accordance with relevant guidelines and regulations.

Consent for publication

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1:.

Survey of online courses during COVID-19 pandemic.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Zheng, M., Bender, D. & Lyon, C. Online learning during COVID-19 produced equivalent or better student course performance as compared with pre-pandemic: empirical evidence from a school-wide comparative study. BMC Med Educ 21 , 495 (2021). https://doi.org/10.1186/s12909-021-02909-z

Download citation

Received : 31 March 2021

Accepted : 26 August 2021

Published : 16 September 2021

DOI : https://doi.org/10.1186/s12909-021-02909-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Dental education
  • Online learning
  • COVID-19 pandemic
  • Instructional strategies
  • Interaction
  • Learning performance

BMC Medical Education

ISSN: 1472-6920

research questions on online learning

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

COVID-19’s impacts on the scope, effectiveness, and interaction characteristics of online learning: A social network analysis

Roles Data curation, Formal analysis, Methodology, Writing – review & editing

¶ ‡ JZ and YD are contributed equally to this work as first authors.

Affiliation School of Educational Information Technology, South China Normal University, Guangzhou, Guangdong, China

Roles Data curation, Formal analysis, Methodology, Writing – original draft

Affiliations School of Educational Information Technology, South China Normal University, Guangzhou, Guangdong, China, Hangzhou Zhongce Vocational School Qiantang, Hangzhou, Zhejiang, China

Roles Data curation, Writing – original draft

Roles Data curation

Roles Writing – original draft

Affiliation Faculty of Education, Shenzhen University, Shenzhen, Guangdong, China

Roles Conceptualization, Supervision, Writing – review & editing

* E-mail: [email protected] (JH); [email protected] (YZ)

ORCID logo

  • Junyi Zhang, 
  • Yigang Ding, 
  • Xinru Yang, 
  • Jinping Zhong, 
  • XinXin Qiu, 
  • Zhishan Zou, 
  • Yujie Xu, 
  • Xiunan Jin, 
  • Xiaomin Wu, 

PLOS

  • Published: August 23, 2022
  • https://doi.org/10.1371/journal.pone.0273016
  • Reader Comments

Table 1

The COVID-19 outbreak brought online learning to the forefront of education. Scholars have conducted many studies on online learning during the pandemic, but only a few have performed quantitative comparative analyses of students’ online learning behavior before and after the outbreak. We collected review data from China’s massive open online course platform called icourse.163 and performed social network analysis on 15 courses to explore courses’ interaction characteristics before, during, and after the COVID-19 pan-demic. Specifically, we focused on the following aspects: (1) variations in the scale of online learning amid COVID-19; (2a) the characteristics of online learning interaction during the pandemic; (2b) the characteristics of online learning interaction after the pandemic; and (3) differences in the interaction characteristics of social science courses and natural science courses. Results revealed that only a small number of courses witnessed an uptick in online interaction, suggesting that the pandemic’s role in promoting the scale of courses was not significant. During the pandemic, online learning interaction became more frequent among course network members whose interaction scale increased. After the pandemic, although the scale of interaction declined, online learning interaction became more effective. The scale and level of interaction in Electrodynamics (a natural science course) and Economics (a social science course) both rose during the pan-demic. However, long after the pandemic, the Economics course sustained online interaction whereas interaction in the Electrodynamics course steadily declined. This discrepancy could be due to the unique characteristics of natural science courses and social science courses.

Citation: Zhang J, Ding Y, Yang X, Zhong J, Qiu X, Zou Z, et al. (2022) COVID-19’s impacts on the scope, effectiveness, and interaction characteristics of online learning: A social network analysis. PLoS ONE 17(8): e0273016. https://doi.org/10.1371/journal.pone.0273016

Editor: Heng Luo, Central China Normal University, CHINA

Received: April 20, 2022; Accepted: July 29, 2022; Published: August 23, 2022

Copyright: © 2022 Zhang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: The data underlying the results presented in the study were downloaded from https://www.icourse163.org/ and are now shared fully on Github ( https://github.com/zjyzhangjunyi/dataset-from-icourse163-for-SNA ). These data have no private information and can be used for academic research free of charge.

Funding: The author(s) received no specific funding for this work.

Competing interests: The authors have declared that no competing interests exist.

1. Introduction

The development of the mobile internet has spurred rapid advances in online learning, offering novel prospects for teaching and learning and a learning experience completely different from traditional instruction. Online learning harnesses the advantages of network technology and multimedia technology to transcend the boundaries of conventional education [ 1 ]. Online courses have become a popular learning mode owing to their flexibility and openness. During online learning, teachers and students are in different physical locations but interact in multiple ways (e.g., via online forum discussions and asynchronous group discussions). An analysis of online learning therefore calls for attention to students’ participation. Alqurashi [ 2 ] defined interaction in online learning as the process of constructing meaningful information and thought exchanges between more than two people; such interaction typically occurs between teachers and learners, learners and learners, and the course content and learners.

Massive open online courses (MOOCs), a 21st-century teaching mode, have greatly influenced global education. Data released by China’s Ministry of Education in 2020 show that the country ranks first globally in the number and scale of higher education MOOCs. The COVID-19 outbreak has further propelled this learning mode, with universities being urged to leverage MOOCs and other online resource platforms to respond to government’s “School’s Out, But Class’s On” policy [ 3 ]. Besides MOOCs, to reduce in-person gatherings and curb the spread of COVID-19, various online learning methods have since become ubiquitous [ 4 ]. Though Lederman asserted that the COVID-19 outbreak has positioned online learning technologies as the best way for teachers and students to obtain satisfactory learning experiences [ 5 ], it remains unclear whether the COVID-19 pandemic has encouraged interaction in online learning, as interactions between students and others play key roles in academic performance and largely determine the quality of learning experiences [ 6 ]. Similarly, it is also unclear what impact the COVID-19 pandemic has had on the scale of online learning.

Social constructivism paints learning as a social phenomenon. As such, analyzing the social structures or patterns that emerge during the learning process can shed light on learning-based interaction [ 7 ]. Social network analysis helps to explain how a social network, rooted in interactions between learners and their peers, guides individuals’ behavior, emotions, and outcomes. This analytical approach is especially useful for evaluating interactive relationships between network members [ 8 ]. Mohammed cited social network analysis (SNA) as a method that can provide timely information about students, learning communities and interactive networks. SNA has been applied in numerous fields, including education, to identify the number and characteristics of interelement relationships. For example, Lee et al. also used SNA to explore the effects of blogs on peer relationships [ 7 ]. Therefore, adopting SNA to examine interactions in online learning communities during the COVID-19 pandemic can uncover potential issues with this online learning model.

Taking China’s icourse.163 MOOC platform as an example, we chose 15 courses with a large number of participants for SNA, focusing on learners’ interaction characteristics before, during, and after the COVID-19 outbreak. We visually assessed changes in the scale of network interaction before, during, and after the outbreak along with the characteristics of interaction in Gephi. Examining students’ interactions in different courses revealed distinct interactive network characteristics, the pandemic’s impact on online courses, and relevant suggestions. Findings are expected to promote effective interaction and deep learning among students in addition to serving as a reference for the development of other online learning communities.

2. Literature review and research questions

Interaction is deemed as central to the educational experience and is a major focus of research on online learning. Moore began to study the problem of interaction in distance education as early as 1989. He defined three core types of interaction: student–teacher, student–content, and student–student [ 9 ]. Lear et al. [ 10 ] described an interactivity/ community-process model of distance education: they specifically discussed the relationships between interactivity, community awareness, and engaging learners and found interactivity and community awareness to be correlated with learner engagement. Zulfikar et al. [ 11 ] suggested that discussions initiated by the students encourage more students’ engagement than discussions initiated by the instructors. It is most important to afford learners opportunities to interact purposefully with teachers, and improving the quality of learner interaction is crucial to fostering profound learning [ 12 ]. Interaction is an important way for learners to communicate and share information, and a key factor in the quality of online learning [ 13 ].

Timely feedback is the main component of online learning interaction. Woo and Reeves discovered that students often become frustrated when they fail to receive prompt feedback [ 14 ]. Shelley et al. conducted a three-year study of graduate and undergraduate students’ satisfaction with online learning at universities and found that interaction with educators and students is the main factor affecting satisfaction [ 15 ]. Teachers therefore need to provide students with scoring justification, support, and constructive criticism during online learning. Some researchers examined online learning during the COVID-19 pandemic. They found that most students preferred face-to-face learning rather than online learning due to obstacles faced online, such as a lack of motivation, limited teacher-student interaction, and a sense of isolation when learning in different times and spaces [ 16 , 17 ]. However, it can be reduced by enhancing the online interaction between teachers and students [ 18 ].

Research showed that interactions contributed to maintaining students’ motivation to continue learning [ 19 ]. Baber argued that interaction played a key role in students’ academic performance and influenced the quality of the online learning experience [ 20 ]. Hodges et al. maintained that well-designed online instruction can lead to unique teaching experiences [ 21 ]. Banna et al. mentioned that using discussion boards, chat sessions, blogs, wikis, and other tools could promote student interaction and improve participation in online courses [ 22 ]. During the COVID-19 pandemic, Mahmood proposed a series of teaching strategies suitable for distance learning to improve its effectiveness [ 23 ]. Lapitan et al. devised an online strategy to ease the transition from traditional face-to-face instruction to online learning [ 24 ]. The preceding discussion suggests that online learning goes beyond simply providing learning resources; teachers should ideally design real-life activities to give learners more opportunities to participate.

As mentioned, COVID-19 has driven many scholars to explore the online learning environment. However, most have ignored the uniqueness of online learning during this time and have rarely compared pre- and post-pandemic online learning interaction. Taking China’s icourse.163 MOOC platform as an example, we chose 15 courses with a large number of participants for SNA, centering on student interaction before and after the pandemic. Gephi was used to visually analyze changes in the scale and characteristics of network interaction. The following questions were of particular interest:

  • (1) Can the COVID-19 pandemic promote the expansion of online learning?
  • (2a) What are the characteristics of online learning interaction during the pandemic?
  • (2b) What are the characteristics of online learning interaction after the pandemic?
  • (3) How do interaction characteristics differ between social science courses and natural science courses?

3. Methodology

3.1 research context.

We selected several courses with a large number of participants and extensive online interaction among hundreds of courses on the icourse.163 MOOC platform. These courses had been offered on the platform for at least three semesters, covering three periods (i.e., before, during, and after the COVID-19 outbreak). To eliminate the effects of shifts in irrelevant variables (e.g., course teaching activities), we chose several courses with similar teaching activities and compared them on multiple dimensions. All course content was taught online. The teachers of each course posted discussion threads related to learning topics; students were expected to reply via comments. Learners could exchange ideas freely in their responses in addition to asking questions and sharing their learning experiences. Teachers could answer students’ questions as well. Conversations in the comment area could partly compensate for a relative absence of online classroom interaction. Teacher–student interaction is conducive to the formation of a social network structure and enabled us to examine teachers’ and students’ learning behavior through SNA. The comment areas in these courses were intended for learners to construct knowledge via reciprocal communication. Meanwhile, by answering students’ questions, teachers could encourage them to reflect on their learning progress. These courses’ successive terms also spanned several phases of COVID-19, allowing us to ascertain the pandemic’s impact on online learning.

3.2 Data collection and preprocessing

To avoid interference from invalid or unclear data, the following criteria were applied to select representative courses: (1) generality (i.e., public courses and professional courses were chosen from different schools across China); (2) time validity (i.e., courses were held before during, and after the pandemic); and (3) notability (i.e., each course had at least 2,000 participants). We ultimately chose 15 courses across the social sciences and natural sciences (see Table 1 ). The coding is used to represent the course name.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0273016.t001

To discern courses’ evolution during the pandemic, we gathered data on three terms before, during, and after the COVID-19 outbreak in addition to obtaining data from two terms completed well before the pandemic and long after. Our final dataset comprised five sets of interactive data. Finally, we collected about 120,000 comments for SNA. Because each course had a different start time—in line with fluctuations in the number of confirmed COVID-19 cases in China and the opening dates of most colleges and universities—we divided our sample into five phases: well before the pandemic (Phase I); before the pandemic (Phase Ⅱ); during the pandemic (Phase Ⅲ); after the pandemic (Phase Ⅳ); and long after the pandemic (Phase Ⅴ). We sought to preserve consistent time spans to balance the amount of data in each period ( Fig 1 ).

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g001

3.3 Instrumentation

Participants’ comments and “thumbs-up” behavior data were converted into a network structure and compared using social network analysis (SNA). Network analysis, according to M’Chirgui, is an effective tool for clarifying network relationships by employing sophisticated techniques [ 25 ]. Specifically, SNA can help explain the underlying relationships among team members and provide a better understanding of their internal processes. Yang and Tang used SNA to discuss the relationship between team structure and team performance [ 26 ]. Golbeck argued that SNA could improve the understanding of students’ learning processes and reveal learners’ and teachers’ role dynamics [ 27 ].

To analyze Question (1), the number of nodes and diameter in the generated network were deemed as indicators of changes in network size. Social networks are typically represented as graphs with nodes and degrees, and node count indicates the sample size [ 15 ]. Wellman et al. proposed that the larger the network scale, the greater the number of network members providing emotional support, goods, services, and companionship [ 28 ]. Jan’s study measured the network size by counting the nodes which represented students, lecturers, and tutors [ 29 ]. Similarly, network nodes in the present study indicated how many learners and teachers participated in the course, with more nodes indicating more participants. Furthermore, we investigated the network diameter, a structural feature of social networks, which is a common metric for measuring network size in SNA [ 30 ]. The network diameter refers to the longest path between any two nodes in the network. There has been evidence that a larger network diameter leads to greater spread of behavior [ 31 ]. Likewise, Gašević et al. found that larger networks were more likely to spread innovative ideas about educational technology when analyzing MOOC-related research citations [ 32 ]. Therefore, we employed node count and network diameter to measure the network’s spatial size and further explore the expansion characteristic of online courses. Brief introduction of these indicators can be summarized in Table 2 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t002

To address Question (2), a list of interactive analysis metrics in SNA were introduced to scrutinize learners’ interaction characteristics in online learning during and after the pandemic, as shown below:

  • (1) The average degree reflects the density of the network by calculating the average number of connections for each node. As Rong and Xu suggested, the average degree of a network indicates how active its participants are [ 33 ]. According to Hu, a higher average degree implies that more students are interacting directly with each other in a learning context [ 34 ]. The present study inherited the concept of the average degree from these previous studies: the higher the average degree, the more frequent the interaction between individuals in the network.
  • (2) Essentially, a weighted average degree in a network is calculated by multiplying each degree by its respective weight, and then taking the average. Bydžovská took the strength of the relationship into account when determining the weighted average degree [ 35 ]. By calculating friendship’s weighted value, Maroulis assessed peer achievement within a small-school reform [ 36 ]. Accordingly, we considered the number of interactions as the weight of the degree, with a higher average degree indicating more active interaction among learners.
  • (3) Network density is the ratio between actual connections and potential connections in a network. The more connections group members have with each other, the higher the network density. In SNA, network density is similar to group cohesion, i.e., a network of more strong relationships is more cohesive [ 37 ]. Network density also reflects how much all members are connected together [ 38 ]. Therefore, we adopted network density to indicate the closeness among network members. Higher network density indicates more frequent interaction and closer communication among students.
  • (4) Clustering coefficient describes local network attributes and indicates that two nodes in the network could be connected through adjacent nodes. The clustering coefficient measures users’ tendency to gather (cluster) with others in the network: the higher the clustering coefficient, the more frequently users communicate with other group members. We regarded this indicator as a reflection of the cohesiveness of the group [ 39 ].
  • (5) In a network, the average path length is the average number of steps along the shortest paths between any two nodes. Oliveres has observed that when an average path length is small, the route from one node to another is shorter when graphed [ 40 ]. This is especially true in educational settings where students tend to become closer friends. So we consider that the smaller the average path length, the greater the possibility of interaction between individuals in the network.
  • (6) A network with a large number of nodes, but whose average path length is surprisingly small, is known as the small-world effect [ 41 ]. A higher clustering coefficient and shorter average path length are important indicators of a small-world network: a shorter average path length enables the network to spread information faster and more accurately; a higher clustering coefficient can promote frequent knowledge exchange within the group while boosting the timeliness and accuracy of knowledge dissemination [ 42 ]. Brief introduction of these indicators can be summarized in Table 3 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t003

To analyze Question 3, we used the concept of closeness centrality, which determines how close a vertex is to others in the network. As Opsahl et al. explained, closeness centrality reveals how closely actors are coupled with their entire social network [ 43 ]. In order to analyze social network-based engineering education, Putnik et al. examined closeness centrality and found that it was significantly correlated with grades [ 38 ]. We used closeness centrality to measure the position of an individual in the network. Brief introduction of these indicators can be summarized in Table 4 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t004

3.4 Ethics statement

This study was approved by the Academic Committee Office (ACO) of South China Normal University ( http://fzghb.scnu.edu.cn/ ), Guangzhou, China. Research data were collected from the open platform and analyzed anonymously. There are thus no privacy issues involved in this study.

4.1 COVID-19’s role in promoting the scale of online courses was not as important as expected

As shown in Fig 2 , the number of course participants and nodes are closely correlated with the pandemic’s trajectory. Because the number of participants in each course varied widely, we normalized the number of participants and nodes to more conveniently visualize course trends. Fig 2 depicts changes in the chosen courses’ number of participants and nodes before the pandemic (Phase II), during the pandemic (Phase III), and after the pandemic (Phase IV). The number of participants in most courses during the pandemic exceeded those before and after the pandemic. But the number of people who participate in interaction in some courses did not increase.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g002

In order to better analyze the trend of interaction scale in online courses before, during, and after the pandemic, the selected courses were categorized according to their scale change. When the number of participants increased (decreased) beyond 20% (statistical experience) and the diameter also increased (decreased), the course scale was determined to have increased (decreased); otherwise, no significant change was identified in the course’s interaction scale. Courses were subsequently divided into three categories: increased interaction scale, decreased interaction scale, and no significant change. Results appear in Table 5 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t005

From before the pandemic until it broke out, the interaction scale of five courses increased, accounting for 33.3% of the full sample; one course’s interaction scale declined, accounting for 6.7%. The interaction scale of nine courses decreased, accounting for 60%. The pandemic’s role in promoting online courses thus was not as important as anticipated, and most courses’ interaction scale did not change significantly throughout.

No courses displayed growing interaction scale after the pandemic: the interaction scale of nine courses fell, accounting for 60%; and the interaction scale of six courses did not shift significantly, accounting for 40%. Courses with an increased scale of interaction during the pandemic did not maintain an upward trend. On the contrary, the improvement in the pandemic caused learners’ enthusiasm for online learning to wane. We next analyzed several interaction metrics to further explore course interaction during different pandemic periods.

4.2 Characteristics of online learning interaction amid COVID-19

4.2.1 during the covid-19 pandemic, online learning interaction in some courses became more active..

Changes in course indicators with the growing interaction scale during the pandemic are presented in Fig 3 , including SS5, SS6, NS1, NS3, and NS8. The horizontal ordinate indicates the number of courses, with red color representing the rise of the indicator value on the vertical ordinate and blue representing the decline.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g003

Specifically: (1) The average degree and weighted average degree of the five course networks demonstrated an upward trend. The emergence of the pandemic promoted students’ enthusiasm; learners were more active in the interactive network. (2) Fig 3 shows that 3 courses had increased network density and 2 courses had decreased. The higher the network density, the more communication within the team. Even though the pandemic accelerated the interaction scale and frequency, the tightness between learners in some courses did not improve. (3) The clustering coefficient of social science courses rose whereas the clustering coefficient and small-world property of natural science courses fell. The higher the clustering coefficient and the small-world property, the better the relationship between adjacent nodes and the higher the cohesion [ 39 ]. (4) Most courses’ average path length increased as the interaction scale increased. However, when the average path length grew, adverse effects could manifest: communication between learners might be limited to a small group without multi-directional interaction.

When the pandemic emerged, the only declining network scale belonged to a natural science course (NS2). The change in each course index is pictured in Fig 4 . The abscissa indicates the size of the value, with larger values to the right. The red dot indicates the index value before the pandemic; the blue dot indicates its value during the pandemic. If the blue dot is to the right of the red dot, then the value of the index increased; otherwise, the index value declined. Only the weighted average degree of the course network increased. The average degree, network density decreased, indicating that network members were not active and that learners’ interaction degree and communication frequency lessened. Despite reduced learner interaction, the average path length was small and the connectivity between learners was adequate.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g004

4.2.2 After the COVID-19 pandemic, the scale decreased rapidly, but most course interaction was more effective.

Fig 5 shows the changes in various courses’ interaction indicators after the pandemic, including SS1, SS2, SS3, SS6, SS7, NS2, NS3, NS7, and NS8.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g005

Specifically: (1) The average degree and weighted average degree of most course networks decreased. The scope and intensity of interaction among network members declined rapidly, as did learners’ enthusiasm for communication. (2) The network density of seven courses also fell, indicating weaker connections between learners in most courses. (3) In addition, the clustering coefficient and small-world property of most course networks decreased, suggesting little possibility of small groups in the network. The scope of interaction between learners was not limited to a specific space, and the interaction objects had no significant tendencies. (4) Although the scale of course interaction became smaller in this phase, the average path length of members’ social networks shortened in nine courses. Its shorter average path length would expedite the spread of information within the network as well as communication and sharing among network members.

Fig 6 displays the evolution of course interaction indicators without significant changes in interaction scale after the pandemic, including SS4, SS5, NS1, NS4, NS5, and NS6.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.g006

Specifically: (1) Some course members’ social networks exhibited an increase in the average and weighted average. In these cases, even though the course network’s scale did not continue to increase, communication among network members rose and interaction became more frequent and deeper than before. (2) Network density and average path length are indicators of social network density. The greater the network density, the denser the social network; the shorter the average path length, the more concentrated the communication among network members. However, at this phase, the average path length and network density in most courses had increased. Yet the network density remained small despite having risen ( Table 6 ). Even with more frequent learner interaction, connections remained distant and the social network was comparatively sparse.

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t006

In summary, the scale of interaction did not change significantly overall. Nonetheless, some course members’ frequency and extent of interaction increased, and the relationships between network members became closer as well. In the study, we found it interesting that the interaction scale of Economics (a social science course) course and Electrodynamics (a natural science course) course expanded rapidly during the pandemic and retained their interaction scale thereafter. We next assessed these two courses to determine whether their level of interaction persisted after the pandemic.

4.3 Analyses of natural science courses and social science courses

4.3.1 analyses of the interaction characteristics of economics and electrodynamics..

Economics and Electrodynamics are social science courses and natural science courses, respectively. Members’ interaction within these courses was similar: the interaction scale increased significantly when COVID-19 broke out (Phase Ⅲ), and no significant changes emerged after the pandemic (Phase Ⅴ). We hence focused on course interaction long after the outbreak (Phase V) and compared changes across multiple indicators, as listed in Table 7 .

thumbnail

https://doi.org/10.1371/journal.pone.0273016.t007

As the pandemic continued to improve, the number of participants and the diameter long after the outbreak (Phase V) each declined for Economics compared with after the pandemic (Phase IV). The interaction scale decreased, but the interaction between learners was much deeper. Specifically: (1) The weighted average degree, network density, clustering coefficient, and small-world property each reflected upward trends. The pandemic therefore exerted a strong impact on this course. Interaction was well maintained even after the pandemic. The smaller network scale promoted members’ interaction and communication. (2) Compared with after the pandemic (Phase IV), members’ network density increased significantly, showing that relationships between learners were closer and that cohesion was improving. (3) At the same time, as the clustering coefficient and small-world property grew, network members demonstrated strong small-group characteristics: the communication between them was deepening and their enthusiasm for interaction was higher. (4) Long after the COVID-19 outbreak (Phase V), the average path length was reduced compared with previous terms, knowledge flowed more quickly among network members, and the degree of interaction gradually deepened.

The average degree, weighted average degree, network density, clustering coefficient, and small-world property of Electrodynamics all decreased long after the COVID-19 outbreak (Phase V) and were lower than during the outbreak (Phase Ⅲ). The level of learner interaction therefore gradually declined long after the outbreak (Phase V), and connections between learners were no longer active. Although the pandemic increased course members’ extent of interaction, this rise was merely temporary: students’ enthusiasm for learning waned rapidly and their interaction decreased after the pandemic (Phase IV). To further analyze the interaction characteristics of course members in Economics and Electrodynamics, we evaluated the closeness centrality of their social networks, as shown in section 4.3.2.

4.3.2 Analysis of the closeness centrality of Economics and Electrodynamics.

The change in the closeness centrality of social networks in Economics was small, and no sharp upward trend appeared during the pandemic outbreak, as shown in Fig 7 . The emergence of COVID-19 apparently fostered learners’ interaction in Economics albeit without a significant impact. The closeness centrality changed in Electrodynamics varied from that of Economics: upon the COVID-19 outbreak, closeness centrality was significantly different from other semesters. Communication between learners was closer and interaction was more effective. Electrodynamics course members’ social network proximity decreased rapidly after the pandemic. Learners’ communication lessened. In general, Economics course showed better interaction before the outbreak and was less affected by the pandemic; Electrodynamics course was more affected by the pandemic and showed different interaction characteristics at different periods of the pandemic.

thumbnail

(Note: "****" indicates the significant distinction in closeness centrality between the two periods, otherwise no significant distinction).

https://doi.org/10.1371/journal.pone.0273016.g007

5. Discussion

We referred to discussion forums from several courses on the icourse.163 MOOC platform to compare online learning before, during, and after the COVID-19 pandemic via SNA and to delineate the pandemic’s effects on online courses. Only 33.3% of courses in our sample increased in terms of interaction during the pandemic; the scale of interaction did not rise in any courses thereafter. When the courses scale rose, the scope and frequency of interaction showed upward trends during the pandemic; and the clustering coefficient of natural science courses and social science courses differed: the coefficient for social science courses tended to rise whereas that for natural science courses generally declined. When the pandemic broke out, the interaction scale of a single natural science course decreased along with its interaction scope and frequency. The amount of interaction in most courses shrank rapidly during the pandemic and network members were not as active as they had been before. However, after the pandemic, some courses saw declining interaction but greater communication between members; interaction also became more frequent and deeper than before.

5.1 During the COVID-19 pandemic, the scale of interaction increased in only a few courses

The pandemic outbreak led to a rapid increase in the number of participants in most courses; however, the change in network scale was not significant. The scale of online interaction expanded swiftly in only a few courses; in others, the scale either did not change significantly or displayed a downward trend. After the pandemic, the interaction scale in most courses decreased quickly; the same pattern applied to communication between network members. Learners’ enthusiasm for online interaction reduced as the circumstances of the pandemic improved—potentially because, during the pandemic, China’s Ministry of Education declared “School’s Out, But Class’s On” policy. Major colleges and universities were encouraged to use the Internet and informational resources to provide learning support, hence the sudden increase in the number of participants and interaction in online courses [ 46 ]. After the pandemic, students’ enthusiasm for online learning gradually weakened, presumably due to easing of the pandemic [ 47 ]. More activities also transitioned from online to offline, which tempered learners’ online discussion. Research has shown that long-term online learning can even bore students [ 48 ].

Most courses’ interaction scale decreased significantly after the pandemic. First, teachers and students occupied separate spaces during the outbreak, had few opportunities for mutual cooperation and friendship, and lacked a sense of belonging [ 49 ]. Students’ enthusiasm for learning dissipated over time [ 50 ]. Second, some teachers were especially concerned about adapting in-person instructional materials for digital platforms; their pedagogical methods were ineffective, and they did not provide learning activities germane to student interaction [ 51 ]. Third, although teachers and students in remote areas were actively engaged in online learning, some students could not continue to participate in distance learning due to inadequate technology later in the outbreak [ 52 ].

5.2 Characteristics of online learning interaction during and after the COVID-19 pandemic

5.2.1 during the covid-19 pandemic, online interaction in most courses did not change significantly..

The interaction scale of only a few courses increased during the pandemic. The interaction scope and frequency of these courses climbed as well. Yet even as the degree of network interaction rose, course network density did not expand in all cases. The pandemic sparked a surge in the number of online learners and a rapid increase in network scale, but students found it difficult to interact with all learners. Yau pointed out that a greater network scale did not enrich the range of interaction between individuals; rather, the number of individuals who could interact directly was limited [ 53 ]. The internet facilitates interpersonal communication. However, not everyone has the time or ability to establish close ties with others [ 54 ].

In addition, social science courses and natural science courses in our sample revealed disparate trends in this regard: the clustering coefficient of social science courses increased and that of natural science courses decreased. Social science courses usually employ learning approaches distinct from those in natural science courses [ 55 ]. Social science courses emphasize critical and innovative thinking along with personal expression [ 56 ]. Natural science courses focus on practical skills, methods, and principles [ 57 ]. Therefore, the content of social science courses can spur large-scale discussion among learners. Some course evaluations indicated that the course content design was suboptimal as well: teachers paid close attention to knowledge transmission and much less to piquing students’ interest in learning. In addition, the thread topics that teachers posted were scarcely diversified and teachers’ questions lacked openness. These attributes could not spark active discussion among learners.

5.2.2 Online learning interaction declined after the COVID-19 pandemic.

Most courses’ interaction scale and intensity decreased rapidly after the pandemic, but some did not change. Courses with a larger network scale did not continue to expand after the outbreak, and students’ enthusiasm for learning paled. The pandemic’s reduced severity also influenced the number of participants in online courses. Meanwhile, restored school order moved many learning activities from virtual to in-person spaces. Face-to-face learning has gradually replaced online learning, resulting in lower enrollment and less interaction in online courses. Prolonged online courses could have also led students to feel lonely and to lack a sense of belonging [ 58 ].

The scale of interaction in some courses did not change substantially after the pandemic yet learners’ connections became tighter. We hence recommend that teachers seize pandemic-related opportunities to design suitable activities. Additionally, instructors should promote student-teacher and student-student interaction, encourage students to actively participate online, and generally intensify the impact of online learning.

5.3 What are the characteristics of interaction in social science courses and natural science courses?

The level of interaction in Economics (a social science course) was significantly higher than that in Electrodynamics (a natural science course), and the small-world property in Economics increased as well. To boost online courses’ learning-related impacts, teachers can divide groups of learners based on the clustering coefficient and the average path length. Small groups of students may benefit teachers in several ways: to participate actively in activities intended to expand students’ knowledge, and to serve as key actors in these small groups. Cultivating students’ keenness to participate in class activities and self-management can also help teachers guide learner interaction and foster deep knowledge construction.

As evidenced by comments posted in the Electrodynamics course, we observed less interaction between students. Teachers also rarely urged students to contribute to conversations. These trends may have arisen because teachers and students were in different spaces. Teachers might have struggled to discern students’ interaction status. Teachers could also have failed to intervene in time, to design online learning activities that piqued learners’ interest, and to employ sound interactive theme planning and guidance. Teachers are often active in traditional classroom settings. Their roles are comparatively weakened online, such that they possess less control over instruction [ 59 ]. Online instruction also requires a stronger hand in learning: teachers should play a leading role in regulating network members’ interactive communication [ 60 ]. Teachers can guide learners to participate, help learners establish social networks, and heighten students’ interest in learning [ 61 ]. Teachers should attend to core members in online learning while also considering edge members; by doing so, all network members can be driven to share their knowledge and become more engaged. Finally, teachers and assistant teachers should help learners develop knowledge, exchange topic-related ideas, pose relevant questions during course discussions, and craft activities that enable learners to interact online [ 62 ]. These tactics can improve the effectiveness of online learning.

As described, network members displayed distinct interaction behavior in Economics and Electrodynamics courses. First, these courses varied in their difficulty: the social science course seemed easier to understand and focused on divergent thinking. Learners were often willing to express their views in comments and to ponder others’ perspectives [ 63 ]. The natural science course seemed more demanding and was oriented around logical thinking and skills [ 64 ]. Second, courses’ content differed. In general, social science courses favor the acquisition of declarative knowledge and creative knowledge compared with natural science courses. Social science courses also entertain open questions [ 65 ]. Natural science courses revolve around principle knowledge, strategic knowledge, and transfer knowledge [ 66 ]. Problems in these courses are normally more complicated than those in social science courses. Third, the indicators affecting students’ attitudes toward learning were unique. Guo et al. discovered that “teacher feedback” most strongly influenced students’ attitudes towards learning social science courses but had less impact on students in natural science courses [ 67 ]. Therefore, learners in social science courses likely expect more feedback from teachers and greater interaction with others.

6. Conclusion and future work

Our findings show that the network interaction scale of some online courses expanded during the COVID-19 pandemic. The network scale of most courses did not change significantly, demonstrating that the pandemic did not notably alter the scale of course interaction. Online learning interaction among course network members whose interaction scale increased also became more frequent during the pandemic. Once the outbreak was under control, although the scale of interaction declined, the level and scope of some courses’ interactive networks continued to rise; interaction was thus particularly effective in these cases. Overall, the pandemic appeared to have a relatively positive impact on online learning interaction. We considered a pair of courses in detail and found that Economics (a social science course) fared much better than Electrodynamics (a natural science course) in classroom interaction; learners were more willing to partake in-class activities, perhaps due to these courses’ unique characteristics. Brint et al. also came to similar conclusions [ 57 ].

This study was intended to be rigorous. Even so, several constraints can be addressed in future work. The first limitation involves our sample: we focused on a select set of courses hosted on China’s icourse.163 MOOC platform. Future studies should involve an expansive collection of courses to provide a more holistic understanding of how the pandemic has influenced online interaction. Second, we only explored the interactive relationship between learners and did not analyze interactive content. More in-depth content analysis should be carried out in subsequent research. All in all, the emergence of COVID-19 has provided a new path for online learning and has reshaped the distance learning landscape. To cope with associated challenges, educational practitioners will need to continue innovating in online instructional design, strengthen related pedagogy, optimize online learning conditions, and bolster teachers’ and students’ competence in online learning.

  • View Article
  • Google Scholar
  • PubMed/NCBI
  • 30. Serrat O. Social network analysis. Knowledge solutions: Springer; 2017. p. 39–43. https://doi.org/10.1007/978-981-10-0983-9_9
  • 33. Rong Y, Xu E, editors. Strategies for the Management of the Government Affairs Microblogs in China Based on the SNA of Fifty Government Affairs Microblogs in Beijing. 14th International Conference on Service Systems and Service Management 2017.
  • 34. Hu X, Chu S, editors. A comparison on using social media in a professional experience course. International Conference on Social Media and Society; 2013.
  • 35. Bydžovská H. A Comparative Analysis of Techniques for Predicting Student Performance. Proceedings of the 9th International Conference on Educational Data Mining; Raleigh, NC, USA: International Educational Data Mining Society2016. p. 306–311.
  • 40. Olivares D, Adesope O, Hundhausen C, et al., editors. Using social network analysis to measure the effect of learning analytics in computing education. 19th IEEE International Conference on Advanced Learning Technologies 2019.
  • 41. Travers J, Milgram S. An experimental study of the small world problem. Social Networks: Elsevier; 1977. p. 179–197. https://doi.org/10.1016/B978-0-12-442450-0.50018–3
  • 43. Okamoto K, Chen W, Li X-Y, editors. Ranking of closeness centrality for large-scale social networks. International workshop on frontiers in algorithmics; 2008; Springer, Berlin, Heidelberg: Springer.
  • 47. Ding Y, Yang X, Zheng Y, editors. COVID-19’s Effects on the Scope, Effectiveness, and Roles of Teachers in Online Learning Based on Social Network Analysis: A Case Study. International Conference on Blended Learning; 2021: Springer.
  • 64. Boys C, Brennan J., Henkel M., Kirkland J., Kogan M., Youl P. Higher Education and Preparation for Work. Jessica Kingsley Publishers. 1988. https://doi.org/10.1080/03075079612331381467

ORIGINAL RESEARCH article

Insights into students’ experiences and perceptions of remote learning methods: from the covid-19 pandemic to best practice for the future.

\r\nTrang Nguyen

  • 1 Minerva Schools at Keck Graduate Institute, San Francisco, CA, United States
  • 2 Ronin Institute for Independent Scholarship, Montclair, NJ, United States
  • 3 Department of Physics, University of Toronto, Toronto, ON, Canada

This spring, students across the globe transitioned from in-person classes to remote learning as a result of the COVID-19 pandemic. This unprecedented change to undergraduate education saw institutions adopting multiple online teaching modalities and instructional platforms. We sought to understand students’ experiences with and perspectives on those methods of remote instruction in order to inform pedagogical decisions during the current pandemic and in future development of online courses and virtual learning experiences. Our survey gathered quantitative and qualitative data regarding students’ experiences with synchronous and asynchronous methods of remote learning and specific pedagogical techniques associated with each. A total of 4,789 undergraduate participants representing institutions across 95 countries were recruited via Instagram. We find that most students prefer synchronous online classes, and students whose primary mode of remote instruction has been synchronous report being more engaged and motivated. Our qualitative data show that students miss the social aspects of learning on campus, and it is possible that synchronous learning helps to mitigate some feelings of isolation. Students whose synchronous classes include active-learning techniques (which are inherently more social) report significantly higher levels of engagement, motivation, enjoyment, and satisfaction with instruction. Respondents’ recommendations for changes emphasize increased engagement, interaction, and student participation. We conclude that active-learning methods, which are known to increase motivation, engagement, and learning in traditional classrooms, also have a positive impact in the remote-learning environment. Integrating these elements into online courses will improve the student experience.

Introduction

The COVID-19 pandemic has dramatically changed the demographics of online students. Previously, almost all students engaged in online learning elected the online format, starting with individual online courses in the mid-1990s through today’s robust online degree and certificate programs. These students prioritize convenience, flexibility and ability to work while studying and are older than traditional college age students ( Harris and Martin, 2012 ; Levitz, 2016 ). These students also find asynchronous elements of a course are more useful than synchronous elements ( Gillingham and Molinari, 2012 ). In contrast, students who chose to take courses in-person prioritize face-to-face instruction and connection with others and skew considerably younger ( Harris and Martin, 2012 ). This leaves open the question of whether students who prefer to learn in-person but are forced to learn remotely will prefer synchronous or asynchronous methods. One study of student preferences following a switch to remote learning during the COVID-19 pandemic indicates that students enjoy synchronous over asynchronous course elements and find them more effective ( Gillis and Krull, 2020 ). Now that millions of traditional in-person courses have transitioned online, our survey expands the data on student preferences and explores if those preferences align with pedagogical best practices.

An extensive body of research has explored what instructional methods improve student learning outcomes (Fink. 2013). Considerable evidence indicates that active-learning or student-centered approaches result in better learning outcomes than passive-learning or instructor-centered approaches, both in-person and online ( Freeman et al., 2014 ; Chen et al., 2018 ; Davis et al., 2018 ). Active-learning approaches include student activities or discussion in class, whereas passive-learning approaches emphasize extensive exposition by the instructor ( Freeman et al., 2014 ). Constructivist learning theories argue that students must be active participants in creating their own learning, and that listening to expert explanations is seldom sufficient to trigger the neurological changes necessary for learning ( Bostock, 1998 ; Zull, 2002 ). Some studies conclude that, while students learn more via active learning, they may report greater perceptions of their learning and greater enjoyment when passive approaches are used ( Deslauriers et al., 2019 ). We examine student perceptions of remote learning experiences in light of these previous findings.

In this study, we administered a survey focused on student perceptions of remote learning in late May 2020 through the social media account of @unjadedjade to a global population of English speaking undergraduate students representing institutions across 95 countries. We aim to explore how students were being taught, the relationship between pedagogical methods and student perceptions of their experience, and the reasons behind those perceptions. Here we present an initial analysis of the results and share our data set for further inquiry. We find that positive student perceptions correlate with synchronous courses that employ a variety of interactive pedagogical techniques, and that students overwhelmingly suggest behavioral and pedagogical changes that increase social engagement and interaction. We argue that these results support the importance of active learning in an online environment.

Materials and Methods

Participant pool.

Students were recruited through the Instagram account @unjadedjade. This social media platform, run by influencer Jade Bowler, focuses on education, effective study tips, ethical lifestyle, and promotes a positive mindset. For this reason, the audience is presumably academically inclined, and interested in self-improvement. The survey was posted to her account and received 10,563 responses within the first 36 h. Here we analyze the 4,789 of those responses that came from undergraduates. While we did not collect demographic or identifying information, we suspect that women are overrepresented in these data as followers of @unjadedjade are 80% women. A large minority of respondents were from the United Kingdom as Jade Bowler is a British influencer. Specifically, 43.3% of participants attend United Kingdom institutions, followed by 6.7% attending university in the Netherlands, 6.1% in Germany, 5.8% in the United States and 4.2% in Australia. Ninety additional countries are represented in these data (see Supplementary Figure 1 ).

Survey Design

The purpose of this survey is to learn about students’ instructional experiences following the transition to remote learning in the spring of 2020.

This survey was initially created for a student assignment for the undergraduate course Empirical Analysis at Minerva Schools at KGI. That version served as a robust pre-test and allowed for identification of the primary online platforms used, and the four primary modes of learning: synchronous (live) classes, recorded lectures and videos, uploaded or emailed materials, and chat-based communication. We did not adapt any open-ended questions based on the pre-test survey to avoid biasing the results and only corrected language in questions for clarity. We used these data along with an analysis of common practices in online learning to revise the survey. Our revised survey asked students to identify the synchronous and asynchronous pedagogical methods and platforms that they were using for remote learning. Pedagogical methods were drawn from literature assessing active and passive teaching strategies in North American institutions ( Fink, 2013 ; Chen et al., 2018 ; Davis et al., 2018 ). Open-ended questions asked students to describe why they preferred certain modes of learning and how they could improve their learning experience. Students also reported on their affective response to learning and participation using a Likert scale.

The revised survey also asked whether students had responded to the earlier survey. No significant differences were found between responses of those answering for the first and second times (data not shown). See Supplementary Appendix 1 for survey questions. Survey data was collected from 5/21/20 to 5/23/20.

Qualitative Coding

We applied a qualitative coding framework adapted from Gale et al. (2013) to analyze student responses to open-ended questions. Four researchers read several hundred responses and noted themes that surfaced. We then developed a list of themes inductively from the survey data and deductively from the literature on pedagogical practice ( Garrison et al., 1999 ; Zull, 2002 ; Fink, 2013 ; Freeman et al., 2014 ). The initial codebook was revised collaboratively based on feedback from researchers after coding 20–80 qualitative comments each. Before coding their assigned questions, alignment was examined through coding of 20 additional responses. Researchers aligned in identifying the same major themes. Discrepancies in terms identified were resolved through discussion. Researchers continued to meet weekly to discuss progress and alignment. The majority of responses were coded by a single researcher using the final codebook ( Supplementary Table 1 ). All responses to questions 3 (4,318 responses) and 8 (4,704 responses), and 2,512 of 4,776 responses to question 12 were analyzed. Valence was also indicated where necessary (i.e., positive or negative discussion of terms). This paper focuses on the most prevalent themes from our initial analysis of the qualitative responses. The corresponding author reviewed codes to ensure consistency and accuracy of reported data.

Statistical Analysis

The survey included two sets of Likert-scale questions, one consisting of a set of six statements about students’ perceptions of their experiences following the transition to remote learning ( Table 1 ). For each statement, students indicated their level of agreement with the statement on a five-point scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). The second set asked the students to respond to the same set of statements, but about their retroactive perceptions of their experiences with in-person instruction before the transition to remote learning. This set was not the subject of our analysis but is present in the published survey results. To explore correlations among student responses, we used CrossCat analysis to calculate the probability of dependence between Likert-scale responses ( Mansinghka et al., 2016 ).

www.frontiersin.org

Table 1. Likert-scale questions.

Mean values are calculated based on the numerical scores associated with each response. Measures of statistical significance for comparisons between different subgroups of respondents were calculated using a two-sided Mann-Whitney U -test, and p -values reported here are based on this test statistic. We report effect sizes in pairwise comparisons using the common-language effect size, f , which is the probability that the response from a random sample from subgroup 1 is greater than the response from a random sample from subgroup 2. We also examined the effects of different modes of remote learning and technological platforms using ordinal logistic regression. With the exception of the mean values, all of these analyses treat Likert-scale responses as ordinal-scale, rather than interval-scale data.

Students Prefer Synchronous Class Sessions

Students were asked to identify their primary mode of learning given four categories of remote course design that emerged from the pilot survey and across literature on online teaching: live (synchronous) classes, recorded lectures and videos, emailed or uploaded materials, and chats and discussion forums. While 42.7% ( n = 2,045) students identified live classes as their primary mode of learning, 54.6% ( n = 2613) students preferred this mode ( Figure 1 ). Both recorded lectures and live classes were preferred over uploaded materials (6.22%, n = 298) and chat (3.36%, n = 161).

www.frontiersin.org

Figure 1. Actual (A) and preferred (B) primary modes of learning.

In addition to a preference for live classes, students whose primary mode was synchronous were more likely to enjoy the class, feel motivated and engaged, be satisfied with instruction and report higher levels of participation ( Table 2 and Supplementary Figure 2 ). Regardless of primary mode, over two-thirds of students reported they are often distracted during remote courses.

www.frontiersin.org

Table 2. The effect of synchronous vs. asynchronous primary modes of learning on student perceptions.

Variation in Pedagogical Techniques for Synchronous Classes Results in More Positive Perceptions of the Student Learning Experience

To survey the use of passive vs. active instructional methods, students reported the pedagogical techniques used in their live classes. Among the synchronous methods, we identify three different categories ( National Research Council, 2000 ; Freeman et al., 2014 ). Passive methods (P) include lectures, presentations, and explanation using diagrams, white boards and/or other media. These methods all rely on instructor delivery rather than student participation. Our next category represents active learning through primarily one-on-one interactions (A). The methods in this group are in-class assessment, question-and-answer (Q&A), and classroom chat. Group interactions (F) included classroom discussions and small-group activities. Given these categories, Mann-Whitney U pairwise comparisons between the 7 possible combinations and Likert scale responses about student experience showed that the use of a variety of methods resulted in higher ratings of experience vs. the use of a single method whether or not that single method was active or passive ( Table 3 ). Indeed, students whose classes used methods from each category (PAF) had higher ratings of enjoyment, motivation, and satisfaction with instruction than those who only chose any single method ( p < 0.0001) and also rated higher rates of participation and engagement compared to students whose only method was passive (P) or active through one-on-one interactions (A) ( p < 0.00001). Student ratings of distraction were not significantly different for any comparison. Given that sets of Likert responses often appeared significant together in these comparisons, we ran a CrossCat analysis to look at the probability of dependence across Likert responses. Responses have a high probability of dependence on each other, limiting what we can claim about any discrete response ( Supplementary Figure 3 ).

www.frontiersin.org

Table 3. Comparison of combinations of synchronous methods on student perceptions. Effect size (f).

Mann-Whitney U pairwise comparisons were also used to check if improvement in student experience was associated with the number of methods used vs. the variety of types of methods. For every comparison, we found that more methods resulted in higher scores on all Likert measures except distraction ( Table 4 ). Even comparison between four or fewer methods and greater than four methods resulted in a 59% chance that the latter enjoyed the courses more ( p < 0.00001) and 60% chance that they felt more motivated to learn ( p < 0.00001). Students who selected more than four methods ( n = 417) were also 65.1% ( p < 0.00001), 62.9% ( p < 0.00001) and 64.3% ( p < 0.00001) more satisfied with instruction, engaged, and actively participating, respectfully. Therefore, there was an overlap between how the number and variety of methods influenced students’ experiences. Since the number of techniques per category is 2–3, we cannot fully disentangle the effect of number vs. variety. Pairwise comparisons to look at subsets of data with 2–3 methods from a single group vs. 2–3 methods across groups controlled for this but had low sample numbers in most groups and resulted in no significant findings (data not shown). Therefore, from the data we have in our survey, there seems to be an interdependence between number and variety of methods on students’ learning experiences.

www.frontiersin.org

Table 4. Comparison of the number of synchronous methods on student perceptions. Effect size (f).

Variation in Asynchronous Pedagogical Techniques Results in More Positive Perceptions of the Student Learning Experience

Along with synchronous pedagogical methods, students reported the asynchronous methods that were used for their classes. We divided these methods into three main categories and conducted pairwise comparisons. Learning methods include video lectures, video content, and posted study materials. Interacting methods include discussion/chat forums, live office hours, and email Q&A with professors. Testing methods include assignments and exams. Our results again show the importance of variety in students’ perceptions ( Table 5 ). For example, compared to providing learning materials only, providing learning materials, interaction, and testing improved enjoyment ( f = 0.546, p < 0.001), motivation ( f = 0.553, p < 0.0001), satisfaction with instruction ( f = 0.596, p < 0.00001), engagement ( f = 0.572, p < 0.00001) and active participation ( f = 0.563, p < 0.00001) (row 6). Similarly, compared to just being interactive with conversations, the combination of all three methods improved five out of six indicators, except for distraction in class (row 11).

www.frontiersin.org

Table 5. Comparison of combinations of asynchronous methods on student perceptions. Effect size (f).

Ordinal logistic regression was used to assess the likelihood that the platforms students used predicted student perceptions ( Supplementary Table 2 ). Platform choices were based on the answers to open-ended questions in the pre-test survey. The synchronous and asynchronous methods used were consistently more predictive of Likert responses than the specific platforms. Likewise, distraction continued to be our outlier with no differences across methods or platforms.

Students Prefer In-Person and Synchronous Online Learning Largely Due to Social-Emotional Reasoning

As expected, 86.1% (4,123) of survey participants report a preference for in-person courses, while 13.9% (666) prefer online courses. When asked to explain the reasons for their preference, students who prefer in-person courses most often mention the importance of social interaction (693 mentions), engagement (639 mentions), and motivation (440 mentions). These students are also more likely to mention a preference for a fixed schedule (185 mentions) vs. a flexible schedule (2 mentions).

In addition to identifying social reasons for their preference for in-person learning, students’ suggestions for improvements in online learning focus primarily on increasing interaction and engagement, with 845 mentions of live classes, 685 mentions of interaction, 126 calls for increased participation and calls for changes related to these topics such as, “Smaller teaching groups for live sessions so that everyone is encouraged to talk as some people don’t say anything and don’t participate in group work,” and “Make it less of the professor reading the pdf that was given to us and more interaction.”

Students who prefer online learning primarily identify independence and flexibility (214 mentions) and reasons related to anxiety and discomfort in in-person settings (41 mentions). Anxiety was only mentioned 12 times in the much larger group that prefers in-person learning.

The preference for synchronous vs. asynchronous modes of learning follows similar trends ( Table 6 ). Students who prefer live classes mention engagement and interaction most often while those who prefer recorded lectures mention flexibility.

www.frontiersin.org

Table 6. Most prevalent themes for students based on their preferred mode of remote learning.

Student Perceptions Align With Research on Active Learning

The first, and most robust, conclusion is that incorporation of active-learning methods correlates with more positive student perceptions of affect and engagement. We can see this clearly in the substantial differences on a number of measures, where students whose classes used only passive-learning techniques reported lower levels of engagement, satisfaction, participation, and motivation when compared with students whose classes incorporated at least some active-learning elements. This result is consistent with prior research on the value of active learning ( Freeman et al., 2014 ).

Though research shows that student learning improves in active learning classes, on campus, student perceptions of their learning, enjoyment, and satisfaction with instruction are often lower in active-learning courses ( Deslauriers et al., 2019 ). Our finding that students rate enjoyment and satisfaction with instruction higher for active learning online suggests that the preference for passive lectures on campus relies on elements outside of the lecture itself. That might include the lecture hall environment, the social physical presence of peers, or normalization of passive lectures as the expected mode for on-campus classes. This implies that there may be more buy-in for active learning online vs. in-person.

A second result from our survey is that student perceptions of affect and engagement are associated with students experiencing a greater diversity of learning modalities. We see this in two different results. First, in addition to the fact that classes that include active learning outperform classes that rely solely on passive methods, we find that on all measures besides distraction, the highest student ratings are associated with a combination of active and passive methods. Second, we find that these higher scores are associated with classes that make use of a larger number of different methods.

This second result suggests that students benefit from classes that make use of multiple different techniques, possibly invoking a combination of passive and active methods. However, it is unclear from our data whether this effect is associated specifically with combining active and passive methods, or if it is associated simply with the use of multiple different methods, irrespective of whether those methods are active, passive, or some combination. The problem is that the number of methods used is confounded with the diversity of methods (e.g., it is impossible for a classroom using only one method to use both active and passive methods). In an attempt to address this question, we looked separately at the effect of number and diversity of methods while holding the other constant. Across a large number of such comparisons, we found few statistically significant differences, which may be a consequence of the fact that each comparison focused on a small subset of the data.

Thus, our data suggests that using a greater diversity of learning methods in the classroom may lead to better student outcomes. This is supported by research on student attention span which suggests varying delivery after 10–15 min to retain student’s attention ( Bradbury, 2016 ). It is likely that this is more relevant for online learning where students report high levels of distraction across methods, modalities, and platforms. Given that number and variety are key, and there are few passive learning methods, we can assume that some combination of methods that includes active learning improves student experience. However, it is not clear whether we should predict that this benefit would come simply from increasing the number of different methods used, or if there are benefits specific to combining particular methods. Disentangling these effects would be an interesting avenue for future research.

Students Value Social Presence in Remote Learning

Student responses across our open-ended survey questions show a striking difference in reasons for their preferences compared with traditional online learners who prefer flexibility ( Harris and Martin, 2012 ; Levitz, 2016 ). Students reasons for preferring in-person classes and synchronous remote classes emphasize the desire for social interaction and echo the research on the importance of social presence for learning in online courses.

Short et al. (1976) outlined Social Presence Theory in depicting students’ perceptions of each other as real in different means of telecommunications. These ideas translate directly to questions surrounding online education and pedagogy in regards to educational design in networked learning where connection across learners and instructors improves learning outcomes especially with “Human-Human interaction” ( Goodyear, 2002 , 2005 ; Tu, 2002 ). These ideas play heavily into asynchronous vs. synchronous learning, where Tu reports students having positive responses to both synchronous “real-time discussion in pleasantness, responsiveness and comfort with familiar topics” and real-time discussions edging out asynchronous computer-mediated communications in immediate replies and responsiveness. Tu’s research indicates that students perceive more interaction with synchronous mediums such as discussions because of immediacy which enhances social presence and support the use of active learning techniques ( Gunawardena, 1995 ; Tu, 2002 ). Thus, verbal immediacy and communities with face-to-face interactions, such as those in synchronous learning classrooms, lessen the psychological distance of communicators online and can simultaneously improve instructional satisfaction and reported learning ( Gunawardena and Zittle, 1997 ; Richardson and Swan, 2019 ; Shea et al., 2019 ). While synchronous learning may not be ideal for traditional online students and a subset of our participants, this research suggests that non-traditional online learners are more likely to appreciate the value of social presence.

Social presence also connects to the importance of social connections in learning. Too often, current systems of education emphasize course content in narrow ways that fail to embrace the full humanity of students and instructors ( Gay, 2000 ). With the COVID-19 pandemic leading to further social isolation for many students, the importance of social presence in courses, including live interactions that build social connections with classmates and with instructors, may be increased.

Limitations of These Data

Our undergraduate data consisted of 4,789 responses from 95 different countries, an unprecedented global scale for research on online learning. However, since respondents were followers of @unjadedjade who focuses on learning and wellness, these respondents may not represent the average student. Biases in survey responses are often limited by their recruitment techniques and our bias likely resulted in more robust and thoughtful responses to free-response questions and may have influenced the preference for synchronous classes. It is unlikely that it changed students reporting on remote learning pedagogical methods since those are out of student control.

Though we surveyed a global population, our design was rooted in literature assessing pedagogy in North American institutions. Therefore, our survey may not represent a global array of teaching practices.

This survey was sent out during the initial phase of emergency remote learning for most countries. This has two important implications. First, perceptions of remote learning may be clouded by complications of the pandemic which has increased social, mental, and financial stresses globally. Future research could disaggregate the impact of the pandemic from students’ learning experiences with a more detailed and holistic analysis of the impact of the pandemic on students.

Second, instructors, students and institutions were not able to fully prepare for effective remote education in terms of infrastructure, mentality, curriculum building, and pedagogy. Therefore, student experiences reflect this emergency transition. Single-modality courses may correlate with instructors who lacked the resources or time to learn or integrate more than one modality. Regardless, the main insights of this research align well with the science of teaching and learning and can be used to inform both education during future emergencies and course development for online programs that wish to attract traditional college students.

Global Student Voices Improve Our Understanding of the Experience of Emergency Remote Learning

Our survey shows that global student perspectives on remote learning agree with pedagogical best practices, breaking with the often-found negative reactions of students to these practices in traditional classrooms ( Shekhar et al., 2020 ). Our analysis of open-ended questions and preferences show that a majority of students prefer pedagogical approaches that promote both active learning and social interaction. These results can serve as a guide to instructors as they design online classes, especially for students whose first choice may be in-person learning. Indeed, with the near ubiquitous adoption of remote learning during the COVID-19 pandemic, remote learning may be the default for colleges during temporary emergencies. This has already been used at the K-12 level as snow days become virtual learning days ( Aspergren, 2020 ).

In addition to informing pedagogical decisions, the results of this survey can be used to inform future research. Although we survey a global population, our recruitment method selected for students who are English speakers, likely majority female, and have an interest in self-improvement. Repeating this study with a more diverse and representative sample of university students could improve the generalizability of our findings. While the use of a variety of pedagogical methods is better than a single method, more research is needed to determine what the optimal combinations and implementations are for courses in different disciplines. Though we identified social presence as the major trend in student responses, the over 12,000 open-ended responses from students could be analyzed in greater detail to gain a more nuanced understanding of student preferences and suggestions for improvement. Likewise, outliers could shed light on the diversity of student perspectives that we may encounter in our own classrooms. Beyond this, our findings can inform research that collects demographic data and/or measures learning outcomes to understand the impact of remote learning on different populations.

Importantly, this paper focuses on a subset of responses from the full data set which includes 10,563 students from secondary school, undergraduate, graduate, or professional school and additional questions about in-person learning. Our full data set is available here for anyone to download for continued exploration: https://dataverse.harvard.edu/dataset.xhtml?persistentId= doi: 10.7910/DVN/2TGOPH .

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. The patients/participants provided their written informed consent to participate in this study.

Author Contributions

GS: project lead, survey design, qualitative coding, writing, review, and editing. TN: data analysis, writing, review, and editing. CN and PB: qualitative coding. JW: data analysis, writing, and editing. CS: writing, review, and editing. EV and KL: original survey design and qualitative coding. PP: data analysis. JB: original survey design and survey distribution. HH: data analysis. MP: writing. All authors contributed to the article and approved the submitted version.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We want to thank Minerva Schools at KGI for providing funding for summer undergraduate research internships. We also want to thank Josh Fost and Christopher V. H.-H. Chen for discussion that helped shape this project.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2021.647986/full#supplementary-material

Aspergren, E. (2020). Snow Days Canceled Because of COVID-19 Online School? Not in These School Districts.sec. Education. USA Today. Available online at: https://www.usatoday.com/story/news/education/2020/12/15/covid-school-canceled-snow-day-online-learning/3905780001/ (accessed December 15, 2020).

Google Scholar

Bostock, S. J. (1998). Constructivism in mass higher education: a case study. Br. J. Educ. Technol. 29, 225–240. doi: 10.1111/1467-8535.00066

CrossRef Full Text | Google Scholar

Bradbury, N. A. (2016). Attention span during lectures: 8 seconds, 10 minutes, or more? Adv. Physiol. Educ. 40, 509–513. doi: 10.1152/advan.00109.2016

PubMed Abstract | CrossRef Full Text | Google Scholar

Chen, B., Bastedo, K., and Howard, W. (2018). Exploring best practices for online STEM courses: active learning, interaction & assessment design. Online Learn. 22, 59–75. doi: 10.24059/olj.v22i2.1369

Davis, D., Chen, G., Hauff, C., and Houben, G.-J. (2018). Activating learning at scale: a review of innovations in online learning strategies. Comput. Educ. 125, 327–344. doi: 10.1016/j.compedu.2018.05.019

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., and Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. 116, 19251–19257. doi: 10.1073/pnas.1821936116

Fink, L. D. (2013). Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses. Somerset, NJ: John Wiley & Sons, Incorporated.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., et al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. 111, 8410–8415. doi: 10.1073/pnas.1319030111

Gale, N. K., Heath, G., Cameron, E., Rashid, S., and Redwood, S. (2013). Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med. Res. Methodol. 13:117. doi: 10.1186/1471-2288-13-117

Garrison, D. R., Anderson, T., and Archer, W. (1999). Critical inquiry in a text-based environment: computer conferencing in higher education. Internet High. Educ. 2, 87–105. doi: 10.1016/S1096-7516(00)00016-6

Gay, G. (2000). Culturally Responsive Teaching: Theory, Research, and Practice. Multicultural Education Series. New York, NY: Teachers College Press.

Gillingham, and Molinari, C. (2012). Online courses: student preferences survey. Internet Learn. 1, 36–45. doi: 10.18278/il.1.1.4

Gillis, A., and Krull, L. M. (2020). COVID-19 remote learning transition in spring 2020: class structures, student perceptions, and inequality in college courses. Teach. Sociol. 48, 283–299. doi: 10.1177/0092055X20954263

Goodyear, P. (2002). “Psychological foundations for networked learning,” in Networked Learning: Perspectives and Issues. Computer Supported Cooperative Work , eds C. Steeples and C. Jones (London: Springer), 49–75. doi: 10.1007/978-1-4471-0181-9_4

Goodyear, P. (2005). Educational design and networked learning: patterns, pattern languages and design practice. Australas. J. Educ. Technol. 21, 82–101. doi: 10.14742/ajet.1344

Gunawardena, C. N. (1995). Social presence theory and implications for interaction and collaborative learning in computer conferences. Int. J. Educ. Telecommun. 1, 147–166.

Gunawardena, C. N., and Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer mediated conferencing environment. Am. J. Distance Educ. 11, 8–26. doi: 10.1080/08923649709526970

Harris, H. S., and Martin, E. (2012). Student motivations for choosing online classes. Int. J. Scholarsh. Teach. Learn. 6, 1–8. doi: 10.20429/ijsotl.2012.060211

Levitz, R. N. (2016). 2015-16 National Online Learners Satisfaction and Priorities Report. Cedar Rapids: Ruffalo Noel Levitz, 12.

Mansinghka, V., Shafto, P., Jonas, E., Petschulat, C., Gasner, M., and Tenenbaum, J. B. (2016). CrossCat: a fully Bayesian nonparametric method for analyzing heterogeneous, high dimensional data. J. Mach. Learn. Res. 17, 1–49. doi: 10.1007/978-0-387-69765-9_7

National Research Council (2000). How People Learn: Brain, Mind, Experience, and School: Expanded Edition. Washington, DC: National Academies Press, doi: 10.17226/9853

Richardson, J. C., and Swan, K. (2019). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Online Learn. 7, 68–88. doi: 10.24059/olj.v7i1.1864

Shea, P., Pickett, A. M., and Pelz, W. E. (2019). A Follow-up investigation of ‘teaching presence’ in the suny learning network. Online Learn. 7, 73–75. doi: 10.24059/olj.v7i2.1856

Shekhar, P., Borrego, M., DeMonbrun, M., Finelli, C., Crockett, C., and Nguyen, K. (2020). Negative student response to active learning in STEM classrooms: a systematic review of underlying reasons. J. Coll. Sci. Teach. 49, 45–54.

Short, J., Williams, E., and Christie, B. (1976). The Social Psychology of Telecommunications. London: John Wiley & Sons.

Tu, C.-H. (2002). The measurement of social presence in an online learning environment. Int. J. E Learn. 1, 34–45. doi: 10.17471/2499-4324/421

Zull, J. E. (2002). The Art of Changing the Brain: Enriching Teaching by Exploring the Biology of Learning , 1st Edn. Sterling, VA: Stylus Publishing.

Keywords : online learning, COVID-19, active learning, higher education, pedagogy, survey, international

Citation: Nguyen T, Netto CLM, Wilkins JF, Bröker P, Vargas EE, Sealfon CD, Puthipiroj P, Li KS, Bowler JE, Hinson HR, Pujar M and Stein GM (2021) Insights Into Students’ Experiences and Perceptions of Remote Learning Methods: From the COVID-19 Pandemic to Best Practice for the Future. Front. Educ. 6:647986. doi: 10.3389/feduc.2021.647986

Received: 30 December 2020; Accepted: 09 March 2021; Published: 09 April 2021.

Reviewed by:

Copyright © 2021 Nguyen, Netto, Wilkins, Bröker, Vargas, Sealfon, Puthipiroj, Li, Bowler, Hinson, Pujar and Stein. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Geneva M. Stein, [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

How Effective Is Online Learning? What the Research Does and Doesn’t Tell Us

research questions on online learning

  • Share article

Editor’s Note: This is part of a series on the practical takeaways from research.

The times have dictated school closings and the rapid expansion of online education. Can online lessons replace in-school time?

Clearly online time cannot provide many of the informal social interactions students have at school, but how will online courses do in terms of moving student learning forward? Research to date gives us some clues and also points us to what we could be doing to support students who are most likely to struggle in the online setting.

The use of virtual courses among K-12 students has grown rapidly in recent years. Florida, for example, requires all high school students to take at least one online course. Online learning can take a number of different forms. Often people think of Massive Open Online Courses, or MOOCs, where thousands of students watch a video online and fill out questionnaires or take exams based on those lectures.

In the online setting, students may have more distractions and less oversight, which can reduce their motivation.

Most online courses, however, particularly those serving K-12 students, have a format much more similar to in-person courses. The teacher helps to run virtual discussion among the students, assigns homework, and follows up with individual students. Sometimes these courses are synchronous (teachers and students all meet at the same time) and sometimes they are asynchronous (non-concurrent). In both cases, the teacher is supposed to provide opportunities for students to engage thoughtfully with subject matter, and students, in most cases, are required to interact with each other virtually.

Coronavirus and Schools

Online courses provide opportunities for students. Students in a school that doesn’t offer statistics classes may be able to learn statistics with virtual lessons. If students fail algebra, they may be able to catch up during evenings or summer using online classes, and not disrupt their math trajectory at school. So, almost certainly, online classes sometimes benefit students.

In comparisons of online and in-person classes, however, online classes aren’t as effective as in-person classes for most students. Only a little research has assessed the effects of online lessons for elementary and high school students, and even less has used the “gold standard” method of comparing the results for students assigned randomly to online or in-person courses. Jessica Heppen and colleagues at the American Institutes for Research and the University of Chicago Consortium on School Research randomly assigned students who had failed second semester Algebra I to either face-to-face or online credit recovery courses over the summer. Students’ credit-recovery success rates and algebra test scores were lower in the online setting. Students assigned to the online option also rated their class as more difficult than did their peers assigned to the face-to-face option.

Most of the research on online courses for K-12 students has used large-scale administrative data, looking at otherwise similar students in the two settings. One of these studies, by June Ahn of New York University and Andrew McEachin of the RAND Corp., examined Ohio charter schools; I did another with colleagues looking at Florida public school coursework. Both studies found evidence that online coursetaking was less effective.

About this series

BRIC ARCHIVE

This essay is the fifth in a series that aims to put the pieces of research together so that education decisionmakers can evaluate which policies and practices to implement.

The conveners of this project—Susanna Loeb, the director of Brown University’s Annenberg Institute for School Reform, and Harvard education professor Heather Hill—have received grant support from the Annenberg Institute for this series.

To suggest other topics for this series or join in the conversation, use #EdResearchtoPractice on Twitter.

Read the full series here .

It is not surprising that in-person courses are, on average, more effective. Being in person with teachers and other students creates social pressures and benefits that can help motivate students to engage. Some students do as well in online courses as in in-person courses, some may actually do better, but, on average, students do worse in the online setting, and this is particularly true for students with weaker academic backgrounds.

Students who struggle in in-person classes are likely to struggle even more online. While the research on virtual schools in K-12 education doesn’t address these differences directly, a study of college students that I worked on with Stanford colleagues found very little difference in learning for high-performing students in the online and in-person settings. On the other hand, lower performing students performed meaningfully worse in online courses than in in-person courses.

But just because students who struggle in in-person classes are even more likely to struggle online doesn’t mean that’s inevitable. Online teachers will need to consider the needs of less-engaged students and work to engage them. Online courses might be made to work for these students on average, even if they have not in the past.

Just like in brick-and-mortar classrooms, online courses need a strong curriculum and strong pedagogical practices. Teachers need to understand what students know and what they don’t know, as well as how to help them learn new material. What is different in the online setting is that students may have more distractions and less oversight, which can reduce their motivation. The teacher will need to set norms for engagement—such as requiring students to regularly ask questions and respond to their peers—that are different than the norms in the in-person setting.

Online courses are generally not as effective as in-person classes, but they are certainly better than no classes. A substantial research base developed by Karl Alexander at Johns Hopkins University and many others shows that students, especially students with fewer resources at home, learn less when they are not in school. Right now, virtual courses are allowing students to access lessons and exercises and interact with teachers in ways that would have been impossible if an epidemic had closed schools even a decade or two earlier. So we may be skeptical of online learning, but it is also time to embrace and improve it.

A version of this article appeared in the April 01, 2020 edition of Education Week as How Effective Is Online Learning?

Sign Up for EdWeek Tech Leader

Edweek top school jobs.

Student working on a computer.

Sign Up & Sign In

module image 9

American Psychological Association Logo

Managing attention and distractibility in online learning

Research-backed answers to some of the most commonly asked questions regarding attention and distractibility in the virtual classroom.

  • Learning and Memory
  • Perception and Attention
  • Schools and Classrooms
  • Technology and Design

Young male student looking computer screen

This year, as COVID-19 disrupted traditional K–12 education, even the most experienced teachers felt suddenly thrown back into their first day, or first years, of teaching. Appearing in their virtual classrooms, many teachers found themselves looking at an array of squares on a screen, some with students looking back, some with a bare desktop and chair, some missing entirely. For many, this new environment felt foreign as their go-to strategies in the classroom setting did not seem to translate readily online. As a result, teachers were left with many questions and few clear answers.

Although the existing literature specific to virtual learning environments is limited, there is a robust research base on attention, engagement, distractibility, and learning in general, much of which can be adapted and applied in virtual settings. Below, we offer research-backed answers to some of the most commonly asked questions regarding attention and distractibility in the virtual classroom.

What do attention and engagement look like in an online environment?

In face-to-face settings, teachers typically rely on perceiving and responding to overt student behaviors as evidence of their attention. In an online setting, teachers may be able to see only a student’s head and shoulders at most, which limits the information available. In these circumstances, teachers must turn to other sources of input. In their 2011 book, “Creating the Opportunity to Learn,” Boykin and Noguera offer the following description for behavioral, cognitive, and affective engagement:

Behavioral engagement is “on task behavior.” In a virtual environment, on task behavior may include students’ commenting in the chat function, asking and answering questions, seeking and providing help to peers, and participating in collaborative discussions. Cognitive engagement refers to effort aimed at understanding complex material or learning challenging skills. In a virtual environment, cognitive engagement may include students showing that they are willing and able to take on a task even if it is challenging ( Corno & Mandinach, 1983 ), the extent to which they persist on a task regardless of its difficulty, and the strategies they employ to assist them while learning (Richardson & Newby, 2006 ). Affective engagement refers to students’ emotional reactions including showing interest in, curiosity about, or enjoyment of a task, communicating a positive attitude, and expressing the value, importance, or personal relevance of a task (Boykin & Noguera, 2011). When students are not affectively engaged, they are likely to show boredom, stress, or anxiety.

How do I know my students are paying attention and engaged while I’m teaching online or with online work?

How teachers know if their students are paying attention and engaged is an issue of assessment. The classroom assessment process begins with asking yourself, “What do I want to know about my students’ engagement?” To ensure representativeness, teachers can include questions on each of the types of engagement discussed previously. For example, one might ask, “Are my students persisting even when they encounter difficult work?” Or, “Do my students appear to be interested during class-wide discussions?”

After teachers establish what they want to know, the next step is to determine what might count as evidence to answer that particular question. For example, teachers may look for evidence of student persistence by observing what students do when they encounter hurdles or stumbling blocks. If students continue steadily working and adjust and adapt their plans as needed, it might serve as evidence of persistence.

Knowing what evidence to collect, however, is only half the battle. As teachers, it is also important to have a host of strategies and techniques to collect such evidence. Classroom assessment does little to affect student learning unless teachers use the information from assessment events to inform their next teaching steps or to craft feedback that moves learning forward. That is why it is imperative that teachers draw on their knowledge of the curriculum and typical learning trajectories to inform teaching and learning.

How can I structure my online teaching to best engage my students, and what strategies can I use to reengage students who are distracted?

Many of the strategies that teachers use to increase student engagement in face-to-face classrooms can also be adapted to structure online teaching. For example, it is important to recognize the types of learning for which synchronous (active online) and asynchronous (offline) modalities are advantageous and to use each modality strategically.

The synchronous format is useful for introducing new topics, discussing complex ideas and challenging work, and promoting collaborative learning and student-teacher interactions. One of the disadvantages of the synchronous format is that students might find it difficult to remain engaged for long durations, and teachers should expect the duration of engagement to drop with age—ninth-graders will be able to stay engaged longer than fifth-graders, fifth-graders longer than third-graders, and so on.

Asynchronous learning could be used to reinforce what was taught and discussed during synchronous sessions and for tasks and activities that can be self-paced and that might require more time to complete, such as long-term projects. Because students work independently during asynchronous learning, it is important to break up activities into smaller chunks as well as to vary the types of activities, such as answering questions after watching a brief video or writing a short essay after reading assigned pages of a book. Asynchronous learning also has the advantage of promoting student self-regulation and sense of control over the learning process, factors known to increase student engagement (Fredricks et al., 2004).

Finally, students are more likely to be engaged if they feel respected and valued by their teachers and peers, and if they feel that they belong to the classroom and school community. Teachers can reinforce student engagement with praise or by allowing students to do a fun activity. In addition, establishing specific times during the week when students can collaborate on a creative activity, watch a short and lighthearted video together, or just talk could go a long way to creating positive bonds and an engaged community in a virtual environment.

You may also like

  • Access through  your organization
  • Purchase PDF

Article preview

Introduction, section snippets.

  • References (78)

Cited by (55)

Elsevier

Computers & Education

The qualitative evidence behind the factors impacting online learning experiences as informed by the community of inquiry framework: a thematic synthesis ☆.

  • • Three main categories: course design, instructors', and peers' actions.
  • • Ten descriptive themes ranging from immediacy to ease of navigation.
  • • Three analytical themes: accountability, being real, and supporting learning.
  • • Responsibility for success is shared among designers, instructors and learners.
  • • More transparent reporting is needed in future qualitative studies.

Methodology

Characteristics of primary studies, conclusions, author contribution, references ∗ (78), does academic discipline moderate coi-course outcomes relationships in online mba courses, the internet and higher education, improving online social presence through asynchronous video, a meta-analysis addressing the relationship between teaching presence and students’ satisfaction and learning, interrelationships between and among social, teaching, and cognitive presence, on the nth presence for the community of inquiry framework, computers and education, using a community of inquiry framework to teach a nursing and midwifery research subject: an evaluative study, nurse education today, social presence in relation to students’ satisfaction and learning in the online environment: a meta-analysis. computers in human, the interrelationship of emotion and cognition when students undertake collaborative group work online: an interdisciplinary approach, learning presence: towards a theory of self-efficacy, self-regulation, and the development of a communities of inquiry in online and blended learning environments, understanding cognitive presence in an online and blended community of inquiry: assessing outcomes and processes for deep approaches to learning, british journal of educational technology, conducting a meta-ethnography of qualitative literature: lessons learnt, bmc medical research methodology.

  • * Alanazi, R. A. (2017). Learning to self-regulate: Crafting co-regulation experiences in an online learning...

Assessing teaching presence in a computer conferencing context

Journal of asynchronous learning networks.

  • * Archibald, D. (2011). Fostering cognitive presence in higher education through the authentic design, delivery, and...

Methods for the synthesis of qualitative research: a critical review

Conducting a multivocal thematic synthesis on an extensive body of literature, canadian journal of learning and technology, building community in online doctoral classrooms: instructor practices that support community, online learning.

  • * Bokhari, H. M. (2016). A case study examining perceptions of female Saudi university students regarding the use of...
  • * Brakhage, H. H. (2015). Customer experience in online higher education: A study of adult online college honor...

Evaluating meta-ethnography: Systematic analysis and synthesis of qualitative research

Health technology assessment, online education from the perspective of community college students within the community of inquiry paradigm, community college journal of research and practice, comparing asynchronous and synchronous video vs. text based discussions in an online teacher education course, what is the impact of mental health-related stigma on help-seeking a systematic review of quantitative and qualitative studies, psychological medicine, should we exclude inadequately reported studies from qualitative systematic reviews an evaluation of sensitivity analyses in two case study reviews, qualitative health research, quality assessment of qualitative evidence for systematic review and synthesis: is it meaningful, and if so, how should it be performed, research synthesis methods.

  • * Catron, S. D. (2012). An investigation of online educational quality in professional and continuing education using...

Exploring business students’ communicative needs: Social presence in effective online instruction

The journal of research in business education.

  • Davidson-Shivers, Rasmussen, & Lowenthal (2018). Web-based learning design, implementation and evaluation. Springer...

How can systematic reviews incorporate qualitative research? A critical perspective

Qualitative research, online teaching experience: a qualitative metasynthesis (qms), merlot journal of online learning and teaching, teaching presence: co-creating a multi-national online learning community in an asynchronous classroom, online learning journal.

  • * Finley, L. (2016). Undergraduate business students perceptions of teaching presence in online business courses...

Online community of inquiry review: Social, cognitive and teaching presence issues

  • Garrison, D. R. (2009). Communities of inquiry in online learning. In P. L. Rogers, G. A. Berg, J. V. Boettcher, C....
  • Garrison, D. R. (2013). Theoretical foundations and epistemological insights of the community of inquiry. In Z. Akyol &...
  • Garrison, D. R., & Akyol, Z. (2013). The community of inquiry theoretical framework. In M. G. Moore (Ed.), Handbook of...

Critical inquiry in a text-based environment: Computer conferencing in higher education

Critical thinking, cognitive presence, and computer conferencing in distance education, american journal of distance education, facilitating cognitive presence in online learning: interaction is not enough, the american journal of distance education, critical thinking in the community of inquiry framework: an analysis of the theoretical model and cognitive presence coding schemes.

Along with social and teaching presence, it constitutes a meaningful educational experience (Garrison et al., 2000). Since its introduction, the CoI framework has been one of the most researched frameworks for online learning (Caskurlu et al., 2021; Cleveland-Innes, 2019; Park & Shea, 2020). The importance of the framework is also evident in a recent systematic review of 619 research articles on online teaching and learning (Martin et al., 2020).

Learning effectiveness of a flexible learning study programme in a blended learning design: why are some courses more effective than others?

Transition from traditional to online learning in hong kong tertiary educational institutions during covid-19 pandemic, effectiveness of online learning: a multi-complementary approach research with responses from the covid-19 pandemic period, building bridges to advance the community of inquiry framework for online learning, supporting a mathematics community of inquiry through online discussion forums: towards design principles.

  • Twin Cities

University of Minnesota

  • Bachelor's Degrees
  • Master's Degrees
  • Doctorate Degrees
  • Certificates
  • Coursera Online Courses
  • Licensing Programs
  • Post-Secondary Enrollment Options (PSEO)
  • Credit Online Courses
  • Professional Development Online Courses
  • Student Stories
  • Health and Well-being
  • Learn Online

Top 6 Questions People Ask About Online Learning

Closeup of hands typing on a laptop and holding a pen

Since the invention of the internet, we have witnessed a huge change in the accessibility and flexibility of higher education. Not only can students earn their degrees at a distance and on their own schedule but they can also complete certifications and trade programs with more ease than ever before.

If you’re considering online classes as a means to achieving your goals, you likely have questions. Here are some of the most common ones, with answers!

What Is Online Learning?

So, just what is online learning? This term refers to education that takes place in a completely virtual environment using an internet connection and a computer or device to connect to the school. In the online "classroom," you can do all the same things that in-person students do, such as:

  • Listening to lectures
  • Answering questions from a professor
  • Completing readings
  • Turning in assignments
  • Taking quizzes and tests
  • Meeting as a group

Some schools, programs, or courses combine online learning with in-person learning experiences. This model is known as "hybrid education," wherein students participate online most of the time. However, when learning objectives call for hands-on experience (say, practicing skills for a health profession or laboratory experiments), they can head to campus.

That said, many programs allow their students to complete the entire curriculum virtually. Degrees such as a Bachelor of Science in Software Engineering, for example, may not call for in-person learning at all. You can always contact admissions or the specific department if you want to learn more about delivery format.

Why Online Learning Is Good for Students

Despite the widespread accessibility of remote education, some students remain skeptical about online classes. Are you really learning if there’s not a professor present at the front of a lecture hall? Can you really learn the skills you need without the in-person interaction between students and faculty?

Ease and Accessibility

While some people feel online education lacks the intimacy and immediacy of a "real" classroom, it offers an educational channel to students who might otherwise not have the time or resources to attend. Online access has made it possible for students to enroll and participate in online classes with greater ease, from nearly anywhere, in a way that fits their schedules.

Affordability

Online courses are usually more affordable as well. According to the Education Data Initiative , an online degree is $36,595 cheaper than an in-person degree when the cost of tuition and attendance are compared. The average cost of attending a private university is $129,800 for an in-person degree and only $60,593 for an online degree.

It’s also estimated that students who commute to college for in-person classes pay $1,360 per year in transportation costs that an online student wouldn’t have to pay. Add in factors such as cheaper meals at home and more time to work, and it’s not hard to see why many students opt for online learning.

Top Questions About Online Learning

Despite the benefits, you likely still have some questions about online learning. Let’s take a look at six of the most common.

1. Are You Able to Earn Your Degree Completely Online? Yes, many (but not all) schools do offer this as an option. We’re not just talking about certificates or minors, either.

For instance, you can earn a Master of Science in Electrical and Computer Engineering from U of M Online. If you complete the entire program virtually, you will pay in-state tuition costs from anywhere in the United States – a major bonus. A good school should offer you a searchable course catalog to compare options and view which have a required on-campus component.

2. How Long Does It Take to Earn a Degree Online? Most online programs mirror their in-person counterparts in terms of how long it takes to earn the degree. From certificates and minors to bachelor’s or master’s degrees, you’re looking at roughly the same timeline for equivalent programs. Some programs offer students the flexibility for part time options if that is needed to accommodate work and family responsibilities.

Some schools or programs may limit how quickly you can move through the material. However, given the freedom and flexibility of online learning, it’s possible you can complete more coursework in less time than you could on campus. Talk to your admissions officer or program coordinator about specifics.

When first researching your options, you can again turn to the searchable course catalog. On each degree page, you should find the recommended timeline clearly listed.

3. Is an Online Degree Viewed Differently Than a Traditional Degree? Among the most common and pressing questions for online learning is whether future employers view online degrees with skepticism. The answer is an emphatic "no." Most online programs appear on your transcript the same as on-campus programs would.

You may also wonder if an online program will impact your plans for a higher degree later. As long as your degree is from an accredited institution, it won’t harm your chances of acceptance.

4. What Are Some Benefits of Online Learning? When you choose to learn online, you can:

  • Study more, due to the lack of commuting to, from, and around campus
  • Potentially take more classes, again because of the time savings
  • Get more immediate feedback from professors on assignments
  • Leverage the online resources that come with your course portal
  • Spend less money on your degree overall
  • Continue working or caring for family while going to school

5. Do Instructors Offer Help and Support to Students? Instructors are required to give the same amount of time and energy to their online classes as they do to in-person groups. In fact, many professors are enthusiastic about virtual learning because it means they have more flexibility and don’t have to commute either.

6. Can Students Have Success and Excel in Online Learning? Lastly, can you learn new skills, attain knowledge, and become successful in online learning? Unequivocally, the answer is yes! Online degree programs still afford you tutoring and career resources as well as full access to academic resources such as the library .

Plus, you will have the ability to transfer credits either to or from the degree program, just as you would with an on-campus one. In other words, you will find yourself and your goals in no way hampered by taking the online approach.

Online Learning

In summary, online learning offers you a ton of freedom and savings. It allows you to complete your work anywhere, from the office to the living room to on the road. And you can rest assured that you’ll get the same level of professorial support as you would from an on-campus program, as well as a degree that’s worth just as much.

Learn More, Today

Ready to learn more? Reach out to U of M Online to ask questions or get information about specific programs today!

  • Cost of Online Education vs. Traditional Education
  • The top 5 questions people ask about online learning
  • https://online.umn.edu/programs-search
  • https://online.umn.edu/tuition-fees-and-financial-aid
  • https://online.umn.edu/story/academic-tutoring-and-career-resources
  • https://online.umn.edu/story/u-m-libraries
  • https://online.umn.edu/transfer-credit
  • https://online.umn.edu/

Related Stories

close-up of a man's hands typing on a laptop keyboard

The Impact of Digital Credentials on Professional Growth and Opportunities

A grinning woman wearing headphones attends an online course on her laptop

How Online Education Is Shaping the Future Workforce

A Black doctor kneels to speak to a Black mother and her daughter in a waiting room

Population Health: What Is It and Why Is It Important?

A man's outstretched hand is surrounded by glowing computer icons

A Guide to Data-Driven Marketing: How Big Data Is Transforming the Digital Marketing Landscape

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case AskWhy Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

research questions on online learning

Home Surveys Academic

Distance learning survey for students: Tips & examples

Distance learning survey questions for students

The COVID-19 pandemic changed learning in many unprecedented ways. Students had to not just move to online learning but also keep a social distance from their friends and family. A student interest survey helps customize teaching methods and curriculum to make learning more engaging and relevant to students’ lives. It was quite challenging for some to adjust to the ‘new normal’ and missed the in-person interaction with their teachers. For some, it simply meant spending more time with the parents.

Schools need to know how students feel about distance education and learn more about their experiences. To collect data, they can send out a survey on remote learning for students. Once they have the results, the management team can know what students like in the existing setup and what they would like to change.

The classroom response system allowed students to answer multiple-choice questions and engage in real-time discussions instantly.

Here are the examples of class survey questions of distance learning survey for students you must ask to collect their feedback.

LEARN ABOUT:  Testimonial Questions

Examples of distance learning survey questions for students

1. How do you feel overall about distance education?

  • Below Average

This question collects responses about the overall experience of the students regarding online education. Schools can use this data to decide whether they should continue with teaching online or move in-person learning.

2. Do you have access to a device for learning online?

  • Yes, but it doesn’t work well
  • No, I share with others

Students should have uninterrupted access to a device for learning online. Know if they face any challenges with the device’s hardware quality. Or if they share the device with others in the house and can’t access when they need it.

3. What device do you use for distance learning?

Know whether students use a laptop, desktop, smartphone, or tablet for distance learning. A laptop or desktop would be an ideal choice for its screen size and quality. You can use a multiple-choice question type in your questionnaire for distance education students.

4. How much time do you spend each day on an average on distance education?

Know how much time do students spend while taking an online course. Analyze if they are over-spending time and find out the reasons behind it. Students must allocate some time to play and exercise while staying at home to take care of their health. You can find out from answers to this question whether they spend time on other activities as well.

5. How effective has remote learning been for you?

  • Not at all effective
  • Slightly effective
  • Moderately effective
  • Very effective
  • Extremely effective

Depending on an individual’s personality, students may like to learn in the classroom with fellow students or alone at home. The classroom offers a more lively and interactive environment, whereas it is relatively calm at home. You can use this question to know if remote learning is working for students or not. 

6. How helpful your [School or University] has been in offering you the resources to learn from home?

  • Not at all helpful
  • Slightly helpful
  • Moderately helpful
  • Very helpful
  • Extremely helpful

The school management teams need to offer full support to both teachers and students to make distance education comfortable and effective. They should provide support in terms of technological infrastructure and process framework. Given the pandemic situation, schools must allow more flexibility and create lesser strict policies.

7. How stressful is distance learning for you during the COVID-19 pandemic?

Studying in the time of pandemic can be quite stressful, especially if you or someone in the family is not doing well. Measure the stress level of the students and identify ways to reduce it. For instance, you can organize an online dance party or a lego game. The responses to this question can be crucial in deciding the future course of distance learning. 

8. How well could you manage time while learning remotely? (Consider 5 being extremely well and 1 being not at all)

  • Academic schedule

Staying at home all the time and balancing multiple things can be stressful for many people. It requires students to have good time-management skills and self-discipline. Students can rate their experience on a scale of 1-5 and share it with the school authorities. Use a multiple-choice matrix question type for such questions in your distance learning questionnaire for students.

LEARN ABOUT: System Usability Scale

9. Do you enjoy learning remotely?

  • Yes, absolutely
  • Yes, but I would like to change a few things
  • No, there are quite a few challenges
  • No, not at all

Get a high-level view on whether students are enjoying learning from home or doing it because they are being forced to do so. Gain insights on how you can improve distance education and make it interesting for them.

10. How helpful are your teachers while studying online?

Distance education lacks proximity with teachers and has its own set of unique challenges. Some students may find it difficult to learn a subject and take more time to understand. This question measures the extent to which students find their teachers helpful.

You can also use a ready-made survey template to save time. The sample questionnaire for students can be easily customized as per your requirements.

USE THIS TEMPLATE

Other important questions of distance learning survey for students

  • How peaceful is the environment at home while learning?
  •  Are you satisfied with the technology and software you are using for online learning?
  • How important is face-to-face communication for you while learning remotely?
  • How often do you talk to your [School/University] classmates?
  • How often do you have a 1-1 discussion with your teachers?

How to create a survey?

The intent behind creating a remote learning questionnaire for students should be to know how schools and teachers can better support them. Use an online survey software like ours to create a survey or use a template to get started. Distribute the survey through email, mobile app, website, or QR code.

Once you get the survey results, generate reports, and share them with your team. You can also download them in formats like .pdf, .doc, and .xls. To analyze data from multiple resources, you can integrate the survey software with third-party apps.

If you need any help with designing a survey, customizing the look and feel, or deriving insights from it, get in touch with us. We’d be happy to help.

MORE LIKE THIS

user behavior

User Behavior: What it is, How to Understand, Track & Uses

Sep 24, 2024

research questions on online learning

Mass Personalization is not Personalization! — Tuesday CX Thoughts

change management questions

Change Management Questions: How to Design & Ask Questions

Sep 23, 2024

Top 5 Change Management Models to Transform Your Organization

Sep 20, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Tuesday CX Thoughts (TCXT)
  • Uncategorized
  • What’s Coming Up
  • Workforce Intelligence

logo

FAQs: How Online Courses Work

research questions on online learning

The Benefits of Online Education

How online education works.

  • The Effectiveness of Online Education

Choosing Online Degree Programs

Technical skills and considerations, paying for online degree programs.

Recent reports detail just how quickly colleges adopted online learning. According to the Babson Survey Research Group, university and student participation in online education is at an all-time high. Even some of the largest and most prestigious universities now offer online degrees. Despite its growing popularity, online education is still relatively new, and many students and academics are completely unacquainted with it. Questions and concerns are normal. This page addresses some of the most frequently asked questions about online degree programs. All answers are thoroughly researched; we include links to relevant studies whenever possible.

Question: What are some of the advantages of attending college online?

[Answer] Online education is known for its flexibility, but studies have identified several additional benefits of attending class online. Among them:

  • Communication : Many students are more comfortable engaging in meaningful discussions online than in a classroom. These students might have hearing or speech impairments; speak different languages; have severe social anxiety; or simply need more time to organize their thoughts.
  • Personalized learning : Not all students learn the same way. Web-based learning allows instructors to deliver the same content using different media, like videos or simulations, personalizing learning. Online classes providing round-the-clock access to materials and lectures also let students study when they feel most focused and engaged.
  • Accessibility : Online programs transcend time, geographic, and other barriers to higher education. This can be helpful for those who work full-time, live in remote regions, or serve in the military.
  • Adaptability : Learning management systems that integrate text-to-speech and other adaptive technologies support learners with physical, behavioral, and learning challenges.
  • Efficiency : Studies show online students tend to achieve the same learning results in half the time as classroom-based students.
  • Engagement : Online instructors can use games, social media, virtual badges, and other engaging technologies to motivate students and enhance learning.

Question: How does online education work on a day-to-day basis?

[Answer] Instructional methods, course requirements, and learning technologies can vary significantly from one online program to the next, but the vast bulk of them use a learning management system (LMS) to deliver lectures and materials, monitor student progress, assess comprehension, and accept student work. LMS providers design these platforms to accommodate a multitude of instructor needs and preferences. While some courses deliver live lectures using video conferencing tools, others allow students to download pre-recorded lectures and use message boards to discuss topics. Instructors may also incorporate simulations, games, and other engagement-boosters to enhance learning. Students should research individual programs to find out how and when they would report to class; how lectures and materials are delivered; how and how much they would collaborate with faculty and peers; and other important details. We address many of these instructional methods and LMS capabilities elsewhere in this guide.

Question: Can you really earn online degrees in hands-on fields like nursing and engineering?

[Answer] Yes and no. While schools do offer online and hybrid programs in these disciplines, students must usually meet additional face-to-face training requirements. Schools usually establish these requirements with convenience in mind. For example, students in fields like nursing, teaching, and social work may be required to complete supervised fieldwork or clinical placements, but do so through local schools, hospitals/clinics, and other organizations. For example, students enrolled in the University of Virginia’s Engineers PRODUCED in Virginia program can complete all their engineering classes online in a live format while gaining practical experience through strategic internships with employers across the state. Some online programs do require students to complete on-campus training, seminars and assessments, but visits are often designed to minimize cost and travel. Students should consider these requirements when researching programs.

The Effectiveness and Credibility of Online Education

Question: is online education as effective as face-to-face instruction.

[Answer] Online education may seem relatively new, but years of research suggests it can be just as effective as traditional coursework, and often more so. According to a U.S. Department of Education analysis of more than 1,000 learning studies, online students tend to outperform classroom-based students across most disciplines and demographics. Another major review published the same year found that online students had the advantage 70 percent of the time, a gap authors projected would only widen as programs and technologies evolve.

While these reports list several plausible reasons students might learn more effectively online—that they have more control over their studies, or more opportunities for reflection—medium is only one of many factors that influence outcomes. Successful online students tend to be organized self-starters who can complete their work without reporting to a traditional classroom. Learning styles and preferences matter, too. Prospective students should research programs carefully to identify which ones offer the best chance of success.

Question: Do employers accept online degrees?

[Answer] All new learning innovations are met with some degree of scrutiny, but skepticism subsides as methods become more mainstream. Such is the case for online learning. Studies indicate employers who are familiar with online degrees tend to view them more favorably, and more employers are acquainted with them than ever before. The majority of colleges now offer online degrees, including most public, not-for-profit, and Ivy League universities. Online learning is also increasingly prevalent in the workplace as more companies invest in web-based employee training and development programs.

Question: Is online education more conducive to cheating?

[Answer] The concern that online students cheat more than traditional students is perhaps misplaced. When researchers at Marshall University conducted a study to measure the prevalence of cheating in online and classroom-based courses, they concluded, “somewhat surprisingly, the results showed higher rates of academic dishonesty in live courses.” The authors suggest the social familiarity of students in a classroom setting may lessen their sense of moral obligation.

Another reason cheating is less common in online programs is that colleges have adopted strict anti-cheating protocols and technologies. According to a report published by the Online Learning Consortium, some online courses require students to report to proctored testing facilities to complete exams, though virtual proctoring using shared screens and webcams is increasingly popular. Sophisticated identity verification tools like biometric analysis and facial recognition software are another way these schools combat cheating. Instructors often implement their own anti-cheating measures, too, like running research papers through plagiarism-detection programs or incorporating challenge-based questions in quizzes and exams. When combined, these measures can reduce academic dishonesty significantly.

In an interview with OnlineEducation.com, Dr. Susan Aldridge, president of Drexel University Online, discussed the overall approach many universities take to curbing cheating–an approach that includes both technical and policy-based prevention strategies.

“Like most online higher education providers, Drexel University employs a three-pronged approach to maintaining academic integrity among its virtual students,” said Dr. Aldridge. “We create solid barriers to cheating, while also making every effort to identify and sanction it as it occurs or directly after the fact. At the same time, we foster a principled community of inquiry that, in turn, motivates students to act in ethical ways. So with this triad in mind, we have implemented more than a few strategies and systems to ensure academic integrity.”

Question: How do I know if online education is right for me?

[Answer] Choosing the right degree program takes time and careful research no matter how one intends to study. Learning styles, goals, and programs always vary, but students considering online colleges must consider technical skills, ability to self-motivate, and other factors specific to the medium. A number of colleges and universities have developed assessments to help prospective students determine whether they are prepared for online learning. You can access a compilation of assessments from many different colleges online. Online course demos and trials can also be helpful, particularly if they are offered by schools of interest. Students can call online colleges and ask to speak an admissions representative who can clarify additional requirements and expectations.

Question: How do I know if an online degree program is credible?

[Answer] As with traditional colleges, some online schools are considered more credible than others. Reputation, post-graduation employment statistics, and enrollment numbers are not always reliable indicators of quality, which is why many experts advise students to look for accredited schools. In order for an online college to be accredited, a third-party organization must review its practices, finances, instructors, and other important criteria and certify that they meet certain quality standards. The certifying organization matters, too, since accreditation is only as reliable as the agency that grants it. Students should confirm online programs’ accrediting agencies are recognized by the U.S. Department of Education and/or the Council on Higher Education Accreditation before submitting their applications.

Online Student Support Services

Question: do online schools offer the same student support services as traditional colleges.

[Answer] Colleges and universities tend to offer online students many of the same support services as campus-based students, though they may be administered differently. Instead of going to a campus library, online students may log in to virtual libraries stocked with digital materials, or work with research librarians by phone or email. Tutoring, academic advising, and career services might rely on video conferencing software, virtual meeting rooms, and other collaborative technologies. Some online colleges offer non-academic student support services as well. For example, Western Governor University’s Student Assistance Program provides online students with 24/7 access to personal counseling, legal advice, and financial consulting services. A list of student support services is usually readily available on online colleges’ websites.

Question: What technical skills do online students need?

[Answer] Online learning platforms are typically designed to be as user-friendly as possible: intuitive controls, clear instructions, and tutorials guide students through new tasks. However, students still need basic computer skills to access and navigate these programs. These skills include: using a keyboard and a mouse; running computer programs; using the Internet; sending and receiving email; using word processing programs; and using forums and other collaborative tools. Most online programs publish such requirements on their websites. If not, an admissions adviser can help.

Students who do not meet a program’s basic technical skills requirements are not without recourse. Online colleges frequently offer classes and simulations that help students establish computer literacy before beginning their studies. Microsoft’s online digital literacy curriculum is one free resource.

Question: What technology requirements must online students meet? What if they do not meet them?

[Answer] Technical requirements vary from one online degree program to the next, but most students need at minimum high-speed Internet access, a keyboard, and a computer capable of running specified online learning software. Courses using identity verification tools and voice- or web-conferencing software require webcams and microphones. Scanners and printers help, too. While online schools increasingly offer mobile apps for learning on-the-go, smartphones and tablets alone may not be sufficient.

Most online colleges list minimum technology requirements on their websites. Students who do not meet these requirements should contact schools directly to inquire about programs that can help. Some online schools lend or provide laptops, netbooks, or tablets for little to no cost, though students must generally return them right away if they withdraw from courses. Other colleges may offer grants and scholarships to help cover technical costs for students who qualify.

Question: Are online students eligible for financial aid?

[Answer] Qualifying online students enrolled in online degree programs are eligible for many of the same loans, scholarships, and grants as traditional campus-based students. They are also free to apply for federal and state financial aid so long as they:

  • Attend online programs accredited by an organization recognized by either the U.S. Department of Education or the Council on Higher Education Accreditation.
  • Attend online schools that are authorized to operate in their state of residence.
  • Meet all additional application requirements, including those related to legal status, citizenship, age, and educational attainment.
  • Submit applications and all supporting materials by their deadlines.

Students can visit the U.S. Department of Education’s Federal Student Aid website to review all eligibility requirements and deadlines, and to submit their Free Application for Student Aid (FAFSA). Note that many states, colleges, and organizations use FAFSA to determine students’ eligibility for other types of aid, including grants, scholarships, and loans. Students can contact prospective schools directly to speak with financial aid advisors.

Disclaimer: Financial aid is never guaranteed, even among eligible online students. Contact colleges and universities directly to clarify their policies

Question: Can students use military education benefits to pay for online education?

[Answer] Active-duty and veteran military service-members can typically apply their military education benefits toward an online degree, though they must still meet many of the same eligibility requirements detailed in the previous answer. Many state-level benefits have additional residency requirements. Most colleges have whole offices dedicated to helping these students understand and use their benefits effectively. They may also clarify applicable aid programs and requirements on their official websites. When in doubt, students should contact schools directly or report to the nearest Department of Veteran Affairs to learn more about their options.

" Educational Benefits of Online Learning ," Blackboard Learning, Presented by California Polytechnic State University, San Louis Obispo

" Four Proven Advantages of Online Learning (That are NOT the Cost, Accessibility or Flexibility) , Coursera Blog, Coursera

" Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies ," U.S. Department of Education

" Twenty years of research on the academic performance differences between traditional and distance learning ," M. Sachar, Y. Neumann, Journal of Online Learning and Teaching, Merlot

" The Market Value of Online Degrees as a Credible Credential ," Calvin D. Foggle, Devonda Elliott, accessed via New York University

" Cheating in the Digital Age: Do Students Cheat More in Online Courses ?" George Watson, James Sottile, accessed via the University of Georga

" Student Identity Verification Tools and Live Proctoring in Accordance With Regulations to Combat Academic Dishonesty in Distance Education ," Vincent Termini, Franklin Hayes, Online Learning Consortium

" Student Readiness for Online Learning ," G. Hanley, Merlot

" Recognized Accrediting Organizations ," Council for Higher Education Accreditation  

" Digital Literacy ," Microsoft, Inc.  

" Free Application for Federal Student Aid ," Office of Federal Student Aid, U.S. Department of Education

Online Education Guide

  • Expert Advice for Online Students
  • Instructional Design in Online Programs
  • Learning Management Systems
  • Online Student Trends and Success Factors
  • Online Teaching Methods
  • Student Guide to Understanding and Avoiding Plagiarism
  • Student Services for Online Learners

IEEE Account

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

Online/Digital Learning Questionnaire

  • October 2020

Muhammad Adnan at Monash University (Australia)

  • Monash University (Australia)

Kainat Anwar at Government College University Faisalabad

  • Government College University Faisalabad

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations
  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

COMMENTS

  1. PDF A Systematic Review of the Research Topics in Online Learning During

    Table 1 summarizes the 12 topics in online learning research in the current research and compares it to Martin et al.'s (2020) study, as shown in Figure 1. The top research theme in our study was engagement (22.5%), followed by course design and development (12.6%) and course technology (11.0%).

  2. 45 Survey Questions to Understand Student Engagement in Online Learning

    Research suggests that some groups of students experience more difficulty with academic performance and engagement when course content is delivered online vs. face-to-face. As you look to improve the online learning experience for students, take a moment to understand how students, caregivers, and staff are currently experiencing virtual learning.

  3. 206 questions with answers in ONLINE LEARNING

    Online Learning - Science topic. Explore the latest questions and answers in Online Learning, and find Online Learning experts. Questions (206) Publications (338,281) Questions related to Online ...

  4. A systematic review of the effectiveness of online learning in higher

    Zhang et al. (2022) implemented a bibliometric review to provide a holistic view of research on online learning in higher education during the COVID-19 pandemic period. They concluded that the majority of research focused on identifying the use of strategies and technologies, psychological impacts brought by the pandemic, and student perceptions.

  5. (Pdf) Research on Online Learning

    The CoI model has formed the basis for a good deal of research on online learning. Most of this research. has focused on one of the three pr esences, social presence being the most frequently ...

  6. Online and face‐to‐face learning: Evidence from students' performance

    1.1. Related literature. Online learning is a form of distance education which mainly involves internet‐based education where courses are offered synchronously (i.e. live sessions online) and/or asynchronously (i.e. students access course materials online in their own time, which is associated with the more traditional distance education).

  7. Key findings about online learning and the ...

    Parents with lower incomes whose children's schools closed amid COVID-19 were more likely to say their children faced technology-related obstacles while learning from home. Nearly half of these parents (46%) said their child faced at least one of the three obstacles to learning asked about in the survey, compared with 31% of parents with ...

  8. Examining research on the impact of distance and online learning: A

    Distance learning has evolved over many generations into its newest form of what we commonly label as online learning. In this second-order meta-analysis, we analyze 19 first-order meta-analyses to examine the impact of distance learning and the special case of online learning on students' cognitive, affective and behavioral outcomes.

  9. Students' online learning challenges during the pandemic and how they

    To address the research questions, we used both quantitative and qualitative analyses. For the quantitative analysis, we entered all the data into an excel spreadsheet. ... Clark T. Research and practice in K-12 online learning: A review of open access literature. The International Review of Research in Open and Distributed Learning. 2009; 10 ...

  10. A systematic review of research on online teaching and learning from

    1. Introduction. Online learning has been on the increase in the last two decades. In the United States, though higher education enrollment has declined, online learning enrollment in public institutions has continued to increase (Allen & Seaman, 2017), and so has the research on online learning.There have been review studies conducted on specific areas on online learning such as innovations ...

  11. Online learning during COVID-19 produced equivalent or better student

    Research across disciplines has demonstrated that well-designed online learning can lead to students' enhanced motivation, satisfaction, and learning [1,2,3,4,5,6,7].]. A report by the U.S. Department of Education [], based on examinations of comparative studies of online and face-to-face versions of the same course from 1996 to 2008, concluded that online learning could produce learning ...

  12. COVID-19's impacts on the scope, effectiveness, and ...

    The COVID-19 outbreak brought online learning to the forefront of education. Scholars have conducted many studies on online learning during the pandemic, but only a few have performed quantitative comparative analyses of students' online learning behavior before and after the outbreak. We collected review data from China's massive open online course platform called icourse.163 and ...

  13. Insights Into Students' Experiences and Perceptions of Remote Learning

    This result is consistent with prior research on the value of active learning (Freeman et al., 2014). Though research shows that student learning improves in active learning classes, on campus, student perceptions of their learning, enjoyment, and satisfaction with instruction are often lower in active-learning courses (Deslauriers et al., 2019 ...

  14. How Effective Is Online Learning? What the Research Does and Doesn't

    Online learning can take a number of different forms. Often people think of Massive Open Online Courses, or MOOCs, where thousands of students watch a video online and fill out questionnaires or ...

  15. Students' Learning Experiences and Perceptions of Online Course Content

    Research questions focused on how participants perceived their learning experiences in online courses and how they described interactions with instructors and other students. Data collection was multimodal. The interviews were conducted in face-to-face format, electronic mail, and Skype. The questionnaires were completed by electronic mail. Field

  16. Managing attention and distractibility in online learning

    As a result, teachers were left with many questions and few clear answers. Although the existing literature specific to virtual learning environments is limited, there is a robust research base on attention, engagement, distractibility, and learning in general, much of which can be adapted and applied in virtual settings.

  17. The qualitative evidence behind the factors impacting online learning

    Garrison et al. (2000) proposed the CoI framework to "illustrate the multifaceted components of teaching and learning in a text-based environment" (Anderson, Rourke, Garrison, & Archer, 2001, p. 3).The focus of the CoI framework is to create deep and meaningful online learning experiences through (a) social presence (SP): "the ability of participants to identify with the community (e.g ...

  18. (PDF) Engaging online learners: A quantitative study of postsecondary

    The online learning experimental questions were attached to the end of the NSSE online survey and sent to students at 45 U.S. baccalaureate degree-granting institutions. The 45

  19. Top 6 Questions People Ask About Online Learning

    Affordability. Online courses are usually more affordable as well. According to the Education Data Initiative, an online degree is $36,595 cheaper than an in-person degree when the cost of tuition and attendance are compared. The average cost of attending a private university is $129,800 for an in-person degree and only $60,593 for an online ...

  20. Distance learning survey for students

    Distance education lacks proximity with teachers and has its own set of unique challenges. Some students may find it difficult to learn a subject and take more time to understand. This question measures the extent to which students find their teachers helpful. You can also use a ready-made survey template to save time.

  21. The Impact of Online Learning on Student's Academic Performance

    online classes could affect the academic performance of students. This paper seeks to study the. impact of online learning on the academic performance of university students and to determine. whether education systems should increase the amount of online learning for traditional in-class. subjects.

  22. Traditional Learning Compared to Online Learning During the COVID-19

    Accordingly, faculty members have been given online learning training on how to implement online teaching by online teaching experts through the university's electronic platforms to teach students remotely through the Internet (Rucker & Frass, 2017). This study aims to analyze the implications of the shift to online learning from a faculty ...

  23. PDF Students' Perceptions towards the Quality of Online Education: A

    Yi Yang Linda F. Cornelius Mississippi State University. Abstract. How to ensure the quality of online learning in institutions of higher education has been a growing concern during the past several years. While several studies have focused on the perceptions of faculty and administrators, there has been a paucity of research conducted on ...

  24. Frequently Asked Questions About Online Education

    Recent reports detail just how quickly colleges adopted online learning. According to the Babson Survey Research Group, university and student participation in online education is at an all-time high. Even some of the largest and most prestigious universities now offer online degrees. Despite its growing popularity, online education is still ...

  25. Ask the Expert: Online learning vs. classroom learning

    Christine Greenhow, associate professor of educational technology in the College of Education, 2018 Recipient of MSU's Teacher-Scholar Award, answers questions about online and classroom learning.. Q: What are the advantages of online learning, compared to in-person classroom learning? A: Online learning can be as good or even better than in-person classroom learning. Research has shown that ...

  26. In-person versus online learning in relation to students' perceptions

    We examined students' perceptions of mattering during the pandemic in relation to in-person versus online learning in a sample of 6578 Canadian students in Grades 4-12. We found that elementary school students who attended school in-person reported mattering the most, followed by secondary school students who learned part-time in-person and the rest of the time online (blended learning group).

  27. Digitalised higher education: key developments, questions, and concerns

    Introduction. Higher education (HE) is already profoundly digitalised. Students and staff use digital technology routinely, including learning management systems (LMS), e-books, and many apps supporting teaching, learning, and research (Henderson, Selwyn, & Aston, Citation 2017).University leaders are reorganising their institutions as data organisations that can benefit from data analytics ...

  28. When Meta-Learning Meets Online and Continual Learning: A Survey

    Over the past decade, deep neural networks have demonstrated significant success using the training scheme that involves mini-batch stochastic gradient descent on extensive datasets. Expanding upon this accomplishment, there has been a surge in research exploring the application of neural networks in other learning scenarios. One notable framework that has garnered significant attention is ...

  29. (PDF) Online/Digital Learning Questionnaire

    7) I am comfortable communicating electr onically. 8) Learning is the same in class and at home on the Internet. than a regular course. 10) I believe a complete course can be gi ven by the ...

  30. Digital storytelling: An educational approach for enhancing dyslexic

    The aim is to investigate how digital storytelling could enhance the writing skills of primary students with dyslexia (operational dimension), also considering the critical and cultural dimensions of Green's 3D socio-cultural theoretical framework. To achieve this aim, the following research questions (RQs) were formulated: