Impact of online learning on student's performance and engagement: a systematic review

  • Open access
  • Published: 01 November 2024
  • Volume 3 , article number  205 , ( 2024 )

Cite this article

You have full access to this open access article

research article about blended learning

  • Catherine Nabiem Akpen   ORCID: orcid.org/0009-0007-2218-2254 1 ,
  • Stephen Asaolu   ORCID: orcid.org/0000-0002-7116-6468 1 ,
  • Sunday Atobatele   ORCID: orcid.org/0000-0003-1947-2561 2 ,
  • Hilary Okagbue   ORCID: orcid.org/0000-0002-3779-9763 1 &
  • Sidney Sampson   ORCID: orcid.org/0000-0001-5303-5475 2  

The rapid shift to online learning during the COVID-19 pandemic has significantly influenced educational practices worldwide and increased the use of online learning platforms. This systematic review examines the impact of online learning on student engagement and performance, providing a comprehensive analysis of existing studies. Using the Preferred Reporting Items for Systematic review and Meta-Analysis (PRISMA) guideline, a thorough literature search was conducted across different databases (PubMed, ScienceDirect, and JSTOR for articles published between 2019 and 2024. The review included peer-reviewed studies that assess student engagement and performance in online learning environments. After applying inclusion and exclusion criteria, 18 studies were selected for detailed analysis. The analysis revealed varied impacts of online learning on student performance and engagement. Some studies reported improved academic performance due to the flexibility and accessibility of online learning, enabling students to learn at their own pace. However, other studies highlighted challenges such as decreased engagement and isolation, and reduced interaction with instructors and peers. The effectiveness of online learning was found to be influenced by factors such as the quality of digital tools, good internet, and student motivation. Maintaining student engagement remains a challenge, effective strategies to improve student engagement such as interactive elements, like discussion forums and multimedia resources, alongside adequate instructor-student interactions, were critical in improving both engagement and performance.

Avoid common mistakes on your manuscript.

1 Introduction

Online learning also referred to as E-learning or remote learning is essentially a web-based program that gives learners access to knowledge or information whenever needed, regardless of their proximity to a location or time constraints [ 1 ]. This form of learning has been around for a while, it started in the late 1990s and it has advanced quickly. It has been considered a good choice, particularly for adult learners [ 2 ].

Online education promotes a student-centred approach, whereby students are expected to actively participate in the learning process. The digital tools used in online learning include interactive elements, computers, mobile devices, the internet, and other devices that allow students to receive and share knowledge [ 3 ]. Different types of online learning exist, such as microlearning, individualized learning, synchronous, asynchronous, blended, and massive open online courses [ 2 ]. Online learning offers several advantages to students, such as its adaptability to individual needs, ease, and flexibility in terms of involvement. With user-friendly online learning applications on their personal computers (PCs) or laptops, students can take part in their online courses from any convenient place, they can take specific courses with less time and location restrictions [ 4 ].

Learning experiences and academic success of students are some of the difficulties of online education [ 5 ]. Furthermore, while technology facilitates accessibility and ease of use of online learning platforms, it can also have restrictive effects, where many students struggle to gain internet access [ 6 ], in turn causes problems with participation and attendance in virtual classes, which makes it difficult to adopt online learning platforms [ 7 ]. Other issues with e-learning include educational policy, learning pedagogy, accessibility, affordability, and flexibility [ 8 ]. Many developing countries have substantial issues with reliable internet connection and access to digital devices, especially among economically backward children [ 9 ]. Maintaining student engagement in an online classroom can be more difficult than in a traditional face-to-face setting [ 10 ]. Even with all the advantages of online learning, there is reduced interaction between students and course facilitators. Another barrier to online learning is the lack of opportunities for human connection, which was thought to be essential for creating peer support and creating in-depth group discussions on the subject [ 11 ].

Over the past four years, COVID-19 has spread over the world, forcing schools to close, hence the pandemic compelled educators and learners at every level to swiftly adapt to online learning to curb the spread of the disease while ensuring continuous education [ 12 ]. The emergence of the pandemic rendered traditional face-to-face teaching and training methods unfeasible [ 13 ]. Some studies [ 14 , 15 , 16 ] acknowledged that the move to online learning was significant and sudden, but that it was also necessary to continue the learning process. This abrupt change sparked an argument regarding the standard of learning and satisfaction with learning among students [ 17 ].

While there are similarities between face-to-face (F2F) and online learning, they still differ in several ways [ 18 ], some of the similarities are: prerequisites for students include attendance, comprehension of the subject matter, turning in homework, and completion of group projects. The teachers still need to create curricula, enhance the quality of their instruction, respond to inquiries from students, inspire them to learn, and grade assignments [ 19 ]. One difference between online learning and F2F learning is the fact that online learning is student-centred and necessitates active learning while F2F learning is teacher-centred and demands passive learning from the student [ 19 ]. Another difference is teaching and learning has to happen at the same time and location in face-to-face learning, while online learning is not restricted by time or location [ 20 ]. Online learning allows teaching and learning to be done separately using internet-based information delivery systems [ 21 ].

Finding more efficient strategies to increase student engagement in online learning settings is necessary, as the absence of F2F interactions between students and instructors or among students continues to be a significant issue with online learning [ 20 ]. Student engagement has been defined as how involved or interested students appear to be in their learning and how connected they are to their classes, their institutions, and each other [ 22 ]. Engagement has been pointed out as a major dimension of students’ level and quality of learning, and is associated with improvement in their academic achievement, their persistence versus dropout, as well as their personal and cognitive development [ 23 ]. In an online setting, student engagement is equally crucial to their success and performance [ 24 ].

Change in learning delivery method is accompanied by inquiries when assessing whether online education is a practical replacement for traditional classroom instruction, cost–benefit evaluation, student experience, and student achievement are now being carefully considered [ 19 ]. This decision-making process will most likely continue if students seek greater learning opportunities and technological advances [ 19 ].

An individual's academic performance is significant to their success during their time in an educational institution [ 25 ], students' academic achievement is one indicator of their educational accomplishment. However, it is frequently seen that while student learning capacities are average, the demands placed on them for academic achievement are rising. This is the reason why the student's academic performance success rate is below par [ 25 ].

Numerous authors [ 11 , 13 , 18 , 26 , 27 , 28 , 29 ] have examined how students and teachers view online learning, but it is still important to understand how much students are learning from these platforms. After all, student performance determines whether a subject or course is successful or unsuccessful.

The increase in the use of online learning calls for a careful analysis of its impact on student performance and engagement. Investigating the online learning experiences of students will guide education policymakers such as ministries, departments, and agencies in both the public and private sectors in the evaluation of the potential pros and cons of adopting online education against F2F education [ 30 ]

Given the foregoing, this study was carried out to; (1) investigate the online learning experiences of students, (2) review the academic performance of students using online learning platforms, and (3) explore the levels of students’ engagement when learning using online platforms.

2 Methodology

The study was conducted using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines [ 31 ].

2.1 Search strategy and databases used

PubMed, ScienceDirect, and JSTOR were databases used to search for articles using identified search terms. The three data bases were selected for their extensive coverage of health sciences, social sciences and educational articles. The articles searched were between the years 2019–2024, this is because online learning became popular during the COVID-19 pandemic which started in 2019. Only English, open-access, and free full-text articles were selected for review, this is to ensure that the data analysed are publicly available to ascertain transparency and reproducibility of the review. The search was carried out in February 2024. The search strategy terms used are shown in Table  1 .

2.2 Inclusion and exclusion criteria

Articles included for review were studies conducted on students enrolled in any field in a higher institution. Only articles in English language that were published between 2019 and 2024 which assessed student performance and engagement were included.

Articles excluded are studies involving pupils (students in primary school), articles not written in English language, and those published before 2019. Also, studies that did not follow the declaration of Helsinki on research ethics and without clear evidence of ethical consideration and approval were excluded.

2.3 Search outcomes

A total of 1078 articles were obtained from the databases searched. Four articles were duplicated and eliminated from the review. After the elimination of duplicates, titles, and abstracts were used to evaluate the remaining 1074 articles. These articles were screened based on the inclusion criteria, a total of 1052 studies were excluded after reading the titles and abstracts. Complete texts of 22 articles were read and four were found to be irrelevant to the review, a total of 18 articles were used for the systematic review.

The PRISMA flowchart shown in Fig.  1 illustrates the procedure used to screen and assess the articles.

figure 1

A PRISMA flow chart of studies included in the systematic review

2.4 Data analysis

A data synthesis table was developed to collect relevant information on the author, year study was conducted, study design, study location, sample methodology, sample size, population, assessment tool, findings on student performance and student engagement, other findings, and limitations. Data was collected about whether students’ performance and engagement improved or declined following the introduction of online learning in their education. Data about the extent of the improvement or decline was also collected.

2.5 Quality appraisal

A quality assessment was carried out using the Critical Appraisal Skills Programme (CASP) developed to appraise systematic reviews. The checklist was used to analyse the included articles.

The characteristics of the 18 articles included in the study are presented in Table  2 . Ten (55.6%) were cross-sectional studies [ 2 , 10 , 12 , 13 , 29 , 32 , 33 , 34 , 35 , 36 ], three (16.7%) were mixed methods studies [ 18 , 26 , 37 ], two (11.1%) were quasi-experimental and longitudinal studies [ 3 , 38 , 39 , 40 ], and one (5.5%) was a qualitative study [ 41 ].

The population involved in the study was a mix of students from various fields and departments, including medical, nursing, pharmacy, psychology, students taking management courses, and engineering students [ 3 , 12 , 13 , 18 , 29 , 32 , 34 , 35 , 38 , 39 , 41 ]. Other students were undergraduates from different fields that were not mentioned [ 2 , 10 , 26 , 33 , 36 , 37 , 40 ].

Study outcomes were categorized using three categories; student performance, student engagement, and studies that measured both student performance and engagement.

The summary of findings from the included studies are presented in Table  3 . Questionnaire surveys were mostly used across all the studies, however, one study used focus group discussions [ 41 ] and another study used a checklist to collect administrative data from student registers [ 40 ]. Study designs used in the included studies are cross-sectional, mixed methods, quasi-experimental, qualitative, and longitudinal. Studies were included from various countries across all six continents, countries in Asia constituted most of the studies (n = 7), Europe (n = 5), North America (n = 2), South America, Africa, and Australia all had one country represented in the study location.

3.1 Students’ performance

The impact of online learning on student performance was documented in thirteen studies [ 3 , 10 , 12 , 13 , 18 , 26 , 32 , 33 , 34 , 36 , 38 , 39 , 40 ]. In the study conducted by Elnour et al. [ 12 ], about half of the respondents strongly agreed that online learning had a negative impact on their grades in comparison to when they were attending face-to-face classes, two other studies had similar findings where students reported a decline in their grades during online learning [ 34 , 40 ].

Two studies experimented to compare grades achieved by students taking online classes (experimental group) with students taking face-to-face classes (control group) and found that those in the experimental group scored higher during examinations than those in the control group [ 38 , 39 ]. Nine studies included in this review showed a positive impact of online learning on student performance [ 3 , 10 , 13 , 26 , 32 , 33 , 36 , 38 , 39 ] students reported getting higher scores during examinations when they switched to online learning.

Two studies measured the performance of students before online learning and during online learning [ 3 , 40 ]. Both studies had varying findings, one of the studies found that when students started learning online, their grades improved on average from 4.7/10 to 5.15/10 and dropped to 4.6/10 when they went back to face-to-face learning [ 3 ], while another study used students' registers to capture their grades before online learning and when they started studying online and found that the switch to online learning led to a lesser number of credits obtained by the students [ 40 ].

3.2 Students’ engagement

Student engagement during online learning was reported in ten of the reviewed articles [ 2 , 10 , 12 , 13 , 18 , 29 , 35 , 36 , 37 , 41 ]. Students reported the negative effect of online learning on engagement with their peers and teachers. Nonetheless, in one of the studies [ 18 ], the respondents reported that online learning did not affect engagement with their lecturers, even though they felt least engaged with their peers. Students reported the effect of isolation when they were studying and taking classes online in comparison to when they had face-to-face learning [ 2 , 41 ], they revealed that the abrupt switch did not allow them to understand and adapt to the new form of learning and it led to feelings of isolation and separation from their classmates and teachers [ 2 ]

For science-based courses, students reported concern about carrying out practical classes, as studying online did not grant them the opportunity to effectively carry out practical [ 18 ]. Also, medical students reported dissatisfaction in interacting with their patients, which led to less engagement and connection [ 13 ]. One of the studies reviewed stated the role of engagement in increasing student performance over time, students stated that when they interact and engage with their teams and lecturers, they tend to perform better in their examinations [ 18 ].

4 Discussion

This study aimed to examine the impact of online learning on students' academic performance and engagement. The results underscore the varied impacts of online learning on student performance and engagement. While some students benefited from the flexibility and new opportunities presented by online learning, others struggled with the lack of direct interaction and practical engagement. This suggests that while online learning has potential, it requires careful implementation and support to address the challenges of engagement and practical application, particularly in fields requiring hands-on experience.

Majority of the articles in this review showed that online learning did not negatively affect the academic performance of students, though the studies did not have a standardized method of measuring their performance before online learning and during studying online, most of the survey was based on the students' perceptions. These findings support the findings of other studies that reported an increase in students' grades when they studied online [ 42 , 43 , 44 ]. Possible reasons highlighted for the increase in performance include the availability of recorded videos; students were able to study and listen to past teachings at their own pace and review course content when necessary. This enabled them to manage their time better and strengthen their understanding of complex materials and courses. Also, the use of computers and the availability of good internet connectivity were major reasons emphasized by students in helping them achieve good grades. The incorporation of digital tools like interactive quizzes, recorded videos, and learning management systems (LMS) provided students an interesting avenue to learn, which enhanced their academic performance [ 45 ]. Many students found that independent learning was suitable and matched their unique learning style better than F2F learning, this could be another reason for the improvement in their grades.

Despite reporting good grades with online learning, students still felt unsatisfied with this mode of learning, they reported bad internet connectivity, especially in studies conducted in Africa and Asia [ 1 , 13 ]. Furthermore, there were no academic performance variations between rural and urban learners [ 46 ], this finding varied with the finding of Bacher et al. [ 47 ] who stated that students in rural communities will require more support to bridge the academic gap experienced with their peers who live in urban settings. Another author compared the impact of environmental conditions at home and student academic performance, and it was found that students who had poor lighting conditions or those who were exposed to noisy environments performed poorly, this suggests that online learners need proper indoor lighting, ventilation, and a quiet environment for proper learning [ 42 ]

However, one of the studies found that online learning reduced the academic grades of the students, this could be because of the use of smartphones in carrying out examinations instead of using computers, and inexperience with the use of the Learning Management System (LMS) [ 34 ].

A lot of implications can arise as a result of improved performance among students due to a shift from F2F to online learning platforms. For students, it can increase confidence and contentment [ 48 ], but because of the dependence on technology, students also need to learn time management skills and self-discipline [ 49 ] which are essential for success in an online environment. Families may feel less stressed about their children’s academic success, but this might also result in more pressure to sustain these outcomes [ 50 ], particularly if the progress is linked directly to online learning. More educated citizens will benefit from increased academic performance through an increase in rates of employment and economic growth [ 51 ], but unequal access to technology could make the divide between various socioeconomic classes more pronounced. Furthermore, improved student performance has the potential to elevate the overall quality of the workforce, accelerating economic growth and competitiveness in the global market [ 52 ]. However, disparities in online learning must be addressed to guarantee that every student has an equal chance of success.

In terms of student engagement, similar findings were seen across the reviewed articles, most students reported that online learning was less engaging, and they could not associate with their peers or lecturers which made them feel self-isolated. This finding has been supported by Hollister et al. [ 43 ] where students complained of less engagement in online classes despite attaining good grades, they missed the spontaneous conversations and collaborations that are typical in a classroom setting. Motivation is an important element in both online and offline learning, students need self-motivation for overall learning outcomes [ 44 ]. Findings from this review indicate that students who reported being able to engage with their teams and lecturers actively attribute their success to self-motivation. Also, Cents-Boonstra et al. [ 53 ] investigated the role of motivating teaching behaviour and found that teachers who offered support and guidance during learning had more student engagement in comparison to teachers who did not offer any support or show enthusiasm for teaching. Courses that previously required hands-on experiences, like clinical practice or laboratory work, was challenging to conduct online, medical students expressed dissatisfaction with not being able to conduct practical sessions in the laboratory or interact effectively with their patients, this made learning online an isolating experience. Their participation dropped as a result of the separation between the theoretical and practical components of their education. This supports the finding of Khalil et al. [ 54 ] where medical students stated that they missed having live clinical sessions and couldn’t wait to go back to having a F2F class. Major barriers to participation included a lack of personal devices, and, inconsistent internet access, especially in rural or low-income areas. These barriers made it difficult for students to participate fully in online classes and also made them feel more frustrated and disengaged. This is similar to a study by Al-Amin et al. [ 11 ] where tertiary students studying online complained of less engagement in classroom activities.

Generally, students reported a negative effect of online learning on their engagement. This could be a result of poor technology skills, unavailability of personal computers or smartphones, or lack of internet services [ 55 ].

In a study conducted by Heilporn et al. [ 56 ], the author examined strategies that can be used by teachers to improve student engagement in a blended learning environment. Presenting a clear course structure and working at a particular pace, engaging learners with interactive activities, and providing additional support and constant feedback will help in improving overall student engagement. In a study by Gopal et al. [ 57 ], it was found that quality of instructor and the ability to use technological tools is an important element in influencing students engagement. The instructor needs to understand the psychology of the students in order to effectively present the course material.

A decrease in student engagement can have a detrimental effect on their entire educational experience, this can affect motivation and satisfaction. In the long-term, this could lead to decreased academic achievement and increased dropout rates [ 58 ]. To maintain students' motivation and engagement, families might need to put in extra effort especially if they simultaneously manage the online learning needs of numerous children [ 59 ]. This can result in additional stress or financial constraints in purchasing technological tools. In addition, for students studying online, it results in a less unified learning environment, which may diminish community bonds, and instructors will find it difficult to assist disengaged and potentially falling behind students [ 60 ].

The contrast between positive student performance and negative student engagement suggests that while online learning is a useful approach, it is less successful at fostering the interactive and social aspects of education. Online learning must include interactive components like discussion boards, and group projects that will enable in-person communication [ 61 ]. Furthermore, it is essential to guarantee that students have access to sufficient technology tools and training to enable them participate fully.

Some learners found it difficult to give the benefits of learning online, but none failed to give the benefits of face-to-face learning. In a study by Aguilera-Hermida [ 6 ], college students preferred studying in a physical classroom against studying online, they also found it hard to adapt to online classes, this decreased their level of participation and engagement. Also, an increase in good grades might be a result of cheating behaviours [ 3 ], given that unlike face-to-face learning where teachers are present to invigilate and validate that examinations were individual-based, for online learning it is difficult to determine if examinations were truly carried out by the students, giving students the option to share their answers with classmates or obtain them from internet resources. The studies did not state if measures were put in place to ensure exams taken online were devoid of cheating by the students.

Furthermore, online learning is here to stay, but there is a need for planning and execution of the process to mitigate the issue of students engaging effectively. Ignorance of this could put the possible advantages of this process in danger [ 62 ].

4.1 Limitations

A major limitation of this systematic review is the paucity of studies that objectively measured performance and engagement in students before and after the introduction of online learning. Findings in fourteen (78%) of the included articles were self-reported by the students which could lead to recall and/or desirability bias. In addition, the lack of uniform measurement or scale for assessing students’ performance and engagement is also a limitation. Subsequently, we suggest that standardized study tools should be developed and validated across various populations to more accurately and objectively evaluate the impact of the introduction of online learning on students’ performance and engagement. More studies should be conducted with clear pre- and post-intervention measurements using different pedagogical approaches to access their effects on students’ performance and engagement. These studies should also design ways of measuring indicators objectively without recall or desirability biases. Furthermore, the exclusion of studies that are not open access as well as publication bias for articles not published in English language are also limitations of this study.

5 Conclusion

The switch to online learning had both its advantages and disadvantages. The flexibility and accessibility of online platforms have played a major role in the enhancement of student performance, yet the decline in engagement underscores the need for more efficacious strategies to promote engagement. Online learning had a positive impact on student performance, most of the students reported either an increase or no change in grades when they changed to learning online. Only three studies stated a decline in student performance. Overall, students felt with online learning, they could not engage with their peers, teams, and teachers. They had a feeling of social isolation and felt more engagement would have improved their performance better. Schools and policymakers must develop strategies to mitigate the challenge of student engagement in online learning. This is necessary to prepare institutions for potential future pandemics which will compel reliance on online learning, this is critical for maintaining student satisfaction and overall learning outcomes.

In summary, online learning has the capacity to enhance academic achievement, but its effectiveness depends on effectively resolving the barriers associated with student involvement. Future studies should examine the long-term effects of online learning on student's performance and engagement with emphasis on creating strategies to improve the social and interactive components of the learning process. This is essential to guarantee that, in the future, online learning will be a viable and productive educational medium not just a band-aid fix during emergencies like the COVID-19 pandemic.

Data availability

The articles used for this systematic review are all cited and publicly available.

Bossman A, Agyei SK. Technology and instructor dimensions, e-learning satisfaction, and academic performance of distance students in Ghana. Heliyon. 2022;8(4):09200. https://doi.org/10.1016/J.HELIYON.2022.E09200 .

Article   Google Scholar  

Rahman A, Islam MS, Ahmed NAMF, Islam MM. Students’ perceptions of online learning in higher secondary education in Bangladesh during COVID-19 pandemic. Soc Sci Human Open. 2023;8(1):100646. https://doi.org/10.1016/J.SSAHO.2023.100646 .

Pérez MA, Tiemann P, Urrejola-Contreras GP. The impact of the learning environment sudden shifts on students’ performance in the context of the COVID-19 pandemic. Educ Méd. 2023;24(3):100801. https://doi.org/10.1016/J.EDUMED.2023.100801 .

Basar ZM, Mansor AN, Jamaludin KA, Alias BS. The effectiveness and challenges of online learning for secondary school students—a case study. Asian J Univ Educ. 2021;17(3):119–29.

Rajabalee YB, Santally MI. Learner satisfaction, engagement and performances in an online module: Implications for institutional e-learning policy. Educ Inf Technol (Dordr). 2021;26(3):2623–56.

Aguilera-Hermida AP. College students’ use and acceptance of emergency online learning due to COVID-19. Int J Educ Res Open. 2020;1:100011. https://doi.org/10.1016/j.ijedro.2020.100011 .

Nambiar D. The impact of online learning during COVID-19: students’ and teachers’ perspective. Int J Indian Psychol. 2020;8(2):783–93.

Google Scholar  

Maheshwari M, Gupta AK, Goyal S. Transformation in higher education through e-learning: a shifting paradigm. Pac Bus Rev Int. 2021;13(8):49–63.

Pokhrel S, Chhetri R. A literature review on impact of COVID-19 pandemic on teaching and learning. High Educ Future. 2021;8(1):133–41. https://doi.org/10.1177/2347631120983481 .

Kedia P, Mishra L. Exploring the factors influencing the effectiveness of online learning: a study on college students. Soc Sci Human Open. 2023;8(1):100559. https://doi.org/10.1016/J.SSAHO.2023.100559 .

Al-Amin M, Al Zubayer A, Deb B, Hasan M. Status of tertiary level online class in Bangladesh: students’ response on preparedness, participation and classroom activities. Heliyon. 2021;7(1):e05943. https://doi.org/10.1016/J.HELIYON.2021.E05943 .

Elnour AA, Abou Hajal A, Goaddar R, Elsharkawy N, Mousa S, Dabbagh N, Mohamad Al Qahtani M, Al Balooshi S, Othman Al Damook N, Sadeq A. Exploring the pharmacy students’ perspectives on off-campus online learning experiences amid COVID-19 crises: a cross-sectional survey. Saudi Pharm J. 2023;31(7):1339–50. https://doi.org/10.1016/j.jsps.2023.05.024 .

Fahim A, Rana S, Haider I, Jalil V, Atif S, Shakeel S, Sethi A. From text to e-text: perceptions of medical, dental and allied students about e-learning. Heliyon. 2022;8(12):e12157. https://doi.org/10.1016/J.HELIYON.2022.E12157 .

Henriksen D, Creely E, Henderson M. Folk pedagogies for teacher transitions: approaches to synchronous online learning in the wake of COVID-19. J Technol Teach Educ. 2020;28(2):201–9.

Zhu X, Chen B, Avadhanam RM, Shui H, Zhang RZ. Reading and connecting: using social annotation in online classes. Inf Learn Sci. 2020;121(5/6):261–71.

Bao W. COVID-19 and online teaching in higher education: a case study of Peking University. Hum Behav Emerg Technol. 2020;2(2):113–5.

Baber H. Determinants of students’ perceived learning outcome and satisfaction in online learning during the pandemic of COVID-19. J Educ Elearn Res. 2020;7(3):285–92.

Afzal F, Crawford L. Student’s perception of engagement in online project management education and its impact on performance: the mediating role of self-motivation. Proj Leadersh Soc. 2022;3:100057. https://doi.org/10.1016/j.plas.2022.100057 .

Paul J, Jefferson F. A comparative analysis of student performance in an online vs. face-to-face environmental science course from 2009 to 2016. Front Comput Sci. 2019;1:7.

Francescucci A, Rohani L. Exclusively synchronous online (VIRI) learning: the impact on student performance and engagement outcomes. J Mark Educ. 2019;41(1):60–9.

Pei L, Wu H. Does online learning work better than offline learning in undergraduate medical education? A systematic review and meta-analysis. Med Educ Online. 2019;24(1):1666538.

Thang SM, Mahmud N, Mohd Jaafar N, Ng LLS, Abdul Aziz NB. Online learning engagement among Malaysian primary school students during the covid-19 pandemic. Int J Innov Creat Change. 2022;16(2):302–26.

Ribeiro L, Rosário P, Núñez JC, Gaeta M, Fuentes S. “First-year students background and academic achievement: the mediating role of student engagement. Front Psychol. 2019;10:2669.

Muzammil M, Sutawijaya A, Harsasi M. Investigating student satisfaction in online learning: the role of student interaction and engagement in distance learning university. Turk Online J Distance Educ. 2020;21(Special Issue-IODL):88–96.

Mandasari B. The impact of online learning toward students’ academic performance on business correspondence course. EDUTEC. 2020;4(1):98–110.

Chen LH. Moving forward: international students’ perspectives of online learning experience during the pandemic. Int J Educ Res Open. 2023;5:100276. https://doi.org/10.1016/j.ijedro.2023.100276 .

Wu YH, Chiang CP. Online or physical class for histology course: Which one is better? J Dent Sci. 2023;18(3):1295–300. https://doi.org/10.1016/j.jds.2023.03.004 .

Salahshouri A, Eslami K, Boostani H, Zahiri M, Jahani S, Arjmand R, Heydarabadi AB, Dehaghi BF. The university students’ viewpoints on e-learning system during COVID-19 pandemic: the case of Iran. Heliyon. 2022;8(2):e08984. https://doi.org/10.1016/j.heliyon.2022.e08984 .

Maqbool S, Farhan M, Abu Safian H, Zulqarnain I, Asif H, Noor Z, Yavari M, Saeed S, Abbas K, Basit J, Ur Rehman ME. Student’s perception of E-learning during COVID-19 pandemic and its positive and negative learning outcomes among medical students: a country-wise study conducted in Pakistan and Iran. Ann Med Surg (Lond). 2022;82:104713. https://doi.org/10.1016/j.amsu.2022.104713 .

Anderson T. Theories for learning with emerging technologies. In: Veletsianos G, editor. Emerging technologies in distance education. Athabasca: Athabasca University Press; 2010.

Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, Shamseer L, Tetzlaff JM, Akl EA, Brennan SE, Chou R, Glanville J, Grimshaw JM, Hróbjartsson A, Lalu MM, Li T, Loder EW, Mayo-Wilson E, McDonald S, McGuinness LA, Stewart LA, Thomas J, Tricco AC, Welch VA, Whiting P, Moher D. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Int J Surg. 2021;88:105906. https://doi.org/10.1016/j.ijsu.2021.105906 .

Weerarathna RS, Rathnayake NM, Pathirana UPGY, Weerasinghe DSH, Biyanwila DSP, Bogahage SD. Effect of E-learning on management undergraduates’ academic success during COVID-19: a study at non-state Universities in Sri Lanka. Heliyon. 2023;9(9):e19293. https://doi.org/10.1016/j.heliyon.2023.e19293 .

Bossman A, Agyei SK. Technology and instructor dimensions, e-learning satisfaction, and academic performance of distance students in Ghana. Heliyon. 2022;8(4):e09200. https://doi.org/10.1016/J.HELIYON.2022.E09200 .

Mushtaha E, Abu Dabous S, Alsyouf I, Ahmed A, Raafat AN. The challenges and opportunities of online learning and teaching at engineering and theoretical colleges during the pandemic. Ain Shams Eng J. 2022;13(6):101770. https://doi.org/10.1016/J.ASEJ.2022.101770 .

Wester ER, Walsh LL, Arango-Caro S, Callis-Duehl KL. Student engagement declines in STEM undergraduates during COVID-19–driven remote learning. J Microbiol Biol Educ. 2021;22(1):22.1.50. https://doi.org/10.1128/jmbe.v22i1.2385 .

Lemay DJ, Bazelais P, Doleck T. Transition to online learning during the COVID-19 pandemic. Comput Hum Behav Rep. 2021;4:100130. https://doi.org/10.1016/j.chbr.2021.100130 .

Briggs MA, Thornton C, McIver VJ, Rumbold PLS, Peart DJ. Investigation into the transition to online learning due to the COVID-19 pandemic, between new and continuing undergraduate students. J Hosp Leis Sport Tour Educ. 2023;32:100430. https://doi.org/10.1016/J.JHLSTE.2023.100430 .

Nácher MJ, Badenes-Ribera L, Torrijos C, Ballesteros MA, Cebadera E. The effectiveness of the GoKoan e-learning platform in improving university students’ academic performance. Stud Educ Eval. 2021;70:101026. https://doi.org/10.1016/J.STUEDUC.2021.101026 .

Grønlien HK, Christoffersen TE, Ringstad Ø, Andreassen M, Lugo RG. A blended learning teaching strategy strengthens the nursing students’ performance and self-reported learning outcome achievement in an anatomy, physiology and biochemistry course—a quasi-experimental study. Nurse Educ Pract. 2021;52:103046. https://doi.org/10.1016/J.NEPR.2021.103046 .

De Paola M, Gioia F, Scoppa V. Online teaching, procrastination and student achievement. Econ Educ Rev. 2023;94:102378. https://doi.org/10.1016/J.ECONEDUREV.2023.102378 .

Goodwin J, Kilty C, Kelly P, O’Donovan A, White S, O’Malley M. Undergraduate student nurses’ views of online learning. Teach Learn Nurs. 2022;17(4):398–402. https://doi.org/10.1016/J.TELN.2022.02.005 .

Realyvásquez-Vargas A, Maldonado-Macías AA, Arredondo-Soto KC, Baez-Lopez Y, Carrillo-Gutiérrez T, Hernández-Escobedo G. The impact of environmental factors on academic performance of university students taking online classes during the COVID-19 Pandemic in Mexico. Sustainability. 2020;12(21):9194.

Hollister B, Nair P, Hill-Lindsay S, Chukoskie L. Engagement in online learning: student attitudes and behavior during COVID-19. Front Educ. 2022;7:851019. https://doi.org/10.3389/feduc.2022.851019 .

Hsu HC, Wang CV, Levesque-Bristol C. Reexamining the impact of self-determination theory on learning outcomes in the online learning environment. Educ Inf Technol. 2019;24(3):2159–74.

Bradley VM. Learning Management System (LMS) use with online instruction. Int J Technol Educ. 2021;4(1):68–92.

Clark AE, Nong H, Zhu H, Zhu R. Compensating for academic loss: online learning and student performance during the COVID-19 pandemic. China Econ Rev. 2021;68:101629.

Bacher-Hicks A, Goodman J, Mulhern C. Inequality in household adaptation to schooling shocks: Covid-induced online learning engagement in real time. J Public Econ. 2021;193:1043451. https://doi.org/10.1016/j.jpubeco.2020.104345 .

Liu YM, Hou YC. Effect of multi-disciplinary teaching on learning satisfaction, self-confidence level and learning performance in the nursing students. Nurse Educ Pract. 2021;55:103128.

Gelles LA, Lord SM, Hoople GD, Chen DA, Mejia JA. Compassionate flexibility and self-discipline: Student adaptation to emergency remote teaching in an integrated engineering energy course during COVID-19. Educ Sci (Basel). 2020;10(11):304.

Deng Y, et al. Family and academic stress and their impact on students’ depression level and academic performance. Front Psychiatry. 2022;13:869337. https://doi.org/10.3389/fpsyt.2022.869337 .

Gunderson M, Oreopolous P. Returns to education in developed countries. In: The economics of education; 2020. p. 39–51. https://doi.org/10.1016/B978-0-12-815391-8.00003-3 .

Prasetyo PE, Kistanti NR. Human capital, institutional economics and entrepreneurship as a driver for quality & sustainable economic growth. Entrep Sustain Issues. 2020;7(4):2575. https://doi.org/10.9770/jesi.2020.7.4(1) .

Cents-Boonstra M, Lichtwarck-Aschoff A, Denessen E, Aelterman N, Haerens L. Fostering student engagement with motivating teaching: an observation study of teacher and student behaviours. Res Pap Educ. 2021;36(6):754–79. https://doi.org/10.1080/02671522.2020.1767184 .

Khalil R, Mansour AE, Fadda WA, et al. The sudden transition to synchronized online learning during the COVID-19 pandemic in Saudi Arabia: a qualitative study exploring medical students’ perspectives. BMC Med Educ. 2020;20:1–10. https://doi.org/10.1186/s12909-020-02208-z .

Werang BR, Leba SMR. Factors affecting student engagement in online teaching and learning: a qualitative case study. Qualitative Report. 2022;27(2):555–77. https://doi.org/10.46743/2160-3715/2022.5165 .

Heilporn G, Lakhal S, Bélisle M. An examination of teachers’ strategies to foster student engagement in blended learning in higher education. Int J Educ Technol High Educ. 2021;18(1):25. https://doi.org/10.1186/s41239-021-00260-3 .

Gopal R, Singh V, Aggarwal A. Impact of online classes on the satisfaction and performance of students during the pandemic period of COVID 19. Educ Inf Technol (Dordr). 2021;26:6923–47. https://doi.org/10.1007/s10639-021-10523-1 .

Schnitzler K, Holzberger D, Seidel T. All better than being disengaged: student engagement patterns and their relations to academic self-concept and achievement. Eur J Psychol Educ. 2021;36(3):627–52. https://doi.org/10.1007/s10212-020-00500-6 .

Roksa J, Kinsley P. The role of family support in facilitating academic success of low-income students. Res High Educ. 2019;60:415–36. https://doi.org/10.1007/s11162-018-9517-z .

Antoni J. Disengaged and nearing departure: Students at risk for dropping out in the age of COVID-19. TUScholarShare Faculty/Researcher Works; 2020. https://doi.org/10.34944/dspace/396 .

Cavinato AG, Hunter RA, Ott LS, Robinson JK. Promoting student interaction, engagement, and success in an online environment. Anal Bioanal Chem. 2021;413:1513–20. https://doi.org/10.1007/s00216-021-03178-x .

Kumar S, Todd G. Effectiveness of online learning interventions on student engagement and academic performance amongst first-year students in allied health disciplines: a systematic review of the literature. Focus Health Prof Educ Multi-Prof J. 2022;23(3):36–55. https://doi.org/10.3316/informit.668657139008083 .

Download references

The authors did not receive funding from any agency/institution for this research.

Author information

Authors and affiliations.

Sydani Institute for Research and Innovation, Sydani Group, Abuja, Nigeria

Catherine Nabiem Akpen, Stephen Asaolu & Hilary Okagbue

Sydani Group, Abuja, Nigeria

Sunday Atobatele & Sidney Sampson

You can also search for this author in PubMed   Google Scholar

Contributions

CNA, SOA, SA, and SS initiated the topic, CNA, HO and SOA searched and screened the articles, CNA, SOA, and HO conducted the data synthesis for the manuscript, CNA wrote the initial draft of the manuscript, CNA and SOA wrote the second draft of the manuscript, and SOA, HO and SA provided supervision. All authors reviewed and approved the final manuscript.

Corresponding author

Correspondence to Stephen Asaolu .

Ethics declarations

Competing interests.

The authors declare no competing interests regarding this research work.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Akpen, C.N., Asaolu, S., Atobatele, S. et al. Impact of online learning on student's performance and engagement: a systematic review. Discov Educ 3 , 205 (2024). https://doi.org/10.1007/s44217-024-00253-0

Download citation

Received : 18 July 2024

Accepted : 05 September 2024

Published : 01 November 2024

DOI : https://doi.org/10.1007/s44217-024-00253-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online learning
  • Student engagement
  • Student performance
  • Systematic review
  • Literature review
  • Find a journal
  • Publish with us
  • Track your research

An official website of the United States government

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( Lock Locked padlock icon ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List

Evaluating blended learning effectiveness: an empirical study from undergraduates’ perspectives using structural equation modeling

Xiaotian han.

  • Author information
  • Article notes
  • Copyright and License information

Edited by: Simone Belli, Complutense University of Madrid, Spain

Reviewed by: María-Camino Escolar-Llamazares, University of Burgos, Spain; Ramazan Yilmaz, Bartin University, Türkiye; Denok Sunarsi, Pamulang University, Indonesia

*Correspondence: Xiaotian Han, [email protected]

Received 2022 Oct 1; Accepted 2023 May 2; Collection date 2023.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

Following the global COVID-19 outbreak, blended learning (BL) has received increasing attention from educators. The purpose of this study was: (a) to develop a measurement to evaluate the effectiveness of blended learning for undergraduates; and (b) to explore the potential association between effectiveness with blended learning and student learning outcomes. This research consisted of two stages. In Stage I, a measurement for evaluating undergraduates’ blended learning perceptions was developed. In Stage II, a non-experimental, correlational design was utilized to examine whether or not there is an association between blended learning effectiveness and student learning outcomes. SPSS 26.0 and AMOS 23.0 were utilized to implement factor analysis and structured equation modeling. The results of the study demonstrated: (1) The hypothesized factors (course overview, course objectives, assessments, 1148 class activities, course resources, and technology support) were aligned as a unified system in blended learning. (2) There was a positive relationship between the effectiveness of blended learning and student learning outcomes. Additional findings, explanations, and suggestions for future research were also discussed in the study.

Keywords: blended learning effectiveness, measurement, undergraduates, student learning outcomes, structural equation modeling

1. Introduction

Following the global COVID-19 outbreak, blended learning (BL) has received increasing attention from educators. BL can be defined as an approach that combines face-to-face and online learning ( Dos, 2014 ), which has become the default means of delivering educational content in the pandemic context worldwide due to its rich pedagogical practices, flexible approaches, and cost-effectiveness ( Tamim, 2018 ; Lakhal et al., 2020 ). Moreover, empirical research has demonstrated that BL improves learners’ active learning strategies, multi-technology learning processes, and learner-centered learning experiences ( Feng et al., 2018 ; Han and Ellis, 2021 ; Liu, 2021 ). Furthermore, students are increasingly requesting BL courses due to the inability to on-campus attendance ( Brown et al., 2018 ). In addition, researchers have examined the positive effects of BL on engaging students, improving their academic performance and raising student satisfaction ( Alducin-Ochoa and Vázquez-Martínez, 2016 ; Manwaring et al., 2017 ).

In China, the Ministry of Education has strongly supported educational informatization since 2012 by issuing a number of policies (the Ministry of Education, 2012). In 2016, China issued the Guiding Opinions of the Ministry of Education on Deepening the Educational and Teaching Reform of Colleges and Universities, emphasizing the promotion of the BL model in higher education. In 2017, the Ministry of Education listed BL as one of the trends in driving education reform in the New Media Alliance Horizon Report: 2017 Higher Education Edition. In 2018, Minister Chen Baosheng of the Ministry of Education proposed at the National Conference on Undergraduate Education in Colleges and Universities in the New Era to focus on promoting classroom revolution and new teaching models such as flipped classroom and BL approach. In 2020, the first batch of national BL courses was identified, which pushed the development of BL to the forefront of teaching reform. During the pandemic era in China, BL was implemented in all universities and colleges.

However, a number of researchers produced opposing results regarding the benefits of BL. Given the pre-requisites, resources, and attitudes of the students, BL model is suspected to be inapplicable to all courses, such as practicum courses ( Boyle et al., 2003 ; Naffi et al., 2020 ). Moreover, it should be noted that students, teachers, and educational institutions may lack BL experience and therefore they are not sufficiently prepared (such as technology access) to implement BL methods or focus on the efficiency of BL initiatives ( Xiao, 2016 ; Liliana, 2018 ; Adnan and Anwar, 2020 ). Another big concern is that BL practice is hard to evaluate because there are few standardized BL criteria ( Yan and Chen, 2021 ; Zhang et al., 2022 ). In addition, a number of studies have concluded there was no significant contribution of BL in terms of student performance and test scores, compared to traditional learning environments ( U.S. Department of Education, 2009 ). Therefore, it is extremely necessary to explore the essential elements of BL in higher education and examine the effect of BL on student academic achievement. This paper offers important insights for those attempting to implement BL in classroom practice to effectively support student needs in higher education.

The purpose of this research was: (1) To develop a measurement with key components to evaluate BL in undergraduates; (2) To explore the associations between perceptions of BL effectiveness and student learning outcomes (SLOs) in a higher education course using the developed measurement.

The significance of the current study was listed as follows: (1) The researchers noted that there have only a few studies have focused on the BL measurement in higher education and its effects on SLOs. Therefore, the current study results will add to the literature regarding BL measurement and its validity. (2) The Ministry of Education in China has an explicit goal the desire to update university teaching means and strategies in accordance with the demands of the twenty-first century. Therefore, the current study will contribute to the national goals of the Ministry of Education in China, enhance understanding of BL, and provide a theoretical framework and its applicability. (3) Faculty members in higher education who attempt to apply BL model in their instructions will be aware of the basic components of BL that contribute to SLOs.

2. Literature review

2.1. definitions of bl.

BL is referred to as “hybrid,” “flexible,” “mixed,” “flipped” or “inverted” learning. The BL concept was first proposed in the late 20 century against the backdrop of growing technological innovation ( Keogh et al., 2017 ). The general definition of BL is that it integrates traditional face-to-face teaching with a web-based approach.

However, this description has been hotly debated by researchers in recent years. Oliver and Trigwell (2005) posited that BL may have different attributions in relation to various theories, meaning that the concept should be revised. Others attempted to clarify the significance of BL by classifying the proportion of online learning in BL and the different models that come under the BL umbrella. Allen and Seaman (2010) proposed that BL should include 30–70% online-in person learning (otherwise, it would be considered online learning (more than 70%) or traditional face-to-face learning (less than 30%)). In The Handbook of Blended Learning that edited by Bonk and Graham (2006) set out three categories of BL: web-enhanced learning, reduced face-time learning, and transforming blends. Web-enhanced learning pertains to the addition of extra online materials and learning experiences to traditional face-to-face instruction. Reduced face-time learning means to shift part of face-to-face lecture time to computer-mediated activities. Transforming blends mixes traditional face-to-face instruction with web-based interactions, through which students are able to actively construct their knowledge.

This study views BL as an instructional approach that provides both synchronous and asynchronous modes of delivery through which students construct their own understandings and interact with others in these settings, which is widely accepted by numerous researchers ( Liliana, 2018 ; Bayyat et al., 2021 ). To phrase this in another way, this description emphasizes that learning has to be experienced by the learner.

2.2. Essential elements of BL

Previous studies, universities, and cooperation have discussed the essential components of online learning courses. Blackboard assesses online learning environments on four scales (course design, cooperation, assessment, and learner support) with 63 items. Quality Matters evaluated online learning according to the following categories: course overview, objectives, assessment, teaching resources, activities and cooperation, course technology, learner support, and practicability. Californian State Universities rated their criteria on a ten scale of 58 items, including learning evaluation, cooperation and activities, technology support, mobile technology, accessibility, and course reflection. New York State Universities evaluate BL under the following six sub scales: course overview, course design, and assignment, class activities, cooperation, and assessment. Due to the lack of criteria for BL, these standards have been considered in evaluating BL.

The present study utilized Biggs’ (1999) constructive alignment as the main theoretical framework to analyze BL courses. “Constructive means the idea that students construct meaning through relevant activities … and the alignment aspect refers to what the teachers do, which is to set up a learning environment” ( Biggs, 1999 , p. 13). Later, Biggs and Tang (2011) elaborated on the two terms — ‘constructive’ and ‘alignment’ originated from constructivist theory and curriculum theory, respectively, in the book Teaching for Quality Learning at University. Constructivism was regarded as “learners use their own activity to construct their knowledge as interpreted through their own exiting schemata.” The term “alignment” emphasized that the assessments set were relevant and conducive to the intended learning goals ( Biggs and Tang, 2011 , p. 97). According to Biggs’ statement, various critical components should be closely linked within the learning context, including learning objectives, teaching learning activities, and assessment tasks. These main components have been defined in detail:

(1) Learning objectives indicate the expected level of student understanding and performance. They tell students what they have to do, how they should do it, and how they will be assessed. Both course overview and learning objectives involve intended learning outcomes.

(2) Teaching/learning activities are a set of learning processes that the students have to complete by themselves to achieve a given course’s intended learning outcomes. In BL, activities include both online and face-to-face activities where students are able to engage in collaborations and social interactions ( Hadwin and Oshige, 2011 ; Ellis et al., 2021 ). The interactive learning activities are chosen to best support course objectives and students’ learning outcomes ( Clark and Post, 2021 ). Examples of activities in BL include: group problem-solving, discussion with peers/teachers, peer instruction, answering clicker questions or in-class polls ( Matsushita, 2017 ).

(3) Assessment tasks are tools to determine students’ achievements based on evidence. In BL, assessments can be conducted either online or in-class. Examples of assessments in BL include: online quizzes, group projects, field-work notes, individual assignments.

(4) Besides, based on the definition of BL and the integration of information technology improvement in recent years, online resources and technological support have become essential components of BL courses ( Darling-Aduana and Heinrich, 2018 ; Turvey and Pachler, 2020 ). On a similar note, Ellis and Goodyear (2016) and Laurillard (2013) emphasized the role of technical devices in BL, whilst Zawacki-Richter (2009) regarded online resources and technological support as central to achieving BL course requirements. In addition, Liu (2021) suggested that a BL model should include teaching objectives, operating procedures, teaching evaluation, and teaching resources before class, during class, and after class, respectively. With this in mind, the present study integrates both essential curriculum components in the face-to-face course and information technology into the teaching and learning aspects of the BL course.

2.3. BL effectiveness and SLOs

Many researchers have demonstrated the benefits of BL approach on SLOs because of the importance of BL in improving teaching methods and better reflecting the improvement of the learner skills, talents, and interest in learning. Garrison and Kanuka (2004) reported increased completion rates as a result of BL application. Similar results were agreed upon by other researchers. Kenney and Newcombe (2011) and Demirkol and Kazu (2014) conducted comparisons and found that students in BL environments had higher average scores than those in non-BL environments. Alsalhi et al. (2021) utilized a quasi-experimental study at Ajman University ( n  = 268) and indicated that the use of BL has a positive effect on students’ academic success in a statistics course. “BL helps to balance a classroom that contains students with different readiness, motivation, and skills to learn. Moreover, BL deviates from traditional teaching and memorizing of students” ( Alsalhi et al., 2021 , p. 253). No statistical significant difference was found among students based on the variables of the university they attended.

However, researchers also showed that BL approach may not be applicable to all learners or improve their learning outcomes. Oxford Group (2013) reported that about 16% of learners had negative attitudes toward BL, while 26% of learners chose not to complete BL. Kintu et al. (2017) examined the relationship between student characteristics, BL design, and learning outcomes and indicated that BL design is beneficial to raise student satisfaction ( n  = 238). The study also found that BL predicted learning outcomes for learners with high self-regulation skills. Similar results were reported by Siemens (2005) who indicated that students who have higher learner interactions resulted in higher satisfaction and learning outcomes. Hara (2000) identified ambiguous course design and potential technical difficulties as major barriers in BL practice, which led to dissatisfied learning outcomes. Clark and Post (2021) utilized a hybrid study in higher education to explore the effectiveness of different instructional approaches (face-to-face, eLearning, and BL) and indicated that the individual student valued active learning in both face-to-face classes and eLearning classes. Moreover, having an eLearning experience prior to face-to-face classes is beneficial for students to perform well on the assessment. However, the study noted that students who took face-to-face courses were positively associated with their final grades.

2.4. Research questions and hypotheses

To fill in the gaps, the research questions and hypotheses were raised in the present research as follows:

RQ1 : What components (among course overview, course objectives, assessments, activities, course resources, and technology support) contribute to the measurement?
RQ2 : Is there an association between BL effectiveness and SLOs in higher education?
H1 : All components (among course overview, course objectives, assessments, class activities, course resources, and technology support) contribute to the BL course model.
H2 : There is an association between BL effectiveness and SLOs.

3. Methodology

The study employed a non-experimental, correlational design and used survey responses from undergraduates to address the research questions. Specifically, a higher education institution in Shanghai with a specialization in teacher education was studied. The present study was a part of an instructional initiative project at this institution designed to identify students’ perceptions of the effectiveness of BL and explore the possible relationships between BL effectiveness and SLOs.

The present research consisted of two stages: Stage I (from March 2021 to July 2021) aimed to develop a measurement for evaluating undergraduates’ BL perceptions through a survey of undergraduates who had experienced BL courses. Stage II (from September 2021 to January 2022) aimed to use the developed measurement to examine whether or not there is an association between BL effectiveness and SLOs.

3.1. Instruments

3.1.1. effectiveness of bl scale (ebls).

In Stage I, according to Biggs’ theoretical framework and the existing literature, the measurement used in this study was composed of six sub-scales: course overview, learning objectives, assessments, course resources, teaching/learning activities, and technology support. After comparing these criteria, the instrument titled “Blended Learning Evaluation” was derived from Quality Matters Course Design Rubric Standards (QM Rubric) and revised. Following consultation with experienced teaching experts who had experience in BL design and application, the revised QM Rubric can be applied to both the online and face-to-face portions of the course. Table 1 details the modified measurement.

Items of measurement.

Then, a panel of two experts, two blended course design trainers, and two faculty members in the curriculum and instruction department were asked to evaluate the appropriateness and relevance of each item included in the instrument. Subsequently, a group of 10 sophomores and senior students were asked to check how the questions are read and understood and accordingly give feedback. Based on their comments and the suggestions from the panel, a few minor changes were made and content validity was again evaluated by the panel of experts prior to the administration of the instrument.

Finally, the EBLS that modified from Quality Matters Course Design Rubric Standards was determined as the initial scale that was preparing for the construct validity and reliability checks. The EBLS composed of six sub-scales (25 items in total): course overview (4 items), course objectives (4 items), assessments (4 items), course resources (5 items), in-class and online activities (4 items), and technology support (4 items). The measurement applied a 5-point Likert Scale (1-Strongly Disagree, 2-Disagree, 3-So-so, 4-Agree, and 5-Strongly Agree).

3.1.2. Student learning outcomes

In Stage II, the course marks from the Curriculum and Instruction Theorem module were used as an indicator of students’ learning outcomes. Multiple regression analysis was utilized via SPSS 25.0 to perform data analysis.

In the second stage of the study, the students’ course marks from the Curriculum and Instruction Theorem module were used as an indicator of students’ learning outcomes. This curriculum was a semester-long mandatory course for 91 sophomores which ran for 16 weeks from September 2021 to January 2022. It aimed to develop the students’ knowledge of in-depth disciplinary and academic content but also skills pertaining to cooperation, technology, inquiry, discussion, presentation, and reflection. The course was designed as a synchronous BL curriculum, in which students all had both face-to-face and technologically-mediated interactions (see Table 2 ). Each week, there were 1.5 h of face-to-face learning that combined lectures, tutorials, and fieldwork. The lectures covered teaching key concepts with examples and non-examples and connected teaching theories to practical issues. Meanwhile, the tutorials provided opportunities for students to collaborate with peers or in groups. The fieldwork offered opportunities for students to observe real classes and interview cooperative teachers or students in local elementary schools. Technologically-mediated interactions supported by the Learning Management System (LMS) provided supplementary learning resources, reading materials, relative videos, cases, assessment and other resources from the Internet. Students were required to complete online quizzes, assignments, projects, and discussions as well on LMS.

Weekly blended learning design mode.

The final marks of the course were derived from both formative and summative assessments. The formative assessments covered attendance and participation, individual assignments (quizzes, reflections, discussions, case studies, and class observation reports) and group projects (lesson plan analysis, mini-instruction, reports). The summative assessment was a paper-based final examination, as required by the college administrators.

3.2. Participants

In Stage I of the study, the target population was sophomore and junior undergraduates from different majors at a higher education institution in Shanghai. Detailed demographic information has been reported in the results section of this study. Notably, due to practical constraints, a convenience sample was employed in the present study. As explained by McMillan and Schumacher (2010) , although the generalizability of the results is more limited, the findings are nevertheless useful when considering BL effectiveness. Thus, care was taken to gather the demographic background information on the respondents to ensure an accurate description of the participants could be achieved.

In Stage II of the study, the participants were 91 sophomores who took the synchronous BL course, Curriculum and Instruction Theorem, in School of Primary Education in the fall of 2021 (September 2021–January 2022).

3.3. Data collection procedures, analysis and presentation

Institutional Review Board (IRB) approval was obtained prior to the collection of Stage I and Stage II. In Stage I, the informed IRB-approved Informed Consent Form included a brief introduction to the study purpose, the length of time required to complete the survey, possible risks and benefits, the researcher’s contact information, etc. It also clarified to the potential respondent that the survey was voluntary and anonymous. SurveyMonkey 1 was used to administer the survey. In Stage II, the informed IRB-approved Informed Consent Form was also provided to participants. LMS was used for data collection.

To address RQ1, the study used the four following steps:

An initial measurement was modified and translated from QM rubrics, and the content validity was checked by the authority.

Secondly, the reliability of measurement was examined.

Exploratory factor analysis (EFA) was conducted to test the construct validity.

Confirmatory factor analysis (CFA) was examined to correct for the relationships between the modeling and data. Ultimately, a revised BL measurement was developed with factor loadings and weights. In Stage I, SPSS 26.0 and AMOS 23.0 were utilized to implement factor analysis and structured equation modeling.

To address RQ2, the study followed two steps:

Descriptive statistics (mean, standard deviation, minimum rating, and maximum rating) were calculated on the undergraduates’ perspectives on BL effectiveness, as identified by the author.

SLOs were regressed on the perceived BL effectiveness. This research question examined whether the overall BL effectiveness was associated with student achievement. In Stage II, SPSS 26.0 was utilized to implement correlations and multiple regressions.

3.4. Limitations

Based on the threats to the validity of internal, external, structural, and statistical findings summarized by McMillan and Schumacher (2010) , the following limitations of this study are acknowledged. First, since data were self-reported by participants, may have been influenced and the answers they provided may not reflect their true feelings or behaviors. Second, the study used a convenience sample rather than a database consisting of all undergraduates in higher education in Shanghai; therefore, the population external validity was limited to those faculties with response characteristics. Last, although care was taken to generally phrase the research questions in terms of association rather than effects, a limitation of the study is that correlational design limits our ability to draw causal inferences. The results may be suggestive, but further research is needed in order to draw conclusions about BL impacts.

4.1. What factors (among course overview, course objectives, assessments, class activities, course resources, and technology support) contribute to the measurement?

4.1.1. demographic information in stage i.

In Stage I, a survey with 25 items in 6 sub-scales was delivered to undergraduates who had experienced BL in higher education. In total, 295 valid questionnaires were collected in Stage I (from March 2021 to July 2021). Demographic information of the participants were reported as follows: the percentage of male respondents was 27% while the percentage of female respondents was 73%. The majors of respondents included education (51%), literature (22%), computer science (11%), business (10%), arts (5%), and others (1%). All the respondents were single and aged in the range of 19–20 years old.

4.1.2. Reliability analysis

To address RQ1, reliability and EFA were conducted on the questionnaire results. Test reliability refers to “the consistency of measurement – the extent to which the results are similar over different forms of the same instrument or occasions of data collection” ( McMillan and Schumacher, 2010 , p. 179). To be precise, the study tested internal consistency (Cronbach’s Alpha), composite reliability (CR), and Average of Variance Extracted (AVE) evidence for reliability. According to Table 3 , the reliability of the measurement (25 Items) showed the internal reliability for this scale was 0.949 ( N  = 295). The alpha reliability value for each sub-scale is as follows: 0.859, 0.873, 0.877, 0.910, 0.902, and 0.881, respectively. Since the total scale’s alpha value and sub-scales’ alpha values were all greater than 0.70, the reliability of the survey was relatively high and therefore acceptable. Moreover, the AVE of each sub-scale was greater than 0.50, indicating that the reliability and convergence of this measurement were good. In addition, CR values were all greater than 0.80. This indicates that the composite reliability is high. Therefore, this blended course evaluation measurement is deemed reliable.

Reliability results for the measurement ( N  = 295).

4.1.3. Exploratory factor analysis

According to the research design, EFA was then carried out to determine its construct validity by using SPSS 26.0 to identify if some or all factors (among course overview, course objectives, assessments, class activities, course resources, and technology support) perform well in the context of a blended course design. According to Bryant and Arnold (1995) , to run EFA, the sample should be at least five times the number of variables. The subjects-to-variables ratio should be 5 or greater. Furthermore, every analysis should be based on “a minimum of 100 observations regardless of the subjects-to-variables ratio” (p. 100). This study included 25 variables, meaning that 300 samples were gathered. The number of samples was more than 12 times greater than the variables. Compared to the criteria proposed by Kaiser and Rice (1974) , the KMO of measurement in this study was greater than 0.70 (0.932). This result indicates the sampling is more than adequate. According to Table 4 (showing Bartlett’s Test of Sphericity), the approximate Chi-square of Bartlett’s test of Sphericity is 4124.801 ( p  = 0.000 < 0.001). This shows that the test was likely to be significant. Therefore, EFA could be used to examine the study.

Bartlett’s test of sphericity.

EFA refers to “how items are related to each other and how different parts of an instrument are related” ( McMillan and Schumacher, 2010 , p. 176). Factor analysis (principal component with varimax rotation) analysis was deployed to assess the degree to which 25 blended course design level questions were asked in the “Blended Course Evaluation Survey.” According to the EFA results detailed in Table 5 (Rotated Factor Matrix), the 25 items loaded on six factors with eigenvalues were greater than 1. The results of the rotated factor matrix showed the loadings were all close to or higher than 0.70 ( Comrey and Lee, 1992 ). Therefore, these six factors mapped well to the dimensions and the measurement can be seen to have relatively good construct validity. Hence, to answer RQ1, all of the factors (among course overview, course objectives, assessments, class activities, course resources, and technology support) performed well in the measurement.

Rotated factor matrix * .

Extraction Method: Principal Axis Factoring. Rotation Method: Varimax with Kaiser Normalization.

Rotation converged in 16 iterations. The bold values indicate that load size of the factors are greater than 0.5.

To further address to what extent factors contribute to the measurement, the hypothesized model in the present study was examined, after which the weight of each factor was calculated for educators based on its structural equation modeling. To discern whether the hypothesized model reflects the collected data, AMOS 23.0 was utilized to carry out confirmatory factor analysis. Compared the fit indexes to the criteria in Table 6 Comparison of Fit Indexes for Alternative Models of the Structure of the Blended Course Design Measurement below, the Root Mean Square Error of Approximation (RMSEA) was 0.034, lower than our rule of thumb of 0.05, which would indicate a good model. Additionally, the results of TLI (0.979) and CFI (0.981) were above our target for a good model. Moreover, CMIN/DF was 1.284, lower than 3; GFI was 0.909, greater than 0.8; AGFI was 0.886, greater than 0.8; NFI was 0.922, greater than 0.9; IFI was 0.982, greater than 0.9; and RMR was 0.013 lower than 0.08. Based on these criteria, it appears that the initial model fits the data well. In other words, the initial model can effectively explain and evaluate a blended course design.

Comparison of fit indexes for alternative models of the structure of the blended learning measurement.

4.1.4. Confirmatory factor analysis

Focusing on the model itself, CFA was examined to correct the relationships between the modeling and data. Figure 1 shows that most subtests provided relatively strong measures of the appropriate ability or construct. Specifically, one factor was positively correlated to the others. For instance, course overview was positively correlated to course objectives, assessments, course resources, class activities, and technology support. The coefficients of the correlations for the respective factors are as follows: 0.72, 0.52, 0.61, 0.63, 0.55. This means that in any BL, if the course overview rises by 1 point, the other variables will rise by 0.72, 0.52, 0.61, 0.63, 0.55 points, respectively. The results match the statement that “cognitive tests and cognitive factors are positively correlated” ( Keith, 2015 , p. 335). Additionally, this study tested the discriminant validity of the measurement to ensure that each factor performed differently in the model itself. According to Fornell and Lacker’s (1981) criteria, the square root of AVE value must be greater than the correlation value between the other concepts. The results in Table 7 illustrated that the value of the variables (0.777 which was the lowest) exceeded the correlation value (0.72 which was the greatest). From this, it can be confirmed that the hypothesized model used in the present study had sufficient discriminant validity. Therefore, the hypothesized model in the present study reflected reality well.

Figure 1

Standardized estimates for the initial blended course design six-factor model.

Discriminant validity.

The weight of each factor in the model was further calculated for educators based on structural equation modeling (see Figure 2 ). For example, the weight of course overview = 0.84/(0.84 + 0.79 + 0.70 + 0.73 + 0.74 + 0.70) = 0.187. Using the same way to calculate the other weighs. The relevant calculations are shown below and the results are shown in Table 8 . The total score of a blend course design is calculated as follows: the score of course overview * 0.187 + the score of course objectives * 0.176 + the score of assessment * 0.155 + the score of course resources * 0.162 + the score of class activities * 0.164 + the score of technology support * 0.156. The total grade for this measurement is 100.

Figure 2

Weight of factors in the present model.

Confirmatory factor loading and weightings.

4.2. Is there an association between the effectiveness of blended learning and student learning outcomes?

4.2.1. demographic information in stage ii.

In Stage II, there were 91 respondents collected through LMS. The percentage of male respondents was 16% while the percentage of female respondents was 84%. All the respondents in Stage II took the synchronous BL course, Curriculum and Instruction Theorem, in School of Primary Education in Fall 2021 (September 2021–January 2022).

4.2.2. Descriptive statistics

In answering RQ2, the descriptive statistics were reported in Table 9 for the undergraduates’ perspectives on BL effectiveness and SLOs. Higher scores for this measure of BL effectiveness indicate undergraduates perceive BL as more effective, with responses of 1 for “Strongly Disagree” to 4 for “Strongly Agree.” The results revealed that the six elements of BL effectiveness had an overall mean of 93.65 (corresponding to an item average of 3.74, which corresponds to “agree”). The scores of each sub-scale are very similar and, again, correspond to undergraduates reporting that they “agree” with the efficacy of BL with respect to course overview, course objectives, assessment, course resources, class activities, and technology support. Table 8 also provided information about overall SLOs. Specifically, it illustrated that students’ learning outcomes (final marks composed of formative assessments and summertime assessments) in BL had an overall mean of 80.65. The maximum and minimum scores were 93.00 and 60.00, respectively.

Descriptive statistics for the overall scores and sub scales of the measures of blended learning effectiveness and student achievement ( N  = 91).

4.2.3. Regressions between BL effectiveness and SLOs

To further address the relationship between BL effectiveness and SLOs, SLOs was regressed on the perceived BL effectiveness. This research question examined whether the overall BL effectiveness was associated with student achievement. Additionally, Pearson correlations (shown in Table 10 ) between key variables were calculated. The results showed that the overall score of BL effectiveness was significantly correlated with student achievement ( r  = 0.716, p  < 0.01).

Descriptive statistics and Pearson correlations between key variables in the regression models.

* p  < 0.05; ** p  < 0.01.

Table 11 shows the results for the regression of total student academic performance on the overall BL effectiveness scores across six components (course overview, course objectives, assessment, course resources, class activities, and technology support). Notably, the full model was statistically significant. Directly addressing RQ2, undergraduates reported that regarding BL effectiveness explained 51.3% of the additional variance, F (1, 89) = 93.843, p  < 0.001, ΔR 2  = 0.508. Moreover, it was statistically significant and considered to have a large effect. Accordingly, when the perception of BL effectiveness increased by a value of one point, the student’s academic performance would increase by 0.563 ( b  = 0.563, p  < 0.001). Thus, to answer the final research question, there is a positive correlation between the effectiveness of BL and student learning achievement.

Summary of simultaneous multiple linear regression results predicting student achievement from perceptions of the blended learning effectiveness.

* p  < 0.05; ** p  < 0.01. R  = 0.716, R 2  = 0.513, F (1, 89) = 93.843, p  < 0.001.

5. Conclusion and discussion

BL is a combination of face-to-face interactions and online learning, where the instructor manages students in a technological learning environment. In the post-pandemic era, BL courses are widely used and accepted by educators, students, and universities. However, the validity of BL remains controversial. The lack of an accurate BL scale was one of the big concerns. The study developed a measurement to evaluate BL for undergraduates and investigated the relationship between the effectiveness of BL and SLOs. Biggs’ 1999) constructive alignment, including factors like course overview, learning objectives, teaching/learning activities, and assessment, was utilized as the primary theoretical framework for conceptualizing the scale. Later, related literature indicated the importance of adding technology and resources as essential components. Therefore, a scale was developed with six subscales.

RQ1 explored the essential components of BL. Stage I recruited 295 undergraduates from different majors at a university in Shanghai. Hypothetical measurements that include 6 sub-scales (25 items in total) were examined. Construct validity was examined with EFA and CFA. As a result, a 6-factor 5-point Likert-type scale of BL effectiveness made up of 25 times was developed. The total variance regarding the six factors of this scale was calculated as 68.4%. The internal consistency reliability coefficient (Cronbach’s alpha) for the total scale was calculated to be 0.949. The alpha reliability values for each sub-scale were as follows: 0.859, 0.873, 0.877, 0.910, 0.902, and 0.881, respectively. The results of the study demonstrated that the hypothesized factors (course overview, course objectives, assessments, class activities, course resources, and technology support) mainly proposed by Biggs (1999) are aligned as a unified system in BL. Furthermore, the results reflect the real concerns of students as they experience BL in higher education However, the participants in the present study were selected from among students enrolled in BL at the university. The characteristics of these samples were as limited as the responders. In future research, a larger scale including undergraduates in other universities may be recruited to test the validity.

RQ2 examined the association between BL validity and SLOs. In Stage II, the study recruited 91 students who participated in a synchronous BL course at the College of Education. The results demonstrated a positive relationship between the effectiveness of BL and SLOs: the more effective that undergraduates perceived BL, the better their SLOs. It supported the results of the previous literature ( Demirkol and Kazu, 2014 ; Alsalhi et al., 2021 ). Moreover, the descriptive analysis provided additional findings for educators when designing and implementing BL for undergraduates. First, undergraduates expect a clear class overview about how to start the course, how to learn through the course, and how to evaluate their learning outcomes. A clear syllabus with detailed explanations should be prepared and distributed at the outset of BL. Second, undergraduates pay attention to curriculum objectives and continuously compare their work as they progress through the course to see if it helps them achieve those objectives; on this basis, outlining the objectives at the beginning of chapter learning and showing expected learning outcomes (such as rubrics) are recommended. Finally, undergraduates enjoy rich social interactions in both face-to-face activities and online interactions, therefore, a variety of classroom activities for different levels of students is recommended. In future study, more detailed analyses could be considered. For example, it would be valuable to explore the indirect effect of the effectiveness of BL on SLOS. Besides, qualitative research could be conducted to identify the underlying reasons why BL affects SLOs.

Data availability statement

The datasets presented in this article are not readily available because the datasets generated for this study are not publicly available due to the permissions gained from the target group. Requests to access the datasets should be directed to XH, [email protected] .

Ethics statement

The studies involving human participants were reviewed and approved by the Shanghai Normal University Tianhua College. The patients/participants provided their written informed consent to participate in this study.

Author contributions

XH: drafting the manuscript, data analysis and perform the analysis, and funding acquisition. JF: theoretical framework, and methodology. BW: supervision. YC: data collection and curation. YW: reviewing and editing. The author confirms being the sole contributor of this work and has approved it for publication.

This research was supported by the Chinese Association for Non-Government Education (grant number: CANFZG22268) and Shanghai High Education Novice Teacher Training Funding Plan (grant number: ZZ202231022).

Conflict of interest

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Acknowledgments

I would like to express my appreciation to my colleagues: Prof. Jie Feng and Dr. Yinghui Chen. They both provided invaluable feedback on this manuscript.

1 www.surveymonkey.com

  • Adnan M., Anwar K. (2020). Online learning amid the COVID-19 pandemic: students’ perspectives. J. Pedagog. Sociol. Psychol. 2, 45–51. doi: 10.33902/JPSP.2020261309 [ DOI ] [ Google Scholar ]
  • Alducin-Ochoa J. M., Vázquez-Martínez A. I. (2016). Academic performance in blended- learning and face-to-face university teaching. Asian Soc. Sci. 12:207. doi: 10.5539/ass.v12n3p207 [ DOI ] [ Google Scholar ]
  • Allen I. E., Seaman J. (2010). Class differences: online education in the United States, 2010. The Sloan consortium. Babson Survey Research Group. Available at: http://sloanconsortium.org/publications/survey/class_differences (Accessed October 7, 2014)
  • Alsalhi N., Eltahir M., Al-Qatawneh S., Quakli N., Antoun H., Abdelkader A., et al. (2021). Blended learning in higher education: a study of its impact on students’ performance. Int. J. Emerg. Technol. Learn. 16, 249–268. doi: 10.3991/ijet.v16i14.23775 [ DOI ] [ Google Scholar ]
  • Bayyat M., Muaili Z., Aldabbas L. (2021). Online component challenges of a blended learning experience: a comprehensive approach. Turk. Online J. Dist. Educ. 22, 277–294. doi: 10.17718/tojde.1002881 [ DOI ] [ Google Scholar ]
  • Biggs J. (1999). Teaching for quality learning at university. open university press/society for research in higher education, Buckingham.
  • Biggs J., Tang C. (2011). Teaching for quality learning at university: what the student does (4th). London: McGraw-Hill. [ Google Scholar ]
  • Bonk C., Graham C. (2006). The handbook of blended learning: global perspectives, local designs. San Francisco, CA: Pfeiffer Publishing. [ Google Scholar ]
  • Boyle T., Bradley C., Chalk P., Jones R., Pickard P. (2003). Using blended learning to improve student success rates in learning to program. J. Educ. Media 28, 165–178. doi: 10.1080/1358165032000153160 [ DOI ] [ Google Scholar ]
  • Brown C., Davis N., Sotardi V., Vidal W. (2018). Towards understanding of student engagement in blended learning: a conceptualization of learning without borders. Available at: http://2018conference.ascilite.org/wp-content/uploads/2018/12/ASCILITE-2018-Proceedings-Final.pdf. 318-323
  • Bryant F. B., Arnold P. R. (1995). “Principal-components analysis and exploratory and confirmatory factor analysis” in Reading and understanding multivariate statistics. eds. Grimm L. G., Arnold P. R. (Washington, DC: American Psychological Association; ), 99–136. [ Google Scholar ]
  • Clark C., Post G. (2021). Preparation and synchronous participation improve student performance in a blended learning experience. Australas. J. Educ. Technol. 37, 187–199. doi: 10.14742/ajet.6811 [ DOI ] [ Google Scholar ]
  • Comrey A. L., Lee H. B. (1992). A first course in factor analysis (2nd). Hillsdale, NJ: Lawrence Erlbaum Associates. [ Google Scholar ]
  • Darling-Aduana J., Heinrich C. J. (2018). The role of teacher capacity and instructional practice in the integration of educational technology for emergent bilingual students. Comput. Educ. 126, 417–432. doi: 10.1016/j.compedu.2018.08.002 [ DOI ] [ Google Scholar ]
  • Demirkol M., Kazu I. Y. (2014). Effect of blended environment model on high school students’ academic achievement. Turk. Online J. Educ. Technol. 13, 78–87. [ Google Scholar ]
  • Dos B. (2014). Developing and evaluating a blended learning course. Anthropologist 17, 121–128. doi: 10.1080/09720073.2014.11891421 [ DOI ] [ Google Scholar ]
  • Ellis R., Bliuc A. M., Han F. (2021). Challenges in assessing the nature of effective collaboration in blended university courses. Australas. J. Educ. Technol. 37:1–14. doi: 10.14742/ajet.5576 [ DOI ] [ Google Scholar ]
  • Ellis R. A., Goodyear P. (2016). Models of learning space: integrating research on space, place and learning in higher education. Rev. Educat. 4, 149–191. doi: 10.1002/rev3.3056 [ DOI ] [ Google Scholar ]
  • Feng X., Wang R., Wu Y. (2018). A literature review on blended learning: based on analytical framework of blended learning. J. Dist. Educ. 12, 13–24. doi: 10.15881/j.cnki.cn33-1304/g4.2018.03.002 [ DOI ] [ Google Scholar ]
  • Fornell C., Lacker D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 18, 39–50. doi: 10.1177/002224378101800104 [ DOI ] [ Google Scholar ]
  • Garrison D. R., Kanuka H. (2004). Blended learning: uncovering its transformative potential in higher education. Internet High. Educ. 7, 95–105. doi: 10.1016/j.iheduc.2004.02.001 [ DOI ] [ Google Scholar ]
  • Hadwin A., Oshige M. (2011). Self-regulation, coregulation, and socially shared regulation: exploring perspectives of social in self-regulated learning theory. Teach. Coll. Rec. 113, 240–264. doi: 10.1007/s11191-010-9259-6 [ DOI ] [ Google Scholar ]
  • Han F., Ellis R. (2021). Patterns of student collaborative learning in blended course designs based on their learning orientations: a student approaches to learning perspective. Int. J. Educ. Technol. High. Educ. 18:66. doi: 10.1186/s41239-021-00303-9 [ DOI ] [ Google Scholar ]
  • Hara N. (2000). Students distress with a web-based distance education course. Information Communication and Society Communication & Society. 557–579. doi: 10.1080/13691180010002297 [ DOI ] [ Google Scholar ]
  • Kaiser H. F., Rice J. (1974). Little jiffy, mark IV. Educ. Psychol. Meas. 34, 111–117. doi: 10.1177/001316447403400115 [ DOI ] [ Google Scholar ]
  • Keith T. (2015). Multiple regression and beyond: an introduction to multiple regression and structural equation modeling (2nd). New York: Routledge. [ Google Scholar ]
  • Kenney J., Newcombe E. (2011). Adopting a blended learning approach: challenges, encountered and lessons learned in an action research study. J. Asynchronous Learn. Netw. 15, 45–57. doi: 10.24059/olj.v15i1.182 [ DOI ] [ Google Scholar ]
  • Keogh J. W., Gowthorp L., McLean M. (2017). Perceptions of sport science students on the potential applications and limitations of blended learning in their education: a qualitative study. Sports Biomech. 16, 297–312. doi: 10.1080/14763141.2017.1305439, PMID: [ DOI ] [ PubMed ] [ Google Scholar ]
  • Kintu M., Zhu C., Kagambe E. (2017). Blended learning effectiveness: the relationship between student characteristics, design features and outcomes. Int. J. Educ. Technol. High. Educ. 14, 1–20. doi: 10.1186/s41239-017-0043-4 [ DOI ] [ Google Scholar ]
  • Lakhal S., Mukamurera J., Bedard M. E., Heilporn G., Chauret M. (2020). Features fostering academic and social integration in blended synchronous courses in graduate programs. Int. J. Educ. Technol. High. Educ. 17:5. doi: 10.1186/s41239-020-0180-z [ DOI ] [ Google Scholar ]
  • Laurillard D. (2013). Rethinking university teaching: a conversational framework for the effective use of learning technologies. London: Routledge [ Google Scholar ]
  • Liliana C. M. (2018). Blended learning: deficits and prospects in higher education. Australas. J. Educ. Technol. 34, 42–56. doi: 10.14742/ajet.3100 [ DOI ] [ Google Scholar ]
  • Liu Y. (2021). Blended learning of management courses based on learning behavior analysis. Int. J. Emerg. Technol. Learn. 16, 150–165. doi: 10.3991/IJET.V16I09.22741 [ DOI ] [ Google Scholar ]
  • Manwaring K. C., Larsen R., Graham C. R., Henrie C. R., Halverson L. R. (2017). Investigating student engagement in blended learning settings using experience sampling and structural equation modeling. Internet High. Educ. 35, 21–33. doi: 10.1016/j.iheduc.2017.06.002 [ DOI ] [ Google Scholar ]
  • Matsushita K. (2017). Deep active learning: toward greater depth in university education. Singapore: Springer [ Google Scholar ]
  • McMillan J. H., Schumacher S. (2010). Research in education: evidence-based inquiry (7th). Upper Saddle River, NJ: Pearson Education, Inc. [ Google Scholar ]
  • Naffi N., Davidson A.-L., Patino A., Beatty B., Gbetoglo E., Duponsel N. (2020). Online learning during COVID-19: 8 ways universities can improve equity and access. The Conversation. Available at: https://theconversation.com/online-learning-during-covid-19-8-ways-universities-can-improve-equity-and-access-145286
  • Oliver M., Trigwell K. (2005). Can “blended learning” be recommended? E-Learn. Digit. Media 2, 17–26. doi: 10.2304/elea.2005.2.1.17 [ DOI ] [ Google Scholar ]
  • Oxford Group , (2013). Blended learning-current use, challenges and best practices. Available at: ( http://www.kineo.com/m/0/blended-learning-report-202013.pdf )
  • Siemens G. (2005). Connectivism: A learning theory for the digital age. Int. J. Educ. Technol. High. Educ. 2, 3–10. Retrieved from: http://itdl.org/Journal/Jan_05/Jan_05.pdf [ Google Scholar ]
  • Tamim R. M. (2018). Blended learning for learner empowerment: voices from the middle east. J. Res. Technol. Educ. 50, 70–83. doi: 10.1080/15391523.2017.1405757 [ DOI ] [ Google Scholar ]
  • Turvey K., Pachler N. (2020). Design principles for fostering pedagogical provenance through research in technology supported learning. Comput. Educ. 146:103736. doi: 10.1016/j.compedu.2019.103736 [ DOI ] [ Google Scholar ]
  • U.S. Department of Education , Office of planning, evaluation, and policy development, evaluation of evidence-based practices in online learning: a meta-analysis and review of online learning studies, Washington, DC: U.S. Department of Education; (2009). [ Google Scholar ]
  • Xiao J. (2016). Who am I as a distance tutor? An investigation of distance tutors’ professional identity in China. Distance Educ. 37, 4–21. doi: 10.1080/01587919.2016.1158772 [ DOI ] [ Google Scholar ]
  • Yan Y., Chen H. (2021). Developments and emerging trends of blended learning: a document co-citation analysis (2003-2020). Int. J. Emerg. Technol. Learn. 16, 149–164. doi: 10.3991/ijet.v16i24.25971 [ DOI ] [ Google Scholar ]
  • Zawacki-Richter O. (2009). Research areas in distance education—a Delphi study. International review of research in open and distributed learning. 10, 1–17. [ Google Scholar ]
  • Zhang Q., Zhang M., Yang C. (2022). Situation, challenges and suggestions of college teachers’ blended teaching readiness. E-Educ. Res. 12, 46–53. doi: 10.13811/j.cnki.eer.2022.01.006 [ DOI ] [ Google Scholar ]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

  • View on publisher site
  • PDF (777.0 KB)
  • Collections

Similar articles

Cited by other articles, links to ncbi databases.

  • Download .nbib .nbib
  • Format: AMA APA MLA NLM

Add to Collections

IMAGES

  1. Blended Learning Explained: Definition, Models, & More

    research article about blended learning

  2. (PDF) Effectiveness of Blended Learning in Higher Education

    research article about blended learning

  3. (PDF) Blended Learning Outcome vs. Traditional Learning Outcome

    research article about blended learning

  4. (PDF) Effect of blended learning to academic achievement

    research article about blended learning

  5. Research about blended learning

    research article about blended learning

  6. (PDF) Effect of Blended Learning on Student Achievement

    research article about blended learning

VIDEO

  1. What is…Blended Learning?

  2. Blended learning & flipped classroom

  3. A Student-Centered Model of Blended Learning

  4. What is Blended Learning?

  5. Blended Learning- An Introduction to Blended Learning

  6. Teaching with a Blended Learning Methodology