close

Evidence-based practice for effective decision-making

Effective HR decision-making is based on considering the best available evidence combined with critical thinking.

People professionals are faced with complex workplace decisions and need to understand ‘what works’ in order to influence organisational outcomes for the better. 

Evidence-based practice helps them make better, more effective decisions by choosing reliable, trustworthy solutions and being less reliant on outdated received wisdom, fads or superficial quick fixes. 

At the CIPD, we believe this is an important step for the people profession to take: our Profession Map describes a vision of a profession that is principles-led, evidence-based and outcomes-driven. Taking an evidence-based approach to decision-making can have a huge impact on the working lives of people in all sorts of organisations worldwide.

This factsheet outlines what evidence-based practice is and why it is so important, highlighting the four sources of evidence to draw on and combine to ensure the greatest chance of making effective decisions. It then looks to the steps we can take to move towards an evidence-based people profession. 

On this page

  • What is evidence-based practice?
  • Why is evidence-based practice important?
  • What evidence should we use?
  • How can we move towards an evidence-based people profession?
  • Useful contacts and further reading

At the heart of evidence-based practice is the idea that good decision-making is achieved through critical appraisal of the best available evidence from multiple sources. When we say ‘evidence’, we mean information, facts or data supporting (or contradicting) a claim, assumption or hypothesis. This evidence may come from scientific research, the local organisation, experienced professionals or relevant stakeholders. We use the following definition from CEBMa :

“Evidence-based practice is about making decisions through the conscientious, explicit and judicious use of the best available evidence from multiple sources… to increase the likelihood of a favourable outcome.”

In search of best available evidence

The reasons why evidence-based practice is so important, the principles that underpin it, how it can be followed and how challenges in doing so can be overcome.

Callout Image

Information overload

In their report Evidence-based management: the basic principles , Eric Barends, Denise Rousseau and Rob Briner of CEBMa outline the challenge of biased and unreliable management decisions. 

People professionals face all sorts of contradictory insights and claims about what works and what doesn’t in the workplace. As Daniel Levitin puts it:

"We're assaulted with facts, pseudo facts, jibber-jabber, and rumor, all posing as information. Trying to figure out what you need to know and what you can ignore is exhausting."

While assessing the reliability of evidence becomes more important as the mass of opinion grows, with such a barrage of information, we inevitably use mental shortcuts to make decisions easier and to avoid our brains overloading.

Unfortunately, this means we are prone to biases. Our reports a head for hiring and our minds at work outline the most common of these:

  • authority bias: the tendency to overvalue the opinion of a person or organisation that is seen as an authority
  • conformity bias: the tendency to conform to others in a group, also referred to as 'group think' or 'herd behaviour'
  • confirmation bias: looking to confirm existing beliefs when assessing new information
  • patternicity or the illusion of causality: the tendency to see patterns and assume causal relations by connecting the dots even when there is just random 'noise'.

So-called ‘best practice’

Received wisdom and the notion of ‘best practice’ also creates bias. One organisation may look to another as an example of sound practice and decision-making, without critically evaluating the effectiveness of their actions. And while scientific literature on key issues in the field is vital, there’s a gap between this and the perceptions of practitioners, who are often unaware of the depth of research available.

Cherry-picking evidence

Even when looking at research, we can be naturally biased. We have a tendency to ‘cherry-pick’ research that backs up a perspective or opinion and ignores research that does not, even if it gives stronger evidence on cause-and-effect relationships. This bad habit is hard to avoid – it's even common among academic researchers. So we need approaches that help us determine which research evidence we should trust.

Our ‘insight’ article When the going gets tough, the tough get evidence explains the importance of taking an evidence-based approach to decision making in light of the COVID-19 pandemic. It emphasises and discusses how decision makers can and should become savvy consumers of research.

How can evidence-based practice help?

Our thought leadership article outlines the importance of evidence-based practice in more detail but, essentially, it has three main benefits:

  • It ensures that decision-making is based on fact, rather than outdated insights, short-term fads and natural bias.
  • It creates a stronger body of knowledge and as a result, a more trusted profession.
  • It gives more gravitas to professionals, leads to increased influence on other business leaders and has a more positive impact in work.

The four sources of evidence

The issues above demonstrate the limitations of basing decisions on limited, unreliable evidence. Before making an important decision or introducing a new practice, an evidence-based people professional should start by asking: "What is the available evidence?" As a minimum, people professionals should consider four sources of evidence.

  • Scientific literature on people management has become more readily available in recent years, particularly on topics such as the recruitment and selection of personnel, the effect of feedback on performance and the characteristics of effective teams. People professionals’ ability to search for and appraise research for its relevance and trustworthiness is essential.
  • Organisational data must be examined as it highlights issues needing a manager’s attention. This data can come externally from customers or clients (customer satisfaction, repeated business), or internally from employees (levels of job satisfaction, retention rates). There’s also the comparison between ‘hard’ evidence, such as turnover rate and productivity levels, and ‘soft’ elements, like perceptions of culture and attitudes towards leadership. Gaining access to organisational data is key to determining causes of problems, solutions and implementing solutions.
  • Expertise and judgement of practitioners, managers, consultants and business leaders is important to ensure effective decision-making. This professional knowledge differs from opinion as it’s accumulated over time through reflection on outcomes of similar actions taken in similar contexts. It reflects specialised knowledge acquired through repeated experience of specialised activities.
  • Stakeholders, both internal (employees, managers, board members) and external (suppliers, investors, shareholders), may be affected by an organisation’s decisions and their consequences. Their values reflect what they deem important, which in turn affects how they respond to the organisation’s decisions. Acquiring knowledge of their concerns provides a frame of reference for analysing evidence.

Combining the evidence

One very important element of evidence-based practice is collating evidence from different sources. There are six ways – depicted in our infographic below – which will encourage this:

Evidence based practice infographic

  • Asking – translating a practical issue or problem into an answerable question.
  • Acquiring – systematically searching for and retrieving evidence.
  • Appraising – critically judging the trustworthiness and relevance of the evidence.
  • Aggregating – weighing and pulling together the evidence.
  • Applying – incorporating the evidence into a decision-making process.
  • Assessing – evaluating the outcome of the decision taken so as to increase the likelihood.

Through these six steps, practitioners can ensure the quality of evidence is not ignored. Appraisal varies depending on the source of evidence, but generally involves the same questions:

  • Where and how is evidence gathered?
  • Is it the best evidence available?
  • Is it sufficient to reach a conclusion?
  • Might it be biased in a particular direction? If so, why?

Evidence-based practice is about using the best available evidence from multiple sources to optimise decisions. Being evidence-based is not a question of looking for ‘proof’, as this is far too elusive. However, we can – and should – prioritise the most trustworthy evidence available. The gains in making better decisions on the ground, strengthening the body of knowledge and becoming a more influential profession are surely worthwhile.

To realise the vision of a people profession that’s genuinely evidence-based, we need to move forward on two fronts. 

First, we need to make sure that the body of professional knowledge is evidence-based – the CIPD’s Evidence review hub is one way in which we are doing this. 

Second, people professionals need to develop capacity in engaging with the best available evidence. Doing this as a non-researcher may feel daunting, but taking small steps to making more evidence-based decisions can make a huge difference. Our thought leadership article outlines a maturity model for being more evidence-based in more detail, but to summarise, we’d encourage people professionals to take the following steps:

  • Read research : engage with high-quality research on areas of interest through reading core textbooks and journals that summarise research.
  • Collect and analyse organisational data : in the long-term, developing analytical capability should be an aim for the people profession. More immediately, HR leaders should have some knowledge of data-analytics, enough to ask probing questions and make the case for the resources needed for robust measures.
  • Review published evidence , including conducting or commissioning short evidence reviews of scientific literature to inform decisions.
  • Pilot new practices : evaluate new interventions through applying the same principles used in rigorous cause-and-effect research.
  • Share your knowledge : strengthen the body of knowledge by sharing research insights at events or in publications.
  • Critical thinking : throughout this process, question assumptions and carefully consider where there are gaps in knowledge.

Developing this sort of capability is a long journey but one that people professionals should aspire to. As the professional body for HR and people development, the CIPD takes an evidence-based view on the future of work – and, importantly, what this means for our profession. By doing this, we can help prepare professionals and employers for what’s coming, while also equipping them to succeed and shape a changing world of work.

Our Profession Map has been developed to do this. It defines the knowledge, behaviours and values which should underpin today’s people profession. It has been developed as an international standard against which an organisation can benchmark its values. At its core are the concepts of being principles-led, evidence-based and outcomes driven. This recognises the importance of using the four forms of evidence in a principled manner to develop positive outcomes for stakeholders. As evidence is often of varying degrees of quality, it’s important that people professionals consider if and how they should incorporate the different types of evidence into their work.

Evidence-based practice is a useful concept for understanding whether practices in HR lead to the desired outcomes, and whether these practices are being used to the best effect. 

Both our guide and thought leadership article offer a detailed, step-by-step approach to using evidence-based practice in your decision making.

All our evidence reviews are featured on our Evidence Hub . For a learning and development perspective, listen to our Evidence-based L&D podcast. There's also Using evidence in HR decision-making: 10 lessons from the COVID-19 crisis , part of our coronavirus webinar series.

Center for Evidence-Based Management (CEBMa)  

ScienceForWork - Evidence-based management  

Books and reports

Barends, E. and Rousseau, D. (2018)  Evidence-based management: how to use evidence to make better organizational decisions . Kogan Page: London

Levitin, D. (2015) The Organized Mind: Thinking Straight in the Age of Information Overload . London: Penguin. 

Randell, G. and Toplis, J. (2014)  Towards organizational fitness: a guide to diagnosis and treatment . London: Gower.

Visit the  CIPD and Kogan Page Bookshop  to see all our priced publications currently in print.

Journal articles

Petticrew, M. and Roberts, H. (2003) Evidence, hierarchies, and typologies: horses for courses . Journal Of Epidemiology And Community Health . Vol 57(7): 527.

Rousseau, D. (2020) Making evidence based-decisions in an uncertain world.  Organizational Dynamics . Vol 49, Issue 1, January-March. Reviewed in Bitesize research.

Severson, E. (2019) Real-life EBM: what it feels like to lead evidence-based HR.  People + Strategy . Vol 42, No 1. pp22-27.

CIPD members can use our  online journals  to find articles from over 300 journal titles relevant to HR.

Members and  People Management  subscribers can see articles on the  People Management  website.

This factsheet was last updated by Jake Young: Research Associate, CIPD

Jake’s research interests cover a number of workplace topics, notably inclusion and diversity. Jake is heavily involved with CIPD’s evidence reviews, looking at a variety of topics including employee engagement, employee resilience and virtual teams.

Tackling barriers to work today whilst creating inclusive workplaces of tomorrow.

Related content

We all know that being evidence-based helps us make better decisions, but how can we turn this into a reality?

A case study on using evidence-based practice to better understand how to support hybrid workforces

A case study on using evidence-based practice to reinvigorate performance management practices

A case study on using evidence-based practice to review selection processes for promoting police officers

Explore our other factsheets

Explore how to create and implement a learning and development strategy and policy to support organisational performance

An overview of the purpose and benefits of HR policies and how to implement and communicate them effectively throughout an organisation

Explores the benefits of workforce planning, the activities involved and the stages of the workforce planning process

Understand what employee relations means as a concept and what it means to employers

Nurse Practitioner Certification

ANA Nursing Resources Hub

Search Resources Hub

A female nurse wearing dark blue scrubs is seated at a desk and reading information from a folder. A computer screen can be seen in front of her.

What is Evidence-Based Practice in Nursing?

5 min read • June, 01 2023

Evidence-based practice in nursing involves providing holistic, quality care based on the most up-to-date research and knowledge rather than traditional methods, advice from colleagues, or personal beliefs. 

Nurses can expand their knowledge and improve their clinical practice experience by collecting, processing, and implementing research findings. Evidence-based practice focuses on what's at the heart of nursing — your patient. Learn what evidence-based practice in nursing is, why it's essential, and how to incorporate it into your daily patient care.

How to Use Evidence-Based Practice in Nursing

Evidence-based practice requires you to review and assess the latest research. The knowledge gained from evidence-based research in nursing may indicate changing a standard nursing care policy in your practice Discuss your findings with your nurse manager and team before implementation. Once you've gained their support and ensured compliance with your facility's policies and procedures, merge nursing implementations based on this information with your patient's values to provide the most effective care. 

You may already be using evidence-based nursing practices without knowing it. Research findings support a significant percentage of nursing practices, and ongoing studies anticipate this will continue to increase.

Evidence-Based Practice in Nursing Examples

There are various examples of evidence-based practice in nursing, such as:

  • Use of oxygen to help with hypoxia and organ failure in patients with COPD 
  • Management of angina
  • Protocols regarding alarm fatigue
  • Recognition of a family member's influence on a patient's presentation of symptoms
  • Noninvasive measurement of blood pressure in children 

Improving patient care begins by asking how you can make it a safer, more compassionate, and personal experience. 

Learn about pertinent evidence-based practice information on our  Clinical Practice Material page .

Five Steps to Implement Evidence-Based Practice in Nursing

A young female nurse is seated at a desk, wearing a light blue scrub outfit and doing research using a laptop and taking notes.

Evidence-based nursing draws upon critical reasoning and judgment skills developed through experience and training. You can practice evidence-based nursing interventions by  following five crucial steps  that serve as guidelines for making patient care decisions. This process includes incorporating the best external evidence, your clinical expertise, and the patient's values and expectations.

  • Ask a clear question about the patient's issue and determine an ultimate goal, such as improving a procedure to help their specific condition. 
  • Acquire the best evidence by searching relevant clinical articles from legitimate sources.
  • Appraise the resources gathered to determine if the information is valid, of optimal quality compared to the evidence levels, and relevant for the patient.
  • Apply the evidence to clinical practice by making decisions based on your nursing expertise and the new information.
  • Assess outcomes to determine if the treatment was effective and should be considered for other patients.

Analyzing Evidence-Based Research Levels

You can compare current professional and clinical practices with new research outcomes when evaluating evidence-based research. But how do you know what's considered the best information?

Use critical thinking skills and consider  levels of evidence  to establish the reliability of the information when you analyze evidence-based research. These levels can help you determine how much emphasis to place on a study, report, or clinical practice guideline when making decisions about patient care.

The Levels of Evidence-Based Practice

Four primary levels of evidence come into play when you're making clinical decisions.

  • Level A acquires evidence from randomized, controlled trials and is considered the most reliable.
  • Level B evidence is obtained from quality-designed control trials without randomization.
  • Level C typically gets implemented when there is limited information about a condition and acquires evidence from a consensus viewpoint or expert opinion.
  • Level ML (multi-level) is usually applied to complex cases and gets its evidence from more than one of the other levels.

Why Is Evidence-Based Practice in Nursing Essential?

Three people are standing in a hospital corridor, a male nurse and two female nurses, and they are all looking intently at some information that one of the nurses is holding in her hands.

Implementing evidence-based practice in nursing bridges the theory-to-practice gap and delivers innovative patient care using the most current health care findings. The topic of evidence-based practice will likely come up throughout your nursing career. Its origins trace back to Florence Nightingale. This iconic founder of modern nursing gathered data and conclusions regarding the relationship between unsanitary conditions and failing health. Its application remains essential today.

Other Benefits of Evidence-Based Practice in Nursing

Besides keeping health care practices relevant and current, evidence-based practice in nursing offers a range of other benefits to you and your patients:

  • Promotes positive patient outcomes
  • Reduces health care costs by preventing complications 
  • Contributes to the growth of the science of nursing
  • Allows for incorporation of new technologies into health care practice
  • Increases nurse autonomy and confidence in decision-making
  • Ensures relevancy of nursing practice with new interventions and care protocols 
  • Provides scientifically supported research to help make well-informed decisions
  • Fosters shared decision-making with patients in care planning
  • Enhances critical thinking 
  • Encourages lifelong learning

When you use the principles of evidence-based practice in nursing to make decisions about your patient's care, it results in better outcomes, higher satisfaction, and reduced costs. Implementing this method promotes lifelong learning and lets you strive for continuous quality improvement in your clinical care and nursing practice to achieve  nursing excellence .

Images sourced from Getty Images

Related Resources

A cheerful nurse in blue scrubs takes notes on a clipboard while engaging with an elderly patient in a clinic room, signifying a caring and professional nurse-patient interaction.

Item(s) added to cart

evidence based practice and critical thinking

  • - Google Chrome

Intended for healthcare professionals

  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • News & Views
  • Critical thinking in...

Critical thinking in healthcare and education

  • Related content
  • Peer review
  • Jonathan M Sharples , professor 1 ,
  • Andrew D Oxman , research director 2 ,
  • Kamal R Mahtani , clinical lecturer 3 ,
  • Iain Chalmers , coordinator 4 ,
  • Sandy Oliver , professor 1 ,
  • Kevan Collins , chief executive 5 ,
  • Astrid Austvoll-Dahlgren , senior researcher 2 ,
  • Tammy Hoffmann , professor 6
  • 1 EPPI-Centre, UCL Department of Social Science, London, UK
  • 2 Global Health Unit, Norwegian Institute of Public Health, Oslo, Norway
  • 3 Centre for Evidence-Based Medicine, Oxford University, Oxford, UK
  • 4 James Lind Initiative, Oxford, UK
  • 5 Education Endowment Foundation, London, UK
  • 6 Centre for Research in Evidence-Based Practice, Bond University, Gold Coast, Australia
  • Correspondence to: J M Sharples Jonathan.Sharples{at}eefoundation.org.uk

Critical thinking is just one skill crucial to evidence based practice in healthcare and education, write Jonathan Sharples and colleagues , who see exciting opportunities for cross sector collaboration

Imagine you are a primary care doctor. A patient comes into your office with acute, atypical chest pain. Immediately you consider the patient’s sex and age, and you begin to think about what questions to ask and what diagnoses and diagnostic tests to consider. You will also need to think about what treatments to consider and how to communicate with the patient and potentially with the patient’s family and other healthcare providers. Some of what you do will be done reflexively, with little explicit thought, but caring for most patients also requires you to think critically about what you are going to do.

Critical thinking, the ability to think clearly and rationally about what to do or what to believe, is essential for the practice of medicine. Few doctors are likely to argue with this. Yet, until recently, the UK regulator the General Medical Council and similar bodies in North America did not mention “critical thinking” anywhere in their standards for licensing and accreditation, 1 and critical thinking is not explicitly taught or assessed in most education programmes for health professionals. 2

Moreover, although more than 2800 articles indexed by PubMed have “critical thinking” in the title or abstract, most are about nursing. We argue that it is important for clinicians and patients to learn to think critically and that the teaching and learning of these skills should be considered explicitly. Given the shared interest in critical thinking with broader education, we also highlight why healthcare and education professionals and researchers need to work together to enable people to think critically about the health choices they make throughout life.

Essential skills for doctors and patients

Critical thinking …

Log in using your username and password

BMA Member Log In

If you have a subscription to The BMJ, log in:

  • Need to activate
  • Log in via institution
  • Log in via OpenAthens

Log in through your institution

Subscribe from £184 *.

Subscribe and get access to all BMJ articles, and much more.

* For online subscription

Access this article for 1 day for: £50 / $60/ €56 ( excludes VAT )

You can download a PDF version for your personal record.

Buy this article

evidence based practice and critical thinking

Promoting critical thinking through an evidence-based skills fair intervention

Journal of Research in Innovative Teaching & Learning

ISSN : 2397-7604

Article publication date: 23 November 2020

Issue publication date: 1 April 2022

The lack of critical thinking in new graduates has been a concern to the nursing profession. The purpose of this study was to investigate the effects of an innovative, evidence-based skills fair intervention on nursing students' achievements and perceptions of critical thinking skills development.

Design/methodology/approach

The explanatory sequential mixed-methods design was employed for this study.

The findings indicated participants perceived the intervention as a strategy for developing critical thinking.

Originality/value

The study provides educators helpful information in planning their own teaching practice in educating students.

Critical thinking

Evidence-based practice, skills fair intervention.

Gonzalez, H.C. , Hsiao, E.-L. , Dees, D.C. , Noviello, S.R. and Gerber, B.L. (2022), "Promoting critical thinking through an evidence-based skills fair intervention", Journal of Research in Innovative Teaching & Learning , Vol. 15 No. 1, pp. 41-54. https://doi.org/10.1108/JRIT-08-2020-0041

Emerald Publishing Limited

Copyright © 2020, Heidi C. Gonzalez, E-Ling Hsiao, Dianne C. Dees, Sherri R. Noviello and Brian L. Gerber

Published in Journal of Research in Innovative Teaching & Learning . Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode

Introduction

Critical thinking (CT) was defined as “cognitive skills of analyzing, applying standards, discriminating, information seeking, logical reasoning, predicting, and transforming knowledge” ( Scheffer and Rubenfeld, 2000 , p. 357). Critical thinking is the basis for all professional decision-making ( Moore, 2007 ). The lack of critical thinking in student nurses and new graduates has been a concern to the nursing profession. It would negatively affect the quality of service and directly relate to the high error rates in novice nurses that influence patient safety ( Arli et al. , 2017 ; Saintsing et al. , 2011 ). It was reported that as many as 88% of novice nurses commit medication errors with 30% of these errors due to a lack of critical thinking ( Ebright et al. , 2004 ). Failure to rescue is another type of error common for novice nurses, reported as high as 37% ( Saintsing et al. , 2011 ). The failure to recognize trends or complications promptly or take action to stabilize the patient occurs when health-care providers do not recognize signs and symptoms of the early warnings of distress ( Garvey and CNE series, 2015 ). Internationally, this lack of preparedness and critical thinking attributes to the reported 35–60% attrition rate of new graduate nurses in their first two years of practice ( Goodare, 2015 ). The high attrition rate of new nurses has expensive professional and economic costs of $82,000 or more per nurse and negatively affects patient care ( Twibell et al. , 2012 ). Facione and Facione (2013) reported the failure to utilize critical thinking skills not only interferes with learning but also results in poor decision-making and unclear communication between health-care professionals, which ultimately leads to patient deaths.

Due to the importance of critical thinking, many nursing programs strive to infuse critical thinking into their curriculum to better prepare graduates for the realities of clinical practice that involves ever-changing, complex clinical situations and bridge the gap between education and practice in nursing ( Benner et al. , 2010 ; Kim et al. , 2019 ; Park et al. , 2016 ; Newton and Moore, 2013 ; Nibert, 2011 ). To help develop students' critical thinking skills, nurse educators must change the way they teach nursing, so they can prepare future nurses to be effective communicators, critical thinkers and creative problem solvers ( Rieger et al. , 2015 ). Nursing leaders also need to redefine teaching practice and educational guidelines that drive innovation in undergraduate nursing programs.

Evidence-based practice has been advocated to promote critical thinking and help reduce the research-practice gap ( Profetto-McGrath, 2005 ; Stanley and Dougherty, 2010 ). Evidence-based practice was defined as “the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of the individual patient” ( Sackett et al. , 1996 , p. 71). Skills fair intervention, one type of evidence-based practice, can be used to engage students, promote active learning and develop critical thinking ( McCausland and Meyers, 2013 ; Roberts et al. , 2009 ). Skills fair intervention helps promote a consistent teaching practice of the psychomotor skills to the novice nurse that decreased anxiety, gave clarity of expectations to the students in the clinical setting and increased students' critical thinking skills ( Roberts et al. , 2009 ). The researchers of this study had an opportunity to create an active, innovative skills fair intervention for a baccalaureate nursing program in one southeastern state. This intervention incorporated evidence-based practice rationale with critical thinking prompts using Socratic questioning, evidence-based practice videos to the psychomotor skill rubrics, group work, guided discussions, expert demonstration followed by guided practice and blended learning in an attempt to promote and develop critical thinking in nursing students ( Hsu and Hsieh, 2013 ; Oermann et al. , 2011 ; Roberts et al. , 2009 ). The effects of an innovative skills fair intervention on senior baccalaureate nursing students' achievements and their perceptions of critical thinking development were examined in the study.

Literature review

The ability to use reasoned opinion focusing equally on processes and outcomes over emotions is called critical thinking ( Paul and Elder, 2008 ). Critical thinking skills are desired in almost every discipline and play a major role in decision-making and daily judgments. The roots of critical thinking date back to Socrates 2,500 years ago and can be traced to the ancient philosopher Aristotle ( Paul and Elder, 2012 ). Socrates challenged others by asking inquisitive questions in an attempt to challenge their knowledge. In the 1980s, critical thinking gained nationwide recognition as a behavioral science concept in the educational system ( Robert and Petersen, 2013 ). Many researchers in both education and nursing have attempted to define, measure and teach critical thinking for decades. However, a theoretical definition has yet to be accepted and established by the nursing profession ( Romeo, 2010 ). The terms critical literacy, CT, reflective thinking, systems thinking, clinical judgment and clinical reasoning are used synonymously in the reviewed literature ( Clarke and Whitney, 2009 ; Dykstra, 2008 ; Jones, 2010 ; Swing, 2014 ; Turner, 2005 ).

Watson and Glaser (1980) viewed critical thinking not only as cognitive skills but also as a combination of skills, knowledge and attitudes. Paul (1993) , the founder of the Foundation for Critical Thinking, offered several definitions of critical thinking and identified three essential components of critical thinking: elements of thought, intellectual standards and affective traits. Brunt (2005) stated critical thinking is a process of being practical and considered it to be “the process of purposeful thinking and reflective reasoning where practitioners examine ideas, assumptions, principles, conclusions, beliefs, and actions in the contexts of nursing practice” (p. 61). In an updated definition, Ennis (2011) described critical thinking as, “reasonable reflective thinking focused on deciding what to believe or do” (para. 1).

The most comprehensive attempt to define critical thinking was under the direction of Facione and sponsored by the American Philosophical Association ( Scheffer and Rubenfeld, 2000 ). Facione (1990) surveyed 53 experts from the arts and sciences using the Delphi method to define critical thinking as a “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as an explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which judgment, is based” (p. 2).

To come to a consensus definition for critical thinking, Scheffer and Rubenfeld (2000) also conducted a Delphi study. Their study consisted of an international panel of nurses who completed five rounds of sequenced questions to arrive at a consensus definition. Critical thinking was defined as “habits of mind” and “cognitive skills.” The elements of habits of mind included “confidence, contextual perspective, creativity, flexibility, inquisitiveness, intellectual integrity, intuition, open-mindedness, perseverance, and reflection” ( Scheffer and Rubenfeld, 2000 , p. 352). The elements of cognitive skills were recognized as “analyzing, applying standards, discriminating, information seeking, logical reasoning, predicting, and transforming knowledge” ( Scheffer and Rubenfeld, 2000 , p. 352). In addition, Ignatavicius (2001) defined the development of critical thinking as a long-term process that must be practiced, nurtured and reinforced over time. Ignatavicius believed that a critical thinker required six cognitive skills: interpretation, analysis, evaluation, inference, explanation and self-regulation ( Chun-Chih et al. , 2015 ). According to Ignatavicius (2001) , the development of critical thinking is difficult to measure or describe because it is a formative rather than summative process.

Fero et al. (2009) noted that patient safety might be compromised if a nurse cannot provide clinically competent care due to a lack of critical thinking. The Institute of Medicine (2001) recommended five health care competencies: patient-centered care, interdisciplinary team care, evidence-based practice, informatics and quality improvement. Understanding the development and attainment of critical thinking is the key for gaining these future competencies ( Scheffer and Rubenfeld, 2000 ). The development of a strong scientific foundation for nursing practice depends on habits such as contextual perspective, inquisitiveness, creativity, analysis and reasoning skills. Therefore, the need to better understand how these critical thinking habits are developed in nursing students needs to be explored through additional research ( Fero et al. , 2009 ). Despite critical thinking being listed since the 1980s as an accreditation outcome criteria for baccalaureate programs by the National League for Nursing, very little improvement has been observed in practice ( McMullen and McMullen, 2009 ). James (2013) reported the number of patient harm incidents associated with hospital care is much higher than previously thought. James' study indicated that between 210,000 and 440,000 patients each year go to the hospital for care and end up suffering some preventable harm that contributes to their death. James' study of preventable errors is attributed to other sources besides nursing care, but having a nurse in place who can advocate and critically think for patients will make a positive impact on improving patient safety ( James, 2013 ; Robert and Peterson, 2013 ).

Adopting teaching practice to promote CT is a crucial component of nursing education. Research by Nadelson and Nadelson (2014) suggested evidence-based practice is best learned when integrated into multiple areas of the curriculum. Evidence-based practice developed its roots through evidence-based medicine, and the philosophical origins extend back to the mid-19th century ( Longton, 2014 ). Florence Nightingale, the pioneer of modern nursing, used evidence-based practice during the Crimean War when she recognized a connection between poor sanitary conditions and rising mortality rates of wounded soldiers ( Rahman and Applebaum, 2011 ). In professional nursing practice today, a commonly used definition of evidence-based practice is derived from Dr. David Sackett: the conscientious, explicit and judicious use of current best evidence in making decisions about the care of the individual patient ( Sackett et al. , 1996 , p. 71). As professional nurses, it is imperative for patient safety to remain inquisitive and ask if the care provided is based on available evidence. One of the core beliefs of the American Nephrology Nurses' Association's (2019) 2019–2020 Strategic Plan is “Anna must support research to develop evidence-based practice, as well as to advance nursing science, and that as individual members, we must support, participate in, and apply evidence-based research that advances our own skills, as well as nursing science” (p. 1). Longton (2014) reported the lack of evidence-based practice in nursing resulted in negative outcomes for patients. In fact, when evidence-based practice was implemented, changes in policies and procedures occurred that resulted in decreased reports of patient harm and associated health-care costs. The Institute of Medicine (2011) recommendations included nurses being leaders in the transformation of the health-care system and achieving higher levels of education that will provide the ability to critically analyze data to improve the quality of care for patients. Student nurses must be taught to connect and integrate CT and evidence-based practice throughout their program of study and continue that practice throughout their careers.

One type of evidence-based practice that can be used to engage students, promote active learning and develop critical thinking is skills fair intervention ( McCausland and Meyers, 2013 ; Roberts et al. , 2009 ). Skills fair intervention promoted a consistent teaching approach of the psychomotor skills to the novice nurse that decreased anxiety, gave clarity of expectations to the students in the clinical setting and increased students' critical thinking skills ( Roberts et al. , 2009 ). The skills fair intervention used in this study is a teaching strategy that incorporated CT prompts, Socratic questioning, group work, guided discussions, return demonstrations and blended learning in an attempt to develop CT in nursing students ( Hsu and Hsieh, 2013 ; Roberts et al. , 2009 ). It melded evidence-based practice with simulated CT opportunities while students practiced essential psychomotor skills.

Research methodology

Context – skills fair intervention.

According to Roberts et al. (2009) , psychomotor skills decline over time even among licensed experienced professionals within as little as two weeks and may need to be relearned within two months without performing a skill. When applying this concept to student nurses for whom each skill is new, it is no wonder their competency result is diminished after having a summer break from nursing school. This skills fair intervention is a one-day event to assist baccalaureate students who had taken the summer off from their studies in nursing and all faculty participated in operating the stations. It incorporated evidence-based practice rationale with critical thinking prompts using Socratic questioning, evidence-based practice videos to the psychomotor skill rubrics, group work, guided discussions, expert demonstration followed by guided practice and blended learning in an attempt to promote and develop critical thinking in baccalaureate students.

Students were scheduled and placed randomly into eight teams based on attributes of critical thinking as described by Wittmann-Price (2013) : Team A – Perseverance, Team B – Flexibility, Team C – Confidence, Team D – Creativity, Team E – Inquisitiveness, Team F – Reflection, Team G – Analyzing and Team H – Intuition. The students rotated every 20 minutes through eight stations: Medication Administration: Intramuscular and Subcutaneous Injections, Initiating Intravenous Therapy, ten-minute Focused Physical Assessment, Foley Catheter Insertion, Nasogastric Intubation, Skin Assessment/Braden Score and Restraints, Vital Signs and a Safety Station. When the students completed all eight stations, they went to the “Check-Out” booth to complete a simple evaluation to determine their perceptions of the effectiveness of the innovative intervention. When the evaluations were complete, each of the eight critical thinking attribute teams placed their index cards into a hat, and a student won a small prize. All Junior 2, Senior 1 and Senior 2 students were required to attend the Skills Fair. The Skills Fair Team strove to make the event as festive as possible, engaging nursing students with balloons, candy, tri-boards, signs and fun pre and postactivities. The Skills Fair rubrics, scheduling and instructions were shared electronically with students and faculty before the skills fair intervention to ensure adequate preparation and continuous resource availability as students move forward into their future clinical settings.

Research design

Institutional review board (IRB) approval was obtained from XXX University to conduct this study and protect human subject rights. The explanatory sequential mixed-methods design was employed for this study. The design was chosen to identify what effects a skills fair intervention that had on senior baccalaureate nursing students' achievements on the Kaplan Critical Thinking Integrated Test (KCTIT) and then follow up with individual interviews to explore those test results in more depth. In total, 52 senior nursing students completed the KCTIT; 30 of them participated in the skills fair intervention and 22 of them did not participate. The KCTIT is a computerized 85-item exam in which 85 equates to 100%, making each question worth one point. It has high reliability and validity ( Kaplan Nursing, 2012 ; Swing, 2014 ). The reliability value of the KCTIT ranged from 0.72 to 0.89. A t -test was used to analyze the test results.

A total of 11 participants were purposefully selected based on a range of six high achievers and five low achievers on the KCTIT for open-ended one-on-one interviews. Each interview was conducted individually and lasted for about 60 minutes. An open-ended interview protocol was used to guide the flow of data collection. The interviewees' ages ranged from 21 to 30 years, with an average of 24 years. One of 11 interviewees was male. Among them, seven were White, three were Black and one was Indian American. The data collected were used to answer the following research questions: (1) What was the difference in achievements on the KCTIT among senior baccalaureate nursing students who participated in the skills fair intervention and students who did not participate? (2) What were the senior baccalaureate nursing students' perceptions of internal and external factors impacting the development of critical thinking skills during the skills fair intervention? and (3) What were the senior baccalaureate nursing students' perceptions of the skills fair intervention as a critical thinking developmental strategy?

Inductive content analysis was used to analyze interview data by starting with the close reading of the transcripts and writing memos for initial coding, followed by an analysis of patterns and relationships among the data for focused coding. The intercoder reliability was established for qualitative data analysis with a nursing expert. The lead researcher and the expert read the transcript several times and assigned a code to significant units of text that corresponded with answering the research questions. The codes were compared based on differences and similarities and sorted into subcategories and categories. Then, headings and subheadings were used based on similar comments to develop central themes and patterns. The process of establishing intercoder reliability helped to increase dependability, conformability and credibility of the findings ( Graneheim and Lundman, 2004 ). In addition, methods of credibility, confirmability, dependability and transferability were applied to increase the trustworthiness of this study ( Graneheim and Lundman, 2004 ). First, reflexivity was observed by keeping journals and memos. This practice allowed the lead researcher to reflect on personal views to minimize bias. Data saturation was reached through following the recommended number of participants as well as repeated immersion in the data during analysis until no new data surfaced. Member checking was accomplished through returning the transcript and the interpretation to the participants to check the accuracy and truthfulness of the findings. Finally, proper documentation was conducted to allow accurate crossreferencing throughout the study.

Quantitative results

Results for the quantitative portion showed there was no difference in scores on the KCTIT between senior nursing students who participated in the skills fair intervention and senior nursing students who did not participate, t (50) = −0.174, p  = 0.86 > 0.05. The test scores between the nonparticipant group ( M  = 67.59, SD = 5.81) and the participant group ( M  = 67.88, SD = 5.99) were almost equal.

Qualitative results

Initial coding.

The results from the initial coding and generated themes are listed in Table 1 . First, the participants perceived the skills fair intervention as “promoting experience” and “confidence” by practicing previously learned knowledge and reinforcing it with active learning strategies. Second, the participants perceived the skills fair intervention as a relaxed, nonthreatening learning environment due to the festive atmosphere, especially in comparison to other learning experiences in the nursing program. The nonthreatening environment of the skills fair intervention allowed students to learn without fear. Third, the majority of participants believed their critical thinking was strengthened after participating. Several participants believed their perception of critical thinking was “enhanced” or “reinforced” rather than significantly changed.

Focused coding results

The final themes were derived from the analysis of patterns and relationships among the content of the data using inductive content analysis ( Saldana, 2009 ). The following was examined across the focused coding process: (1) factors impacting critical thinking skills development during skills fair intervention and (2) skills fair intervention a critical thinking skills developmental strategy.

Factors impacting critical thinking skills development . The factors impacting the development of critical thinking during the skills fair intervention were divided into two themes: internal factors and external factors. The internal factors were characteristics innate to the students. The identified internal factors were (1) confidence and anxiety levels, (2) attitude and (3) age. The external factors were the outside influences that affected the students. The external factors were (1) experience and practice, (2) faculty involvement, (3) positive learning environment and (4) faculty prompts.

I think that confidence and anxiety definitely both have a huge impact on your ability to be able to really critically think. If you start getting anxious and panicking you cannot think through the process like you need too. I do not really think gender or age necessarily would have anything to do with critical thinking.
Definitely the confidence level, I think, the more advanced you get in the program, your confidence just keeps on growing. Level of anxiety, definitely… I think the people who were in the Skills Fair for the first time, had more anxiety because they did not really know to think, they did not know how strict it was going to be, or if they really had to know everything by the book. I think the Skills Fair helped everyone's confidence levels, but especially the Jr. 2's.

Attitude was an important factor in the development of critical thinking skills during the skills fair intervention as participants believed possessing a pleasant and positive attitude meant a student was eager to learn, participate, accept responsibility for completing duties and think seriously. Participant 6 believed attitude contributed to performance in the Skills Fair.

I feel like, certain things bring critical thinking out in you. And since I'm a little bit older than some of the other students, I have had more life experiences and am able to figure stuff out better. Older students have had more time to learn by trial and error, and this and that.
Like when I had clinical with you, you'd always tell us to know our patients' medications. To always know and be prepared to answer questions – because at first as a Junior 1 we did not do that in the clinical setting… and as a Junior 2, I did not really have to know my medications, but with you as a Senior 1, I started to realize that the patients do ask about their meds, so I was making sure that I knew everything before they asked it. And just having more practice with IVs – at first, I was really nervous, but when I got to my preceptorship – I had done so many IVs and with all of the practice, it just built up my confidence with that skill so when I performed that skill during the Fair, I was confident due to my clinical experiences and able to think and perform better.
I think teachers will always affect the ability to critically think just because you want [to] get the right answer because they are there and you want to seem smart to them [Laugh]. Also, if you are leading in the wrong direction of your thinking – they help steer you back to [in] the right direction so I think that was very helpful.
You could tell the faculty really tried to make it more laid back and fun, so everybody would have a good experience. The faculty had a good attitude. I think making it fun and active helped keep people positive. You know if people are negative and not motivated, nothing gets accomplished. The faculty did an amazing job at making the Skills Fair a positive atmosphere.

However, for some of the participants, a positive learning environment depended on their fellow students. The students were randomly assigned alphabetically to groups, and the groups were assigned to starting stations at the Skills Fair. The participants claimed some students did not want to participate and displayed cynicism toward the intervention. The participants believed their cynicism affected the positive learning environment making critical thinking more difficult during the Skills Fair.

Okay, when [instructor name] was demonstrating the Chevron technique right after we inserted the IV catheter and we were trying to secure the catheter, put on the extension set, and flush the line at what seemed to be all at the same time. I forgot about how you do not want to put the tape right over the hub of the catheter because when you go back in and try to assess the IV site – you're trying to assess whether or not it is patent or infiltrated – you have to visualize the insertion site. That was one of the things that I had been doing wrong because I was just so excited that I got the IV in the vein in the first place – that I did not think much about the tape or the tegaderm for sterility. So I think an important part of critical thinking is to be able to recognize when you've made a mistake and stop, stop yourself from doing it in the future (see Table 2 ).

Skills fair intervention as a developmental strategy for critical thinking . The participants identified the skills fair intervention was effective as a developmental strategy for critical thinking, as revealed in two themes: (1) develops alternative thinking and (2) thinking before doing (See Table 3 ).

Develops alternative thinking . The participants perceived the skills fair intervention helped enhance critical thinking and confidence by developing alternative thinking. Alternative thinking was described as quickly thinking of alternative solutions to problems based on the latest evidence and using that information to determine what actions were warranted to prevent complications and prevent injury. It helped make better connections through the learning of rationale between knowledge and skills and then applying that knowledge to prevent complications and errors to ensure the safety of patients. The participants stated the learning of rationale for certain procedures provided during the skills fair intervention such as the evidence and critical thinking prompts included in the rubrics helped reinforce this connection. The participants also shared they developed alternative thinking after participating in the skills fair intervention by noticing trends in data to prevent potential complications from the faculty prompts. Participant 1 stated her instructor prompted her alternative thinking through questioning about noticing trends to prevent potential complications. She said the following:

Another way critical thinking occurred during the skills fair was when [instructor name] was teaching and prompted us about what it would be like to care for a patient with a fractured hip – I think this was at the 10-minute focused assessment station, but I could be wrong. I remember her asking, “What do you need to be on the look-out for? What can go wrong?” I automatically did not think critically very well and was only thinking circulation in the leg, dah, dah, dah. But she was prompting us to think about mobility alterations and its effect on perfusion and oxygenation. She was trying to help us build those connections. And I think that's a lot of the aspects of critical thinking that gets overlooked with the nursing student – trouble making connections between our knowledge and applying it in practice.

Thinking before doing . The participants perceived thinking before doing, included thinking of how and why certain procedures, was necessary through self-examination prior to taking action. The hands-on situational learning allowed the participants in the skills fair intervention to better notice assessment data and think at a higher level as their previous learning of the skills was perceived as memorization of steps. This higher level of learning allowed participants to consider different future outcomes and analyze pertinent data before taking action.

I think what helped me the most is considering outcomes of my actions before I do anything. For instance, if you're thinking, “Okay. Well, I need to check their blood pressure before I administer this blood pressure medication – or the blood pressure could potentially bottom out.” I really do not want my patient to bottom out and get hypotensive because I administered a medication that was ordered, but not safe to give. I could prevent problems from happening if I know what to be on alert for and act accordingly. So ultimately knowing that in the clinical setting, I can prevent complications from happening and I save myself, my license, and promote patient safety. I think knowing that I've seen the importance of critical thinking already in practice has helped me value and understand why I should be critically thinking. Yes, we use the 5-rights of medication safety – but we also have to think. For instance, if I am going to administer insulin – what do I need to know or do to give this safely? What is the current blood sugar? Has the patient been eating? When is the next meal scheduled? Is the patient NPO for a procedure? Those are examples of questions to consider and the level of thinking that needs to take place prior to taking actions in the clinical setting.

Although the results of quantitative data showed no significant difference in scores on the KCTIT between the participant and nonparticipant groups, during the interviews some participants attributed this result to the test not being part of a course grade and believed students “did not try very hard to score well.” However, the participants who attended interviews did identify the skills fair intervention as a developmental strategy for critical thinking by helping them develop alternative thinking and thinking before doing. The findings are supported in the literature as (1) nurses must recognize signs of clinical deterioration and take action promptly to prevent potential complications ( Garvey and CNE series 2015 ) and (2) nurses must analyze pertinent data and consider all possible solutions before deciding on the most appropriate action for each patient ( Papathanasiou et al. , 2014 ).

The skills fair intervention also enhanced the development of self-confidence by participants practicing previously learned skills in a controlled, safe environment. The nonthreatening environment of the skills fair intervention allowed students to learn without fear and the majority of participants believed their critical thinking was strengthened after participating. The interview data also revealed a combination of internal and external factors that influenced the development of critical thinking during the skills fair intervention including confidence and anxiety levels, attitude, age, experience and practice, faculty involvement, positive learning environment and faculty prompts. These factors should be considered when addressing the promotion and development of critical thinking.

Conclusions, limitations and recommendations

A major concern in the nursing profession is the lack of critical thinking in student nurses and new graduates, which influences the decision-making of novice nurses and directly affects patient care and safety ( Saintsing et al. , 2011 ). Nurse educators must use evidence-based practice to prepare students to critically think with the complicated and constantly evolving environment of health care today ( Goodare, 2015 ; Newton and Moore, 2013 ). Evidence-based practice has been advocated to promote critical thinking ( Profetto-McGrath, 2005 ; Stanley and Dougherty, 2010 ). The skills fair intervention can be one type of evidence-based practice used to promote critical thinking ( McCausland and Meyers, 2013 ; Roberts et al. , 2009 ). The Intervention used in this study incorporated evidence-based practice rationale with critical thinking prompts using Socratic questioning, evidence-based practice videos to the psychomotor skill rubrics, group work, guided discussions, expert demonstration followed by guided practice and blended learning in an attempt to promote and develop critical thinking in nursing students.

The explanatory sequential mixed-methods design was employed to investigate the effects of the innovative skills fair intervention on senior baccalaureate nursing students' achievements and their perceptions of critical thinking skills development. Although the quantitative results showed no significant difference in scores on the KCTIT between students who participated in the skills fair intervention and those who did not, those who attended the interviews perceived their critical thinking was reinforced after the skills fair intervention and believed it was an effective developmental strategy for critical thinking, as it developed alternative thinking and thinking before doing. This information is useful for nurse educators who plan their own teaching practice to promote critical thinking and improve patient outcomes. The findings also provide schools and educators information that helps review their current approach in educating nursing students. As evidenced in the findings, the importance of developing critical thinking skills is crucial for becoming a safe, professional nurse. Internal and external factors impacting the development of critical thinking during the skills fair intervention were identified including confidence and anxiety levels, attitude, age, experience and practice, faculty involvement, positive learning environment and faculty prompts. These factors should be considered when addressing the promotion and development of critical thinking.

There were several limitations to this study. One of the major limitations of the study was the limited exposure of students' time of access to the skills fair intervention, as it was a one-day learning intervention. Another limitation was the sample selection and size. The skills fair intervention was limited to only one baccalaureate nursing program in one southeastern state. As such, the findings of the study cannot be generalized as it may not be representative of baccalaureate nursing programs in general. In addition, this study did not consider students' critical thinking achievements prior to the skills fair intervention. Therefore, no baseline measurement of critical thinking was available for a before and after comparison. Other factors in the nursing program could have affected the students' scores on the KCTIT, such as anxiety or motivation that was not taken into account in this study.

The recommendations for future research are to expand the topic by including other regions, larger samples and other baccalaureate nursing programs. In addition, future research should consider other participant perceptions, such as nurse educators, to better understand the development and growth of critical thinking skills among nursing students. Finally, based on participant perceptions, future research should include a more rigorous skills fair intervention to develop critical thinking and explore the link between confidence and critical thinking in nursing students.

Initial coding results

ThemesFrequency
Experience and confidence contributed to critical thinking skills76
Skills fair intervention had a relaxed atmosphere23
Skills fair intervention reinforced critical thinking skills21

Factors impacting critical thinking skill development during skills fair intervention

ThemesSubthemesFrequency of mentions
Internal factors 33
Confidence and anxiety levels17
Attitude10
Age6
External factors 62
Experience and practice21
Faculty involvement24
Positive learning environment11
Faculty prompts6

Skills fair intervention as a developmental strategy for critical thinking

ThemesSubthemesFrequency
Develops alternative thinking 13
Application of knowledge and skills9
Noticing trends to prevent complications4
Thinking before doing 10
Considering future outcomes5
Analyzing relevant data5

American Nephrology Nurses Association (ANNA) ( 2019 ), “ Learning, leading, connecting, and playing at the intersection of nephrology and nursing-2019–2020 strategic plan ”, viewed 3 Aug 2019, available at: https://www.annanurse.org/download/reference/association/strategicPlan.pdf .

Arli , S.D. , Bakan , A.B. , Ozturk , S. , Erisik , E. and Yildirim , Z. ( 2017 ), “ Critical thinking and caring in nursing students ”, International Journal of Caring Sciences , Vol. 10 No. 1 , pp. 471 - 478 .

Benner , P. , Sutphen , M. , Leonard , V. and Day , L. ( 2010 ), Educating Nurses: A Call for Radical Transformation , Jossey-Bass , San Francisco .

Brunt , B. ( 2005 ), “ Critical thinking in nursing: an integrated review ”, The Journal of Continuing Education in Nursing , Vol. 36 No. 2 , pp. 60 - 67 .

Chun-Chih , L. , Chin-Yen , H. , I-Ju , P. and Li-Chin , C. ( 2015 ), “ The teaching-learning approach and critical thinking development: a qualitative exploration of Taiwanese nursing students ”, Journal of Professional Nursing , Vol. 31 No. 2 , pp. 149 - 157 , doi: 10.1016/j.profnurs.2014.07.001 .

Clarke , L.W. and Whitney , E. ( 2009 ), “ Walking in their shoes: using multiple-perspectives texts as a bridge to critical literacy ”, The Reading Teacher , Vol. 62 No. 6 , pp. 530 - 534 , doi: 10.1598/RT.62.6.7 .

Dykstra , D. ( 2008 ), “ Integrating critical thinking and memorandum writing into course curriculum using the internet as a research tool ”, College Student Journal , Vol. 42 No. 3 , pp. 920 - 929 , doi: 10.1007/s10551-010-0477-2 .

Ebright , P. , Urden , L. , Patterson , E. and Chalko , B. ( 2004 ), “ Themes surrounding novice nurse near-miss and adverse-event situations ”, The Journal of Nursing Administration: The Journal of Nursing Administration , Vol. 34 , pp. 531 - 538 , doi: 10.1097/00005110-200411000-00010 .

Ennis , R. ( 2011 ), “ The nature of critical thinking: an outline of critical thinking dispositions and abilities ”, viewed 3 May 2017, available at: https://education.illinois.edu/docs/default-source/faculty-documents/robert-ennis/thenatureofcriticalthinking_51711_000.pdf .

Facione , P.A. ( 1990 ), Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , The California Academic Press , Millbrae .

Facione , N.C. and Facione , P.A. ( 2013 ), The Health Sciences Reasoning Test: Test Manual , The California Academic Press , Millbrae .

Fero , L.J. , Witsberger , C.M. , Wesmiller , S.W. , Zullo , T.G. and Hoffman , L.A. ( 2009 ), “ Critical thinking ability of new graduate and experienced nurses ”, Journal of Advanced Nursing , Vol. 65 No. 1 , pp. 139 - 148 , doi: 10.1111/j.1365-2648.2008.04834.x .

Garvey , P.K. and CNE series ( 2015 ), “ Failure to rescue: the nurse's impact ”, Medsurg Nursing , Vol. 24 No. 3 , pp. 145 - 149 .

Goodare , P. ( 2015 ), “ Literature review: ‘are you ok there?’ The socialization of student and graduate nurses: do we have it right? ”, Australian Journal of Advanced Nursing , Vol. 33 No. 1 , pp. 38 - 43 .

Graneheim , U.H. and Lundman , B. ( 2014 ), “ Qualitative content analysis in nursing research: concepts, procedures, and measures to achieve trustworthiness ”, Nurse Education Today , Vol. 24 No. 2 , pp. 105 - 12 , doi: 10.1016/j.nedt.2003.10.001 .

Hsu , L. and Hsieh , S. ( 2013 ), “ Factors affecting metacognition of undergraduate nursing students in a blended learning environment ”, International Journal of Nursing Practice , Vol. 20 No. 3 , pp. 233 - 241 , doi: 10.1111/ijn.12131 .

Ignatavicius , D. ( 2001 ), “ Six critical thinking skills for at-the-bedside success ”, Dimensions of Critical Care Nursing , Vol. 20 No. 2 , pp. 30 - 33 .

Institute of Medicine ( 2001 ), Crossing the Quality Chasm: A New Health System for the 21st Century , National Academy Press , Washington .

James , J. ( 2013 ), “ A new, evidence-based estimate of patient harms associated with hospital care ”, Journal of Patient Safety , Vol. 9 No. 3 , pp. 122 - 128 , doi: 10.1097/PTS.0b013e3182948a69 .

Jones , J.H. ( 2010 ), “ Developing critical thinking in the perioperative environment ”, AORN Journal , Vol. 91 No. 2 , pp. 248 - 256 , doi: 10.1016/j.aorn.2009.09.025 .

Kaplan Nursing ( 2012 ), Kaplan Nursing Integrated Testing Program Faculty Manual , Kaplan Nursing , New York, NY .

Kim , J.S. , Gu , M.O. and Chang , H.K. ( 2019 ), “ Effects of an evidence-based practice education program using multifaceted interventions: a quasi-experimental study with undergraduate nursing students ”, BMC Medical Education , Vol. 19 , doi: 10.1186/s12909-019-1501-6 .

Longton , S. ( 2014 ), “ Utilizing evidence-based practice for patient safety ”, Nephrology Nursing Journal , Vol. 41 No. 4 , pp. 343 - 344 .

McCausland , L.L. and Meyers , C.C. ( 2013 ), “ An interactive skills fair to prepare undergraduate nursing students for clinical experience ”, Nursing Education Perspectives , Vol. 34 No. 6 , pp. 419 - 420 , doi: 10.5480/1536-5026-34.6.419 .

McMullen , M.A. and McMullen , W.F. ( 2009 ), “ Examining patterns of change in the critical thinking skills of graduate nursing students ”, Journal of Nursing Education , Vol. 48 No. 6 , pp. 310 - 318 , doi: 10.3928/01484834-20090515-03 .

Moore , Z.E. ( 2007 ), “ Critical thinking and the evidence-based practice of sport psychology ”, Journal of Clinical Sport Psychology , Vol. 1 , pp. 9 - 22 , doi: 10.1123/jcsp.1.1.9 .

Nadelson , S. and Nadelson , L.S. ( 2014 ), “ Evidence-based practice article reviews using CASP tools: a method for teaching EBP ”, Worldviews on Evidence-Based Nursing , Vol. 11 No. 5 , pp. 344 - 346 , doi: 10.1111/wvn.12059 .

Newton , S.E. and Moore , G. ( 2013 ), “ Critical thinking skills of basic baccalaureate and accelerated second-degree nursing students ”, Nursing Education Perspectives , Vol. 34 No. 3 , pp. 154 - 158 , doi: 10.5480/1536-5026-34.3.154 .

Nibert , A. ( 2011 ), “ Nursing education and practice: bridging the gap ”, Advance Healthcare Network , viewed 3 May 2017, available at: https://www.elitecme.com/resource-center/nursing/nursing-education-practice-bridging-the-gap/ .

Oermann , M.H. , Kardong-Edgren , S. , Odom-Maryon , T. , Hallmark , B.F. , Hurd , D. , Rogers , N. and Smart , D.A. ( 2011 ), “ Deliberate practice of motor skills in nursing education: CPR as exemplar ”, Nursing Education Perspectives , Vol. 32 No. 5 , pp. 311 - 315 , doi: 10.5480/1536-5026-32.5.311 .

Papathanasiou , I.V. , Kleisiaris , C.F. , Fradelos , E.C. , Kakou , K. and Kourkouta , L. ( 2014 ), “ Critical thinking: the development of an essential skill for nursing students ”, Acta Informatica Medica , Vol. 22 No. 4 , pp. 283 - 286 , doi: 10.5455/aim.2014.22.283-286 .

Park , M.Y. , Conway , J. and McMillan , M. ( 2016 ), “ Enhancing critical thinking through simulation ”, Journal of Problem-Based Learning , Vol. 3 No. 1 , pp. 31 - 40 , doi: 10.24313/jpbl.2016.3.1.31 .

Paul , R. ( 1993 ), Critical Thinking: How to Prepare Students for a Rapidly Changing World , The Foundation for Critical Thinking , Santa Rosa .

Paul , R. and Elder , L. ( 2008 ), “ Critical thinking: the art of socratic questioning, part III ”, Journal of Developmental Education , Vol. 31 No. 3 , pp. 34 - 35 .

Paul , R. and Elder , L. ( 2012 ), Critical Thinking: Tools for Taking Charge of Your Learning and Your Life , 3rd ed. , Pearson/Prentice Hall , Boston .

Profetto-McGrath , J. ( 2005 ), “ Critical thinking and evidence-based practice ”, Journal of Professional Nursing , Vol. 21 No. 6 , pp. 364 - 371 , doi: 10.1016/j.profnurs.2005.10.002 .

Rahman , A. and Applebaum , R. ( 2011 ), “ What's all this about evidence-based practice? The roots, the controversies, and why it matters ”, American Society on Aging , viewed 3 May 2017, available at: https://www.asaging.org/blog/whats-all-about-evidence-based-practice-roots-controversies-and-why-it-matters .

Rieger , K. , Chernomas , W. , McMillan , D. , Morin , F. and Demczuk , L. ( 2015 ), “ The effectiveness and experience of arts‐based pedagogy among undergraduate nursing students: a comprehensive systematic review protocol ”, JBI Database of Systematic Reviews and Implementation Reports , Vol. 13 No. 2 , pp. 101 - 124 , doi: 10.11124/jbisrir-2015-1891 .

Robert , R.R. and Petersen , S. ( 2013 ), “ Critical thinking at the bedside: providing safe passage to patients ”, Medsurg Nursing , Vol. 22 No. 2 , pp. 85 - 118 .

Roberts , S.T. , Vignato , J.A. , Moore , J.L. and Madden , C.A. ( 2009 ), “ Promoting skill building and confidence in freshman nursing students with a skills-a-thon ”, Educational Innovations , Vol. 48 No. 8 , pp. 460 - 464 , doi: 10.3928/01484834-20090518-05 .

Romeo , E. ( 2010 ), “ Quantitative research on critical thinking and predicting nursing students' NCLEX-RN performance ”, Journal of Nursing Education , Vol. 49 No. 7 , pp. 378 - 386 , doi: 10.3928/01484834-20100331-05 .

Sackett , D. , Rosenberg , W. , Gray , J. , Haynes , R. and Richardson , W. ( 1996 ), “ Evidence-based medicine: what it is and what it isn't ”, British Medical Journal , Vol. 312 No. 7023 , pp. 71 - 72 , doi: 10.1136/bmj.312.7023.71 .

Saintsing , D. , Gibson , L.M. and Pennington , A.W. ( 2011 ), “ The novice nurse and clinical decision-making: how to avoid errors ”, Journal of Nursing Management , Vol. 19 No. 3 , pp. 354 - 359 .

Saldana , J. ( 2009 ), The Coding Manual for Qualitative Researchers , Sage , Los Angeles .

Scheffer , B. and Rubenfeld , M. ( 2000 ), “ A consensus statement on critical thinking in nursing ”, Journal of Nursing Education , Vol. 39 No. 8 , pp. 352 - 359 .

Stanley , M.C. and Dougherty , J.P. ( 2010 ), “ Nursing education model. A paradigm shift in nursing education: a new model ”, Nursing Education Perspectives , Vol. 31 No. 6 , pp. 378 - 380 , doi: 10.1043/1536-5026-31.6.378 .

Swing , V.K. ( 2014 ), “ Early identification of transformation in the proficiency level of critical thinking skills (CTS) for the first-semester associate degree nursing (ADN) student ”, doctoral thesis , Capella University , Minneapolis , viewed 3 May 2017, ProQuest Dissertations & Theses database .

Turner , P. ( 2005 ), “ Critical thinking in nursing education and practice as defined in the literature ”, Nursing Education Perspectives , Vol. 26 No. 5 , pp. 272 - 277 .

Twibell , R. , St Pierre , J. , Johnson , D. , Barton , D. , Davis , C. and Kidd , M. ( 2012 ), “ Tripping over the welcome mat: why new nurses don't stay and what the evidence says we can do about it ”, American Nurse Today , Vol. 7 No. 6 , pp. 1 - 10 .

Watson , G. and Glaser , E.M. ( 1980 ), Watson Glaser Critical Thinking Appraisal , Psychological Corporation , San Antonio .

Wittmann-Price , R.A. ( 2013 ), “ Facilitating learning in the classroom setting ”, in Wittmann-Price , R.A. , Godshall , M. and Wilson , L. (Eds), Certified Nurse Educator (CNE) Review Manual , Springer Publishing , New York, NY , pp. 19 - 70 .

Corresponding author

Related articles, all feedback is valuable.

Please share your general feedback

Report an issue or find answers to frequently asked questions

Contact Customer Support

What is Evidence-Based Practice in Nursing? (With Examples, Benefits, & Challenges)

evidence based practice and critical thinking

Are you a nurse looking for ways to increase patient satisfaction, improve patient outcomes, and impact the profession? Have you found yourself caught between traditional nursing approaches and new patient care practices? Although evidence-based practices have been used for years, this concept is the focus of patient care today more than ever. Perhaps you are wondering, “What is evidence-based practice in nursing?” In this article, I will share information to help you begin understanding evidence-based practice in nursing + 10 examples about how to implement EBP.

What is Evidence-Based Practice in Nursing?

When was evidence-based practice first introduced in nursing, who introduced evidence-based practice in nursing, what is the difference between evidence-based practice in nursing and research in nursing, what are the benefits of evidence-based practice in nursing, top 5 benefits to the patient, top 5 benefits to the nurse, top 5 benefits to the healthcare organization, 10 strategies nursing schools employ to teach evidence-based practices, 1. assigning case studies:, 2. journal clubs:, 3. clinical presentations:, 4. quizzes:, 5. on-campus laboratory intensives:, 6. creating small work groups:, 7. interactive lectures:, 8. teaching research methods:, 9. requiring collaboration with a clinical preceptor:, 10. research papers:, what are the 5 main skills required for evidence-based practice in nursing, 1. critical thinking:, 2. scientific mindset:, 3. effective written and verbal communication:, 4. ability to identify knowledge gaps:, 5. ability to integrate findings into practice relevant to the patient’s problem:, what are 5 main components of evidence-based practice in nursing, 1. clinical expertise:, 2. management of patient values, circumstances, and wants when deciding to utilize evidence for patient care:, 3. practice management:, 4. decision-making:, 5. integration of best available evidence:, what are some examples of evidence-based practice in nursing, 1. elevating the head of a patient’s bed between 30 and 45 degrees, 2. implementing measures to reduce impaired skin integrity, 3. implementing techniques to improve infection control practices, 4. administering oxygen to a client with chronic obstructive pulmonary disease (copd), 5. avoiding frequently scheduled ventilator circuit changes, 6. updating methods for bathing inpatient bedbound clients, 7. performing appropriate patient assessments before and after administering medication, 8. restricting the use of urinary catheterizations, when possible, 9. encouraging well-balanced diets as soon as possible for children with gastrointestinal symptoms, 10. implementing and educating patients about safety measures at home and in healthcare facilities, how to use evidence-based knowledge in nursing practice, step #1: assessing the patient and developing clinical questions:, step #2: finding relevant evidence to answer the clinical question:, step #3: acquire evidence and validate its relevance to the patient’s specific situation:, step #4: appraise the quality of evidence and decide whether to apply the evidence:, step #5: apply the evidence to patient care:, step #6: evaluating effectiveness of the plan:, 10 major challenges nurses face in the implementation of evidence-based practice, 1. not understanding the importance of the impact of evidence-based practice in nursing:, 2. fear of not being accepted:, 3. negative attitudes about research and evidence-based practice in nursing and its impact on patient outcomes:, 4. lack of knowledge on how to carry out research:, 5. resource constraints within a healthcare organization:, 6. work overload:, 7. inaccurate or incomplete research findings:, 8. patient demands do not align with evidence-based practices in nursing:, 9. lack of internet access while in the clinical setting:, 10. some nursing supervisors/managers may not support the concept of evidence-based nursing practices:, 12 ways nurse leaders can promote evidence-based practice in nursing, 1. be open-minded when nurses on your teams make suggestions., 2. mentor other nurses., 3. support and promote opportunities for educational growth., 4. ask for increased resources., 5. be research-oriented., 6. think of ways to make your work environment research-friendly., 7. promote ebp competency by offering strategy sessions with staff., 8. stay up-to-date about healthcare issues and research., 9. actively use information to demonstrate ebp within your team., 10. create opportunities to reinforce skills., 11. develop templates or other written tools that support evidence-based decision-making., 12. review evidence for its relevance to your organization., bonus 8 top suggestions from a nurse to improve your evidence-based practices in nursing, 1. subscribe to nursing journals., 2. offer to be involved with research studies., 3. be intentional about learning., 4. find a mentor., 5. ask questions, 6. attend nursing workshops and conferences., 7. join professional nursing organizations., 8. be honest with yourself about your ability to independently implement evidence-based practice in nursing., useful resources to stay up to date with evidence-based practices in nursing, professional organizations & associations, blogs/websites, youtube videos, my final thoughts, frequently asked questions answered by our expert, 1. what did nurses do before evidence-based practice, 2. how did florence nightingale use evidence-based practice, 3. what is the main limitation of evidence-based practice in nursing, 4. what are the common misconceptions about evidence-based practice in nursing, 5. are all types of nurses required to use evidence-based knowledge in their nursing practice, 6. will lack of evidence-based knowledge impact my nursing career, 7. i do not have access to research databases, how do i improve my evidence-based practice in nursing, 7. are there different levels of evidence-based practices in nursing.

• Level One: Meta-analysis of random clinical trials and experimental studies • Level Two: Quasi-experimental studies- These are focused studies used to evaluate interventions. • Level Three: Non-experimental or qualitative studies. • Level Four: Opinions of nationally recognized experts based on research. • Level Five: Opinions of individual experts based on non-research evidence such as literature reviews, case studies, organizational experiences, and personal experiences.

8. How Can I Assess My Evidence-Based Knowledge In Nursing Practice?

evidence based practice and critical thinking

evidence based practice and critical thinking

  • Subscribe to journal Subscribe
  • Get new issue alerts Get alerts

Secondary Logo

Journal logo.

Colleague's E-mail is Invalid

Your message has been successfully sent to your colleague.

Save my selection

Enhancing Critical Thinking in Clinical Practice

Implications for critical and acute care nurses.

Shoulders, Bridget MS, ACNP-BC, CCRN-CMC; Follett, Corrinne MS, FNP-BC, CCRN, RN-BC, RCIS; Eason, Joyce MS, ANP-BC, RN-BC

Bridget Shoulders, MS, ACNP-BC, CCRN-CMC , is a nurse practitioner in the cardiology department at the James A. Haley VA Hospital in Tampa, Florida.

Corrinne Follett, MS, FNP-BC, CCRN, RN-BC, RCIS, is a nurse practitioner in the cardiology department at the James A. Haley VA Hospital in Tampa, Florida.

Joyce Eason, MS, ANP-BC, RN-BC, is a nurse practitioner in the cardiology department at the James A. Haley VA Hospital in Tampa, Florida.

The authors have disclosed that they have no significant relationship with, or financial interest in, any commercial companies pertaining to this article.

Address correspondence and reprint requests to: Bridget Shoulders, MS, ACNP-BC, 31047 Whitlock Dr, Wesley Chapel, FL 33543 ( [email protected] ).

The complexity of patients in the critical and acute care settings requires that nurses be skilled in early recognition and management of rapid changes in patient condition. The interpretation and response to these events can greatly impact patient outcomes. Nurses caring for these complex patients are expected to use astute critical thinking in their decision making. The purposes of this article were to explore the concept of critical thinking and provide practical strategies to enhance critical thinking in the critical and acute care environment.

The complexity of patients in the critical and acute care settings requires that nurses be skilled in early recognition and management of rapid changes in patient condition. The interpretation and response to these events can greatly impact patient outcomes. The purpose of this article is to explore the concept of critical thinking and provide practical strategies to enhance critical thinking in the critical and acute care environment.

The complexity of patients in the critical and acute care settings requires that nurses be skilled in early recognition and management of rapid changes in patients’ condition. Caring for patients with complex conditions, decreased length of stay, sophisticated technology, and increasing demands on time challenges new and experienced nurses alike to use astute critical thinking in clinical decision making. The decisions made directly affect patient care outcomes. 1 Bedside nurses, preceptors, and nurse leaders play a pivotal role in the development of critical thinking ability in the clinical setting. The purposes of this article were to explore the concept of critical thinking and to provide nurses with practical strategies to enhance critical thinking in clinical practice.

WHAT IS CRITICAL THINKING?

Critical thinking is a learned process 2 that occurs within and across all domains. There are numerous definitions of critical thinking in the literature, often described in terms of its components, features, and characteristics. Peter Facione, an expert in the field of critical thinking, led a group of experts from various disciplines to establish a consensus definition of critical thinking. The Delphi Report, 3 published in 1990, characterized the ideal critical thinker as “habitually inquisitive, well-informed, trustful of reason…, diligent in seeking relevant information, and persistent in seeking results.” Although this definition was the most comprehensive attempt to define critical thinking 4 at the time, it was not nursing specific.

Scheffer and Rubenfeld 4 used the Delphi technique to define critical thinking in nursing. An international panel of expert nurses in practice, education, and research provided input into what habits of the mind and cognitive skills were at the core of critical thinking. After discussion and analysis, the panel provided the following consensus statement: “Critical thinking in nursing is an essential component of professional accountability and quality nursing care. Critical thinkers in nursing exhibit these habits of the mind: confidence, contextual perspective, creativity, flexibility, inquisitiveness, intellectual integrity, intuition, open-mindedness, perseverance, and reflection. Critical thinkers in nursing practice the cognitive skills of analyzing, applying standards, discriminating, information seeking, logical reasoning, predicting and transforming knowledge.” This definition expanded on the consensus definition in the Delphi Report to include the additional components of creativity and intuition.

Skilled critically thinking nurses respond quickly to changes in patients’ conditions, changing priorities of care based on the urgency of the situation. They accurately interpret data, such as subtle changes in vital signs or laboratory values. 5 They are not just looking at the numbers but also assessing the accuracy and relevancy of the findings. Critical thinking helps the nurse to recognize events as part of the bigger picture and center in on the problem.

Lack of critical thinking is evident when nurses depend heavily on structured approaches, such as protocols, to make clinical decisions. These guidelines should not be viewed as mandates because the practice is always more complex than what can be captured by pathways and protocols. 6 Without critical thinking, nurses are merely performing task-oriented care.

One example of how nurses use critical thinking is with medication administration. This task may appear to be primarily a technical process, but it requires astute critical thinking. Eisenhauer and Hurley 7 interviewed 40 nurses to illustrate their thinking processes during medication administration. The nurses described communicating with providers, sharing their interpretation of patient data to ensure safe administration of medication. They used their judgment about the timing of as-needed medication (eg, timing pain medication before physical therapy). Nurses integrated their knowledge of the patient’s laboratory values or pattern of response to medication to determine the need for a change in the drug dose or time. They assessed whether a medication was achieving the desired effect and took precautionary measures in anticipating potential side effects. It is evident in these examples that safe administration of medication involves critical thinking beyond the 5 rights that nurses are taught in the academic setting .

INTEGRATING RESEARCH, EVIDENCE-BASED PRACTICE, AND CRITICAL THINKING

Nursing research is a scientific process that validates and refines existing knowledge and generates new knowledge that influences nursing practice. 8 Evidence-based practice integrates the best available research with clinical expertise and patient’s needs and values. Different types of evidence have different strengths and weaknesses in terms of credibility. The typical evidence hierarchy places meta-analysis of randomized clinical trials at the top and expert opinion at the bottom of what counts as good evidence. 6

It is important to recognize that nursing knowledge is not always evidence based. Nurses have historically acquired knowledge through a variety of nonscientific sources such as trial and error, role modeling, tradition, intuition, and personal experiences. 8 Although these sources have been “handed down” over the years and continue to influence nursing practice, nurses are expected to use the best available evidence to guide their decision making. Evidence-based practice redirects nursing from making decisions based on tradition to practicing based on the best research evidence.

Barriers for nurses to implement evidence-based practices include lack of knowledge of research, difficulty interpreting findings and applying to practice, lack of time, and lack of autonomy to implement changes. 9 Universities can overcome these barriers by incorporating nursing research throughout all clinical and nonclinical courses. Joint endeavors between hospitals and universities to educate nurses in the use of research will increase the level of comfort with evidence-based practice. 10 Specialized research departments devoted to promotion and education of staff nurses in research evaluation, utilization, and implementation would allow nursing staff to experience an increased level of support and awareness of the need for research utilization.

Nurse leaders need to create an environment that supports transformation from outdated practices and traditions. Nurses must feel empowered to question nursing practice and have available resources to support the search for evidence. Critical thinking and evidence-based practice must be connected and integrated for nurses, starting in their basic education programs and fostered throughout their lifetime. 11

THE NURSING PROCESS AND CRITICAL THINKING

The nursing process is the nurse’s initial introduction to a thinking process used to collect, analyze, and solve patient care problems. The steps of the nursing process are similar to the scientific method. In both processes, information is gathered, observations are made, problems are identified, plans are developed, actions are taken, and processes are reviewed for effectiveness. 8 The nursing process, used as a framework for making clinical judgments, helps guide nurses to think about what they do in their practice.

Chabeli 12 described how critical thinking can be facilitated using the framework of the nursing process. During the assessment phase, the nurse systematically gathers information to identify the chief complaint and other health problems. The nurse uses critical thinking to examine and interpret the data, separating the relevant from the irrelevant and clarifying the meaning when necessary. During the diagnosis phase, nurses use the diagnostic reasoning process to draw conclusions and decide whether nursing intervention is indicated. The planning and implementation of interventions should be mutual, research based, and realistic and have measurable expected outcomes. The evaluation phase addresses the effectiveness of the plan of care and is ongoing as the patient progresses toward goal achievement. The author concludes that when the nursing process is used effectively for the intended purpose, it is a powerful scientific vehicle for facilitating critical thinking.

HOW DO WE LEARN CRITICAL THINKING IN NURSING?

Nurses initially learn to think critically in the academic environment, using assessments designed to measure critical thinking. It is conceivable that a nurse could pass an examination in the classroom but have difficulty making the transition to think critically in the clinical setting. Improving critical thinking ability should be viewed as a process and, as with the development of any skill, requires practice. 13

Most nurses develop their critical thinking ability as they gain clinical expertise. Patricia Benner 14 described the development of clinical expertise, as nurses transition from novice to expert. The beginning, or novice nurse, has theoretical knowledge as a foundation and minimal practical experiences to draw from. As similar situations are encountered, experience is accrued over time as the nurse evolves toward competency. As proficiency is developed, the nurse is able to perceive situations as a whole and recognize the significant aspects. As the proficient nurse reaches toward expertise, decision making becomes automatic, drawing from the enormous background of experience acquired over the years. Experience is more than the passage of time and is required at each stage before progressing to the next level of clinical expertise. As nurses progress along the novice-to-expert continuum and gain competence, they develop their ability to think critically. 15

Preceptors play a significant role in transitioning nurses into professional practice. It is essential that preceptors have the necessary skills to facilitate the critical thinking development of new nurses. Forneris and Peden-McAlpine 16 investigated the impact of the preceptor’s coaching component of a reflective learning intervention on novice nurses’ critical thinking skills. The following coaching strategies were used to educate preceptors: context (eg, understanding the big picture), dialogue, reflection, and time (eg, the use of past experiences to discern change over time). After completing the educational intervention, the preceptors used these strategies to coach the novice nurses in the development of their critical thinking skills. This study found that these strategies stimulated the novice nurses to engage in an intentional, reflective dialogue. The preceptors acknowledged a change in their preceptor style, moving from describing critical thinking as prioritizing and organizing task to a dialogue to share thinking and understand rationale.

Nurses must have the necessary dispositions (eg, attributes, attitudes, habits of the mind) to be effective critical thinkers. 11 Finn 17 defined thinking dispositions that influence critical thinking. Open mindedness was described as the willingness to seek out and consider new evidence or possibilities. Fair mindedness referred to an unprejudiced examination of evidence that might question beliefs or a viewpoint contrary to the nurse’s own beliefs. Reflectiveness was described as the willingness to gather relevant evidence to carefully evaluate an issue, rather than making hasty judgments. Counterfactual thinking referred to the willingness to ponder what could or would happen if the facts were considered under different conditions or perspectives. The opposite thinking styles directed toward maintaining the status quo included being close minded, biased, and rigid.

Rung-Chaung et al 18 investigated the critical thinking competence and disposition of nurses at different rankings on the clinical ladder. Using Benner’s novice to expert model as their theoretical framework, a stratified random sampling of 2300 nurses working at a medical center were classified according to their position on the clinical ladder. Ten to fifteen percent of this population were randomly selected for each ladder group, with the final sample size totaling 269. Data were collected using a modified version of the Watson-Glaser Critical Thinking Appraisal tool, designed to assess critical thinking competence in the categories of inference, recognition of assumptions, deduction, interpretation, and evaluation. The participants’ cumulative average score for critical thinking competence was 61.8 of a possible score of 100, ranking highest in interpretation and lowest in inference. Participants completed a modified version of the California Critical Thinking Disposition Inventory, designed to measure the following characteristics of critical thinking: inquisitiveness, systematic analytical approach, open mindedness, and reflective thinking. Participants scored highest in reflective thinking and lowest in inquisitiveness.

Analysis of the data indicated that older nurses with more years of experience and a more prominent position on the clinical ladder were predictive of a higher critical thinking disposition. Overall, critical thinking was shown to be only partially developed. The authors recommended training programs, such as problem-based learning, group discussion, role-playing, and concept mapping be adopted to enhance nurse critical thinking skills.

Chang el al 19 examined the relationship between critical thinking and nursing competence, using the Watson-Glaser Critical Thinking Appraisal and the Nursing Competence Scale. A total of 570 clinical nurses participated in the study. These nurses scored highest in interpretation ability and lowest in inference ability. These findings were consistent with the results reported in the Rung-Chuang study. Analysis of the data indicated that critical thinking ability was significantly higher in older nurses and nurses with more than 5 years of experience. The findings of this study indicated that critical thinking ability, working years, position/title, and education level were the 4 significant predictors of nursing competence. There were significantly positive correlations between critical thinking ability and nursing competence, indicating that the higher the critical thinking ability, the better the nursing competence is.

STRATEGIES TO ENHANCE CRITICAL THINKING ABILITY

To improve critical thinking, the learning needs of nurses must first be identified. The Performance Based Development System, a scenario-based tool, was used in a study to identify critical thinking learning needs of 2144 new and experienced nurses. 20 Results were reported as either meeting (identifying the appropriate actions) or not meeting the expectations. Most participants (74.9%) met the expectations by identifying the appropriate actions. Of the approximately 25% who did not meet the expectations, the learning needs identified included initiating appropriate nursing interventions (97.2%), differentiating urgency (67%), reporting essential clinical data (65.4%), anticipating relevant medical orders (62.8%), understanding decision rationale (62.6%), and problem recognition (57.1%). As expected, nurses with the most experience had the highest rate of identifying the appropriate actions on the Performance-Based Development System assessment. These findings were consisted with Benner’s novice to expert framework. These types of assessment tools can be used to identify learning needs and help facilitate individualized orientation. The authors acknowledged that further research is needed to identify areas of critical thinking deficiency and to test objective, educational strategies that enhance critical thinking in the nursing population.

The Institute of Medicine report on the future of nursing 21 emphasized the importance of nursing residency programs to provide hands-on experience for new graduates transitioning into practice. According to the report, these programs have been shown to help new nurses develop critical competencies in clinical decision making (eg, critical thinking) and autonomy in providing patient care. Implementing successful methods to expedite the development of critical thinking in new nurses has the potential to improve patient safety, nurse job satisfaction, and recruitment and retention of competent nurse professionals. 22

Although critical thinking skills are developed through clinical practice, there are many experienced nurses who possess less than optimal critical thinking skills. 5 As part of an initiative to elevate the critical thinking of nurses on the frontline, Berkow et al 23 reported the development of the Critical Thinking Diagnostic, a tool designed to assess critical thinking of experienced nurses. The tool includes 25 competencies, identified by nursing leaders as core skills at the heart of critical thinking. These competencies were grouped into 5 components of critical thinking: problem recognition, clinical decision making, prioritization, clinical implementation, and reflection. The potential application of this tool may enable nurse leaders to identify critical thinking strengths and individualize learning activities based on the specific needs of nurses on the frontline.

The critical thinking concepts, identified in the Delphi study of nurse experts, were used to teach critical thinking in a continuing education course. 24 The objective of the course was to help nurses develop the cognitive skills and habits of the mind considered important for practice. The course focused on the who, what, where, when, why, and how of critical thinking, using the case study approach. The authors concluded that critical thinking courses should include specific strategies for application of knowledge and opportunities to use cognitive strategies with clinical simulations.

Journal clubs encourage evidence-based practice and critical thinking by introducing nurses to new developments and broader perspectives of health care. 11 Lehna et al 25 described the virtual journal club (VJC) as an alternative to the traditional journal club meetings. The VJC uses an online blog format to post research-based articles and critiques, for generation of discussion by nurses. Recommendations for practice change derived from the analysis are forwarded to the appropriate decision-making body for consideration. The VJC not only exposes the nursing staff to scientific evidence to support changing their practice but also may lead to institutional policy changes that are based on the best evidence. The VJC overcomes the limitations of the traditional journal clubs by being available to all nurses at all times.

The integration of simulation technology in nursing exposes nursing students and nurses to complex patient care scenarios in a safe environment. Kirkman 26 reported a study to investigate nursing students’ ability to transfer knowledge and skill learned during high-fidelity simulations to the clinical setting, over time. The sample of 42 undergraduate students were rated on their ability to perform a respiratory assessment, using observation and a performance evaluation tool. The findings indicated there was a significant difference in transfer of learning demonstrated by participants over time. These results provide evidence that students were able to transfer knowledge and skills from high-fidelity simulations to the traditional clinical setting.

Jacobson et al 27 reported using simulated clinical scenarios to increase nurses’ perceived confidence and skill in handling emergency situations. During a 7-month period, the scenarios were conducted a total of 97 times with staff nurses. Each scenario presented a patient’s evolving story to challenge nurses to assess and synthesize the clinical information. The scenarios included a critical point at which the nurses needed to recognize and respond to significant deterioration in the patient’s condition. Postproject survey data found that most of the nurses perceived an improvement in their confidence and skill in managing emergency situations. More than half of the nurses reported that their critical thinking skills improved because of participation in this project.

Individual nurses can enhance critical thinking by developing a questioning attitude and habits of inquiry, where there is an appreciation and openness to other ways of doing things. Nurses should routinely reflect on the care provided and the outcomes of their interventions. Using reflection encourages nurses to think critically about what they do in everyday practice and learn from their experiences. 28 This strategy is beneficial for nurses to validate knowledge and examine nursing practice. 5 Nurses must be comfortable with asking and being asked “why” and “why not.” Seeking new knowledge and updating or refining current knowledge encourage critical thinking by practicing based on the evidence. “We’ve always done it that way” is no longer an acceptable answer. A list of other useful strategies for enhancing critical thinking is included in Table 1 .

T1-5

USING THE INTERACTIVE CASE STUDY APPROACH TO ENHANCE CRITICAL THINKING

Case studies provide a means to attain experience in high-risk and complex situations in a safe environment. The purpose of a case study is to apply acquired knowledge to a specific patient situation, using actual or hypothetical scenarios. Waxman and Telles 32 discussed using Benner’s model to develop simple to complex scenarios that match the learning level of the nurse. The case study should ideally provide all the relevant information for analysis, without directing the nurse’s thinking in a particular direction. Participants are encouraged to use thinking processes similar to that used in a real situation.

A well-developed case study defines objectives and expected outcomes. The questions should be geared toward the outcomes to be met. 30 The focus of the questions should be on the underlying thought processes used to arrive at the answer, rather than the answer alone. This helps nurses identify the reasons behind why a decision is made. In some cases, the case study may build on the information shared, instead of presenting all the information at one time. At the very least, case studies should have face validity or represent what they were developed to represent. 33

Case studies can be developed for specific purposes, such as analyzing data or improving the nurse’s skill in responding to specific clinical situations. 30 This strategy can be useful in building nurses’ confidence in managing complex or emergency situations. The case can be tailored to specific patient populations or clinical events. Covering the course of care that a patient receives over time is effective in putting together the whole picture. 31 For the purpose of improving patient outcomes, the case study should represent the overall patient experience. Case studies may be used to review specific actions that led to positive outcomes or the processes that led to negative outcomes. This can help determine if the care was the most appropriate for the situation. 34

The use of case studies with simulation technology provides nurses with the opportunity to critically think through a critical situation in a controlled setting. The latest human patient simulators (HPSs) are programmed to respond to the nurse’s intervention, with outcomes determined as a result of the intervention. Howard et al 35 compared the teaching strategies of HPSs and the traditional interactive case study (ICS) approach, using scenarios with the same subject matter. A sample of 49 senior nursing students were given pretest and posttest designed to measure the students’ knowledge of the content presented and their ability to apply that content to clinical problems. Participants in the HPS group scored significantly higher on the posttest than the ICS group did. Students reported that the HPS assisted them in understanding concepts, was a valuable learning experience, and helped to stimulate their critical thinking. There was no significant difference between the HPS and ICS groups’ responses to the statement that the educational intervention was realistic.

The Figure depicts an example of a heart failure case study with the objective of applying critical thinking to a common problem encountered in practice. Expert clinical nurses would be ideal to serve as facilitators of this learning experience. Their role would be to present the scenario, describe the physiological findings, ask open-ended questions that require thinking and analysis, and guide the discussion and problem-solving process. Discussion and questioning strategies that are helpful in eliciting reflective responses during the learning experience are included in Table 2 . This case study could be tailored to meet the learning needs of the target audience.

T2-5

THE INFLUENCE OF THE WORKPLACE ENVIRONMENT

The workplace environment can enhance or hinder nurses’ motivation to develop their critical thinking abilities. Cornell and Riordan 36 reported an observational study that assessed workflow barriers to critical thinking in the workplace. A total of 2061 tasks were recorded on an acute care unit during 35.7 hours of observation. The activities found to consume nearly 70% of the nurses’ time included verbal communication, walking, administering medications, treatments, and documentation. Nurse workflow was characterized by frequent task switching, interruptions, and unpredictability. The authors recommended reallocating duties, delegating appropriate task to nonnursing personnel, reducing waste, deploying technology that reduces repetitive task, and continuing education and training to help nurses cope with the complex demands of nursing.

Factors in the work environment conducive to the development of critical thinking include an atmosphere of team support, staffing patterns that allow continuity of care, and exposure to a variety of patient care situations. Creating an environment where contributions are valued, nurses feel respected, and there is comfort with asking probing questions is very important in enhancing the development of critical thinking skills.

Critical thinking is an essential skill that impacts the entire spectrum of nursing practice. Studies have shown that the higher the critical thinking ability, the better the nursing competence is. It is essential that critical thinking of new and experienced nurses be assessed and learning activities developed based on the specific needs of the nurses. The concept of critical thinking should be included in orientation, ongoing education, and preceptor preparation curriculums. These educational offerings should be designed to help nurses develop the cognitive skills and habits of the mind considered important for practice.

Bedside nurses can integrate a critical thinking approach by developing clinical expertise, making a commitment to lifelong learning, and practicing based on the evidence. Nurses should routinely reflect on the care provided and the outcomes of their interventions.

Further research is needed to identify areas of critical thinking deficiency and evaluate strategies aimed at enhancing critical thinking. These strategies will ultimately lead to improved clinical decision making and patient outcomes. Bedside nurses, preceptors, and nurse leaders are encouraged to work together collaboratively to create a culture where critical thinking is an integral part of nursing practice.

Acute care; Critical thinking; Decision making

  • + Favorites
  • View in Gallery

Readers Of this Article Also Read

Associations between inactivity and cognitive function in older intensive care..., anticholinergic burden and xerostomia in critical care settings, assessment of clinical reasoning while attending critical care postsimulation..., nurse preceptor role in new graduate nurses' transition to practice, certified and advanced degree critical care nurses improve patient outcomes.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Lippincott Open Access

Logo of lwwopen

Evidence-Based Practice and Nursing Research

Evidence-based practice is now widely recognized as the key to improving healthcare quality and patient outcomes. Although the purposes of nursing research (conducting research to generate new knowledge) and evidence-based nursing practice (utilizing best evidence as basis of nursing practice) seem quite different, an increasing number of research studies have been conducted with the goal of translating evidence effectively into practice. Clearly, evidence from research (effective innovation) must be accompanied by effective implementation, and an enabling context to achieve significant outcomes.

As mentioned by Professor Rita Pickler, “nursing science needs to encompass all manner of research, from discovery to translation, from bench to bedside, from mechanistic to holistic” ( Pickler, 2018 ). I feel that The Journal of Nursing Research must provide an open forum for all kind of research in order to help bridge the gap between research-generated evidence and clinical nursing practice and education.

In this issue, an article by professor Ying-Ju Chang and colleagues at National Cheng Kung University presents an evidence-based practice curriculum for undergraduate nursing students developed using an action research-based model. This “evidence-based practice curriculum” spans all four academic years, integrates coursework and practicums, and sets different learning objectives for students at different grade levels. Also in this issue, Yang et al. apply a revised standard care procedure to increase the ability of critical care nurses to verify the placement of nasogastric tubes. After appraising the evidence, the authors conclude that the aspirate pH test is the most reliable and economical method for verifying nasogastric tube placement at the bedside. They subsequently develop a revised standard care procedure and a checklist for auditing the procedure, conduct education for nurses, and examine the effectiveness of the revised procedure.

I hope that these two studies help us all better appreciate that, in addition to innovation and new breakthrough discoveries, curriculum development and evidence-based quality improvement projects, though may not seem so novel, are also important areas of nursing research. Translating evidence into practice is sound science and merits more research.

Cite this article as: Chien, L. Y. (2019). Evidence-based practice and nursing research. The Journal of Nursing Research, 27 (4), e29. https://doi.org/10.1097/jnr.0000000000000346

  • Pickler R. H. (2018). Honoring the past, pursuing the future . Nursing Research , 67 ( 1 ), 1–2. 10.1097/NNR.0000000000000255 [ PubMed ] [ CrossRef ] [ Google Scholar ]

Immediate Versus Delayed Low-Stakes Questioning: Encouraging the Testing Effect Through Embedded Video Questions to Support Students’ Knowledge Outcomes, Self-Regulation, and Critical Thinking

  • Original research
  • Open access
  • Published: 30 July 2024

Cite this article

You have full access to this open access article

evidence based practice and critical thinking

  • Joseph T. Wong   ORCID: orcid.org/0000-0003-1890-6284 1 ,
  • Lindsey Engle Richland 1 &
  • Bradley S. Hughes 2  

152 Accesses

Explore all metrics

In light of the educational challenges brought about by the COVID-19 pandemic, there is a growing need to bolster online science teaching and learning by incorporating evidence-based pedagogical principles of Learning Experience Design (LXD). As a response to this, we conducted a quasi-experimental, design-based research study involving nN  = 183 undergraduate students enrolled across two online classes in an upper-division course on Ecology and Evolutionary Biology at a large R1 public university. The study extended over a period of 10 weeks, during which half of the students encountered low-stakes questions immediately embedded within the video player, while the remaining half received the same low-stakes questions after viewing all the instructional videos within the unit. Consequently, this study experimentally manipulated the timing of the questions across the two class conditions. These questions functioned as opportunities for low-stakes content practice and retention, designed to encourage learners to experience testing effect and augment the formation of their conceptual understanding. Across both conditions, we assessed potential differences in total weekly quiz grades, page views, and course participation among students who encountered embedded video questions. We also assessed students’ self-report engagement, self-regulation, and critical thinking. On average, the outcomes indicated that learners exposed to immediate low-stakes questioning exhibited notably superior summative quiz scores, increased page views, and enhanced participation in the course. Additionally, those who experienced immediate questioning demonstrated heightened levels of online engagement, self-regulation, and critical thinking. Moreover, our analysis delved into the intricate interplay between treatment conditions, learners’ self-regulation, critical thinking, and quiz grades through a multiple regression model. Notably, the interaction between those in the immediate questioning condition and self-regulation emerged as a significant factor, suggesting that the influence of immediate questioning on quiz grades varies based on learners’ self-regulation abilities. Collectively, these findings highlight the substantial positive effects of immediate questioning of online video lectures on both academic performance and cognitive skills within an online learning context. This discussion delves into the potential implications for institutions to continually refine their approach in order to effectively promote successful online science teaching and learning, drawing from the foundations of pedagogical learning experience design paradigms and the testing effect model.

Similar content being viewed by others

evidence based practice and critical thinking

Impact of question presence and interactivity in instructional videos on student learning

evidence based practice and critical thinking

Fostering engaging online learning experiences: Investigating situational interest and mind-wandering as mediators through learning experience design

evidence based practice and critical thinking

First-year university students' self-regulated learning during the COVID-19 pandemic: a qualitative longitudinal study

Avoid common mistakes on your manuscript.

1 Introduction

A recurring concern in traditional in-person and online courses deployed is how best to maintain and sustain learners’ engagement throughout the learning process. When considering the disruptions caused by the COVID-19 pandemic, these concerns are further exacerbated by competing introductions of “edtech” tools that were deployed in urgency to facilitate teaching and learning during a time of crisis learning context. That is not to say that introducing “edtech” tools did not aid in supporting students’ learning trajectories during this period of time, but a major concern currently is a widespread deployment of “edtech solutions’’ without proper alignment with evidence-based pedagogical learning frameworks (Asad et al., 2020 ; Chick et al., 2020 ; Sandars et al., 2020 ) and whether or not the tools being deployed were having the intended supporting learning effect on students. Between 2020 and 2022, the United States government distributed $58.4 billion dollars through the Higher Education Emergency Relief Fund to public universities which spent more than $1.2 billion on distance learning technologies (EDSCOOP, 2023 ; O’leary & June, 2023 ). Educational technology spending by universities included expenditures on software licenses, hardware (such as computers and tablets), learning management systems (LMS), online course development tools, audio-visual equipment, digital content, and various technology-related services to name a few. In light of the considerable resources dedicated to distance learning in recent years, the need to discern how to employ these “edtech tools’’ in a manner that is meaningful, impactful, and grounded in evidence-based pedagogies has grown substantially.

Higher education has been grappling with a myriad of technologies to deploy in order to support the exponential increase of undergraduates enrolled in online courses. Data from the United States in the fall of 2020 indicate that approximately 11.8 million (75%) undergraduate students were enrolled in at least one distance learning course, while 7.0 million (44%) of undergraduates exclusively took distance education courses (National Center for Education Statistics [NCES], 2022 ). In the Fall of 2021 with the return to in-person instruction, about 75% of all postsecondary degree seekers in the U.S. took at least some online classes with around 30% studying exclusively online (NCES, 2022 ). In the aftermath of the pandemic, the proportion of students engaged in online courses has declined to 60%. Nevertheless, this figure remains notably higher than the levels seen in the pre-pandemic era (NCES, 2022 ). To meet the increasing demand, universities possess substantial opportunities to explore effective strategies for enhancing the online learning experiences of undergraduate students. However, it’s important to note that merely introducing new tools into instructors’ technological toolkit may not be enough to foster impactful teaching and learning.

To address these concerns, this study employs a quasi-experimental design, implementing embedded video questions into an asynchronous undergraduate Biology course, anchored in the Learning Experience Design (LXD) pedagogical paradigm. The objective is to assess the effectiveness of the embedded video question assessment platform, utilizing video technologies and employing design-based research (DBR) methodologies to evaluate practical methods for fostering active learning in online educational settings. While video content integration in education is recognized as valuable for capturing learners’ attention and delivering complex concepts (Wong et al., 2023 , 2024 ), passive consumption of videos may not fully harness their potential to promote active learning and deeper engagement (Mayer, 2017 , 2019 ). Embedded video questions provide an avenue to transform passive viewing into an interactive and participatory experience (Christiansen et al., 2017 ; van der Meij & Bӧckmann, 2021 ). By strategically embedding thought-provoking questions within video segments, educators can prompt students to reflect on the material, assess comprehension, and immediately evaluate conceptual understanding. Additionally, analyzing the timing and placement of these questions within a video lesson may yield valuable insights into their effectiveness of facilitating the testing effect, a process in which implementing low-stakes retrieval practice over a period of time can help learners integrate new information with prior knowledge (Carpenter, 2009 ; Littrell-Baez et al., 2015 ; Richland et al., 2009 ). Understanding how variations in timing influence student responses and comprehension levels can inform instructional strategies for optimizing the use of interactive elements in educational videos in fostering engagement and enhancing learning performance.

This study aimed to compare students who received low-stakes questions after watching a series of lecture videos with those who encountered questions immediately embedded within the video player. The objective was to identify differences in total weekly quiz scores, course engagement, as well as learning behaviors such as critical thinking and self-regulation over a span of 10 weeks. While previous studies have examined the efficacy of embedded video questions, few have considered the interrelation of these learning behaviors within the context of the Learning Experience Design (LXD) paradigm and the testing effect model for undergraduate science courses. These findings will contribute to a deeper understanding of evidence-based designs for asynchronous online learning environments and will help in evaluating the effectiveness of embedding video questions with regards to question timing within the LXD paradigm. Considering the increasing demand and substantial investment in online courses within higher education, this study aims to assess the effectiveness of a research-practice partnership in implementing embedded video questions into two courses. The ultimate aim is to determine whether this approach could serve as a scalable model for effectively meeting educational needs in the future.

2 Literature Review

2.1 learning experience design.

Learning Experience Design (LXD) encompasses the creation of learning scenarios that transcend the confines of traditional classroom settings, often harnessing the potential of online and educational technologies (Ahn, 2019 ). This pedagogical paradigm involves crafting impactful learning encounters that are centered around human needs and driven by specific objectives, aimed at achieving distinct learning results (Floor, 2018 , 2023 ; Wong & Hughes, 2022 ; Wong et al., 2024 ). LXD differs from the conventional pedagogical process of “instructional design,” which primarily focuses on constructing curricula and instructional programming for knowledge acquisition (Correia, 2021 ). Instead, LXD can be described as an interdisciplinary integration that combines principles from instructional design, pedagogical teaching approaches, cognitive science, learning sciences, and user experience design (Weigel, 2015 ). LXD extends beyond the boundaries of traditional educational settings, leveraging online and virtual technologies (Ahn, 2019 ). As a result, the primary focus of LXD is on devising learning experiences that are human-centered and geared toward specific outcomes (Floor, 2018 ; Wong & Hughes, 2022 ).

Practically, LXD is characterized by five essential components: Human-Centered Approach, Objective-Driven Design, Grounded in Learning Theory, Emphasis on Experiential Learning, and Collaborative Interdisciplinary Efforts (Floor, 2018 ). Taking a human-centered approach considers the needs, preferences, and viewpoints of the learners, resulting in tailored learning experiences where learners take precedence (Matthews et al., 2017 ; Wong & Hughes, 2022 ). An objective-driven approach to course design curates learning experiences that are intentionally structured to align specific objectives, making every learning activity purposeful and pertinent to support students’ learning experiences (Floor, 2018 ; Wong et al., 2022 ). LXD also is grounded in learning theories, such that the design process is informed by evidence-based practices drawn from cognitive science and learning sciences (Ahn et al., 2019 ). Furthermore, LXD places a large emphasis on experiential learning where active and hands-on learning techniques, along with real-world applications, facilitate deeper understanding and retention (Floor, 2018 , 2023 ; Wong et al., 2024 ). Lastly, LXD is interdisciplinary, bringing together professionals from diverse backgrounds, including instructional designers, educators, cognitive scientists, and user experience designers, to forge comprehensive and well-rounded learning experiences (Weigel, 2015 ). Each of these facets underscores the significance of empathy, where both intended and unintended learning design outcomes are meticulously taken into account to enhance learners’ experiences (Matthews et al., 2017 ; Wong & Hughes, 2022 ). Consequently, LXD broadens the scope of learning experiences, enabling instructors and designers to resonate with learners and enrich the repertoire of learning design strategies (Ahn et al., 2019 ; Weigel, 2015 ), thus synergizing with the utilization of video as a powerful tool for teaching and learning online. In tandem with the evolving landscape of educational practices, LXD empowers educators to adapt and enhance their methodologies, fostering successful and enriched learning outcomes (Ahn, 2019 ; Floor, 2018 , 2023 ; Wong et al., 2022 ), while also embracing the dynamic potential of multimedia educational technologies like video in delivering effective and engaging instructional content.

2.2 Video as a Tool for Teaching and Learning

Video and multimedia educational technologies have been broadly used as “edtech tools” tools for teaching and learning over the last three decades during in-person instruction and especially now with online learning modalities (Cruse, 2006 ; Mayer, 2019 ). Educational videos, also referred to as instructional or explainer videos, serve as a modality for delivering teaching and learning through audio and visuals to demonstrate or illustrate key concepts being taught. Multiple researchers have found evidence for the affordances of video-based learning, citing benefits including reinforcement in reading and lecture materials, aiding the development of common base knowledge for students, enhancing comprehension, providing greater accommodations for diverse learning preferences, increasing student motivations, and promoting teacher effectiveness (Corporate Public Broadcasting [CPB], 1997 , 2004 ; Cruse, 2006 ; Kolas, 2015 ; Wong et al., 2023 ; Wong et al., 2024 ; Yousef et al., 2014 ). Proponents in the field of video research also cite specific video design features that aid in specifically supporting students’ learning experiences such as searching, playback, retrieval, and interactivity (Giannakos, 2013 ; Yousef et al., 2014 ; Wong et al., 2023b ). A study conducted by Wong et al. ( 2023b ) sheds light on the limitations of synchronous Zoom video lectures, based on a survey of more than 600 undergraduates during the pandemic. It underscores the advantages of the design of asynchronous videos in online courses, which may better accommodate student learning needs when compared to traditional synchronous learning (Wong et al., 2023b ). Mayer’s ( 2001 , 2019 ) framework for multimedia learning provides a theoretical and practical foundation for how video-based learning modalities can be used as cognitive tools to support students’ learning experiences. While some researchers have argued videos as a passive mode of learning, Mayer ( 2001 ) explains that viewing educational videos involves high cognitive activity that is required for active learning, but this can only occur through well-designed multimedia instruction that specifically fosters cognitive processing in learners, even though learners may seem or appear to be behaviorally inactive (Meyer, 2009 , 2019 ). Following Mayer’s ( 2019 ) principles, we designed multimedia lessons supporting students’ cognitive processing through segmenting, pre-training, temporal contiguity, modality matching, and signaling, all implemented through asynchronous embedded video questions.

2.3 Embedded Video Questions

Embedded video questions are a type of educational technology design feature that adds interactive quizzing capacities while students engage in video-based learning. They involve incorporating formative assessments directly within online videos, prompting viewers to answer questions at specific points in the content. While a video is in progress, students viewing it are prompted with questions designed to encourage increased engagement and deeper cognitive processing (Christiansen et al., 2017 ; Kovacs, 2016 ; Wong et al., 2023 ; van der Meij et al., 2021 ). This is similar to an Audience Response System (ARS) during traditional in-person lectures where an instructor utilizes a live polling system in a lecture hall such as iClickers to present questions to the audience (Pan et al., 2019 ). Yet, within the context of online learning, students are tasked with independently viewing videos at their convenience, and a set of on-screen questions emerges. This allows learners to pause, reflect, and answer questions at their own pace, fostering a sense of control over the learning process (Ryan & Deci, 2017 ). These questions serve to promptly recapitulate key concepts, identify potential misconceptions, or promote conceptual understanding of the subject matter. Studies suggest that embedded video questions can significantly improve student engagement compared to traditional video lectures (Chi & Wylie, 2014 ). Research on the use of embedded video questions has already shown promising empirical results in the field, such as stimulating students’ retrieval and practice, recognition of key facts, and prompting behavioral changes to rewind, review, or repeat the materials that were taught (Cummins et al., 2015 ; Haagsman et al., 2020 ; Rice et al., 2019 ; Wong & Hughes et al., 2022 ; Wong et al., 2024 ). Embedded video questions have also been shown to transition learners from passively watching a video to actively engaging with the video content (Dunlosky et al., 2013 ; Kestin & Miller, 2022 ; Schmitz, 2020 ), a critically important factor when considering the expediency from in-person to online instruction due to the pandemic. As a result, there are a myriad of affordances that showcase the potential effects of embedded video questions on student learning experiences ⎯one of which is how embedded video questions can be intentionally leveraged with regards to question timing to support active information processing facilitated through the testing effect.

3 Testing Effect

Active information processing in the context of video-based learning is the process in which learners are able to encode relevant information from a video, integrate that information with their prior knowledge, and retrieve that information stored at a later time (Johnson & Mayer, 2009 ; Schmitz, 2020 ). This active learning process of retrieval, the learning strategy of rehearsing learning materials through quizzing and testing, is grounded in the cognitive process known as the testing effect. From a cognitive learning perspective, the testing effect is a process in which implementing low-stakes retrieval practice over a period of time can help learners integrate new information with prior knowledge, increasing long-term retention and memory retrieval in order to manipulate knowledge flexibly (Carpenter, 2009 ; Littrell-Baez et al., 2015 ; Richland et al., 2009 ). This shifts the narrative from looking at assessments as traditional high-stakes exams, but rather as practice learning events that provide a measure of learners’ knowledge in the current moment, in order to more effectively encourage retention and knowledge acquisition of information not yet learned (Adesope et al., 2017 ; Carrier & Pashler, 1992 ; Richland et al., 2009 ). The connection between retrieval and the testing effect represents sustained, continual, and successive rehearsal of successfully retrieving accurate information from long-term memory storage (Schmitzs, 2020 ).

The frequency of practice and the time allotted between practice sessions also play a role in memory retention. Equally as important, the timing and intentionality of when these questions might occur within a video may influence learner outcomes. As such, the more instances learners are able to retrieve knowledge from their long-term memory as practice, the better learners may recall and remember that information (Richland et al., 2009 ). This can come in the form of practice tests, which research has shown tremendous success in the cognitive testing literature (Carpenter, 2009 ; Roediger III & Karpicke, 2006 ), or in this study, embedded video questions to facilitate the testing effect. By doing so, we can provide students with an alternative interactive online modality to learning the material in addition to rereading or re-studying (Adesope et al., 2017 ; Roediger et al., 2006 ). Instead, learners are presented with opportunities to answer questions frequently and immediately as retrieval practice when watching a video. Active participation through answering questions keeps viewers focused and promotes deeper information processing (Azevedo et al., 2010 ). We can offer a focused medium for students to recall, retrieve, and recognize crucial concepts (Mayer et al., 2009 ; van de Meij et al., 2021 ). This approach aims to cultivate an active learning environment that engages learners’ cognitive processing during online education. It assists students in discerning which aspects of the learning material they have mastered and identifies areas that require further attention (Agarwal et al., 2008 ; Fiorella & Mayer, 2015 , 2018 ; McDaniel et al., 2011 ).

4 The Testing Effect on Student Learning Behaviors

Embedded video questions present a potential learning modality that operationalizes the theoretical model of the testing effect which may have tremendous benefits on the nature of student-centered active learning opportunities within an online course, particularly with student learning behaviors such as student engagement, self-regulation, and critical thinking. As such, leveraging the testing effect and the LXD pedagogical paradigm synergistically through the medium of embedded video questions may amplify student learning behaviors in online courses. The following sections review the literature on engagement, self-regulation, and critical thinking.

Student engagement in the online learning environment has garnered significant attention due to its crucial role in influencing learning outcomes, satisfaction, and overall course success (Bolliger & Halupa, 2018 ; Wang et al., 2013 ; Wong et al., 2023b ; Wong & Hughes, 2022 ). Broadly defined, student engagement can be characterized as the extent of student commitment or active involvement required to fulfill a learning task (Redmond et al., 2018 ; Ertmer et al., 2010 ). Additionally, engagement can extend beyond mere participation and attendance, involving active involvement in discussions, assignments, collaborative activities, and interactions with peers and instructors (Hu & Kuh, 2002 ; Redmond et al., 2018 ; Wong et al., 2022 ). Within an online course, engagement can be elaborated as encompassing the levels of attention, curiosity, interaction, and intrinsic interest that students display throughout an instructional module (Redmond et al., 2018 ). This also extends to encompass the motivational characteristics that students may exhibit during their learning journey (Pellas, 2014 ). Several factors influence student online engagement, and they can be broadly categorized into individual, course-related, and institutional factors. Individual factors include self-regulation skills, prior experience with online learning, and motivation (Sansone et al., 2011 ; Sun & Rueda, 2012 ). Course-related factors encompass instructional design, content quality, interactivity, and opportunities for collaboration (Pellas, 2014 ; Czerkawski & Lyman, 2016 ). Institutional factors involve support services, technological infrastructure, and instructor presence (Swan et al., 2009 ; Picciano, 2023 ). Furthermore, research has established a noteworthy and favorable correlation between engagement and various student outcomes, including advancements in learning, satisfaction with the course, and overall course grades (Bolliger & Halupa, 2018 ; Havlverson & Graham, 2019 ). Instructional designers argue that to enhance engagement, instructors and educators can employ strategies like designing interactive and authentic assignments (Cummins et al., 2015 ; Floor, 2018 ), fostering active learning opportunities, and creating supportive online learning environments (Kuh et al., 2005 ; Wong et al., 2022 ). Thus, engaged students tend to demonstrate a deeper understanding of the course material, a stronger sense of self-regulation, and improved critical thinking skills (Fedricks et al., 2004 ; Jaggars & Xu, 2016 ; Pellas, 2018 ).

Self-regulation pertains to the inherent ability of individuals to manage and control their cognitive and behavioral functions with the intention of attaining particular objectives (Pellas, 2014 ; Vrugt & Oort, 2008 ; Zimmerman & Schunk, 2001 ). In the context of online courses, self-regulation takes on a more specific definition, encapsulating the degree to which students employ self-regulated metacognitive skills–the ability to reflect on one’s own thinking–during learning activities to ensure success in an online learning environment (Wang et al., 2013 ; Wolters et al., 2013 ). Unlike conventional in-person instruction, asynchronous self-paced online courses naturally lack the physical presence of an instructor who can offer immediate guidance and support in facilitating the learning journey. While instructors may maintain accessibility through published videos, course announcements, and email communication, students do not participate in face-to-face interactions within the framework of asynchronous courses. However, the implementation of asynchronous online courses offers learners autonomy, affording them the flexibility to determine when, where, and for how long they engage with course materials (McMahon & Oliver, 2001 ; Wang et al., 2017 ). Furthermore, the utilization of embedded video questions in this course taps into Bloom’s taxonomy, featuring both lower and higher-order thinking questions to test learners’ understanding. This medium enables learners to immediately engage with and comprehend conceptual materials through processes such as pausing, remembering, understanding, applying, analyzing, and evaluating, negating the need to postpone these interactions until exam dates (Betts, 2008 ; Churches, 2008 ). While this shift places a significant responsibility on the learner compared to traditional instruction, embedded video questions contribute to a student-centered active learning experience (Pulukuri & Abrams, 2021 ; Torres et al., 2022 ). This approach nurtures students’ self-regulation skills by offering explicit guidance in monitoring their cognitive processes, setting both short-term and long-term objectives, allocating sufficient time for assignments, promoting digital engagement, and supplying appropriate scaffolding (Al-Harthy et al., 2010 ; Kanuka, 2006 ; Shneiderman & Hochheiser, 2001 ). Through this, students actively deploy numerous cognitive and metacognitive strategies to manage, control, and regulate their learning behaviors to meet the demands of their tasks (Moos & Bonde, 2016 ; Wang et al., 2013 ). Due to the deliberate application of LXD principles, the course has the capability to enhance the development of students’ self-regulation abilities in the context of online learning (Pulukuri & Abrams, 2021 ). Consequently, this empowers students to identify their existing knowledge and engage in critical evaluation of information that may need further refinement and clarification.

Leveraging the testing effect model through the integration of embedded video questions also yields notable advantages concerning students’ critical thinking capabilities. Critical thinking involves students’ capacity to employ both new and existing conceptual knowledge to make informed decisions, having evaluated the content at hand (Pintrich et al., 1993 ). In the context of online courses, critical thinking becomes evident through actions such as actively seeking diverse sources of representation (Richland & Simms, 2015 ), encountering and learning from unsuccessful retrieval attempts (Richland et al., 2009 ), and effectively utilizing this information to make informed judgments and draw conclusions (Uzuntiryaki-Kondakci & Capa-Aydin, 2013 ). To further elaborate, according to Brookfield ( 1987 ), critical thinking in the research context involves recognizing and examining the underlying assumptions that shape learners’ thoughts and actions. As students actively practice critical thinking within the learning environment, the research highlights the significance of metacognitive monitoring, which encompasses the self-aware assessment of one’s own thoughts, reactions, perceptions, assumptions, and levels of confidence in the subject matter (Bruning, 2005 ; Halpern, 1998 ; Jain & Dowson, 2009 ; Wang et al., 2013 ). As such, infusing embedded video questions into the learning process may serve as a strategic pedagogical approach that may catalyze students’ critical thinking skills.

In the context of embedded video questions, students must critically analyze questions, concepts, scenarios, and make judgments on which answer best reflects the problem. As students engage with the videos, they’re prompted to monitor their own thinking processes, question assumptions, and consider alternate perspectives—a quintessential aspect of metacognition that complements critical thinking (Bruning, 2005 ; Halpern, 1998 ; Jain & Dowson, 2009 ; Wang et al., 2013 ). Sometimes, students might get the answers wrong, but these unsuccessful attempts also contribute to the testing effect in a positive manner (Richland et al., 2009 ). Unsuccessful attempts serve as learning opportunities to critically analyze and reflect during the low-stakes testing stage so that learners are better prepared later on. Furthermore, cultivating students’ aptitude for critical thinking also has the potential to enhance their transferable skills (Fries et al., 2020 ), a pivotal competency for STEM undergraduate students at research-intensive institutions (R1), bridging course content to real-world applications. In essence, the interplay between the testing effect model and the use of embedded video questions not only supports students’ critical thinking, but also underscores the intricate relationship between engagement, self-regulation, and course outcomes (Wang et al., 2013 ).

4.1 Current Study

This study builds on the work of Wong and Hughes ( 2023 ) on the implementation of LXD in STEM courses utilizing educational technologies. Utilizing the same course content, course videos, and pedagogical learning design, this Design-Based Research (DBR) approach employs learning theories to assess the effectiveness of design and instructional tools within real-world learner contexts (DBR Collective, 2003; Siek et al., 2014 ). In this study, we utilized the same instructional videos and course materials as Wong & Hughes et al. ( 2023 ), but instead incorporated iterative design enhancements such as embedded video questions to assess their potential testing effect impacts on students’ learning experiences. Therefore, this quasi-experimental research contrasts students who participated in a 10-week undergraduate science online course. Half of these students encountered low-stakes questions integrated directly within the video player (immediate condition), while the other half received questions following a series of video lectures (delayed condition). The aim is to assess how the timing of when low-stakes questioning occurs might beneficially influence learners’ science content knowledge, engagement, self-regulation, and critical thinking. Additionally, we assessed students’ learning analytics within the online course, including online page views and course participation, as a proximal measure of learners’ online engagement. We then compared these findings with their self-report survey responses within the online course to corroborate the results. With the implementation of a newly iterated online course grounded in LXD paradigm and the testing effect model, this study is guided by the following research questions:

RQ1) To what extent does the effect of “immediate vs. delayed low-stakes questioning” influence learners’ total quiz grades, online page views, and course participation rate?

RQ2) To what extent does the effect of “immediate vs. delayed low-stakes questioning” influence learners’ engagement, self-regulation, and critical thinking?

RQ3) To what extent does the relationship between “immediate vs. delayed low-stakes questioning” and learner’s total quiz grades vary depending on their levels of self-regulation and critical thinking?

5 Methodology

5.1 ethical considerations.

This study, funded by the National Science Foundation (NSF), adheres to stringent ethical standards mandated by both the university and the grant funding agency. The university institution obtained approval from its Institutional Review Board (IRB) to conduct human subjects research, ensuring compliance with ethical guidelines. The research was categorized as IRB-exempt due to its online, anonymous data collection process, which posed minimal risk to participants. All participants were provided with comprehensive information about the study, including its purpose, procedures, potential risks and benefits, confidentiality measures, and their right to withdraw without consequences. Participant data was treated with utmost confidentiality and anonymity, and the study’s questions, topics, and content were designed to avoid causing harm to students. The research protocol received formal approval from the university’s ethics committee. All participants provided informed consent to participate in the study before any data collection procedures commenced. This ensured that participants were fully aware of the study’s purpose, procedures, potential risks and benefits, confidentiality measures, and their right to withdraw without consequences.

5.2 Quasi-experimental Design

This research employed a design-based research (DBR) approach, leveraging learning theories to evaluate the effectiveness of design, instructional tools, or products in authentic, real-world settings (DBR Collective, 2003; Siek et al., 2014 ). The rationale for this research methodology is to assess instructional tools in ecologically valid environments and explore whether these tools enhance students’ learning experiences (Scott et al., 2020 ). Our decision to adopt a DBR approach arises from the limited research on investigating the efficacy of the Learning Experience Design (LXD) pedagogical paradigm with embedded video questions in online undergraduate science courses. We are also cognizant of previous research indicating that simply inserting questions directly into videos, without evidence-based pedagogical principles, intentional design, and instructional alignment, does not significantly improve learning outcomes (Deng et al., 2023 ; Deng & Gao, 2023 ; Marshall & Marshall, 2021 ). Thus, this DBR study utilizes a Learning Experience Design (LXD) approach to cultivate active learner engagement through the implementation of learning theories such as the testing effect model. We then compare the impact of embedded video questions on learning outcomes within the newly designed self-paced asynchronous online course (See Fig.  1 ). Subsequently, we test these designs with learners and utilize the findings to iterate, adapt, and redeploy these techniques continually, aiming to enhance the efficacy and gradual evolution in our designs of embedded video questions on students’ learning experiences.

figure 1

Quasi-experimental research design.

The study involved two equivalently sized classes within the School of Biological Sciences at an R1 university in Southern California, with students voluntarily enrolling in either one of these two classes. The two classes were taught by the same professor on the same topics of Ecology and Evolutionary Biology. This particular course was chosen to serve as a research-practice partnership (RPP), collaborating closely with the professor, educational designers, researchers, and online course creators to customize a course that aligns with the instructor’s and students’ needs returning from the COVID-19 remote learning environment.

The study spanned a 10-week period, allowing sufficient dosage for implementing our learning designs and effectively measuring their impact on students’ learning experiences (See Fig.  1 ). Selecting a quasi-experimental design allowed us to assess the impact of question timing and placement on students’ comprehension and retention of the material presented in the videos. Following quasi-experimental design principles, the study involved two classes, each assigned to a different treatment condition. Students who experienced low-stakes questions after watching a series of videos were referred to as “Delayed Questioning,” and students who experienced low-stakes questions immediately embedded within the video player were referred to as “Immediate Questioning.” In the delayed questioning condition, students encountered low-stakes questions only after watching all assigned video lectures for the week, while in the immediate questioning condition, students faced questions directly embedded in the video player, time-stamped and deliberately synchronized with the presented conceptual content. The two treatment conditions, “Delayed” and “Immediate Questioning’’ were carefully designed to isolate the effect of question timing while keeping all other variables constant. As such, the low-stakes questions, quantity of videos, and the number of questions in both conditions were completely identical, with the only experimental manipulation involving the timing and placement of the questions across conditions.

Following the viewing of videos and answering of low-stakes questions, either embedded directly in the video or after watching all of the videos in the instructional unit, all students proceeded to take an end-of-week quiz, serving as a summative assessment released on Fridays. The end-of-week quiz was completely identical and released at the same time and day across both conditions. This comprehensive approach ensured equitable testing conditions and minimized potential confounding variables. Furthermore, this approach allowed for a controlled comparison between the two conditions, helping to determine whether embedding questions directly within the video player led to different learning outcomes compared to presenting questions after watching all of the videos. Selecting these carefully designed treatment conditions allowed for a controlled comparison, isolating the effect of question timing while keeping all other variables constant. This methodological rigor facilitated a robust analysis of the impact of question placement on students’ learning experiences and outcomes.

5.3 Participants

The study encompassed a total of n =  183 undergraduate students who were actively enrolled in upper-division courses specializing in Ecology and Evolutionary Biology. Participants were selected based on their voluntary self-enrollment in these upper-division courses during a specific enrollment period of Winter 2021. No exclusion criteria were applied, allowing for a broad sample encompassing various backgrounds and levels of experience in Ecology and Evolutionary Biology. These courses were part of the curriculum at a prominent R1 research university located in Southern California and were specifically offered within the School of Biological Sciences. Students were able to enroll in the upper division course so long as they were a biological sciences major and met their lower division prerequisites. Regarding the demographic makeup of the participants, it included a diverse representation with 1.2% identifying as African American, 72.0% as Asian/Pacific Islander, 10.1% as Hispanic, 11.3% as white, and 5.4% as multiracial. Gender distribution among the students consisted of 69.0% females and 31.0% males (See Table  1 ). Participants randomly self-select into one of two distinct course sections, each characterized by different approaches to course implementation: (1) The first condition featured questions placed at the conclusion of all video scaffolds ( n =  92). (2) The second section incorporated questions that were embedded directly within the video scaffolds themselves ( n =  91).

5.4 Learning Experience Design

5.4.1 video design.

The curriculum delivery integrated innovative self-paced video materials crafted with the Learning Experience Design (LXD) paradigm in mind (Wong et al., 2024 ). These videos incorporated various digital learning features such as high-quality studio production, 4 K multi-camera recording, green screen inserts, voice-over narrations, and animated infographics (See Fig.  2 ). Underpinning this pedagogical approach of the video delivery was the situated cognition theory (SCT) for e-learning experience design, as proposed by Brown et al. in 1989. In practice, the videos were structured to align with the key elements of SCT, which include modeling, coaching, scaffolding, articulation, reflection, and exploration (Collins et al., 1991 ; Wong et al., 2024 ). For instance, the instructor initiated each module by introducing a fundamental concept, offering in-depth explanations supported by evidence, presenting real-world instances demonstrating the application of the concept in research, and exploring the implications of the concept to align with the course’s educational objectives. This approach emphasized immersion in real-world applications, enhancing the overall learning experience.

figure 2

This figure visually depicts the embedded video question interface alongside the Bloom's Taxonomy pyramid, illustrating the connection between the video questions and the quiz questions for the week, specifically emphasizing the testing effect

In the video design process, we adopted an approach where content equivalent to an 80-minute in-person lecture was broken down into smaller, more manageable segments lasting between five to seven minutes. This approach was taken to alleviate the potential for student fatigue, reduce cognitive load, and minimize opportunities for students to become distracted (Humphris & Clark, 2021 ; Mayer, 2019 ). Moreover, we meticulously scripted the videos to align seamlessly with the course textbook. This alignment served the purpose of pre-training students in fundamental concepts and terminologies using scientific visuals and simplified explanations, thereby preparing them for more in-depth and detailed textbook study. As part of our video design strategy, we strategically integrated embedded questions at specific time points during the video playback. These questions were designed to serve multiple purposes, including assessing students’ comprehension, sustaining their attention, and pinpointing areas of strength and weakness in their understanding. In line with Meyer’s ( 2019 ) principles of multimedia design, our videos were crafted to incorporate elements like pretraining, segmenting, temporal contiguity, and signaling (See Fig.  2 ). These principles ensured that relevant concepts, visuals, and questions were presented concurrently, rather than sequentially (Mayer, 2003, 2019 ). This approach encouraged active engagement and processing by providing cues to learners within the video content.

5.4.2 Question Design

Students in both the “immediate” and “delayed” conditions experienced low-stakes multiple-choice questions. Low-stakes multiple-choice questions were knowledge check questions that served as opportunities for content practice, retention, and reconstructive exercises, aiming to engage learners in the testing effect and enhance their conceptual understanding (Richland et al., 2009 ). Grounded in Bloom’s Taxonomy, the low-stakes questions were designed to emphasize lower-order thinking skills, such as “remembering and understanding” concepts in context (Bloom, 2001 ; Betts, 2008 ) (See Fig.  2 ). In contrast, students experienced high-stakes multiple-choice questions on the weekly summative quizzes consisting of higher-order thinking questions that required students to “apply, analyze, and evaluate” scenarios in ecology and evolutionary biology, encouraging learners to break down relationships and make judgments about the information presented (Bloom, 2001 ; Betts, 2008 ) (See Fig.  2 ).

For instance, an example low-stakes multiple-choice question can be found in Fig. 2  that students encountered which included: “In the hypothetical fish example, the cost of reproduction often involves:” (A) shunting of fats and gonads to provision eggs, (B) shunting of fats to gonads to make more sperm, (C) using fats as a source of fuel for general locomotion, (D) fish face no resource limitations, (E) A and B . Upon reading the question, the question prompts the learner to “remember” and “understand” what they just watched and identify what they know or potentially do not know. Questions that prompt learners to “remember” and “understand” are considered lower-order thinking questions on the Bloom’s Taxonomy pyramid (Bloom, 2001 ). An example of the high-stakes questions that students encountered while taking their weekly summative exams include: “Given the tradeoff between survival and reproduction fertility, (the number of offspring), how does natural selection act on species? A) Natural selection will minimize the number of mating cycles, B) Natural selection will maximize fecundity, C) Natural selection will maximize survivability, D) Natural selection will compromise between survival and fecundity, D) None of the above . These high-stakes questions on the weekly summary quizzes are made up of higher-order thinking questions that require learners to “apply, analyze, and evaluate,” which consists of the top three pillars of the Bloom’s taxonomy pyramid (Bloom, 2001 ). The notable differences between low-stakes and high-stakes questions are learners’ application of their conceptual understanding to elaborate on their new and existing understandings, critically evaluate between concepts, and apply the concepts in a new scenario or context. High-stakes questions, or higher-order thinking questions, have been shown to promote the transfer of learning, increase the application of concepts during retrieval practice, and prevent simply recalling facts and memorizing the right answers by heart (Chan, 2010 ; McDaniel et al., 2013 ; Mayer, 2014 ; Richland et al., 2009 ). This active process allows students to organize the key learning concepts into higher orders and structures. Moreover, the student’s working memory connects new knowledge with prior knowledge, facilitating the transfer to long-term memory and enabling the retrieval of this information at a later time (Mayer, 2014 ). Together, these strategic question design choices empower students to actively participate in constructive metacognitive evaluations, encouraging learners to contemplate “how and why” they reached their conclusions (See Fig.  2 ). Research has indicated that such an approach promotes critical thinking and the utilization of elaborative skills among learners in online learning contexts (Tullis & Benjamin, 2011 ; Wang et al., 2013 ). Furthermore, by having students answer questions and practice the concepts, our intentions were that students would be better prepared for the high-stakes questions on the weekly quizzes due to the facilitation of testing effect through low-stakes questioning prior.

Based on their respective conditions, learners would encounter low-stakes questions either after watching a series of 6 or 7 lecture videos or integrated directly within each video synchronized to the concept being taught. We opted to have the questions for the “delayed” condition after a series of videos instead of after every video because this time delay allowed us to investigate the effects of timing and spacing between the two treatment conditions. Having all the questions appear at the end of a series of lecture videos also helped to avoid the recency effect and minimize immediate recall for students in the “delayed” condition. Since having questions after every video could also be considered a form of immediate questioning, as the questions would be directly related to the video students just watched, we intentionally designed the “delayed” condition to have all the questions at the end of 6 or 7 videos for that instructional unit to maintain treatment differences. By structuring the questions in the “delayed” condition this way, we aimed to assess whether students retain and integrate knowledge over time, providing a more comprehensive understanding of the learning process and the potential treatment differences of “delayed” compared to “immediate” questioning. Furthermore, we considered that this design choice could mitigate potential fatigue effects that might arise from frequent interruptions of questioning for students in the “immediate” condition. Ultimately, the research design decision for the “delayed” condition to place the low-stakes questions after students watched 6 or 7 videos for that instructional unit provided an optimal treatment comparison between the immediate and delayed conditions.

5.4.3 Course Design and Delivery

The course was implemented within the Canvas Learning Management System (LMS), the official learning platform of the university. The videos recorded for this course were uploaded, designed, and deployed using the Yuja Enterprise Video Platform software. Yuja is a cloud-based content management system (CMS) for video storage, streaming, and e-learning content creation. For this study, we utilized Yuja to store the videos in the cloud, design the embedded video questions platform, and record student grades. After uploading the videos, the questions were inputted into the Yuja system with the corresponding answer options based on specific time codes. These time codes were determined based on the concepts presented within the video. Typically, lower-order thinking questions (i.e. questions that required, remembering, understanding) were placed immediately after introducing a definition of a key concept. Then, higher-order thinking questions (i.e. analyzing, evaluating) were placed towards the end of the video for students to apply the concepts in context before moving on to the next video. Finally, each video was then published from Yuja to Canvas using the Canvas Learning Tools Interoperability (LTI) integration so that all student grades from the embedded video questions were automatically graded and directly updated into the Canvas grade book.

5.5 Data Collection and Instrumentation

Data collection for this study was conducted electronically during the Winter 2021 academic term. All survey measurement instruments were distributed online to the participating students through the Qualtrics XM platform, an online survey tool provided through the university. Students were granted direct access to the surveys through hyperlinks that were seamlessly integrated into their Canvas Learning Management System (LMS) course space, providing a user-friendly, FERPA compliant, and secure centralized data collection environment. Students filled out the surveys immediately after completing their last lesson during the last week of the course on Week 10. When responding to all of the surveys, students were asked to reflect on their learning experiences about the online course they were enrolled in specifically. Having students complete the survey right after their last lesson was an intentional research design decision in order to maintain the rigor, robustness, and quality of responses from students.

5.5.1 Survey Instruments

Three surveys were given to the participants: the Motivated Strategies for Learning Questionnaire, assessing critical thinking and self-regulation, and the Perceived Engagement Scale. We maintained the original question count and structure for reliability but made slight adjustments, such as replacing “classroom” with “online course” to align with the study’s online math intervention context. This approach, supported by research (Hall, 2016; Savage, 2018), ensures effectiveness while preserving the survey instruments’ reliability, particularly across different learning modalities.

The MLSQ instrument utilized in this study was originally developed by a collaborative team of researchers from the National Center for Research to Improve Postsecondary Teaching and Learning and the School of Education at the University of Michigan (Pintrich et al., 1993 ). This well-established self-report instrument is designed to comprehensively assess undergraduate students’ motivations and their utilization of diverse learning strategies. Respondents were presented with a 7-point Likert scale to express their agreement with statements, ranging from 1 (completely disagree) to 7 (completely agree). To evaluate students in the context of the self-paced online course, we focused specifically on the self-regulation and critical thinking subscales of the MLSQ. Sample items in the self-regulation scale included statements such as “When studying for this course I try to determine which concepts I don’t understand well” and “When I become confused about something I’m watching for this class, I go back and try to figure it out.” Sample items for critical thinking include “I often find myself questioning things I hear or read in this course to decide if I find them convincing” and “I try to play around with ideas of my own related to what I am learning in this course.” According to the original authors, these subscales exhibit strong internal consistency, with Cronbach alpha coefficients reported at 0.79 and 0.80, respectively. In this study, Cronbach’s alphas for self-regulation and critical thinking were 0.86 and 0.85, respectively.

To gauge students’ perceptions of their online engagement, we employed a 12-item survey adapted from Rossing et al. ( 2012 ). This survey encompassed a range of questions probing students’ views on the learning experience and their sense of engagement within the online course. Respondents conveyed their responses on a 5-point Likert scale, ranging from 1 (completely disagree) to 5 (completely agree). Sample items in the scale included statements such as “This online activity motivated me to learn more than being in the classroom” and “Online video lessons are important for me when learning at home.” Rossing et al. ( 2012 ) report that the internal consistency coefficient for this instrument was 0.90. Similarly, Wong et al. ( 2023b ) reported a coefficient of 0.88, further supporting the scale’s reliability across online learning contexts. This instrument demonstrated robust internal consistency, with a Cronbach alpha coefficient reported at 0.89, indicating its reliability in assessing students’ perceptions of online engagement.

5.5.2 Course Learning Analytics

Throughout the 10-week duration, individualized student-level learning analytics were gathered from the Canvas Learning Management System (LMS). These analytics encompassed various metrics, including total quiz grades, participation rates, and page views. The total quiz grades served as a summative assessment with 10 multiple choice questions. This aggregate metric was derived from the summation of weekly quiz scores over the 10-week period. Each student completed a total of 10 quizzes over the course of the study, with one quiz administered per week. It’s noteworthy that the quizzes presented to students in both classes were completely identical in terms of length, question count, and answer choices. By standardizing the quizzes across both classes, we ensured uniformity in assessment across both classes, thereby enabling a fair comparison of learning outcomes between students who received embedded video questions and those who did not.

Pageviews and participation rates offered detailed insights into individual user behavior within the Canvas Learning Management System (LMS). Pageviews specifically monitored the total number of pages accessed by learners within the Canvas course environment, with each page load constituting a tracked event. This meticulous tracking provided a metric of the extent of learners’ interaction with course materials (Instructure, 2024 ), enabling a close examination of learner engagement and navigation patterns within the online course. Consequently, page view data can serve as a reliable proxy for student engagement rather than a definitive measure, assisting in gauging the occurrence of activity and facilitating comparisons among students within a course or when analyzing trends over time. The total number of page views for both classes were examined and compared between students with and without embedded video questions.

Participation metrics within the Canvas LMS encompassed a broad spectrum of user interactions within the course environment. These included not only traditional activities such as submitting assignments and quizzes but also more dynamic engagements such as watching and rewatching videos, redoing low-stakes questions for practice, and contributing to discussion threads by responding to questions (Instructure, 2024 ). Each instance of learner activity was logged as an event within the Canvas LMS. These participation measures were comprehensive and captured the diverse range of actions undertaken by students throughout their learning journey. They provided invaluable insights into the level of engagement and involvement of each student within their respective course sections. By recording these metrics individually for each student, the Canvas LMS facilitated detailed analysis and tracking of learner behavior, enabling a nuanced understanding of student participation patterns and their impact on learning outcomes.

5.6 Data Analysis Plan

We conducted checks for scale reliability to assess the alpha coefficients for all the measurement instruments. Additionally, a chi-square analysis was performed to ensure that there were no disparities between conditions in terms of gender, ethnicity, and student-grade level statuses prior to treatment. Next, descriptive analyses were conducted to assess the frequencies, distribution, and variability across the two different conditions on learners total quiz grades, page views, and participation after 10 weeks of instruction (See Table  2 ). Then, a series of one-way Analysis of Variance (ANOVAs) were conducted to examine the differences between conditions on dependent variables separately. Next, two Multivariate Analysis of Variance (MANOVAs) were conducted to evaluate the difference between treatment conditions on multiple dependent variables. A MANOVA was chosen for analysis in order to access multiple dependent variables simultaneously while comparing across two or more groups. The first MANOVA compared the means of learners with and without embedded video questions on three dependent variables: (D1) quiz grades, (D2) pageviews, and (D3) participation. A second MANOVA compared the means of learners with and without embedded video questions on three dependent variables: (D1) engagement, (D2) self-regulation, and (D3) critical thinking skills. Lastly, multiple regression analyses were conducted to evaluate the effect of embedded video questions related to learners’ quiz grades and whether this relation was moderated by learners’ self-regulation and critical thinking skills.

Descriptive Analysis.

Table  3 displays the average weekly quiz grades for two instructional conditions, “Delayed Questioning” and “Immediate Questioning,” over a ten-week period from January 4th to March 8th. Fluctuations in quiz grades are evident across the observation period for both conditions. For instance, on Week 1, the average quiz grade for “Delayed Questioning” was 95.65, while it was notably higher at 99.2 for students in the “Immediate Questioning” condition. Similarly, on Week 6, quiz grades decreased for both conditions, with “Delayed Questioning” at 93.35 and “Immediate Questioning” at 96.9 (See Fig.  3 ). Comparing the average quiz grades between the two instructional conditions revealed consistent differences throughout the observation period. The “Immediate Questioning” condition consistently demonstrated higher quiz grades compared to “Delayed Questioning.” Notably, this difference is particularly pronounced on certain dates, such as Week 3, where the average quiz grade for “Delayed Questioning” was 97.6, while it reached 99.6 for “Immediate Questioning.” These descriptive findings suggest that embedding questions directly within the video content may positively influence learners’ quiz performance, potentially indicating higher engagement and comprehension of the course material. However, further analysis is required to explore the significant differences in weekly quiz grades between the two instructional conditions.

figure 3

Descriptive comparison of students' weekly summative quiz by condition

figure 4

This figure presents the frequency of page views throughout the 10-week course

Figure  4 presents the frequency of page views throughout the 10 week course, acting as an proximal indicator of learner engagement, across different dates for two instructional approaches: “Delayed Questioning” and “Immediate Questioning.” Higher page view counts indicate heightened interaction with course materials on specific dates. For example, on Week 1, “Delayed Questioning” registered 9,817 page views, while “Immediate Questioning” recorded 12,104 page views, indicating peaks in engagement. Conversely, lower page view counts on subsequent dates may imply reduced learner activity or engagement with the course content. Fluctuations in page view counts throughout the observation period highlight varying levels of learner engagement under each instructional condition. Notably, a comparative analysis between the two instructional methods unveiled consistent patterns, with “Immediate Questioning” condition consistently exhibiting higher page view counts across most observation dates. This initial examination suggests that embedding questions directly within the video player may enhance learner engagement, evidenced by increased interaction with course materials.

Upon examination of the participation rates across the specified dates, it is evident that the “Immediate Questioning” condition consistently generated higher levels of engagement compared to the “Delayed Questioning” condition (See Fig.  5 ). For instance, on Week 4, the participation rate for “Delayed Questioning” was recorded as 459, while it notably reached 847 for “Immediate Questioning.” Similarly, on Week 7 participation rates were 491 and 903 for “Delayed Questioning” and “Immediate Questioning,” respectively, indicating a substantial difference in participation rates between the two instructional approaches. Moreover, both conditions experienced fluctuations in participation rates over time, with instances where participation rates surged or declined on specific dates. For instance, on Week 10, the participation rate for “Delayed Questioning” dropped to 287, whereas it remained relatively higher at 677 for “Immediate Questioning.” Overall, the descriptive analysis depicted in Fig.  5 highlights the differences in participation rates across the two conditions and underscores how embedding video questions influences learners’ online behaviors.

figure 5

This figure presents the frequency of participation throughout the 10-week course

6.1 Multivariate Analysis of Variance on Dependent Variables

A MANOVA was conducted to compare the means of learners with and without embedded video questions on three dependent variables: (D1) quiz grades, (D2) pageviews, and (D3) participation (See Table  4 ). The multivariate test was significant, F (3, 150) = 188.8, p  < 0.000; Pillai’s Trace = 0.791, partial η 2  = 0.791, indicating a difference between learners who experienced ”Delayed” and “Immediate Questioning.” The univariate F tests showed there was a statistically significant difference for total quiz grades F (1, 152) = 6.91; p  < 0.05; partial η 2  = 0.043), pageviews F (1, 152) = 26.02; p  < 0.001; partial η 2  = 0.146), and course participation rates F (1, 152) = 569.6; p  < 0.001; partial η 2  = 0.789) between the two conditions. The results of the Bonferroni pairwise comparisons of mean differences for total quiz grades ( p  < 0.05), pageviews ( p  < 0.001), and course participation ( p  < 0.001) were statistically significantly different between the two conditions. Therefore, learners who experienced questions directly embedded within the video player had significantly higher total quiz grades, page views, and course participation across 10 weeks.

A second MANOVA compared the means of learners with and without embedded video questions on three dependent variables: (D1) engagement, (D2) self-regulation, and (D3) critical thinking skills (See Table  5 ). The multivariate test was significant, F (3, 179) = 5.09, p  < 0.000; Pillai’s Trace = 0.079, partial η 2  = 0.079, indicating a difference between learners who experienced ”Delayed” and “Immediate Questioning.” The univariate F tests showed there was a statistically significant difference between learners with and without embedded video questions for engagement F (1, 181) = 7.43; p  < 0.05; partial η 2  = 0.039), self-regulation F (1, 181) = 14.34; p  < 0.001; partial η 2  = 0.073), and critical thinking skills F (1, 181) = 6.75; p  < 0.01; partial η 2  = 0.036). The results of the Bonferroni pairwise comparisons of mean differences for engagement ( p  < 0.05), self-regulation ( p  < 0.001), and critical thinking skills ( p  < 0.01) were statistically significantly different across the two conditions. Therefore, experienced questions directly embedded within the video player had significantly higher engagement, self-regulation, and critical thinking skills.

6.2 Moderation Analyses

A multiple regression model investigated whether the association between learners’ total quiz grades who experienced ”Delayed” or “Immediate Questioning” depends on their levels of self-regulation and critical thinking (Table  6 ). The moderators for this analysis were learners’ self-report self-regulation and critical thinking skills, while the outcome variable was the learners’ total quiz grades after 10 weeks. Results show that learners’ who experienced “Immediate Questioning” (β = 1.15, SE  = 4.72) were significantly predictive of their total quiz grades Additionally, the main effect of students’ self-regulation (β = 0.394, SE  = 0.78) and critical thinking skills (β = 0.222, SE  = 0.153) were statistically significant. Furthermore, the interaction between learners who experienced “Immediate Questioning” and self-regulation was also significant (β = 0.608, SE  = 0.120), suggesting that the effect of condition on quiz grades is dependent on the level of learners’ self-regulation. However, the interaction between treatment conditions and critical thinking was not significant (β = 0.520, SE  = 0.231). Together, the variables accounted for approximately 20% of the explained variance in learners’ quiz grades, R 2  = 0.19, F (5,158) = 9.08, p  < 0.001.

7 Discussion

This study was part of a large-scale online learning research effort at the university, examining undergraduate experiences through pedagogically grounded educational technologies. Specifically, it implemented learning experience design, the testing effect model, and “edtech tools” aligned with evidence-based learning theories to enhance student knowledge, engagement, and transferable skills like self-regulation and critical thinking. A key goal was to use design-based research methodologies to evaluate students where instructors were applying these evidence-based practices in real-world settings, helping determine if investments in educational technologies supported student learning outcomes. With the increased demand for online learning post-pandemic, this study investigated the impact of embedded video questions within an asynchronous online Biology course on engagement, self-regulation, critical thinking, and quiz performance. By comparing “Immediate Questioning” versus “Delayed Questioning,” this research explored how the timing of embedded video questions affected the efficacy of online learning, contributing to our understanding of effective online education strategies. The discussion interpreted and contextualized the findings within the broader landscape of online education, technology integration, and pedagogical design.

7.1 Impact on Student Course Outcomes

The first MANOVA results revealed significant positive effects of “Immediate Low-stakes Questioning” on learners’ summative quiz scores over a 10-week period compared to the “Delayed Low-stakes condition.” Notably, both groups had equal preparation time, with quizzes available at the same time and deadlines each week. This indicates that the timing and interactive nature of embedded video questions, aimed at fostering the testing effect paradigm, contributed to increased learner activity and participation (Richland et al., 2009 ). The “Immediate Questioning” group, characterized by notably higher weekly quiz scores, benefitted from the active learning facilitated by concurrent processing of concepts through answering questions while watching the lecture videos. Embedded questions not only fostered an active learning environment but also captured students’ attention and engaged them differently compared to passive video viewing learning modalidies (Mayer, 2021; van der Meij et al., 2021 ). This approach allowed for immediate recall and practice, providing guided opportunities for students to reflect on their knowledge and validate their accuracies or improve upon their mistakes (Cummins et al., 2015 ; Haagsman et al., 2020 ). The strategic timing of questions synchronized with specific instructional topics provided students with opportunities to recognize, reflect on, and decipher what they know and what they don’t know. Consequently, students approached their weekly quizzes with greater readiness, as strategically positioned embedded video questions fostered enhanced cognitive engagement due to their intentional timing, placement, and deliberate use of low-stakes questioning (Christiansen et al., 2017 ; Deng & Gao, 2023 ). Overall, the study’s results align with previous literature, indicating that interactive low-stakes quizzing capacities through intentionally timed questions within video-based learning effectively simulate the testing effect paradigm to foster retrieval practice over time (Littrell-Baez et al., 2015 ; Richland et al., 2009 ). These findings underscore the efficacy of integrating interactive elements into online learning environments to enhance student engagement and learning outcomes.

Additionally, students in the “Immediate Questioning’’ condition demonstrated significantly higher participation rates and page views within the course (Table  2 ). Page views were tracked at the individual student level, representing the total number of pages accessed, including watching and rewatching videos, accessing assignments, and downloading course materials. This indicates that students in the “Immediate Questioning’’ condition were more engaged with course content, preparing for weekly quizzes by actively engaging with various resources. In terms of participation rates, learners in the “Immediate Questioning’’ condition were more active compared to their counterparts (Table  2 ). Participation encompassed various actions within the Canvas LMS course, such as submitting assignments, watching videos, accessing course materials, and engaging in discussion threads. Students in this condition were more likely to ask questions, share thoughts, and respond to peers, fostering a deeper level of engagement. Moreover, there was a consistent pattern of students revisiting instructional videos, as reflected in page views. Research on embedded video questions has shown that they prompt positive learning behaviors, such as reviewing course materials (Cummins et al., 2015 ; Haagsman et al., 2020 ; Rice et al., 2019 ; Wong et al., 2022 ). These insights into student behavior highlight the impact of integrating questions within the video player, resulting in increased engagement indicated by higher page views and course participation.

7.2 Impacts on Student Learning Behaviors

In addition to learning analytics, we gathered data on students’ self-reported online engagement. Students in the “Immediate Questioning” condition reported higher engagement levels than their counterparts, possibly due to the anticipation of upcoming questions, fostering attention, participation, and interaction. This increased awareness can positively impact students’ engagement, retrieval, and understanding, as they mentally prepare for the questions presented (Dunlosky et al., 2013 ; Schmitz, 2020 ). Moreover, questions directly embedded within the video encourage thoughtful engagement with material, amplifying the benefits of repeated low-stakes testing in preparation for assessments (Kovacs, 2016 ; Richland et al., 2009 ). Our study manipulated the timing of these questions to enhance the saliency of the testing effect paradigm, aiming to transition learners from passive to active participants in the learning process. When considering both the first and second MANOVA results, students in the “Immediate Questioning” condition not only showed significant differences in participation and page views but also reported significantly higher engagement compared to those in the “Delayed Questioning” condition. These findings align with previous research on interactive learning activities and “edtech tools” in promoting engagement in online courses (Wong et al., 2022 ; Wong et al., 2024 ). We employed the same instructional videos from Wong and Hughes ( 2022 ), but our study was informed by the design constraints students identified regarding limited interactivity, practice opportunities, and student-centered active learning in asynchronous settings. By integrating embedded video questions to address these concerns, we offered students a more engaging and interactive learning experience. As a result, embedding questions directly within videos is suggested to be an effective strategy for enhancing learner engagement and participation in online courses. Our results also contribute to the literature by comparing self-report data with behavioral course data, shedding light on the beneficial impacts of embedded video questions.

The significant differences in self-regulation and critical thinking skills among learners in the “Immediate Questioning” condition, who experienced questions embedded directly in videos, highlights the value of this pedagogical approach. Engaging with questions intentionally timed and aligned with the instructional content requires learners to monitor and regulate their cognitive processes, fostering metacognitive awareness and self-regulated learning (Jain & Dowson, 2009 ; Wang et al., 2013 ). The cognitive effort exerted to critically analyze, reflect, and respond to these questions within the video enhances critical thinking skills, compelling learners to evaluate and apply their understanding in real-time contexts. Our intentional LXD aimed to enhance the testing effect model’s saliency, encouraging students to think about their own thinking through formative assessments and solidify their conceptual understanding before summative assessments (Richland & Simms, 2015 ). Repeated opportunities for metacognitive reflection and regulation empower students to gauge their comprehension, identify areas for further exploration, and manage their learning progress (Wang et al., 2017 ; Wong & Hughes, 2022 ; Wong et al., 2022 ). Furthermore, immediate questioning compared to delayed questioning facilitates higher-order cognitive skills, with students in the “Immediate Questioning” condition showing significantly higher critical thinking. Critical thinking, evident through actions like exploring varied sources, learning from unsuccessful retrieval attempts (Richland et al., 2009 ), and making inferences (Uzuntiryaki-Kondakci & Capa-Aydin, 2013 ), is influenced by the timing of these questions.

Employing Bloom’s Taxonomy as a foundation for shaping our question construction, this entailed that the lower-order questions were formulated to underscore the tasks of remembering, comprehending, and applying concepts in specific contexts (Bloom, 2001 ; Betts, 2008 ). Conversely, the higher-order questions were tailored to provoke the application and analysis of real-world scenarios in the field of ecology and evolutionary biology, requiring students to deconstruct relationships and evaluate patterns on the information presented (Bloom, 2001 ; Betts, 2008 ). In combination, these choices in question design provide students with the opportunity to engage in a critical evaluation of course concepts, prompting learners to make inferences, inquire, and judge complex problems as they formulate their solutions. Immediate questioning prompts consideration of key concepts and assessment of understanding in real-time (Jain & Dowson, 2009 ; Wang et al., 2013 ), whereas delayed questioning requires learners to retain the information for a longer duration in their working memory, simultaneously mitigating distractions from mind-wandering, as learners await a delayed opportunity to actively retrieve and practice the information gleaned from the videos (Richland et al., 2099; Richland and Simms, 2015 ; Wong et al., 2023b ). Thus, promptly answering low-stakes questions embedded within videos while engaging with content enhances self-regulation, critical thinking, and overall engagement with instructional material. In this way, the cultivation of both self-regulation and critical thinking skills also holds the potential to bolster students’ transferable skills that can be applied across various contexts (Fries et al., 2020 ), which is a crucial competency for undergraduate students in STEM disciplines (Wong et al., 2023b ).

7.3 Interplay between Student Learning Behaviors and Knowledge Outcomes

Our analysis explored the interplay between the two conditions, learners’ self-regulation, critical thinking, and quiz grades using a multiple regression model. The results revealed that treatment condition, self-regulation, and critical thinking were significant predictors of quiz grades (Table  4 ), suggesting a potential mechanism that self-regulation plays when considering the testing effect (Peng et al., 2019 ; Sotola & Crede, 2021 ). Notably, the interaction between the “Immediate Questioning” condition and self-regulation emerged as a significant factor, suggesting that the influence of embedded video questions on quiz grades varies based on learners’ self-regulation abilities. In other words, learners in the “Immediate Questioning” condition who showed greater self-regulation tended to have significantly higher quiz grades. This pattern underscores the importance of considering learners’ metacognitive strategies when examining the impact of instructional interventions online, highlighting the potential mechanism self-regulation plays in the testing effect (Peng et al., 2019 ; Sotola & Crede, 2021 ). Conversely, the interaction term between the two conditions and critical thinking was not significant (Table  5 ). While there was a significant main effect for critical thinking, the timing of low-stakes questioning (delayed or immediate) did not significantly influence quiz scores based on students’ critical thinking skills. This implies that the timing of the low-stakes questions in this study may not depend on the levels of students’ critical thinking skills, but rather on their levels of self-regulation to influence their total quiz scores. Furthermore, self-regulation significantly influenced learners’ quiz grades throughout the 10-week course. Conceptually synchronized questions immediately embedded in the video player served as metacognitive reflective learning opportunities, empowering students to gauge their comprehension, identify areas for further exploration, and actively manage their learning progress (Delen et al., 2014 ; Wang et al., 2013 ; Wong & Hughes, 2023 ). One of the many benefits of the testing effect paradigm is acknowledging errors during low-stakes practice, allowing learners to self-regulate by reassessing initial understandings and fostering conceptual change (Richland et al., 2009 ; Iwamoto et al., 2017 ; Sotola & Crede, 2021 ). Enhancing students’ metacognitive techniques like self-regulation can enrich skills applicable in various contexts, including other courses, workforce training, and time management (Barak et al., 2016 ; Fisher & Baird, 2005 ; Fries et al., 2020 ). For STEM undergraduates at research-intensive institutions, embedding questions directly into the video player nurtures these critical proficiencies by linking course content with real-world applications. The study highlights how the interplay between LXD, the testing effect model, and immediate questioning embedded in video supports critical thinking and underscores the relationship between engagement, self-regulation, and science knowledge outcomes.

7.3.1 Alignment with Learning Experience Design and Learning Theories

The positive outcomes of this study also resonate with the principles of Learning Experience Design. LXD emphasizes human-centered, experiential, and evidence-based design to create meaningful and effective learning encounters (Floor, 2018 ). The incorporation of embedded video questions exemplifies how LXD principles can be applied intentionally to empathize with learner’s needs in online learning experiences (Wong & Hughes, 2023 ; Wong et al., 2023b ). By incorporating interactivity through embedded video questions, the video lessons promoted active learning, where learners’ needs and behaviors in the course were considered. This design choice transformed passive video consumption into an interactive and participatory experience, aligning with LXD’s focus on fostering engagement through experiential learning techniques (Floor, 2018 ). Additionally, the alignment of the study’s findings with LXD underscores the value of interdisciplinary with the implementation of educational technologies at scale. To make this study possible, we worked alongside the university instructor, an instructional designer, and a researcher in order to consider the integration of instructional design, learning sciences, theories of learning, and user experience design (Weigel, 2015 ). In doing so, we were able to ensure that the course was properly aligned to the LXD paradigm, grounded in learning theories such as the testing effect and Bloom’s Taxonomy, and deployed with an empathic lens to promote students’ active learning behaviors in online learning settings. Thus, our efforts led to the implementation of a technology-enhanced online learning experience that effectively supported learners’ quiz grades, engagement, self-regulation, and critical thinking.

7.4 Implications for Practice and Future Directions

The implications of this study for educators, instructional designers, and higher education administrators are significant. Firstly, the incorporation of immediate low-stakes questioning directly within video content offers a promising avenue for enriching online learning experiences rooted in the Learning Experience Design (LXD) paradigm and the testing effect model. Educators can integrate these strategies and technological modality into their course designs to foster active learning and deepen learners’ engagement with course material. Instructional designers, drawing on LXD principles, can create meaningful learning experiences that incorporate evidence-based pedagogical strategies, such as embedding low-stakes questions within instructional content. Facilitating the testing effect with low-stakes questioning can extend beyond videos and be incorporated into readings, assignments, and course activities. Moreover, higher education administrators and institutions should recognize the importance of integrating technology in line with evidence-based pedagogies. While the rapid introduction of educational technology (edtech) tools during the COVID-19 pandemic facilitated emergency remote learning, our study underscores the necessity of aligning these tools with pedagogical frameworks to optimize their effectiveness. By investing in the development and implementation of technologies that promote active learning and enhance learners’ engagement, self-regulation, and critical thinking, institutions can better equip students for success in online learning environments while capitalizing on existing edtech resources. An essential aspect of our study is to raise awareness about the range of tools already available to and supported by universities. Ensuring accessibility for instructors, designers, researchers, and students is imperative, enabling effective adoption of these tools while employing evidence-based strategies. We aspire for this study to serve as an example of how university investments in tools can positively impact students’ learning experiences, encouraging others to adopt similar approaches as we continue to refine our support for students’ needs.

7.4.1 Limitations

Further research is needed to thoroughly assess the long-term benefits of incorporating embedding low-stakes questions directly into videos in online undergraduate courses. During this study, participants in both groups were presented with low-stakes questions throughout the course. Students in the immediate condition encountering questions embedded within the video player experienced automatic triggering of questions, synchronized with instructional content. In contrast, those in the delayed condition faced identical questions after viewing all of the lecture videos in the instructional unit. While the timing of the questions served as a deliberate experimental manipulation between the two groups, determining whether the testing effect was more pronounced in either condition poses a limitation of the study. Despite high weekly quiz grades ranging from mid to upper 90% for both conditions, quiz scores were significantly higher for those who experienced questions directly embedded in the video. However, it’s important to note that scores remained consistently high across both conditions, suggesting that the testing effect may manifest regardless of question timing or that the question difficulty may need to be adjusted. This highlights the need for further exploration of how the testing effect operates in various instructional courses, topics, and learning contexts. Future research could involve a quasi-experimental study comprising a traditional control group without questions and treatment conditions integrating embedded video questions, utilizing larger sample sizes across STEM courses could reveal the true advantages of the testing effect. Moreover, future research could consider controlling for additional learning analytics, such as video completion rates, assignment submission times, and accuracy of low-stakes questioning, as predictors for learners’ course performance and learning outcomes. Understanding these dynamics can refine instructional strategies for optimizing learning outcomes in online education settings. We deliberately refrained from introducing additional learning opportunities between groups to ensure equal access to course content. Our aim was to evaluate the timing and integration of questions within or following video content, scrutinizing the effectiveness and benefits of implementing the embedded video questioning platform within the framework of LXD.

As a future direction, we plan to investigate the long-term impacts of embedded video questions on knowledge retention and transferable skills. Additionally, analyzing various question types, number, and difficulty, along with on-demand feedback and spacing intervals within videos, could inform optimal design choices for promoting knowledge outcomes and student learning behaviors. Enhancing the designs might include direct feedback for each of the low-stakes questions, adjusting the quantity of low-stakes questions learners encounter, and refining the difficulty level to better cater to individual learning needs. Further research is warranted to explore underlying mechanisms, optimal design, and factors influencing cognitive aspects such as affect, cognitive load, and mind-wandering. Structural equation modeling, pending sample sizes, could provide insights into intricate mechanisms exhibited by students. Lastly, exploring the scalability of this approach across different subject domains and learner populations could enhance understanding of its generalizability and benefits of operationalizing the testing effect through embedded video within the LXD paradigm.

8 Conclusion

The integration of low-stakes questioning embedded directly into the video player within an asynchronous online course grounded in the Learning Experience Design (LXD) paradigm showcased significantly positive effects on learners’ engagement, self-regulation, and critical thinking compared to their counterparts. In addition, results showed that learners in the immediate condition had significantly higher quiz grades, pageviews, and course participation after 10 instructional weeks. Furthermore, findings also revealed that one potential mechanism underpinning learners’ increased quiz grades might be attributed to students’ levels of self-regulation when experiencing embedded video questions. As evidenced by students learning analytics and self-reported online engagement, learners are more actively involved in the learning process, with the timing of the embedded questions activating students’ awareness to reflect on “what, how, and why” before critically deciding on answer choices to the conceptual questions. We suspect that learners might be experiencing more of the benefits of the testing effect given our LX design decisions, the placement of the questions given the timing of when these questions appeared, and how the questions were designed when deploying the low-stakes questioning. Thus, results suggest that the implementation of an LX-designed self-paced online course deployed with low-stakes questions directly embedded in video are efficacious for students’ science learning outcomes and may have practical implications for the sustainability and rigor of undergraduate science distance learning. As a result, this study contributes to the growing body of literature on technology-enhanced pedagogical strategies for online learning and underscores the importance of aligning “edtech” tools with evidence-based pedagogical frameworks. By fostering active learning through embedded low-stakes video questions, educators and instructional designers create online learning experiences that are more engaging, meaningful, and effective, ultimately enhancing students’ academic outcomes and transferable skills in digital learning environments. As institutions continue to invest in educational technology, the collaborative integration of expertise from diverse fields will be pivotal in designing and implementing effective and engaging online learning environments.

Data Availability

The data that support the findings of this study are available from the corresponding author, Joseph Wong, upon reasonable request.

Adesope, O. O., Trevisan, D. A., & Sundararajan, N. (2017). Rethinking the use of tests: A meta-analysis of practice testing. Review of Educational Research , 87 (3), 659–701. https://doi.org/10.3102/0034654316689306 .

Article   Google Scholar  

Agarwal, P. K., Karpicke, J. D., Kang, S. H., Roediger, I. I. I., H. L., & McDermott, K. B. (2008). Examining the testing effect with open-and closed‐book tests. Applied Cognitive Psychology: The Official Journal of the Society for Applied Research in Memory and Cognition , 22 (7), 861–876. https://doi.org/10.1002/acp.1391 .

Ahn, J. (2019). Drawing inspiration for learning experience design (LX) from diverse perspectives. The Emerging Learning Design Journal , 6 (1), 1. https://digitalcommons.montclair.edu/eldj/vol6/iss1/1 .

Google Scholar  

Al-Harthy, I. S., Was, C. A., & Isaacson, R. M. (2010). Goals, efficacy and metacognitive self-regulation a path analysis. International Journal of Education , 2 (1), 1.

Asad, M. M., Hussain, N., Wadho, M., Khand, Z. H., & Churi, P. P. (2020). Integration of e-learning technologies for interactive teaching and learning process: An empirical study on higher education institutes of Pakistan. Journal of Applied Research in Higher Education . https://doi.org/10.1108/JARHE-04-2020-0103 .

Azevedo, R., Moos, D. C., Johnson, A. M., & Chauncey, A. D. (2010). Measuring cognitive and metacognitive regulatory processes during hypermedia learning: Issues and challenges. Educational psychologist, 45 (4), 210–223. https://doi.org/10.1080/00461520.2010.515934 .

Barak, M., Hussein-Farraj, R., & Dori, Y. J. (2016). On-campus or online: Examining self-regulation and cognitive transfer skills in different learning settings. International Journal of Educational Technology in Higher Education , 13 (1), 1–18. https://doi.org/10.1186/s41239-016-0035-9 .

Betts, S. C. (2008). Teaching and assessing basic concepts to advanced applications: Using Bloom’s taxonomy to inform graduate course design. Academy of Educational Leadership Journal , 12 (3), 99.

Bloom, H. (2001). How to read and why . Simon and Schuster.

Bolliger, D. U., & Halupa, C. (2018). Online student perceptions of engagement, transactional distance, and outcomes. Distance Education , 39 (3), 299–316. https://doi.org/10.1080/01587919.2018.1476845 .

Brookfield, S. (1995). Adult learning: An overview. International Encyclopedia of Education , 10 , 375–380.

Bruning, K. (2005). The role of critical thinking in the online learning environment. International Journal of Instructional Technology and Distance Learning , 2 (5), 21–31.

Carpenter, S. K. (2009). Cue strength as a moderator of the testing effect: The benefits of elaborative retrieval. Journal of Experimental Psychology: Learning Memory and Cognition , 35 (6), 1563. https://doi.org/10.1037/a0017021 .

Chan, J. C. (2010). Long-term effects of testing on the recall of nontested materials. Memory, 18 (1), 49–57. https://doi.org/10.1080/09658210903405737 .

Carrier, M., & Pashler, H. (1992). The influence of retrieval on retention. Memory & Cognition , 20 , 633–642. https://doi.org/10.3758/BF03202713 .

Chi, M. T., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist, 49 (4), 219–243. https://doi.org/10.1080/00461520.2014.965823 .

Chick, R. C., Clifton, G. T., Peace, K. M., Propper, B. W., Hale, D. F., Alseidi, A. A., & Vreeland, T. J. (2020). Using technology to maintain the education of residents during the COVID-19 pandemic. Journal of Surgical Education , 77 (4), 729–732. https://doi.org/10.1016/j.jsurg.2020.03.018 .

​​Christiansen, M. A., Lambert, A. M., Nadelson, L. S., Dupree, K. M., & Kingsford, T. A. (2017). In-class versus at-home quizzes: Which is better? A flipped learning study in a two-site synchronously broadcast organic chemistry course. Journal of Chemical Education , 94 (2), 157–163. https://doi.org/10.1021/acs.jchemed.6b00370 .

Churches, A. (2008). Bloom’s taxonomy blooms digitally. Tech & Learning , 1 , 1–6.

Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator , 15 (3), 6–11. https://eric.ed.gov/?id=EJ440511 .

Corporation for Public Broadcasting (1997). Study of school uses of television and video. 1996–1997 School year summary report. (ERIC Document Reproduction Service No. ED 413 879).

Corporation for Public Broadcasting (2004). Television goes to school: The impact of video on student learning in formal education. Available: http://www.cpb.org/stations/reports/tvgoestoschool/ .

Correia, A. P. (2021). ID 2 LXD. From instructional design to learning experience design: The Rise of design thinking. Driving educational change: Innovations in action .

Cruse, E. (2006). Using educational video in the classroom: Theory, research and practice. Library Video Company , 12 (4), 56–80.

Cummins, S., Beresford, A. R., & Rice, A. (2015). Investigating engagement with in-video quiz questions in a programming course. IEEE Transactions on Learning Technologies , 9 (1), 57–66. https://doi.org/10.1109/TLT.2015.2444374 .

Czerkawski, B. C., & Lyman, E. W. (2016). An instructional design framework for fostering student engagement in online learning environments. TechTrends , 60 , 532–539. https://doi.org/10.1007/s11528-016-0110-z .

Delen, E., Liew, J., & Willson, V. (2014). Effects of interactivity and instructional scaffolding on learning: Self-regulation in online video-based environments. Computers & Education , 78 , 312–320. https://doi.org/10.1016/j.compedu.2014.06.018 .

Deng, R., & Gao, Y. (2023). Effects of embedded questions in pre-class videos on learner perceptions, video engagement, and learning performance in flipped classrooms. Active Learning in Higher Education . https://doi.org/10.1177/14697874231167098

Deng, R., Feng, S., & Shen, S. (2023). Improving the effectiveness of video-based flipped classrooms with question-embedding. Education and Information Technologies , 1–26. https://doi.org/10.1007/s10639-023-12303-5 .

Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., & Willingham, D. T. (2013). Improving students’ learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychological Science in the Public Interest , 14 (1), 4–58. https://doi.org/10.1177/1529100612453266 .

EDSCOOP Staff (June 5, 2023). Colleges spent $1B on distance-learning tech at COVID-19 peak. https://edscoop.com/colleges-spent-distance-learning-tech-covid-19/#:~:text=Of%20more%20than%20%2426%20billion,students%2C%20according%20to%20the%20report .

Ertmer, P. A., Richardson, J. C., Lehman, J. D., Newby, T. J., Cheng, X., Mong, C., & Sadaf, A. (2010). Peer feedback in a large undergraduate blended course: Perceptions of value and learning. Journal of Educational Computing Research, 43 (1), 67–88. https://doi.org/10.2190/EC.43.1.e .

Fiorella, L., & Mayer, R. E. (2015). Learning as a generative activity . Cambridge University Press.

Book   Google Scholar  

Fisher, M., & Baird, D. E. (2005). Online learning design that fosters student support, self-regulation, and retention. Campus-wide Information Systems , 22 (2), 88–107. https://doi.org/10.1108/10650740510587100 .

Floor, N. (2018). What is learning experience design . Springer.

Floor, N. (2023). This is learning experience design: What it is, how it works, and why it matters . New Riders.

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research , 74 (1), 59–109. https://doi.org/10.3102/00346543074001059 .

Fries-Britt, S., & White-Lewis, D. (2020). In pursuit of meaningful relationships: How black males perceive faculty interactions in STEM. The Urban Review , 52 (3), 521–540. https://doi.org/10.1007/s11256-020-00559-x .

Fiorella, L., & Mayer, R. E. (2018). What works and doesn't work with instructional video. Computers in Human Behavior, 89 , 465–470. https://doi.org/10.1016/j.chb.2018.07.015

Giannakos, M. N. (2013). Exploring the video-based learning research: A review of the literature. British Journal of Educational Technology , 44 (6), E191–E195. https://doi.org/10.1111/bjet.12070 .

Haagsman, M. E., Scager, K., Boonstra, J., & Koster, M. C. (2020). Pop-up questions within educational videos: Effects on students’ learning. Journal of Science Education and Technology , 29 , 713–724. https://doi.org/10.1007/s10956-020-09847-3 .

Halpern, D. F. (1998). Teaching critical thinking for transfer across domains: Disposition, skills, structure training, and metacognitive monitoring. American Psychologist , 53 (4), 449. https://doi.org/10.1037/0003-066X.53.4.449 .

Halverson, L. R., & Graham, C. R. (2019). Learner engagement in blended learning environments: A conceptual framework. Online Learning , 23 (2), 145–178. https://doi.org/10.24059/olj.v23i2.1481 .

Hu, S., & Kuh, G. D. (2002). Being (dis) engaged in educationally purposeful activities: The influences of student and institutional characteristics. Research in Higher Education , 43 , 555–575. https://doi.org/10.1023/A:1020114231387 .

Humphries, B., & Clark, D. (2021). An examination of student preference for traditional didactic or chunking teaching strategies in an online learning environment. Research in Learning Technology . https://doi.org/10.25304/rlt.v29.2405

Instructure (2024, January 18). How do I view analytics for an individual student in new analytics? Instructure Community. https://community.canvaslms.com/t5/Instructor-Guide/How-do-I-view-analytics-for-an-individual-student-in-New/ta-p/801 .

Iwamoto, D. H., Hargis, J., Taitano, E. J., & Vuong, K. (2017). Analyzing the efficacy of the testing effect using KahootTM on student performance. Turkish Online Journal of Distance Education , 18 (2), 80–93. https://doi.org/10.17718/tojde.306561 .

Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers & Education , 95 , 270–284. https://doi.org/10.1016/j.compedu.2016.01.014 .

Jain, S., & Dowson, M. (2009). Mathematics anxiety as a function of multidimensional self-regulation and self-efficacy. Contemporary Educational Psychology , 34 (3), 240–249. https://doi.org/10.1016/j.cedpsych.2009.05.004 .

Johnson, C. I., & Mayer, R. E. (2009). A testing effect with multimedia learning. Journal of Educational Psychology , 101 (3), 621. https://doi.org/10.1037/a0015183 .

Kanuka, H. (2006). Instructional design and eLearning: A discussion of pedagogical content knowledge as a missing construct. E-Journal of Instructional Science and Technology, 9 (2), n2.

Kestin, G., & Miller, K. (2022). Harnessing active engagement in educational videos: Enhanced visuals and embedded questions. Physical Review Physics Education Research , 18 (1), 010148. https://doi.org/10.1103/PhysRevPhysEducRes.18.010148 .

Kolås, L. (2015, June). Application of interactive videos in education. In 2015 International Conference on Information Technology Based Higher Education and Training (ITHET) (pp. 1–6). IEEE. https://doi.org/10.1109/ITHET.2015.7218037 .

Kovacs, G. (2016, April). Effects of in-video quizzes on MOOC lecture viewing. In Proceedings of the third (2016) ACM conference on Learning@ Scale (pp. 31–40). https://doi.org/10.1145/2876034.2876041 .

Kuh, G. D., Kinzie, J., Schuh, J. H., & Whitt, E. J. (2005). Never let it rest lessons about student success from high-performing colleges and universities. Change: The Magazine of Higher Learning , 37 (4), 44–51. https://doi.org/10.3200/CHNG.37.4.44-51 .

Littrell-Baez, M. K., Friend, A., Caccamise, D., & Okochi, C. (2015). Using retrieval practice and metacognitive skills to improve content learning. Journal of Adolescent & Adult Literacy , 58 (8), 682–689. https://doi.org/10.1002/jaal.420 .

Matthews, M. T., Williams, G. S., Yanchar, S. C., & McDonald, J. K. (2017). Empathy in distance learning design practice. TechTrends , 61 (5), 486–493. https://doi.org/10.1007/s11528-017-0212-2 .

Marshall, F. B., & Marshall, J. (2021, November). The effects of embedding knowledge-check questions in instructional videos. In innovate learning summit (pp. 319–327). Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/220301/ .

Mayer, R. E. (2009). Constructivism as a theory of learning versus constructivism as a prescription for instruction. In Constructivist instruction (pp. 196–212). Routledge.

Mayer, R. E. (Ed.). (2005). The Cambridge handbook of multimedia learning . Cambridge University Press.

Mayer, R. E. (2014). Introduction to multimedia learning.

Mayer, R. E. (2017). Using multimedia for e-learning. Journal of Computer Assisted Learning , 33 (5), 403–423. https://doi.org/10.1111/jcal.12197 .

Mayer, R. E. (2019). Thirty years of research on online learning. Applied Cognitive Psychology , 33 (2), 152–159. https://doi.org/10.1002/acp.3482 .

Mayer, R. E., Heiser, J., & Lonn, S. (2001). Cognitive constraints on multimedia learning: When presenting more material results in less understanding. Journal of Educational Psychology , 93 (1), 187. https://doi.org/10.1037/0022-0663.93.1.187 .

Mayer, R. E., Stull, A., DeLeeuw, K., Almeroth, K., Bimber, B., Chun, D., ... & Zhang, H. (2009). Clickers in college classrooms: Fostering learning with questioning methods in large lecture classes. Contemporary Educational Psychology, 3 4 (1), 51–57. https://doi.org/10.1016/j.cedpsych.2008.04.002

McDaniel, M. A., Thomas, R. C., Agarwal, P. K., McDermott, K. B., & Roediger, H. L. (2013). Quizzing in middle-school science: Successful transfer performance on classroom exams. Applied Cognitive Psychology , 27 (3), 360–372. https://doi.org/10.1002/acp.2914 .

McMahon, M., & Oliver, R. (2001). Promoting self-regulated learning in an on-line environment (pp. 1299–1305). Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/8630/ .

van der Meij, H., & Bӧckmann, L. (2021). Effects of embedded questions in recorded lectures. Journal of Computing in Higher Education , 33 (1), 235–254. https://doi.org/10.1007/s12528-020-09263-x .

McDaniel, M. A., Agarwal, P. K., Huelser, B. J., McDermott, K. B., & Roediger III, H. L. (2011). Test-enhanced learning in a middle school science classroom: The effects of quiz frequency and placement. Journal of Educational Psychology, 103 (2), 399. https://doi.org/10.1037/a0021782 .

Moos, D. C., & Bonde, C. (2016). Flipping the classroom: Embedding self-regulated learning prompts in videos. Technology Knowledge and Learning , 21 , 225–242. https://doi.org/10.1007/s10758-015-9269-1 .

National Center for Education Statistics (2022). Postbaccalaureate Enrollment. Condition of Education. U.S. Department of Education, Institute of Education Sciences. Retrieved May 31, 2022, https://nces.ed.gov/programs/coe/indicator/chb .

O’leary, B., June, A. W., & May (2023). 30, Higher Ed Recieved Billions in Covid-Relief Money. Where did it Go? The Chronicle of Higher Education. https://www.chronicle.com/article/higher-ed-received-billions-in-covid-relief-money-where-did-it-go?emailConfirmed=true&supportSignUp=true&supportForgotPassword=true&email=wongjosepht%40gmail.com&success=true&code=success&bc_nonce=s0oj2mjwxyeggvl7ua8u&cid=gen_sign_in

Pan, S. C., Cooke, J., Little, J. L., McDaniel, M. A., Foster, E. R., Connor, L. T., & Rickard, T. C. (2019). Online and clicker quizzing on jargon terms enhances definition-focused but not conceptually focused biology exam performance. CBE—Life Sciences Education , 18 (4), ar54. https://doi.org/10.1187/cbe.18-12-0248 .

Pellas, N. (2018). Is the flipped classroom model for all? Correspondence analysis from trainee instructional media designers. Education and Information Technologies, 23 (2), 757–775. https://doi.org/10.1007/s10639-017-9634-x .

Pellas, N. (2014). The influence of computer self-efficacy, metacognitive self-regulation and self-esteem on student engagement in online learning programs: Evidence from the virtual world of Second Life. Computers in Human Behavior , 35 , 157–170. https://doi.org/10.1016/j.chb.2014.02.048 .

Peng, Y., Liu, Y., & Guo, C. (2019). Examining the neural mechanism behind testing effect with concrete and abstract words. Neuroreport , 30 (2), 113–119. https://doi.org/10.1097/WNR.0000000000001169 .

Picciano, A. G. (2023). Future technological trends and research. In Data Analytics and Adaptive Learning (pp. 303–322). Routledge.

Pintrich, P. R., Smith, D. A., Garcia, T., & McKeachie, W. J. (1993). Reliability and predictive validity of the motivated strategies for learning questionnaire (MSLQ). Educational and Psychological Measurement , 53 (3), 801–813. https://doi.org/10.1177/0013164493053003024 .

Pulukuri, S., & Abrams, B. (2021). Improving learning outcomes and metacognitive monitoring: Replacing traditional textbook readings with question-embedded videos. Journal of Chemical Education , 98 (7), 2156–2166. https://doi.org/10.1021/acs.jchemed.1c00237 .

Redmond, P., Abawi, L., Brown, A., Henderson, R., & Heffernan, A. (2018). An online engagement framework for higher education. Online Learning Journal , 22 (1), 183–204. https://doi.org/10.24059/olj.v22i1.1175 .

Rice, P., Beeson, P., & Blackmore-Wright, J. (2019). Evaluating the impact of a quiz question within an educational video. TechTrends , 63 (5), 522–532. https://doi.org/10.1007/s11528-019-00374-6 .

Richland, L. E., Kornell, N., & Kao, L. S. (2009). The pretesting effect: Do unsuccessful retrieval attempts enhance learning? Journal of Experimental Psychology: Applied , 15 (3), 243. https://doi.org/10.1037/a0016496 .

Richland, L. E., & Simms, N. (2015). Analogy, higher order thinking, and education. Wiley Interdisciplinary Reviews: Cognitive Science , 6 (2), 177–192. https://doi.org/10.1002/wcs.1336 .

Roediger, I. I. I., H. L., & Karpicke, J. D. (2006). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science , 1 (3), 181–210. https://doi.org/10.1111/j.1745-6916.2006.00012.x .

Rossing, J. P., Miller, W., Cecil, A. K., & Stamper, S. E. (2012). iLearning: The future of higher education? Student perceptions on learning with mobile tablets. https://hdl.handle.net/1805/7071 .

Ryan, R. M., & Deci, E. L. (2017). Self-determination theory: Basic psychological needs in motivation, development, and wellness . Guilford publications.

Sandars, J., Correia, R., Dankbaar, M., de Jong, P., Goh, P. S., Hege, I., & Pusic, M. (2020). Twelve tips for rapidly migrating to online learning during the COVID-19 pandemic. https://doi.org/10.15694/mep.2020.000082.1 .

Sansone, C., Fraughton, T., Zachary, J. L., Butner, J., & Heiner, C. (2011). Self-regulation of motivation when learning online: The importance of who, why and how. Educational Technology Research and Development , 59 , 199–212. https://doi.org/10.1007/s11423-011-9193-6 .

Scott, E. E., Wenderoth, M. P., & Doherty, J. H. (2020). Design-based research: A methodology to extend and enrich biology education research. CBE—Life Sciences Education, 19 (2), es11. https://doi.org/10.1187/cbe.19-11-0245 .

Schmitz, W. H. G. (2020). Embedded questions in text and video-based lectures (Master’s thesis, University of Twente). https://purl.utwente.nl/essays/82825 .

Shneiderman, B., & Hochheiser, H. (2001). Universal usability as a stimulus to advanced interface design. Behaviour & Information Technology , 20 (5), 367–376. https://doi.org/10.1080/01449290110083602 .

Siek, K. A., Hayes, G. R., Newman, M. W., & Tang, J. C. (2014). Field deployments: Knowing from using in context. In J. Olson & W. Kellogg (Eds.), Ways of knowing in HCI (pp. 119–142). New York, NY: Springer. https://doi.org/10.1007/978-1-4939-0378-8_6 .

Sotola, L. K., & Crede, M. (2021). Regarding class quizzes: A meta-analytic synthesis of studies on the relationship between frequent low-stakes testing and class performance. Educational Psychology Review , 33 , 407–426. https://doi.org/10.1007/s10648-020-09563-9 .

Sun, J. C. Y., & Rueda, R. (2012). Situational interest, computer self-efficacy and self‐regulation: Their impact on student engagement in distance education. British Journal of Educational Technology , 43 (2), 191–204. https://doi.org/10.1111/j.1467-8535.2010.01157.x .

Swan, K., Garrison, D. R., & Richardson, J. C. (2009). A constructivist approach to online learning: The community of inquiry framework. In Information technology and constructivism in higher education: Progressive learning frameworks (pp. 43–57). IGI global.

Torres, D., Pulukuri, S., & Abrams, B. (2022). Embedded questions and targeted feedback transform passive educational videos into effective active learning tools. Journal of Chemical Education , 99 (7), 2738–2742. https://doi.org/10.1021/acs.jchemed.2c00342 .

Tullis, J. G., & Benjamin, A. S. (2011). On the effectiveness of self-paced learning. Journal of Memory and Language , 64 (2), 109–118. https://doi.org/10.1016/j.jml.2010.11.002 .

Uzuntiryaki-Kondakci, E., & Capa-Aydin, Y. (2013). Predicting critical thinking skills of university students through metacognitive self-regulation skills and chemistry self-efficacy. Educational Sciences: Theory and Practice , 13 (1), 666–670. https://eric.ed.gov/?id=EJ1016667 .

Vrugt, A., & Oort, F. J. (2008). Metacognition, achievement goals, study strategies and academic achievement: Pathways to achievement. Metacognition and Learning , 3 , 123–146. https://doi.org/10.1007/s11409-008-9022-4 .

Wang, H. H., Chen, H. T., Lin, H. S., & Hong, Z. R. (2017). The effects of college students’ positive thinking, learning motivation and self-regulation through a self-reflection intervention in Taiwan. Higher Education Research & Development , 36 (1), 201–216. https://doi.org/10.1080/07294360.2016.1176999 .

Wang, C. H., Shannon, D. M., & Ross, M. E. (2013). Students’ characteristics, self-regulated learning, technology self-efficacy, and course outcomes in online learning. Distance Education , 34 (3), 302–323. https://doi.org/10.1080/01587919.2013.835779 .

Weigel, M. (2015). Learning experience design versus user experience: Moving from user to Learner. Sixredmarbles .

Wolters, C. A., & Benzon, M. B. (2013). Assessing and predicting college students’ use of strategies for the self-regulation of motivation. The Journal of Experimental Education , 81 (2), 199–221. https://doi.org/10.1080/00220973.2012.699901 .

Wong, J. T., Bui, N. N., Fields, D. T., & Hughes, B. S. (2023). A learning experience design approach to online professional development for teaching science through the arts: Evaluation of teacher content knowledge, self-efficacy and STEAM perceptions. Journal of Science Teacher Education, 34 , 1–31. https://doi.org/10.1080/1046560X.2022.2112552

Wong, J., Chen, E., Rose, E., Lerner, B., Richland, L., & Hughes, B. (2023). The cognitive and behavioral learning impacts of embedded video questions: Leveraging learning experience design to support students’ knowledge outcomes. In P. Blikstein, J. Van Aalst, R. Kizito, & K. Brennan (Eds.), Proceedings of the 17th international conference of the learning sciences - ICLS 2023 (pp. 1861–1862). International Society of the Learning Sciences. https://doi.org/10.22318/icls2023.356980

Wong, J. T., Chen, E., Au-Yeung, N., Lerner, B. S., & Richland, L. E. (2024). Fostering engaging online learning experiences: Investigating situational interest and mind-wandering as mediators through learning experience design. Education and Information Technologies, 1–27. https://doi.org/10.1007/s10639-024-12524-2

Wong, J. T., & Hughes, B. S. (2022). Leveraging learning experience design: Digital media approaches to influence motivational traits that support student learning behaviors in undergraduate online courses. Journal of Computing in Higher Education, 35 , 1–38. https://doi.org/10.1007/s12528-022-09342-1

Wong, J. T., Mesghina, A., Chen, E., Yeung, N. A., Lerner, B. S., & Richland, L. E. (2023b). Zooming in or zoning out: Examining undergraduate learning experiences with zoom and the role of mind-wandering. Computers and Education Open , 4 , 100118. https://doi.org/10.1016/j.caeo.2022.100118 .

Yousef, A. M. F., Chatti, M. A., & Schroeder, U. (2014). Video-based learning: A critical analysis of the research published in 2003–2013 and future visions. In eLmL 2014, The sixth international conference on mobile, hybrid, and on-line learning (pp. 112–119).

Zimmerman, B. J., & Schunk, D. H. (Eds.). (2001). Self-regulated learning and academic achievement: Theoretical perspectives . Routledge.

Download references

Acknowledgements

We thank all the participating students, instructor, university staff, and administrators. We are impressed by their enthusiasm to adopt and learn new strategies to implement LXD strategies during the pandemic teaching and learning environment.

This work was supported by the National Science Foundation Graduate Research Fellowship, under grant number 2020304238 to the first author via the University of California, Irvine.

Author information

Authors and affiliations.

University of California, Irvine, 3200 Education Bldg, Irvine, CA, 92697, USA

Joseph T. Wong & Lindsey Engle Richland

University of California, Irvine, 301 Steinhaus Hall, Irvine, CA, 92697, USA

Bradley S. Hughes

You can also search for this author in PubMed   Google Scholar

Contributions

Joseph Wong: concept and design, data acquisition, data analysis/interpretation, manuscript writing, statistical analysis, technical support, material support. Lindsey Richland: critical manuscript revision, supervision, admin, material support. Bradley Hughes: instructor, concept, and design, data acquisition, data analysis/interpretation, critical revision of the manuscript, admin, supervision.

Corresponding author

Correspondence to Joseph T. Wong .

Ethics declarations

Conflict of interest.

No potential conflict of interest was reported by the authors.

Ethical Approval

This study was approved by the Internal Review Board Ethics Committee at the University.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Wong, J.T., Richland, L.E. & Hughes, B.S. Immediate Versus Delayed Low-Stakes Questioning: Encouraging the Testing Effect Through Embedded Video Questions to Support Students’ Knowledge Outcomes, Self-Regulation, and Critical Thinking. Tech Know Learn (2024). https://doi.org/10.1007/s10758-024-09746-1

Download citation

Accepted : 20 May 2024

Published : 30 July 2024

DOI : https://doi.org/10.1007/s10758-024-09746-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Learning experience design
  • Testing effect
  • Embedded video questions
  • Critical thinking
  • Self-regulation
  • Find a journal
  • Publish with us
  • Track your research

W

  • General & Introductory Business & Management
  • Business & Management Special Topics

evidence based practice and critical thinking

Managing Innovation: Integrating Technological, Market and Organizational Change, 7th Edition

ISBN: 978-1-119-71319-7

December 2020

PREFER DIGITAL VERSIONS OF YOUR TEXTBOOKS?

Get instant access to your Wiley eBook. Buy or rent eBooks for a period of up to 150 days.

Digital Evaluation Copy

evidence based practice and critical thinking

Joe Tidd , John R. Bessant

Now in its seventh edition, Managing Innovation: Integrating Technological, Market and Organizational Change enables graduate and undergraduate students to develop the unique skill set and the foundational knowledge required to successfully manage innovation, technology, and new product development. This bestselling text has been fully updated with new data, new methods, and new concepts while still retaining its holistic approach the subject. The text provides an integrated, evidence-based methodology to innovation management that is supported by the latest academic research and the authors’ extensive experience in real-world management practice.

Students are provided with an impressive range of learning tools—including numerous case studies, illustrative examples, discussions questions, and key information boxes—to help them explore the innovation process and its relation to the markets, technology, and the organization. “Research Notes" examine the latest evidence and topics in the field, while "Views from the Front Line" offer insights from practicing innovation managers and connect the covered material to actual experiences and challenges. Throughout the text, students are encouraged to apply their knowledge and critical thinking skills to business model innovation, creativity, entrepreneurship, service innovation, and many more current and emerging approaches and practices.

New to this Edition:

  • Updated case studies, examples, research notes, and references to reflect the most current information available
  • Unique chapter digital innovation
  • Unique chapter on social and sustainable Innovation
  • Enhanced instructor materials and support, including new seminar activities, project tools, video tutorials

Wiley Advantage:

  • An enhanced ebook includes videos, audio content, additional cases and resources, and chapter concept check assessment questions
  • Provides practical and tested processes, models, and tools
  • Emphasizes real-life application over abstract theory
  • Includes video content, podcast material, an accompanying interactive e-book, and other multimedia supplements
  • Features interactive innovation tools and exercises to reinforce critical concepts
  • Includes access to the “Innovation Portal” for both student and instructor, containing searchable innovation tools, cases, and exercises
  • Offers an extensive range of teaching resources via an instructor’s companion website

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

Probing the Relationship Between Evidence-Based Practice Implementation Models and Critical Thinking in Applied Nursing Practice

  • PMID: 27031030
  • DOI: 10.3928/00220124-20160322-05

HOW TO OBTAIN CONTACT HOURS BY READING THIS ISSUE Instructions: 1.2 contact hours will be awarded by Villanova University College of Nursing upon successful completion of this activity. A contact hour is a unit of measurement that denotes 60 minutes of an organized learning activity. This is a learner-based activity. Villanova University College of Nursing does not require submission of your answers to the quiz. A contact hour certificate will be awarded after you register, pay the registration fee, and complete the evaluation form online at http://goo.gl/gMfXaf. In order to obtain contact hours you must: 1. Read the article, "Probing the Relationship Between Evidence-Based Practice Implementation Models and Critical Thinking in Applied Nursing Practice," found on pages 161-168, carefully noting any tables and other illustrative materials that are included to enhance your knowledge and understanding of the content. Be sure to keep track of the amount of time (number of minutes) you spend reading the article and completing the quiz. 2. Read and answer each question on the quiz. After completing all of the questions, compare your answers to those provided within this issue. If you have incorrect answers, return to the article for further study. 3. Go to the Villanova website to register for contact hour credit. You will be asked to provide your name, contact information, and a VISA, MasterCard, or Discover card number for payment of the $20.00 fee. Once you complete the online evaluation, a certificate will be automatically generated. This activity is valid for continuing education credit until March 31, 2019. CONTACT HOURS This activity is co-provided by Villanova University College of Nursing and SLACK Incorporated. Villanova University College of Nursing is accredited as a provider of continuing nursing education by the American Nurses Credentialing Center's Commission on Accreditation.

Objectives: • Describe the key components and characteristics related to evidence-based practice and critical thinking. • Identify the relationship between evidence-based practice and critical thinking. DISCLOSURE STATEMENT Neither the planners nor the author have any conflicts of interest to disclose. Evidence-based practice is not a new concept to the profession of nursing, yet its application and sustainability is inconsistent in nursing practice. Despite the expansion of efforts to teach evidence-based practice and practically apply evidence at the bedside, a research-practice gap still exists. Several critical factors contribute to the successful application of evidence into practice, including critical thinking. The purpose of this article is to discuss the relationship between critical thinking and the current evidence-based practice implementation models. Understanding this relationship will help nurse educators and clinicians in cultivating critical thinking skills in nursing staff to most effectively apply evidence at the bedside. Critical thinking is a key element and is essential to the learning and implementation of evidence-based practice, as demonstrated by its integration into evidence-based practice implementation models.

Copyright 2016, SLACK Incorporated.

PubMed Disclaimer

Similar articles

  • Effect of Evidence-Based Practice Programs on Individual Barriers of Workforce Nurses: An Integrative Review. Middlebrooks R Jr, Carter-Templeton H, Mund AR. Middlebrooks R Jr, et al. J Contin Educ Nurs. 2016 Sep 1;47(9):398-406. doi: 10.3928/00220124-20160817-06. J Contin Educ Nurs. 2016. PMID: 27580506 Review.
  • Barriers to Participation in an Online Nursing Journal Club at a Community Teaching Hospital. Rodriguez C, Victor C, Leonardi N, Sulo S, Littlejohn G. Rodriguez C, et al. J Contin Educ Nurs. 2016 Dec 1;47(12):536-542. doi: 10.3928/00220124-20161115-05. J Contin Educ Nurs. 2016. PMID: 27893915
  • Population-Focused Practice Competency Needs Among Public Health Nursing Leaders in Washington State. Espina CR, Bekemeier B, Storey-Kuyl M. Espina CR, et al. J Contin Educ Nurs. 2016 May 1;47(5):212-9. doi: 10.3928/00220124-20160419-06. J Contin Educ Nurs. 2016. PMID: 27124075
  • From Academic-Practice Partnership to Professional Nursing Practice Model. Hudacek SS, DiMattio MJ, Turkel MC. Hudacek SS, et al. J Contin Educ Nurs. 2017 Mar 1;48(3):104-112. doi: 10.3928/00220124-20170220-05. J Contin Educ Nurs. 2017. PMID: 28253416
  • Association Between Sarcopenia and Nutritional Status in Older Adults: A Systematic Literature Review. Eglseer D, Eminovic S, Lohrmann C. Eglseer D, et al. J Gerontol Nurs. 2016 Jul 1;42(7):33-41. doi: 10.3928/00989134-20160613-03. J Gerontol Nurs. 2016. PMID: 27337185 Review.
  • Search in MeSH

LinkOut - more resources

Full text sources.

  • Ovid Technologies, Inc.

Other Literature Sources

  • scite Smart Citations

Miscellaneous

  • NCI CPTAC Assay Portal

full text provider logo

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

Journal of Occupational Therapy Education

Home > JOURNALS > JOTE > Vol. 7 (2023) > Iss. 3

Concept Mapping as an Instructional Method to Support Critical Thinking in Occupational Therapy Students: A Pilot Study

Alissa R. Baker , Western Michigan University Follow Cassandra C. Ginn , Eastern Kentucky University Follow

Document Type (Journals)

Original Research

In occupational therapy practice, critical thinking is a foundational skill for the delivery of effective care; however, there is limited evidence on the development of critical thinking skills in occupational therapy education. The purpose of this study was to explore the effects and student perceptions of concept mapping on critical thinking skills in occupational therapy education. This study used a quasi-experimental design with a retrospective pre-post assessment after two teaching conditions: (a) traditional lecture and (b) concept mapping. The same convenience sample of students was used for each condition. Additional outcome measures included assessment of student concept maps using a scoring rubric and a survey of students’ perceptions on the use of concept mapping. Results of the retrospective pre-post assessment indicated significant gains in student knowledge (p

Alissa Baker, OTD, OTR/L is an Assistant Professor at Western Michigan University in the Department of Occupational Therapy, Kalamazoo, MI.

Cassandra Ginn, OTD, OTR/L, CBIS is an Assistant Professor at Eastern Kentucky University in the department of Occupational Science and Occupational Therapy. She has over a decade of experience in clinical practice, working in inpatient rehabilitation.

Declaration of Interest

The authors report no declarations of interest.

Recommended Citation

Baker, A. R., & Ginn, C. C. (2023). Concept Mapping as an Instructional Method to Support Critical Thinking in Occupational Therapy Students: A Pilot Study. Journal of Occupational Therapy Education, 7 (3). https://doi.org/10.26681/jote.2023.070307

Creative Commons License

Since July 31, 2023

Included in

Educational Methods Commons , Occupational Therapy Commons

https://doi.org/10.26681/jote.2023.070307

  • Journal Home
  • About This Journal
  • Aims & Scope
  • Editorial Board
  • Instructions for Authors
  • Submit Article
  • Most Popular Papers
  • Receive Email Notices or RSS

Advanced Search

ISSN: 2573-1378

Home | About | FAQ | My Account | Accessibility Statement

Privacy Copyright

NIMH Logo

Transforming the understanding and treatment of mental illnesses.

Información en español

Celebrating 75 Years! Learn More >>

  • Health Topics
  • Brochures and Fact Sheets
  • Help for Mental Illnesses
  • Clinical Trials

Caring for Your Mental Health

Esta página también está disponible en español .

Mental health includes emotional, psychological, and social well-being. It is more than the absence of a mental illness—it’s essential to your overall health and quality of life. Self-care can play a role in maintaining your mental health and help support your treatment and recovery if you have a mental illness.

How can I take care of my mental health?

Self-care means taking the time to do things that help you live well and improve both your physical health and mental health. This can help you manage stress, lower your risk of illness, and increase your energy. Even small acts of self-care in your daily life can have a big impact.

Here are some self-care tips:

  • Get regular exercise.  Just 30 minutes of walking every day can boost your mood and improve your health. Small amounts of exercise add up, so don’t be discouraged if you can’t do 30 minutes at one time.
  • Eat healthy, regular meals and stay hydrated.  A balanced diet and plenty of water can improve your energy and focus throughout the day. Pay attention to your intake of caffeine and alcohol and how they affect your mood and well-being—for some, decreasing caffeine and alcohol consumption can be helpful.
  • Make sleep a priority . Stick to a schedule, and make sure you’re getting enough sleep. Blue light from devices and screens can make it harder to fall asleep, so reduce blue light exposure from your phone or computer before bedtime.
  • Try a relaxing activity.  Explore relaxation or wellness programs or apps, which may incorporate meditation, muscle relaxation, or breathing exercises. Schedule regular times for these and other healthy activities you enjoy, such as listening to music, reading, spending time in nature, and engaging in low-stress hobbies.
  • Set goals and priorities.  Decide what must get done now and what can wait. Learn to say “no” to new tasks if you start to feel like you’re taking on too much. Try to appreciate what you have accomplished at the end of the day.
  • Practice gratitude.  Remind yourself daily of things you are grateful for. Be specific. Write them down or replay them in your mind.
  • Focus on positivity . Identify and challenge your negative and unhelpful thoughts.
  • Stay connected.  Reach out to friends or family members who can provide emotional support and practical help.

Self-care looks different for everyone, and it is important to find what you need and enjoy. It may take trial and error to discover what works best for you.

Learn more about  healthy practices for your mind and body  .

When should I seek professional help?

Seek professional help if you are experiencing severe or distressing symptoms that have lasted 2 weeks or more, such as:

  • Difficulty sleeping
  • Changes in appetite or unplanned weight changes
  • Difficulty getting out of bed in the morning because of mood
  • Difficulty concentrating
  • Loss of interest in things you usually find enjoyable
  • Inability to complete usual tasks and activities
  • Feelings of irritability, frustration, or restlessness

How can I find help?

If you have concerns about your mental health, talk to a primary care provider. They can refer you to a qualified mental health professional, such as a psychologist, psychiatrist, or clinical social worker, who can help you figure out the next steps. Find  tips for talking with a health care provider about your mental health.

You can learn more about getting help on the NIMH website. You can also learn about finding support  and locating mental health services  in your area on the Substance Abuse and Mental Health Services Administration website.

If you or someone you know is struggling or having thoughts of suicide, call or text the  988 Suicide & Crisis Lifeline   at 988 or chat at 988lifeline.org   . This service is confidential, free, and available 24 hours a day, 7 days a week. In life-threatening situations, call  911.

Suicide is preventable—learn about warning signs of suicide and action steps for helping someone in emotional distress.

Featured videos

GREAT: Helpful Practices to Manage Stress and Anxiety:  Learn about helpful practices to manage stress and anxiety. GREAT was developed by Dr. Krystal Lewis, a licensed clinical psychologist at NIMH.

Getting to Know Your Brain: Dealing with Stress:  Test your knowledge about stress and the brain. Also learn how to create and use a “ stress catcher ” to practice strategies to deal with stress.

Guided Visualization: Dealing with Stress:  Learn how the brain handles stress and practice a guided visualization activity.

Mental Health Minute: Stress and Anxiety in Adolescents: Got 60 seconds? Take a mental health minute to learn about stress and anxiety in adolescents.

Featured fact sheets

My Mental Health

  • NIH Wellness Toolkits   : NIH provides toolkits with strategies for improving your  emotional health  and  social health  .
  • MedlinePlus: How to Improve Mental Health   : MedlinePlus provides health information and tips for improving your mental health.
  • CDC: Emotional Well-Being   : CDC provides information on how to cope with stress and promote social connectedness.
  • SAMHSA: How to Cope   : SAMHSA offers tips for taking care of your well-being and connecting with others for support.

Last Reviewed:  February 2024

Unless otherwise specified, the information on our website and in our publications is in the public domain and may be reused or copied without permission. However, you may not reuse or copy images. Please cite the National Institute of Mental Health as the source. Read our copyright policy to learn more about our guidelines for reusing NIMH content.

IMAGES

  1. The Essential (Oxford Review) Guide to Evidence-Based Practice

    evidence based practice and critical thinking

  2. Critical Thinking & Evidence Based Practice Flashcards

    evidence based practice and critical thinking

  3. Evidence-based practice for effective decision-making

    evidence based practice and critical thinking

  4. 6 Examples of Critical Thinking Skills

    evidence based practice and critical thinking

  5. Critical Thinking and the Process of Evidence-Based Practice (Paperback

    evidence based practice and critical thinking

  6. why is Importance of Critical Thinking Skills in Education

    evidence based practice and critical thinking

VIDEO

  1. Evidence-Based Practice in Clinical Social Work

  2. Create a mind map for how critical thinking is used as a nurse based on the major components of crit

  3. WGU RN to BSN Scholarship in Nursing Practice D219

  4. 🧘‍♀️Mindfulness or 🧠 Critical Thinking? #shortvideo

  5. Critical Thinking

  6. The Revolution Will Not Be Uploaded

COMMENTS

  1. The Effectiveness of an Evidence-Based Practice (EBP) Educational Program on Undergraduate Nursing Students' EBP Knowledge and Skills: A Cluster Randomized Control Trial

    1. Introduction. Evidence-based practice (EBP) is defined as "clinical decision-making that considers the best available evidence; the context in which the care is delivered; client preference; and the professional judgment of the health professional" [] (p. 2).EBP implementation is recommended in clinical settings [2,3,4,5] as it has been attributed to promoting high-value health care ...

  2. Evidence-based practice for effective decision-making

    Evidence-based practice helps them make better, more effective decisions by choosing reliable, trustworthy solutions and being less reliant on outdated received wisdom, fads or superficial quick fixes. At the CIPD, we believe this is an important step for the people profession to take: our Profession Map describes a vision of a profession that ...

  3. Critical thinking and evidence-based practice

    Critical thinking (CT) is vital to evidence-based nursing practice. Evidence-based practice (EBP) supports nursing care and can contribute positively to patient outcomes across a variety of settings and geographic locations. The nature of EBP, its relevance to nursing, and the skills needed to support it should be required components of ...

  4. What is Evidence-Based Practice in Nursing?

    Evidence-based practice in nursing involves providing holistic, quality care based on the most up-to-date research and knowledge rather than traditional methods, advice from colleagues, or personal beliefs. ... Use critical thinking skills and consider levels of evidence to establish the reliability of the information when you analyze evidence ...

  5. Critical Thinking and Evidence-Based Practice

    CRITICAL THINKING (CT) is vital in developing evidence-based nursing practice. Evidence-based practice (EBP) supports nursing care that can be "individualized to patients and their families, is more effective, streamlined, and dynamic, and maximizes effects of clinical judgment" ( Youngblut & Brooten, 2001, p. 468).

  6. PDF The Importance of Critical Thinking in Evidenced-Based Practice

    nce 3 of Critical Thinking in Evidenced-Based Practice. O ne ofthe hallmarks of EBP is its focus on c. itical thinking. Astleitner (2002) defines critical thinking asa higher-ord. r thinking skill which mainly consists of evaluating arguments. It is a purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation ...

  7. Critical thinking in healthcare and education

    Critical thinking is just one skill crucial to evidence based practice in healthcare and education, write Jonathan Sharples and colleagues , who see exciting opportunities for cross sector collaboration Imagine you are a primary care doctor. A patient comes into your office with acute, atypical chest pain. Immediately you consider the patient's sex and age, and you begin to think about what ...

  8. Critical thinking in nursing clinical practice, education and research

    Critical thinking is a complex, dynamic process formed by attitudes and strategic skills, with the aim of achieving a specific goal or objective. The attitudes, including the critical thinking attitudes, constitute an important part of the idea of good care, of the good professional. It could be said that they become a virtue of the nursing ...

  9. Evidenced-Based Thinking for Scientific Thinking

    Evidenced-based thinking is part of critical thinking which involves identifying, evaluating, and using evidence. ... Over the course of a semester, this combining of evidence became more of a common practice for our students. Some students even noted that one type of evidence could serve as a launching pad for another. For example, one student ...

  10. Critical thinking and the process of evidence-based practice

    Critical thinking and the process of evidence-based practice by Eileen Gambrill, New York, NY, Oxford University Press, 2019, 338 pp., ISBN 978--190-46335-9 (paperback) Jerzy Szmagalski The Maria Grzegorzewska University, Warsaw, Poland Correspondence [email protected]

  11. Clinical Reasoning, Decisionmaking, and Action: Thinking Critically and

    Before research should be used in practice, it must be evaluated. There are many complexities and nuances in evaluating the research evidence for clinical practice. Evaluation of research behind evidence-based medicine requires critical thinking and good clinical judgment. Sometimes the research findings are mixed or even conflicting.

  12. Promoting critical thinking through an evidence-based skills fair

    One type of evidence-based practice that can be used to engage students, promote active learning and develop critical thinking is skills fair intervention ( McCausland and Meyers, 2013; Roberts et al., 2009 ). Skills fair intervention promoted a consistent teaching approach of the psychomotor skills to the novice nurse that decreased anxiety ...

  13. What is Evidence-Based Practice in Nursing? (With Examples, Benefits

    Critical Thinking: Evidence-based practices in nursing require having the ability to evaluate data logically and weigh the evidence. 2. Scientific Mindset: ... Evidence-based practice in nursing involves several components such as creating answerable clinical questions, using resources to find the best evidence to answer the clinical question(s ...

  14. Enhancing Critical Thinking in Clinical Practice

    Journal clubs encourage evidence-based practice and critical thinking by introducing nurses to new developments and broader perspectives of health care. 11 Lehna et al 25 described the virtual journal club (VJC) as an alternative to the traditional journal club meetings. The VJC uses an online blog format to post research-based articles and ...

  15. Critical Thinking: Knowledge and Skills for Evidence-Based Practice

    PurposeI respond to Kamhi's (2011) conclusion in his article "Balancing Certainty and Uncertainty in Clinical Practice" that rational or critical thinking is an essential complement to evidence-bas...

  16. PDF Critical Thinking: Knowledge and Skills for Evidence-Based Practice

    critical thinking and rationality are terms that are sometimes used interchangeably (e.g., Stanovich, 1999). ABSTRACT: Purpose: I respond to Kamhi's (2011) conclusion in his article "Balancing Certainty and Uncertainty in Clinical Practice" that rational or critical thinking is an essential com-plement to evidence-based practice (EBP).

  17. Critical thinking: knowledge and skills for evidence-based practice

    Purpose: I respond to Kamhi's (2011) conclusion in his article "Balancing Certainty and Uncertainty in Clinical Practice" that rational or critical thinking is an essential complement to evidence-based practice (EBP). Method: I expand on Kamhi's conclusion and briefly describe what clinicians might need to know to think critically within an EBP ...

  18. Evidence-Based Practice and Nursing Research

    This "evidence-based practice curriculum" spans all four academic years, integrates coursework and practicums, and sets different learning objectives for students at different grade levels. Also in this issue, Yang et al. apply a revised standard care procedure to increase the ability of critical care nurses to verify the placement of ...

  19. Critical Thinking and the Process of Evidence-Based Practice

    People also read lists articles that other readers of this article have read.. Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.. Cited by lists all citing articles based on Crossref citations. Articles with the Crossref icon will open in a new tab.

  20. Immediate Versus Delayed Low-Stakes Questioning: Encouraging ...

    These findings will contribute to a deeper understanding of evidence-based designs for asynchronous online learning environments and will help in evaluating the effectiveness of embedding video questions with regards to question timing within the LXD paradigm. ... As students actively practice critical thinking within the learning environment, ...

  21. Development of a Critical Reflection Mentorship Program to Enhance

    Critical thinking was evaluated using the Nursing Critical Thinking in Clinical Practice Questionnaire (N-CT-4 Practice) pre- and postprogram implementation. ... Effects of a work-based critical reflection program for novice nurses. BMC Medical Education, 18(1), ... Worldviews on Evidence-Based Nursing, 14(4), 257-264.

  22. Managing Innovation: Integrating Technological, Market and

    The text provides an integrated, evidence-based methodology to innovation management that is supported by the latest academic research and the authors' extensive experience in real-world management practice. Students are provided with an impressive range of learning tools—including numerous case studies, illustrative examples, discussions ...

  23. Probing the Relationship Between Evidence-Based Practice ...

    Despite the expansion of efforts to teach evidence-based practice and practically apply evidence at the bedside, a research-practice gap still exists. Several critical factors contribute to the successful application of evidence into practice, including critical thinking. The purpose of this article is to discuss the relationship between ...

  24. "Concept Mapping to Support Critical Thinking in OT Students" by Alissa

    In occupational therapy practice, critical thinking is a foundational skill for the delivery of effective care; however, there is limited evidence on the development of critical thinking skills in occupational therapy education. The purpose of this study was to explore the effects and student perceptions of concept mapping on critical thinking skills in occupational therapy education. This ...

  25. Caring for Your Mental Health

    Practice gratitude. Remind yourself daily of things you are grateful for. Be specific. Write them down or replay them in your mind. Focus on positivity. Identify and challenge your negative and unhelpful thoughts. Stay connected. Reach out to friends or family members who can provide emotional support and practical help.