Portland Community College | Portland, Oregon

Core outcomes.

  • Core Outcomes: Critical Thinking and Problem Solving

Think Critically and Imaginatively

  • Engage the imagination to explore new possibilities.
  • Formulate and articulate ideas.
  • Recognize explicit and tacit assumptions and their consequences.
  • Weigh connections and relationships.
  • Distinguish relevant from non-relevant data, fact from opinion.
  • Identify, evaluate and synthesize information (obtained through library, world-wide web, and other sources as appropriate) in a collaborative environment.
  • Reason toward a conclusion or application.
  • Understand the contributions and applications of associative, intuitive and metaphoric modes of reasoning to argument and analysis.
  • Analyze and draw inferences from numerical models.
  • Determine the extent of information needed.
  • Access the needed information effectively and efficiently.
  • Evaluate information and its sources critically.
  • Incorporate selected information into one’s knowledge base.
  • Understand the economic, legal, and social issues surrounding the use of information, and access and use information ethically and legally.

Problem-Solve

  • Identify and define central and secondary problems.
  • Research and analyze data relevant to issues from a variety of media.
  • Select and use appropriate concepts and methods from a variety of disciplines to solve problems effectively and creatively.
  • Form associations between disparate facts and methods, which may be cross-disciplinary.
  • Identify and use appropriate technology to research, solve, and present solutions to problems.
  • Understand the roles of collaboration, risk-taking, multi-disciplinary awareness, and the imagination in achieving creative responses to problems.
  • Make a decision and take actions based on analysis.
  • Interpret and express quantitative ideas effectively in written, visual, aural, and oral form.
  • Interpret and use written, quantitative, and visual text effectively in presentation of solutions to problems.
Core Outcomes
Sample Indicators

Limited demonstration or application of knowledge and skills.

Identifies the main problem, question at issue or the source’s position.

Identifies implicit aspects of the problem and addresses their relationship to each other.

Basic demonstration and application of knowledge and skills.

Identifies one’s own position on the issue, drawing support from experience, and information not available from assigned sources.

Addresses more than one perspective including perspectives drawn from outside information.

Clearly distinguishes between fact, opinion and acknowledges value judgments.

Demonstrates comprehension and is able to apply essential knowledge and skill.

Identifies and addresses the validity of key assumptions that underlie the issue.

Examines the evidence and source of evidence.

Relates cause and effect.

Illustrates existing or potential consequences.

Analyzes the scope and context of the issue including an assessment of the audience of the analysis.

Demonstrates thorough, effective and/or sophisticated application of knowledge and skills.

Identifies and discusses conclusions, implication and consequences of issues considering context, assumptions, data and evidence.

Objectively reflects upon own assertions.

  • AB: Auto Collision Repair Technology
  • ABE: Adult Basic Education
  • AD: Addiction Studies
  • AM: Automotive Service Technology
  • AMT: Aviation Maintenance Technology
  • APR: Apprenticeship
  • ARCH: Architectural Design and Drafting
  • ASL: American Sign Language
  • ATH: Anthropology
  • AVS: Aviation Science
  • BA: Business Administration
  • BCT: Building Construction Technology
  • BI: Biology
  • BIT: Bioscience Technology
  • CADD: Computer Aided Design and Drafting
  • CAS/OS: Computer Applications & Web Technologies
  • CG: Counseling and Guidance
  • CH: Chemistry
  • CHLA: Chicano/ Latino Studies
  • CHN: Chinese
  • CIS: Computer Information Systems
  • CJA: Criminal Justice
  • CMET: Civil and Mechanical Engineering Technology
  • COMM: Communication Studies
  • Core Outcomes: Communication
  • Core Outcomes: Community and Environmental Responsibility
  • Core Outcomes: Cultural Awareness
  • Core Outcomes: Professional Competence
  • Core Outcomes: Self-Reflection
  • CS: Computer Science
  • CTT: Computed Tomography
  • DA: Dental Assisting
  • DE: Developmental Education – Reading & Writing
  • DE: Developmental Education – Reading and Writing
  • DH: Dental Hygiene
  • DS: Diesel Service Technology
  • DST: Dealer Service Technology
  • DT: Dental Lab Technology
  • DT: Dental Technology
  • EC: Economics
  • ECE/HEC/HUS: Child and Family Studies
  • ED: Paraeducator and Library Assistant
  • EET: Electronic Engineering Technology
  • ELT: Electrical Trades
  • EMS: Emergency Medical Services
  • ENGR: Engineering
  • ESOL: English for Speakers of Other Languages
  • ESR: Environmental Studies
  • Exercise Science (formerly FT: Fitness Technology)
  • FMT: Facilities Maintenance Technology
  • FN: Foods and Nutrition
  • FOT: Fiber Optics Technology
  • FP: Fire Protection Technology
  • GD: Graphic Design
  • GEO: Geography
  • GER: German
  • GGS: Geology and General Science
  • GRN: Gerontology
  • HE: Health Education
  • HIM: Health Information Management
  • HR: Culinary Assistant Program
  • HST: History
  • ID: Interior Design
  • INSP: Building Inspection Technology
  • Integrated Studies
  • ITP: Sign Language Interpretation
  • J: Journalism
  • JPN: Japanese
  • LAT: Landscape Technology
  • LIB: Library
  • Literature (ENG)
  • MA: Medical Assisting
  • MCH: Machine Manufacturing Technology
  • MLT: Medical Laboratory Technology
  • MM: Multimedia
  • MP: Medical Professions
  • MRI: Magnetic Resonance Imaging
  • MSD: Management/Supervisory Development
  • MT: Microelectronic Technology
  • MTH: Mathematics
  • MUC: Music & Sonic Arts (formerly Professional Music)
  • NRS: Nursing
  • OMT: Ophthalmic Medical Technology
  • OST: Occupational Skills Training
  • PCC Core Outcomes/Course Mapping Matrix
  • PE: Physical Education
  • PHL: Philosophy
  • PHY: Physics
  • PL: Paralegal
  • PS: Political Science
  • PSY: Psychology
  • Race, Indigenous Nations, and Gender (RING)
  • RAD: Radiography
  • RE: Real Estate
  • RUS: Russian
  • SC: Skill Center
  • SOC: Sociology
  • SPA: Spanish
  • TA: Theatre Arts
  • TE: Facilities Maintenance
  • VP: Video Production
  • VT: Veterinary Technology
  • WLD: Welding Technology
  • Writing/Composition
  • WS: Women’s and Gender Studies

Classroom Q&A

With larry ferlazzo.

In this EdWeek blog, an experiment in knowledge-gathering, Ferlazzo will address readers’ questions on classroom management, ELL instruction, lesson planning, and other issues facing teachers. Send your questions to [email protected]. Read more from this blog.

Eight Instructional Strategies for Promoting Critical Thinking

learning outcomes critical thinking

  • Share article

(This is the first post in a three-part series.)

The new question-of-the-week is:

What is critical thinking and how can we integrate it into the classroom?

This three-part series will explore what critical thinking is, if it can be specifically taught and, if so, how can teachers do so in their classrooms.

Today’s guests are Dara Laws Savage, Patrick Brown, Meg Riordan, Ph.D., and Dr. PJ Caposey. Dara, Patrick, and Meg were also guests on my 10-minute BAM! Radio Show . You can also find a list of, and links to, previous shows here.

You might also be interested in The Best Resources On Teaching & Learning Critical Thinking In The Classroom .

Current Events

Dara Laws Savage is an English teacher at the Early College High School at Delaware State University, where she serves as a teacher and instructional coach and lead mentor. Dara has been teaching for 25 years (career preparation, English, photography, yearbook, newspaper, and graphic design) and has presented nationally on project-based learning and technology integration:

There is so much going on right now and there is an overload of information for us to process. Did you ever stop to think how our students are processing current events? They see news feeds, hear news reports, and scan photos and posts, but are they truly thinking about what they are hearing and seeing?

I tell my students that my job is not to give them answers but to teach them how to think about what they read and hear. So what is critical thinking and how can we integrate it into the classroom? There are just as many definitions of critical thinking as there are people trying to define it. However, the Critical Think Consortium focuses on the tools to create a thinking-based classroom rather than a definition: “Shape the climate to support thinking, create opportunities for thinking, build capacity to think, provide guidance to inform thinking.” Using these four criteria and pairing them with current events, teachers easily create learning spaces that thrive on thinking and keep students engaged.

One successful technique I use is the FIRE Write. Students are given a quote, a paragraph, an excerpt, or a photo from the headlines. Students are asked to F ocus and respond to the selection for three minutes. Next, students are asked to I dentify a phrase or section of the photo and write for two minutes. Third, students are asked to R eframe their response around a specific word, phrase, or section within their previous selection. Finally, students E xchange their thoughts with a classmate. Within the exchange, students also talk about how the selection connects to what we are covering in class.

There was a controversial Pepsi ad in 2017 involving Kylie Jenner and a protest with a police presence. The imagery in the photo was strikingly similar to a photo that went viral with a young lady standing opposite a police line. Using that image from a current event engaged my students and gave them the opportunity to critically think about events of the time.

Here are the two photos and a student response:

F - Focus on both photos and respond for three minutes

In the first picture, you see a strong and courageous black female, bravely standing in front of two officers in protest. She is risking her life to do so. Iesha Evans is simply proving to the world she does NOT mean less because she is black … and yet officers are there to stop her. She did not step down. In the picture below, you see Kendall Jenner handing a police officer a Pepsi. Maybe this wouldn’t be a big deal, except this was Pepsi’s weak, pathetic, and outrageous excuse of a commercial that belittles the whole movement of people fighting for their lives.

I - Identify a word or phrase, underline it, then write about it for two minutes

A white, privileged female in place of a fighting black woman was asking for trouble. A struggle we are continuously fighting every day, and they make a mockery of it. “I know what will work! Here Mr. Police Officer! Drink some Pepsi!” As if. Pepsi made a fool of themselves, and now their already dwindling fan base continues to ever shrink smaller.

R - Reframe your thoughts by choosing a different word, then write about that for one minute

You don’t know privilege until it’s gone. You don’t know privilege while it’s there—but you can and will be made accountable and aware. Don’t use it for evil. You are not stupid. Use it to do something. Kendall could’ve NOT done the commercial. Kendall could’ve released another commercial standing behind a black woman. Anything!

Exchange - Remember to discuss how this connects to our school song project and our previous discussions?

This connects two ways - 1) We want to convey a strong message. Be powerful. Show who we are. And Pepsi definitely tried. … Which leads to the second connection. 2) Not mess up and offend anyone, as had the one alma mater had been linked to black minstrels. We want to be amazing, but we have to be smart and careful and make sure we include everyone who goes to our school and everyone who may go to our school.

As a final step, students read and annotate the full article and compare it to their initial response.

Using current events and critical-thinking strategies like FIRE writing helps create a learning space where thinking is the goal rather than a score on a multiple-choice assessment. Critical-thinking skills can cross over to any of students’ other courses and into life outside the classroom. After all, we as teachers want to help the whole student be successful, and critical thinking is an important part of navigating life after they leave our classrooms.

usingdaratwo

‘Before-Explore-Explain’

Patrick Brown is the executive director of STEM and CTE for the Fort Zumwalt school district in Missouri and an experienced educator and author :

Planning for critical thinking focuses on teaching the most crucial science concepts, practices, and logical-thinking skills as well as the best use of instructional time. One way to ensure that lessons maintain a focus on critical thinking is to focus on the instructional sequence used to teach.

Explore-before-explain teaching is all about promoting critical thinking for learners to better prepare students for the reality of their world. What having an explore-before-explain mindset means is that in our planning, we prioritize giving students firsthand experiences with data, allow students to construct evidence-based claims that focus on conceptual understanding, and challenge students to discuss and think about the why behind phenomena.

Just think of the critical thinking that has to occur for students to construct a scientific claim. 1) They need the opportunity to collect data, analyze it, and determine how to make sense of what the data may mean. 2) With data in hand, students can begin thinking about the validity and reliability of their experience and information collected. 3) They can consider what differences, if any, they might have if they completed the investigation again. 4) They can scrutinize outlying data points for they may be an artifact of a true difference that merits further exploration of a misstep in the procedure, measuring device, or measurement. All of these intellectual activities help them form more robust understanding and are evidence of their critical thinking.

In explore-before-explain teaching, all of these hard critical-thinking tasks come before teacher explanations of content. Whether we use discovery experiences, problem-based learning, and or inquiry-based activities, strategies that are geared toward helping students construct understanding promote critical thinking because students learn content by doing the practices valued in the field to generate knowledge.

explorebeforeexplain

An Issue of Equity

Meg Riordan, Ph.D., is the chief learning officer at The Possible Project, an out-of-school program that collaborates with youth to build entrepreneurial skills and mindsets and provides pathways to careers and long-term economic prosperity. She has been in the field of education for over 25 years as a middle and high school teacher, school coach, college professor, regional director of N.Y.C. Outward Bound Schools, and director of external research with EL Education:

Although critical thinking often defies straightforward definition, most in the education field agree it consists of several components: reasoning, problem-solving, and decisionmaking, plus analysis and evaluation of information, such that multiple sides of an issue can be explored. It also includes dispositions and “the willingness to apply critical-thinking principles, rather than fall back on existing unexamined beliefs, or simply believe what you’re told by authority figures.”

Despite variation in definitions, critical thinking is nonetheless promoted as an essential outcome of students’ learning—we want to see students and adults demonstrate it across all fields, professions, and in their personal lives. Yet there is simultaneously a rationing of opportunities in schools for students of color, students from under-resourced communities, and other historically marginalized groups to deeply learn and practice critical thinking.

For example, many of our most underserved students often spend class time filling out worksheets, promoting high compliance but low engagement, inquiry, critical thinking, or creation of new ideas. At a time in our world when college and careers are critical for participation in society and the global, knowledge-based economy, far too many students struggle within classrooms and schools that reinforce low-expectations and inequity.

If educators aim to prepare all students for an ever-evolving marketplace and develop skills that will be valued no matter what tomorrow’s jobs are, then we must move critical thinking to the forefront of classroom experiences. And educators must design learning to cultivate it.

So, what does that really look like?

Unpack and define critical thinking

To understand critical thinking, educators need to first unpack and define its components. What exactly are we looking for when we speak about reasoning or exploring multiple perspectives on an issue? How does problem-solving show up in English, math, science, art, or other disciplines—and how is it assessed? At Two Rivers, an EL Education school, the faculty identified five constructs of critical thinking, defined each, and created rubrics to generate a shared picture of quality for teachers and students. The rubrics were then adapted across grade levels to indicate students’ learning progressions.

At Avenues World School, critical thinking is one of the Avenues World Elements and is an enduring outcome embedded in students’ early experiences through 12th grade. For instance, a kindergarten student may be expected to “identify cause and effect in familiar contexts,” while an 8th grader should demonstrate the ability to “seek out sufficient evidence before accepting a claim as true,” “identify bias in claims and evidence,” and “reconsider strongly held points of view in light of new evidence.”

When faculty and students embrace a common vision of what critical thinking looks and sounds like and how it is assessed, educators can then explicitly design learning experiences that call for students to employ critical-thinking skills. This kind of work must occur across all schools and programs, especially those serving large numbers of students of color. As Linda Darling-Hammond asserts , “Schools that serve large numbers of students of color are least likely to offer the kind of curriculum needed to ... help students attain the [critical-thinking] skills needed in a knowledge work economy. ”

So, what can it look like to create those kinds of learning experiences?

Designing experiences for critical thinking

After defining a shared understanding of “what” critical thinking is and “how” it shows up across multiple disciplines and grade levels, it is essential to create learning experiences that impel students to cultivate, practice, and apply these skills. There are several levers that offer pathways for teachers to promote critical thinking in lessons:

1.Choose Compelling Topics: Keep it relevant

A key Common Core State Standard asks for students to “write arguments to support claims in an analysis of substantive topics or texts using valid reasoning and relevant and sufficient evidence.” That might not sound exciting or culturally relevant. But a learning experience designed for a 12th grade humanities class engaged learners in a compelling topic— policing in America —to analyze and evaluate multiple texts (including primary sources) and share the reasoning for their perspectives through discussion and writing. Students grappled with ideas and their beliefs and employed deep critical-thinking skills to develop arguments for their claims. Embedding critical-thinking skills in curriculum that students care about and connect with can ignite powerful learning experiences.

2. Make Local Connections: Keep it real

At The Possible Project , an out-of-school-time program designed to promote entrepreneurial skills and mindsets, students in a recent summer online program (modified from in-person due to COVID-19) explored the impact of COVID-19 on their communities and local BIPOC-owned businesses. They learned interviewing skills through a partnership with Everyday Boston , conducted virtual interviews with entrepreneurs, evaluated information from their interviews and local data, and examined their previously held beliefs. They created blog posts and videos to reflect on their learning and consider how their mindsets had changed as a result of the experience. In this way, we can design powerful community-based learning and invite students into productive struggle with multiple perspectives.

3. Create Authentic Projects: Keep it rigorous

At Big Picture Learning schools, students engage in internship-based learning experiences as a central part of their schooling. Their school-based adviser and internship-based mentor support them in developing real-world projects that promote deeper learning and critical-thinking skills. Such authentic experiences teach “young people to be thinkers, to be curious, to get from curiosity to creation … and it helps students design a learning experience that answers their questions, [providing an] opportunity to communicate it to a larger audience—a major indicator of postsecondary success.” Even in a remote environment, we can design projects that ask more of students than rote memorization and that spark critical thinking.

Our call to action is this: As educators, we need to make opportunities for critical thinking available not only to the affluent or those fortunate enough to be placed in advanced courses. The tools are available, let’s use them. Let’s interrogate our current curriculum and design learning experiences that engage all students in real, relevant, and rigorous experiences that require critical thinking and prepare them for promising postsecondary pathways.

letsinterrogate

Critical Thinking & Student Engagement

Dr. PJ Caposey is an award-winning educator, keynote speaker, consultant, and author of seven books who currently serves as the superintendent of schools for the award-winning Meridian CUSD 223 in northwest Illinois. You can find PJ on most social-media platforms as MCUSDSupe:

When I start my keynote on student engagement, I invite two people up on stage and give them each five paper balls to shoot at a garbage can also conveniently placed on stage. Contestant One shoots their shot, and the audience gives approval. Four out of 5 is a heckuva score. Then just before Contestant Two shoots, I blindfold them and start moving the garbage can back and forth. I usually try to ensure that they can at least make one of their shots. Nobody is successful in this unfair environment.

I thank them and send them back to their seats and then explain that this little activity was akin to student engagement. While we all know we want student engagement, we are shooting at different targets. More importantly, for teachers, it is near impossible for them to hit a target that is moving and that they cannot see.

Within the world of education and particularly as educational leaders, we have failed to simplify what student engagement looks like, and it is impossible to define or articulate what student engagement looks like if we cannot clearly articulate what critical thinking is and looks like in a classroom. Because, simply, without critical thought, there is no engagement.

The good news here is that critical thought has been defined and placed into taxonomies for decades already. This is not something new and not something that needs to be redefined. I am a Bloom’s person, but there is nothing wrong with DOK or some of the other taxonomies, either. To be precise, I am a huge fan of Daggett’s Rigor and Relevance Framework. I have used that as a core element of my practice for years, and it has shaped who I am as an instructional leader.

So, in order to explain critical thought, a teacher or a leader must familiarize themselves with these tried and true taxonomies. Easy, right? Yes, sort of. The issue is not understanding what critical thought is; it is the ability to integrate it into the classrooms. In order to do so, there are a four key steps every educator must take.

  • Integrating critical thought/rigor into a lesson does not happen by chance, it happens by design. Planning for critical thought and engagement is much different from planning for a traditional lesson. In order to plan for kids to think critically, you have to provide a base of knowledge and excellent prompts to allow them to explore their own thinking in order to analyze, evaluate, or synthesize information.
  • SIDE NOTE – Bloom’s verbs are a great way to start when writing objectives, but true planning will take you deeper than this.

QUESTIONING

  • If the questions and prompts given in a classroom have correct answers or if the teacher ends up answering their own questions, the lesson will lack critical thought and rigor.
  • Script five questions forcing higher-order thought prior to every lesson. Experienced teachers may not feel they need this, but it helps to create an effective habit.
  • If lessons are rigorous and assessments are not, students will do well on their assessments, and that may not be an accurate representation of the knowledge and skills they have mastered. If lessons are easy and assessments are rigorous, the exact opposite will happen. When deciding to increase critical thought, it must happen in all three phases of the game: planning, instruction, and assessment.

TALK TIME / CONTROL

  • To increase rigor, the teacher must DO LESS. This feels counterintuitive but is accurate. Rigorous lessons involving tons of critical thought must allow for students to work on their own, collaborate with peers, and connect their ideas. This cannot happen in a silent room except for the teacher talking. In order to increase rigor, decrease talk time and become comfortable with less control. Asking questions and giving prompts that lead to no true correct answer also means less control. This is a tough ask for some teachers. Explained differently, if you assign one assignment and get 30 very similar products, you have most likely assigned a low-rigor recipe. If you assign one assignment and get multiple varied products, then the students have had a chance to think deeply, and you have successfully integrated critical thought into your classroom.

integratingcaposey

Thanks to Dara, Patrick, Meg, and PJ for their contributions!

Please feel free to leave a comment with your reactions to the topic or directly to anything that has been said in this post.

Consider contributing a question to be answered in a future post. You can send one to me at [email protected] . When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.

You can also contact me on Twitter at @Larryferlazzo .

Education Week has published a collection of posts from this blog, along with new material, in an e-book form. It’s titled Classroom Management Q&As: Expert Strategies for Teaching .

Just a reminder; you can subscribe and receive updates from this blog via email (The RSS feed for this blog, and for all Ed Week articles, has been changed by the new redesign—new ones won’t be available until February). And if you missed any of the highlights from the first nine years of this blog, you can see a categorized list below.

  • This Year’s Most Popular Q&A Posts
  • Race & Racism in Schools
  • School Closures & the Coronavirus Crisis
  • Classroom-Management Advice
  • Best Ways to Begin the School Year
  • Best Ways to End the School Year
  • Student Motivation & Social-Emotional Learning
  • Implementing the Common Core
  • Facing Gender Challenges in Education
  • Teaching Social Studies
  • Cooperative & Collaborative Learning
  • Using Tech in the Classroom
  • Student Voices
  • Parent Engagement in Schools
  • Teaching English-Language Learners
  • Reading Instruction
  • Writing Instruction
  • Education Policy Issues
  • Differentiating Instruction
  • Math Instruction
  • Science Instruction
  • Advice for New Teachers
  • Author Interviews
  • Entering the Teaching Profession
  • The Inclusive Classroom
  • Learning & the Brain
  • Administrator Leadership
  • Teacher Leadership
  • Relationships in Schools
  • Professional Development
  • Instructional Strategies
  • Best of Classroom Q&A
  • Professional Collaboration
  • Classroom Organization
  • Mistakes in Education
  • Project-Based Learning

I am also creating a Twitter list including all contributors to this column .

The opinions expressed in Classroom Q&A With Larry Ferlazzo are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Sign Up for EdWeek Update

Edweek top school jobs.

Two students look at a cellphone in class

Sign Up & Sign In

module image 9

Banner

Essential Learning Outcomes: Critical/Creative Thinking

  • Civic Responsibility
  • Critical/Creative Thinking
  • Cultural Sensitivity
  • Information Literacy
  • Oral Communication
  • Quantitative Reasoning
  • Written Communication
  • Diversity, Equity & Inclusion

Description

Guide to Critical/Creative Thinking

Intended Learning Outcome:

Analyze, evaluate, and synthesize information in order to consider problems/ideas and transform them in innovative or imaginative ways (See below for definitions)

Assessment may include but is not limited to the following criteria and intended outcomes:

Analyze problems/ideas critically and/or creatively

  • Formulates appropriate questions to consider problems/issues
  • Evaluates costs and benefits of a solution
  • Identifies possible solutions to problems or resolution to issues
  • Applies innovative and imaginative approaches to problems/ideas

Synthesize information/ideas into a coherent whole

  • Seeks and compares information that leads to informed decisions/opinions
  • Applies fact and opinion appropriately
  • Expands upon ideas to foster new lines of inquiry
  • Synthesizes ideas into a coherent whole

Evaluate synthesized information in order to transform problems/ideas in innovative or imaginative ways

  • Applies synthesized information to inform effective decisions
  • Experiments with creating a novel idea, question, or product
  • Uses new approaches and takes appropriate risks without going beyond the guidelines of the assignment
  • Evaluates and reflects on the decision through a process that takes into account the complexities of an issue

From Association of American Colleges & Universities, LEAP outcomes and VALUE rubrics:   Critical thinking  is a habit of mind characterized by the comprehensive exploration of issues, ideas, artifacts, and events before accepting or formulating an opinion or conclusion.

Creative thinking  is both the capacity to combine or synthesize existing ideas, images, or expertise in original ways and the experience of thinking, reacting, and working in an imaginative way characterized by a high degree of innovation, divergent thinking, and risk taking.

Elements, excerpts, and ideas borrowed with permission form Assessing Outcomes and Improving Achievement: Tips and tools for Using Rubrics , edited by Terrel L. Rhodes. Copyright 2010 by the Association of American Colleges and Universities.

How to Align - Critical/Creative Thinking

  • Critical/Creative Thinking ELO Tutorial

Critical/Creative Thinking Rubric

Analyze, evaluate, and synthesize information in order to consider problems/ideas and transform them into innovative or imaginative ways.

Criteria Inadequate Developing Competent Proficient
Analyze
problems/ideas
critically and/or
creatively
Does not analyze
problems/ideas
Analyzes
problems/ideas
but not critically
and/or creatively
Begins to analyze
the problems/ideas
critically and/or
creatively
analyzes the
problems/ideas
critically and/or
creatively
Synthesize
information/ideas
in order to
synthesize into a 
coherent whole
Does not
synthesize
information/ideas
Begins to
synthesize
information/ideas
but not into a 
coherent whole
Synthesizes
information/ideas
but not into a 
coherent whole
Synthesizes
information/ideas
into a coherent
whole
Evaluate
synthesized
information in
order to 
transform
problems/ideas
in innovative
Does not evaluate
synthesized
information in
order to transform
problems/ideas
Evaluates
synthesized 
information and
begins to 
transform
problems/ideas
Evaluates
synthesized
information and
transforms
problems/ideas
Evaluates, synthesized
information and
transforms
problems/ideas
accounting for their
complexities or nuances

Elements, excerpts, and ideas borrowed with permission form  Assessing Outcomes and Improving Achievement: Tips and tools for Using Rubrics , edited by Terrel L. Rhodes. Copyright 2010 by the Association of American Colleges and Universities.

Sample Assignments

  • Cleveland Museum of Art tour (Just Mercy) Assignment contributed by Chris Wolken, Matt Lafferty, Luke Schuleter and Sara Clark.
  • Disaster Analysis This assignment was created by faculty at Durham College in Canada The purpose of this assignment is to evaluate students’ ability to think critically about how natural disasters are portrayed in the media.
  • Laboratory Report-Critical Thinking Assignment contributed by Anne Distler.
  • (Re)Imaginings assignment ENG 1020 Assignment contributed by Sara Fuller.
  • Sustainability Project-Part 1 Waste Journal Assignment contributed by Anne Distler.
  • Sustainability Project-Part 2 Research Assignment contributed by Anne Distler.
  • Sustainability Project-Part 3 Waste Journal Continuation Assignment contributed by Anne Distler.
  • Sustainability Project-Part 4 Reflection Assignment contributed by Anne Distler.
  • Reconstructed Landscapes (VCPH) Assignment contributed by Jonathan Wayne
  • Book Cover Design (VCIL)) Assignment contributed by George Kopec

Ask a Librarian

  • << Previous: Civic Responsibility
  • Next: Cultural Sensitivity >>
  • Last Updated: Jan 8, 2024 12:20 PM
  • URL: https://libguides.tri-c.edu/Essential

Learning outcomes and critical thinking – good intentions in conflict

  • Studies in Higher Education 44(2):1-11
  • CC BY-NC-ND 4.0

Martin G. Erikson at Högskolan i Borås

  • Högskolan i Borås
  • This person is not on ResearchGate, or hasn't claimed this research yet.

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations
  • J Comput High Educ

Jason Mcdonald

  • Iolanda Moura Costa

Ilham Muhammad

  • Febrinna Marchy
  • Elmawati Elmawati
  • Christina Monika Samosir
  • Mohamad Hafizi Bin Masdar

Engku Mohamad Engku Abdullah

  • Engku Abdullah¹
  • Dalila Burhan¹
  • J COMPUT ASSIST LEAR
  • Hye Kyung Jin
  • Eunyoung Kim
  • Hassan Aliyu

Ebikabowei Musah

  • Aina Jacob Kola
  • Leander Luiz Klein
  • Gabriela Dos Santos Malaquias
  • Vanessa Giacomelli Bressan

Rafael Mellado-Silva

  • Moipone Motaung
  • Marianne Ellegaard
  • Henriette Lorenzen
  • Jesper Bahrenscheer

Daniela Dumitru

  • Dragos Bigu

Jan Elen

  • E. V. Pullias
  • Karl Jaspers
  • Linda Frederiksen
  • Bruce Macfarlane
  • Jeremy Berg
  • Tracy Bowell

John B Biggs

  • Catherine Tang
  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up
  • DOI: 10.1080/03075079.2018.1486813
  • Corpus ID: 51996395

Learning outcomes and critical thinking – good intentions in conflict

  • Martin G. Erikson , Malgorzata Erikson
  • Published in Studies in Higher Education 19 June 2018
  • Education, Philosophy

110 Citations

Efforts to drill the critical thinking skills on momentum and impulse phenomena using discovery learning model, critical thinking. undeveloped competence in teachers a review, critical thinking in the higher education classroom: knowledge, power, control and identities, assessing critical thinking in business education: key issues and practical solutions, an inquiry into critical thinking in the australian curriculum: examining its conceptual understandings and their implications on developing critical thinking as a “general capability” on teachers’ practice and knowledge, the relationship between critical thinking and knowledge acquisition: the role of digital mind maps-pbl strategies, profile of 11th-grade students’ critical thinking skills in the reaction rate topic and their relationship with learning outcomes, the critical thinking-oriented adaptations of problem-based learning models: a systematic review, individualised and instrumentalised critical thinking, students and the optics of possibility within neoliberal higher education, assessment of students’ critical thinking maturity in the process of teaching philosophy, 54 references, critical thinking and the disciplines reconsidered, critical thinking and disciplinary thinking: a continuing debate, understanding and enacting learning outcomes: the academic's perspective, learning outcomes and ways of thinking across contrasting disciplines and settings in higher education, critical thinking in education: a review, ‘lost in translation’: learning outcomes and the governance of education.

  • Highly Influential

Policing the Subject: Learning Outcomes, Managerialism and Research in PCET

Teaching critical thinking for transfer across domains. dispositions, skills, structure training, and metacognitive monitoring., learning outcomes in higher education, why use learning outcomes in higher education exploring the grounds for academic resistance and reclaiming the value of unexpected learning, related papers.

Showing 1 through 3 of 0 Related Papers

loading

SkillsYouNeed

  • LEARNING SKILLS
  • Study Skills
  • Critical Thinking

Search SkillsYouNeed:

Learning Skills:

  • A - Z List of Learning Skills
  • What is Learning?
  • Learning Approaches
  • Learning Styles
  • 8 Types of Learning Styles
  • Understanding Your Preferences to Aid Learning
  • Lifelong Learning
  • Decisions to Make Before Applying to University
  • Top Tips for Surviving Student Life
  • Living Online: Education and Learning
  • 8 Ways to Embrace Technology-Based Learning Approaches

Critical Thinking Skills

  • Critical Thinking and Fake News
  • Understanding and Addressing Conspiracy Theories
  • Critical Analysis
  • Top Tips for Study
  • Staying Motivated When Studying
  • Student Budgeting and Economic Skills
  • Getting Organised for Study
  • Finding Time to Study
  • Sources of Information
  • Assessing Internet Information
  • Using Apps to Support Study
  • What is Theory?
  • Styles of Writing
  • Effective Reading
  • Critical Reading
  • Note-Taking from Reading
  • Note-Taking for Verbal Exchanges
  • Planning an Essay
  • How to Write an Essay
  • The Do’s and Don’ts of Essay Writing
  • How to Write a Report
  • Academic Referencing
  • Assignment Finishing Touches
  • Reflecting on Marked Work
  • 6 Skills You Learn in School That You Use in Real Life
  • Top 10 Tips on How to Study While Working
  • Exam Skills
  • Writing a Dissertation or Thesis
  • Research Methods
  • Teaching, Coaching, Mentoring and Counselling
  • Employability Skills for Graduates

Subscribe to our FREE newsletter and start improving your life in just 5 minutes a day.

You'll get our 5 free 'One Minute Life Skills' and our weekly newsletter.

We'll never share your email address and you can unsubscribe at any time.

What is Critical Thinking?

Critical thinking is the ability to think clearly and rationally, understanding the logical connection between ideas.  Critical thinking has been the subject of much debate and thought since the time of early Greek philosophers such as Plato and Socrates and has continued to be a subject of discussion into the modern age, for example the ability to recognise fake news .

Critical thinking might be described as the ability to engage in reflective and independent thinking.

In essence, critical thinking requires you to use your ability to reason. It is about being an active learner rather than a passive recipient of information.

Critical thinkers rigorously question ideas and assumptions rather than accepting them at face value. They will always seek to determine whether the ideas, arguments and findings represent the entire picture and are open to finding that they do not.

Critical thinkers will identify, analyse and solve problems systematically rather than by intuition or instinct.

Someone with critical thinking skills can:

Understand the links between ideas.

Determine the importance and relevance of arguments and ideas.

Recognise, build and appraise arguments.

Identify inconsistencies and errors in reasoning.

Approach problems in a consistent and systematic way.

Reflect on the justification of their own assumptions, beliefs and values.

Critical thinking is thinking about things in certain ways so as to arrive at the best possible solution in the circumstances that the thinker is aware of. In more everyday language, it is a way of thinking about whatever is presently occupying your mind so that you come to the best possible conclusion.

Critical Thinking is:

A way of thinking about particular things at a particular time; it is not the accumulation of facts and knowledge or something that you can learn once and then use in that form forever, such as the nine times table you learn and use in school.

The Skills We Need for Critical Thinking

The skills that we need in order to be able to think critically are varied and include observation, analysis, interpretation, reflection, evaluation, inference, explanation, problem solving, and decision making.

Specifically we need to be able to:

Think about a topic or issue in an objective and critical way.

Identify the different arguments there are in relation to a particular issue.

Evaluate a point of view to determine how strong or valid it is.

Recognise any weaknesses or negative points that there are in the evidence or argument.

Notice what implications there might be behind a statement or argument.

Provide structured reasoning and support for an argument that we wish to make.

The Critical Thinking Process

You should be aware that none of us think critically all the time.

Sometimes we think in almost any way but critically, for example when our self-control is affected by anger, grief or joy or when we are feeling just plain ‘bloody minded’.

On the other hand, the good news is that, since our critical thinking ability varies according to our current mindset, most of the time we can learn to improve our critical thinking ability by developing certain routine activities and applying them to all problems that present themselves.

Once you understand the theory of critical thinking, improving your critical thinking skills takes persistence and practice.

Try this simple exercise to help you to start thinking critically.

Think of something that someone has recently told you. Then ask yourself the following questions:

Who said it?

Someone you know? Someone in a position of authority or power? Does it matter who told you this?

What did they say?

Did they give facts or opinions? Did they provide all the facts? Did they leave anything out?

Where did they say it?

Was it in public or in private? Did other people have a chance to respond an provide an alternative account?

When did they say it?

Was it before, during or after an important event? Is timing important?

Why did they say it?

Did they explain the reasoning behind their opinion? Were they trying to make someone look good or bad?

How did they say it?

Were they happy or sad, angry or indifferent? Did they write it or say it? Could you understand what was said?

What are you Aiming to Achieve?

One of the most important aspects of critical thinking is to decide what you are aiming to achieve and then make a decision based on a range of possibilities.

Once you have clarified that aim for yourself you should use it as the starting point in all future situations requiring thought and, possibly, further decision making. Where needed, make your workmates, family or those around you aware of your intention to pursue this goal. You must then discipline yourself to keep on track until changing circumstances mean you have to revisit the start of the decision making process.

However, there are things that get in the way of simple decision making. We all carry with us a range of likes and dislikes, learnt behaviours and personal preferences developed throughout our lives; they are the hallmarks of being human. A major contribution to ensuring we think critically is to be aware of these personal characteristics, preferences and biases and make allowance for them when considering possible next steps, whether they are at the pre-action consideration stage or as part of a rethink caused by unexpected or unforeseen impediments to continued progress.

The more clearly we are aware of ourselves, our strengths and weaknesses, the more likely our critical thinking will be productive.

The Benefit of Foresight

Perhaps the most important element of thinking critically is foresight.

Almost all decisions we make and implement don’t prove disastrous if we find reasons to abandon them. However, our decision making will be infinitely better and more likely to lead to success if, when we reach a tentative conclusion, we pause and consider the impact on the people and activities around us.

The elements needing consideration are generally numerous and varied. In many cases, consideration of one element from a different perspective will reveal potential dangers in pursuing our decision.

For instance, moving a business activity to a new location may improve potential output considerably but it may also lead to the loss of skilled workers if the distance moved is too great. Which of these is the more important consideration? Is there some way of lessening the conflict?

These are the sort of problems that may arise from incomplete critical thinking, a demonstration perhaps of the critical importance of good critical thinking.

Further Reading from Skills You Need

The Skills You Need Guide for Students

The Skills You Need Guide for Students

Skills You Need

Develop the skills you need to make the most of your time as a student.

Our eBooks are ideal for students at all stages of education, school, college and university. They are full of easy-to-follow practical information that will help you to learn more effectively and get better grades.

In Summary:

Critical thinking is aimed at achieving the best possible outcomes in any situation. In order to achieve this it must involve gathering and evaluating information from as many different sources possible.

Critical thinking requires a clear, often uncomfortable, assessment of your personal strengths, weaknesses and preferences and their possible impact on decisions you may make.

Critical thinking requires the development and use of foresight as far as this is possible. As Doris Day sang, “the future’s not ours to see”.

Implementing the decisions made arising from critical thinking must take into account an assessment of possible outcomes and ways of avoiding potentially negative outcomes, or at least lessening their impact.

  • Critical thinking involves reviewing the results of the application of decisions made and implementing change where possible.

It might be thought that we are overextending our demands on critical thinking in expecting that it can help to construct focused meaning rather than examining the information given and the knowledge we have acquired to see if we can, if necessary, construct a meaning that will be acceptable and useful.

After all, almost no information we have available to us, either externally or internally, carries any guarantee of its life or appropriateness.  Neat step-by-step instructions may provide some sort of trellis on which our basic understanding of critical thinking can blossom but it doesn’t and cannot provide any assurance of certainty, utility or longevity.

Continue to: Critical Thinking and Fake News Critical Reading

See also: Analytical Skills Understanding and Addressing Conspiracy Theories Introduction to Neuro-Linguistic Programming (NLP)

Essential Learning Outcomes Resources

  • Stockton's ELO Resources
  • Adapting to Change
  • Communication Skills
  • Creativity and Innovation

Critical Thinking

  • Ethical Reasoning
  • Global Awareness
  • Information Literacy and Research Skills
  • Program Competence
  • Quantitative Reasoning
  • Teamwork and Collaboration

Profile Photo

The ability to formulate an effective, balanced perspective on an issue or topic.

  • Bezanilla, María José, et al. (2019). Methodologies for teaching-learning critical thinking in higher education: The teacher’s view
  • Bjerkvik, Liv ; Hilli, Yvonne. (2019). Reflective writing in undergraduate clinical nursing education: A literature review
  • Cooke, Lori, Stroup, Harrington. (2019). Operationalizing the concept of critical thinking for student learning outcome development
  • D’alessio, Fernando A. et al. (2019). Studying the impact of critical thinking on the academic performance of executive MBA students
  • Janssen, Eva M., et al. (2019). Training higher education teachers’ critical thinking and attitudes towards teaching it
  • Morris, Richard, et al. (2019). Effectiveness of two methods for teaching critical thinking to communication sciences and disorders undergraduates
  • Plotnikova, N. F. ; Strukov, E. N. (2019). Integration of teamwork and critical thinking skills in the process of teaching students
  • Stephenson, Norda, et al. (2019). Impact of peer-led team learning and the science writing and workshop template on the critical thinking skills of first-year chemistry students
  • Venugopalan, Murali. (2019). Building critical thinking skills through literature
  • Yusuf, Nur Muthmainnah. (2019). Optimizing critical thinking skill through peer editing technique in teaching writing
  • Zucker, Andrew. (2019). Using critical thinking to counter misinformation
  • Center for Teaching Thinking (CTT)
  • Foundation for Critical Thinking
  • Critical Thinking Value Rubrics (AAC&U)
  • Stockton Institute for Faculty Development

Cover Art

  • << Previous: Creativity and Innovation
  • Next: Ethical Reasoning >>
  • Last Updated: Sep 13, 2023 8:31 AM
  • 101 Vera King Farris Drive
  • Galloway, NJ 08205-9411
  • (609) 652-4346
  • Find us on Google Map
  • A-Z Databases
  • Subject Guides
  • Subject Librarians
  • Opening Hours
  • Ask Us & FAQ
  • Study Rooms
  • Library Instruction
  • Streaming Video
  • Course Reserves
  • ELO Resources
  • My Library Account
  • Website Feedback
  • Staff Login

Writing Student Learning Outcomes

Student learning outcomes state what students are expected to know or be able to do upon completion of a course or program. Course learning outcomes may contribute, or map to, program learning outcomes, and are required in group instruction course syllabi .

At both the course and program level, student learning outcomes should be clear, observable and measurable, and reflect what will be included in the course or program requirements (assignments, exams, projects, etc.). Typically there are 3-7 course learning outcomes and 3-7 program learning outcomes.

When submitting learning outcomes for course or program approvals, or assessment planning and reporting, please:

  • Begin with a verb (exclude any introductory text and the phrase “Students will…”, as this is assumed)
  • Limit the length of each learning outcome to 400 characters
  • Exclude special characters (e.g., accents, umlats, ampersands, etc.)
  • Exclude special formatting (e.g., bullets, dashes, numbering, etc.)

Writing Course Learning Outcomes Video

Watch Video

Steps for Writing Outcomes

The following are recommended steps for writing clear, observable and measurable student learning outcomes. In general, use student-focused language, begin with action verbs and ensure that the learning outcomes demonstrate actionable attributes.

1. Begin with an Action Verb

Begin with an action verb that denotes the level of learning expected. Terms such as know , understand , learn , appreciate are generally not specific enough to be measurable. Levels of learning and associated verbs may include the following:

  • Remembering and understanding: recall, identify, label, illustrate, summarize.
  • Applying and analyzing: use, differentiate, organize, integrate, apply, solve, analyze.
  • Evaluating and creating: Monitor, test, judge, produce, revise, compose.

Consult Bloom’s Revised Taxonomy (below) for more details. For additional sample action verbs, consult this list from The Centre for Learning, Innovation & Simulation at The Michener Institute of Education at UNH.

2. Follow with a Statement

  • Identify and summarize the important feature of major periods in the history of western culture
  • Apply important chemical concepts and principles to draw conclusions about chemical reactions
  • Demonstrate knowledge about the significance of current research in the field of psychology by writing a research paper
  • Length – Should be no more than 400 characters.

*Note: Any special characters (e.g., accents, umlats, ampersands, etc.) and formatting (e.g., bullets, dashes, numbering, etc.) will need to be removed when submitting learning outcomes through HelioCampus Assessment and Credentialing (formerly AEFIS) and other digital campus systems.

Revised Bloom’s Taxonomy of Learning: The “Cognitive” Domain

Graphic depiction of Revised Bloom's Taxonomy

To the right: find a sampling of verbs that represent learning at each level. Find additional action verbs .

*Text adapted from: Bloom, B.S. (Ed.) 1956. Taxonomy of Educational Objectives: The classification of educational goals. Handbook 1, Cognitive Domain. New York.

Anderson, L.W. (Ed.), Krathwohl, D.R. (Ed.), Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Raths, J., & Wittrock, M.C. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s Taxonomy of Educational Objectives (Complete edition). New York: Longman.

Examples of Learning Outcomes

Academic program learning outcomes.

The following examples of academic program student learning outcomes come from a variety of academic programs across campus, and are organized in four broad areas: 1) contextualization of knowledge; 2) praxis and technique; 3) critical thinking; and, 4) research and communication.

Student learning outcomes for each UW-Madison undergraduate and graduate academic program can be found in Guide . Click on the program of your choosing to find its designated learning outcomes.

This is an accordion element with a series of buttons that open and close related content panels.

Contextualization of Knowledge

Students will…

  • identify, formulate and solve problems using appropriate information and approaches.
  • demonstrate their understanding of major theories, approaches, concepts, and current and classical research findings in the area of concentration.
  • apply knowledge of mathematics, chemistry, physics, and materials science and engineering principles to materials and materials systems.
  • demonstrate an understanding of the basic biology of microorganisms.

Praxis and Technique

  • utilize the techniques, skills and modern tools necessary for practice.
  • demonstrate professional and ethical responsibility.
  • appropriately apply laws, codes, regulations, architectural and interiors standards that protect the health and safety of the public.

Critical Thinking

  • recognize, describe, predict, and analyze systems behavior.
  • evaluate evidence to determine and implement best practice.
  • examine technical literature, resolve ambiguity and develop conclusions.
  • synthesize knowledge and use insight and creativity to better understand and improve systems.

Research and Communication

  • retrieve, analyze, and interpret the professional and lay literature providing information to both professionals and the public.
  • propose original research: outlining a plan, assembling the necessary protocol, and performing the original research.
  • design and conduct experiments, and analyze and interpret data.
  • write clear and concise technical reports and research articles.
  • communicate effectively through written reports, oral presentations and discussion.
  • guide, mentor and support peers to achieve excellence in practice of the discipline.
  • work in multi-disciplinary teams and provide leadership on materials-related problems that arise in multi-disciplinary work.

Course Learning Outcomes

  • identify, formulate and solve integrative chemistry problems. (Chemistry)
  • build probability models to quantify risks of an insurance system, and use data and technology to make appropriate statistical inferences. (Actuarial Science)
  • use basic vector, raster, 3D design, video and web technologies in the creation of works of art. (Art)
  • apply differential calculus to model rates of change in time of physical and biological phenomena. (Math)
  • identify characteristics of certain structures of the body and explain how structure governs function. (Human Anatomy lab)
  • calculate the magnitude and direction of magnetic fields created by moving electric charges. (Physics)

Additional Resources

  • Bloom’s Taxonomy
  • The Six Facets of Understanding – Wiggins, G. & McTighe, J. (2005). Understanding by Design (2nd ed.). ASCD
  • Taxonomy of Significant Learning – Fink, L.D. (2003). A Self-Directed Guide to Designing Courses for Significant Learning. Jossey-Bass
  • College of Agricultural & Life Sciences Undergraduate Learning Outcomes
  • College of Letters & Science Undergraduate Learning Outcomes

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Athl Train
  • v.38(3); Jul-Sep 2003

Active Learning Strategies to Promote Critical Thinking

Stacy E. Walker, PhD, ATC, provided conception and design; acquisition and analysis and interpretation of the data; and drafting, critical revision, and final approval of the article.

To provide a brief introduction to the definition and disposition to think critically along with active learning strategies to promote critical thinking.

Data Sources:

I searched MEDLINE and Educational Resources Information Center (ERIC) from 1933 to 2002 for literature related to critical thinking, the disposition to think critically, questioning, and various critical-thinking pedagogic techniques.

Data Synthesis:

The development of critical thinking has been the topic of many educational articles recently. Numerous instructional methods exist to promote thought and active learning in the classroom, including case studies, discussion methods, written exercises, questioning techniques, and debates. Three methods—questioning, written exercises, and discussion and debates—are highlighted.

Conclusions/Recommendations:

The definition of critical thinking, the disposition to think critically, and different teaching strategies are featured. Although not appropriate for all subject matter and classes, these learning strategies can be used and adapted to facilitate critical thinking and active participation.

The development of critical thinking (CT) has been a focus of educators at every level of education for years. Imagine a certified athletic trainer (ATC) who does not consider all of the injury options when performing an assessment or an ATC who fails to consider using any new rehabilitation techniques because the ones used for years have worked. Envision ATCs who are unable to react calmly during an emergency because, although they designed the emergency action plan, they never practiced it or mentally prepared for an emergency. These are all examples of situations in which ATCs must think critically.

Presently, athletic training educators are teaching many competencies and proficiencies to entry-level athletic training students. As Davies 1 pointed out, CT is needed in clinical decision making because of the many changes occurring in education, technology, and health care reform. Yet little information exists in the athletic training literature regarding CT and methods to promote thought. Fuller, 2 using the Bloom taxonomy, classified learning objectives, written assignments, and examinations as CT and nonCT. Athletic training educators fostered more CT in their learning objectives and written assignments than in examinations. The disposition of athletic training students to think critically exists but is weak. Leaver-Dunn et al 3 concluded that teaching methods that promote the various components of CT should be used. My purpose is to provide a brief introduction to the definition and disposition to think critically along with active learning strategies to promote CT.

DEFINITION OF CRITICAL THINKING

Four commonly referenced definitions of critical thinking are provided in Table ​ Table1. 1 . All of these definitions describe an individual who is actively engaged in the thought process. Not only is this person evaluating, analyzing, and interpreting the information, he or she is also analyzing inferences and assumptions made regarding that information. The use of CT skills such as analysis of inferences and assumptions shows involvement in the CT process. These cognitive skills are employed to form a judgment. Reflective thinking, defined by Dewey 8 as the type of thinking that consists of turning a subject over in the mind and giving it serious and consecutive consideration, can be used to evaluate the quality of judgment(s) made. 9 Unfortunately, not everyone uses CT when solving problems. Therefore, in order to think critically, there must be a certain amount of self-awareness and other characteristics present to enable a person to explain the analysis and interpretation and to evaluate any inferences made.

Various Definitions of Critical Thinking

An external file that holds a picture, illustration, etc.
Object name is i1062-6050-038-03-0263-t01.jpg

DISPOSITION TO THINK CRITICALLY

Recently researchers have begun to investigate the relationship between the disposition to think critically and CT skills. Many believe that in order to develop CT skills, the disposition to think critically must be nurtured as well. 4 , 10 – 12 Although research related to the disposition to think critically has recently increased, as far back as 1933 Dewey 8 argued that possession of knowledge is no guarantee for the ability to think well but that an individual must desire to think. Open mindedness, wholeheartedness, and responsibility were 3 of the attitudes he felt were important traits of character to develop the habit of thinking. 8

More recently, the American Philosophical Association Delphi report on critical thinking 7 was released in 1990. This report resulted from a questionnaire regarding CT completed by a cross-disciplinary panel of experts from the United States and Canada. Findings included continued support for the theory that to develop CT, an individual must possess and use certain dispositional characteristics. Based upon the dispositional phrases, the California Critical Thinking Dispositional Inventory 13 was developed. Seven dispositions (Table ​ (Table2) 2 ) were derived from the original 19 published in the Delphi report. 12 It is important to note that these are attitudes or affects, which are sought after in an individual, and not thinking skills. Facione et al 9 purported that a person who thinks critically uses these 7 dispositions to form and make judgments. For example, if an individual is not truth seeking, he or she may not consider other opinions or theories regarding an issue or problem before forming an opinion. A student may possess the knowledge to think critically about an issue, but if these dispositional affects do not work in concert, the student may fail to analyze, evaluate, and synthesize the information to think critically. More research is needed to determine the relationship between CT and the disposition to think critically.

Dispositions to Think Critically 12

An external file that holds a picture, illustration, etc.
Object name is i1062-6050-038-03-0263-t02.jpg

METHODS TO PROMOTE CRITICAL THOUGHT

Educators can use various instructional methods to promote CT and problem solving. Although educators value a student who thinks critically about concepts, the spirit or disposition to think critically is, unfortunately, not always present in all students. Many college faculty expect their students to think critically. 14 Some nursing-specific common assumptions made by university nursing teaching faculty are provided 15 (Table ​ (Table3) 3 ) because no similar research exists in athletic training. Espeland and Shanta 16 argued that faculty who select lecture formats as a large part of their teaching strategy may be enabling students. When lecturing, the instructor organizes and presents essential information without student input. This practice eliminates the opportunity for students to decide for themselves what information is important to know. For example, instead of telling our students via lecture what medications could be given to athletes with an upper respiratory infection, they could be assigned to investigate medications and decide which one is appropriate.

Common Assumptions of Nursing Faculty 15

An external file that holds a picture, illustration, etc.
Object name is i1062-6050-038-03-0263-t03.jpg

Students need to be exposed to diverse teaching methods that promote CT in order to nurture the CT process. 14 , 17 – 19 As pointed out by Kloss, 20 sometimes students are stuck and unable to understand that various answers exist for one problem. Each ATC has a different method of taping a sprained ankle, performing special tests, and obtaining medical information. Kloss 20 stated that students must be exposed to ambiguity and multiple interpretations and perspectives of a situation or problem in order to stimulate growth. As students move through their clinical experiences, they witness the various methods for taping ankles, performing special tests, and obtaining a thorough history from an injured athlete. Paul and Elder 21 stated that many professors may try to encourage students to learn a body of knowledge by stating that body of knowledge in a sequence of lectures and then asking students to internalize knowledge outside of class on their own time. Not all students possess the thinking skills to analyze and synthesize information without practice. The following 3 sections present information and examples of different teaching techniques to promote CT.

Questioning

An assortment of questioning tactics exists to promote CT. Depending on how a question is asked, the student may use various CT skills such as interpretation, analysis, and recognition of assumptions to form a conclusion. Mills 22 suggested that the thoughtful use of questions may be the quintessential activity of an effective teacher. Questions are only as good as the thought put into them and should go beyond knowledge-level recall. 22 Researchers 23 , 24 have found that often clinical teachers asked significantly more lower-level cognitive questions than higher-level questions. Questions should be designed to promote evaluation and synthesis of facts and concepts. Asking a student to evaluate when proprioception exercises should be included in a rehabilitation program is more challenging than asking a student to define proprioception. Higher-level thinking questions should start or end with words or phrases such as, “explain,” “compare,” “why,” “which is a solution to the problem,” “what is the best and why,” and “do you agree or disagree with this statement?” For example, a student could be asked to compare the use of parachlorophenylalanine versus serotonin for control of posttreatment soreness. Examples of words that can be used to begin questions to challenge at the different levels of the Bloom Taxonomy 25 are given in Table ​ Table4. 4 . The Bloom Taxonomy 25 is a hierarchy of thinking skills that ranges from simple skills, such as knowledge, to complex thinking, such as evaluation. Depending on the initial words used in the question, students can be challenged at different levels of cognition.

Examples of Questions 23

An external file that holds a picture, illustration, etc.
Object name is i1062-6050-038-03-0263-t04.jpg

Another type of questioning technique is Socratic questioning. Socratic questioning is defined as a type of questioning that deeply probes or explores the meaning, justification, or logical strength of a claim, position, or line of reasoning. 4 , 26 Questions are asked that investigate assumptions, viewpoints, consequences, and evidence. Questioning methods, such as calling on students who do not have their hands up, can enhance learning by engaging students to think. The Socratic method focuses on clarification. A student's answer to a question can be followed by asking a fellow student to summarize the previous answer. Summarizing the information allows the student to demonstrate whether he or she was listening, had digested the information, and understood it enough to put it into his or her own words. Avoiding questions with one set answer allows for different viewpoints and encourages students to compare problems and approaches. Asking students to explain how the high school and the collegiate or university field experiences are similar and different is an example. There is no right or wrong answer because the answers depend upon the individual student's experiences. 19 Regardless of the answer, the student must think critically about the topic to form a conclusion of how the field experiences are different and similar.

In addition to using these questioning techniques, it is equally important to orient the students to this type of classroom interaction. Mills 22 suggested that provocative questions should be brief and contain only one or two issues at a time for class reflection. It is also important to provide deliberate silence, or “wait” time, for students upon asking questions. 22 , 27 Waiting at least 5 seconds allows the students to think and encourages thought. Elliot 18 argued that waiting even as long as 10 seconds allows the students time to think about possibilities. If a thought question is asked, time must be given for the students to think about the answer.

Classroom Discussion and Debates

Classroom discussion and debates can promote critical thinking. Various techniques are available. Bernstein 28 developed a negotiation model in which students were confronted with credible but antagonistic arguments. Students were challenged to deal with the tension between the two arguments. This tension is believed to be one component driving critical thought. Controversial issues in psychology, such as animal rights and pornography, were presented and discussed. Students responded favorably and, as the class progressed over time, they reported being more comfortable arguing both sides of an issue. In athletic training education, a negotiation model could be employed to discuss certain topics, such as the use of heat versus ice or the use of ultrasound versus electric stimulation in the treatment of an injury. Students could be assigned to defend the use of a certain treatment. Another strategy to promote students to seek both sides of an issue is pro and con grids. 29 Students create grids with the pros and cons or advantages or disadvantages of an issue or treatment. Debate was used to promote CT in second-year medical students. 30 After debating, students reported improvements in literature searching, weighing risks and benefits of treatments, and making evidence-based decisions. Regardless of the teaching methods used, students should be exposed to analyzing the costs and benefits of issues, problems, and treatments to help prepare them for real-life decision making.

Observing the reasoning skills of another person was used by Galotti 31 to promote CT. Students were paired, and 4 reasoning tasks were administered. As the tasks were administered, students were told to talk aloud through the reasoning process of their decisions. Students who were observing were to write down key phrases and statements. This same process can be used in an injury-evaluation class. One student performs an evaluation while the others in the class observe. Classroom discussion can then follow. Another alternative is to divide students into pairs. One student performs an evaluation while the other observes. After the evaluation is completed, the students discuss with each other the evaluation (Table ​ (Table5 5 presents examples). Another option is to have athletic training students observe a student peer or ATC during a field evaluation of an athlete. While observing, the student can write down any questions or topics to discuss after the evaluation, providing the student an opportunity to ask why certain evaluation methods were and were not used.

Postevaluation Questions

An external file that holds a picture, illustration, etc.
Object name is i1062-6050-038-03-0263-t05.jpg

Daily newspaper clippings directly related to current classroom content also allow an instructor to incorporate discussion into the classroom. 32 For example, an athlete who has been reported to have died as a result of heat illness could provide subject matter for classroom discussion or various written assignments. Such news also affords the instructor an opportunity to discuss the affective components involved. Students could be asked to step into the role of the ATC and think about the reported implications of this death from different perspectives. They could also list any assumptions made by the article or follow-up questions they would ask if they could interview the persons involved. This provides a forum to enlighten students to think for themselves and realize that not each person in the room perceives the article the same way. Whatever the approach taken, investigators and educators agree that assignments and arguments are useful to promote thought among students.

Written Assignments

In-class and out-of-class assignments can also serve as powerful vehicles to allow students to expand their thinking processes. Emig 33 believed that involving students in writing serves their learning uniquely because writing, as process and product, possesses a cluster of attributes that correspond uniquely to certain powerful learning strategies. As a general rule, assignments for the purpose of promoting thought should be short (not long term papers) and focus on the aspect of thinking. 19 Research or 1-topic papers may or may not be a student's own thoughts, and Meyers 32 argued that term papers often prove to be exercises in recapitulating the thoughts of others.

Allegretti and Frederick 34 used a variety of cases from a book to promote CT regarding different ethical issues. Countless case-study situations can be created to allow students to practice managing situations and assess clinical decision making. For example, after reading the National Athletic Trainers' Association position statement on lightning, a student can be asked to address the following scenario: “Explain how you would handle a situation in which a coach has kept athletes outside practicing unsafely. What information would you use from this statement to explain your concerns? Explain why you picked the specific concerns.” These questions can be answered individually or in small groups and then discussed in class. The students will pick different concerns based on their thinking. This variety in answers is not only one way to show that no answer is right or wrong but also allows students to defend their answers to peers. Questions posed on listservs are excellent avenues to enrich a student's education. Using these real-life questions, students read about real issues and concerns of ATCs. These topics present excellent opportunities to pose questions to senior-level athletic training students to examine how they would handle the situation. This provides the students a safe place to analyze the problem and form a decision. Once the students make a decision, additional factors, assumptions, and inferences can be discussed by having all students share the solution they chose.

Lantz and Meyers 35 used personification and assigned students to assume the character of a drug. Students were to relate themselves to the drug, in the belief that drugs exhibit many unique characteristics, such as belonging to a family, interaction problems, adverse reactions, and so forth. The development of analogies comes from experience and comparing one theory or scenario to another with strong similarities.

Fopma-Loy and Ulrich 36 identified various CT classroom exercises educators can implement to promote higher-order thought (Table ​ (Table6). 6 ). Many incorporate a personal reaction from the student and allow the student to link that learning to his or her feelings. This personal reaction of feelings to cognitive information is important to show the relevance of material.

Exercises to Promote Critical Thought 36

An external file that holds a picture, illustration, etc.
Object name is i1062-6050-038-03-0263-t06.jpg

Last, poems are another avenue that can be used to promote CT. 20 Although poems are widely thought of as an assignment in an English class, athletic training students may benefit from this creative writing activity. The focus of this type of homework activity should be on reviewing content creatively. The lines of the poem need not rhyme as long as appropriate content is explained in the poem. For example, a poem on the knee could be required to include signs, symptoms, and anatomical content of one injury or various injuries. A poem on head injuries could focus on the different types of history questions that should be asked. Students should understand that the focus of the assignment is a creative review of the material and not a test of their poetic qualities. The instructor should complete a poem as well. To break the ice, the instructor's poem can be read first, followed by a student volunteering to read his or her poem.

CONCLUSIONS

Regardless of the methods used to promote CT, care must be taken to consider the many factors that may inhibit a student from thinking critically. The student's disposition to think critically is a major factor, and if a deficit in a disposition is noticed, this should be nurtured. Students should be encouraged to be inquisitive, ask questions, and not believe and accept everything they are told. As pointed out by Loving and Wilson 14 and Oermann, 19 thought develops with practice and evaluation over time using multiple strategies. Additionally, faculty should be aware of their course goals and learning objectives. If these goals and objectives are stated as higher-order thought outcomes, then activities that promote CT should be included in classroom activities and assignments. 14 Finally, it is important that CT skills be encouraged and reinforced in all classes by teaching faculty, not only at the college level but at every level of education. Although huge gains in CT may not be reflected in all college students, we can still plant the seed and encourage students to use their thinking abilities in the hope these will grow over time.

How educators can train critical thinking with Kialo Edu

learning outcomes critical thinking

It is vital to equip students with the 21st-century skills needed to face future challenges. Perhaps the trickiest skill to purposefully develop in students is critical thinking , which describes the ability to analyze information, evaluate evidence, and draw reasonable conclusions based on sound logic and reasoning .

Luckily, Kialo Edu is purpose-built for this task. Kialo discussions actively engage students to train their logic and reasoning skills and can help students become more autonomous and open-minded thinkers.

Let’s explore how educators can use Kialo discussions to advance students’ critical thinking skills.

How students benefit from critical thinking skills

1. critical thinking improves student learning outcomes.

When students learn how to think critically, they become more active learners capable of applying their knowledge across subject areas.

Cross-subject knowledge transfer means students are better able to learn independently, which in turn leads to better learning outcomes.

That’s because independent learners are self-motivated, making them more likely to persevere in pursuit of their learning goals. 

2. Critical thinking helps students become informed citizens

Critical thinking skills are a central pillar of information literacy, and allow students to better assess the reliability of information they come across — especially information found online.

This is essential for students as many get their information from online sources, including social media platforms.

As well as helping students identify misinformation, critical thinking complements the development of students’ civic literacy skills.

The ability to understand different points of view, question political and media rhetoric, and grasp the broader implications of policy decisions empowers students to participate in meaningful discussions about how society functions and their place within it.

3. Students can make better decisions with critical thinking skills

learning outcomes critical thinking

Critical thinking allows students to make informed decisions grounded in evidence. By reflecting on their own thought processes, questioning their assumptions, and understanding the impact of cognitive bias on thinking, students with developed critical thinking skills are better positioned to make good decisions as they progress through life.

4. Students can improve their problem-solving skills through critical thinking

By developing students’ rational capacities , critical thinking helps students become better problem-solvers.

Students who approach problems systematically and remain open to new solutions are better placed to tackle educational and professional challenges. They are also more capable of thinking up innovative solutions to larger societal problems.

Activities to train critical thinking with Kialo discussions

1. map arguments on kialo edu to promote metacognition.

Kialo discussions take the form of a map of all the different parts of an argument, providing a unique, visual method for students to see how ideas are related to each other.

Argument mapping has been shown to be one of the most effective ways of cultivating critical thinking skills, as well as facilitating a deeper understanding of the topic at hand.

learning outcomes critical thinking

As students evaluate the relationships between the connected claims, guide them to examine the thinking process being represented in the discussion.

Metacognitive practices like this — in which students think about their own and others’ thinking — help cultivate self-reflection, an essential component of critical thinking.

2. Use Kialo discussions to train students’ reasoning skills

The cornerstone of critical thinking is the use of reason to evaluate evidence, analyze arguments, and arrive at measured conclusions. To improve their reasoning capacities, students need plenty of practice in both developing their own arguments and critiquing those made by others.

Kialo discussions take the form of interconnected lines of reasoning. Within this structure, students are guided to think about the logical relationship between each point they make and the one they are responding to.

You can encourage them to consider whether each of their peers’ claims logically supports or weakens the one above, and even add some of your own faulty reasoning to the discussion to demonstrate common logical fallacies .

3. Use Kialo Edu as a framework for student-led inquiry

Adept critical thinkers are independent learners, capable of leading their own investigations into a topic and constructing their own understanding . Kialo discussions work to promote student autonomy by creating a framework for student-led inquiry into the discussion topic.

learning outcomes critical thinking

First, Kialo discussions begin with a central question. Then, students have complete freedom as to where and how they analyze, respond to, and contribute arguments.

This nonlinear nature of Kialo discussions promotes active learning by asking students to choose which part of the discussion to contribute to.

It also gives students a sense of ownership over their own learning, as they are encouraged to come to their own conclusions rather than simply reproduce answers provided to them.

Similar to other critical-thinking discussion activities such as Socratic seminars , teachers in Kialo discussions can choose the level of guidance to give to students.

You can assign tasks to students to encourage certain types of participation or use the Discussion Chat to bring students’ attention to a particular branch of the discussion that you would like them to develop.

4. Have students collaborate on Kialo Edu to encourage flexible thinking

To become better critical thinkers, students must be open to new ideas — in other words, they must be flexible in the way that they think. Educators can utilize Kialo discussions to encourage flexible thinking in students.

Because Kialo discussions work great as collaborative activities, students are exposed to opinions they may never have even considered. Exposure to a variety of perspectives helps students become more open and flexible thinkers.

Plus, because students’ claims are open to critique by their peers, discussion participants are encouraged to reflect upon their own thinking.

This self-reflection can help students to question their implicit assumptions and biases, and develop a disposition more open to changing their minds on an issue. Upon collaborating in a discussion, ask students to develop counterarguments to their own claims to interrogate their original position on a topic.

5. Use Kialo discussions to practice research and citing sources

Beyond having purely logical connections between claims, Kialo discussions immerse students in developing their research skills and interrogating the evidence behind each other’s assertions. 

First, direct students to add sources to their claims. This is an important skill as it requires students to conduct independent research and find reliable sources that support the points being made. Practicing it on Kialo Edu can also greatly benefit students’ research skills for essay writing.

learning outcomes critical thinking

Then, have students practice source criticism by asking them to evaluate how reliable the sources supporting each other’s arguments truly are.

By doing so, they will need to check the veracity of each source, which is a vital component of information and media literacy . At the same time, they are practicing their reasoning skills.

Helping educators to develop students’ critical thinking skills is at the core of Kialo Edu’s mission, which is to make the world a more thoughtful place.

If you have feedback, thoughts, or suggestions about how we can achieve that goal, reach out to us on social media or directly at [email protected] .

Want to try Kialo Edu with your class?

Sign up for free and use Kialo Edu to have thoughtful classroom discussions and train students’ argumentation and critical thinking skills.

  • The Key is Being Metacognitive
  • The Big Picture
  • Learning Outcomes
  • Test your Existing Knowledge
  • Definitions of Critical Thinking
  • Learning How to Think Critically
  • Self Reflection Activity
  • End of Module Survey
  • Test Your Existing Knowledge
  • Interpreting Information Methodically
  • Using the SEE-I Method
  • Interpreting Information Critically
  • Argument Analysis
  • Learning Activities
  • Argument Mapping
  • Summary of Anlyzing Arguments
  • Fallacious Reasoning
  • Statistical Misrepresentation
  • Biased Reasoning
  • Common Cognitive Biases
  • Poor Research Methods - The Wakefield Study
  • Summary of How Reasoning Fails
  • Misinformation and Disinformation
  • Media and Digital Literacy
  • Information Trustworthiness
  • Summary of How Misinformation is Spread

Critical Thinking Tutorial: Learning Outcomes

Why learn to construct an argument.

The logic video describes what the premise of an argument is in relation to its conclusion and demonstrates how to test if an argument is valid or sound . Testing for validity and soundness are key critical thinking skills. At university, you will use these argument analysis skills to examine how accurate arguments are, to differentiate between good and poor arguments, and to construct sound arguments of your own.

  Learning Outcomes

At the end of this module on analyzing arguments, you should be able to

  • Recognize premises and conclusions in arguments
  • Differentiate a good argument from a poor argument
  • Determine when a simple argument is valid
  • Determine when a simple argument is sound
  • Classify arguments as either inductive or deductive
  • << Previous: The Big Picture
  • Next: Test Your Existing Knowledge >>
  • Library A to Z
  • Follow on Facebook
  • Follow on Twitter
  • Follow on YouTube
  • Follow on Instagram

The University of Saskatchewan's main campus is situated on  Treaty 6 Territory and the Homeland of the Métis.

© University of Saskatchewan Disclaimer | Privacy

  • Last Updated: Dec 14, 2023 3:51 PM
  • URL: https://libguides.usask.ca/CriticalThinkingTutorial

Immediate Versus Delayed Low-Stakes Questioning: Encouraging the Testing Effect Through Embedded Video Questions to Support Students’ Knowledge Outcomes, Self-Regulation, and Critical Thinking

  • Original research
  • Open access
  • Published: 30 July 2024

Cite this article

You have full access to this open access article

learning outcomes critical thinking

  • Joseph T. Wong   ORCID: orcid.org/0000-0003-1890-6284 1 ,
  • Lindsey Engle Richland 1 &
  • Bradley S. Hughes 2  

160 Accesses

Explore all metrics

In light of the educational challenges brought about by the COVID-19 pandemic, there is a growing need to bolster online science teaching and learning by incorporating evidence-based pedagogical principles of Learning Experience Design (LXD). As a response to this, we conducted a quasi-experimental, design-based research study involving nN  = 183 undergraduate students enrolled across two online classes in an upper-division course on Ecology and Evolutionary Biology at a large R1 public university. The study extended over a period of 10 weeks, during which half of the students encountered low-stakes questions immediately embedded within the video player, while the remaining half received the same low-stakes questions after viewing all the instructional videos within the unit. Consequently, this study experimentally manipulated the timing of the questions across the two class conditions. These questions functioned as opportunities for low-stakes content practice and retention, designed to encourage learners to experience testing effect and augment the formation of their conceptual understanding. Across both conditions, we assessed potential differences in total weekly quiz grades, page views, and course participation among students who encountered embedded video questions. We also assessed students’ self-report engagement, self-regulation, and critical thinking. On average, the outcomes indicated that learners exposed to immediate low-stakes questioning exhibited notably superior summative quiz scores, increased page views, and enhanced participation in the course. Additionally, those who experienced immediate questioning demonstrated heightened levels of online engagement, self-regulation, and critical thinking. Moreover, our analysis delved into the intricate interplay between treatment conditions, learners’ self-regulation, critical thinking, and quiz grades through a multiple regression model. Notably, the interaction between those in the immediate questioning condition and self-regulation emerged as a significant factor, suggesting that the influence of immediate questioning on quiz grades varies based on learners’ self-regulation abilities. Collectively, these findings highlight the substantial positive effects of immediate questioning of online video lectures on both academic performance and cognitive skills within an online learning context. This discussion delves into the potential implications for institutions to continually refine their approach in order to effectively promote successful online science teaching and learning, drawing from the foundations of pedagogical learning experience design paradigms and the testing effect model.

Similar content being viewed by others

learning outcomes critical thinking

Impact of question presence and interactivity in instructional videos on student learning

learning outcomes critical thinking

Fostering engaging online learning experiences: Investigating situational interest and mind-wandering as mediators through learning experience design

learning outcomes critical thinking

First-year university students' self-regulated learning during the COVID-19 pandemic: a qualitative longitudinal study

Avoid common mistakes on your manuscript.

1 Introduction

A recurring concern in traditional in-person and online courses deployed is how best to maintain and sustain learners’ engagement throughout the learning process. When considering the disruptions caused by the COVID-19 pandemic, these concerns are further exacerbated by competing introductions of “edtech” tools that were deployed in urgency to facilitate teaching and learning during a time of crisis learning context. That is not to say that introducing “edtech” tools did not aid in supporting students’ learning trajectories during this period of time, but a major concern currently is a widespread deployment of “edtech solutions’’ without proper alignment with evidence-based pedagogical learning frameworks (Asad et al., 2020 ; Chick et al., 2020 ; Sandars et al., 2020 ) and whether or not the tools being deployed were having the intended supporting learning effect on students. Between 2020 and 2022, the United States government distributed $58.4 billion dollars through the Higher Education Emergency Relief Fund to public universities which spent more than $1.2 billion on distance learning technologies (EDSCOOP, 2023 ; O’leary & June, 2023 ). Educational technology spending by universities included expenditures on software licenses, hardware (such as computers and tablets), learning management systems (LMS), online course development tools, audio-visual equipment, digital content, and various technology-related services to name a few. In light of the considerable resources dedicated to distance learning in recent years, the need to discern how to employ these “edtech tools’’ in a manner that is meaningful, impactful, and grounded in evidence-based pedagogies has grown substantially.

Higher education has been grappling with a myriad of technologies to deploy in order to support the exponential increase of undergraduates enrolled in online courses. Data from the United States in the fall of 2020 indicate that approximately 11.8 million (75%) undergraduate students were enrolled in at least one distance learning course, while 7.0 million (44%) of undergraduates exclusively took distance education courses (National Center for Education Statistics [NCES], 2022 ). In the Fall of 2021 with the return to in-person instruction, about 75% of all postsecondary degree seekers in the U.S. took at least some online classes with around 30% studying exclusively online (NCES, 2022 ). In the aftermath of the pandemic, the proportion of students engaged in online courses has declined to 60%. Nevertheless, this figure remains notably higher than the levels seen in the pre-pandemic era (NCES, 2022 ). To meet the increasing demand, universities possess substantial opportunities to explore effective strategies for enhancing the online learning experiences of undergraduate students. However, it’s important to note that merely introducing new tools into instructors’ technological toolkit may not be enough to foster impactful teaching and learning.

To address these concerns, this study employs a quasi-experimental design, implementing embedded video questions into an asynchronous undergraduate Biology course, anchored in the Learning Experience Design (LXD) pedagogical paradigm. The objective is to assess the effectiveness of the embedded video question assessment platform, utilizing video technologies and employing design-based research (DBR) methodologies to evaluate practical methods for fostering active learning in online educational settings. While video content integration in education is recognized as valuable for capturing learners’ attention and delivering complex concepts (Wong et al., 2023 , 2024 ), passive consumption of videos may not fully harness their potential to promote active learning and deeper engagement (Mayer, 2017 , 2019 ). Embedded video questions provide an avenue to transform passive viewing into an interactive and participatory experience (Christiansen et al., 2017 ; van der Meij & Bӧckmann, 2021 ). By strategically embedding thought-provoking questions within video segments, educators can prompt students to reflect on the material, assess comprehension, and immediately evaluate conceptual understanding. Additionally, analyzing the timing and placement of these questions within a video lesson may yield valuable insights into their effectiveness of facilitating the testing effect, a process in which implementing low-stakes retrieval practice over a period of time can help learners integrate new information with prior knowledge (Carpenter, 2009 ; Littrell-Baez et al., 2015 ; Richland et al., 2009 ). Understanding how variations in timing influence student responses and comprehension levels can inform instructional strategies for optimizing the use of interactive elements in educational videos in fostering engagement and enhancing learning performance.

This study aimed to compare students who received low-stakes questions after watching a series of lecture videos with those who encountered questions immediately embedded within the video player. The objective was to identify differences in total weekly quiz scores, course engagement, as well as learning behaviors such as critical thinking and self-regulation over a span of 10 weeks. While previous studies have examined the efficacy of embedded video questions, few have considered the interrelation of these learning behaviors within the context of the Learning Experience Design (LXD) paradigm and the testing effect model for undergraduate science courses. These findings will contribute to a deeper understanding of evidence-based designs for asynchronous online learning environments and will help in evaluating the effectiveness of embedding video questions with regards to question timing within the LXD paradigm. Considering the increasing demand and substantial investment in online courses within higher education, this study aims to assess the effectiveness of a research-practice partnership in implementing embedded video questions into two courses. The ultimate aim is to determine whether this approach could serve as a scalable model for effectively meeting educational needs in the future.

2 Literature Review

2.1 learning experience design.

Learning Experience Design (LXD) encompasses the creation of learning scenarios that transcend the confines of traditional classroom settings, often harnessing the potential of online and educational technologies (Ahn, 2019 ). This pedagogical paradigm involves crafting impactful learning encounters that are centered around human needs and driven by specific objectives, aimed at achieving distinct learning results (Floor, 2018 , 2023 ; Wong & Hughes, 2022 ; Wong et al., 2024 ). LXD differs from the conventional pedagogical process of “instructional design,” which primarily focuses on constructing curricula and instructional programming for knowledge acquisition (Correia, 2021 ). Instead, LXD can be described as an interdisciplinary integration that combines principles from instructional design, pedagogical teaching approaches, cognitive science, learning sciences, and user experience design (Weigel, 2015 ). LXD extends beyond the boundaries of traditional educational settings, leveraging online and virtual technologies (Ahn, 2019 ). As a result, the primary focus of LXD is on devising learning experiences that are human-centered and geared toward specific outcomes (Floor, 2018 ; Wong & Hughes, 2022 ).

Practically, LXD is characterized by five essential components: Human-Centered Approach, Objective-Driven Design, Grounded in Learning Theory, Emphasis on Experiential Learning, and Collaborative Interdisciplinary Efforts (Floor, 2018 ). Taking a human-centered approach considers the needs, preferences, and viewpoints of the learners, resulting in tailored learning experiences where learners take precedence (Matthews et al., 2017 ; Wong & Hughes, 2022 ). An objective-driven approach to course design curates learning experiences that are intentionally structured to align specific objectives, making every learning activity purposeful and pertinent to support students’ learning experiences (Floor, 2018 ; Wong et al., 2022 ). LXD also is grounded in learning theories, such that the design process is informed by evidence-based practices drawn from cognitive science and learning sciences (Ahn et al., 2019 ). Furthermore, LXD places a large emphasis on experiential learning where active and hands-on learning techniques, along with real-world applications, facilitate deeper understanding and retention (Floor, 2018 , 2023 ; Wong et al., 2024 ). Lastly, LXD is interdisciplinary, bringing together professionals from diverse backgrounds, including instructional designers, educators, cognitive scientists, and user experience designers, to forge comprehensive and well-rounded learning experiences (Weigel, 2015 ). Each of these facets underscores the significance of empathy, where both intended and unintended learning design outcomes are meticulously taken into account to enhance learners’ experiences (Matthews et al., 2017 ; Wong & Hughes, 2022 ). Consequently, LXD broadens the scope of learning experiences, enabling instructors and designers to resonate with learners and enrich the repertoire of learning design strategies (Ahn et al., 2019 ; Weigel, 2015 ), thus synergizing with the utilization of video as a powerful tool for teaching and learning online. In tandem with the evolving landscape of educational practices, LXD empowers educators to adapt and enhance their methodologies, fostering successful and enriched learning outcomes (Ahn, 2019 ; Floor, 2018 , 2023 ; Wong et al., 2022 ), while also embracing the dynamic potential of multimedia educational technologies like video in delivering effective and engaging instructional content.

2.2 Video as a Tool for Teaching and Learning

Video and multimedia educational technologies have been broadly used as “edtech tools” tools for teaching and learning over the last three decades during in-person instruction and especially now with online learning modalities (Cruse, 2006 ; Mayer, 2019 ). Educational videos, also referred to as instructional or explainer videos, serve as a modality for delivering teaching and learning through audio and visuals to demonstrate or illustrate key concepts being taught. Multiple researchers have found evidence for the affordances of video-based learning, citing benefits including reinforcement in reading and lecture materials, aiding the development of common base knowledge for students, enhancing comprehension, providing greater accommodations for diverse learning preferences, increasing student motivations, and promoting teacher effectiveness (Corporate Public Broadcasting [CPB], 1997 , 2004 ; Cruse, 2006 ; Kolas, 2015 ; Wong et al., 2023 ; Wong et al., 2024 ; Yousef et al., 2014 ). Proponents in the field of video research also cite specific video design features that aid in specifically supporting students’ learning experiences such as searching, playback, retrieval, and interactivity (Giannakos, 2013 ; Yousef et al., 2014 ; Wong et al., 2023b ). A study conducted by Wong et al. ( 2023b ) sheds light on the limitations of synchronous Zoom video lectures, based on a survey of more than 600 undergraduates during the pandemic. It underscores the advantages of the design of asynchronous videos in online courses, which may better accommodate student learning needs when compared to traditional synchronous learning (Wong et al., 2023b ). Mayer’s ( 2001 , 2019 ) framework for multimedia learning provides a theoretical and practical foundation for how video-based learning modalities can be used as cognitive tools to support students’ learning experiences. While some researchers have argued videos as a passive mode of learning, Mayer ( 2001 ) explains that viewing educational videos involves high cognitive activity that is required for active learning, but this can only occur through well-designed multimedia instruction that specifically fosters cognitive processing in learners, even though learners may seem or appear to be behaviorally inactive (Meyer, 2009 , 2019 ). Following Mayer’s ( 2019 ) principles, we designed multimedia lessons supporting students’ cognitive processing through segmenting, pre-training, temporal contiguity, modality matching, and signaling, all implemented through asynchronous embedded video questions.

2.3 Embedded Video Questions

Embedded video questions are a type of educational technology design feature that adds interactive quizzing capacities while students engage in video-based learning. They involve incorporating formative assessments directly within online videos, prompting viewers to answer questions at specific points in the content. While a video is in progress, students viewing it are prompted with questions designed to encourage increased engagement and deeper cognitive processing (Christiansen et al., 2017 ; Kovacs, 2016 ; Wong et al., 2023 ; van der Meij et al., 2021 ). This is similar to an Audience Response System (ARS) during traditional in-person lectures where an instructor utilizes a live polling system in a lecture hall such as iClickers to present questions to the audience (Pan et al., 2019 ). Yet, within the context of online learning, students are tasked with independently viewing videos at their convenience, and a set of on-screen questions emerges. This allows learners to pause, reflect, and answer questions at their own pace, fostering a sense of control over the learning process (Ryan & Deci, 2017 ). These questions serve to promptly recapitulate key concepts, identify potential misconceptions, or promote conceptual understanding of the subject matter. Studies suggest that embedded video questions can significantly improve student engagement compared to traditional video lectures (Chi & Wylie, 2014 ). Research on the use of embedded video questions has already shown promising empirical results in the field, such as stimulating students’ retrieval and practice, recognition of key facts, and prompting behavioral changes to rewind, review, or repeat the materials that were taught (Cummins et al., 2015 ; Haagsman et al., 2020 ; Rice et al., 2019 ; Wong & Hughes et al., 2022 ; Wong et al., 2024 ). Embedded video questions have also been shown to transition learners from passively watching a video to actively engaging with the video content (Dunlosky et al., 2013 ; Kestin & Miller, 2022 ; Schmitz, 2020 ), a critically important factor when considering the expediency from in-person to online instruction due to the pandemic. As a result, there are a myriad of affordances that showcase the potential effects of embedded video questions on student learning experiences ⎯one of which is how embedded video questions can be intentionally leveraged with regards to question timing to support active information processing facilitated through the testing effect.

3 Testing Effect

Active information processing in the context of video-based learning is the process in which learners are able to encode relevant information from a video, integrate that information with their prior knowledge, and retrieve that information stored at a later time (Johnson & Mayer, 2009 ; Schmitz, 2020 ). This active learning process of retrieval, the learning strategy of rehearsing learning materials through quizzing and testing, is grounded in the cognitive process known as the testing effect. From a cognitive learning perspective, the testing effect is a process in which implementing low-stakes retrieval practice over a period of time can help learners integrate new information with prior knowledge, increasing long-term retention and memory retrieval in order to manipulate knowledge flexibly (Carpenter, 2009 ; Littrell-Baez et al., 2015 ; Richland et al., 2009 ). This shifts the narrative from looking at assessments as traditional high-stakes exams, but rather as practice learning events that provide a measure of learners’ knowledge in the current moment, in order to more effectively encourage retention and knowledge acquisition of information not yet learned (Adesope et al., 2017 ; Carrier & Pashler, 1992 ; Richland et al., 2009 ). The connection between retrieval and the testing effect represents sustained, continual, and successive rehearsal of successfully retrieving accurate information from long-term memory storage (Schmitzs, 2020 ).

The frequency of practice and the time allotted between practice sessions also play a role in memory retention. Equally as important, the timing and intentionality of when these questions might occur within a video may influence learner outcomes. As such, the more instances learners are able to retrieve knowledge from their long-term memory as practice, the better learners may recall and remember that information (Richland et al., 2009 ). This can come in the form of practice tests, which research has shown tremendous success in the cognitive testing literature (Carpenter, 2009 ; Roediger III & Karpicke, 2006 ), or in this study, embedded video questions to facilitate the testing effect. By doing so, we can provide students with an alternative interactive online modality to learning the material in addition to rereading or re-studying (Adesope et al., 2017 ; Roediger et al., 2006 ). Instead, learners are presented with opportunities to answer questions frequently and immediately as retrieval practice when watching a video. Active participation through answering questions keeps viewers focused and promotes deeper information processing (Azevedo et al., 2010 ). We can offer a focused medium for students to recall, retrieve, and recognize crucial concepts (Mayer et al., 2009 ; van de Meij et al., 2021 ). This approach aims to cultivate an active learning environment that engages learners’ cognitive processing during online education. It assists students in discerning which aspects of the learning material they have mastered and identifies areas that require further attention (Agarwal et al., 2008 ; Fiorella & Mayer, 2015 , 2018 ; McDaniel et al., 2011 ).

4 The Testing Effect on Student Learning Behaviors

Embedded video questions present a potential learning modality that operationalizes the theoretical model of the testing effect which may have tremendous benefits on the nature of student-centered active learning opportunities within an online course, particularly with student learning behaviors such as student engagement, self-regulation, and critical thinking. As such, leveraging the testing effect and the LXD pedagogical paradigm synergistically through the medium of embedded video questions may amplify student learning behaviors in online courses. The following sections review the literature on engagement, self-regulation, and critical thinking.

Student engagement in the online learning environment has garnered significant attention due to its crucial role in influencing learning outcomes, satisfaction, and overall course success (Bolliger & Halupa, 2018 ; Wang et al., 2013 ; Wong et al., 2023b ; Wong & Hughes, 2022 ). Broadly defined, student engagement can be characterized as the extent of student commitment or active involvement required to fulfill a learning task (Redmond et al., 2018 ; Ertmer et al., 2010 ). Additionally, engagement can extend beyond mere participation and attendance, involving active involvement in discussions, assignments, collaborative activities, and interactions with peers and instructors (Hu & Kuh, 2002 ; Redmond et al., 2018 ; Wong et al., 2022 ). Within an online course, engagement can be elaborated as encompassing the levels of attention, curiosity, interaction, and intrinsic interest that students display throughout an instructional module (Redmond et al., 2018 ). This also extends to encompass the motivational characteristics that students may exhibit during their learning journey (Pellas, 2014 ). Several factors influence student online engagement, and they can be broadly categorized into individual, course-related, and institutional factors. Individual factors include self-regulation skills, prior experience with online learning, and motivation (Sansone et al., 2011 ; Sun & Rueda, 2012 ). Course-related factors encompass instructional design, content quality, interactivity, and opportunities for collaboration (Pellas, 2014 ; Czerkawski & Lyman, 2016 ). Institutional factors involve support services, technological infrastructure, and instructor presence (Swan et al., 2009 ; Picciano, 2023 ). Furthermore, research has established a noteworthy and favorable correlation between engagement and various student outcomes, including advancements in learning, satisfaction with the course, and overall course grades (Bolliger & Halupa, 2018 ; Havlverson & Graham, 2019 ). Instructional designers argue that to enhance engagement, instructors and educators can employ strategies like designing interactive and authentic assignments (Cummins et al., 2015 ; Floor, 2018 ), fostering active learning opportunities, and creating supportive online learning environments (Kuh et al., 2005 ; Wong et al., 2022 ). Thus, engaged students tend to demonstrate a deeper understanding of the course material, a stronger sense of self-regulation, and improved critical thinking skills (Fedricks et al., 2004 ; Jaggars & Xu, 2016 ; Pellas, 2018 ).

Self-regulation pertains to the inherent ability of individuals to manage and control their cognitive and behavioral functions with the intention of attaining particular objectives (Pellas, 2014 ; Vrugt & Oort, 2008 ; Zimmerman & Schunk, 2001 ). In the context of online courses, self-regulation takes on a more specific definition, encapsulating the degree to which students employ self-regulated metacognitive skills–the ability to reflect on one’s own thinking–during learning activities to ensure success in an online learning environment (Wang et al., 2013 ; Wolters et al., 2013 ). Unlike conventional in-person instruction, asynchronous self-paced online courses naturally lack the physical presence of an instructor who can offer immediate guidance and support in facilitating the learning journey. While instructors may maintain accessibility through published videos, course announcements, and email communication, students do not participate in face-to-face interactions within the framework of asynchronous courses. However, the implementation of asynchronous online courses offers learners autonomy, affording them the flexibility to determine when, where, and for how long they engage with course materials (McMahon & Oliver, 2001 ; Wang et al., 2017 ). Furthermore, the utilization of embedded video questions in this course taps into Bloom’s taxonomy, featuring both lower and higher-order thinking questions to test learners’ understanding. This medium enables learners to immediately engage with and comprehend conceptual materials through processes such as pausing, remembering, understanding, applying, analyzing, and evaluating, negating the need to postpone these interactions until exam dates (Betts, 2008 ; Churches, 2008 ). While this shift places a significant responsibility on the learner compared to traditional instruction, embedded video questions contribute to a student-centered active learning experience (Pulukuri & Abrams, 2021 ; Torres et al., 2022 ). This approach nurtures students’ self-regulation skills by offering explicit guidance in monitoring their cognitive processes, setting both short-term and long-term objectives, allocating sufficient time for assignments, promoting digital engagement, and supplying appropriate scaffolding (Al-Harthy et al., 2010 ; Kanuka, 2006 ; Shneiderman & Hochheiser, 2001 ). Through this, students actively deploy numerous cognitive and metacognitive strategies to manage, control, and regulate their learning behaviors to meet the demands of their tasks (Moos & Bonde, 2016 ; Wang et al., 2013 ). Due to the deliberate application of LXD principles, the course has the capability to enhance the development of students’ self-regulation abilities in the context of online learning (Pulukuri & Abrams, 2021 ). Consequently, this empowers students to identify their existing knowledge and engage in critical evaluation of information that may need further refinement and clarification.

Leveraging the testing effect model through the integration of embedded video questions also yields notable advantages concerning students’ critical thinking capabilities. Critical thinking involves students’ capacity to employ both new and existing conceptual knowledge to make informed decisions, having evaluated the content at hand (Pintrich et al., 1993 ). In the context of online courses, critical thinking becomes evident through actions such as actively seeking diverse sources of representation (Richland & Simms, 2015 ), encountering and learning from unsuccessful retrieval attempts (Richland et al., 2009 ), and effectively utilizing this information to make informed judgments and draw conclusions (Uzuntiryaki-Kondakci & Capa-Aydin, 2013 ). To further elaborate, according to Brookfield ( 1987 ), critical thinking in the research context involves recognizing and examining the underlying assumptions that shape learners’ thoughts and actions. As students actively practice critical thinking within the learning environment, the research highlights the significance of metacognitive monitoring, which encompasses the self-aware assessment of one’s own thoughts, reactions, perceptions, assumptions, and levels of confidence in the subject matter (Bruning, 2005 ; Halpern, 1998 ; Jain & Dowson, 2009 ; Wang et al., 2013 ). As such, infusing embedded video questions into the learning process may serve as a strategic pedagogical approach that may catalyze students’ critical thinking skills.

In the context of embedded video questions, students must critically analyze questions, concepts, scenarios, and make judgments on which answer best reflects the problem. As students engage with the videos, they’re prompted to monitor their own thinking processes, question assumptions, and consider alternate perspectives—a quintessential aspect of metacognition that complements critical thinking (Bruning, 2005 ; Halpern, 1998 ; Jain & Dowson, 2009 ; Wang et al., 2013 ). Sometimes, students might get the answers wrong, but these unsuccessful attempts also contribute to the testing effect in a positive manner (Richland et al., 2009 ). Unsuccessful attempts serve as learning opportunities to critically analyze and reflect during the low-stakes testing stage so that learners are better prepared later on. Furthermore, cultivating students’ aptitude for critical thinking also has the potential to enhance their transferable skills (Fries et al., 2020 ), a pivotal competency for STEM undergraduate students at research-intensive institutions (R1), bridging course content to real-world applications. In essence, the interplay between the testing effect model and the use of embedded video questions not only supports students’ critical thinking, but also underscores the intricate relationship between engagement, self-regulation, and course outcomes (Wang et al., 2013 ).

4.1 Current Study

This study builds on the work of Wong and Hughes ( 2023 ) on the implementation of LXD in STEM courses utilizing educational technologies. Utilizing the same course content, course videos, and pedagogical learning design, this Design-Based Research (DBR) approach employs learning theories to assess the effectiveness of design and instructional tools within real-world learner contexts (DBR Collective, 2003; Siek et al., 2014 ). In this study, we utilized the same instructional videos and course materials as Wong & Hughes et al. ( 2023 ), but instead incorporated iterative design enhancements such as embedded video questions to assess their potential testing effect impacts on students’ learning experiences. Therefore, this quasi-experimental research contrasts students who participated in a 10-week undergraduate science online course. Half of these students encountered low-stakes questions integrated directly within the video player (immediate condition), while the other half received questions following a series of video lectures (delayed condition). The aim is to assess how the timing of when low-stakes questioning occurs might beneficially influence learners’ science content knowledge, engagement, self-regulation, and critical thinking. Additionally, we assessed students’ learning analytics within the online course, including online page views and course participation, as a proximal measure of learners’ online engagement. We then compared these findings with their self-report survey responses within the online course to corroborate the results. With the implementation of a newly iterated online course grounded in LXD paradigm and the testing effect model, this study is guided by the following research questions:

RQ1) To what extent does the effect of “immediate vs. delayed low-stakes questioning” influence learners’ total quiz grades, online page views, and course participation rate?

RQ2) To what extent does the effect of “immediate vs. delayed low-stakes questioning” influence learners’ engagement, self-regulation, and critical thinking?

RQ3) To what extent does the relationship between “immediate vs. delayed low-stakes questioning” and learner’s total quiz grades vary depending on their levels of self-regulation and critical thinking?

5 Methodology

5.1 ethical considerations.

This study, funded by the National Science Foundation (NSF), adheres to stringent ethical standards mandated by both the university and the grant funding agency. The university institution obtained approval from its Institutional Review Board (IRB) to conduct human subjects research, ensuring compliance with ethical guidelines. The research was categorized as IRB-exempt due to its online, anonymous data collection process, which posed minimal risk to participants. All participants were provided with comprehensive information about the study, including its purpose, procedures, potential risks and benefits, confidentiality measures, and their right to withdraw without consequences. Participant data was treated with utmost confidentiality and anonymity, and the study’s questions, topics, and content were designed to avoid causing harm to students. The research protocol received formal approval from the university’s ethics committee. All participants provided informed consent to participate in the study before any data collection procedures commenced. This ensured that participants were fully aware of the study’s purpose, procedures, potential risks and benefits, confidentiality measures, and their right to withdraw without consequences.

5.2 Quasi-experimental Design

This research employed a design-based research (DBR) approach, leveraging learning theories to evaluate the effectiveness of design, instructional tools, or products in authentic, real-world settings (DBR Collective, 2003; Siek et al., 2014 ). The rationale for this research methodology is to assess instructional tools in ecologically valid environments and explore whether these tools enhance students’ learning experiences (Scott et al., 2020 ). Our decision to adopt a DBR approach arises from the limited research on investigating the efficacy of the Learning Experience Design (LXD) pedagogical paradigm with embedded video questions in online undergraduate science courses. We are also cognizant of previous research indicating that simply inserting questions directly into videos, without evidence-based pedagogical principles, intentional design, and instructional alignment, does not significantly improve learning outcomes (Deng et al., 2023 ; Deng & Gao, 2023 ; Marshall & Marshall, 2021 ). Thus, this DBR study utilizes a Learning Experience Design (LXD) approach to cultivate active learner engagement through the implementation of learning theories such as the testing effect model. We then compare the impact of embedded video questions on learning outcomes within the newly designed self-paced asynchronous online course (See Fig.  1 ). Subsequently, we test these designs with learners and utilize the findings to iterate, adapt, and redeploy these techniques continually, aiming to enhance the efficacy and gradual evolution in our designs of embedded video questions on students’ learning experiences.

figure 1

Quasi-experimental research design.

The study involved two equivalently sized classes within the School of Biological Sciences at an R1 university in Southern California, with students voluntarily enrolling in either one of these two classes. The two classes were taught by the same professor on the same topics of Ecology and Evolutionary Biology. This particular course was chosen to serve as a research-practice partnership (RPP), collaborating closely with the professor, educational designers, researchers, and online course creators to customize a course that aligns with the instructor’s and students’ needs returning from the COVID-19 remote learning environment.

The study spanned a 10-week period, allowing sufficient dosage for implementing our learning designs and effectively measuring their impact on students’ learning experiences (See Fig.  1 ). Selecting a quasi-experimental design allowed us to assess the impact of question timing and placement on students’ comprehension and retention of the material presented in the videos. Following quasi-experimental design principles, the study involved two classes, each assigned to a different treatment condition. Students who experienced low-stakes questions after watching a series of videos were referred to as “Delayed Questioning,” and students who experienced low-stakes questions immediately embedded within the video player were referred to as “Immediate Questioning.” In the delayed questioning condition, students encountered low-stakes questions only after watching all assigned video lectures for the week, while in the immediate questioning condition, students faced questions directly embedded in the video player, time-stamped and deliberately synchronized with the presented conceptual content. The two treatment conditions, “Delayed” and “Immediate Questioning’’ were carefully designed to isolate the effect of question timing while keeping all other variables constant. As such, the low-stakes questions, quantity of videos, and the number of questions in both conditions were completely identical, with the only experimental manipulation involving the timing and placement of the questions across conditions.

Following the viewing of videos and answering of low-stakes questions, either embedded directly in the video or after watching all of the videos in the instructional unit, all students proceeded to take an end-of-week quiz, serving as a summative assessment released on Fridays. The end-of-week quiz was completely identical and released at the same time and day across both conditions. This comprehensive approach ensured equitable testing conditions and minimized potential confounding variables. Furthermore, this approach allowed for a controlled comparison between the two conditions, helping to determine whether embedding questions directly within the video player led to different learning outcomes compared to presenting questions after watching all of the videos. Selecting these carefully designed treatment conditions allowed for a controlled comparison, isolating the effect of question timing while keeping all other variables constant. This methodological rigor facilitated a robust analysis of the impact of question placement on students’ learning experiences and outcomes.

5.3 Participants

The study encompassed a total of n =  183 undergraduate students who were actively enrolled in upper-division courses specializing in Ecology and Evolutionary Biology. Participants were selected based on their voluntary self-enrollment in these upper-division courses during a specific enrollment period of Winter 2021. No exclusion criteria were applied, allowing for a broad sample encompassing various backgrounds and levels of experience in Ecology and Evolutionary Biology. These courses were part of the curriculum at a prominent R1 research university located in Southern California and were specifically offered within the School of Biological Sciences. Students were able to enroll in the upper division course so long as they were a biological sciences major and met their lower division prerequisites. Regarding the demographic makeup of the participants, it included a diverse representation with 1.2% identifying as African American, 72.0% as Asian/Pacific Islander, 10.1% as Hispanic, 11.3% as white, and 5.4% as multiracial. Gender distribution among the students consisted of 69.0% females and 31.0% males (See Table  1 ). Participants randomly self-select into one of two distinct course sections, each characterized by different approaches to course implementation: (1) The first condition featured questions placed at the conclusion of all video scaffolds ( n =  92). (2) The second section incorporated questions that were embedded directly within the video scaffolds themselves ( n =  91).

5.4 Learning Experience Design

5.4.1 video design.

The curriculum delivery integrated innovative self-paced video materials crafted with the Learning Experience Design (LXD) paradigm in mind (Wong et al., 2024 ). These videos incorporated various digital learning features such as high-quality studio production, 4 K multi-camera recording, green screen inserts, voice-over narrations, and animated infographics (See Fig.  2 ). Underpinning this pedagogical approach of the video delivery was the situated cognition theory (SCT) for e-learning experience design, as proposed by Brown et al. in 1989. In practice, the videos were structured to align with the key elements of SCT, which include modeling, coaching, scaffolding, articulation, reflection, and exploration (Collins et al., 1991 ; Wong et al., 2024 ). For instance, the instructor initiated each module by introducing a fundamental concept, offering in-depth explanations supported by evidence, presenting real-world instances demonstrating the application of the concept in research, and exploring the implications of the concept to align with the course’s educational objectives. This approach emphasized immersion in real-world applications, enhancing the overall learning experience.

figure 2

This figure visually depicts the embedded video question interface alongside the Bloom's Taxonomy pyramid, illustrating the connection between the video questions and the quiz questions for the week, specifically emphasizing the testing effect

In the video design process, we adopted an approach where content equivalent to an 80-minute in-person lecture was broken down into smaller, more manageable segments lasting between five to seven minutes. This approach was taken to alleviate the potential for student fatigue, reduce cognitive load, and minimize opportunities for students to become distracted (Humphris & Clark, 2021 ; Mayer, 2019 ). Moreover, we meticulously scripted the videos to align seamlessly with the course textbook. This alignment served the purpose of pre-training students in fundamental concepts and terminologies using scientific visuals and simplified explanations, thereby preparing them for more in-depth and detailed textbook study. As part of our video design strategy, we strategically integrated embedded questions at specific time points during the video playback. These questions were designed to serve multiple purposes, including assessing students’ comprehension, sustaining their attention, and pinpointing areas of strength and weakness in their understanding. In line with Meyer’s ( 2019 ) principles of multimedia design, our videos were crafted to incorporate elements like pretraining, segmenting, temporal contiguity, and signaling (See Fig.  2 ). These principles ensured that relevant concepts, visuals, and questions were presented concurrently, rather than sequentially (Mayer, 2003, 2019 ). This approach encouraged active engagement and processing by providing cues to learners within the video content.

5.4.2 Question Design

Students in both the “immediate” and “delayed” conditions experienced low-stakes multiple-choice questions. Low-stakes multiple-choice questions were knowledge check questions that served as opportunities for content practice, retention, and reconstructive exercises, aiming to engage learners in the testing effect and enhance their conceptual understanding (Richland et al., 2009 ). Grounded in Bloom’s Taxonomy, the low-stakes questions were designed to emphasize lower-order thinking skills, such as “remembering and understanding” concepts in context (Bloom, 2001 ; Betts, 2008 ) (See Fig.  2 ). In contrast, students experienced high-stakes multiple-choice questions on the weekly summative quizzes consisting of higher-order thinking questions that required students to “apply, analyze, and evaluate” scenarios in ecology and evolutionary biology, encouraging learners to break down relationships and make judgments about the information presented (Bloom, 2001 ; Betts, 2008 ) (See Fig.  2 ).

For instance, an example low-stakes multiple-choice question can be found in Fig. 2  that students encountered which included: “In the hypothetical fish example, the cost of reproduction often involves:” (A) shunting of fats and gonads to provision eggs, (B) shunting of fats to gonads to make more sperm, (C) using fats as a source of fuel for general locomotion, (D) fish face no resource limitations, (E) A and B . Upon reading the question, the question prompts the learner to “remember” and “understand” what they just watched and identify what they know or potentially do not know. Questions that prompt learners to “remember” and “understand” are considered lower-order thinking questions on the Bloom’s Taxonomy pyramid (Bloom, 2001 ). An example of the high-stakes questions that students encountered while taking their weekly summative exams include: “Given the tradeoff between survival and reproduction fertility, (the number of offspring), how does natural selection act on species? A) Natural selection will minimize the number of mating cycles, B) Natural selection will maximize fecundity, C) Natural selection will maximize survivability, D) Natural selection will compromise between survival and fecundity, D) None of the above . These high-stakes questions on the weekly summary quizzes are made up of higher-order thinking questions that require learners to “apply, analyze, and evaluate,” which consists of the top three pillars of the Bloom’s taxonomy pyramid (Bloom, 2001 ). The notable differences between low-stakes and high-stakes questions are learners’ application of their conceptual understanding to elaborate on their new and existing understandings, critically evaluate between concepts, and apply the concepts in a new scenario or context. High-stakes questions, or higher-order thinking questions, have been shown to promote the transfer of learning, increase the application of concepts during retrieval practice, and prevent simply recalling facts and memorizing the right answers by heart (Chan, 2010 ; McDaniel et al., 2013 ; Mayer, 2014 ; Richland et al., 2009 ). This active process allows students to organize the key learning concepts into higher orders and structures. Moreover, the student’s working memory connects new knowledge with prior knowledge, facilitating the transfer to long-term memory and enabling the retrieval of this information at a later time (Mayer, 2014 ). Together, these strategic question design choices empower students to actively participate in constructive metacognitive evaluations, encouraging learners to contemplate “how and why” they reached their conclusions (See Fig.  2 ). Research has indicated that such an approach promotes critical thinking and the utilization of elaborative skills among learners in online learning contexts (Tullis & Benjamin, 2011 ; Wang et al., 2013 ). Furthermore, by having students answer questions and practice the concepts, our intentions were that students would be better prepared for the high-stakes questions on the weekly quizzes due to the facilitation of testing effect through low-stakes questioning prior.

Based on their respective conditions, learners would encounter low-stakes questions either after watching a series of 6 or 7 lecture videos or integrated directly within each video synchronized to the concept being taught. We opted to have the questions for the “delayed” condition after a series of videos instead of after every video because this time delay allowed us to investigate the effects of timing and spacing between the two treatment conditions. Having all the questions appear at the end of a series of lecture videos also helped to avoid the recency effect and minimize immediate recall for students in the “delayed” condition. Since having questions after every video could also be considered a form of immediate questioning, as the questions would be directly related to the video students just watched, we intentionally designed the “delayed” condition to have all the questions at the end of 6 or 7 videos for that instructional unit to maintain treatment differences. By structuring the questions in the “delayed” condition this way, we aimed to assess whether students retain and integrate knowledge over time, providing a more comprehensive understanding of the learning process and the potential treatment differences of “delayed” compared to “immediate” questioning. Furthermore, we considered that this design choice could mitigate potential fatigue effects that might arise from frequent interruptions of questioning for students in the “immediate” condition. Ultimately, the research design decision for the “delayed” condition to place the low-stakes questions after students watched 6 or 7 videos for that instructional unit provided an optimal treatment comparison between the immediate and delayed conditions.

5.4.3 Course Design and Delivery

The course was implemented within the Canvas Learning Management System (LMS), the official learning platform of the university. The videos recorded for this course were uploaded, designed, and deployed using the Yuja Enterprise Video Platform software. Yuja is a cloud-based content management system (CMS) for video storage, streaming, and e-learning content creation. For this study, we utilized Yuja to store the videos in the cloud, design the embedded video questions platform, and record student grades. After uploading the videos, the questions were inputted into the Yuja system with the corresponding answer options based on specific time codes. These time codes were determined based on the concepts presented within the video. Typically, lower-order thinking questions (i.e. questions that required, remembering, understanding) were placed immediately after introducing a definition of a key concept. Then, higher-order thinking questions (i.e. analyzing, evaluating) were placed towards the end of the video for students to apply the concepts in context before moving on to the next video. Finally, each video was then published from Yuja to Canvas using the Canvas Learning Tools Interoperability (LTI) integration so that all student grades from the embedded video questions were automatically graded and directly updated into the Canvas grade book.

5.5 Data Collection and Instrumentation

Data collection for this study was conducted electronically during the Winter 2021 academic term. All survey measurement instruments were distributed online to the participating students through the Qualtrics XM platform, an online survey tool provided through the university. Students were granted direct access to the surveys through hyperlinks that were seamlessly integrated into their Canvas Learning Management System (LMS) course space, providing a user-friendly, FERPA compliant, and secure centralized data collection environment. Students filled out the surveys immediately after completing their last lesson during the last week of the course on Week 10. When responding to all of the surveys, students were asked to reflect on their learning experiences about the online course they were enrolled in specifically. Having students complete the survey right after their last lesson was an intentional research design decision in order to maintain the rigor, robustness, and quality of responses from students.

5.5.1 Survey Instruments

Three surveys were given to the participants: the Motivated Strategies for Learning Questionnaire, assessing critical thinking and self-regulation, and the Perceived Engagement Scale. We maintained the original question count and structure for reliability but made slight adjustments, such as replacing “classroom” with “online course” to align with the study’s online math intervention context. This approach, supported by research (Hall, 2016; Savage, 2018), ensures effectiveness while preserving the survey instruments’ reliability, particularly across different learning modalities.

The MLSQ instrument utilized in this study was originally developed by a collaborative team of researchers from the National Center for Research to Improve Postsecondary Teaching and Learning and the School of Education at the University of Michigan (Pintrich et al., 1993 ). This well-established self-report instrument is designed to comprehensively assess undergraduate students’ motivations and their utilization of diverse learning strategies. Respondents were presented with a 7-point Likert scale to express their agreement with statements, ranging from 1 (completely disagree) to 7 (completely agree). To evaluate students in the context of the self-paced online course, we focused specifically on the self-regulation and critical thinking subscales of the MLSQ. Sample items in the self-regulation scale included statements such as “When studying for this course I try to determine which concepts I don’t understand well” and “When I become confused about something I’m watching for this class, I go back and try to figure it out.” Sample items for critical thinking include “I often find myself questioning things I hear or read in this course to decide if I find them convincing” and “I try to play around with ideas of my own related to what I am learning in this course.” According to the original authors, these subscales exhibit strong internal consistency, with Cronbach alpha coefficients reported at 0.79 and 0.80, respectively. In this study, Cronbach’s alphas for self-regulation and critical thinking were 0.86 and 0.85, respectively.

To gauge students’ perceptions of their online engagement, we employed a 12-item survey adapted from Rossing et al. ( 2012 ). This survey encompassed a range of questions probing students’ views on the learning experience and their sense of engagement within the online course. Respondents conveyed their responses on a 5-point Likert scale, ranging from 1 (completely disagree) to 5 (completely agree). Sample items in the scale included statements such as “This online activity motivated me to learn more than being in the classroom” and “Online video lessons are important for me when learning at home.” Rossing et al. ( 2012 ) report that the internal consistency coefficient for this instrument was 0.90. Similarly, Wong et al. ( 2023b ) reported a coefficient of 0.88, further supporting the scale’s reliability across online learning contexts. This instrument demonstrated robust internal consistency, with a Cronbach alpha coefficient reported at 0.89, indicating its reliability in assessing students’ perceptions of online engagement.

5.5.2 Course Learning Analytics

Throughout the 10-week duration, individualized student-level learning analytics were gathered from the Canvas Learning Management System (LMS). These analytics encompassed various metrics, including total quiz grades, participation rates, and page views. The total quiz grades served as a summative assessment with 10 multiple choice questions. This aggregate metric was derived from the summation of weekly quiz scores over the 10-week period. Each student completed a total of 10 quizzes over the course of the study, with one quiz administered per week. It’s noteworthy that the quizzes presented to students in both classes were completely identical in terms of length, question count, and answer choices. By standardizing the quizzes across both classes, we ensured uniformity in assessment across both classes, thereby enabling a fair comparison of learning outcomes between students who received embedded video questions and those who did not.

Pageviews and participation rates offered detailed insights into individual user behavior within the Canvas Learning Management System (LMS). Pageviews specifically monitored the total number of pages accessed by learners within the Canvas course environment, with each page load constituting a tracked event. This meticulous tracking provided a metric of the extent of learners’ interaction with course materials (Instructure, 2024 ), enabling a close examination of learner engagement and navigation patterns within the online course. Consequently, page view data can serve as a reliable proxy for student engagement rather than a definitive measure, assisting in gauging the occurrence of activity and facilitating comparisons among students within a course or when analyzing trends over time. The total number of page views for both classes were examined and compared between students with and without embedded video questions.

Participation metrics within the Canvas LMS encompassed a broad spectrum of user interactions within the course environment. These included not only traditional activities such as submitting assignments and quizzes but also more dynamic engagements such as watching and rewatching videos, redoing low-stakes questions for practice, and contributing to discussion threads by responding to questions (Instructure, 2024 ). Each instance of learner activity was logged as an event within the Canvas LMS. These participation measures were comprehensive and captured the diverse range of actions undertaken by students throughout their learning journey. They provided invaluable insights into the level of engagement and involvement of each student within their respective course sections. By recording these metrics individually for each student, the Canvas LMS facilitated detailed analysis and tracking of learner behavior, enabling a nuanced understanding of student participation patterns and their impact on learning outcomes.

5.6 Data Analysis Plan

We conducted checks for scale reliability to assess the alpha coefficients for all the measurement instruments. Additionally, a chi-square analysis was performed to ensure that there were no disparities between conditions in terms of gender, ethnicity, and student-grade level statuses prior to treatment. Next, descriptive analyses were conducted to assess the frequencies, distribution, and variability across the two different conditions on learners total quiz grades, page views, and participation after 10 weeks of instruction (See Table  2 ). Then, a series of one-way Analysis of Variance (ANOVAs) were conducted to examine the differences between conditions on dependent variables separately. Next, two Multivariate Analysis of Variance (MANOVAs) were conducted to evaluate the difference between treatment conditions on multiple dependent variables. A MANOVA was chosen for analysis in order to access multiple dependent variables simultaneously while comparing across two or more groups. The first MANOVA compared the means of learners with and without embedded video questions on three dependent variables: (D1) quiz grades, (D2) pageviews, and (D3) participation. A second MANOVA compared the means of learners with and without embedded video questions on three dependent variables: (D1) engagement, (D2) self-regulation, and (D3) critical thinking skills. Lastly, multiple regression analyses were conducted to evaluate the effect of embedded video questions related to learners’ quiz grades and whether this relation was moderated by learners’ self-regulation and critical thinking skills.

Descriptive Analysis.

Table  3 displays the average weekly quiz grades for two instructional conditions, “Delayed Questioning” and “Immediate Questioning,” over a ten-week period from January 4th to March 8th. Fluctuations in quiz grades are evident across the observation period for both conditions. For instance, on Week 1, the average quiz grade for “Delayed Questioning” was 95.65, while it was notably higher at 99.2 for students in the “Immediate Questioning” condition. Similarly, on Week 6, quiz grades decreased for both conditions, with “Delayed Questioning” at 93.35 and “Immediate Questioning” at 96.9 (See Fig.  3 ). Comparing the average quiz grades between the two instructional conditions revealed consistent differences throughout the observation period. The “Immediate Questioning” condition consistently demonstrated higher quiz grades compared to “Delayed Questioning.” Notably, this difference is particularly pronounced on certain dates, such as Week 3, where the average quiz grade for “Delayed Questioning” was 97.6, while it reached 99.6 for “Immediate Questioning.” These descriptive findings suggest that embedding questions directly within the video content may positively influence learners’ quiz performance, potentially indicating higher engagement and comprehension of the course material. However, further analysis is required to explore the significant differences in weekly quiz grades between the two instructional conditions.

figure 3

Descriptive comparison of students' weekly summative quiz by condition

figure 4

This figure presents the frequency of page views throughout the 10-week course

Figure  4 presents the frequency of page views throughout the 10 week course, acting as an proximal indicator of learner engagement, across different dates for two instructional approaches: “Delayed Questioning” and “Immediate Questioning.” Higher page view counts indicate heightened interaction with course materials on specific dates. For example, on Week 1, “Delayed Questioning” registered 9,817 page views, while “Immediate Questioning” recorded 12,104 page views, indicating peaks in engagement. Conversely, lower page view counts on subsequent dates may imply reduced learner activity or engagement with the course content. Fluctuations in page view counts throughout the observation period highlight varying levels of learner engagement under each instructional condition. Notably, a comparative analysis between the two instructional methods unveiled consistent patterns, with “Immediate Questioning” condition consistently exhibiting higher page view counts across most observation dates. This initial examination suggests that embedding questions directly within the video player may enhance learner engagement, evidenced by increased interaction with course materials.

Upon examination of the participation rates across the specified dates, it is evident that the “Immediate Questioning” condition consistently generated higher levels of engagement compared to the “Delayed Questioning” condition (See Fig.  5 ). For instance, on Week 4, the participation rate for “Delayed Questioning” was recorded as 459, while it notably reached 847 for “Immediate Questioning.” Similarly, on Week 7 participation rates were 491 and 903 for “Delayed Questioning” and “Immediate Questioning,” respectively, indicating a substantial difference in participation rates between the two instructional approaches. Moreover, both conditions experienced fluctuations in participation rates over time, with instances where participation rates surged or declined on specific dates. For instance, on Week 10, the participation rate for “Delayed Questioning” dropped to 287, whereas it remained relatively higher at 677 for “Immediate Questioning.” Overall, the descriptive analysis depicted in Fig.  5 highlights the differences in participation rates across the two conditions and underscores how embedding video questions influences learners’ online behaviors.

figure 5

This figure presents the frequency of participation throughout the 10-week course

6.1 Multivariate Analysis of Variance on Dependent Variables

A MANOVA was conducted to compare the means of learners with and without embedded video questions on three dependent variables: (D1) quiz grades, (D2) pageviews, and (D3) participation (See Table  4 ). The multivariate test was significant, F (3, 150) = 188.8, p  < 0.000; Pillai’s Trace = 0.791, partial η 2  = 0.791, indicating a difference between learners who experienced ”Delayed” and “Immediate Questioning.” The univariate F tests showed there was a statistically significant difference for total quiz grades F (1, 152) = 6.91; p  < 0.05; partial η 2  = 0.043), pageviews F (1, 152) = 26.02; p  < 0.001; partial η 2  = 0.146), and course participation rates F (1, 152) = 569.6; p  < 0.001; partial η 2  = 0.789) between the two conditions. The results of the Bonferroni pairwise comparisons of mean differences for total quiz grades ( p  < 0.05), pageviews ( p  < 0.001), and course participation ( p  < 0.001) were statistically significantly different between the two conditions. Therefore, learners who experienced questions directly embedded within the video player had significantly higher total quiz grades, page views, and course participation across 10 weeks.

A second MANOVA compared the means of learners with and without embedded video questions on three dependent variables: (D1) engagement, (D2) self-regulation, and (D3) critical thinking skills (See Table  5 ). The multivariate test was significant, F (3, 179) = 5.09, p  < 0.000; Pillai’s Trace = 0.079, partial η 2  = 0.079, indicating a difference between learners who experienced ”Delayed” and “Immediate Questioning.” The univariate F tests showed there was a statistically significant difference between learners with and without embedded video questions for engagement F (1, 181) = 7.43; p  < 0.05; partial η 2  = 0.039), self-regulation F (1, 181) = 14.34; p  < 0.001; partial η 2  = 0.073), and critical thinking skills F (1, 181) = 6.75; p  < 0.01; partial η 2  = 0.036). The results of the Bonferroni pairwise comparisons of mean differences for engagement ( p  < 0.05), self-regulation ( p  < 0.001), and critical thinking skills ( p  < 0.01) were statistically significantly different across the two conditions. Therefore, experienced questions directly embedded within the video player had significantly higher engagement, self-regulation, and critical thinking skills.

6.2 Moderation Analyses

A multiple regression model investigated whether the association between learners’ total quiz grades who experienced ”Delayed” or “Immediate Questioning” depends on their levels of self-regulation and critical thinking (Table  6 ). The moderators for this analysis were learners’ self-report self-regulation and critical thinking skills, while the outcome variable was the learners’ total quiz grades after 10 weeks. Results show that learners’ who experienced “Immediate Questioning” (β = 1.15, SE  = 4.72) were significantly predictive of their total quiz grades Additionally, the main effect of students’ self-regulation (β = 0.394, SE  = 0.78) and critical thinking skills (β = 0.222, SE  = 0.153) were statistically significant. Furthermore, the interaction between learners who experienced “Immediate Questioning” and self-regulation was also significant (β = 0.608, SE  = 0.120), suggesting that the effect of condition on quiz grades is dependent on the level of learners’ self-regulation. However, the interaction between treatment conditions and critical thinking was not significant (β = 0.520, SE  = 0.231). Together, the variables accounted for approximately 20% of the explained variance in learners’ quiz grades, R 2  = 0.19, F (5,158) = 9.08, p  < 0.001.

7 Discussion

This study was part of a large-scale online learning research effort at the university, examining undergraduate experiences through pedagogically grounded educational technologies. Specifically, it implemented learning experience design, the testing effect model, and “edtech tools” aligned with evidence-based learning theories to enhance student knowledge, engagement, and transferable skills like self-regulation and critical thinking. A key goal was to use design-based research methodologies to evaluate students where instructors were applying these evidence-based practices in real-world settings, helping determine if investments in educational technologies supported student learning outcomes. With the increased demand for online learning post-pandemic, this study investigated the impact of embedded video questions within an asynchronous online Biology course on engagement, self-regulation, critical thinking, and quiz performance. By comparing “Immediate Questioning” versus “Delayed Questioning,” this research explored how the timing of embedded video questions affected the efficacy of online learning, contributing to our understanding of effective online education strategies. The discussion interpreted and contextualized the findings within the broader landscape of online education, technology integration, and pedagogical design.

7.1 Impact on Student Course Outcomes

The first MANOVA results revealed significant positive effects of “Immediate Low-stakes Questioning” on learners’ summative quiz scores over a 10-week period compared to the “Delayed Low-stakes condition.” Notably, both groups had equal preparation time, with quizzes available at the same time and deadlines each week. This indicates that the timing and interactive nature of embedded video questions, aimed at fostering the testing effect paradigm, contributed to increased learner activity and participation (Richland et al., 2009 ). The “Immediate Questioning” group, characterized by notably higher weekly quiz scores, benefitted from the active learning facilitated by concurrent processing of concepts through answering questions while watching the lecture videos. Embedded questions not only fostered an active learning environment but also captured students’ attention and engaged them differently compared to passive video viewing learning modalidies (Mayer, 2021; van der Meij et al., 2021 ). This approach allowed for immediate recall and practice, providing guided opportunities for students to reflect on their knowledge and validate their accuracies or improve upon their mistakes (Cummins et al., 2015 ; Haagsman et al., 2020 ). The strategic timing of questions synchronized with specific instructional topics provided students with opportunities to recognize, reflect on, and decipher what they know and what they don’t know. Consequently, students approached their weekly quizzes with greater readiness, as strategically positioned embedded video questions fostered enhanced cognitive engagement due to their intentional timing, placement, and deliberate use of low-stakes questioning (Christiansen et al., 2017 ; Deng & Gao, 2023 ). Overall, the study’s results align with previous literature, indicating that interactive low-stakes quizzing capacities through intentionally timed questions within video-based learning effectively simulate the testing effect paradigm to foster retrieval practice over time (Littrell-Baez et al., 2015 ; Richland et al., 2009 ). These findings underscore the efficacy of integrating interactive elements into online learning environments to enhance student engagement and learning outcomes.

Additionally, students in the “Immediate Questioning’’ condition demonstrated significantly higher participation rates and page views within the course (Table  2 ). Page views were tracked at the individual student level, representing the total number of pages accessed, including watching and rewatching videos, accessing assignments, and downloading course materials. This indicates that students in the “Immediate Questioning’’ condition were more engaged with course content, preparing for weekly quizzes by actively engaging with various resources. In terms of participation rates, learners in the “Immediate Questioning’’ condition were more active compared to their counterparts (Table  2 ). Participation encompassed various actions within the Canvas LMS course, such as submitting assignments, watching videos, accessing course materials, and engaging in discussion threads. Students in this condition were more likely to ask questions, share thoughts, and respond to peers, fostering a deeper level of engagement. Moreover, there was a consistent pattern of students revisiting instructional videos, as reflected in page views. Research on embedded video questions has shown that they prompt positive learning behaviors, such as reviewing course materials (Cummins et al., 2015 ; Haagsman et al., 2020 ; Rice et al., 2019 ; Wong et al., 2022 ). These insights into student behavior highlight the impact of integrating questions within the video player, resulting in increased engagement indicated by higher page views and course participation.

7.2 Impacts on Student Learning Behaviors

In addition to learning analytics, we gathered data on students’ self-reported online engagement. Students in the “Immediate Questioning” condition reported higher engagement levels than their counterparts, possibly due to the anticipation of upcoming questions, fostering attention, participation, and interaction. This increased awareness can positively impact students’ engagement, retrieval, and understanding, as they mentally prepare for the questions presented (Dunlosky et al., 2013 ; Schmitz, 2020 ). Moreover, questions directly embedded within the video encourage thoughtful engagement with material, amplifying the benefits of repeated low-stakes testing in preparation for assessments (Kovacs, 2016 ; Richland et al., 2009 ). Our study manipulated the timing of these questions to enhance the saliency of the testing effect paradigm, aiming to transition learners from passive to active participants in the learning process. When considering both the first and second MANOVA results, students in the “Immediate Questioning” condition not only showed significant differences in participation and page views but also reported significantly higher engagement compared to those in the “Delayed Questioning” condition. These findings align with previous research on interactive learning activities and “edtech tools” in promoting engagement in online courses (Wong et al., 2022 ; Wong et al., 2024 ). We employed the same instructional videos from Wong and Hughes ( 2022 ), but our study was informed by the design constraints students identified regarding limited interactivity, practice opportunities, and student-centered active learning in asynchronous settings. By integrating embedded video questions to address these concerns, we offered students a more engaging and interactive learning experience. As a result, embedding questions directly within videos is suggested to be an effective strategy for enhancing learner engagement and participation in online courses. Our results also contribute to the literature by comparing self-report data with behavioral course data, shedding light on the beneficial impacts of embedded video questions.

The significant differences in self-regulation and critical thinking skills among learners in the “Immediate Questioning” condition, who experienced questions embedded directly in videos, highlights the value of this pedagogical approach. Engaging with questions intentionally timed and aligned with the instructional content requires learners to monitor and regulate their cognitive processes, fostering metacognitive awareness and self-regulated learning (Jain & Dowson, 2009 ; Wang et al., 2013 ). The cognitive effort exerted to critically analyze, reflect, and respond to these questions within the video enhances critical thinking skills, compelling learners to evaluate and apply their understanding in real-time contexts. Our intentional LXD aimed to enhance the testing effect model’s saliency, encouraging students to think about their own thinking through formative assessments and solidify their conceptual understanding before summative assessments (Richland & Simms, 2015 ). Repeated opportunities for metacognitive reflection and regulation empower students to gauge their comprehension, identify areas for further exploration, and manage their learning progress (Wang et al., 2017 ; Wong & Hughes, 2022 ; Wong et al., 2022 ). Furthermore, immediate questioning compared to delayed questioning facilitates higher-order cognitive skills, with students in the “Immediate Questioning” condition showing significantly higher critical thinking. Critical thinking, evident through actions like exploring varied sources, learning from unsuccessful retrieval attempts (Richland et al., 2009 ), and making inferences (Uzuntiryaki-Kondakci & Capa-Aydin, 2013 ), is influenced by the timing of these questions.

Employing Bloom’s Taxonomy as a foundation for shaping our question construction, this entailed that the lower-order questions were formulated to underscore the tasks of remembering, comprehending, and applying concepts in specific contexts (Bloom, 2001 ; Betts, 2008 ). Conversely, the higher-order questions were tailored to provoke the application and analysis of real-world scenarios in the field of ecology and evolutionary biology, requiring students to deconstruct relationships and evaluate patterns on the information presented (Bloom, 2001 ; Betts, 2008 ). In combination, these choices in question design provide students with the opportunity to engage in a critical evaluation of course concepts, prompting learners to make inferences, inquire, and judge complex problems as they formulate their solutions. Immediate questioning prompts consideration of key concepts and assessment of understanding in real-time (Jain & Dowson, 2009 ; Wang et al., 2013 ), whereas delayed questioning requires learners to retain the information for a longer duration in their working memory, simultaneously mitigating distractions from mind-wandering, as learners await a delayed opportunity to actively retrieve and practice the information gleaned from the videos (Richland et al., 2099; Richland and Simms, 2015 ; Wong et al., 2023b ). Thus, promptly answering low-stakes questions embedded within videos while engaging with content enhances self-regulation, critical thinking, and overall engagement with instructional material. In this way, the cultivation of both self-regulation and critical thinking skills also holds the potential to bolster students’ transferable skills that can be applied across various contexts (Fries et al., 2020 ), which is a crucial competency for undergraduate students in STEM disciplines (Wong et al., 2023b ).

7.3 Interplay between Student Learning Behaviors and Knowledge Outcomes

Our analysis explored the interplay between the two conditions, learners’ self-regulation, critical thinking, and quiz grades using a multiple regression model. The results revealed that treatment condition, self-regulation, and critical thinking were significant predictors of quiz grades (Table  4 ), suggesting a potential mechanism that self-regulation plays when considering the testing effect (Peng et al., 2019 ; Sotola & Crede, 2021 ). Notably, the interaction between the “Immediate Questioning” condition and self-regulation emerged as a significant factor, suggesting that the influence of embedded video questions on quiz grades varies based on learners’ self-regulation abilities. In other words, learners in the “Immediate Questioning” condition who showed greater self-regulation tended to have significantly higher quiz grades. This pattern underscores the importance of considering learners’ metacognitive strategies when examining the impact of instructional interventions online, highlighting the potential mechanism self-regulation plays in the testing effect (Peng et al., 2019 ; Sotola & Crede, 2021 ). Conversely, the interaction term between the two conditions and critical thinking was not significant (Table  5 ). While there was a significant main effect for critical thinking, the timing of low-stakes questioning (delayed or immediate) did not significantly influence quiz scores based on students’ critical thinking skills. This implies that the timing of the low-stakes questions in this study may not depend on the levels of students’ critical thinking skills, but rather on their levels of self-regulation to influence their total quiz scores. Furthermore, self-regulation significantly influenced learners’ quiz grades throughout the 10-week course. Conceptually synchronized questions immediately embedded in the video player served as metacognitive reflective learning opportunities, empowering students to gauge their comprehension, identify areas for further exploration, and actively manage their learning progress (Delen et al., 2014 ; Wang et al., 2013 ; Wong & Hughes, 2023 ). One of the many benefits of the testing effect paradigm is acknowledging errors during low-stakes practice, allowing learners to self-regulate by reassessing initial understandings and fostering conceptual change (Richland et al., 2009 ; Iwamoto et al., 2017 ; Sotola & Crede, 2021 ). Enhancing students’ metacognitive techniques like self-regulation can enrich skills applicable in various contexts, including other courses, workforce training, and time management (Barak et al., 2016 ; Fisher & Baird, 2005 ; Fries et al., 2020 ). For STEM undergraduates at research-intensive institutions, embedding questions directly into the video player nurtures these critical proficiencies by linking course content with real-world applications. The study highlights how the interplay between LXD, the testing effect model, and immediate questioning embedded in video supports critical thinking and underscores the relationship between engagement, self-regulation, and science knowledge outcomes.

7.3.1 Alignment with Learning Experience Design and Learning Theories

The positive outcomes of this study also resonate with the principles of Learning Experience Design. LXD emphasizes human-centered, experiential, and evidence-based design to create meaningful and effective learning encounters (Floor, 2018 ). The incorporation of embedded video questions exemplifies how LXD principles can be applied intentionally to empathize with learner’s needs in online learning experiences (Wong & Hughes, 2023 ; Wong et al., 2023b ). By incorporating interactivity through embedded video questions, the video lessons promoted active learning, where learners’ needs and behaviors in the course were considered. This design choice transformed passive video consumption into an interactive and participatory experience, aligning with LXD’s focus on fostering engagement through experiential learning techniques (Floor, 2018 ). Additionally, the alignment of the study’s findings with LXD underscores the value of interdisciplinary with the implementation of educational technologies at scale. To make this study possible, we worked alongside the university instructor, an instructional designer, and a researcher in order to consider the integration of instructional design, learning sciences, theories of learning, and user experience design (Weigel, 2015 ). In doing so, we were able to ensure that the course was properly aligned to the LXD paradigm, grounded in learning theories such as the testing effect and Bloom’s Taxonomy, and deployed with an empathic lens to promote students’ active learning behaviors in online learning settings. Thus, our efforts led to the implementation of a technology-enhanced online learning experience that effectively supported learners’ quiz grades, engagement, self-regulation, and critical thinking.

7.4 Implications for Practice and Future Directions

The implications of this study for educators, instructional designers, and higher education administrators are significant. Firstly, the incorporation of immediate low-stakes questioning directly within video content offers a promising avenue for enriching online learning experiences rooted in the Learning Experience Design (LXD) paradigm and the testing effect model. Educators can integrate these strategies and technological modality into their course designs to foster active learning and deepen learners’ engagement with course material. Instructional designers, drawing on LXD principles, can create meaningful learning experiences that incorporate evidence-based pedagogical strategies, such as embedding low-stakes questions within instructional content. Facilitating the testing effect with low-stakes questioning can extend beyond videos and be incorporated into readings, assignments, and course activities. Moreover, higher education administrators and institutions should recognize the importance of integrating technology in line with evidence-based pedagogies. While the rapid introduction of educational technology (edtech) tools during the COVID-19 pandemic facilitated emergency remote learning, our study underscores the necessity of aligning these tools with pedagogical frameworks to optimize their effectiveness. By investing in the development and implementation of technologies that promote active learning and enhance learners’ engagement, self-regulation, and critical thinking, institutions can better equip students for success in online learning environments while capitalizing on existing edtech resources. An essential aspect of our study is to raise awareness about the range of tools already available to and supported by universities. Ensuring accessibility for instructors, designers, researchers, and students is imperative, enabling effective adoption of these tools while employing evidence-based strategies. We aspire for this study to serve as an example of how university investments in tools can positively impact students’ learning experiences, encouraging others to adopt similar approaches as we continue to refine our support for students’ needs.

7.4.1 Limitations

Further research is needed to thoroughly assess the long-term benefits of incorporating embedding low-stakes questions directly into videos in online undergraduate courses. During this study, participants in both groups were presented with low-stakes questions throughout the course. Students in the immediate condition encountering questions embedded within the video player experienced automatic triggering of questions, synchronized with instructional content. In contrast, those in the delayed condition faced identical questions after viewing all of the lecture videos in the instructional unit. While the timing of the questions served as a deliberate experimental manipulation between the two groups, determining whether the testing effect was more pronounced in either condition poses a limitation of the study. Despite high weekly quiz grades ranging from mid to upper 90% for both conditions, quiz scores were significantly higher for those who experienced questions directly embedded in the video. However, it’s important to note that scores remained consistently high across both conditions, suggesting that the testing effect may manifest regardless of question timing or that the question difficulty may need to be adjusted. This highlights the need for further exploration of how the testing effect operates in various instructional courses, topics, and learning contexts. Future research could involve a quasi-experimental study comprising a traditional control group without questions and treatment conditions integrating embedded video questions, utilizing larger sample sizes across STEM courses could reveal the true advantages of the testing effect. Moreover, future research could consider controlling for additional learning analytics, such as video completion rates, assignment submission times, and accuracy of low-stakes questioning, as predictors for learners’ course performance and learning outcomes. Understanding these dynamics can refine instructional strategies for optimizing learning outcomes in online education settings. We deliberately refrained from introducing additional learning opportunities between groups to ensure equal access to course content. Our aim was to evaluate the timing and integration of questions within or following video content, scrutinizing the effectiveness and benefits of implementing the embedded video questioning platform within the framework of LXD.

As a future direction, we plan to investigate the long-term impacts of embedded video questions on knowledge retention and transferable skills. Additionally, analyzing various question types, number, and difficulty, along with on-demand feedback and spacing intervals within videos, could inform optimal design choices for promoting knowledge outcomes and student learning behaviors. Enhancing the designs might include direct feedback for each of the low-stakes questions, adjusting the quantity of low-stakes questions learners encounter, and refining the difficulty level to better cater to individual learning needs. Further research is warranted to explore underlying mechanisms, optimal design, and factors influencing cognitive aspects such as affect, cognitive load, and mind-wandering. Structural equation modeling, pending sample sizes, could provide insights into intricate mechanisms exhibited by students. Lastly, exploring the scalability of this approach across different subject domains and learner populations could enhance understanding of its generalizability and benefits of operationalizing the testing effect through embedded video within the LXD paradigm.

8 Conclusion

The integration of low-stakes questioning embedded directly into the video player within an asynchronous online course grounded in the Learning Experience Design (LXD) paradigm showcased significantly positive effects on learners’ engagement, self-regulation, and critical thinking compared to their counterparts. In addition, results showed that learners in the immediate condition had significantly higher quiz grades, pageviews, and course participation after 10 instructional weeks. Furthermore, findings also revealed that one potential mechanism underpinning learners’ increased quiz grades might be attributed to students’ levels of self-regulation when experiencing embedded video questions. As evidenced by students learning analytics and self-reported online engagement, learners are more actively involved in the learning process, with the timing of the embedded questions activating students’ awareness to reflect on “what, how, and why” before critically deciding on answer choices to the conceptual questions. We suspect that learners might be experiencing more of the benefits of the testing effect given our LX design decisions, the placement of the questions given the timing of when these questions appeared, and how the questions were designed when deploying the low-stakes questioning. Thus, results suggest that the implementation of an LX-designed self-paced online course deployed with low-stakes questions directly embedded in video are efficacious for students’ science learning outcomes and may have practical implications for the sustainability and rigor of undergraduate science distance learning. As a result, this study contributes to the growing body of literature on technology-enhanced pedagogical strategies for online learning and underscores the importance of aligning “edtech” tools with evidence-based pedagogical frameworks. By fostering active learning through embedded low-stakes video questions, educators and instructional designers create online learning experiences that are more engaging, meaningful, and effective, ultimately enhancing students’ academic outcomes and transferable skills in digital learning environments. As institutions continue to invest in educational technology, the collaborative integration of expertise from diverse fields will be pivotal in designing and implementing effective and engaging online learning environments.

Data Availability

The data that support the findings of this study are available from the corresponding author, Joseph Wong, upon reasonable request.

Adesope, O. O., Trevisan, D. A., & Sundararajan, N. (2017). Rethinking the use of tests: A meta-analysis of practice testing. Review of Educational Research , 87 (3), 659–701. https://doi.org/10.3102/0034654316689306 .

Article   Google Scholar  

Agarwal, P. K., Karpicke, J. D., Kang, S. H., Roediger, I. I. I., H. L., & McDermott, K. B. (2008). Examining the testing effect with open-and closed‐book tests. Applied Cognitive Psychology: The Official Journal of the Society for Applied Research in Memory and Cognition , 22 (7), 861–876. https://doi.org/10.1002/acp.1391 .

Ahn, J. (2019). Drawing inspiration for learning experience design (LX) from diverse perspectives. The Emerging Learning Design Journal , 6 (1), 1. https://digitalcommons.montclair.edu/eldj/vol6/iss1/1 .

Google Scholar  

Al-Harthy, I. S., Was, C. A., & Isaacson, R. M. (2010). Goals, efficacy and metacognitive self-regulation a path analysis. International Journal of Education , 2 (1), 1.

Asad, M. M., Hussain, N., Wadho, M., Khand, Z. H., & Churi, P. P. (2020). Integration of e-learning technologies for interactive teaching and learning process: An empirical study on higher education institutes of Pakistan. Journal of Applied Research in Higher Education . https://doi.org/10.1108/JARHE-04-2020-0103 .

Azevedo, R., Moos, D. C., Johnson, A. M., & Chauncey, A. D. (2010). Measuring cognitive and metacognitive regulatory processes during hypermedia learning: Issues and challenges. Educational psychologist, 45 (4), 210–223. https://doi.org/10.1080/00461520.2010.515934 .

Barak, M., Hussein-Farraj, R., & Dori, Y. J. (2016). On-campus or online: Examining self-regulation and cognitive transfer skills in different learning settings. International Journal of Educational Technology in Higher Education , 13 (1), 1–18. https://doi.org/10.1186/s41239-016-0035-9 .

Betts, S. C. (2008). Teaching and assessing basic concepts to advanced applications: Using Bloom’s taxonomy to inform graduate course design. Academy of Educational Leadership Journal , 12 (3), 99.

Bloom, H. (2001). How to read and why . Simon and Schuster.

Bolliger, D. U., & Halupa, C. (2018). Online student perceptions of engagement, transactional distance, and outcomes. Distance Education , 39 (3), 299–316. https://doi.org/10.1080/01587919.2018.1476845 .

Brookfield, S. (1995). Adult learning: An overview. International Encyclopedia of Education , 10 , 375–380.

Bruning, K. (2005). The role of critical thinking in the online learning environment. International Journal of Instructional Technology and Distance Learning , 2 (5), 21–31.

Carpenter, S. K. (2009). Cue strength as a moderator of the testing effect: The benefits of elaborative retrieval. Journal of Experimental Psychology: Learning Memory and Cognition , 35 (6), 1563. https://doi.org/10.1037/a0017021 .

Chan, J. C. (2010). Long-term effects of testing on the recall of nontested materials. Memory, 18 (1), 49–57. https://doi.org/10.1080/09658210903405737 .

Carrier, M., & Pashler, H. (1992). The influence of retrieval on retention. Memory & Cognition , 20 , 633–642. https://doi.org/10.3758/BF03202713 .

Chi, M. T., & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist, 49 (4), 219–243. https://doi.org/10.1080/00461520.2014.965823 .

Chick, R. C., Clifton, G. T., Peace, K. M., Propper, B. W., Hale, D. F., Alseidi, A. A., & Vreeland, T. J. (2020). Using technology to maintain the education of residents during the COVID-19 pandemic. Journal of Surgical Education , 77 (4), 729–732. https://doi.org/10.1016/j.jsurg.2020.03.018 .

​​Christiansen, M. A., Lambert, A. M., Nadelson, L. S., Dupree, K. M., & Kingsford, T. A. (2017). In-class versus at-home quizzes: Which is better? A flipped learning study in a two-site synchronously broadcast organic chemistry course. Journal of Chemical Education , 94 (2), 157–163. https://doi.org/10.1021/acs.jchemed.6b00370 .

Churches, A. (2008). Bloom’s taxonomy blooms digitally. Tech & Learning , 1 , 1–6.

Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator , 15 (3), 6–11. https://eric.ed.gov/?id=EJ440511 .

Corporation for Public Broadcasting (1997). Study of school uses of television and video. 1996–1997 School year summary report. (ERIC Document Reproduction Service No. ED 413 879).

Corporation for Public Broadcasting (2004). Television goes to school: The impact of video on student learning in formal education. Available: http://www.cpb.org/stations/reports/tvgoestoschool/ .

Correia, A. P. (2021). ID 2 LXD. From instructional design to learning experience design: The Rise of design thinking. Driving educational change: Innovations in action .

Cruse, E. (2006). Using educational video in the classroom: Theory, research and practice. Library Video Company , 12 (4), 56–80.

Cummins, S., Beresford, A. R., & Rice, A. (2015). Investigating engagement with in-video quiz questions in a programming course. IEEE Transactions on Learning Technologies , 9 (1), 57–66. https://doi.org/10.1109/TLT.2015.2444374 .

Czerkawski, B. C., & Lyman, E. W. (2016). An instructional design framework for fostering student engagement in online learning environments. TechTrends , 60 , 532–539. https://doi.org/10.1007/s11528-016-0110-z .

Delen, E., Liew, J., & Willson, V. (2014). Effects of interactivity and instructional scaffolding on learning: Self-regulation in online video-based environments. Computers & Education , 78 , 312–320. https://doi.org/10.1016/j.compedu.2014.06.018 .

Deng, R., & Gao, Y. (2023). Effects of embedded questions in pre-class videos on learner perceptions, video engagement, and learning performance in flipped classrooms. Active Learning in Higher Education . https://doi.org/10.1177/14697874231167098

Deng, R., Feng, S., & Shen, S. (2023). Improving the effectiveness of video-based flipped classrooms with question-embedding. Education and Information Technologies , 1–26. https://doi.org/10.1007/s10639-023-12303-5 .

Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., & Willingham, D. T. (2013). Improving students’ learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychological Science in the Public Interest , 14 (1), 4–58. https://doi.org/10.1177/1529100612453266 .

EDSCOOP Staff (June 5, 2023). Colleges spent $1B on distance-learning tech at COVID-19 peak. https://edscoop.com/colleges-spent-distance-learning-tech-covid-19/#:~:text=Of%20more%20than%20%2426%20billion,students%2C%20according%20to%20the%20report .

Ertmer, P. A., Richardson, J. C., Lehman, J. D., Newby, T. J., Cheng, X., Mong, C., & Sadaf, A. (2010). Peer feedback in a large undergraduate blended course: Perceptions of value and learning. Journal of Educational Computing Research, 43 (1), 67–88. https://doi.org/10.2190/EC.43.1.e .

Fiorella, L., & Mayer, R. E. (2015). Learning as a generative activity . Cambridge University Press.

Book   Google Scholar  

Fisher, M., & Baird, D. E. (2005). Online learning design that fosters student support, self-regulation, and retention. Campus-wide Information Systems , 22 (2), 88–107. https://doi.org/10.1108/10650740510587100 .

Floor, N. (2018). What is learning experience design . Springer.

Floor, N. (2023). This is learning experience design: What it is, how it works, and why it matters . New Riders.

Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research , 74 (1), 59–109. https://doi.org/10.3102/00346543074001059 .

Fries-Britt, S., & White-Lewis, D. (2020). In pursuit of meaningful relationships: How black males perceive faculty interactions in STEM. The Urban Review , 52 (3), 521–540. https://doi.org/10.1007/s11256-020-00559-x .

Fiorella, L., & Mayer, R. E. (2018). What works and doesn't work with instructional video. Computers in Human Behavior, 89 , 465–470. https://doi.org/10.1016/j.chb.2018.07.015

Giannakos, M. N. (2013). Exploring the video-based learning research: A review of the literature. British Journal of Educational Technology , 44 (6), E191–E195. https://doi.org/10.1111/bjet.12070 .

Haagsman, M. E., Scager, K., Boonstra, J., & Koster, M. C. (2020). Pop-up questions within educational videos: Effects on students’ learning. Journal of Science Education and Technology , 29 , 713–724. https://doi.org/10.1007/s10956-020-09847-3 .

Halpern, D. F. (1998). Teaching critical thinking for transfer across domains: Disposition, skills, structure training, and metacognitive monitoring. American Psychologist , 53 (4), 449. https://doi.org/10.1037/0003-066X.53.4.449 .

Halverson, L. R., & Graham, C. R. (2019). Learner engagement in blended learning environments: A conceptual framework. Online Learning , 23 (2), 145–178. https://doi.org/10.24059/olj.v23i2.1481 .

Hu, S., & Kuh, G. D. (2002). Being (dis) engaged in educationally purposeful activities: The influences of student and institutional characteristics. Research in Higher Education , 43 , 555–575. https://doi.org/10.1023/A:1020114231387 .

Humphries, B., & Clark, D. (2021). An examination of student preference for traditional didactic or chunking teaching strategies in an online learning environment. Research in Learning Technology . https://doi.org/10.25304/rlt.v29.2405

Instructure (2024, January 18). How do I view analytics for an individual student in new analytics? Instructure Community. https://community.canvaslms.com/t5/Instructor-Guide/How-do-I-view-analytics-for-an-individual-student-in-New/ta-p/801 .

Iwamoto, D. H., Hargis, J., Taitano, E. J., & Vuong, K. (2017). Analyzing the efficacy of the testing effect using KahootTM on student performance. Turkish Online Journal of Distance Education , 18 (2), 80–93. https://doi.org/10.17718/tojde.306561 .

Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student performance? Computers & Education , 95 , 270–284. https://doi.org/10.1016/j.compedu.2016.01.014 .

Jain, S., & Dowson, M. (2009). Mathematics anxiety as a function of multidimensional self-regulation and self-efficacy. Contemporary Educational Psychology , 34 (3), 240–249. https://doi.org/10.1016/j.cedpsych.2009.05.004 .

Johnson, C. I., & Mayer, R. E. (2009). A testing effect with multimedia learning. Journal of Educational Psychology , 101 (3), 621. https://doi.org/10.1037/a0015183 .

Kanuka, H. (2006). Instructional design and eLearning: A discussion of pedagogical content knowledge as a missing construct. E-Journal of Instructional Science and Technology, 9 (2), n2.

Kestin, G., & Miller, K. (2022). Harnessing active engagement in educational videos: Enhanced visuals and embedded questions. Physical Review Physics Education Research , 18 (1), 010148. https://doi.org/10.1103/PhysRevPhysEducRes.18.010148 .

Kolås, L. (2015, June). Application of interactive videos in education. In 2015 International Conference on Information Technology Based Higher Education and Training (ITHET) (pp. 1–6). IEEE. https://doi.org/10.1109/ITHET.2015.7218037 .

Kovacs, G. (2016, April). Effects of in-video quizzes on MOOC lecture viewing. In Proceedings of the third (2016) ACM conference on Learning@ Scale (pp. 31–40). https://doi.org/10.1145/2876034.2876041 .

Kuh, G. D., Kinzie, J., Schuh, J. H., & Whitt, E. J. (2005). Never let it rest lessons about student success from high-performing colleges and universities. Change: The Magazine of Higher Learning , 37 (4), 44–51. https://doi.org/10.3200/CHNG.37.4.44-51 .

Littrell-Baez, M. K., Friend, A., Caccamise, D., & Okochi, C. (2015). Using retrieval practice and metacognitive skills to improve content learning. Journal of Adolescent & Adult Literacy , 58 (8), 682–689. https://doi.org/10.1002/jaal.420 .

Matthews, M. T., Williams, G. S., Yanchar, S. C., & McDonald, J. K. (2017). Empathy in distance learning design practice. TechTrends , 61 (5), 486–493. https://doi.org/10.1007/s11528-017-0212-2 .

Marshall, F. B., & Marshall, J. (2021, November). The effects of embedding knowledge-check questions in instructional videos. In innovate learning summit (pp. 319–327). Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/220301/ .

Mayer, R. E. (2009). Constructivism as a theory of learning versus constructivism as a prescription for instruction. In Constructivist instruction (pp. 196–212). Routledge.

Mayer, R. E. (Ed.). (2005). The Cambridge handbook of multimedia learning . Cambridge University Press.

Mayer, R. E. (2014). Introduction to multimedia learning.

Mayer, R. E. (2017). Using multimedia for e-learning. Journal of Computer Assisted Learning , 33 (5), 403–423. https://doi.org/10.1111/jcal.12197 .

Mayer, R. E. (2019). Thirty years of research on online learning. Applied Cognitive Psychology , 33 (2), 152–159. https://doi.org/10.1002/acp.3482 .

Mayer, R. E., Heiser, J., & Lonn, S. (2001). Cognitive constraints on multimedia learning: When presenting more material results in less understanding. Journal of Educational Psychology , 93 (1), 187. https://doi.org/10.1037/0022-0663.93.1.187 .

Mayer, R. E., Stull, A., DeLeeuw, K., Almeroth, K., Bimber, B., Chun, D., ... & Zhang, H. (2009). Clickers in college classrooms: Fostering learning with questioning methods in large lecture classes. Contemporary Educational Psychology, 3 4 (1), 51–57. https://doi.org/10.1016/j.cedpsych.2008.04.002

McDaniel, M. A., Thomas, R. C., Agarwal, P. K., McDermott, K. B., & Roediger, H. L. (2013). Quizzing in middle-school science: Successful transfer performance on classroom exams. Applied Cognitive Psychology , 27 (3), 360–372. https://doi.org/10.1002/acp.2914 .

McMahon, M., & Oliver, R. (2001). Promoting self-regulated learning in an on-line environment (pp. 1299–1305). Association for the Advancement of Computing in Education (AACE). https://www.learntechlib.org/primary/p/8630/ .

van der Meij, H., & Bӧckmann, L. (2021). Effects of embedded questions in recorded lectures. Journal of Computing in Higher Education , 33 (1), 235–254. https://doi.org/10.1007/s12528-020-09263-x .

McDaniel, M. A., Agarwal, P. K., Huelser, B. J., McDermott, K. B., & Roediger III, H. L. (2011). Test-enhanced learning in a middle school science classroom: The effects of quiz frequency and placement. Journal of Educational Psychology, 103 (2), 399. https://doi.org/10.1037/a0021782 .

Moos, D. C., & Bonde, C. (2016). Flipping the classroom: Embedding self-regulated learning prompts in videos. Technology Knowledge and Learning , 21 , 225–242. https://doi.org/10.1007/s10758-015-9269-1 .

National Center for Education Statistics (2022). Postbaccalaureate Enrollment. Condition of Education. U.S. Department of Education, Institute of Education Sciences. Retrieved May 31, 2022, https://nces.ed.gov/programs/coe/indicator/chb .

O’leary, B., June, A. W., & May (2023). 30, Higher Ed Recieved Billions in Covid-Relief Money. Where did it Go? The Chronicle of Higher Education. https://www.chronicle.com/article/higher-ed-received-billions-in-covid-relief-money-where-did-it-go?emailConfirmed=true&supportSignUp=true&supportForgotPassword=true&email=wongjosepht%40gmail.com&success=true&code=success&bc_nonce=s0oj2mjwxyeggvl7ua8u&cid=gen_sign_in

Pan, S. C., Cooke, J., Little, J. L., McDaniel, M. A., Foster, E. R., Connor, L. T., & Rickard, T. C. (2019). Online and clicker quizzing on jargon terms enhances definition-focused but not conceptually focused biology exam performance. CBE—Life Sciences Education , 18 (4), ar54. https://doi.org/10.1187/cbe.18-12-0248 .

Pellas, N. (2018). Is the flipped classroom model for all? Correspondence analysis from trainee instructional media designers. Education and Information Technologies, 23 (2), 757–775. https://doi.org/10.1007/s10639-017-9634-x .

Pellas, N. (2014). The influence of computer self-efficacy, metacognitive self-regulation and self-esteem on student engagement in online learning programs: Evidence from the virtual world of Second Life. Computers in Human Behavior , 35 , 157–170. https://doi.org/10.1016/j.chb.2014.02.048 .

Peng, Y., Liu, Y., & Guo, C. (2019). Examining the neural mechanism behind testing effect with concrete and abstract words. Neuroreport , 30 (2), 113–119. https://doi.org/10.1097/WNR.0000000000001169 .

Picciano, A. G. (2023). Future technological trends and research. In Data Analytics and Adaptive Learning (pp. 303–322). Routledge.

Pintrich, P. R., Smith, D. A., Garcia, T., & McKeachie, W. J. (1993). Reliability and predictive validity of the motivated strategies for learning questionnaire (MSLQ). Educational and Psychological Measurement , 53 (3), 801–813. https://doi.org/10.1177/0013164493053003024 .

Pulukuri, S., & Abrams, B. (2021). Improving learning outcomes and metacognitive monitoring: Replacing traditional textbook readings with question-embedded videos. Journal of Chemical Education , 98 (7), 2156–2166. https://doi.org/10.1021/acs.jchemed.1c00237 .

Redmond, P., Abawi, L., Brown, A., Henderson, R., & Heffernan, A. (2018). An online engagement framework for higher education. Online Learning Journal , 22 (1), 183–204. https://doi.org/10.24059/olj.v22i1.1175 .

Rice, P., Beeson, P., & Blackmore-Wright, J. (2019). Evaluating the impact of a quiz question within an educational video. TechTrends , 63 (5), 522–532. https://doi.org/10.1007/s11528-019-00374-6 .

Richland, L. E., Kornell, N., & Kao, L. S. (2009). The pretesting effect: Do unsuccessful retrieval attempts enhance learning? Journal of Experimental Psychology: Applied , 15 (3), 243. https://doi.org/10.1037/a0016496 .

Richland, L. E., & Simms, N. (2015). Analogy, higher order thinking, and education. Wiley Interdisciplinary Reviews: Cognitive Science , 6 (2), 177–192. https://doi.org/10.1002/wcs.1336 .

Roediger, I. I. I., H. L., & Karpicke, J. D. (2006). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science , 1 (3), 181–210. https://doi.org/10.1111/j.1745-6916.2006.00012.x .

Rossing, J. P., Miller, W., Cecil, A. K., & Stamper, S. E. (2012). iLearning: The future of higher education? Student perceptions on learning with mobile tablets. https://hdl.handle.net/1805/7071 .

Ryan, R. M., & Deci, E. L. (2017). Self-determination theory: Basic psychological needs in motivation, development, and wellness . Guilford publications.

Sandars, J., Correia, R., Dankbaar, M., de Jong, P., Goh, P. S., Hege, I., & Pusic, M. (2020). Twelve tips for rapidly migrating to online learning during the COVID-19 pandemic. https://doi.org/10.15694/mep.2020.000082.1 .

Sansone, C., Fraughton, T., Zachary, J. L., Butner, J., & Heiner, C. (2011). Self-regulation of motivation when learning online: The importance of who, why and how. Educational Technology Research and Development , 59 , 199–212. https://doi.org/10.1007/s11423-011-9193-6 .

Scott, E. E., Wenderoth, M. P., & Doherty, J. H. (2020). Design-based research: A methodology to extend and enrich biology education research. CBE—Life Sciences Education, 19 (2), es11. https://doi.org/10.1187/cbe.19-11-0245 .

Schmitz, W. H. G. (2020). Embedded questions in text and video-based lectures (Master’s thesis, University of Twente). https://purl.utwente.nl/essays/82825 .

Shneiderman, B., & Hochheiser, H. (2001). Universal usability as a stimulus to advanced interface design. Behaviour & Information Technology , 20 (5), 367–376. https://doi.org/10.1080/01449290110083602 .

Siek, K. A., Hayes, G. R., Newman, M. W., & Tang, J. C. (2014). Field deployments: Knowing from using in context. In J. Olson & W. Kellogg (Eds.), Ways of knowing in HCI (pp. 119–142). New York, NY: Springer. https://doi.org/10.1007/978-1-4939-0378-8_6 .

Sotola, L. K., & Crede, M. (2021). Regarding class quizzes: A meta-analytic synthesis of studies on the relationship between frequent low-stakes testing and class performance. Educational Psychology Review , 33 , 407–426. https://doi.org/10.1007/s10648-020-09563-9 .

Sun, J. C. Y., & Rueda, R. (2012). Situational interest, computer self-efficacy and self‐regulation: Their impact on student engagement in distance education. British Journal of Educational Technology , 43 (2), 191–204. https://doi.org/10.1111/j.1467-8535.2010.01157.x .

Swan, K., Garrison, D. R., & Richardson, J. C. (2009). A constructivist approach to online learning: The community of inquiry framework. In Information technology and constructivism in higher education: Progressive learning frameworks (pp. 43–57). IGI global.

Torres, D., Pulukuri, S., & Abrams, B. (2022). Embedded questions and targeted feedback transform passive educational videos into effective active learning tools. Journal of Chemical Education , 99 (7), 2738–2742. https://doi.org/10.1021/acs.jchemed.2c00342 .

Tullis, J. G., & Benjamin, A. S. (2011). On the effectiveness of self-paced learning. Journal of Memory and Language , 64 (2), 109–118. https://doi.org/10.1016/j.jml.2010.11.002 .

Uzuntiryaki-Kondakci, E., & Capa-Aydin, Y. (2013). Predicting critical thinking skills of university students through metacognitive self-regulation skills and chemistry self-efficacy. Educational Sciences: Theory and Practice , 13 (1), 666–670. https://eric.ed.gov/?id=EJ1016667 .

Vrugt, A., & Oort, F. J. (2008). Metacognition, achievement goals, study strategies and academic achievement: Pathways to achievement. Metacognition and Learning , 3 , 123–146. https://doi.org/10.1007/s11409-008-9022-4 .

Wang, H. H., Chen, H. T., Lin, H. S., & Hong, Z. R. (2017). The effects of college students’ positive thinking, learning motivation and self-regulation through a self-reflection intervention in Taiwan. Higher Education Research & Development , 36 (1), 201–216. https://doi.org/10.1080/07294360.2016.1176999 .

Wang, C. H., Shannon, D. M., & Ross, M. E. (2013). Students’ characteristics, self-regulated learning, technology self-efficacy, and course outcomes in online learning. Distance Education , 34 (3), 302–323. https://doi.org/10.1080/01587919.2013.835779 .

Weigel, M. (2015). Learning experience design versus user experience: Moving from user to Learner. Sixredmarbles .

Wolters, C. A., & Benzon, M. B. (2013). Assessing and predicting college students’ use of strategies for the self-regulation of motivation. The Journal of Experimental Education , 81 (2), 199–221. https://doi.org/10.1080/00220973.2012.699901 .

Wong, J. T., Bui, N. N., Fields, D. T., & Hughes, B. S. (2023). A learning experience design approach to online professional development for teaching science through the arts: Evaluation of teacher content knowledge, self-efficacy and STEAM perceptions. Journal of Science Teacher Education, 34 , 1–31. https://doi.org/10.1080/1046560X.2022.2112552

Wong, J., Chen, E., Rose, E., Lerner, B., Richland, L., & Hughes, B. (2023). The cognitive and behavioral learning impacts of embedded video questions: Leveraging learning experience design to support students’ knowledge outcomes. In P. Blikstein, J. Van Aalst, R. Kizito, & K. Brennan (Eds.), Proceedings of the 17th international conference of the learning sciences - ICLS 2023 (pp. 1861–1862). International Society of the Learning Sciences. https://doi.org/10.22318/icls2023.356980

Wong, J. T., Chen, E., Au-Yeung, N., Lerner, B. S., & Richland, L. E. (2024). Fostering engaging online learning experiences: Investigating situational interest and mind-wandering as mediators through learning experience design. Education and Information Technologies, 1–27. https://doi.org/10.1007/s10639-024-12524-2

Wong, J. T., & Hughes, B. S. (2022). Leveraging learning experience design: Digital media approaches to influence motivational traits that support student learning behaviors in undergraduate online courses. Journal of Computing in Higher Education, 35 , 1–38. https://doi.org/10.1007/s12528-022-09342-1

Wong, J. T., Mesghina, A., Chen, E., Yeung, N. A., Lerner, B. S., & Richland, L. E. (2023b). Zooming in or zoning out: Examining undergraduate learning experiences with zoom and the role of mind-wandering. Computers and Education Open , 4 , 100118. https://doi.org/10.1016/j.caeo.2022.100118 .

Yousef, A. M. F., Chatti, M. A., & Schroeder, U. (2014). Video-based learning: A critical analysis of the research published in 2003–2013 and future visions. In eLmL 2014, The sixth international conference on mobile, hybrid, and on-line learning (pp. 112–119).

Zimmerman, B. J., & Schunk, D. H. (Eds.). (2001). Self-regulated learning and academic achievement: Theoretical perspectives . Routledge.

Download references

Acknowledgements

We thank all the participating students, instructor, university staff, and administrators. We are impressed by their enthusiasm to adopt and learn new strategies to implement LXD strategies during the pandemic teaching and learning environment.

This work was supported by the National Science Foundation Graduate Research Fellowship, under grant number 2020304238 to the first author via the University of California, Irvine.

Author information

Authors and affiliations.

University of California, Irvine, 3200 Education Bldg, Irvine, CA, 92697, USA

Joseph T. Wong & Lindsey Engle Richland

University of California, Irvine, 301 Steinhaus Hall, Irvine, CA, 92697, USA

Bradley S. Hughes

You can also search for this author in PubMed   Google Scholar

Contributions

Joseph Wong: concept and design, data acquisition, data analysis/interpretation, manuscript writing, statistical analysis, technical support, material support. Lindsey Richland: critical manuscript revision, supervision, admin, material support. Bradley Hughes: instructor, concept, and design, data acquisition, data analysis/interpretation, critical revision of the manuscript, admin, supervision.

Corresponding author

Correspondence to Joseph T. Wong .

Ethics declarations

Conflict of interest.

No potential conflict of interest was reported by the authors.

Ethical Approval

This study was approved by the Internal Review Board Ethics Committee at the University.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Wong, J.T., Richland, L.E. & Hughes, B.S. Immediate Versus Delayed Low-Stakes Questioning: Encouraging the Testing Effect Through Embedded Video Questions to Support Students’ Knowledge Outcomes, Self-Regulation, and Critical Thinking. Tech Know Learn (2024). https://doi.org/10.1007/s10758-024-09746-1

Download citation

Accepted : 20 May 2024

Published : 30 July 2024

DOI : https://doi.org/10.1007/s10758-024-09746-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Learning experience design
  • Testing effect
  • Embedded video questions
  • Critical thinking
  • Self-regulation
  • Find a journal
  • Publish with us
  • Track your research

COMMENTS

  1. Core Outcomes: Critical Thinking and Problem Solving

    Core Outcomes. Sample Indicators. Level 1. Limited demonstration or application of knowledge and skills. Identifies the main problem, question at issue or the source's position. Identifies implicit aspects of the problem and addresses their relationship to each other. Level 2. Basic demonstration and application of knowledge and skills.

  2. Learning outcomes and critical thinking

    Learning outcomes and critical thinking. In the literature, the learning outcome model's epistemological shortcomings are mainly seen in relation to disciplinary subject matter, but further aspects come into play when expectations concerning students' critical thinking are added. Following the discussion above, this will be examined in ...

  3. Eight Instructional Strategies for Promoting Critical Thinking

    At Avenues World School, critical thinking is one of the Avenues World Elements and is an enduring outcome embedded in students' early experiences through 12th grade. For instance, a ...

  4. Essential Learning Outcomes: Critical/Creative Thinking

    Guide to Critical/Creative Thinking. Intended Learning Outcome: Analyze, evaluate, and synthesize information in order to consider problems/ideas and transform them in innovative or imaginative ways (See below for definitions) Assessment may include but is not limited to the following criteria and intended outcomes:

  5. Bridging critical thinking and transformative learning: The role of

    From here, the student can utilize other critical thinking skills to navigate the doubt, which, depending on the outcome, could result in transformative learning. It could be argued that instead of incorporating perspective-taking texts into disciplines across the curriculum, we should instead focus our efforts on changing the reading behavior ...

  6. Learning outcomes and critical thinking

    Abstract. The notion of critical thinking and its theoretical complexity are used as a case for an epistemological critique of the model of intended learning outcomes. The conclusion is that three ...

  7. Learning outcomes and critical thinking

    ABSTRACT. The notion of critical thinking and its theoretical complexity are used as a case for an epistemological critique of the model of intended learning outcomes. The conclusion is that three problems of learning outcomes, previously discussed in the literature, become even more challenging when seen in the light of critical thinking.

  8. Assessing Critical Thinking in Higher Education: Current State and

    Significant moderate correlation with the real-world outcomes of critical thinking inventory (r (131) = − ... analysis, and score interpretation process. For any student learning outcomes assessment results to be of instructional value, faculty should be closely involved in the development process and fully understand the outcome of the ...

  9. Active learning tools improve the learning outcomes, scientific

    This relationship between active learning and improving critical thinking has been reported in other groups around the world. 22, 51, 52 Active-learning strategies (such as collaborative work in small groups and case studies) improved students' critical thinking skills as measured by the Watson-Glaser Critical Thinking Appraisal, which assesses ...

  10. Towards developing a critical learning skills framework for master's

    These feelings of being lost, along with a lack of understanding associated with the challenges exacerbated by the multi-dimensional concepts of critical thinking and learning engagement (Eccles, 2016; Larsson, 2017), have created potential sources of mismatch between the expected and actual learning outcomes of master's students. 1.3.

  11. Artificial intelligence to develop outcomes for critical thinking: A

    Learning outcomes for critical thinking can vary depending on the specific context, educational level and goals of a course or program. However, here are some general and broadly applicable learning outcomes for critical thinking: 1. **Analysis and evaluation:**

  12. PDF Knowing, Thinking, and Learning

    Critical thinking remains one of the primary learning outcomes ascribed to higher education; a position the skill has held for decades. As the basic argument goes, the content knowledge students may gain in college is likely to either dissipate or become surpassed in a rapidly evolving post-graduation world. Critical thinking, on

  13. Learning outcomes and critical thinking

    ABSTRACT The notion of critical thinking and its theoretical complexity are used as a case for an epistemological critique of the model of intended learning outcomes. The conclusion is that three problems of learning outcomes, previously discussed in the literature, become even more challenging when seen in the light of critical thinking. The first problem concerns interpretations, as the use ...

  14. Critical Thinking

    Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well. Collecting, analyzing and evaluating information is an important skill in life, and a highly ...

  15. PDF Critical Thinking Learning Outcomes

    Critical Thinking Learning Outcomes. 1. Recognize critical thinking as a process of identifying, analyzing, evaluating, and constructing reasoning in deciding what conclusions to draw or actions to take. And be able to do one or more of the following: 2A. Identify reasoning as they apply it to general or discipline-specific questions or issues. 2B.

  16. Learning Interventions: Collaborative Learning, Critical Thinking and

    There is a consensus that students' higher level of engagement in learning correlates to a more desirable learning outcomes, such as critical thinking and grades (Carini et al., 2006). The rationale of this finding may be simply that the more students engage with the learning materials the more skilful they become (Kuh, 2003).

  17. Critical Thinking

    Critical thinking might be described as the ability to engage in reflective and independent thinking. In essence, critical thinking requires you to use your ability to reason. It is about being an active learner rather than a passive recipient of information. Critical thinkers rigorously question ideas and assumptions rather than accepting them ...

  18. Critical Thinking

    Critical Thinking. The ability to formulate an effective, balanced perspective on an issue or topic. Articles. ... Operationalizing the concept of critical thinking for student learning outcome development. D'alessio, Fernando A. et al. (2019). Studying the impact of critical thinking on the academic performance of executive MBA students.

  19. Writing Student Learning Outcomes

    Academic Program Learning Outcomes. The following examples of academic program student learning outcomes come from a variety of academic programs across campus, and are organized in four broad areas: 1) contextualization of knowledge; 2) praxis and technique; 3) critical thinking; and, 4) research and communication.

  20. Active Learning Strategies to Promote Critical Thinking

    The development of critical thinking has been the topic of many educational articles recently. Numerous instructional methods exist to promote thought and active learning in the classroom, including case studies, discussion methods, written exercises, questioning techniques, and debates. Three methods—questioning, written exercises, and ...

  21. Research Guides: Critical Thinking Tutorial: Learning Outcomes

    Learning Outcomes. At the end of this module, you should be able to. Define comprehension and interpretation in relation to information. Recognize why interpreting information is essential for critical thinking. Employ a methodical approach like the SEE-I method to comprehend and interpret information. Assess the value of using the SEE-I method ...

  22. How educators can train critical thinking with Kialo Edu

    How students benefit from critical thinking skills 1. Critical thinking improves student learning outcomes. When students learn how to think critically, they become more active learners capable of applying their knowledge across subject areas.. Cross-subject knowledge transfer means students are better able to learn independently, which in turn leads to better learning outcomes.

  23. Research Guides: Critical Thinking Tutorial: Learning Outcomes

    The logic video describes what the premise of an argument is in relation to its conclusion and demonstrates how to test if an argument is valid or sound.Testing for validity and soundness are key critical thinking skills. At university, you will use these argument analysis skills to examine how accurate arguments are, to differentiate between good and poor arguments, and to construct sound ...

  24. Immediate Versus Delayed Low-Stakes Questioning: Encouraging ...

    We also assessed students' self-report engagement, self-regulation, and critical thinking. On average, the outcomes indicated that learners exposed to immediate low-stakes questioning exhibited notably superior summative quiz scores, increased page views, and enhanced participation in the course.