CRENC Learn

How to Create a Data Analysis Plan: A Detailed Guide

by Barche Blaise | Aug 12, 2020 | Writing

how to create a data analysis plan

If a good research question equates to a story then, a roadmap will be very vita l for good storytelling. We advise every student/researcher to personally write his/her data analysis plan before seeking any advice. In this blog article, we will explore how to create a data analysis plan: the content and structure.

This data analysis plan serves as a roadmap to how data collected will be organised and analysed. It includes the following aspects:

  • Clearly states the research objectives and hypothesis
  • Identifies the dataset to be used
  • Inclusion and exclusion criteria
  • Clearly states the research variables
  • States statistical test hypotheses and the software for statistical analysis
  • Creating shell tables

1. Stating research question(s), objectives and hypotheses:

All research objectives or goals must be clearly stated. They must be Specific, Measurable, Attainable, Realistic and Time-bound (SMART). Hypotheses are theories obtained from personal experience or previous literature and they lay a foundation for the statistical methods that will be applied to extrapolate results to the entire population.

2. The dataset:

The dataset that will be used for statistical analysis must be described and important aspects of the dataset outlined. These include; owner of the dataset, how to get access to the dataset, how the dataset was checked for quality control and in what program is the dataset stored (Excel, Epi Info, SQL, Microsoft access etc.).

3. The inclusion and exclusion criteria :

They guide the aspects of the dataset that will be used for data analysis. These criteria will also guide the choice of variables included in the main analysis.

4. Variables:

Every variable collected in the study should be clearly stated. They should be presented based on the level of measurement (ordinal/nominal or ratio/interval levels), or the role the variable plays in the study (independent/predictors or dependent/outcome variables). The variable types should also be outlined.  The variable type in conjunction with the research hypothesis forms the basis for selecting the appropriate statistical tests for inferential statistics. A good data analysis plan should summarize the variables as demonstrated in Figure 1 below.

Presentation of variables in a data analysis plan

5. Statistical software

There are tons of software packages for data analysis, some common examples are SPSS, Epi Info, SAS, STATA, Microsoft Excel. Include the version number,  year of release and author/manufacturer. Beginners have the tendency to try different software and finally not master any. It is rather good to select one and master it because almost all statistical software have the same performance for basic and the majority of advance analysis needed for a student thesis. This is what we recommend to all our students at CRENC before they begin writing their results section .

6. Selecting the appropriate statistical method to test hypotheses

Depending on the research question, hypothesis and type of variable, several statistical methods can be used to answer the research question appropriately. This aspect of the data analysis plan outlines clearly why each statistical method will be used to test hypotheses. The level of statistical significance (p-value) which is often but not always <0.05 should also be written.  Presented in figures 2a and 2b are decision trees for some common statistical tests based on the variable type and research question

A good analysis plan should clearly describe how missing data will be analysed.

How to choose a statistical method to determine association between variables

7. Creating shell tables

Data analysis involves three levels of analysis; univariable, bivariable and multivariable analysis with increasing order of complexity. Shell tables should be created in anticipation for the results that will be obtained from these different levels of analysis. Read our blog article on how to present tables and figures for more details. Suppose you carry out a study to investigate the prevalence and associated factors of a certain disease “X” in a population, then the shell tables can be represented as in Tables 1, Table 2 and Table 3 below.

Table 1: Example of a shell table from univariate analysis

Example of a shell table from univariate analysis

Table 2: Example of a shell table from bivariate analysis

Example of a shell table from bivariate analysis

Table 3: Example of a shell table from multivariate analysis

Example of a shell table from multivariate analysis

aOR = adjusted odds ratio

Now that you have learned how to create a data analysis plan, these are the takeaway points. It should clearly state the:

  • Research question, objectives, and hypotheses
  • Dataset to be used
  • Variable types and their role
  • Statistical software and statistical methods
  • Shell tables for univariate, bivariate and multivariate analysis

Further readings

Creating a Data Analysis Plan: What to Consider When Choosing Statistics for a Study https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4552232/pdf/cjhp-68-311.pdf

Creating an Analysis Plan: https://www.cdc.gov/globalhealth/healthprotection/fetp/training_modules/9/creating-analysis-plan_pw_final_09242013.pdf

Data Analysis Plan: https://www.statisticssolutions.com/dissertation-consulting-services/data-analysis-plan-2/

Photo created by freepik – www.freepik.com

Barche Blaise

Dr Barche is a physician and holds a Masters in Public Health. He is a senior fellow at CRENC with interests in Data Science and Data Analysis.

Post Navigation

16 comments.

Ewane Edwin, MD

Thanks. Quite informative.

James Tony

Educative write-up. Thanks.

Mabou Gabriel

Easy to understand. Thanks Dr

Amabo Miranda N.

Very explicit Dr. Thanks

Dongmo Roosvelt, MD

I will always remember how you help me conceptualize and understand data science in a simple way. I can only hope that someday I’ll be in a position to repay you, my dear friend.

Menda Blondelle

Plan d’analyse

Marc Lionel Ngamani

This is interesting, Thanks

Nkai

Very understandable and informative. Thank you..

Ndzeshang

love the figures.

Selemani C Ngwira

Nice, and informative

MONICA NAYEBARE

This is so much educative and good for beginners, I would love to recommend that you create and share a video because some people are able to grasp when there is an instructor. Lots of love

Kwasseu

Thank you Doctor very helpful.

Mbapah L. Tasha

Educative and clearly written. Thanks

Philomena Balera

Well said doctor,thank you.But when do you present in tables ,bars,pie chart etc?

Rasheda

Very informative guide!

Submit a Comment Cancel Reply

Your email address will not be published. Required fields are marked *

Notify me of follow-up comments by email.

Notify me of new posts by email.

Submit Comment

  Receive updates on new courses and blog posts

Never Miss a Thing!

Never Miss a Thing!

Subscribe to our mailing list to receive the latest news and updates on our webinars, articles and courses.

You have Successfully Subscribed!

Product Overview

SurveyMonkey is built to handle every use case and need. Explore our product to learn how SurveyMonkey can work for you.

SurveyMonkey

Get data-driven insights from a global leader in online surveys.

Explore core features and advanced tools in one powerful platform.

SurveyMonkey Forms

Build and customize online forms to collect info and payments.

Integrations

Integrate with 100+ apps and plug-ins to get more done.

Market Research Solutions

Purpose-built solutions for all of your market research needs.

SurveyMonkey Genius

Create better surveys and spot insights quickly with built-in AI.

Financial Services

See more industries, customer experience, human resources, see more roles.

Online Polls

Registration Forms

Employee feedback, event feedback, customer satisfaction, see more use cases.

Contact Sales

Net Promoter Score

Measure customer satisfaction and loyalty for your business.

Learn what makes customers happy and turn them into advocates.

Website Feedback

Get actionable insights to improve the user experience.

Contact Information

Collect contact information from prospects, invitees, and more.

Event Registration

Easily collect and track RSVPs for your next event.

Find out what attendees want so that you can improve your next event.

Employee Engagement

Uncover insights to boost engagement and drive better results.

Meeting Feedback

Get feedback from your attendees so you can run better meetings.

360-degree employee evaluation

Use peer feedback to help improve employee performance.

Course Evaluation

Create better courses and improve teaching methods.

University Instructor Evaluation

Learn how students rate the course material and its presentation.

Product Testing

Find out what your customers think about your new product ideas.

See all templates

Resource center.

Best practices for using surveys and survey data

Curiosity at Work Blog

Our blog about surveys, tips for business, and more.

Help Center

Tutorials and how to guides for using SurveyMonkey.

How top brands drive growth with SurveyMonkey.

  • English (US)
  • English (UK)

Developing a data analysis plan

Map out the how and who of survey response analysis

Congratulations!

You’ve got survey results! It’s exciting. It’s empowering. It’s…a little overwhelming.

But before you start to worry, remember that you already set goals for your survey —and from your goals, you formed your response data analysis plan.

What is a data analysis plan?

A data analysis plan is a roadmap for how you’re going to organize and analyze your survey data—and it should help you achieve three objectives that relate to the goal you set before you started your survey:

  • Answer your top research questions
  • Use more specific survey questions to understand those answers
  • Segment survey respondents to compare the opinions of different demographic groups

The bigger picture: Go back to your goals

When you were planning your survey, you came up with general research questions that you wanted to answer by sending out a questionnaire. Remind yourself of your objectives when you start your data analysis plan.

Let’s say you held a conference for educators, and you wanted to know what the attendees thought of your event. Your survey goal was to get feedback from the people who attended your conference. And in order to achieve that goal, you came up with general research questions you’d like to get the insights on:

Conference Feedback Survey Goal: To get feedback from the people who attended my education conference. (I want feedback from attendees so I can assess my event’s strengths and weaknesses—and make targeted improvements accordingly.)

Research questions:

  • How did attendees rate the event overall?
  • What parts/aspects of the conference did attendees like the best?
  • What parts/aspects of the conference need to be improved?
  • Who are the attendees and what are their specific needs?

By going back to your goal and research questions, you should have your objectives fresh in your mind—and you’ll be ready to plan out how you’re going to organize your survey data .

Take a peek at the results for your top research questions

Typically a data analysis plan will start with the questions in your survey that ask respondents to respond directly to your primary research question. In the case of your education conference, it will be these two questions:

  • Overall, how satisfied were you with the conference?
  • How useful was this conference compared to other conferences you have attended?

From these two questions, you’ll know whether your conference was a success. When you report back to your boss or decide whether to hold the conference again next year, this is the information you’ll look to, and it’s the cornerstone of your topline results.

However, overall ratings don’t tell you anything about why attendees liked your conference or how you can make it even better.

Get granular: Organize your questions

Because you want to gain a more insightful understanding of what your data means, organize your thoughts by attributing your specific survey questions to each general research question. So when it comes to creating an effective final report , you’ll know exactly which data you need to answer your bigger questions.

If it helps, organize your questions in a table format:

How did attendees rate the event overall?1. Overall, how satisfied were you with the conference?
2. How useful was this conference compared to other conferences you have attended?
What parts/aspects of the conference did attendees like the best?

What parts/aspects of the conference need to be improved?
3. How would you rate the difficulty of the workshop?
4. Overall, do you think the conference provided too much, too little, or about the right amount of networking?
5. In general, how would you rate the food at the conference?
6. Do you feel the temperature in the conference building was too hot, too cold, or just right?
Who are the attendees and what are their specific needs?8. Are you a teacher, student, or administrator?
9. How large is your school?
10. How old are you?

Now, for example, when you want to answer the larger question, “What parts/aspects of the conference need to be improved, you know that you should draw on responses to survey questions 5 and 6.

Demographic groups: Note the “who’s who” of your survey

You performed an event feedback survey because you wanted to know where you need to make improvements so you can host better future events. But one of the most important parts of understanding the significance of your data—and figuring out what you need to do to improve—is identifying different demographic groupings by segmenting your respondents.

To get a handle on who’s taking your survey, make sure to include demographic questions at the end of your survey, such as age, gender, job role, institution, and more. But why should you do this?

When you’re writing your data analysis plan, think about which groups you want to compare. You should plan to take into account who is taking your survey (and how many of them there are) so you can slice and dice the data in a meaningful way that will inform any improvements you make.

For example, what if your overall satisfaction scores are low, but you see that all the students at your conference loved it? You need to see how different demographic groups answered your survey questions. It’s possible that attendees over 60 didn’t enjoy events that required a deep knowledge of computers. And if enough of them took your survey, they may have lowered your overall scores.

But don’t fret—students were happy with your conference, so you know that your entire event wasn’t awful. Filtering your results by different demographic groups helps you gain perspective—and turn your data into valuable, actionable results.

Putting your analysis plan into action

Now that you know that writing an effective analysis plan involves starting with topline results, organizing your survey questions, and figuring out how you want to segment your survey population into subgroups, you’re ready to start analyzing the data !

Back to Surveys 101

Discover more resources

Toolkits directory

Toolkits directory

Discover our toolkits, designed to help you leverage feedback in your role or industry.

Presidential debate 2024: How political debates impact public opinion 

Presidential debate 2024: How political debates impact public opinion 

Reactions to the presidential debate were quick and decisive. New research on what people think and who will be the most influenced

What is a questionnaire? Definition, examples, and uses

What is a questionnaire? Definition, examples, and uses

Learn how to use questionnaires to collect data to be used in market research for your business. We share examples, templates, and use cases.

Uncovering political identity in today’s complex world

Uncovering political identity in today’s complex world

How do people feel about traditional political identity labels? We looked at political identity by generation, and it's impact on values

See how SurveyMonkey can power your curiosity

App Directory

Vision and Mission

SurveyMonkey Together

Diversity, Equity & Inclusion

Health Plan Transparency in Coverage

Office Locations

Terms of Use

Privacy Notice

California Privacy Notice

Acceptable Uses Policy

Security Statement

GDPR Compliance

Email Opt-In

Accessibility

Cookies Notice

Facebook Surveys

Survey Template

Scheduling Polls

Google Forms vs. SurveyMonkey

Employee Satisfaction Surveys

Free Survey Templates

Mobile Surveys

How to Improve Customer Service

AB Test Significance Calculator

NPS Calculator

Questionnaire Templates

Event Survey

Sample Size Calculator

Writing Good Surveys

Likert Scale

Survey Analysis

360 Degree Feedback

Education Surveys

Survey Questions

NPS Calculation

Customer Satisfaction Survey Questions

Agree Disagree Questions

Create a Survey

Online Quizzes

Qualitative vs Quantitative Research

Customer Survey

Market Research Surveys

Survey Design Best Practices

Margin of Error Calculator

Questionnaire

Demographic Questions

Training Survey

Offline Survey

360 Review Template

Logo for New Prairie Press Open Book Publishing

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

5 Collecting Data in Your Classroom

ESSENTIAL QUESTIONS

  • What sort of methodological considerations are necessary to collect data in your educational context?
  • What methods of data collection will be most effective for your study?
  • What are the affordances and limitations associated with your data collection methods?
  • What does it mean to triangulate data, and why is it necessary?

As you develop an action plan for your action research project, you will be thinking about the primary task of conducting research, and probably contemplating the data you will collect. It is likely you have asked yourself questions related to the methods you will be using, how you will organize the data collection, and how each piece of data is related within the larger project. This chapter will help you think through these questions.

Data Collection

The data collection methods used in educational research have originated from a variety of disciplines (anthropology, history, psychology, sociology), which has resulted in a variety of research frameworks to draw upon. As discussed in the previous chapter, the challenge for educator-researchers is to develop a research plan and related activities that are focused and manageable to study. While human beings like structure and definitions, especially when we encounter new experiences, educators-as-researchers frequently disregard the accepted frameworks related to research and rely on their own subjective knowledge from their own pedagogical experiences when taking on the role of educator-researcher in educational settings. Relying on subjective knowledge enables teachers to engage more effectively as researchers in their educational context. Educator-researchers especially rely on this subjective knowledge in educational contexts to modify their data collection methodologies. Subjective knowledge negotiates the traditional research frameworks with the data collection possibilities of their practice, while also considering their unique educational context. This empowers educators as researchers, utilizing action research, to be powerful agents for change in educational contexts.

Thinking about Types of Data

Whether the research design is qualitative, quantitative or mixed-methods, it will determine the methods or ways you use to collect data. Qualitative research designs focus on collecting data that is relational, interpretive, subjective, and inductive; whereas a typical quantitative study, collects data that are deductive, statistical, and objective.

 

 

Relational, Interpretive, Subjective Scientific, Statistical, Objective,
 

 

Inductive Deductive
 

 

Language Numbers
 

 

Small (1-15) Large

In contrast, qualitative data is often in the form of language, while quantitative data typically involves numbers. Quantitative researchers require large numbers of participants for validity, while qualitative researchers use a smaller number of participants, and can even use one (Hatch, 2002). In the past, quantitative and qualitative educational researchers rarely interacted, sometimes holding contempt for each other’s work; and even published articles in separate journals based on having distinct theoretical orientations in terms of data collection. Overall, there is a greater appreciation for both quantitative and qualitative approaches, with scholars finding distinct value in each approach, yet in many circles the debate continues over which approach is more beneficial for educational research and in educational contexts.

The goal of qualitative data collection is to build a complex and nuanced description of social or human problems from multiple perspectives. The flexibility and ability to use a variety of data collection techniques encompasses a distinct stance on research. Qualitative researchers are able to capture conversations and everyday language, as well as situational attitudes and beliefs. Qualitative data collection is able to be fitted to the study, with the goal of collecting the most authentic data, not necessarily the most objective. To researchers who strictly use quantitative methods, qualitative methods may seem wholly unstructured, eclectic, and idiosyncratic; however, for qualitative researchers these characteristics are advantageous to their purpose. Quantitative research depends upon structure and is bounded to find relationship among variables and units of measurement. Quantitative research helps make sense of large amounts of data. Both quantitative and qualitative research help us address education challenges by better identifying what is happening, with the goal of identifying why it is happening, and how we can address it.

Most educator-researchers who engage in research projects in schools and classrooms utilize qualitative methodologies for their data collection. Educator-researchers also use mixed methods that focus on qualitative methods, but also use quantitative methods, such as surveys, to provide a multidimensional approach to inquiring about their topic. While qualitative methods may feel more comfortable, there is a methodological rationale for using quantitative research.

Research methodologists use two distinct forms of logic to describe research: induction and deduction. Inductive approaches are focused on developing new or emerging theories, by explaining the accumulation of evidence that provides meaning to similar circumstances. Deductive approaches move in the opposite direction, and create meaning about a particular situation by reasoning from a general idea or theory about the particular circumstances. While qualitative approaches are inductive – observe and then generate theories, for example – qualitative researchers will typically initiate studies with some preconceived notions of potential theories to support their work.

Flexible Research Design

A researcher’s decisions about data collection and activities involve a personal choice, yet the choice of data sources must be responsive to the proposed project and topic. Logically, researchers will use whatever validated methods help them to address the issue they are researching and will develop a research plan around activities to implement those methods. While a research plan is important to conducting valid research in schools and classrooms, a research plan should also be flexible in design to allow data to emerge and find the best data to address research questions. In this way, a research plan is recommended, but data collection methods are not always known in advance. As you, the educator-researcher, interacts with participants, you may find it necessary to continue the research with additional data sources to better address the question at the center of your research. When educators are researchers and a participant in their study, it is especially important to keep an open mind to the wide range of research methodologies. All-in-all educator-researchers should understand that there are varied and multiple paths to move from research questions to addressing those questions.

Mixed Methods

As mentioned above, mixed methods is the use of both qualitative and quantitative methods. Researchers generally use mixed methods to clarify findings from the initial method of data collection. In mixed-methods research, the educator-researcher has increased flexibility in data collection. Mixed methods studies often result in a combination of precise measurements (e.g., grades, test scores, survey, etc.) along with in-depth qualitative data that provide meaningful detail to those measurements. The key advantage of using mixed methods is that quantitative details enhance qualitative data sources that involve conclusions and use terms such as usually, some, or most which can be substituted with a number or quantity, such as percentages or averages, or the mean, the median, and/or the mode. One challenge to educator-researchers is that mixed methods require more time and resources to complete the study, and more familiarity about both qualitative and quantitative data collection methods.

Mixed methods in educator research, even if quantitative methods are only used minimally, provide an opportunity to clarify findings, fill gaps in understanding, and cross-check data. For example, if you are looking at the use of math journals to better engage students and improve their math scores, it would be helpful to understand their abilities in math and reading before analyzing the math journals. Therefore, looking at their test scores might give you some nuanced understanding of why some students improved more than others after using the math journals. Pre- and post-surveys would also provide valuable information in terms of students’ attitudes and beliefs about math and writing. In line with thinking about pre- and post-surveys, some researchers suggest using either qualitative or quantitative approaches in different phases of the research process. In the previous example, pre- and post test scores may quantitatively demonstrate growth or improvement after implementing the math journal; however, the qualitative data would provide detailed evidence as to why the math journals contributed to growth or improvement in math. Quantitative methods can establish relationships among variables, while qualitative methods can explain factors underlying those same relationships.

I caution the reader at this point to not simply think of qualitative methodologies as anecdotal details to quantitative reports. I only highlight mixed methods to introduce the strength of such studies, and to aid in moving educational research methodology away from the binary thinking of quantitative vs. qualitative. In thinking about data collection, possible data sources include questionnaires or surveys, observations (video or written notes), collaboration (meetings, peer coaching), interviews, tests and records, pictures, diaries, transcripts of video and audio recordings, personal journals, student work samples, e-mail and online communication, and any other pertinent documents and reports. As you begin to think about data collection you will consider the available materials and think about aspects discussed in the previous chapter: who, what, where, when, and how. Specifically:

  • Who are the subjects or participants for the study?
  • What data is vital evidence for this study?
  • Where will the data be collected?
  • When will the data be collected?
  • How will the data be collected?

If you find you are having trouble identifying data sources that support your initial question, you may need to revise your research question – and make sure what you are asking is researchable or measurable. The research question can always change throughout the study, but it should only be in relation the data being collected.

Participant Data

As an educator, your possible participants selection pool is narrower than most researchers encounter – however, it is important to be clear about their role in the data design and collection. A study can involve one participant or multiple participants, and participants often serve as the primary source of data in the research process. Most studies by educator-researchers utilize purposeful sampling, or in other words, they select participants who will be able to provide the most relevant information to the study. Therefore, the study design relies upon the participants and the information they can provide. The following is a description of some data collection methods, which include: surveys or questionnaires, individual or group interviews, observations, field notes or diaries, narratives, documents, and elicitation.

Surveys, or questionnaires, are a research instrument frequently used to receive data about participants’ feelings, beliefs, and attitudes in regard to the research topic or activities. Surveys are often used for large sample sizes with the intent of generalizing from a sample population to a larger population. Surveys are used with any number of participants and can be administered at different times during the study, such as pre-activity and post-activity, with the same participants to determine if changes have occurred over the course of the activity time, or simply change over time. Researchers like surveys and questionnaires as an instrument because they can be distributed and collected easily – especially with all of the recent online application possibilities (e.g., Google, Facebook, etc.). Surveys come in several forms, closed-ended, open-ended, or a mix of the two. Closed-ended surveys are typically multiple-choice questions or scales (e.g. 1-5, most likely–least likely) that allow participants to rate or select a response for each question. These responses can easily be tabulated into meaningful number representations, like percentages. For example, Likert scales are often used with a five-point range, with options such as strongly agree, agree, neutral, disagree, and strongly disagree. Open-ended surveys consist of prompts for participants to add their own perspectives in short answer or limited word responses. Open-ended surveys are not always as easy to tabulate, but can provide more detail and description.

Interviews and Focus Groups

Interviews are frequently used by researchers because they often produce some of the most worthwhile data. Interviews allow researchers to obtain candid verbal perspectives through structured or semi-structured questioning. Interview questions, either structured or semi-structured, are related to the research question or research activities to gauge the participants’ thoughts, feelings, motivations, and reflections. Some research relies on interviewing as the primary data source, but most often interviews are used to strengthen and support other data sources. Interviews can be time consuming, but interviews are worthwhile in that you can gather richer and more revealing information than other methods that could be utilized (Koshy, 2010). Lincoln and Guba (1985) identified five outcomes of interviewing:

Outcomes of Interviewing

  • Here and now explanations;
  • Reconstructions of past events and experiences;
  • Projections of anticipated experiences;
  • Verification of information from other sources;
  • Verification of information (p. 268).

As mentioned above, interviews typically take two forms: structured and semi-structured. In terms of interviews, structured means that the researcher identifies a certain number of questions, in a prescribed sequence, and the researcher asks each participant these questions in the same order. Structured interviews qualitatively resemble surveys and questionnaires because they are consistent, easy to administer, provide direct responses, and make tabulation and analysis more consistent. Structured interviews use an interview protocol to organize questions, and maintain consistency.

Semi-structured interviews have a prescribed set of questions and protocol, just like structured interviews, but the researcher does not have to follow those questions or order explicitly. The researcher should ask the same questions to each participant for comparison reasons, but semi-structured interviews allow the researcher to ask follow-up questions that stray from the protocol. The semi-structured interview is intended to allow for new, emerging topics to be obtained from participants. Semi-structured questions can be included in more structured protocols, which allows for the participant to add additional information beyond the formal questions and for the researcher to return to preplanned formal questions after the participant responds. Participants can be interviewed individually or collectively, and while individual interviews are time-consuming, they can provide more in-depth information.

When considering more than two participants for an interview, researchers will often use a focus group interview format. Focus group interviews typically involve three to ten participants and seek to gain socially dependent perspectives or organizational viewpoints. When using focus group interviews with students, researchers often find them beneficial because they allow student reflection and ideas to build off of each other. This is important because often times students feel shy or hesitant to share their ideas with adults, but once another student sparks or confirms their idea, belief, or opinion they are more willing to share. Focus group interviews are very effective as pre- and post-activity data sources. Researchers can use either a structured or semi-structured interview protocol for focus group interviews; however, with multiple participants it may be difficult to maintain the integrity of a structured protocol.

Observations

One of the simplest, and most natural, forms of data collection is to engage in formal observation. Observing humans in a setting provides us contextual understanding of the complexity of human behavior and interrelationships among groups in that setting. If a researcher wants to examine the ways teachers approach a particular area of pedagogical practice, then observation would be a viable data collection tool. Formal observations are truly unique and allow the researcher to collect data that cannot be obtained through other data sources. Ethnography is a qualitative research design that provides a descriptive account based on researchers’ observations and explorations to examine the social dynamics present in cultures and social systems – which includes classrooms and schools. Taken from anthropology, the ethnographer uses observations and detailed note taking, along with other forms of mapping or making sense of the context and relationships within. For Creswell (2007), several guidelines provide structure to an observation:

Structuring Observations

  • Identify what to observe
  • Determine the role you will assume — observer or participant
  • Design observational protocol for recording notes
  • Record information such as physical situation, particular events and activities
  • Thank participants and inform them of the use of and their accessibility to the data (pp. 132– 134)

As an educator-researcher, you may take on a role that exceeds that of an observer and participate as a member of the research setting. In this case, the data sources would be called participant observation to clearly identify the degree of involvement you have in the study. In participant observation, the researcher embeds themselves in the actions of the participants. It is important to understand that participant observation will provide completely different data, in comparison to simply observing someone else. Ethnographies, or studies focused completely on observation as a data source, often extend longer than other data sources, ranging from several months to even years. Extended time provides the researcher the ability to obtain more detailed and accurate information, because it takes time to observe patterns and other details that are significant to the study. Self-study is another consideration for educators, if they want to use observation and be a participant observer. They can use video and audio recordings of their activities to use as data sources and use those as the source of observation.

Field Diaries and Notes

Utilizing a field dairy, or keeping field notes, can be a very effective and practical data collection method. In purpose, a field diary or notes keep a record of what happens during the research activities. It can be useful in tracking how and why your ideas and the research process evolved. Many educators keep daily notes about their classes, and in many ways, this is a more focused and narrower version of documenting the daily happenings of a class. A field diary or notes can also serve as an account of your reflections and commentary on your study, and can be a starting place for your data analysis and interpretations. A field diary or notes are typically valuable when researchers begin to write about their project because it allows them to draw upon their authentic voice. The reflective process that represents a diary can also serve as an additional layer of professional learning for researchers. The format and length of a field diary or notes will vary depending on the researching and the topic; however, the ultimate goal should be to facilitate data collection and analysis.

Data narratives and stories are a fairly new form of formalized data. While researchers have collected bits and pieces of narratives in other forms of data, asking participants to compose a narrative (either written, spoken, or performed) as a whole allows researchers to examine how participants embrace the complexities of the context and social interactions. Humans are programmed to engage with and share narratives to develop meaningful and experiential knowledge. Educator autobiographies bring to life personal stories shaped by knowledge, values, and feelings that developed from their classroom experiences. Narrative data includes three primary areas: temporality, sociality, and place (Clandinin & Conolley, 2000). In terms of temporality, narratives have a past, present, and future because stories are time-based and transitional. Sociality highlights the social relationships in narratives as well as the personal and moral dispositions. Place includes the spaces where the narratives happen. Furthermore, bell hooks (1991) notes that narratives, or storytelling, as inquiry can be a powerful way to study how contexts are influenced by power structures, often linking and intersecting the structural dynamics of social class, race, and gender to highlight the struggle.

Documents provide a way to collect data that is unobtrusive to the participant. Documents are unobtrusive data because it is collected without modifying or distracting the research context when gathered. Educational settings maintain records on all sorts of activities in schools: content standards, state mandates, student discipline records, student attendance, student assessments, performance records, parental engagement, records of how teachers spend PTO money, etc. Documents often provide background and contextual material providing a snapshot of school policies, demographic information, ongoing records over a period of time, and contextual details from the site of the research study. Documents can be characterized similarly to historical research, as primary and secondary. Examples of primary materials are first-hand sources from someone in the educational context, such as minutes from a school board or faculty meeting, photographs, video recordings, and letters. Examples of secondary sources typically include analysis or interpretations of a primary source by others, such as texts, critiques, and reviews. Both types of sources are especially valuable in action research.

Elicitation Methods

We have talked about several methods of data collection that each have useful ways of documenting, inquiring, and thinking about the research question. However, how does a researcher engage participants in ways that allow them to demonstrate what they know, feel, think, or believe? Asking participants directly about their thinking, feeling, or beliefs will only take you so far depending on the comfort and rapport the participant has with the researcher. There are always a variety of hurdles in extracting participants’ knowledge. Even the manner in which questions are framed and the way researchers use materials in the research process are equally important in getting participants to provide reliable, comparable, and valid responses. Furthermore, all individuals who participate in research studies vary in their ability to recall and report what they know, and this affects the value of traditional data collection, especially structured and semi-structured interviewing. In particular, participants’ knowledge or other thinking of interest may be implicit and difficult for them to explicate in simple discussion.

Elicitation methods help researchers uncover unarticulated participant knowledge through a potential variety of activities. Researchers will employ elicitation methods and document the participants’ actions and typically the description of why they took those particular actions. Educators may be able to relate the process of elicitation methods to a “think aloud” activity in which the researcher wants to record or document the activity. Elicitation methods can take many forms. What follows are some basic ideas and formats for elicitation methods.

Brainstorming/Concept Map

Most educators are probably familiar with the process of brainstorming or creating a concept map. These can be very effective elicitation methods when the researcher asks the participant to create a concept map or representation of brainstorming, and then asks the participant to explain the connections between concepts or ideas on the brainstorming or concept map.

Sorting provides an engaging way to gather data from your participants. Sorting, as you can imagine, involves participants sorting, grouping, or categorizing objects or photographs in meaningful ways. Once participants have sorted the objects or photographs, the researcher records or documents the participant explaining why they sorted or grouped the objects or photographs in the way that they did. As a former history teacher, I would often use sorting to assess my students’ understanding of related concepts and events in a world history class. I would use pictures too as the means for students to sort and demonstrate what they understood from the unit. For broader discussion of elicitation techniques in history education see Barton (2015).

Listing/ Ranking

Listing can be an effective way to examine participants’ thinking about a topic. Researchers can have participants construct a list in many different ways to fit the focus of the study and then have the participants explain their list. For example, if an educator was studying middle school student perceptions of careers, they could ask them to complete three lists: Careers in Most Demand; Careers with Most Education/Training; Careers of most Interest.

1. 1. 1.
2. 2. 2.
3. 3. 3.
4. 4. 4.
5. 5. 5.

Then, once participants have filled out the lists, the most important part is documenting them explaining their thinking, and why they filled out the lists the way they did. As you may imagine, in this example, every participant would have a list that is different based on their personal interests.

Researchers can also elicit responses by simply giving participants a prompt, and then asking them to recall whatever they know about that prompt. Researchers will have the participants do this in some sort of demonstrative activity. For example, at the end of a world history course, I might ask students to explain what “culture” means to them and to explain their thinking.

Re-articulation (writing or drawing)

A unique way to engage participants in elicitation methods is to have them write about, rewrite, or draw visual representations of either life experiences or literature that they have read. For example, you could ask them to rewrite a part of the literature they did not like, add a part they thought should be there, or simply extend the ending. Participants can either write or draw these re-articulations. I find that drawing works just as well because, again, the goal is to have participant describe their thinking based on the activity.  

Scenario Decision-Making

Elicitation methods can also examine skills. Researchers can provide participants scenarios and ask them to make decisions. The researchers can document those decisions and analyze the extent to which the participant understands the skill.

  Document, Photograph, or Video Analysis

This is the most basic elicitation in which the researcher provides a document, photograph, or video for the participant to examine. Then, the researcher asks questions about the participants interpretations of the document, photograph, or video. One method that would support this sort of elicitation is to ask the participants to provide images from their everyday words. For example, asking students to document the literacy examples in their homes (i.e., pictures of calendars, bookshelves etc.).  With the availability of one-to-one tech, and iPads, participant documentation is easier.

There are many more methods of data collection also, as well as many variations of the methods described above. The goal for you is to find the data collection methods that are going to give you the best data to answer your research question. If you are unsure, there is nothing wrong with collecting more data than you need to make sure you use effective methods – the only thing you have to lose is time!

Use of Case Studies

Case studies are a popular way for studying phenomena in settings using qualitative methodology. Case studies typically encompass qualitative studies which look closely at what happens when researchers collect data, analyze the data, and present the results. Case studies can focus on a single case or examine a phenomenon across multiple cases. Case studies frame research in a way that allows for rich description of data and depth of analysis.

An advantage of using case study design is that the reader often identifies with the case or phenomena, as well as the participants in the study. Yin (2003) describes case study methodology as inquiry that investigates a contemporary phenomenon within its authentic context. Case studies are particularly appropriate when the boundaries and relationship between the phenomenon and the context are not clear. Case studies relate well with the processes involved in action research. Critics of action research case studies sometimes criticize the inevitable subjectivity, just like general criticisms of action research. Case studies provide researchers opportunities to explore both the how and the why of phenomena in context, while being both exploratory and descriptive.

We want to clarify the differences between methodologies and methods of research. There are methodologies of research, like case study and action research, and methods of data collection. Methodologies like ethnography, narrative inquiry, and case study draw from some similar methods of data collecting that include interviews, collection of artifacts (writings, drawings, images), and observations. The differences between the methodologies include the time-frame for research; the boundaries of the research; and the epistemology.

Triangulation of Data

Triangulation is a method used by qualitative researchers to check and establish trustworthiness in their studies by using and analyzing multiple (three or more) data collection methods to address a research question and develop a consistency of evidence from data sources or approaches. Thus, triangulation facilitates trustworthiness of data through cross verification of evidence, to support claims, from more than two data collection sources. Triangulation also tests the consistency of findings obtained through different data sources and instruments, while minimizing bias in the researcher’s interpretations of the data.

If we think about the example of studying the use of math journals in an elementary classroom, the researcher would want to collect at least three sources of data – the journal prompts, assessment scores, and interviews. When the researcher is analyzing the data, they will want to find themes or evidence across all three data sources to address their research question. In a very basic analysis, if the students demonstrated a deeper level of reflection about math in the journals, their assessment scores improved, and their interviews demonstrated they had more confidence in their number sense and math abilities – then, the researcher could conclude, on a very general level, that math journals improved their students’ math skills, confidence, or abilities. Ideally, the study would examine specific aspects of math to enable deeper analysis of math journals, but this example demonstrates the basic idea of triangulation. In this example, all of the data provided evidence that the intervention of a math journal improved students’ understanding of math, and the three data sources provided trustworthiness for this claim.

Data Collection Checklist

  • Based on your research question, what data might you need ?
  • What are the multiple ways you could collect that data ?
  • How might you document this data , or organize it so that it can be analyzed?
  • What methods are most appropriate for your context and timeframe ?
  • How much time will your data collection require? How much time can you allow for?
  • Will you need to create any data sources (e.g., interview protocol, elicitation materials)?
  • Do your data sources all logically support the research question, and each other?
  • Does your data collection provide for multiple perspectives ?
  • How will your data achieve triangulation in addressing the research question?
  • Will you need more than three data sources to ensure triangulation of data?

Action Research Copyright © by J. Spencer Clark; Suzanne Porath; Julie Thiele; and Morgan Jobe is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License , except where otherwise noted.

Share This Book

Action Research Tutorials-CCAR

Action research tutorials, tutorial 8:  analyzing data - evidence, tutorial 8: resources, a. understanding your data.

Action Research is not a single research project; rather it is an ongoing iterative approach that takes place across cycles of innovation and reflection. It is a way of learning from and through systematic inquiry into one's practice. Central to this process is the collection and analysis of data. The image below (Rie1, 2014) uses color to represent the growing knowledge of the action researcher. After a number of cycles, there is a reporting out or sharing function. In the last tutorial, we discussed what data artifacts you could collect. Now we are going to talk about the analysis of these data artifacts. This is often the step that practitioners find most difficult because they have not been trained to analyze data.  

data analysis plan for action research example

B.  Organizing your data -- What is your storyline?

You will need to report your "findings" that you discover from exploring your data. To develop findings, you have to engage in some serious "looking" to find meaning from what you collected. Often new action researchers will simply describe a survey and then paste the questions with the group responses in their reports. But this is not helping the reader to make sense of what you learned. You haven't processed the data to learn from it. It is helpful to begin with your research questions or your near and medium outcomes on your logic model. What does the data you collected tell you that will help answer your research question? How does the data help you to understand if you see evidence of the outcomes that you predicted in your logic model? To answer these questions, take the responses to your survey and group them around what you are trying to learn. You need to find the storyline that you will be sharing with your audience. AN EXAMPLE-- Katherine Korte Flips her Government Course For example, consider an educator teaching a government class who flipped her classroom as her "action" and, for data collection, asked her students to provide some feedback on their learning in a survey. RESEARCH QUESTION: If I flipped my classroom using video to send lessons home and ibook technology, how will this affect the quality of knowledge building dialogues in the classroom? She was interested in what students learned, how they learned, and changes in their attitudes and engagement in learning. Therefore, she might take her ten survey questions and organize them under these three topics with one question not fitting this process.

(1) knowledge ....(three survey questions) (2) process ....(four survey questions) (3) attitudes, ....(two survey questions)

Now she would compute responses to these groups of questions and think about other data sources besides the survey -- what tests or quizzes did students take that would help answer the first question about knowledge? How might field notes shed light on what students were learning? What artifacts were produced that could be examined? For the second topic, there were student responses that addressed their work in discussion circles, and what they learned from each other and some of the problems from group work. How did students' comments match her notes? Were there more or fewer problems that students needed help with? Then she might organize the analysis into three sections:

Changes in Student Knowledge ....analysis of three survey questions ....classroom quizzes, ....end of term assessment, ....ibook notetaking, ....performances Changes in the Way Students worked ....analysis of four survey questions ....field notes on how well the groups worked ....what problems they had, ....teacher assessment of how prepared they were for the classroom discussions, ....speed at which they accomplished their tasks Student Reactions to the flipped Classroom ....analysis of two survey questions ....students blogs ....observations about the comfort of students contributing ideas or making presentations

This organization helps her to find a storyline that she will tell about the data.  

Validity of Measures

You can see that now that multiple sources of data are being examined to see what they say about the outcomes that the educator cares about. The analysis should be an honest hunt to figure out what can be learned from what happened. You are looking for valid measures that will answer your question. For example, if a teacher listed grades as evidence of student engagement, one might challenge this measure as not a valid measure of engagement because a student could have high grades and not be engaged and another student could be engaged and yet have a low grade. This is why researchers consider the validity of measures.  

Participatory Data Collection

Empowering the group to participate in data collection and analysis helps everyone to have a sense of ownership over the outcomes and the deep understanding of change. These guides and articles might help you think about who should be involved in the data collection/analysis work in your action research. Even if you collect and analyze data within a group, you still have your own reflections on what this process means to you as an action researcher.   

Guijt, I.,(2014).  Participatory Approaches:  This guide, written by Irene Guijt for  UNICEF , looks at the use of participatory approaches in impact evaluation. While not exactly the same as action research it is very close and this guide will be useful if you are involving stakeholders in your process of data collection and analysis

Increasing Participation in Evaluating – Bruner Foundation Guide  - This guide discusses how Organization Staff, Evaluators and Funders are involved in Participatory Evaluation.

Campilan, D. (2000). Participatory Evaluation of Participatory Research. Forum on Evaluation of International Cooperation Projects: Centering on Development of Human Resources in the Field of Agriculture. Nagoya, Japan, International Potato Center.  http://ir.nul.nagoya-u.ac.jp/jspui/bitstream/2237/8890/1/39-56.pdf

Chambers, R. (2009) Making the Poor Count: Using Participatory Options for Impact Evaluation in Chambers, R., Karlan, D., Ravallion, M. and Rogers, P. (Eds) Designing impact evaluations: different perspectives . New Delhi, India, International Initiative for Impact Evaluation. http://www.3ieimpact.org/admin/pdfs_papers/50.pdf

Guijt, I. and J. Gaventa (1998).  Participatory Monitoring and Evaluation: Learning from Change.  IDS Policy Briefing. Brighton, UK, University of Sussex- In participatory monitoring and evaluation, stakeholders work together to decide how progress should be measured and how to understand the outcomes of change.

Zukoski, A. and M. Luluquisen (2002). "Participatory Evaluation: What is it? Why do it? What are the challenges?" Policy & Practice(5). http://depts.washington.edu/ccph/pdf_files/Evaluation.pdf

C.  Exploring your data -- What is your story?

Coding your Data

The Center for Evaluation Research at the University of Calif, Davis has a good and brief  guide on coding as well as  other tools for data analysis  that will be helpful to read before you begin your coding process. If you are doing coding of data-- for example coding what students write in their blogs, it will be helpful to develop a codebook where you list the codes and examples to help you make decisions. This serves to create consistency or reliability in your coding. Good research practice involves creating a codebook for any qualitative analysis you do. This can be very simple or more complex depending on the nature of your data and you scale of analysis you plan to do. This is similar to create a rubric to assess student work. If you are writing an action research dissertation, the use of a codebook is highly recommended. If you want to see how researchers use  codebooks and coding schemes , you will find a number of them have been collected by Kimberly Neuendorf as an accompaniment to her Content Analysis Guidebook. I enjoyed exploring the codebook for the  study of female roles in James Bond 007 movies . In this video, Kevin Meethan also describes the process of going from an interview to codes-- if you go to the youtube version, I think you can find a link to the texts that are given in the examples, but you can just listen to the examples and a get sense for the coding process.  

Reliability  In any coding, an important question to ask is: how reliable is the coding process? In other words, if this teacher told us what theme or content to look for and we coded the data would it be the same as her coding? To achieve reliability, researchers often have two people code data and compare. If they agree on 85% of the cases, then we say the coding is reliable. If you cannot find another person to compare your coding, you can check your coding with yourself at different times. Code a few cases and then wait a day and recode. Do they match? The closer the match the more reliable your coding. Teachers often engage in a very similar process to grade papers in a consistent or reliable way. They create a rubric and then use that rubric to assess the student work. The rubric is their codebook and the access is the coding process. Content Analysis Let's stay with the teacher who flipped her classroom and had students keep notes in their iBooks as they watched the videos at home. This teacher wants to analyze this textual data. How should she approach this? Depending on which category the action researcher chooses, there are lots of ways to do a qualitative analysis of data. Some are more time consuming than others and a large amount of data might be more than a teacher can examine. Our teacher might want to create a sample to analyze. She could take a random set of three entries for each student, or all students' last blog before the test, or she might decide to examine all of the blogs of six students. These six could be randomly selected from all of the students or from groups of students. For example, the teacher might select two students at random from those that received an A, B or C grade in the course, or in a previous course, or one student from each discussion group. A content analysis helps us to know what was contained in these blogs. Suppose there are some central ideas or concepts and our teacher wants to see how often, if at all, they appear in student writing. To access this information, she might use a single blog, or response to a question as the "unit of analysis" and then mark yes or no for a number of ideas that she has decided on. She would take each entry and code it, say 1 or 0, for each of the concepts. This would help her see how frequently individuals and the class as a whole explored these concepts. A different approach to content analysis would involve coding all of the topics that were covered in each entry. Then she could examine the topics that students selected as important. This approach would help her to understand what topics students judged to be important. Theme Analysis Theme analysis is similar to content analysis but a bit more difficult. Here, you are not looking for concepts but for themes. Some process themes might be positive or negative attitudes about the different forms of technology used in the classroom. As in content analysis, these could be set up by the teacher or they could emerge from the data as when the action researcher discovers themes through repeated reading, grouping them, and constructing a final set of themes to use.  Developing a Critical Eye Action research is not about proving how successful you were at some new attempt. It is about learning from your efforts. It is not so much about whether it worked or not, but a more thoughtful examination of how it worked, or for whom it worked, and why things turned out as they did. So be ready to hear that your plans did not work as you expected. Try not to be defensive. Instead, see this as a time to learn from your inquiry. What worked and why? What did not work and why? it is important to examine evidence that contradicts or opposes your research question.  You might find the " What Else Test " developed by Jess Dart will be helpful as you develop your critical eye. 

Advice for Data Analysis in Social Services

The Institute for Research and Innovation in Social Services (IRISS) is a charitable company with a mission to:…promote positive outcomes for the people who use Scotland’s social services by enhancing the capacity and capability of the social services workforce to access and make use of knowledge and research for service innovation and improvement. They have a number of resources can help think about action research starting with this animated video:

Outcomes ToolboxA toolbox of resources relevant to an outcomes-focused approach in social services Developed by  IRISS  in partnership with Coalition of Care and Support Providers in Scotland (CCPS), the Outcomes Toolbox brings together a range of resources and knowledge relevant to an outcomes-focused approach in the social services. Understanding and Measuring Outcomes The role of Qualitative Data Emma Miller and Ellen Daly

D.  Displaying your Data -- How will you Tell a Compelling Research Story?

Think about what your data has told you and tell others what you learned. Use chart, graphs or tables to help people quickly see what you have found. With all of the choices you will find yourself thinking which type of chart should I use. While there are suggested guidelines about these choices, you might find it works just as well to try displaying the data in a number of different ways and look at it and see what you learn. Then test it with someone who does not know the data. What do they see from your chart? Consider the signal to noise ratio. The signal is your data and the noise is graphical elements around the data you might not need. You want to encourage the eye to quickly zero in on what is relevant. This video shows some examples might help you think about how to represent your data without unnecessary distortion. While design can be subjective (and while I don't agree with all of the comments) the suggestion to minimize "chart junk" is important. Try to find ways to clearly communicate your data in an honest and pleasing image.

Presentation and Visualization of Data A site that might be helpful for understanding some basic concepts in statistics and how to display data can be found at  http://www.shodor.org/interactivate/lessons/HistogramsBarGraph/  This site is arranged to support for high school math teachers but there some interactives that might help you think about data displays. (And it is a fun way to play with math and statistic concepts). Many eyes ( http://www.many-eyes.com ) is a free tool for creating a visualization of data sets. for examples see... http://www-958.ibm.com/software/data/cognos/manyeyes/visualizations?sort=rating

Sage Research Methods Community

Qual Data Analysis & Action Research

by Janet Salmons, Ph.D., Research Community Manager for Sage Methodspace

Qualitative data analysis varies by methodology. In this post let’s focus on analysis in action research studies.

data analysis plan for action research example

Action research is a flexible research methodology uniquely suited to researching and supporting change. It integrates social research with exploratory action to promote development. In its classic form, action research involves fluid and overlapping cycles of investigation, action planning, piloting of new practices, and evaluation of outcomes incorporating at all stages the collection and analysis of data and the generation of knowledge (Given, 2008).

Learn about analytic approaches for action and participatory action research in these open access Sage journal articles. Find more posts, interviews, and resources about action research here .

Benjamin-Thomas, T. E., Corrado, A. M., McGrath, C., Rudman, D. L., & Hand, C. (2018). Working Towards the Promise of Participatory Action Research: Learning From Ageing Research Exemplars. International Journal of Qualitative Methods, 17(1). https://doi.org/10.1177/1609406918817953 Abstract. Within research addressing issues of social justice, there is a growing uptake of participatory action research (PAR) approaches that are ideally committed to equitable participation of community members in all phases of the research process in order to collaboratively enact social transformation. However, the utilization of such approaches has not always matched the ideal, with inconsistencies in how participation and action are incorporated. “Participation” within various research processes is displayed differently, with the involvement of community members varying from full participation to their involvement as simply participants for data collection. Similarly, “action” is varyingly enacted from researchers proposing research implications for policy and practice to the meaningful involvement of community members in facilitating social change. This inconsistency in how PAR is utilized, despite widespread publications outlining key principles and central tenets, suggests there are challenges preventing researchers from fully embracing and enacting the central tenets of equitable participation and social transformation. This article intends to provide one way forward, for scholars intending to more fully enact the central tenets of PAR, through critically discussing how, and to what extent, the principles of PAR were enacted within 14 key exemplars of PAR conducted with older adults. More specifically, we display and discuss key principles for enacting the full commitment of PAR, highlight a critical appraisal guide, critically analyze exemplars, and share strategies that researchers have used to address these commitments. The critical appraisal guide and associated research findings provide useful directions for researchers who desire to more fully embrace commitments and practices commensurate with enacting the promise of PAR for equitable collaboration and social transformation.

Fletcher, A. J., MacPhee, M., & Dickson, G. (2015). Doing Participatory Action Research in a Multicase Study: A Methodological Example. International Journal of Qualitative Methods, 14(5). https://doi.org/10.1177/1609406915621405

Abstract. In this article, we describe an approach for conducting participatory action research (PAR) in a longitudinal multicase study, with particular focus on cross-case analysis. Existing literature has documented the practice of PAR in single-case studies, but far less has been written on how to conduct PAR across multiple cases. There is also a need for instructional examples of multicase study application, particularly methods of cross-case analysis. In PAR, research methods—including data analysis methods—have the power to shape participant inclusion or exclusion, involvement or attrition, and mobilization of knowledge in real time. In response to these challenges, we discuss the analysis methods used in a PAR study of health leadership in Canada. The project, which consisted of six case studies of leadership in major health system change, involved health leaders as collaborators. We address the challenges of doing PAR with collaborators facing time limitations and suggest a project structure for involving collaborators at critical junctures. We present a detailed, two-part method for conducting cross-case data analysis. Our method involved targeted collaborator involvement in data interpretation while also ensuring faithfulness to the coded data. We describe our process for mobilizing study findings through a deliberative dialogue with health leaders.

Jensen, C., Hoben, M., Chamberlain, S. A., K. Marshall, S., Young, R. A., & Gruneir, A. (2022). Data Analyses using the Action Project Method Coding Technique: A Guide. International Journal of Qualitative Methods, 21. https://doi.org/10.1177/16094069221108035 Abstract. The qualitative action-project method (A-PM) was developed in counseling psychology and is useful for studying human actions in various contexts. With this article we provide a guide to A-PM data analysis with a focus on the method’s coding technique. We briefly outline the theory underpinning the method as well as the different phases of data collection. The A-PM data analysis happens in parallel from a bottom-up and top-down approach, where researchers consider the data closely for what participants are doing, how they are doing it and the ways in which their actions are directed by their overall goals. We add to the existing literature by detailing the coding technique, providing examples at each stage of analysis, as well as reflect on the possibilities for adapting the protocol for different types of research. Our aim is to support researchers in their efforts to undertake the method.

Newton, P., & Burgess, D. (2008). Exploring Types of Educational Action Research: Implications for Research Validity. International Journal of Qualitative Methods, 7(4), 18–30. https://doi.org/10.1177/160940690800700402 Abstract. In this paper the authors argue that there are three modes of educational action research: emancipatory, practical, and knowledge generating. Furthermore, they suggest that much of action research, although predicated on notions of emancipatory research, is often not primarily emancipatory in nature. There are considerable risks involved when action research fails to adequately justify its truth claims because of a dependence on validities that primarily assess the emancipatory features of the research. Consequently, the authors propose that the various modes of action research require emphasis on different validities that are dependent on the purposes of the research. In doing this, they offer a reconceptualization of Anderson and Herr's (1999) influential approach to validity in action research.

Nind, M. (2011). Participatory data analysis: a step too far? Qualitative Research, 11(4), 349–363. https://doi.org/10.1177/1468794111404310

Abstract. Interest in participatory research methods has grown considerably in the spheres of research with children and young people and research with people with learning disabilities. This growth is rooted in different but related paradigm shifts in childhood and disability. I argue that despite developments in participatory approaches, participatory data analysis has been attempted less than participation in other aspects of research with either children or people with learning disabilities, and that the challenges involved in this are particularly under-explored and important with the latter where we need to investigate what is possible. I discuss why participation in analysis is often neglected before reviewing different responses to the challenge including examples of informal and formal, unstructured and structured, trained and untrained, explicit and implicit approaches. Finally, I make the case for authentic reciprocal learning in exploring the potential benefits of participatory analysis to people and to research.

Peltier, C. (2018). An Application of Two-Eyed Seeing: Indigenous Research Methods With Participatory Action Research. International Journal of Qualitative Methods, 17(1). https://doi.org/10.1177/1609406918812346

Abstract. In this time of reconciliation, Indigenous researchers-in-relation are sharing research paradigms and approaches that align with Indigenous worldviews. This article shares an interpretation of the Mi’kmaw concept of Two-Eyed Seeing as the synthesis of Indigenous methodology and participatory action research situated within an Indigenous paradigm of relevant, reciprocal, respectful, and responsible research. Two-Eyed Seeing is discussed as a guiding approach for researchers offering Indigenous voices and ways of knowing as a means to shift existing qualitative research paradigms. The author offers practical considerations for conducting research with Indigenous peoples in a “good and authentic way.” Through the co-creation of knowledge with Indigenous communities, a collective story was produced as a wellness teaching tool to foster the transfer of knowledge in a meaningful way.

Price, R., Wrigley, C., & Matthews, J. (2021). Action researcher to design innovation catalyst: Building design capability from within. Action Research, 19(2), 318–337. https://doi.org/10.1177/1476750318781221

Abstract. Design as a creative way of framing and solving problems is considered an essential business capability in an innovation era. Organizations with design capability can improve the lives of their customers, stakeholders and employees by creating valuable products, services and experiences. Design-led innovation is a framework that assists organizations to develop design capability for creating a better future as well as profitability. However, implementing design-led innovation requires support. This article presents insights from an action research extended to design innovation catalyst. The catalyst’s aim was to facilitate implementation of design-led innovation in an Australian Airport Corporation to develop design capability. To date, this extended role of action researcher as design innovation catalyst has received limited attention. Therefore, the purpose of this paper is to present insights from the experience of the action researcher as a design innovation catalyst. This paper contributes conceptual and practical insight into the research design, action research cycles and critical reflection of an action researcher operating as design innovation catalyst.

Rix, J., Carrizosa, H. G., Sheehy, K., Seale, J., & Hayhoe, S. (2022). Taking risks to enable participatory data analysis and dissemination: a research note. Qualitative Research, 22(1), 143–153. https://doi.org/10.1177/1468794120965356 Abstract. The involvement of all participants within all aspects of the research process is a well-established challenge for participatory research. This is particularly evident in relation to data analysis and dissemination. A novel way of understanding and approaching this challenge emerged through a large-scale international, 3-year participatory research project involving over 200 disabled people. This approach enabled people to be involved at all stages of the research in a manner that was collectively recognised to be participatory and also delivered high-quality findings. At the heart of this emergent approach to participatory research is an engagement with risk. This research note explores the types of risks involved in delivering research that seeks to be authentically participatory.

Wagemans, A., & Witschge, T. (2019). Examining innovation as process: Action research in journalism studies. Convergence, 25(2), 209–224. https://doi.org/10.1177/1354856519834880

Abstract. In this article, we discuss how ‘action research’ as an experiential research approach allows us to address challenges encountered in researching a converged and digital media landscape. We draw on our experiences as researchers, co-developers and marketeers in the European Union-funded Innovation Action project ‘INnovative Journalism: Enhanced Creativity Tools’ (INJECT) aimed at developing a technological tool for journalism. In this media innovation process, as in other media practices, longstanding delineations no longer hold, due to converging professional disciplines and blurring roles of users and producers. First, we discuss four features of innovation in the current ‘digital’ media landscape that come with specific methodological requirements: (a) the iterative nature of innovation; (b) converged practices, professions and roles; (c) the dispersed geographic nature of media production and innovation processes and (d) the impact of human and non-human actors. We suggest action research as a possible answer to these requirements of the digital media landscape. Drawing on our experiences in the INJECT project, we illustrate how adopting an action research approach provides insight into the non-linear, iterative and converged character of innovation processes by highlighting: (a) how innovation happens at various moments, in various places and by various people; (b) how perceptions and enactments of professions change over time and (c) how roles are (re)combined and expanded in such a way that clear delineation is not easy. Ultimately, we argue that  experiencing  convergence through action research enables us to do justice to the complexity of the current media landscape.

Given, L. M. (2008). Action research The SAGE Encyclopedia of qualitative research methods . Thousand Oaks, California.

More Methodspace Posts about Data Analysis

Stats Literacy

Listen to this interview, and check out Rhys Jones’ latest book: Statistical Literacy: A Beginner's Guide.

Recent Advances in Partial Least Squares Structural Equation Modeling: Disclosing Necessary Conditions

Learn about options available in the dynamic landscape of emerging methodological extensions in the PLS-SEM field is the necessary condition analysis (NCA).

Research Stages: A 2023 Recap

Looking back at 2023, find all posts here! We explored stages of a research project, from concept to publication. In each quarter we focused on one part of the process. In this recap for the year you will find original guest posts, interviews, curated collections of open-access resources, recordings from webinars or roundtable discussions, and instructional resources.

Methods Film Fest: Researchers Share Insights

Methods Film Fest! We can read what they write, but what do researchers say? What are they thinking about, what are they exploring, what insights do they share about methodologies, methods, and approaches? In 2023 Methodspace produced 32 videos, and you can find them all in this post!

Choosing digital tools for qualitative data analysis

Christina Silver explains why and how to use qualitative data analysis software to manage and analyze your notes, literature, materials, and data. Sign up for her upcoming (free) symposium!

Use Research Cases to Teach Methods for Large-Scale Data Analysis

Use research cases as the basis for individual or team activities that build skills.

Finding gems in limited data: How we went from “ungeneralizable” to valuable findings

How do you find gems in a research project when the data is too thin for generalizations? In this post researchers discuss creative ways to learn from (and write about) the experience.

Analyzing Qualitative and/or Quantitative Data

The focus for Q3 of 2023 was on analyzing and interpreting qualitative and quantitative data. Find all the posts, interviews, and resources here!

What is randomness?

Dr. Stephen Gorard defines and explains randomness in a research context.

The power of prediction

Mentor in Residence Stephen Gorard explains how researchers can think about predicting results.

Part Two: Equity Approaches in Quantitative Analysis&nbsp;

The Career and Technical Education (CTE) Equity Framework approach draws high-level insights from this body of work to inform equity in data analysis that can apply to groups of people who may face systemic barriers to CTE participation. Learn more in this two-part post!

Part One: The Need for Equity Approaches in Quantitative Analysis

The Career and Technical Education (CTE) Equity Framework approach draws high-level insights from this body of work to inform equity in data analysis that can apply to groups of people who may face systemic barriers to CTE participation. This is part 2, find the link to part 1 and previous posts about the Equity Framework.

Teaching and learning quantitative research methods in the social sciences

Instructional tips for teaching quantitative data analysis.

How can we judge the trustworthiness of a research finding?

In an era of rampant misinformation and disinformation, what research can you trust? Dr. Stephen Gorard offers guidance!

Analysing complex qualitative data - a brief guide for undergraduate social science research

Learn how inductive and deductive styles of reasoning are used to interpret qualitative research findings.

Image as data: Automated visual content analysis for social science

Images contain information absent in text, and this extra information presents opportunities and challenges. It is an opportunity because one image can document variables with which text sources (newspaper articles, speeches or legislative documents) struggle or on datasets too large to feasibly code manually. Learn how to overcome the challenges.

What to do about missing data?

Tips for dealing with missing data from Dr. Stephen Gorard, author of How to Make Sense of Statistics.

How Standard is Standard Deviation?

Learn more about standard deviation from a paper and presentation from Dr. Stephen Gorard.

Video Data Analysis: How 21st century video data reshapes social science research

Video capture is ubiquitous. What does it mean for researchers, and how can we analyze such data?

Qual Data Analysis & Phenomenology

Qualitative data analysis varies by methodology. Learn about approaches for phenomenological studies through this collection of open access articles.

Qual Data Analysis & Narrative Research

Learn about qualitative data analysis approaches for narrative and diary research in these open access articles.

Qual Data Analysis & Ethnography

Ethnography involves the production of highly detailed accounts of how people in a social setting lead their lives, based on systematic and long-term observation of, and discussion with, those within the setting.

Qual Data Analysis & Grounded Theory

Qualitative data analysis varies by methodology. Discover diverse ways to analyze data for grounded theory studies in these open access articles.

Qual Data Analysis & Action Research

Qualitative data analysis varies by methodology. Learn about approaches for action research in these open access articles.

Analysing Politics, Protest, and Digital Popular Culture

How can you study digital culture and activism? Watch this interview with Dr. Lyndon Wray.

Seeing and Hearing the Problem: Using Video in Qualitative Research

Look at the choices of video methods made by authors of four research articles.

Analyzing Video Data: Qualitative

This collection of open-access articles includes qualitative examples of analysis strategies to use with multimedia video data.

Analyzing Video Data: Quantitative

This collection of open-access articles includes quantitative examples of analysis for video data.

Analyzing Visual Data

How do we understand and interpret visual or video data? See these open-access articles for ideas and examples.

Analyzing Photos in Photovoice Studies

Find a collection of open-access articles about analyzing and interpreting photos generated by participants using photovoice mmethods.

Qual Data Analysis & Grounded Theory

‘technological reflexivity’ in qualitative research design.

AD Center Site Banner

  • Section 2: Home
  • Developing the Quantitative Research Design
  • Qualitative Descriptive Design
  • Design and Development Research (DDR) For Instructional Design
  • Qualitative Narrative Inquiry Research
  • Action Research Resource

What is Action Research?

Considerations, creating a plan of action.

  • Case Study Design in an Applied Doctorate
  • SAGE Research Methods
  • Research Examples (SAGE) This link opens in a new window
  • Dataset Examples (SAGE) This link opens in a new window
  • IRB Resource Center This link opens in a new window

Action research is a qualitative method that focuses on solving problems in social systems, such as schools and other organizations. The emphasis is on solving the presenting problem by generating knowledge and taking action within the social system in which the problem is located. The goal is to generate shared knowledge of how to address the problem by bridging the theory-practice gap (Bourner & Brook, 2019). A general definition of action research is the following: “Action research brings together action and reflection, as well as theory and practice, in participation with others, in the pursuit of practical solutions to issues of pressing concern” (Bradbury, 2015, p. 1). Johnson (2019) defines action research in the field of education as “the process of studying a school, classroom, or teacher-learning situation with the purpose of understanding and improving the quality of actions or instruction” (p.255).

Origins of Action Research

Kurt Lewin is typically credited with being the primary developer of Action Research in the 1940s. Lewin stated that action research can “transform…unrelated individuals, frequently opposed in their outlook and their interests, into cooperative teams, not on the basis of sweetness but on the basis of readiness to face difficulties realistically, to apply honest fact-finding, and to work together to overcome them” (1946, p.211).

Sample Action Research Topics

Some sample action research topics might be the following:

  • Examining how classroom teachers perceive and implement new strategies in the classroom--How is the strategy being used? How do students respond to the strategy? How does the strategy inform and change classroom practices? Does the new skill improve test scores? Do classroom teachers perceive the strategy as effective for student learning?
  • Examining how students are learning a particular content or objectives--What seems to be effective in enhancing student learning? What skills need to be reinforced? How do students respond to the new content? What is the ability of students to understand the new content?
  • Examining how education stakeholders (administrator, parents, teachers, students, etc.) make decisions as members of the school’s improvement team--How are different stakeholders encouraged to participate? How is power distributed? How is equity demonstrated? How is each voice valued? How are priorities and initiatives determined? How does the team evaluate its processes to determine effectiveness?
  • Examining the actions that school staff take to create an inclusive and welcoming school climate--Who makes and implements the actions taken to create the school climate? Do members of the school community (teachers, staff, students) view the school climate as inclusive? Do members of the school community feel welcome in the school? How are members of the school community encouraged to become involved in school activities? What actions can school staff take to help others feel a part of the school community?
  • Examining the perceptions of teachers with regard to the learning strategies that are more effective with special populations, such as special education students, English Language Learners, etc.—What strategies are perceived to be more effective? How do teachers plan instructionally for unique learners such as special education students or English Language Learners? How do teachers deal with the challenges presented by unique learners such as special education students or English Language Learners? What supports do teachers need (e.g., professional development, training, coaching) to more effectively deliver instruction to unique learners such as special education students or English Language Learners?

Remember—The goal of action research is to find out how individuals perceive and act in a situation so the researcher can develop a plan of action to improve the educational organization. While these topics listed here can be explored using other research designs, action research is the design to use if the outcome is to develop a plan of action for addressing and improving upon a situation in the educational organization.

Considerations for Determining Whether to Use Action Research in an Applied Dissertation

  • When considering action research, first determine the problem and the change that needs to occur as a result of addressing the problem (i.e., research problem and research purpose). Remember, the goal of action research is to change how individuals address a particular problem or situation in a way that results in improved practices.
  • If the study will be conducted at a school site or educational organization, you may need site permission. Determine whether site permission will be given to conduct the study.
  • Consider the individuals who will be part of the data collection (e.g., teachers, administrators, parents, other school staff, etc.). Will there be a representative sample willing to participate in the research?
  • If students will be part of the study, does parent consent and student assent need to be obtained?
  • As you develop your data collection plan, also consider the timeline for data collection. Is it feasible? For example, if you will be collecting data in a school, consider winter and summer breaks, school events, testing schedules, etc.
  • As you develop your data collection plan, consult with your dissertation chair, Subject Matter Expert, NU Academic Success Center, and the NU IRB for resources and guidance.
  • Action research is not an experimental design, so you are not trying to accept or reject a hypothesis. There are no independent or dependent variables. It is not generalizable to a larger setting. The goal is to understand what is occurring in the educational setting so that a plan of action can be developed for improved practices.

Considerations for Action Research

Below are some things to consider when developing your applied dissertation proposal using Action Research (adapted from Johnson, 2019):

  • Research Topic and Research Problem -- Decide the topic to be studied and then identify the problem by defining the issue in the learning environment. Use references from current peer-reviewed literature for support.
  • Purpose of the Study —What need to be different or improved as a result of the study?
  • Research Questions —The questions developed should focus on “how” or “what” and explore individuals’ experiences, beliefs, and perceptions.
  • Theoretical Framework -- What are the existing theories (theoretical framework) or concepts (conceptual framework) that can be used to support the research. How does existing theory link to what is happening in the educational environment with regard to the topic? What theories have been used to support similar topics in previous research?
  • Literature Review -- Examine the literature, focusing on peer-reviewed studies published in journal within the last five years, with the exception of seminal works. What about the topic has already been explored and examined? What were the findings, implications, and limitations of previous research? What is missing from the literature on the topic?  How will your proposed research address the gap in the literature?
  • Data Collection —Who will be part of the sample for data collection? What data will be collected from the individuals in the study (e.g., semi-structured interviews, surveys, etc.)? What are the educational artifacts and documents that need to be collected (e.g., teacher less plans, student portfolios, student grades, etc.)? How will they be collected and during what timeframe? (Note--A list of sample data collection methods appears under the heading of “Sample Instrumentation.”)
  • Data Analysis —Determine how the data will be analyzed. Some types of analyses that are frequently used for action research include thematic analysis and content analysis.
  • Implications —What conclusions can be drawn based upon the findings? How do the findings relate to the existing literature and inform theory in the field of education?
  • Recommendations for Practice--Create a Plan of Action— This is a critical step in action research. A plan of action is created based upon the data analysis, findings, and implications. In the Applied Dissertation, this Plan of Action is included with the Recommendations for Practice. The includes specific steps that individuals should take to change practices; recommendations for how those changes will occur (e.g., professional development, training, school improvement planning, committees to develop guidelines and policies, curriculum review committee, etc.); and methods to evaluate the plan’s effectiveness.
  • Recommendations for Research —What should future research focus on? What type of studies need to be conducted to build upon or further explore your findings.
  • Professional Presentation or Defense —This is where the findings will be presented in a professional presentation or defense as the culmination of your research.

Adapted from Johnson (2019).

Considerations for Sampling and Data Collection

Below are some tips for sampling, sample size, data collection, and instrumentation for Action Research:

Sampling and Sample Size

Action research uses non-probability sampling. This is most commonly means a purposive sampling method that includes specific inclusion and exclusion criteria. However, convenience sampling can also be used (e.g., a teacher’s classroom).

Critical Concepts in Data Collection

Triangulation- - Dosemagen and Schwalbach (2019) discussed the importance of triangulation in Action Research which enhances the trustworthiness by providing multiple sources of data to analyze and confirm evidence for findings.

Trustworthiness —Trustworthiness assures that research findings are fulfill four critical elements—credibility, dependability, transferability, and confirmability. Reflect on the following: Are there multiple sources of data? How have you ensured credibility, dependability, transferability, and confirmability? Have the assumptions, limitations, and delimitations of the study been identified and explained? Was the sample a representative sample for the study? Did any individuals leave the study before it ended? How have you controlled researcher biases and beliefs? Are you drawing conclusions that are not supported by data? Have all possible themes been considered? Have you identified other studies with similar results?

Sample Instrumentation

Below are some of the possible methods for collecting action research data:

  • Pre- and Post-Surveys for students and/or staff
  • Staff Perception Surveys and Questionnaires
  • Semi-Structured Interviews
  • Focus Groups
  • Observations
  • Document analysis
  • Student work samples
  • Classroom artifacts, such as teacher lesson plans, rubrics, checklists, etc.
  • Attendance records
  • Discipline data
  • Journals from students and/or staff
  • Portfolios from students and/or staff

A benefit of Action Research is its potential to influence educational practice. Many educators are, by nature of the profession, reflective, inquisitive, and action-oriented. The ultimate outcome of Action Research is to create a plan of action using the research findings to inform future educational practice. A Plan of Action is not meant to be a one-size fits all plan. Instead, it is mean to include specific data-driven and research-based recommendations that result from a detailed analysis of the data, the study findings, and implications of the Action Research study. An effective Plan of Action includes an evaluation component and opportunities for professional educator reflection that allows for authentic discussion aimed at continuous improvement.

When developing a Plan of Action, the following should be considered:

  • How can this situation be approached differently in the future?
  • What should change in terms of practice?
  • What are the specific steps that individuals should take to change practices?
  • What is needed to implement the changes being recommended (professional development, training, materials, resources, planning committees, school improvement planning, etc.)?
  • How will the effectiveness of the implemented changes be evaluated?
  • How will opportunities for professional educator reflection be built into the Action Plan?

Sample Action Research Studies

Anderson, A. J. (2020). A qualitative systematic review of youth participatory action research implementation in U.S. high schools. A merican Journal of Community Psychology, 65 (1/2), 242–257. https://onlinelibrary-wiley-com.proxy1.ncu.edu/doi/epdf/10.1002/ajcp.12389

Ayvaz, Ü., & Durmuş, S.(2021). Fostering mathematical creativity with problem posing activities: An action research with gifted students. Thinking Skills and Creativity, 40. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=edselp&AN=S1871187121000614&site=eds-live

Bellino, M. J. (2018). Closing information gaps in Kakuma Refugee Camp: A youth participatory action research study. American Journal of Community Psychology, 62 (3/4), 492–507. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=ofs&AN=133626988&site=eds-live

Beneyto, M., Castillo, J., Collet-Sabé, J., & Tort, A. (2019). Can schools become an inclusive space shared by all families? Learnings and debates from an action research project in Catalonia. Educational Action Research, 27 (2), 210–226. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=135671904&site=eds-live

Bilican, K., Senler, B., & Karısan, D. (2021). Fostering teacher educators’ professional development through collaborative action research. International Journal of Progressive Education, 17 (2), 459–472. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=149828364&site=eds-live

Black, G. L. (2021). Implementing action research in a teacher preparation program: Opportunities and limitations. Canadian Journal of Action Research, 21 (2), 47–71. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=149682611&site=eds-live

Bozkuş, K., & Bayrak, C. (2019). The Application of the dynamic teacher professional development through experimental action research. International Electronic Journal of Elementary Education, 11 (4), 335–352. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=135580911&site=eds-live

Christ, T. W. (2018). Mixed methods action research in special education: An overview of a grant-funded model demonstration project. Research in the Schools, 25( 2), 77–88. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=135047248&site=eds-live

Jakhelln, R., & Pörn, M. (2019). Challenges in supporting and assessing bachelor’s theses based on action research in initial teacher education. Educational Action Research, 27 (5), 726–741. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=140234116&site=eds-live

Klima Ronen, I. (2020). Action research as a methodology for professional development in leading an educational process. Studies in Educational Evaluation, 64 . https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=edselp&AN=S0191491X19302159&site=eds-live

Messiou, K. (2019). Collaborative action research: facilitating inclusion in schools. Educational Action Research, 27 (2), 197–209. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=ehh&AN=135671898&site=eds-live

Mitchell, D. E. (2018). Say it loud: An action research project examining the afrivisual and africology, Looking for alternative African American community college teaching strategies. Journal of Pan African Studies, 12 (4), 364–487. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=ofs&AN=133155045&site=eds-live

Pentón Herrera, L. J. (2018). Action research as a tool for professional development in the K-12 ELT classroom. TESL Canada Journal, 35 (2), 128–139. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=ofs&AN=135033158&site=eds-live

Rodriguez, R., Macias, R. L., Perez-Garcia, R., Landeros, G., & Martinez, A. (2018). Action research at the intersection of structural and family violence in an immigrant Latino community: a youth-led study. Journal of Family Violence, 33 (8), 587–596. https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=ccm&AN=132323375&site=eds-live

Vaughan, M., Boerum, C., & Whitehead, L. (2019). Action research in doctoral coursework: Perceptions of independent research experiences. International Journal for the Scholarship of Teaching and Learning, 13 . https://proxy1.ncu.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=edsdoj&AN=edsdoj.17aa0c2976c44a0991e69b2a7b4f321&site=eds-live

Sample Journals for Action Research

Educational Action Research

Canadian Journal of Action Research

Sample Resource Videos

Call-Cummings, M. (2017). Researching racism in schools using participatory action research [Video]. Sage Research Methods  http://proxy1.ncu.edu/login?URL=https://methods.sagepub.com/video/researching-racism-in-schools-using-participatory-action-research

Fine, M. (2016). Michelle Fine discusses community based participatory action research [Video]. Sage Knowledge. http://proxy1.ncu.edu/login?URL=https://sk-sagepub-com.proxy1.ncu.edu/video/michelle-fine-discusses-community-based-participatory-action-research

Getz, C., Yamamura, E., & Tillapaugh. (2017). Action Research in Education. [Video]. You Tube. https://www.youtube.com/watch?v=X2tso4klYu8

Bradbury, H. (Ed.). (2015). The handbook of action research (3rd edition). Sage.

Bradbury, H., Lewis, R. & Embury, D.C. (2019). Education action research: With and for the next generation. In C.A. Mertler (Ed.), The Wiley handbook of action research in education (1st edition). John Wiley and Sons. https://ebookcentral.proquest.com/lib/nu/reader.action?docID=5683581&ppg=205

Bourner, T., & Brook, C. (2019). Comparing and contrasting action research and action learning. In C.A. Mertler (Ed.), The Wiley handbook of action research in education (1st edition). John Wiley and Sons. https://ebookcentral.proquest.com/lib/nu/reader.action?docID=5683581&ppg=205

Bradbury, H. (2015). The Sage handbook of action research . Sage. https://www-doi-org.proxy1.ncu.edu/10.4135/9781473921290

Dosemagen, D.M. & Schwalback, E.M. (2019). Legitimacy of and value in action research. In C.A. Mertler (Ed.), The Wiley handbook of action research in education (1st edition). John Wiley and Sons. https://ebookcentral.proquest.com/lib/nu/reader.action?docID=5683581&ppg=205

Johnson, A. (2019). Action research for teacher professional development. In C.A. Mertler (Ed.), The Wiley handbook of action research in education (1st edition). John Wiley and Sons. https://ebookcentral.proquest.com/lib/nu/reader.action?docID=5683581&ppg=205

Lewin, K. (1946). Action research and minority problems. In G.W. Lewin (Ed.), Resolving social conflicts: Selected papers on group dynamics (compiled in 1948). Harper and Row.

Mertler, C. A. (Ed.). (2019). The Wiley handbook of action research in education. John Wiley and Sons. https://ebookcentral.proquest.com/lib/nu/detail.action?docID=5683581

  • << Previous: Qualitative Narrative Inquiry Research
  • Next: Case Study Design in an Applied Doctorate >>
  • Last Updated: Jul 28, 2023 8:05 AM
  • URL: https://resources.nu.edu/c.php?g=1013605

National University

© Copyright 2024 National University. All Rights Reserved.

Privacy Policy | Consumer Information

Logo for Open Library Publishing Platform

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Data Analysis

Module 5: Data Analysis & Reciprocity

At this stage, you’re probably carrying out your planned intervention or action and gathering data to address your research question. Many newcomers to action research believe that analysis should only start after all the data has been collected.

An interim analysis is part of the continuous, ongoing data analysis. It is part of the ongoing reflective planning process of action research (Hendricks, 2013).

Your action research projects will typically involve both quantitative and qualitative data. The methods for simplifying quantitative data, such as  reporting, comparing , and  displaying data,  differ significantly from those used for qualitative data, which include  analyzing the data to identify patterns and themes.

New researchers often feel disappointed when their interventions don’t lead to the anticipated results. However, even in these situations, exploring the data to understand why things didn’t work as expected can provide valuable insights. This process can guide you in refining your intervention to achieve better results in the future.

Remember! Action research is an iterative process so what you learn from this cycle of your research project will inform your next iteration of action research.

Analysis of Quantitative Data: Reporting & Comparing

Quantitative data is usually gathered via:

  • Test scores
  • Rubric-scored work
  • Tally sheets
  • Behavioural scales
  • Attitude scales
  • Closed-ended survey items

For example:  Counting or averaging the number of responses for each item.

  • Closed-ended responses (strong, average, weak) can reflect counts for the number of respondents who chose each response.
  • For the behavioural scale item, which includes numerical responses, the actual number chosen for each item could be tallied and the numbers could be averaged to describe results (Hendricks, 2013).

Quick Tips to Analyze Quantitative Data

“According to Shank (ibid) “themes do not emerge from data. What emerges after much hard work and creative thought, is an awareness in the mind of the researcher that there are patterns of order that seem to cut across various aspects of the data. When these patterns become organized, and when they characterize different segments of data, then we can call them ‘themes’.”

(Hendricks 2013)

Checklist infographic with three items (see long description below)

Analysis of Qualitative Data: Looking for Themes & Patterns

Analysis of qualitative data is a process of making meaning from data sources that can be interpreted in several ways and helps answer the  why questions .

These data sources can be explained and used to answer your research question only after they have been interpreted. This process requires a deeper analysis of data than those processes used to explain quantitative data sources (Hendricks 2013).

Verification

Verification is knowing when you “got it right.” Reaching valid conclusions in your study is a critical step in the action research cycle. Conclusions must be reasonable in light of the results obtained.

Quick Tips to Analyze Qualitative Data

Write your research question(s).
Refer to your research question(s) often as you go through the process of qualitative data analysis
Compile qualitative data sources.
Convert non-textual data to textual form (e.g. audio or video recordings)
Read text sources several times and over several days. Disassemble and code data.
Create a codebook. Define codes and illustrate them with quotes or examples from text sources
Look for themes as you reassemble data.
Interpret results and write up major findings. Describe the patterns and themes specifically related to your research questions(s)
Look for ways that the results of the different types of data you have collected (artifacts, observations, and inquiry data) support each other

Action Research Handbook Copyright © by Dr. Zabedia Nazim and Dr. Sowmya Venkat-Kishore is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • What Is Action Research? | Definition & Examples

What Is Action Research? | Definition & Examples

Published on January 27, 2023 by Tegan George . Revised on January 12, 2024.

Action research Cycle

Table of contents

Types of action research, action research models, examples of action research, action research vs. traditional research, advantages and disadvantages of action research, other interesting articles, frequently asked questions about action research.

There are 2 common types of action research: participatory action research and practical action research.

  • Participatory action research emphasizes that participants should be members of the community being studied, empowering those directly affected by outcomes of said research. In this method, participants are effectively co-researchers, with their lived experiences considered formative to the research process.
  • Practical action research focuses more on how research is conducted and is designed to address and solve specific issues.

Both types of action research are more focused on increasing the capacity and ability of future practitioners than contributing to a theoretical body of knowledge.

Prevent plagiarism. Run a free check.

Action research is often reflected in 3 action research models: operational (sometimes called technical), collaboration, and critical reflection.

  • Operational (or technical) action research is usually visualized like a spiral following a series of steps, such as “planning → acting → observing → reflecting.”
  • Collaboration action research is more community-based, focused on building a network of similar individuals (e.g., college professors in a given geographic area) and compiling learnings from iterated feedback cycles.
  • Critical reflection action research serves to contextualize systemic processes that are already ongoing (e.g., working retroactively to analyze existing school systems by questioning why certain practices were put into place and developed the way they did).

Action research is often used in fields like education because of its iterative and flexible style.

After the information was collected, the students were asked where they thought ramps or other accessibility measures would be best utilized, and the suggestions were sent to school administrators. Example: Practical action research Science teachers at your city’s high school have been witnessing a year-over-year decline in standardized test scores in chemistry. In seeking the source of this issue, they studied how concepts are taught in depth, focusing on the methods, tools, and approaches used by each teacher.

Action research differs sharply from other types of research in that it seeks to produce actionable processes over the course of the research rather than contributing to existing knowledge or drawing conclusions from datasets. In this way, action research is formative , not summative , and is conducted in an ongoing, iterative way.

Action research Traditional research
and findings
and seeking between variables

As such, action research is different in purpose, context, and significance and is a good fit for those seeking to implement systemic change.

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Action research comes with advantages and disadvantages.

  • Action research is highly adaptable , allowing researchers to mold their analysis to their individual needs and implement practical individual-level changes.
  • Action research provides an immediate and actionable path forward for solving entrenched issues, rather than suggesting complicated, longer-term solutions rooted in complex data.
  • Done correctly, action research can be very empowering , informing social change and allowing participants to effect that change in ways meaningful to their communities.

Disadvantages

  • Due to their flexibility, action research studies are plagued by very limited generalizability  and are very difficult to replicate . They are often not considered theoretically rigorous due to the power the researcher holds in drawing conclusions.
  • Action research can be complicated to structure in an ethical manner . Participants may feel pressured to participate or to participate in a certain way.
  • Action research is at high risk for research biases such as selection bias , social desirability bias , or other types of cognitive biases .

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Normal distribution
  • Degrees of freedom
  • Null hypothesis
  • Discourse analysis
  • Control groups
  • Mixed methods research
  • Non-probability sampling
  • Quantitative research
  • Inclusion and exclusion criteria

Research bias

  • Rosenthal effect
  • Implicit bias
  • Cognitive bias
  • Selection bias
  • Negativity bias
  • Status quo bias

Action research is conducted in order to solve a particular issue immediately, while case studies are often conducted over a longer period of time and focus more on observing and analyzing a particular ongoing phenomenon.

Action research is focused on solving a problem or informing individual and community-based knowledge in a way that impacts teaching, learning, and other related processes. It is less focused on contributing theoretical input, instead producing actionable input.

Action research is particularly popular with educators as a form of systematic inquiry because it prioritizes reflection and bridges the gap between theory and practice. Educators are able to simultaneously investigate an issue as they solve it, and the method is very iterative and flexible.

A cycle of inquiry is another name for action research . It is usually visualized in a spiral shape following a series of steps, such as “planning → acting → observing → reflecting.”

Sources in this article

We strongly encourage students to use sources in their work. You can cite our article (APA Style) or take a deep dive into the articles below.

George, T. (2024, January 12). What Is Action Research? | Definition & Examples. Scribbr. Retrieved September 16, 2024, from https://www.scribbr.com/methodology/action-research/
Cohen, L., Manion, L., & Morrison, K. (2017). Research methods in education (8th edition). Routledge.
Naughton, G. M. (2001).  Action research (1st edition). Routledge.

Is this article helpful?

Tegan George

Tegan George

Other students also liked, what is an observational study | guide & examples, primary research | definition, types, & examples, guide to experimental design | overview, steps, & examples, get unlimited documents corrected.

✔ Free APA citation check included ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

Product Overview

SurveyMonkey is built to handle every use case and need. Explore our product to learn how SurveyMonkey can work for you.

SurveyMonkey

Get data-driven insights from a global leader in online surveys.

Explore core features and advanced tools in one powerful platform.

SurveyMonkey Forms

Build and customise online forms to collect info and payments.

Integrations

Integrate with 100+ apps and plug-ins to get more done.

Market Research Solutions

Purpose-built solutions for all of your market research needs.

SurveyMonkey Genius

Create better surveys and spot insights quickly with built-in AI.

Financial Services

See more industries, customer experience, human resources, see more roles.

Online Polls

Registration Forms

Employee feedback, event feedback, customer satisfaction, see more use cases.

Contact Sales

Net Promoter Score

Measure customer satisfaction and loyalty for your business.

Learn what makes customers happy and turn them into advocates.

Website Feedback

Get actionable insights to improve the user experience.

Contact Information

Collect contact information from prospects, invitees, and more.

Event Registration

Easily collect and track RSVPs for your next event.

Find out what attendees want so that you can improve your next event.

Employee Engagement

Uncover insights to boost engagement and drive better results.

Meeting Feedback

Get feedback from your attendees so you can run better meetings.

360-degree employee evaluation

Use peer feedback to help improve employee performance.

Course Evaluation

Create better courses and improve teaching methods.

University Instructor Evaluation

Learn how students rate the course material and its presentation.

Product Testing

Find out what your customers think about your new product ideas.

See all templates

Resource centre.

Best practices for using surveys and survey data

Curiosity at Work Blog

Our blog about surveys, tips for business, and more.

Help Centre

Tutorials and how to guides for using SurveyMonkey.

How top brands drive growth with SurveyMonkey.

  • English (US)
  • English (UK)

What is a data analysis plan?

Data analysis is the process of assessing the data you’ve gathered to extract useful statistics and draw conclusions. It allows you to make sense of the information gathered and answer your key research questions.

But when the data comes rolling in, you may feel a little overwhelmed. It’s often hard to know where to start. Especially because there are a few different options. We recommend developing a plan and thinking carefully about how you’re going to organise and analyse your survey data.

How to analyse your data

1. check your research questions.

First things first, what questions did you set out to answer? What did you want your survey to tell you? These are your research questions.

For instance, let’s say you held a conference for people working in education, and you wanted to know what the attendees thought of your event. Normally, you will have come up with several key questions you wanted answers to, including:

  • How did attendees rate the event overall?
  • Which parts/aspects of the conference did attendees like the best?
  • Which parts/aspects of the conference need to be improved?
  • Who are the attendees and what are their specific needs?

2. Relate each survey question to a research question

Attribute each survey question to the relevant research question. We’ve provided an example in the following table. Doing this will help you know which survey questions to refer to for specific research topics. For example, to find out which parts of the conference attendees liked the best, look at the answers to questions 3–6.

How did attendees rate the event overall?1. Overall, how satisfied were you with the conference?
2. How useful was this conference compared to other conferences you have attended?
Which parts/aspects of the conference did attendees like the best?

Which parts/aspects of the conference need to be improved?
3. How would you rate the difficulty of the workshop?
4. Overall, do you think the conference provided too much, too little or about the right amount of networking?
5. In general, how would you rate the food at the conference?
6. Do you feel the temperature in the conference building was too hot, too cold or just right?
Who are the attendees and what are their specific needs?7. Are you a teacher, student or administrator?
8. How large is your school?
9. How old are you?

3.   Choose your data analysis methods

There are different ways of analysing data, depending on whether it’s qualitative or quantitative. Because most surveys include a mixture, you’ll probably use both types of data analysis techniques.

Quantitative data analysis

Quantitative research focuses on facts and figures. Answers are often presented numerically, but can also be expressed as words, such as yes/no questions or multichoice.

When it comes to analysis techniques for quantitative data, some basic statistics can help you understand your results and identify patterns. This includes descriptive statistics such as minimum and maximum, mean (or average), median and standard deviation (looking at the distance from the mean). And the good news is, there’s no need for other data analysis tools or number crunching—all these statistics are automatically calculated in the Analyze Results [A1] section of your SurveyMonkey survey.

Qualitative data analysis

Qualitative research often investigates opinions and perceptions. While it’s a little harder to analyse than quantitative data, it provides some valuable insights. For example, it often looks at the motivations behind decisions, meaning it’s a useful way of adding context to your quantitative data.

Two key types of qualitative data analysis are content analysis and grounded theory. Content analysis actually refers to a range of methods for analysing text-based data. One example of this is Sentiment Analysis, which is a way of identifying the emotion behind people’s comments. When this functionality is enabled in SurveyMonkey, your survey answers will be categorised as Positive, Neutral, Negative or Undetected. Meanwhile, grounded theory involves looking at the qualitative data to explain a pattern. You can analyse individual comments manually and then keep track of them by creating and adding your own tags in SurveyMonkey. You can also filter responses by tag.

4.   Interpret and present your findings using data analysis techniques

Segmentation and comparison.

As well as looking at overall survey results or at results for each question, it can be incredibly insightful to segment your data and compare results across different segments. You could choose to segment your results by demographic characteristics such as gender, age, location or occupation and compare between them. For instance, you might discover that those aged 30–35 enjoyed your event more than those of other age groups. Or perhaps that, overall, teachers enjoyed the networking sessions more than students.

You can also use some nifty filtering methods for data analysis and comparison survey-wide. For example, SurveyMonkey allows you to filter your entire survey data based on sentiment analysis, on specific demographics or on tags, so you can compare results across these groups.

Data visualisation

Humans are visual creatures. This means we find information easier to understand and remember when it’s displayed as an icon, graph or image, rather than as screeds of text or rows of numbers. Given this, it’s a good idea to display your findings visually before sharing them with your team or adding them to your report. Some common visualisation options for quantitative data are pie charts, bar graphs and line graphs. With SurveyMonkey, you can change between these different charts and graphs, choosing the most compelling option for each result and even customising them. And with your qualitative data, you can create a word cloud, which is a visual representation of the most common words and phrases from your open-ended responses.

Discover more resources

Toolkits Directory

Toolkits Directory

Discover our toolkits, designed to help you leverage feedback in your role or industry.

Create employee exit interview forms to know where to improve

Create employee exit interview forms to know where to improve

Ask the right questions on your exit interview survey to reduce employee attrition. Get started today with our employee form builder tools and templates.

Receive the necessary permissions with online consent forms

Receive the necessary permissions with online consent forms

Get the permissions you need with a custom consent form. Sign up for free today to create forms with our consent form templates.

Receive requests easily with online request forms

Receive requests easily with online request forms

Create and customise request forms easily to receive requests from employees, customers and more. Use our expert-built templates to get started in minutes.

See how SurveyMonkey can power your curiosity

App Directory

Vision and Mission

SurveyMonkey Together

Diversity, Equity & Inclusion

Health Plan Transparency in Coverage

Office Locations

Terms of Use

Privacy Notice

California Privacy Notice

Acceptable Uses Policy

Security Statement

GDPR Compliance

Email Opt-In

Accessibility

Cookies Notice

Facebook Surveys

Survey Template

Scheduling Polls

Google Forms vs. SurveyMonkey

Employee Satisfaction Surveys

Free Survey Templates

Mobile Surveys

How to Improve Customer Service

AB Test Significance Calculator

NPS Calculator

Questionnaire Templates

Event Survey

Sample Size Calculator

Writing Good Surveys

Likert Scale

Survey Analysis

360 Degree Feedback

Education Surveys

Survey Questions

NPS Calculation

Customer Satisfaction Survey Questions

Agree Disagree Questions

Create a Survey

Online Quizzes

Qualitative vs. Quantitative Research

Customer Survey

Market Research Surveys

Survey Design Best Practices

Margin of Error Calculator

Questionnaire

Demographic Questions

Training Survey

Offline Survey

360 Review Template

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

The PMC website is updating on October 15, 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Can J Hosp Pharm
  • v.68(4); Jul-Aug 2015

Logo of cjhp

Creating a Data Analysis Plan: What to Consider When Choosing Statistics for a Study

There are three kinds of lies: lies, damned lies, and statistics. – Mark Twain 1

INTRODUCTION

Statistics represent an essential part of a study because, regardless of the study design, investigators need to summarize the collected information for interpretation and presentation to others. It is therefore important for us to heed Mr Twain’s concern when creating the data analysis plan. In fact, even before data collection begins, we need to have a clear analysis plan that will guide us from the initial stages of summarizing and describing the data through to testing our hypotheses.

The purpose of this article is to help you create a data analysis plan for a quantitative study. For those interested in conducting qualitative research, previous articles in this Research Primer series have provided information on the design and analysis of such studies. 2 , 3 Information in the current article is divided into 3 main sections: an overview of terms and concepts used in data analysis, a review of common methods used to summarize study data, and a process to help identify relevant statistical tests. My intention here is to introduce the main elements of data analysis and provide a place for you to start when planning this part of your study. Biostatistical experts, textbooks, statistical software packages, and other resources can certainly add more breadth and depth to this topic when you need additional information and advice.

TERMS AND CONCEPTS USED IN DATA ANALYSIS

When analyzing information from a quantitative study, we are often dealing with numbers; therefore, it is important to begin with an understanding of the source of the numbers. Let us start with the term variable , which defines a specific item of information collected in a study. Examples of variables include age, sex or gender, ethnicity, exercise frequency, weight, treatment group, and blood glucose. Each variable will have a group of categories, which are referred to as values , to help describe the characteristic of an individual study participant. For example, the variable “sex” would have values of “male” and “female”.

Although variables can be defined or grouped in various ways, I will focus on 2 methods at this introductory stage. First, variables can be defined according to the level of measurement. The categories in a nominal variable are names, for example, male and female for the variable “sex”; white, Aboriginal, black, Latin American, South Asian, and East Asian for the variable “ethnicity”; and intervention and control for the variable “treatment group”. Nominal variables with only 2 categories are also referred to as dichotomous variables because the study group can be divided into 2 subgroups based on information in the variable. For example, a study sample can be split into 2 groups (patients receiving the intervention and controls) using the dichotomous variable “treatment group”. An ordinal variable implies that the categories can be placed in a meaningful order, as would be the case for exercise frequency (never, sometimes, often, or always). Nominal-level and ordinal-level variables are also referred to as categorical variables, because each category in the variable can be completely separated from the others. The categories for an interval variable can be placed in a meaningful order, with the interval between consecutive categories also having meaning. Age, weight, and blood glucose can be considered as interval variables, but also as ratio variables, because the ratio between values has meaning (e.g., a 15-year-old is half the age of a 30-year-old). Interval-level and ratio-level variables are also referred to as continuous variables because of the underlying continuity among categories.

As we progress through the levels of measurement from nominal to ratio variables, we gather more information about the study participant. The amount of information that a variable provides will become important in the analysis stage, because we lose information when variables are reduced or aggregated—a common practice that is not recommended. 4 For example, if age is reduced from a ratio-level variable (measured in years) to an ordinal variable (categories of < 65 and ≥ 65 years) we lose the ability to make comparisons across the entire age range and introduce error into the data analysis. 4

A second method of defining variables is to consider them as either dependent or independent. As the terms imply, the value of a dependent variable depends on the value of other variables, whereas the value of an independent variable does not rely on other variables. In addition, an investigator can influence the value of an independent variable, such as treatment-group assignment. Independent variables are also referred to as predictors because we can use information from these variables to predict the value of a dependent variable. Building on the group of variables listed in the first paragraph of this section, blood glucose could be considered a dependent variable, because its value may depend on values of the independent variables age, sex, ethnicity, exercise frequency, weight, and treatment group.

Statistics are mathematical formulae that are used to organize and interpret the information that is collected through variables. There are 2 general categories of statistics, descriptive and inferential. Descriptive statistics are used to describe the collected information, such as the range of values, their average, and the most common category. Knowledge gained from descriptive statistics helps investigators learn more about the study sample. Inferential statistics are used to make comparisons and draw conclusions from the study data. Knowledge gained from inferential statistics allows investigators to make inferences and generalize beyond their study sample to other groups.

Before we move on to specific descriptive and inferential statistics, there are 2 more definitions to review. Parametric statistics are generally used when values in an interval-level or ratio-level variable are normally distributed (i.e., the entire group of values has a bell-shaped curve when plotted by frequency). These statistics are used because we can define parameters of the data, such as the centre and width of the normally distributed curve. In contrast, interval-level and ratio-level variables with values that are not normally distributed, as well as nominal-level and ordinal-level variables, are generally analyzed using nonparametric statistics.

METHODS FOR SUMMARIZING STUDY DATA: DESCRIPTIVE STATISTICS

The first step in a data analysis plan is to describe the data collected in the study. This can be done using figures to give a visual presentation of the data and statistics to generate numeric descriptions of the data.

Selection of an appropriate figure to represent a particular set of data depends on the measurement level of the variable. Data for nominal-level and ordinal-level variables may be interpreted using a pie graph or bar graph . Both options allow us to examine the relative number of participants within each category (by reporting the percentages within each category), whereas a bar graph can also be used to examine absolute numbers. For example, we could create a pie graph to illustrate the proportions of men and women in a study sample and a bar graph to illustrate the number of people who report exercising at each level of frequency (never, sometimes, often, or always).

Interval-level and ratio-level variables may also be interpreted using a pie graph or bar graph; however, these types of variables often have too many categories for such graphs to provide meaningful information. Instead, these variables may be better interpreted using a histogram . Unlike a bar graph, which displays the frequency for each distinct category, a histogram displays the frequency within a range of continuous categories. Information from this type of figure allows us to determine whether the data are normally distributed. In addition to pie graphs, bar graphs, and histograms, many other types of figures are available for the visual representation of data. Interested readers can find additional types of figures in the books recommended in the “Further Readings” section.

Figures are also useful for visualizing comparisons between variables or between subgroups within a variable (for example, the distribution of blood glucose according to sex). Box plots are useful for summarizing information for a variable that does not follow a normal distribution. The lower and upper limits of the box identify the interquartile range (or 25th and 75th percentiles), while the midline indicates the median value (or 50th percentile). Scatter plots provide information on how the categories for one continuous variable relate to categories in a second variable; they are often helpful in the analysis of correlations.

In addition to using figures to present a visual description of the data, investigators can use statistics to provide a numeric description. Regardless of the measurement level, we can find the mode by identifying the most frequent category within a variable. When summarizing nominal-level and ordinal-level variables, the simplest method is to report the proportion of participants within each category.

The choice of the most appropriate descriptive statistic for interval-level and ratio-level variables will depend on how the values are distributed. If the values are normally distributed, we can summarize the information using the parametric statistics of mean and standard deviation. The mean is the arithmetic average of all values within the variable, and the standard deviation tells us how widely the values are dispersed around the mean. When values of interval-level and ratio-level variables are not normally distributed, or we are summarizing information from an ordinal-level variable, it may be more appropriate to use the nonparametric statistics of median and range. The first step in identifying these descriptive statistics is to arrange study participants according to the variable categories from lowest value to highest value. The range is used to report the lowest and highest values. The median or 50th percentile is located by dividing the number of participants into 2 groups, such that half (50%) of the participants have values above the median and the other half (50%) have values below the median. Similarly, the 25th percentile is the value with 25% of the participants having values below and 75% of the participants having values above, and the 75th percentile is the value with 75% of participants having values below and 25% of participants having values above. Together, the 25th and 75th percentiles define the interquartile range .

PROCESS TO IDENTIFY RELEVANT STATISTICAL TESTS: INFERENTIAL STATISTICS

One caveat about the information provided in this section: selecting the most appropriate inferential statistic for a specific study should be a combination of following these suggestions, seeking advice from experts, and discussing with your co-investigators. My intention here is to give you a place to start a conversation with your colleagues about the options available as you develop your data analysis plan.

There are 3 key questions to consider when selecting an appropriate inferential statistic for a study: What is the research question? What is the study design? and What is the level of measurement? It is important for investigators to carefully consider these questions when developing the study protocol and creating the analysis plan. The figures that accompany these questions show decision trees that will help you to narrow down the list of inferential statistics that would be relevant to a particular study. Appendix 1 provides brief definitions of the inferential statistics named in these figures. Additional information, such as the formulae for various inferential statistics, can be obtained from textbooks, statistical software packages, and biostatisticians.

What Is the Research Question?

The first step in identifying relevant inferential statistics for a study is to consider the type of research question being asked. You can find more details about the different types of research questions in a previous article in this Research Primer series that covered questions and hypotheses. 5 A relational question seeks information about the relationship among variables; in this situation, investigators will be interested in determining whether there is an association ( Figure 1 ). A causal question seeks information about the effect of an intervention on an outcome; in this situation, the investigator will be interested in determining whether there is a difference ( Figure 2 ).

An external file that holds a picture, illustration, etc.
Object name is cjhp-68-311f1.jpg

Decision tree to identify inferential statistics for an association.

An external file that holds a picture, illustration, etc.
Object name is cjhp-68-311f2.jpg

Decision tree to identify inferential statistics for measuring a difference.

What Is the Study Design?

When considering a question of association, investigators will be interested in measuring the relationship between variables ( Figure 1 ). A study designed to determine whether there is consensus among different raters will be measuring agreement. For example, an investigator may be interested in determining whether 2 raters, using the same assessment tool, arrive at the same score. Correlation analyses examine the strength of a relationship or connection between 2 variables, like age and blood glucose. Regression analyses also examine the strength of a relationship or connection; however, in this type of analysis, one variable is considered an outcome (or dependent variable) and the other variable is considered a predictor (or independent variable). Regression analyses often consider the influence of multiple predictors on an outcome at the same time. For example, an investigator may be interested in examining the association between a treatment and blood glucose, while also considering other factors, like age, sex, ethnicity, exercise frequency, and weight.

When considering a question of difference, investigators must first determine how many groups they will be comparing. In some cases, investigators may be interested in comparing the characteristic of one group with that of an external reference group. For example, is the mean age of study participants similar to the mean age of all people in the target group? If more than one group is involved, then investigators must also determine whether there is an underlying connection between the sets of values (or samples ) to be compared. Samples are considered independent or unpaired when the information is taken from different groups. For example, we could use an unpaired t test to compare the mean age between 2 independent samples, such as the intervention and control groups in a study. Samples are considered related or paired if the information is taken from the same group of people, for example, measurement of blood glucose at the beginning and end of a study. Because blood glucose is measured in the same people at both time points, we could use a paired t test to determine whether there has been a significant change in blood glucose.

What Is the Level of Measurement?

As described in the first section of this article, variables can be grouped according to the level of measurement (nominal, ordinal, or interval). In most cases, the independent variable in an inferential statistic will be nominal; therefore, investigators need to know the level of measurement for the dependent variable before they can select the relevant inferential statistic. Two exceptions to this consideration are correlation analyses and regression analyses ( Figure 1 ). Because a correlation analysis measures the strength of association between 2 variables, we need to consider the level of measurement for both variables. Regression analyses can consider multiple independent variables, often with a variety of measurement levels. However, for these analyses, investigators still need to consider the level of measurement for the dependent variable.

Selection of inferential statistics to test interval-level variables must include consideration of how the data are distributed. An underlying assumption for parametric tests is that the data approximate a normal distribution. When the data are not normally distributed, information derived from a parametric test may be wrong. 6 When the assumption of normality is violated (for example, when the data are skewed), then investigators should use a nonparametric test. If the data are normally distributed, then investigators can use a parametric test.

ADDITIONAL CONSIDERATIONS

What is the level of significance.

An inferential statistic is used to calculate a p value, the probability of obtaining the observed data by chance. Investigators can then compare this p value against a prespecified level of significance, which is often chosen to be 0.05. This level of significance represents a 1 in 20 chance that the observation is wrong, which is considered an acceptable level of error.

What Are the Most Commonly Used Statistics?

In 1983, Emerson and Colditz 7 reported the first review of statistics used in original research articles published in the New England Journal of Medicine . This review of statistics used in the journal was updated in 1989 and 2005, 8 and this type of analysis has been replicated in many other journals. 9 – 13 Collectively, these reviews have identified 2 important observations. First, the overall sophistication of statistical methodology used and reported in studies has grown over time, with survival analyses and multivariable regression analyses becoming much more common. The second observation is that, despite this trend, 1 in 4 articles describe no statistical methods or report only simple descriptive statistics. When inferential statistics are used, the most common are t tests, contingency table tests (for example, χ 2 test and Fisher exact test), and simple correlation and regression analyses. This information is important for educators, investigators, reviewers, and readers because it suggests that a good foundational knowledge of descriptive statistics and common inferential statistics will enable us to correctly evaluate the majority of research articles. 11 – 13 However, to fully take advantage of all research published in high-impact journals, we need to become acquainted with some of the more complex methods, such as multivariable regression analyses. 8 , 13

What Are Some Additional Resources?

As an investigator and Associate Editor with CJHP , I have often relied on the advice of colleagues to help create my own analysis plans and review the plans of others. Biostatisticians have a wealth of knowledge in the field of statistical analysis and can provide advice on the correct selection, application, and interpretation of these methods. Colleagues who have “been there and done that” with their own data analysis plans are also valuable sources of information. Identify these individuals and consult with them early and often as you develop your analysis plan.

Another important resource to consider when creating your analysis plan is textbooks. Numerous statistical textbooks are available, differing in levels of complexity and scope. The titles listed in the “Further Reading” section are just a few suggestions. I encourage interested readers to look through these and other books to find resources that best fit their needs. However, one crucial book that I highly recommend to anyone wanting to be an investigator or peer reviewer is Lang and Secic’s How to Report Statistics in Medicine (see “Further Reading”). As the title implies, this book covers a wide range of statistics used in medical research and provides numerous examples of how to correctly report the results.

CONCLUSIONS

When it comes to creating an analysis plan for your project, I recommend following the sage advice of Douglas Adams in The Hitchhiker’s Guide to the Galaxy : Don’t panic! 14 Begin with simple methods to summarize and visualize your data, then use the key questions and decision trees provided in this article to identify relevant statistical tests. Information in this article will give you and your co-investigators a place to start discussing the elements necessary for developing an analysis plan. But do not stop there! Use advice from biostatisticians and more experienced colleagues, as well as information in textbooks, to help create your analysis plan and choose the most appropriate statistics for your study. Making careful, informed decisions about the statistics to use in your study should reduce the risk of confirming Mr Twain’s concern.

Appendix 1. Glossary of statistical terms * (part 1 of 2)

  • 1-way ANOVA: Uses 1 variable to define the groups for comparing means. This is similar to the Student t test when comparing the means of 2 groups.
  • Kruskall–Wallis 1-way ANOVA: Nonparametric alternative for the 1-way ANOVA. Used to determine the difference in medians between 3 or more groups.
  • n -way ANOVA: Uses 2 or more variables to define groups when comparing means. Also called a “between-subjects factorial ANOVA”.
  • Repeated-measures ANOVA: A method for analyzing whether the means of 3 or more measures from the same group of participants are different.
  • Freidman ANOVA: Nonparametric alternative for the repeated-measures ANOVA. It is often used to compare rankings and preferences that are measured 3 or more times.
  • Fisher exact: Variation of chi-square that accounts for cell counts < 5.
  • McNemar: Variation of chi-square that tests statistical significance of changes in 2 paired measurements of dichotomous variables.
  • Cochran Q: An extension of the McNemar test that provides a method for testing for differences between 3 or more matched sets of frequencies or proportions. Often used as a measure of heterogeneity in meta-analyses.
  • 1-sample: Used to determine whether the mean of a sample is significantly different from a known or hypothesized value.
  • Independent-samples t test (also referred to as the Student t test): Used when the independent variable is a nominal-level variable that identifies 2 groups and the dependent variable is an interval-level variable.
  • Paired: Used to compare 2 pairs of scores between 2 groups (e.g., baseline and follow-up blood pressure in the intervention and control groups).

Lang TA, Secic M. How to report statistics in medicine: annotated guidelines for authors, editors, and reviewers. 2nd ed. Philadelphia (PA): American College of Physicians; 2006.

Norman GR, Streiner DL. PDQ statistics. 3rd ed. Hamilton (ON): B.C. Decker; 2003.

Plichta SB, Kelvin E. Munro’s statistical methods for health care research . 6th ed. Philadelphia (PA): Wolters Kluwer Health/ Lippincott, Williams & Wilkins; 2013.

This article is the 12th in the CJHP Research Primer Series, an initiative of the CJHP Editorial Board and the CSHP Research Committee. The planned 2-year series is intended to appeal to relatively inexperienced researchers, with the goal of building research capacity among practising pharmacists. The articles, presenting simple but rigorous guidance to encourage and support novice researchers, are being solicited from authors with appropriate expertise.

Previous articles in this series:

  • Bond CM. The research jigsaw: how to get started. Can J Hosp Pharm . 2014;67(1):28–30.
  • Tully MP. Research: articulating questions, generating hypotheses, and choosing study designs. Can J Hosp Pharm . 2014;67(1):31–4.
  • Loewen P. Ethical issues in pharmacy practice research: an introductory guide. Can J Hosp Pharm. 2014;67(2):133–7.
  • Tsuyuki RT. Designing pharmacy practice research trials. Can J Hosp Pharm . 2014;67(3):226–9.
  • Bresee LC. An introduction to developing surveys for pharmacy practice research. Can J Hosp Pharm . 2014;67(4):286–91.
  • Gamble JM. An introduction to the fundamentals of cohort and case–control studies. Can J Hosp Pharm . 2014;67(5):366–72.
  • Austin Z, Sutton J. Qualitative research: getting started. C an J Hosp Pharm . 2014;67(6):436–40.
  • Houle S. An introduction to the fundamentals of randomized controlled trials in pharmacy research. Can J Hosp Pharm . 2014; 68(1):28–32.
  • Charrois TL. Systematic reviews: What do you need to know to get started? Can J Hosp Pharm . 2014;68(2):144–8.
  • Sutton J, Austin Z. Qualitative research: data collection, analysis, and management. Can J Hosp Pharm . 2014;68(3):226–31.
  • Cadarette SM, Wong L. An introduction to health care administrative data. Can J Hosp Pharm. 2014;68(3):232–7.

Competing interests: None declared.

Further Reading

  • Devor J, Peck R. Statistics: the exploration and analysis of data. 7th ed. Boston (MA): Brooks/Cole Cengage Learning; 2012. [ Google Scholar ]
  • Lang TA, Secic M. How to report statistics in medicine: annotated guidelines for authors, editors, and reviewers. 2nd ed. Philadelphia (PA): American College of Physicians; 2006. [ Google Scholar ]
  • Mendenhall W, Beaver RJ, Beaver BM. Introduction to probability and statistics. 13th ed. Belmont (CA): Brooks/Cole Cengage Learning; 2009. [ Google Scholar ]
  • Norman GR, Streiner DL. PDQ statistics. 3rd ed. Hamilton (ON): B.C. Decker; 2003. [ Google Scholar ]
  • Plichta SB, Kelvin E. Munro’s statistical methods for health care research. 6th ed. Philadelphia (PA): Wolters Kluwer Health/Lippincott, Williams & Wilkins; 2013. [ Google Scholar ]

Data Analysis Plan: Ultimate Guide and Examples

Learn the post survey questions you need to ask attendees for valuable feedback.

data analysis plan for action research example

Once you get survey feedback , you might think that the job is done. The next step, however, is to analyze those results. Creating a data analysis plan will help guide you through how to analyze the data and come to logical conclusions.

So, how do you create a data analysis plan? It starts with the goals you set for your survey in the first place. This guide will help you create a data analysis plan that will effectively utilize the data your respondents provided.

What can a data analysis plan do?

Think of data analysis plans as a guide to your organization and analysis, which will help you accomplish your ultimate survey goals. A good plan will make sure that you get answers to your top questions, such as “how do customers feel about this new product?” through specific survey questions. It will also separate respondents to see how opinions among various demographics may differ.

Creating a data analysis plan

Follow these steps to create your own data analysis plan.

Review your goals

When you plan a survey, you typically have specific goals in mind. That might be measuring customer sentiment, answering an academic question, or achieving another purpose.

If you’re beta testing a new product, your survey goal might be “find out how potential customers feel about the new product.” You probably came up with several topics you wanted to address, such as:

  • What is the typical experience with the product?
  • Which demographics are responding most positively? How well does this match with our idea of the target market?
  • Are there any specific pain points that need to be corrected before the product launches?
  • Are there any features that should be added before the product launches?

Use these objectives to organize your survey data.

Evaluate the results for your top questions

Your survey questions probably included at least one or two questions that directly relate to your primary goals. For example, in the beta testing example above, your top two questions might be:

  • How would you rate your overall satisfaction with the product?
  • Would you consider purchasing this product?

Those questions offer a general overview of how your customers feel. Whether their sentiments are generally positive, negative, or neutral, this is the main data your company needs. The next goal is to determine why the beta testers feel the way they do.

Assign questions to specific goals

Next, you’ll organize your survey questions and responses by which research question they answer. For example, you might assign questions to the “overall satisfaction” section, like:

  • How would you describe your experience with the product?
  • Did you encounter any problems while using the product?
  • What were your favorite/least favorite features?
  • How useful was the product in achieving your goals?

Under demographics, you’d include responses to questions like:

  • Education level

This helps you determine which questions and answers will answer larger questions, such as “which demographics are most likely to have had a positive experience?”

Pay special attention to demographics

Demographics are particularly important to a data analysis plan. Of course you’ll want to know what kind of experience your product testers are having with the product—but you also want to know who your target market should be. Separating responses based on demographics can be especially illuminating.

For example, you might find that users aged 25 to 45 find the product easier to use, but people over 65 find it too difficult. If you want to target the over-65 demographic, you can use that group’s survey data to refine the product before it launches.

Other demographic segregation can be helpful, too. You might find that your product is popular with people from the tech industry, who have an easier time with a user interface, while those from other industries, like education, struggle to use the tool effectively. If you’re targeting the tech industry, you may not need to make adjustments—but if it’s a technological tool designed primarily for educators, you’ll want to make appropriate changes.

Similarly, factors like location, education level, income bracket, and other demographics can help you compare experiences between the groups. Depending on your ultimate survey goals, you may want to compare multiple demographic types to get accurate insight into your results.

Consider correlation vs. causation

When creating your data analysis plan, remember to consider the difference between correlation and causation. For instance, being over 65 might correlate with a difficult user experience, but the cause of the experience might be something else entirely. You may find that your respondents over 65 are primarily from a specific educational background, or have issues reading the text in your user interface. It’s important to consider all the different data points, and how they might have an effect on the overall results.

Moving on to analysis

Once you’ve assigned survey questions to the overall research questions they’re designed to answer, you can move on to the actual data analysis. Depending on your survey tool, you may already have software that can perform quantitative and/or qualitative analysis. Choose the analysis types that suit your questions and goals, then use your analytic software to evaluate the data and create graphs or reports with your survey results.

At the end of the process, you should be able to answer your major research questions.

Power your data analysis with Voiceform

Once you have established your survey goals, Voiceform can power your data collection and analysis. Our feature-rich survey platform offers an easy-to-use interface, multi-channel survey tools, multimedia question types, and powerful analytics. We can help you create and work through a data analysis plan. Find out more about the product, and book a free demo today !

We make collecting, sharing and analyzing data a breeze

Get started for free. Get instant access to Voiceform features that get you amazing data in minutes.

data analysis plan for action research example

educational research techniques

Research techniques and education.

data analysis plan for action research example

Developing a Data Analysis Plan

It is extremely common for beginners and perhaps even experience researchers to lose track of what they are trying to achieve or do when trying to complete a research project. The open nature of research allows for a multitude of equally acceptable ways to complete a project. This leads to an inability to make a decision and or stay on course when doing research.

data analysis plan for action research example

Data Analysis Plan

A data analysis plan includes many features of a research project in it with a particular emphasis on mapping out how research questions will be answered and what is necessary to answer the question. Below is a sample template of the analysis plan.

analysis-plan-page-001-2

The majority of this diagram should be familiar to someone who has ever done research. At the top, you state the problem , this is the overall focus of the paper. Next, comes the purpose , the purpose is the over-arching goal of a research project.

After purpose comes the research questions . The research questions are questions about the problem that are answerable. People struggle with developing clear and answerable research questions. It is critical that research questions are written in a way that they can be answered and that the questions are clearly derived from the problem. Poor questions means poor or even no answers.

After the research questions, it is important to know what variables are available for the entire study and specifically what variables can be used to answer each research question. Lastly, you must indicate what analysis or visual you will develop in order to answer your research questions about your problem. This requires you to know how you will answer your research questions

Below is an example of a completed analysis plan for  simple undergraduate level research paper

example-analysis-plan-page-001

In the example above, the student wants to understand the perceptions of university students about the cafeteria food quality and their satisfaction with the university. There were four research questions, a demographic descriptive question, a descriptive question about the two main variables, a comparison question, and lastly a relationship question.

The variables available for answering the questions are listed off to the left side. Under that, the student indicates the variables needed to answer each question. For example, the demographic variables of sex, class level, and major are needed to answer the question about the demographic profile.

The last section is the analysis. For the demographic profile, the student found the percentage of the population in each sub group of the demographic variables.

A data analysis plan provides an excellent way to determine what needs to be done to complete a study. It also helps a researcher to clearly understand what they are trying to do and provides a visual for those who the research wants to communicate with about the progress of a study.

Share this:

Leave a reply cancel reply, discover more from educational research techniques.

Subscribe now to keep reading and get access to the full archive.

Type your email…

Continue reading

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Action Research: Data Analysis

Profile image of Melissa Blodgett

Related Papers

Vicky Cardullo

data analysis plan for action research example

The High School Journal

Jody Polleck , Jill V Jeffery

Andy N Cubalit , Naely Muchtar , Jittrapat Piankrad , Dararat Khampusaen

Table of Contents Prominent Language Features and Semiotic, Social & Cultural Features of Selected Best-seller English Fragrance Advertisements in 2016 Hatairat Junpeng and Bussabamintra Chaluasaeng, PH.D. pp: 5 – 14 Computer Assisted Instruction (CAI) IN English and Students’ Performance Raffy G. Herrera Pp: 15 – 21 Literary Competence of Grade 10 Students: Towards the Development of a Literature based Reading Program Mary Rose Lawian Pp: 22 - 35 Teaching Sentence Error Identification Techniques by Asking Students to Create MiniTests for Their Classmates Suthee Ploisawaschai, Ph.D. Pp: 36 – 42 The Influence of Task-Based Language Learning Activities on EFL Reading Ability Philaiwan Ninphaphong & Dararat Khampusaen, Ph.D Pp:43 – 52 Improving English Speaking Skill Focusing on Rhythm through Series of Self-Recorded Video Tasks: A Case Study at RMUTI, Khon Kaen Pintip Taweepon & Bussabamintra Chalauisaeng, Ph.D Pp: 53 – 61 Analysis of Test Items on Difficulty Level and Discrimination Index: Basis for Improving the Unified Grade 8 Science Quarterly Test Lily Rose Rafaila Pp: 62 – 70 Evaluating the EFL Courseware at Thai Nguyen University of Information & Communication Technology Duong Thi Hong An Pp: 71 – 76 Home School Instructors’ Beliefs and Practices of Arts Integration in Promoting Young Learners’ Motivation in English Language Learning Jittrapat Piankrad & Maneerat Chuaychoowong, Ph.D. Pp: 77 – 91 Enhancing Grade Nine Students’ Paragraph Writing Ability through Specifically Designed Task-Based Learning Activities: A Case Study of Bann NongPakLoad School, Chaiyaphum Benjaporn Pakawachsomboon & Pradit Sangsookwow, Ph.D. Pp: 92 – 99 Reading and Comprehension levels of Intermediate Pupils in Glamang Elementary School: Basis for Reading Enhancement Activity Program (REAP) Pablo L. Eulatic Jr. R.N. Pp: 100 – 108 Designing an Automated Essay Feedback System ROXIFYonline: Helping Students Improve their Writing through Online Feedback Roxanne Miller Pp: 109 – 118 Maximizing the Use of Whatsapp in Teaching English to Electrical Engineering Students in Politeknik Negeri Ujung Pandang Naely Muchtar Pp: 119 – 124 Using YouTube to Facilitate the Teaching of Global Issues through Ernest Hemingway’s Short Story, Old Man at the Bridge Chakri Kasatri Pp: 125 – 133 A Story of a Smartphone-less Teacher Who Became Her Students’ ICT Guru Magdalena Brzezinska Pp: 134 – 141

Samantha Morley

This study revealed readers&#39; metacognitive identities using readers&#39; written &quot;thinking&quot; responses to informational text and elucidated the ways in which higher level literacy skills are employed during reading and interpreting academic text. Primary goals of the study were (a) to examine readers&#39; cognitive processes during reading; the interaction of reader, text, and activity; as well as the effect of metacognitive awareness on that interaction; (b) to examine the relationship between students&#39; reported awareness about reading and their actual reading comprehension skills; and (c) to determine relationships among the level of metacognitive awareness, reader stance, use of self-selected strategies, and level of understanding of academic text. A sampling of 59 subjects was drawn from sixth-grade middle school students with a range of reading abilities. The subjects participated in a two-part study. Part 1 consisted of general reading and metacognitive awaren...

Davonna Thomas

The purpose of this study was to examine the effects of literature circles on the reading achievement of college reading students using a mixed method approach. A literature circle is defined in this study as students who form a group, read a novel, and meet on a regular basis to discuss what they have read. The researcher-developed intervention included three activities: collaborative oral re-tell, short written response to a prompt, and open discussion. The study employed an experimental design in order to examine the effectiveness of the intervention (literature circles); in addition, the sociocultural context of the college reading classes (and students) is described in detail. Grounded theory was employed to analyze reading attitude, reading motivation, response to participation in a literature circle, and textual engagement. Thirty-eight college students in required reading courses participated in the five-week study. Students were randomly assigned to either the treatment (participation in literature circle) or control (independent reading) condition. Students were able to choose from four pre-selected high-interest young adult novels. At the conclusion of the series of literature circle meetings (or upon completion of reading the novel independently, for control group participants), comprehension was measured using three measures: an oral re-tell of the novel, a twenty question researcher developed open-ended book-specific assessment, and a twenty question assessment on a two-part high school level passage from the Qualitative Reading Inventory (Leslie & Caldwell, 2011). Textual engagement was measured by coding and counting responses to a semi-structured interview. Multivariate analysis of variance (MANOVA) revealed a significant main effect for group assignment, meaning that—when all four measures were combined into a linear function—the students assigned to literature circles outperformed the control group students. Given the significance of the overall test, the univariate main effects were examined. Significant univariate main effects were obtained for the researcher developed test and textual engagement. Qualitative analysis revealed that literature circles improve reading comprehension, depth of textual engagement, and provides an opportunity for discourse, collaboration, and social interaction for its participants. These findings suggest that literature circles lead to both improved comprehension and deeper textual engagement for college reading students.

Rita Menendez

The Asian ESP Journal

Dr. John L Adamson , Tharwat Elsakran

This paper presents the results of a genre analysis of reviewers’ reports on research manuscripts submitted for publication consideration in refereed journals. Following the methodology developed in Swales (1981) and Bhatia (1993), 64 reviewers’ reports were examined in terms of their schematic structure. The component moves were identified and their linguistic signals were highlighted. We concluded that the nature of reviewers’ reports, being personal and evaluative in nature, necessitates the use of the first- person writer pronoun (“I”), qualitative adjectives and premodifying adverbs. The results also show that the ‘evaluation’ move is lengthier, in terms of the number of words used, than the other moves in the reports since it represents the main communicative purpose of the report. Evaluation is couched in three different ways: explicit, implicit and flagged. A strong relationship is established between the reviewers’ cooperation with the manuscript writer and the final decision provided in the ‘position’ move. That is, the more questions a reviewer raises, the less favourable the decision is going to be, and the more suggestions for improvements are given, the more positive the ‘position’ move is going to be. Points and/or issues that reviewers look for are singled out and the linguistic features pertinent to the moves and the steps used for their realization are identified. The study concludes with some guidelines for cooperative and successful reviewing.

Amy Frederick , Richard Beach , Mark A Sulzer , Amanda Haertling Thein

VINCENT MACMBINJI

Dondon B. Buensuceso

This learning resource was collaboratively developed and reviewed by educators from public and private schools, colleges, and/or universities. We encourage teachers and other education stakeholders to email their feedback, comments and recommendations to the Department of Education at [email protected]. We value your feedback and recommendations. All rights reserved. No part of this material may be reproduced or transmitted in any form or by any means-electronic or mechanical including photocopying-without written permission from the DepEd Central Office.

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

RELATED PAPERS

Jeanine A DeFalco

Research in the Teaching of English

Anne Crampton

RMLE Online

Alia A . Ammar , Tecce Decarlo

Linking research with practice: Text Complexity

Journal of College Reading and Learning

Educating for a Just Society: The 41st Association of Literacy Educators and Researchers Yearbook

Kelli Bippert

betul aydin

hakan aydogan

Monica Waterhouse

Ubiquity: The Journal of Literature, Literacy, and the Arts

Leah Panther

International Literacy Association

Marcella Stark , Julie Combs , John Slate Ph. D.

Zin Zin Milk

Christine Liebe

Editorial Universidad Técnica de Machala

Maria Asuncion Rojas Encalada

iskhak iskhak , Muhammad A Budiman , Kurniawan Yudhi Nugroho , Bowo2609yahoo Com

Scholarly Commons

Tori Norris

Sucie Rahmadiah

Suardi unismuh

Free Linguistics Conference 2018

Falantino Eryk Latupapua

Peter Smagorinsky

Patricia Velasco

Shea Kerkhoff

Kara Coffino

Lisa Ortmann , Amy Frederick , K. Coffino

ma. katrina sandra magcamit

Tryanti R Abdulrahman , Lalu Suherman

Foreign Language …

Kristin Davin

Reza Sukma Nugraha

Dyah Yuli Sugiarti

jamil ahmad

The Elementary School Journal

Richard Correnti , Lindsay Matsumura

Handoko Handoko

Kisno Shinoda

DANIEL POLLITT

TextProject Reading Research Report #13.01

Elfrieda H Hiebert

anggun gunawan

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

Examples

Data Analysis Plan

data analysis plan for action research example

With the use of a data analysis plan, you will probably know what to do when you are opting to analyze the data you have gathered. It is one of the most essential things to have that guides you on how you are going to do data collection appropriately. For some reasons, you might want to make sure that you are creating an effective plan . Through that, you should have gathered information that answers some questions in which you probably would want to know about. Having a good plan saves time. It is actually a very good idea to put some data that makes sense to your data analysis plan. Otherwise, you will feel disappointed and may think that what you are doing is worthless. 

10+ Data Analysis Plan Examples

1. data analysis plan template.

Data Analysis Plan Template

  • Google Docs

2. Survey Data Analysis Plan Template

Survey Data Analysis Plan Template

3. Qualitative Data Analysis Plan Template

Qualitative Data Analysis Plan Template

4. Scientific Data Analysis Plan

Scientific Data Analysis Plan

Size: 941 KB

5. Standard Data Analysis Plan

Standard Data Analysis Plan

Size: 247 KB

6. Formative Data Analysis Plan

Formative Data Analysis Plan

Size: 15 KB

7. Observational Study Data Analysis Plan

Observational Study Data Analysis Plan

Size: 34 KB

8. Data Analysis Plan and Products

Data Analysis Plan and Products

Size: 323 KB

9. Summary of Data Analysis Plan

Summary of Data Analysis Plan

Size: 667 KB

10. Professional Data Analysis Plan

Professional Data Analysis Plan

Size: 709 KB

11. National Data Analysis Plan

National Data Analysis Plan

Data Analysis Plan Definition

A data analysis plan is a roadmap that tells you the process on how to properly analyze and organize a particular data. It starts with the three main objectives. First, you have to answer your researched questions. Second, you should use questions that are more specific so that it can easily be understood. Third, you should segment respondents to compare their opinions with other groups.

Data Analysis Methods

Some data analysts would use a specific method. They usually work on both the qualitative data and quantitative data . Below are some of the methods used.

1. Regression Analysis

This is commonly used when you are going to determine the relationship of the variables. You are looking into the correlation of the dependent variable and independent variable. This aims to give an estimation of how an independent variable may impact the dependent variable. This is essential when you are going to make predictions and forecasts .

2. Monte Carlo Simulation

Expect different outcomes when you are making a decision. As individuals, we tend to weigh what’s better. Is it the pros or the cons? However, we cannot easily take which journey should we be going. We have to calculate all the potential risks. In the Monte Carlo Simulation, you are going to generate potential outcomes. This is usually used when you have to conduct a risk analysis that allows you to have a better forecast of what might happen in the future.

3. Factor Analysis

This is a kind of technique used to reduce large numbers to smaller ones. It works whenever multiple observable variables tend to correlate with each other. This is proven useful for it tends to uncover some hidden patterns. This would allow you to explore more concepts that are not easy to measure.

4. Cohort Analysis

A cohort analysis allows you to divide your users into small groups. You are going to monitor these groups from time to time. It is like examining their behavior which can lead you to identify patterns of behavior in a customer’s lifecycle. This is useful especially in business companies because it will serve as their avenue to tailor their service to their specific cohorts.

5. Cluster Analysis

This type of method identifies structures within the set of data. Its aim is to sort data into groups within clusters that are similar to each other and dissimilar to another cluster. This will help you gain insight as to how your data should be distributed.

6. Time Series Analysis

This is a statistical method that is used to determine trends. They measure the same variable to forecast how these variables would fluctuate in the future. There are three main patterns when conducting time series analysis. They are the trends, seasonality, and cyclic patterns.

7. Sentiment Analysis

There are insights that you can learn from what other people write about you. Using a sentiment analysis, you will be able to sort and understand data. Its goal is to interpret emotions that are being conveyed in the data. This may let you know about how other people feel about your brand or service.

What do you mean by aspect-based sentiment analysis?

An aspect-based sentiment analysis allows you to determine the type of emotion a customer writes that pertains to a featured product or campaign.

What is NLP?

NLP stands for Natural Language Processing. This is helpful in sentiment analysis because they use systems which are trained to associate inputs with outputs.

Why does identifying demographic groupings important?

This helps you understand the significance of your data and figure out what steps you need to perform to improve.

There have been a lot of methods to be used in data analysis plan, but it is also a good start to familiarize with the kind of data you have. It goes the same thing with the insights that are considered useful in the analysis . Having a good data plan can actually save your entire research. You just have to think logically to avoid errors before they can actually happen. One more thing to tell is that there have been a lot of data collection records among students. The moment you forgot about your variables and your data, your plan will become absolutely useless.

Twitter

Text prompt

  • Instructive
  • Professional

Create a study plan for final exams in high school

Develop a project timeline for a middle school science fair.

data analysis plan for action research example

Tool or Template

  • Switchboard Tool Data Analysis Action Planning Templates

Data Analysis and Action Planning Templates

It’s critical to regularly analyze and act on data in order to make informed decisions and guide project implementation. To facilitate ongoing learning and project improvement, teams should hold regular data analysis and action planning meetings throughout the project cycle. These Data Analysis and Action Planning Templates can be used to plan for data analysis, document observations from analysis meetings, and document actions required to address issues that staff identify.

  • PUBLISHED BY
  • Switchboard
  • May 30, 2019
  • Language(s)
  • Target Audience(s)
  • Service Provider
  • Sub Topic(s)
  • Data Analysis , Monitoring and Evaluation , Project Design and M&E Planning
  • Population(s)

RELATED CONTENT

When’s that Report Due Again? How to Develop a Monitoring and Evaluation Workplan

When’s that Report Due Again? How to Develop a Monitoring and Evaluation Workplan

Organizational M&E Standards and Capacity Assessment

Organizational M&E Standards and Capacity Assessment

Improving Gender Equality through Basic Gender Analysis

Improving Gender Equality through Basic Gender Analysis

More Resources

data analysis plan for action research example

Applying a Trauma-Informed Approach to Nutrition Education for Newcomer Service Providers

data analysis plan for action research example

​​Trauma Attachment with Unaccompanied Refugee Minor Clients​

data analysis plan for action research example

Using Data For Strategic Consultation And Community Connectedness

IMAGES

  1. Research Action Plan

    data analysis plan for action research example

  2. FREE 9+ Sample Statistical Analysis Plan Templates in PDF

    data analysis plan for action research example

  3. FREE 7+ Data Analysis Samples in Excel

    data analysis plan for action research example

  4. Research Action Plan

    data analysis plan for action research example

  5. 50+ SAMPLE Data Analysis Templates in PDF

    data analysis plan for action research example

  6. Action Research Plan sample

    data analysis plan for action research example

VIDEO

  1. Complete any Data analysis project in 4 steps #datascience #dataanalyst #dataanalytics

  2. Components of a Data Analysis Plan

  3. BIOSTATISTICS KMU-INS-BSN: UNIT-17 DATA ANALYSIS PLAN

  4. Honours & PG Diplomas Research Training Workshop

  5. Data Analyst Project Walkthrough: A Step by Step Guide

  6. How to Build a Successful Business Analysis Plan

COMMENTS

  1. (PDF) Action research: Collecting and analysing data

    The 'observing' stage of action research involves collecting data, for example via surveys, focus groups, interviews, observations, reflective journal writing, and/or assessments. For language ...

  2. How to Create a Data Analysis Plan: A Detailed Guide

    A good data analysis plan should summarize the variables as demonstrated in Figure 1 below. Figure 1. Presentation of variables in a data analysis plan. 5. Statistical software. There are tons of software packages for data analysis, some common examples are SPSS, Epi Info, SAS, STATA, Microsoft Excel.

  3. PDF Developing a Quantitative Data Analysis Plan

    A Data Analysis Plan (DAP) is about putting thoughts into a plan of action. Research questions are often framed broadly and need to be clarified and funnelled down into testable hypotheses and action steps. The DAP provides an opportunity for input from collaborators and provides a platform for training. Having a clear plan of action is also ...

  4. Analyzing Data from Your Classroom

    It is now time to conduct the analysis of your data, which precedes drawing conclusions and sharing your findings. During your action research project, you have been informally analyzing your data and now you can formally analyze to develop findings and reflect on their implications for practice. This will also provide an opportunity to ...

  5. Data Analysis Plan: Examples & Templates

    A data analysis plan is a roadmap for how you're going to organize and analyze your survey data—and it should help you achieve three objectives that relate to the goal you set before you started your survey: Answer your top research questions. Use more specific survey questions to understand those answers. Segment survey respondents to ...

  6. Collecting Data in Your Classroom

    This empowers educators as researchers, utilizing action research, to be powerful agents for change in educational contexts. Thinking about Types of Data. Whether the research design is qualitative, quantitative or mixed-methods, it will determine the methods or ways you use to collect data. Qualitative research designs focus on collecting data ...

  7. Action Research Resources

    Action Research is not a single research project; rather it is an ongoing iterative approach that takes place across cycles of innovation and reflection. It is a way of learning from and through systematic inquiry into one's practice. Central to this process is the collection and analysis of data. The image below (Rie1, 2014) uses color to ...

  8. Qual Data Analysis & Action Research

    Qual Data Analysis & Action Research. by Janet Salmons, Ph.D., Research Community Manager for Sage Methodspace. Qualitative data analysis varies by methodology. In this post let's focus on analysis in action research studies. Action research is a flexible research methodology uniquely suited to researching and supporting change.

  9. Action Research Resource

    The ultimate outcome of Action Research is to create a plan of action using the research findings to inform future educational practice. A Plan of Action is not meant to be a one-size fits all plan. Instead, it is mean to include specific data-driven and research-based recommendations that result from a detailed analysis of the data, the study ...

  10. Data Analysis

    An interim analysis is part of the continuous, ongoing data analysis. It is part of the ongoing reflective planning process of action research (Hendricks, 2013). Your action research projects will typically involve both quantitative and qualitative data. The methods for simplifying quantitative data, such as reporting, comparing, and displaying ...

  11. What Is Action Research?

    Action research is a research method that aims to simultaneously investigate and solve an issue. In other words, as its name suggests, action research conducts research and takes action at the same time. It was first coined as a term in 1944 by MIT professor Kurt Lewin.A highly interactive method, action research is often used in the social ...

  12. PDF Overview of the Action Research Process

    Step 7: Developing an Action Plan. Once the data have been analyzed and the results of the analysis interpreted, the next step in the action research process is the development of an action plan. This is really the ultimate goal of any action research study—it is the "action" part of action research.

  13. PDF THE ACTION RESEARCH GUIDEBOOK

    Action research can be called a "cycle of action" because it normally follow the same process - such as identify core problem, develop research plan, collect data, analyze data, embed the findings into planning, implement and monitor and evaluate the actions then education researchers repeat the process. Based on the process and approaches of

  14. Data Analysis Plan: Examples & Templates

    A data analysis plan is a roadmap for how you can organise and analyse your survey data. Learn how to write an effective survey data analysis plan today. ... Doing this will help you know which survey questions to refer to for specific research topics. For example, to find out which parts of the conference attendees liked the best, look at the ...

  15. PDF CHAPTER FIVE DATA COLLECTION AND ANALYSIS 5.1 Framing the Results

    2002 (staff meeting), and August 2003 (staff meeting). Excerpts from this data trace the early. stages of the emerging interventions: "We need reading materials for each cycle - Cycle 1 -$2,500 for reading box libraries to provide the main reading materials and the same for Cycles 2 and 3- total cost $7,500".

  16. Creating a Data Analysis Plan: What to Consider When Choosing

    The first step in a data analysis plan is to describe the data collected in the study. This can be done using figures to give a visual presentation of the data and statistics to generate numeric descriptions of the data. ... As the title implies, this book covers a wide range of statistics used in medical research and provides numerous examples ...

  17. Data Analysis Plan: Ultimate Guide and Examples

    Data Analysis Plan: Ultimate Guide and Examples. Learn the post survey questions you need to ask attendees for valuable feedback. Once you get survey feedback, you might think that the job is done. The next step, however, is to analyze those results. Creating a data analysis plan will help guide you through how to analyze the data and come to ...

  18. Developing a Data Analysis Plan

    A data analysis plan includes many features of a research project in it with a particular emphasis on mapping out how research questions will be answered and what is necessary to answer the question. Below is a sample template of the analysis plan. The majority of this diagram should be familiar to someone who has ever done research.

  19. PDF Creating an Analysis Plan

    Analysis Plan and Manage Data. The main tasks are as follows: 1. Create an analysis plan • Identify research questions and/or hypotheses. • Select and access a dataset. • List inclusion/exclusion criteria. • Review the data to determine the variables to be used in the main analysis. • Select the appropriate statistical methods and ...

  20. (PDF) Action Research: Data Analysis

    Action Research: Data Analysis by Melissa A. Blodgett November 7, 2010 As an action researcher, the purpose of researching my specific topic can be realized through the analysis of the data I have collected. In order to begin the data analysis process I will first need to reduce my data into a format that can be easily analyzed.

  21. Data Analysis Plan

    Data Analysis Plan Definition. A data analysis plan is a roadmap that tells you the process on how to properly analyze and organize a particular data. It starts with the three main objectives. First, you have to answer your researched questions. Second, you should use questions that are more specific so that it can easily be understood.

  22. PDF Creating a Data Analysis Plan: What to Consider When Choosing

    describing the data through to testing our hypotheses. The purpose of this article is to help you create a data analysis plan for a quantitative study. For those interested in conducting qualitative research, previous articles in this Research Primer series have provided information on the design and analysis of such studies.2,3 Information in ...

  23. Data Analysis and Action Planning Templates

    It's critical to regularly analyze and act on data in order to make informed decisions and guide project implementation. To facilitate ongoing learning and project improvement, teams should hold regular data analysis and action planning meetings throughout the project cycle. These Data Analysis and Action Planning Templates can be used to plan for data analysis, […]