Reference management. Clean and simple.

How to do a thematic analysis

thematic analysis research paper themes

What is a thematic analysis?

When is thematic analysis used, braun and clarke’s reflexive thematic analysis, the six steps of thematic analysis, 1. familiarizing, 2. generating initial codes, 3. generating themes, 4. reviewing themes, 5. defining and naming themes, 6. creating the report, the advantages and disadvantages of thematic analysis, disadvantages, frequently asked questions about thematic analysis, related articles.

Thematic analysis is a broad term that describes an approach to analyzing qualitative data . This approach can encompass diverse methods and is usually applied to a collection of texts, such as survey responses and transcriptions of interviews or focus group discussions. Learn more about different research methods.

A researcher performing a thematic analysis will study a set of data to pinpoint repeating patterns, or themes, in the topics and ideas that are expressed in the texts.

In analyzing qualitative data, thematic analysis focuses on concepts, opinions, and experiences, as opposed to pure statistics. This requires an approach to data that is complex and exploratory and can be anchored by different philosophical and conceptual foundations.

A six-step system was developed to help establish clarity and rigor around this process, and it is this system that is most commonly used when conducting a thematic analysis. The six steps are:

  • Familiarization
  • Generating codes
  • Generating themes
  • Reviewing themes
  • Defining and naming themes
  • Creating the report

It is important to note that even though the six steps are listed in sequence, thematic analysis is not necessarily a linear process that advances forward in a one-way, predictable fashion from step one through step six. Rather, it involves a more fluid shifting back and forth between the phases, adjusting to accommodate new insights when they arise.

And arriving at insight is a key goal of this approach. A good thematic analysis doesn’t just seek to present or summarize data. It interprets and makes a statement about it; it extracts meaning from the data.

Since thematic analysis is used to study qualitative data, it works best in cases where you’re looking to gather information about people’s views, values, opinions, experiences, and knowledge.

Some examples of research questions that thematic analysis can be used to answer are:

  • What are senior citizens’ experiences of long-term care homes?
  • How do women view social media sites as a tool for professional networking?
  • How do non-religious people perceive the role of the church in a society?
  • What are financial analysts’ ideas and opinions about cryptocurrency?

To begin answering these questions, you would need to gather data from participants who can provide relevant responses. Once you have the data, you would then analyze and interpret it.

Because you’re dealing with personal views and opinions, there is a lot of room for flexibility in terms of how you interpret the data. In this way, thematic analysis is systematic but not purely scientific.

A landmark 2006 paper by Victoria Braun and Victoria Clarke (“ Using thematic analysis in psychology ”) established parameters around thematic analysis—what it is and how to go about it in a systematic way—which had until then been widely used but poorly defined.

Since then, their work has been updated, with the name being revised, notably, to “reflexive thematic analysis.”

One common misconception that Braun and Clarke have taken pains to clarify about their work is that they do not believe that themes “emerge” from the data. To think otherwise is problematic since this suggests that meaning is somehow inherent to the data and that a researcher is merely an objective medium who identifies that meaning.

Conversely, Braun and Clarke view analysis as an interactive process in which the researcher is an active participant in constructing meaning, rather than simply identifying it.

The six stages they presented in their paper are still the benchmark for conducting a thematic analysis. They are presented below.

This step is where you take a broad, high-level view of your data, looking at it as a whole and taking note of your first impressions.

This typically involves reading through written survey responses and other texts, transcribing audio, and recording any patterns that you notice. It’s important to read through and revisit the data in its entirety several times during this stage so that you develop a thorough grasp of all your data.

After familiarizing yourself with your data, the next step is coding notable features of the data in a methodical way. This often means highlighting portions of the text and applying labels, aka codes, to them that describe the nature of their content.

In our example scenario, we’re researching the experiences of women over the age of 50 on professional networking social media sites. Interviews were conducted to gather data, with the following excerpt from one interview.

In the example interview snippet, portions have been highlighted and coded. The codes describe the idea or perception described in the text.

It pays to be exhaustive and thorough at this stage. Good practice involves scrutinizing the data several times, since new information and insight may become apparent upon further review that didn’t jump out at first glance. Multiple rounds of analysis also allow for the generation of more new codes.

Once the text is thoroughly reviewed, it’s time to collate the data into groups according to their code.

Now that we’ve created our codes, we can examine them, identify patterns within them, and begin generating themes.

Keep in mind that themes are more encompassing than codes. In general, you’ll be bundling multiple codes into a single theme.

To draw on the example we used above about women and networking through social media, codes could be combined into themes in the following way:

You’ll also be curating your codes and may elect to discard some on the basis that they are too broad or not directly relevant. You may also choose to redefine some of your codes as themes and integrate other codes into them. It all depends on the purpose and goal of your research.

This is the stage where we check that the themes we’ve generated accurately and relevantly represent the data they are based on. Once again, it’s beneficial to take a thorough, back-and-forth approach that includes review, assessment, comparison, and inquiry. The following questions can support the review:

  • Has anything been overlooked?
  • Are the themes definitively supported by the data?
  • Is there any room for improvement?

With your final list of themes in hand, the next step is to name and define them.

In defining them, we want to nail down the meaning of each theme and, importantly, how it allows us to make sense of the data.

Once you have your themes defined, you’ll need to apply a concise and straightforward name to each one.

In our example, our “perceived lack of skills” may be adjusted to reflect that the texts expressed uncertainty about skills rather than the definitive absence of them. In this case, a more apt name for the theme might be “questions about competence.”

To finish the process, we put our findings down in writing. As with all scholarly writing, a thematic analysis should open with an introduction section that explains the research question and approach.

This is followed by a statement about the methodology that includes how data was collected and how the thematic analysis was performed.

Each theme is addressed in detail in the results section, with attention paid to the frequency and presence of the themes in the data, as well as what they mean, and with examples from the data included as supporting evidence.

The conclusion section describes how the analysis answers the research question and summarizes the key points.

In our example, the conclusion may assert that it is common for women over the age of 50 to have negative experiences on professional networking sites, and that these are often tied to interactions with other users and a sense that using these sites requires specialized skills.

Thematic analysis is useful for analyzing large data sets, and it allows a lot of flexibility in terms of designing theoretical and research frameworks. Moreover, it supports the generation and interpretation of themes that are backed by data.

There are times when thematic analysis is not the best approach to take because it can be highly subjective, and, in seeking to identify broad patterns, it can overlook nuance in the data.

What’s more, researchers must be judicious about reflecting on how their own position and perspective bears on their interpretations of the data and if they are imposing meaning that is not there or failing to pick up on meaning that is.

Thematic analysis offers a flexible and recursive way to approach qualitative data that has the potential to yield valuable insights about people’s opinions, views, and lived experience. It must be applied, however, in a conscientious fashion so as not to allow subjectivity to taint or obscure the results.

The purpose of thematic analysis is to find repeating patterns, or themes, in qualitative data. Thematic analysis can encompass diverse methods and is usually applied to a collection of texts, such as survey responses and transcriptions of interviews or focus group discussions. In analyzing qualitative data, thematic analysis focuses on concepts, opinions, and experiences, as opposed to pure statistics.

A big advantage of thematic analysis is that it allows a lot of flexibility in terms of designing theoretical and research frameworks. It also supports the generation and interpretation of themes that are backed by data.

A disadvantage of thematic analysis is that it can be highly subjective and can overlook nuance in the data. Also, researchers must be aware of how their own position and perspective influences their interpretations of the data and if they are imposing meaning that is not there or failing to pick up on meaning that is.

How many themes make sense in your thematic analysis of course depends on your topic and the material you are working with. In general, it makes sense to have no more than 6-10 broader themes, instead of having many really detailed ones. You can then identify further nuances and differences under each theme when you are diving deeper into the topic.

Since thematic analysis is used to study qualitative data, it works best in cases where you’re looking to gather information about people’s views, values, opinions, experiences, and knowledge. Therefore, it makes sense to use thematic analysis for interviews.

After familiarizing yourself with your data, the first step of a thematic analysis is coding notable features of the data in a methodical way. This often means highlighting portions of the text and applying labels, aka codes, to them that describe the nature of their content.

thematic analysis research paper themes

Grad Coach

What (Exactly) Is Thematic Analysis?

Plain-Language Explanation & Definition (With Examples)

By: Jenna Crosley (PhD). Expert Reviewed By: Dr Eunice Rautenbach | April 2021

Thematic analysis is one of the most popular qualitative analysis techniques we see students opting for at Grad Coach – and for good reason. Despite its relative simplicity, thematic analysis can be a very powerful analysis technique when used correctly. In this post, we’ll unpack thematic analysis using plain language (and loads of examples) so that you can conquer your analysis with confidence.

Thematic Analysis 101

  • Basic terminology relating to thematic analysis
  • What is thematic analysis
  • When to use thematic analysis
  • The main approaches to thematic analysis
  • The three types of thematic analysis
  • How to “do” thematic analysis (the process)
  • Tips and suggestions

First, the lingo…

Before we begin, let’s first lay down some terminology. When undertaking thematic analysis, you’ll make use of codes . A code is a label assigned to a piece of text, and the aim of using a code is to identify and summarise important concepts within a set of data, such as an interview transcript.

For example, if you had the sentence, “My rabbit ate my shoes”, you could use the codes “rabbit” or “shoes” to highlight these two concepts. The process of assigning codes is called qualitative coding . If this is a new concept to you, be sure to check out our detailed post about qualitative coding .

Codes are vital as they lay a foundation for themes . But what exactly is a theme? Simply put, a theme is a pattern that can be identified within a data set. In other words, it’s a topic or concept that pops up repeatedly throughout your data. Grouping your codes into themes serves as a way of summarising sections of your data in a useful way that helps you answer your research question(s) and achieve your research aim(s).

Alright – with that out of the way, let’s jump into the wonderful world of thematic analysis…

Thematic analysis 101

What is thematic analysis?

Thematic analysis is the study of patterns to uncover meaning . In other words, it’s about analysing the patterns and themes within your data set to identify the underlying meaning. Importantly, this process is driven by your research aims and questions , so it’s not necessary to identify every possible theme in the data, but rather to focus on the key aspects that relate to your research questions .

Although the research questions are a driving force in thematic analysis (and pretty much all analysis methods), it’s important to remember that these questions are not necessarily fixed . As thematic analysis tends to be a bit of an exploratory process, research questions can evolve as you progress with your coding and theme identification.

Thematic analysis is about analysing the themes within your data set to identify meaning, based on your research questions.

When should you use thematic analysis?

There are many potential qualitative analysis methods that you can use to analyse a dataset. For example, content analysis , discourse analysis , and narrative analysis are popular choices. So why use thematic analysis?

Thematic analysis is highly beneficial when working with large bodies of data ,  as it allows you to divide and categorise large amounts of data in a way that makes it easier to digest. Thematic analysis is particularly useful when looking for subjective information , such as a participant’s experiences, views, and opinions. For this reason, thematic analysis is often conducted on data derived from interviews , conversations, open-ended survey responses , and social media posts.

Your research questions can also give you an idea of whether you should use thematic analysis or not. For example, if your research questions were to be along the lines of:

  • How do dog walkers perceive rules and regulations on dog-friendly beaches?
  • What are students’ experiences with the shift to online learning?
  • What opinions do health professionals hold about the Hippocratic code?
  • How is gender constructed in a high school classroom setting?

These examples are all research questions centering on the subjective experiences of participants and aim to assess experiences, views, and opinions. Therefore, thematic analysis presents a possible approach.

In short, thematic analysis is a good choice when you are wanting to categorise large bodies of data (although the data doesn’t necessarily have to be large), particularly when you are interested in subjective experiences .

Thematic analysis allows you to divide and categorise large amounts of data in a way that makes it far easier to digest.

What are the main approaches?

Broadly speaking, there are two overarching approaches to thematic analysis: inductive and deductive . The approach you take will depend on what is most suitable in light of your research aims and questions. Let’s have a look at the options.

The inductive approach

The inductive approach involves deriving meaning and creating themes from data without any preconceptions . In other words, you’d dive into your analysis without any idea of what codes and themes will emerge, and thus allow these to emerge from the data.

For example, if you’re investigating typical lunchtime conversational topics in a university faculty, you’d enter the research without any preconceived codes, themes or expected outcomes. Of course, you may have thoughts about what might be discussed (e.g., academic matters because it’s an academic setting), but the objective is to not let these preconceptions inform your analysis.

The inductive approach is best suited to research aims and questions that are exploratory in nature , and cases where there is little existing research on the topic of interest.

The deductive approach

In contrast to the inductive approach, a deductive approach involves jumping into your analysis with a pre-determined set of codes . Usually, this approach is informed by prior knowledge and/or existing theory or empirical research (which you’d cover in your literature review ).

For example, a researcher examining the impact of a specific psychological intervention on mental health outcomes may draw on an existing theoretical framework that includes concepts such as coping strategies, social support, and self-efficacy, using these as a basis for a set of pre-determined codes.

The deductive approach is best suited to research aims and questions that are confirmatory in nature , and cases where there is a lot of existing research on the topic of interest.

Regardless of whether you take the inductive or deductive approach, you’ll also need to decide what level of content your analysis will focus on – specifically, the semantic level or the latent level.

A semantic-level focus ignores the underlying meaning of data , and identifies themes based only on what is explicitly or overtly stated or written – in other words, things are taken at face value.

In contrast, a latent-level focus concentrates on the underlying meanings and looks at the reasons for semantic content. Furthermore, in contrast to the semantic approach, a latent approach involves an element of interpretation , where data is not just taken at face value, but meanings are also theorised.

“But how do I know when to use what approach?”, I hear you ask.

Well, this all depends on the type of data you’re analysing and what you’re trying to achieve with your analysis. For example, if you’re aiming to analyse explicit opinions expressed in interviews and you know what you’re looking for ahead of time (based on a collection of prior studies), you may choose to take a deductive approach with a semantic-level focus.

On the other hand, if you’re looking to explore the underlying meaning expressed by participants in a focus group, and you don’t have any preconceptions about what to expect, you’ll likely opt for an inductive approach with a latent-level focus.

Simply put, the nature and focus of your research, especially your research aims , objectives and questions will  inform the approach you take to thematic analysis.

The four main approaches to thematic analysis are inductive, deductive, semantic and latent. The choice of approach depends on the type of data and what you're trying to achieve

What are the types of thematic analysis?

Now that you’ve got an understanding of the overarching approaches to thematic analysis, it’s time to have a look at the different types of thematic analysis you can conduct. Broadly speaking, there are three “types” of thematic analysis:

  • Reflexive thematic analysis
  • Codebook thematic analysis
  • Coding reliability thematic analysis

Let’s have a look at each of these:

Reflexive thematic analysis takes an inductive approach, letting the codes and themes emerge from that data. This type of thematic analysis is very flexible, as it allows researchers to change, remove, and add codes as they work through the data. As the name suggests, reflexive thematic analysis emphasizes the active engagement of the researcher in critically reflecting on their assumptions, biases, and interpretations, and how these may shape the analysis.

Reflexive thematic analysis typically involves iterative and reflexive cycles of coding, interpreting, and reflecting on data, with the aim of producing nuanced and contextually sensitive insights into the research topic, while at the same time recognising and addressing the subjective nature of the research process.

Codebook thematic analysis , on the other hand, lays on the opposite end of the spectrum. Taking a deductive approach, this type of thematic analysis makes use of structured codebooks containing clearly defined, predetermined codes. These codes are typically drawn from a combination of existing theoretical theories, empirical studies and prior knowledge of the situation.

Codebook thematic analysis aims to produce reliable and consistent findings. Therefore, it’s often used in studies where a clear and predefined coding framework is desired to ensure rigour and consistency in data analysis.

Coding reliability thematic analysis necessitates the work of multiple coders, and the design is specifically intended for research teams. With this type of analysis, codebooks are typically fixed and are rarely altered.

The benefit of this form of analysis is that it brings an element of intercoder reliability where coders need to agree upon the codes used, which means that the outcome is more rigorous as the element of subjectivity is reduced. In other words, multiple coders discuss which codes should be used and which shouldn’t, and this consensus reduces the bias of having one individual coder decide upon themes.

Quick Recap: Thematic analysis approaches and types

To recap, the two main approaches to thematic analysis are inductive , and deductive . Then we have the three types of thematic analysis: reflexive, codebook and coding reliability . Which type of thematic analysis you opt for will need to be informed by factors such as:

  • The approach you are taking. For example, if you opt for an inductive approach, you’ll likely utilise reflexive thematic analysis.
  • Whether you’re working alone or in a group . It’s likely that, if you’re doing research as part of your postgraduate studies, you’ll be working alone. This means that you’ll need to choose between reflexive and codebook thematic analysis.

Now that we’ve covered the “what” in terms of thematic analysis approaches and types, it’s time to look at the “how” of thematic analysis.

Need a helping hand?

thematic analysis research paper themes

How to “do” thematic analysis

At this point, you’re ready to get going with your analysis, so let’s dive right into the thematic analysis process. Keep in mind that what we’ll cover here is a generic process, and the relevant steps will vary depending on the approach and type of thematic analysis you opt for.

Step 1: Get familiar with the data

The first step in your thematic analysis involves getting a feel for your data and seeing what general themes pop up. If you’re working with audio data, this is where you’ll do the transcription , converting audio to text.

At this stage, you’ll want to come up with preliminary thoughts about what you’ll code , what codes you’ll use for them, and what codes will accurately describe your content. It’s a good idea to revisit your research topic , and your aims and objectives at this stage. For example, if you’re looking at what people feel about different types of dogs, you can code according to when different breeds are mentioned (e.g., border collie, Labrador, corgi) and when certain feelings/emotions are brought up.

As a general tip, it’s a good idea to keep a reflexivity journal . This is where you’ll write down how you coded your data, why you coded your data in that particular way, and what the outcomes of this data coding are. Using a reflexive journal from the start will benefit you greatly in the final stages of your analysis because you can reflect on the coding process and assess whether you have coded in a manner that is reliable and whether your codes and themes support your findings.

As you can imagine, a reflexivity journal helps to increase reliability as it allows you to analyse your data systematically and consistently. If you choose to make use of a reflexivity journal, this is the stage where you’ll want to take notes about your initial codes and list them in your journal so that you’ll have an idea of what exactly is being reflected in your data. At a later stage in the analysis, this data can be more thoroughly coded, or the identified codes can be divided into more specific ones.

Keep a research journal for thematic analysis

Step 2: Search for patterns or themes in the codes

Step 2! You’re going strong. In this step, you’ll want to look out for patterns or themes in your codes. Moving from codes to themes is not necessarily a smooth or linear process. As you become more and more familiar with the data, you may find that you need to assign different codes or themes according to new elements you find. For example, if you were analysing a text talking about wildlife, you may come across the codes, “pigeon”, “canary” and “budgerigar” which can fall under the theme of birds.

As you work through the data, you may start to identify subthemes , which are subdivisions of themes that focus specifically on an aspect within the theme that is significant or relevant to your research question. For example, if your theme is a university, your subthemes could be faculties or departments at that university.

In this stage of the analysis, your reflexivity journal entries need to reflect how codes were interpreted and combined to form themes.

Step 3: Review themes

By now you’ll have a good idea of your codes, themes, and potentially subthemes. Now it’s time to review all the themes you’ve identified . In this step, you’ll want to check that everything you’ve categorised as a theme actually fits the data, whether the themes do indeed exist in the data, whether there are any themes missing , and whether you can move on to the next step knowing that you’ve coded all your themes accurately and comprehensively . If you find that your themes have become too broad and there is far too much information under one theme, it may be useful to split this into more themes so that you’re able to be more specific with your analysis.

In your reflexivity journal, you’ll want to write about how you understood the themes and how they are supported by evidence, as well as how the themes fit in with your codes. At this point, you’ll also want to revisit your research questions and make sure that the data and themes you’ve identified are directly relevant to these questions .

If you find that your themes have become too broad and there is too much information under one theme, you can split them up into more themes, so that you can be more specific with your analysis.

Step 4: Finalise Themes

By this point, your analysis will really start to take shape. In the previous step, you reviewed and refined your themes, and now it’s time to label and finalise them . It’s important to note here that, just because you’ve moved onto the next step, it doesn’t mean that you can’t go back and revise or rework your themes. In contrast to the previous step, finalising your themes means spelling out what exactly the themes consist of, and describe them in detail . If you struggle with this, you may want to return to your data to make sure that your data and coding do represent the themes, and if you need to divide your themes into more themes (i.e., return to step 3).

When you name your themes, make sure that you select labels that accurately encapsulate the properties of the theme . For example, a theme name such as “enthusiasm in professionals” leaves the question of “who are the professionals?”, so you’d want to be more specific and label the theme as something along the lines of “enthusiasm in healthcare professionals”.

It is very important at this stage that you make sure that your themes align with your research aims and questions . When you’re finalising your themes, you’re also nearing the end of your analysis and need to keep in mind that your final report (discussed in the next step) will need to fit in with the aims and objectives of your research.

In your reflexivity journal, you’ll want to write down a few sentences describing your themes and how you decided on these. Here, you’ll also want to mention how the theme will contribute to the outcomes of your research, and also what it means in relation to your research questions and focus of your research.

By the end of this stage, you’ll be done with your themes – meaning it’s time to write up your findings and produce a report.

It is very important at the theme finalisation stage to make sure that your themes align with your research questions.

Step 5: Produce your report

You’re nearly done! Now that you’ve analysed your data, it’s time to report on your findings. A typical thematic analysis report consists of:

  • An introduction
  • A methodology section
  • Your results and findings
  • A conclusion

When writing your report, make sure that you provide enough information for a reader to be able to evaluate the rigour of your analysis. In other words, the reader needs to know the exact process you followed when analysing your data and why. The questions of “what”, “how”, “why”, “who”, and “when” may be useful in this section.

So, what did you investigate? How did you investigate it? Why did you choose this particular method? Who does your research focus on, and who are your participants? When did you conduct your research, when did you collect your data, and when was the data produced? Your reflexivity journal will come in handy here as within it you’ve already labelled, described, and supported your themes.

If you’re undertaking a thematic analysis as part of a dissertation or thesis, this discussion will be split across your methodology, results and discussion chapters . For more information about those chapters, check out our detailed post about dissertation structure .

It’s absolutely vital that, when writing up your results, you back up every single one of your findings with quotations . The reader needs to be able to see that what you’re reporting actually exists within the results. Also make sure that, when reporting your findings, you tie them back to your research questions . You don’t want your reader to be looking through your findings and asking, “So what?”, so make sure that every finding you represent is relevant to your research topic and questions.

Quick Recap: How to “do” thematic analysis

Getting familiar with your data: Here you’ll read through your data and get a general overview of what you’re working with. At this stage, you may identify a few general codes and themes that you’ll make use of in the next step.

Search for patterns or themes in your codes : Here you’ll dive into your data and pick out the themes and codes relevant to your research question(s).

Review themes : In this step, you’ll revisit your codes and themes to make sure that they are all truly representative of the data, and that you can use them in your final report.

Finalise themes : Here’s where you “solidify” your analysis and make it report-ready by describing and defining your themes.

Produce your report : This is the final step of your thematic analysis process, where you put everything you’ve found together and report on your findings.

Tips & Suggestions

In the video below, we share 6 time-saving tips and tricks to help you approach your thematic analysis as effectively and efficiently as possible.

Wrapping Up

In this article, we’ve covered the basics of thematic analysis – what it is, when to use it, the different approaches and types of thematic analysis, and how to perform a thematic analysis.

If you have any questions about thematic analysis, drop a comment below and we’ll do our best to assist. If you’d like 1-on-1 support with your thematic analysis, be sure to check out our research coaching services here .

thematic analysis research paper themes

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

You Might Also Like:

Thematic analysis explainer

21 Comments

Ollie

I really appreciate the help

Oliv

Hello Sir, how many levels of coding can be done in thematic analysis? We generate codes from the transcripts, then subthemes from the codes and themes from subthemes, isn’t it? Should these themes be again grouped together? how many themes can be derived?can you please share an example of coding through thematic analysis in a tabular format?

Abdullahi Maude

I’ve found the article very educative and useful

TOMMY BIN SEMBEH

Excellent. Very helpful and easy to understand.

SK

This article so far has been most helpful in understanding how to write an analysis chapter. Thank you.

Ruwini

My research topic is the challenges face by the school principal on the process of procurement . Thematic analysis is it sutable fir data analysis ?

M. Anwar

It is a great help. Thanks.

Pari

Best advice. Worth reading. Thank you.

Yvonne Worrell

Where can I find an example of a template analysis table ?

aishch

Finally I got the best article . I wish they also have every psychology topics.

Rosa Ophelia Velarde

Hello, Sir/Maam

I am actually finding difficulty in doing qualitative analysis of my data and how to triangulate this with quantitative data. I encountered your web by accident in the process of searching for a much simplified way of explaining about thematic analysis such as coding, thematic analysis, write up. When your query if I need help popped up, I was hesitant to answer. Because I think this is for fee and I cannot afford. So May I just ask permission to copy for me to read and guide me to study so I can apply it myself for my gathered qualitative data for my graduate study.

Thank you very much! this is very helpful to me in my Graduate research qualitative data analysis.

SAMSON ROTTICH

Thank you very much. I find your guidance here helpful. Kindly let help me understand how to write findings and discussions.

arshad ahmad

i am having troubles with the concept of framework analysis which i did not find here and i have been an assignment on framework analysis

tayron gee

I was discouraged and felt insecure because after more than a year of writing my thesis, my work seemed lost its direction after being checked. But, I am truly grateful because through the comments, corrections, and guidance of the wisdom of my director, I can already see the bright light because of thematic analysis. I am working with Biblical Texts. And thematic analysis will be my method. Thank you.

OLADIPO TOSIN KABIR

lovely and helpful. thanks

Imdad Hussain

very informative information.

Ricky Fordan

thank you very much!, this is very helpful in my report, God bless……..

Akosua Andrews

Thank you for the insight. I am really relieved as you have provided a super guide for my thesis.

Christelle M.

Thanks a lot, really enlightening

fariya shahzadi

excellent! very helpful thank a lot for your great efforts

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly
  • How it works

Thematic Analysis – A Guide with Examples

Published by Alvin Nicolas at August 16th, 2021 , Revised On August 29, 2023

Thematic analysis is one of the most important types of analysis used for qualitative data . When researchers have to analyse audio or video transcripts, they give preference to thematic analysis. A researcher needs to look keenly at the content to identify the context and the message conveyed by the speaker.

Moreover, with the help of this analysis, data can be simplified.  

Importance of Thematic Analysis

Thematic analysis has so many unique and dynamic features, some of which are given below:

Thematic analysis is used because:

  • It is flexible.
  • It is best for complex data sets.
  • It is applied to qualitative data sets.
  • It takes less complexity compared to other theories of analysis.

Intellectuals and researchers give preference to thematic analysis due to its effectiveness in the research.

How to Conduct a Thematic Analysis?

While doing any research , if your data and procedure are clear, it will be easier for your reader to understand how you concluded the results . This will add much clarity to your research.

Understand the Data

This is the first step of your thematic analysis. At this stage, you have to understand the data set. You need to read the entire data instead of reading the small portion. If you do not have the data in the textual form, you have to transcribe it.

Example: If you are visiting an adult dating website, you have to make a data corpus. You should read and re-read the data and consider several profiles. It will give you an idea of how adults represent themselves on dating sites. You may get the following results:

I am a tall, single(widowed), easy-going, honest, good listener with a good sense of humor. Being a handyperson, I keep busy working around the house, and I also like to follow my favourite hockey team on TV or spoil my two granddaughters when I get the chance!! Enjoy most music except Rap! I keep fit by jogging, walking, and bicycling (at least three times a week). I have travelled to many places and RVD the South-West U.S., but I would now like to find that special travel partner to do more travel to warm and interesting countries. I now feel it’s time to meet a nice, kind, honest woman who has some of the same interests as I do; to share the happy times, quiet times, and adventures together

I enjoy photography, lapidary & seeking collectibles in the form of classic movies & 33 1/3, 45 & 78 RPM recordings from the 1920s, ’30s & ’40s. I am retired & looking forward to travelling to Canada, the USA, the UK & Europe, China. I am unique since I do not judge a book by its cover. I accept people for who they are. I will not demand or request perfection from anyone until I am perfect, so I guess that means everyone is safe. My musical tastes range from Classical, big band era, early jazz, classic ’50s & 60’s rock & roll & country since its inception.

Development of Initial Coding:

At this stage, you have to do coding. It’s the essential step of your research . Here you have two options for coding. Either you can do the coding manually or take the help of any tool. A software named the NOVIC is considered the best tool for doing automatic coding.

For manual coding, you can follow the steps given below:

  • Please write down the data in a proper format so that it can be easier to proceed.
  • Use a highlighter to highlight all the essential points from data.
  • Make as many points as possible.
  • Take notes very carefully at this stage.
  • Apply themes as much possible.
  • Now check out the themes of the same pattern or concept.
  • Turn all the same themes into the single one.

Example: For better understanding, the previously explained example of Step 1 is continued here. You can observe the coded profiles below:

Make Themes

At this stage, you have to make the themes. These themes should be categorised based on the codes. All the codes which have previously been generated should be turned into themes. Moreover, with the help of the codes, some themes and sub-themes can also be created. This process is usually done with the help of visuals so that a reader can take an in-depth look at first glance itself.

Extracted Data Review

Now you have to take an in-depth look at all the awarded themes again. You have to check whether all the given themes are organised properly or not. It would help if you were careful and focused because you have to note down the symmetry here. If you find that all the themes are not coherent, you can revise them. You can also reshape the data so that there will be symmetry between the themes and dataset here.

For better understanding, a mind-mapping example is given here:

Extracted Data

Reviewing all the Themes Again

You need to review the themes after coding them. At this stage, you are allowed to play with your themes in a more detailed manner. You have to convert the bigger themes into smaller themes here. If you want to combine some similar themes into a single theme, then you can do it. This step involves two steps for better fragmentation. 

You need to observe the coded data separately so that you can have a precise view. If you find that the themes which are given are following the dataset, it’s okay. Otherwise, you may have to rearrange the data again to coherence in the coded data.

Corpus Data

Here you have to take into consideration all the corpus data again. It would help if you found how themes are arranged here. It would help if you used the visuals to check out the relationship between them. Suppose all the things are not done accordingly, so you should check out the previous steps for a refined process. Otherwise, you can move to the next step. However, make sure that all the themes are satisfactory and you are not confused.

When all the two steps are completed, you need to make a more précised mind map. An example following the previous cases has been given below:

Corpus Data

Define all the Themes here

Now you have to define all the themes which you have given to your data set. You can recheck them carefully if you feel that some of them can fit into one concept, you can keep them, and eliminate the other irrelevant themes. Because it should be precise and clear, there should not be any ambiguity. Now you have to think about the main idea and check out that all the given themes are parallel to your main idea or not. This can change the concept for you.

The given names should be so that it can give any reader a clear idea about your findings. However, it should not oppose your thematic analysis; rather, everything should be organised accurately.

Steps of Writing a dissertation

Does your Research Methodology Have the Following?

  • Great Research/Sources
  • Perfect Language
  • Accurate Sources

If not, we can help. Our panel of experts makes sure to keep the 3 pillars of Research Methodology strong.

Does your Research Methodology Have the Following?

Also, read about discourse analysis , content analysis and survey conducting . we have provided comprehensive guides.

Make a Report

You need to make the final report of all the findings you have done at this stage. You should include the dataset, findings, and every aspect of your analysis in it.

While making the final report , do not forget to consider your audience. For instance, you are writing for the Newsletter, Journal, Public awareness, etc., your report should be according to your audience. It should be concise and have some logic; it should not be repetitive. You can use the references of other relevant sources as evidence to support your discussion.  

Frequently Asked Questions

What is meant by thematic analysis.

Thematic Analysis is a qualitative research method that involves identifying, analyzing, and interpreting recurring themes or patterns in data. It aims to uncover underlying meanings, ideas, and concepts within the dataset, providing insights into participants’ perspectives and experiences.

You May Also Like

This article provides the key advantages of primary research over secondary research so you can make an informed decision.

Discourse analysis is an essential aspect of studying a language. It is used in various disciplines of social science and humanities such as linguistic, sociolinguistics, and psycholinguistic.

What are the different types of research you can use in your dissertation? Here are some guidelines to help you choose a research strategy that would make your research more credible.

USEFUL LINKS

LEARNING RESOURCES

researchprospect-reviews-trust-site

COMPANY DETAILS

Research-Prospect-Writing-Service

  • How It Works

How to do thematic analysis

Last updated

8 February 2023

Reviewed by

Miroslav Damyanov

Uncovering themes in data requires a systematic approach. Thematic analysis organizes data so you can easily recognize the context.

  • What is thematic analysis?

Thematic analysis is   a method for analyzing qualitative data that involves reading through a data set and looking for patterns to derive themes . The researcher's subjective experience plays a central role in finding meaning within the data.

Streamline your thematic analysis

Find patterns and themes across all your qualitative data when you analyze it in Dovetail

  • What are the main approaches to thematic analysis?

Inductive thematic analysis approach

Inductive thematic analysis entails   deriving meaning and identifying themes from data with no preconceptions.  You analyze the data without any expected outcomes.

Deductive thematic analysis approach

In the deductive approach, you analyze data with a set of expected themes. Prior knowledge, research, or existing theory informs this approach.

Semantic thematic analysis approach

With the semantic approach, you ignore the underlying meaning of data. You take identifying themes at face value based on what is written or explicitly stated.

Latent thematic analysis approach

Unlike the semantic approach, the latent approach focuses on underlying meanings in data and looks at the reasons for semantic content. It involves an element of interpretation where you theorize meanings and don’t just take data at face value.

  • When should thematic analysis be used?

Thematic analysis is beneficial when you’re working with large bodies of data. It allows you to divide and categorize huge quantities of data in a way that makes it far easier to digest.  

The following scenarios warrant the use of thematic analysis:

You’re new to qualitative analysis

You need to identify patterns in data

You want to involve participants in the process

Thematic analysis is particularly useful when you’re looking for subjective information such as experiences and opinions in surveys , interviews, conversations, or social media posts. 

  • What are the advantages and disadvantages of thematic analysis?

Thematic analysis is a highly flexible approach to qualitative data analysis that you can modify to meet the needs of many studies. It enables you to generate new insights and concepts from data. 

Beginner researchers who are just learning how to analyze data will find thematic analysis very accessible. It’s easy for most people to grasp and can be relatively quick to learn.

The flexibility of thematic analysis can also be a disadvantage. It can feel intimidating to decide what’s important to emphasize, as there are many ways to interpret meaning from a data set.

  • What is the step-by-step process for thematic analysis?

The basic thematic analysis process requires recognizing codes and themes within a data set. A code is a label assigned to a piece of data that you use to identify and summarize important concepts within a data set. A theme is a pattern that you identify within the data. Relevant steps may vary based on the approach and type of thematic analysis, but these are the general steps you’d take:

1. Familiarize yourself with the data(pre-coding work)

Before you can successfully work with data, you need to understand it. Get a feel for the data to see what general themes pop up. Transcribe audio files and observe any meanings and patterns across the data set. Read through the transcript, and jot down notes about potential codes to create. 

2. Create the initial codes (open code work)

Create a set of initial codes to represent the patterns and meanings in the data. Make a codebook to keep track of the codes. Read through the data again to identify interesting excerpts and apply the appropriate codes. You should use the same code to represent excerpts with the same meaning. 

3. Collate codes with supporting data (clustering of initial code)

Now it's time to group all excerpts associated with a particular code. If you’re doing this manually, cut out codes and put them together. Thematic analysis software will automatically collate them.

4. Group codes into themes (clustering of selective codes)

Once you’ve finalized the codes, you can sort them into potential themes. Themes reflect trends and patterns in data. You can combine some codes to create sub-themes.

5. Review, revise, and finalize the themes (final revision)

Now you’ve decided upon the initial themes, you can review and adjust them as needed. Each theme should be distinct, with enough data to support it. You can merge similar themes and remove those lacking sufficient supportive data. Begin formulating themes into a narrative. 

6. Write the report

The final step of telling the story of a set of data is writing the report. You should fully consider the themes to communicate the validity of your analysis.

A typical thematic analysis report contains the following:

An introduction

A methodology section

Results and findings

A conclusion

Your narrative must be coherent, and it should include vivid quotes that can back up points. It should also include an interpretive analysis and argument for your claims. In addition, consider reporting your findings in a flowchart or tree diagram, which can be independent of or part of your report.  

In conclusion, a thematic analysis is a method of analyzing qualitative data. By following the six steps, you will identify common themes from a large set of texts. This method can help you find rich and useful insights about people’s experiences, behaviors, and nuanced opinions.

  • How to analyze qualitative data

Qualitative data analysis is the process of organizing, analyzing, and interpreting non-numerical and subjective data . The goal is to capture themes and patterns, answer questions, and identify the best actions to take based on that data. 

Researchers can use qualitative data to understand people’s thoughts, feelings, and attitudes. For example, qualitative researchers can help business owners draw reliable conclusions about customers’ opinions and discover areas that need improvement. 

In addition to thematic analysis, you can analyze qualitative data using the following:

Content analysis

Content analysis examines and counts the presence of certain words, subjects, and contexts in documents and communication artifacts, such as: 

Text in various formats

This method transforms qualitative input into quantitative data. You can do it manually or with electronic tools that recognize patterns to make connections between concepts.  

Narrative analysis

Narrative analysis interprets research participants' stories from testimonials, case studies, interviews, and other text or visual data. It provides valuable insights into the complexity of people's feelings, beliefs, and behaviors.

Discourse analysis

In discourse analysis , you analyze the underlying meaning of qualitative data in a particular context, including: 

Historical 

This approach allows us to study how people use language in text, audio, and video to unravel social issues, power dynamics, or inequalities. 

For example, you can look at how people communicate with their coworkers versus their bosses. Discourse analysis goes beyond the literal meaning of words to examine social reality.

Grounded theory analysis

In grounded theory analysis, you develop theories by examining real-world data. The process involves creating hypotheses and theories by systematically collecting and evaluating this data. While this approach is helpful for studying lesser-known phenomena, it might be overwhelming for a novice researcher. 

  • Challenges with analyzing qualitative data

While qualitative data can answer questions that quantitative data can't, it still comes with challenges.

If done manually, qualitative data analysis is very time-consuming.

It can be hard to choose a method. 

Avoiding bias is difficult.

Human error affects accuracy and consistency.

To overcome these challenges, you should fine-tune your methods by using the appropriate tools in collaboration with teammates.

thematic analysis research paper themes

Learn more about thematic analysis software

What is thematic analysis in qualitative research.

Thematic analysis is a method of analyzing qualitative data. It is applied to texts, such as interviews or transcripts. The researcher closely examines the data to identify common patterns and themes.

Can thematic analysis be done manually?

You can do thematic analysis manually, but it is very time-consuming without the help of software.

What are the two types of thematic analysis?

The two main types of thematic analysis include codebook thematic analysis and reflexive thematic analysis.

Codebook thematic analysis uses predetermined codes and structured codebooks to analyze from a deductive perspective. You draw codes from a review of the data or an initial analysis to produce the codebooks.

Reflexive thematic analysis is more flexible and does not use a codebook. Researchers can change, remove, and add codes as they work through the data. 

What makes a good thematic analysis?

The goal of thematic analysis is more than simply summarizing data; it's about identifying important themes. Good thematic analysis interprets, makes sense of data, and explains it. It produces trustworthy and insightful findings that are easy to understand and apply. 

What are examples of themes in thematic analysis?

Grouping codes into themes summarize sections of data in a useful way to answer research questions and achieve objectives. A theme identifies an area of data and tells the reader something about it. A good theme can sit alone without requiring descriptive text beneath it.

For example, if you were analyzing data on wildlife, codes might be owls, hawks, and falcons. These codes might fall beneath the theme of birds of prey. If your data were about the latest trends for teenage girls, codes such as mini skirts, leggings, and distressed jeans would fall under fashion.  

Thematic analysis is straightforward and intuitive enough that most people have no trouble applying it.

Get started today

Go from raw data to valuable insights with a flexible research platform

Editor’s picks

Last updated: 21 December 2023

Last updated: 16 December 2023

Last updated: 6 October 2023

Last updated: 25 November 2023

Last updated: 12 May 2023

Last updated: 15 February 2024

Last updated: 11 March 2024

Last updated: 12 December 2023

Last updated: 18 May 2023

Last updated: 6 March 2024

Last updated: 10 April 2023

Last updated: 20 December 2023

Latest articles

Related topics, log in or sign up.

Get started for free

  • - Google Chrome

Intended for healthcare professionals

  • Access provided by Google Indexer
  • My email alerts
  • BMA member login
  • Username * Password * Forgot your log in details? Need to activate BMA Member Log In Log in via OpenAthens Log in via your institution

Home

Search form

  • Advanced search
  • Search responses
  • Search blogs
  • Practical thematic...

Practical thematic analysis: a guide for multidisciplinary health services research teams engaging in qualitative analysis

  • Related content
  • Peer review
  • Catherine H Saunders , scientist and assistant professor 1 2 ,
  • Ailyn Sierpe , research project coordinator 2 ,
  • Christian von Plessen , senior physician 3 ,
  • Alice M Kennedy , research project manager 2 4 ,
  • Laura C Leviton , senior adviser 5 ,
  • Steven L Bernstein , chief research officer 1 ,
  • Jenaya Goldwag , resident physician 1 ,
  • Joel R King , research assistant 2 ,
  • Christine M Marx , patient associate 6 ,
  • Jacqueline A Pogue , research project manager 2 ,
  • Richard K Saunders , staff physician 1 ,
  • Aricca Van Citters , senior research scientist 2 ,
  • Renata W Yen , doctoral student 2 ,
  • Glyn Elwyn , professor 2 ,
  • JoAnna K Leyenaar , associate professor 1 2
  • on behalf of the Coproduction Laboratory
  • 1 Dartmouth Health, Lebanon, NH, USA
  • 2 Dartmouth Institute for Health Policy and Clinical Practice, Geisel School of Medicine at Dartmouth College, Lebanon, NH, USA
  • 3 Center for Primary Care and Public Health (Unisanté), Lausanne, Switzerland
  • 4 Jönköping Academy for Improvement of Health and Welfare, School of Health and Welfare, Jönköping University, Jönköping, Sweden
  • 5 Highland Park, NJ, USA
  • 6 Division of Public Health Sciences, Department of Surgery, Washington University School of Medicine, St Louis, MO, USA
  • Correspondence to: C H Saunders catherine.hylas.saunders{at}dartmouth.edu
  • Accepted 26 April 2023

Qualitative research methods explore and provide deep contextual understanding of real world issues, including people’s beliefs, perspectives, and experiences. Whether through analysis of interviews, focus groups, structured observation, or multimedia data, qualitative methods offer unique insights in applied health services research that other approaches cannot deliver. However, many clinicians and researchers hesitate to use these methods, or might not use them effectively, which can leave relevant areas of inquiry inadequately explored. Thematic analysis is one of the most common and flexible methods to examine qualitative data collected in health services research. This article offers practical thematic analysis as a step-by-step approach to qualitative analysis for health services researchers, with a focus on accessibility for patients, care partners, clinicians, and others new to thematic analysis. Along with detailed instructions covering three steps of reading, coding, and theming, the article includes additional novel and practical guidance on how to draft effective codes, conduct a thematic analysis session, and develop meaningful themes. This approach aims to improve consistency and rigor in thematic analysis, while also making this method more accessible for multidisciplinary research teams.

Through qualitative methods, researchers can provide deep contextual understanding of real world issues, and generate new knowledge to inform hypotheses, theories, research, and clinical care. Approaches to data collection are varied, including interviews, focus groups, structured observation, and analysis of multimedia data, with qualitative research questions aimed at understanding the how and why of human experience. 1 2 Qualitative methods produce unique insights in applied health services research that other approaches cannot deliver. In particular, researchers acknowledge that thematic analysis is a flexible and powerful method of systematically generating robust qualitative research findings by identifying, analysing, and reporting patterns (themes) within data. 3 4 5 6 Although qualitative methods are increasingly valued for answering clinical research questions, many researchers are unsure how to apply them or consider them too time consuming to be useful in responding to practical challenges 7 or pressing situations such as public health emergencies. 8 Consequently, researchers might hesitate to use them, or use them improperly. 9 10 11

Although much has been written about how to perform thematic analysis, practical guidance for non-specialists is sparse. 3 5 6 12 13 In the multidisciplinary field of health services research, qualitative data analysis can confound experienced researchers and novices alike, which can stoke concerns about rigor, particularly for those more familiar with quantitative approaches. 14 Since qualitative methods are an area of specialisation, support from experts is beneficial. However, because non-specialist perspectives can enhance data interpretation and enrich findings, there is a case for making thematic analysis easier, more rapid, and more efficient, 8 particularly for patients, care partners, clinicians, and other stakeholders. A practical guide to thematic analysis might encourage those on the ground to use these methods in their work, unearthing insights that would otherwise remain undiscovered.

Given the need for more accessible qualitative analysis approaches, we present a simple, rigorous, and efficient three step guide for practical thematic analysis. We include new guidance on the mechanics of thematic analysis, including developing codes, constructing meaningful themes, and hosting a thematic analysis session. We also discuss common pitfalls in thematic analysis and how to avoid them.

Summary points

Qualitative methods are increasingly valued in applied health services research, but multidisciplinary research teams often lack accessible step-by-step guidance and might struggle to use these approaches

A newly developed approach, practical thematic analysis, uses three simple steps: reading, coding, and theming

Based on Braun and Clarke’s reflexive thematic analysis, our streamlined yet rigorous approach is designed for multidisciplinary health services research teams, including patients, care partners, and clinicians

This article also provides companion materials including a slide presentation for teaching practical thematic analysis to research teams, a sample thematic analysis session agenda, a theme coproduction template for use during the session, and guidance on using standardised reporting criteria for qualitative research

In their seminal work, Braun and Clarke developed a six phase approach to reflexive thematic analysis. 4 12 We built on their method to develop practical thematic analysis ( box 1 , fig 1 ), which is a simplified and instructive approach that retains the substantive elements of their six phases. Braun and Clarke’s phase 1 (familiarising yourself with the dataset) is represented in our first step of reading. Phase 2 (coding) remains as our second step of coding. Phases 3 (generating initial themes), 4 (developing and reviewing themes), and 5 (refining, defining, and naming themes) are represented in our third step of theming. Phase 6 (writing up) also occurs during this third step of theming, but after a thematic analysis session. 4 12

Key features and applications of practical thematic analysis

Step 1: reading.

All manuscript authors read the data

All manuscript authors write summary memos

Step 2: Coding

Coders perform both data management and early data analysis

Codes are complete thoughts or sentences, not categories

Step 3: Theming

Researchers host a thematic analysis session and share different perspectives

Themes are complete thoughts or sentences, not categories

Applications

For use by practicing clinicians, patients and care partners, students, interdisciplinary teams, and those new to qualitative research

When important insights from healthcare professionals are inaccessible because they do not have qualitative methods training

When time and resources are limited

Fig 1

Steps in practical thematic analysis

  • Download figure
  • Open in new tab
  • Download powerpoint

We present linear steps, but as qualitative research is usually iterative, so too is thematic analysis. 15 Qualitative researchers circle back to earlier work to check whether their interpretations still make sense in the light of additional insights, adapting as necessary. While we focus here on the practical application of thematic analysis in health services research, we recognise our approach exists in the context of the broader literature on thematic analysis and the theoretical underpinnings of qualitative methods as a whole. For a more detailed discussion of these theoretical points, as well as other methods widely used in health services research, we recommend reviewing the sources outlined in supplemental material 1. A strong and nuanced understanding of the context and underlying principles of thematic analysis will allow for higher quality research. 16

Practical thematic analysis is a highly flexible approach that can draw out valuable findings and generate new hypotheses, including in cases with a lack of previous research to build on. The approach can also be used with a variety of data, such as transcripts from interviews or focus groups, patient encounter transcripts, professional publications, observational field notes, and online activity logs. Importantly, successful practical thematic analysis is predicated on having high quality data collected with rigorous methods. We do not describe qualitative research design or data collection here. 11 17

In supplemental material 1, we summarise the foundational methods, concepts, and terminology in qualitative research. Along with our guide below, we include a companion slide presentation for teaching practical thematic analysis to research teams in supplemental material 2. We provide a theme coproduction template for teams to use during thematic analysis sessions in supplemental material 3. Our method aligns with the major qualitative reporting frameworks, including the Consolidated Criteria for Reporting Qualitative Research (COREQ). 18 We indicate the corresponding step in practical thematic analysis for each COREQ item in supplemental material 4.

Familiarisation and memoing

We encourage all manuscript authors to review the full dataset (eg, interview transcripts) to familiarise themselves with it. This task is most critical for those who will later be engaged in the coding and theming steps. Although time consuming, it is the best way to involve team members in the intellectual work of data interpretation, so that they can contribute to the analysis and contextualise the results. If this task is not feasible given time limitations or large quantities of data, the data can be divided across team members. In this case, each piece of data should be read by at least two individuals who ideally represent different professional roles or perspectives.

We recommend that researchers reflect on the data and independently write memos, defined as brief notes on thoughts and questions that arise during reading, and a summary of their impressions of the dataset. 2 19 Memoing is an opportunity to gain insights from varying perspectives, particularly from patients, care partners, clinicians, and others. It also gives researchers the opportunity to begin to scope which elements of and concepts in the dataset are relevant to the research question.

Data saturation

The concept of data saturation ( box 2 ) is a foundation of qualitative research. It is defined as the point in analysis at which new data tend to be redundant of data already collected. 21 Qualitative researchers are expected to report their approach to data saturation. 18 Because thematic analysis is iterative, the team should discuss saturation throughout the entire process, beginning with data collection and continuing through all steps of the analysis. 22 During step 1 (reading), team members might discuss data saturation in the context of summary memos. Conversations about saturation continue during step 2 (coding), with confirmation that saturation has been achieved during step 3 (theming). As a rule of thumb, researchers can often achieve saturation in 9-17 interviews or 4-8 focus groups, but this will vary depending on the specific characteristics of the study. 23

Data saturation in context

Braun and Clarke discourage the use of data saturation to determine sample size (eg, number of interviews), because it assumes that there is an objective truth to be captured in the data (sometimes known as a positivist perspective). 20 Qualitative researchers often try to avoid positivist approaches, arguing that there is no one true way of seeing the world, and will instead aim to gather multiple perspectives. 5 Although this theoretical debate with qualitative methods is important, we recognise that a priori estimates of saturation are often needed, particularly for investigators newer to qualitative research who might want a more pragmatic and applied approach. In addition, saturation based, sample size estimation can be particularly helpful in grant proposals. However, researchers should still follow a priori sample size estimation with a discussion to confirm saturation has been achieved.

Definition of coding

We describe codes as labels for concepts in the data that are directly relevant to the study objective. Historically, the purpose of coding was to distil the large amount of data collected into conceptually similar buckets so that researchers could review it in aggregate and identify key themes. 5 24 We advocate for a more analytical approach than is typical with thematic analysis. With our method, coding is both the foundation for and the beginning of thematic analysis—that is, early data analysis, management, and reduction occur simultaneously rather than as different steps. This approach moves the team more efficiently towards being able to describe themes.

Building the coding team

Coders are the research team members who directly assign codes to the data, reading all material and systematically labelling relevant data with appropriate codes. Ideally, at least two researchers would code every discrete data document, such as one interview transcript. 25 If this task is not possible, individual coders can each code a subset of the data that is carefully selected for key characteristics (sometimes known as purposive selection). 26 When using this approach, we recommend that at least 10% of data be coded by two or more coders to ensure consistency in codebook application. We also recommend coding teams of no more than four to five people, for practical reasons concerning maintaining consistency.

Clinicians, patients, and care partners bring unique perspectives to coding and enrich the analytical process. 27 Therefore, we recommend choosing coders with a mix of relevant experiences so that they can challenge and contextualise each other’s interpretations based on their own perspectives and opinions ( box 3 ). We recommend including both coders who collected the data and those who are naive to it, if possible, given their different perspectives. We also recommend all coders review the summary memos from the reading step so that key concepts identified by those not involved in coding can be integrated into the analytical process. In practice, this review means coding the memos themselves and discussing them during the code development process. This approach ensures that the team considers a diversity of perspectives.

Coding teams in context

The recommendation to use multiple coders is a departure from Braun and Clarke. 28 29 When the views, experiences, and training of each coder (sometimes known as positionality) 30 are carefully considered, having multiple coders can enhance interpretation and enrich findings. When these perspectives are combined in a team setting, researchers can create shared meaning from the data. Along with the practical consideration of distributing the workload, 31 inclusion of these multiple perspectives increases the overall quality of the analysis by mitigating the impact of any one coder’s perspective. 30

Coding tools

Qualitative analysis software facilitates coding and managing large datasets but does not perform the analytical work. The researchers must perform the analysis themselves. Most programs support queries and collaborative coding by multiple users. 32 Important factors to consider when choosing software can include accessibility, cost, interoperability, the look and feel of code reports, and the ease of colour coding and merging codes. Coders can also use low tech solutions, including highlighters, word processors, or spreadsheets.

Drafting effective codes

To draft effective codes, we recommend that the coders review each document line by line. 33 As they progress, they can assign codes to segments of data representing passages of interest. 34 Coders can also assign multiple codes to the same passage. Consensus among coders on what constitutes a minimum or maximum amount of text for assigning a code is helpful. As a general rule, meaningful segments of text for coding are shorter than one paragraph, but longer than a few words. Coders should keep the study objective in mind when determining which data are relevant ( box 4 ).

Code types in context

Similar to Braun and Clarke’s approach, practical thematic analysis does not specify whether codes are based on what is evident from the data (sometimes known as semantic) or whether they are based on what can be inferred at a deeper level from the data (sometimes known as latent). 4 12 35 It also does not specify whether they are derived from the data (sometimes known as inductive) or determined ahead of time (sometimes known as deductive). 11 35 Instead, it should be noted that health services researchers conducting qualitative studies often adopt all these approaches to coding (sometimes known as hybrid analysis). 3

In practical thematic analysis, codes should be more descriptive than general categorical labels that simply group data with shared characteristics. At a minimum, codes should form a complete (or full) thought. An easy way to conceptualise full thought codes is as complete sentences with subjects and verbs ( table 1 ), although full sentence coding is not always necessary. With full thought codes, researchers think about the data more deeply and capture this insight in the codes. This coding facilitates the entire analytical process and is especially valuable when moving from codes to broader themes. Experienced qualitative researchers often intuitively use full thought or sentence codes, but this practice has not been explicitly articulated as a path to higher quality coding elsewhere in the literature. 6

Example transcript with codes used in practical thematic analysis 36

  • View inline

Depending on the nature of the data, codes might either fall into flat categories or be arranged hierarchically. Flat categories are most common when the data deal with topics on the same conceptual level. In other words, one topic is not a subset of another topic. By contrast, hierarchical codes are more appropriate for concepts that naturally fall above or below each other. Hierarchical coding can also be a useful form of data management and might be necessary when working with a large or complex dataset. 5 Codes grouped into these categories can also make it easier to naturally transition into generating themes from the initial codes. 5 These decisions between flat versus hierarchical coding are part of the work of the coding team. In both cases, coders should ensure that their code structures are guided by their research questions.

Developing the codebook

A codebook is a shared document that lists code labels and comprehensive descriptions for each code, as well as examples observed within the data. Good code descriptions are precise and specific so that coders can consistently assign the same codes to relevant data or articulate why another coder would do so. Codebook development is iterative and involves input from the entire coding team. However, as those closest to the data, coders must resist undue influence, real or perceived, from other team members with conflicting opinions—it is important to mitigate the risk that more senior researchers, like principal investigators, exert undue influence on the coders’ perspectives.

In practical thematic analysis, coders begin codebook development by independently coding a small portion of the data, such as two to three transcripts or other units of analysis. Coders then individually produce their initial codebooks. This task will require them to reflect on, organise, and clarify codes. The coders then meet to reconcile the draft codebooks, which can often be difficult, as some coders tend to lump several concepts together while others will split them into more specific codes. Discussing disagreements and negotiating consensus are necessary parts of early data analysis. Once the codebook is relatively stable, we recommend soliciting input on the codes from all manuscript authors. Yet, coders must ultimately be empowered to finalise the details so that they are comfortable working with the codebook across a large quantity of data.

Assigning codes to the data

After developing the codebook, coders will use it to assign codes to the remaining data. While the codebook’s overall structure should remain constant, coders might continue to add codes corresponding to any new concepts observed in the data. If new codes are added, coders should review the data they have already coded and determine whether the new codes apply. Qualitative data analysis software can be useful for editing or merging codes.

We recommend that coders periodically compare their code occurrences ( box 5 ), with more frequent check-ins if substantial disagreements occur. In the event of large discrepancies in the codes assigned, coders should revise the codebook to ensure that code descriptions are sufficiently clear and comprehensive to support coding alignment going forward. Because coding is an iterative process, the team can adjust the codebook as needed. 5 28 29

Quantitative coding in context

Researchers should generally avoid reporting code counts in thematic analysis. However, counts can be a useful proxy in maintaining alignment between coders on key concepts. 26 In practice, therefore, researchers should make sure that all coders working on the same piece of data assign the same codes with a similar pattern and that their memoing and overall assessment of the data are aligned. 37 However, the frequency of a code alone is not an indicator of its importance. It is more important that coders agree on the most salient points in the data; reviewing and discussing summary memos can be helpful here. 5

Researchers might disagree on whether or not to calculate and report inter-rater reliability. We note that quantitative tests for agreement, such as kappa statistics or intraclass correlation coefficients, can be distracting and might not provide meaningful results in qualitative analyses. Similarly, Braun and Clarke argue that expecting perfect alignment on coding is inconsistent with the goal of co-constructing meaning. 28 29 Overall consensus on codes’ salience and contributions to themes is the most important factor.

Definition of themes

Themes are meta-constructs that rise above codes and unite the dataset ( box 6 , fig 2 ). They should be clearly evident, repeated throughout the dataset, and relevant to the research questions. 38 While codes are often explicit descriptions of the content in the dataset, themes are usually more conceptual and knit the codes together. 39 Some researchers hypothesise that theme development is loosely described in the literature because qualitative researchers simply intuit themes during the analytical process. 39 In practical thematic analysis, we offer a concrete process that should make developing meaningful themes straightforward.

Themes in context

According to Braun and Clarke, a theme “captures something important about the data in relation to the research question and represents some level of patterned response or meaning within the data set.” 4 Similarly, Braun and Clarke advise against themes as domain summaries. While different approaches can draw out themes from codes, the process begins by identifying patterns. 28 35 Like Braun and Clarke and others, we recommend that researchers consider the salience of certain themes, their prevalence in the dataset, and their keyness (ie, how relevant the themes are to the overarching research questions). 4 12 34

Fig 2

Use of themes in practical thematic analysis

Constructing meaningful themes

After coding all the data, each coder should independently reflect on the team’s summary memos (step 1), the codebook (step 2), and the coded data itself to develop draft themes (step 3). It can be illuminating for coders to review all excerpts associated with each code, so that they derive themes directly from the data. Researchers should remain focused on the research question during this step, so that themes have a clear relation with the overall project aim. Use of qualitative analysis software will make it easy to view each segment of data tagged with each code. Themes might neatly correspond to groups of codes. Or—more likely—they will unite codes and data in unexpected ways. A whiteboard or presentation slides might be helpful to organise, craft, and revise themes. We also provide a template for coproducing themes (supplemental material 3). As with codebook justification, team members will ideally produce individual drafts of the themes that they have identified in the data. They can then discuss these with the group and reach alignment or consensus on the final themes.

The team should ensure that all themes are salient, meaning that they are: supported by the data, relevant to the study objectives, and important. Similar to codes, themes are framed as complete thoughts or sentences, not categories. While codes and themes might appear to be similar to each other, the key distinction is that the themes represent a broader concept. Table 2 shows examples of codes and their corresponding themes from a previously published project that used practical thematic analysis. 36 Identifying three to four key themes that comprise a broader overarching theme is a useful approach. Themes can also have subthemes, if appropriate. 40 41 42 43 44

Example codes with themes in practical thematic analysis 36

Thematic analysis session

After each coder has independently produced draft themes, a carefully selected subset of the manuscript team meets for a thematic analysis session ( table 3 ). The purpose of this session is to discuss and reach alignment or consensus on the final themes. We recommend a session of three to five hours, either in-person or virtually.

Example agenda of thematic analysis session

The composition of the thematic analysis session team is important, as each person’s perspectives will shape the results. This group is usually a small subset of the broader research team, with three to seven individuals. We recommend that primary and senior authors work together to include people with diverse experiences related to the research topic. They should aim for a range of personalities and professional identities, particularly those of clinicians, trainees, patients, and care partners. At a minimum, all coders and primary and senior authors should participate in the thematic analysis session.

The session begins with each coder presenting their draft themes with supporting quotes from the data. 5 Through respectful and collaborative deliberation, the group will develop a shared set of final themes.

One team member facilitates the session. A firm, confident, and consistent facilitation style with good listening skills is critical. For practical reasons, this person is not usually one of the primary coders. Hierarchies in teams cannot be entirely flattened, but acknowledging them and appointing an external facilitator can reduce their impact. The facilitator can ensure that all voices are heard. For example, they might ask for perspectives from patient partners or more junior researchers, and follow up on comments from senior researchers to say, “We have heard your perspective and it is important; we want to make sure all perspectives in the room are equally considered.” Or, “I hear [senior person] is offering [x] idea, I’d like to hear other perspectives in the room.” The role of the facilitator is critical in the thematic analysis session. The facilitator might also privately discuss with more senior researchers, such as principal investigators and senior authors, the importance of being aware of their influence over others and respecting and eliciting the perspectives of more junior researchers, such as patients, care partners, and students.

To our knowledge, this discrete thematic analysis session is a novel contribution of practical thematic analysis. It helps efficiently incorporate diverse perspectives using the session agenda and theme coproduction template (supplemental material 3) and makes the process of constructing themes transparent to the entire research team.

Writing the report

We recommend beginning the results narrative with a summary of all relevant themes emerging from the analysis, followed by a subheading for each theme. Each subsection begins with a brief description of the theme and is illustrated with relevant quotes, which are contextualised and explained. The write-up should not simply be a list, but should contain meaningful analysis and insight from the researchers, including descriptions of how different stakeholders might have experienced a particular situation differently or unexpectedly.

In addition to weaving quotes into the results narrative, quotes can be presented in a table. This strategy is a particularly helpful when submitting to clinical journals with tight word count limitations. Quote tables might also be effective in illustrating areas of agreement and disagreement across stakeholder groups, with columns representing different groups and rows representing each theme or subtheme. Quotes should include an anonymous label for each participant and any relevant characteristics, such as role or gender. The aim is to produce rich descriptions. 5 We recommend against repeating quotations across multiple themes in the report, so as to avoid confusion. The template for coproducing themes (supplemental material 3) allows documentation of quotes supporting each theme, which might also be useful during report writing.

Visual illustrations such as a thematic map or figure of the findings can help communicate themes efficiently. 4 36 42 44 If a figure is not possible, a simple list can suffice. 36 Both must clearly present the main themes with subthemes. Thematic figures can facilitate confirmation that the researchers’ interpretations reflect the study populations’ perspectives (sometimes known as member checking), because authors can invite discussions about the figure and descriptions of findings and supporting quotes. 46 This process can enhance the validity of the results. 46

In supplemental material 4, we provide additional guidance on reporting thematic analysis consistent with COREQ. 18 Commonly used in health services research, COREQ outlines a standardised list of items to be included in qualitative research reports ( box 7 ).

Reporting in context

We note that use of COREQ or any other reporting guidelines does not in itself produce high quality work and should not be used as a substitute for general methodological rigor. Rather, researchers must consider rigor throughout the entire research process. As the issue of how to conceptualise and achieve rigorous qualitative research continues to be debated, 47 48 we encourage researchers to explicitly discuss how they have looked at methodological rigor in their reports. Specifically, we point researchers to Braun and Clarke’s 2021 tool for evaluating thematic analysis manuscripts for publication (“Twenty questions to guide assessment of TA [thematic analysis] research quality”). 16

Avoiding common pitfalls

Awareness of common mistakes can help researchers avoid improper use of qualitative methods. Improper use can, for example, prevent researchers from developing meaningful themes and can risk drawing inappropriate conclusions from the data. Braun and Clarke also warn of poor quality in qualitative research, noting that “coherence and integrity of published research does not always hold.” 16

Weak themes

An important distinction between high and low quality themes is that high quality themes are descriptive and complete thoughts. As such, they often contain subjects and verbs, and can be expressed as full sentences ( table 2 ). Themes that are simply descriptive categories or topics could fail to impart meaningful knowledge beyond categorisation. 16 49 50

Researchers will often move from coding directly to writing up themes, without performing the work of theming or hosting a thematic analysis session. Skipping concerted theming often results in themes that look more like categories than unifying threads across the data.

Unfocused analysis

Because data collection for qualitative research is often semi-structured (eg, interviews, focus groups), not all data will be directly relevant to the research question at hand. To avoid unfocused analysis and a correspondingly unfocused manuscript, we recommend that all team members keep the research objective in front of them at every stage, from reading to coding to theming. During the thematic analysis session, we recommend that the research question be written on a whiteboard so that all team members can refer back to it, and so that the facilitator can ensure that conversations about themes occur in the context of this question. Consistently focusing on the research question can help to ensure that the final report directly answers it, as opposed to the many other interesting insights that might emerge during the qualitative research process. Such insights can be picked up in a secondary analysis if desired.

Inappropriate quantification

Presenting findings quantitatively (eg, “We found 18 instances of participants mentioning safety concerns about the vaccines”) is generally undesirable in practical thematic analysis reporting. 51 Descriptive terms are more appropriate (eg, “participants had substantial concerns about the vaccines,” or “several participants were concerned about this”). This descriptive presentation is critical because qualitative data might not be consistently elicited across participants, meaning that some individuals might share certain information while others do not, simply based on how conversations evolve. Additionally, qualitative research does not aim to draw inferences outside its specific sample. Emphasising numbers in thematic analysis can lead to readers incorrectly generalising the findings. Although peer reviewers unfamiliar with thematic analysis often request this type of quantification, practitioners of practical thematic analysis can confidently defend their decision to avoid it. If quantification is methodologically important, we recommend simultaneously conducting a survey or incorporating standardised interview techniques into the interview guide. 11

Neglecting group dynamics

Researchers should concertedly consider group dynamics in the research team. Particular attention should be paid to power relations and the personality of team members, which can include aspects such as who most often speaks, who defines concepts, and who resolves disagreements that might arise within the group. 52

The perspectives of patient and care partners are particularly important to cultivate. Ideally, patient partners are meaningfully embedded in studies from start to finish, not just for practical thematic analysis. 53 Meaningful engagement can build trust, which makes it easier for patient partners to ask questions, request clarification, and share their perspectives. Professional team members should actively encourage patient partners by emphasising that their expertise is critically important and valued. Noting when a patient partner might be best positioned to offer their perspective can be particularly powerful.

Insufficient time allocation

Researchers must allocate enough time to complete thematic analysis. Working with qualitative data takes time, especially because it is often not a linear process. As the strength of thematic analysis lies in its ability to make use of the rich details and complexities of the data, we recommend careful planning for the time required to read and code each document.

Estimating the necessary time can be challenging. For step 1 (reading), researchers can roughly calculate the time required based on the time needed to read and reflect on one piece of data. For step 2 (coding), the total amount of time needed can be extrapolated from the time needed to code one document during codebook development. We also recommend three to five hours for the thematic analysis session itself, although coders will need to independently develop their draft themes beforehand. Although the time required for practical thematic analysis is variable, teams should be able to estimate their own required effort with these guidelines.

Practical thematic analysis builds on the foundational work of Braun and Clarke. 4 16 We have reframed their six phase process into three condensed steps of reading, coding, and theming. While we have maintained important elements of Braun and Clarke’s reflexive thematic analysis, we believe that practical thematic analysis is conceptually simpler and easier to teach to less experienced researchers and non-researcher stakeholders. For teams with different levels of familiarity with qualitative methods, this approach presents a clear roadmap to the reading, coding, and theming of qualitative data. Our practical thematic analysis approach promotes efficient learning by doing—experiential learning. 12 29 Practical thematic analysis avoids the risk of relying on complex descriptions of methods and theory and places more emphasis on obtaining meaningful insights from those close to real world clinical environments. Although practical thematic analysis can be used to perform intensive theory based analyses, it lends itself more readily to accelerated, pragmatic approaches.

Strengths and limitations

Our approach is designed to smooth the qualitative analysis process and yield high quality themes. Yet, researchers should note that poorly performed analyses will still produce low quality results. Practical thematic analysis is a qualitative analytical approach; it does not look at study design, data collection, or other important elements of qualitative research. It also might not be the right choice for every qualitative research project. We recommend it for applied health services research questions, where diverse perspectives and simplicity might be valuable.

We also urge researchers to improve internal validity through triangulation methods, such as member checking (supplemental material 1). 46 Member checking could include soliciting input on high level themes, theme definitions, and quotations from participants. This approach might increase rigor.

Implications

We hope that by providing clear and simple instructions for practical thematic analysis, a broader range of researchers will be more inclined to use these methods. Increased transparency and familiarity with qualitative approaches can enhance researchers’ ability to both interpret qualitative studies and offer up new findings themselves. In addition, it can have usefulness in training and reporting. A major strength of this approach is to facilitate meaningful inclusion of patient and care partner perspectives, because their lived experiences can be particularly valuable in data interpretation and the resulting findings. 11 30 As clinicians are especially pressed for time, they might also appreciate a practical set of instructions that can be immediately used to leverage their insights and access to patients and clinical settings, and increase the impact of qualitative research through timely results. 8

Practical thematic analysis is a simplified approach to performing thematic analysis in health services research, a field where the experiences of patients, care partners, and clinicians are of inherent interest. We hope that it will be accessible to those individuals new to qualitative methods, including patients, care partners, clinicians, and other health services researchers. We intend to empower multidisciplinary research teams to explore unanswered questions and make new, important, and rigorous contributions to our understanding of important clinical and health systems research.

Acknowledgments

All members of the Coproduction Laboratory provided input that shaped this manuscript during laboratory meetings. We acknowledge advice from Elizabeth Carpenter-Song, an expert in qualitative methods.

Coproduction Laboratory group contributors: Stephanie C Acquilano ( http://orcid.org/0000-0002-1215-5531 ), Julie Doherty ( http://orcid.org/0000-0002-5279-6536 ), Rachel C Forcino ( http://orcid.org/0000-0001-9938-4830 ), Tina Foster ( http://orcid.org/0000-0001-6239-4031 ), Megan Holthoff, Christopher R Jacobs ( http://orcid.org/0000-0001-5324-8657 ), Lisa C Johnson ( http://orcid.org/0000-0001-7448-4931 ), Elaine T Kiriakopoulos, Kathryn Kirkland ( http://orcid.org/0000-0002-9851-926X ), Meredith A MacMartin ( http://orcid.org/0000-0002-6614-6091 ), Emily A Morgan, Eugene Nelson, Elizabeth O’Donnell, Brant Oliver ( http://orcid.org/0000-0002-7399-622X ), Danielle Schubbe ( http://orcid.org/0000-0002-9858-1805 ), Gabrielle Stevens ( http://orcid.org/0000-0001-9001-178X ), Rachael P Thomeer ( http://orcid.org/0000-0002-5974-3840 ).

Contributors: Practical thematic analysis, an approach designed for multidisciplinary health services teams new to qualitative research, was based on CHS’s experiences teaching thematic analysis to clinical teams and students. We have drawn heavily from qualitative methods literature. CHS is the guarantor of the article. CHS, AS, CvP, AMK, JRK, and JAP contributed to drafting the manuscript. AS, JG, CMM, JAP, and RWY provided feedback on their experiences using practical thematic analysis. CvP, LCL, SLB, AVC, GE, and JKL advised on qualitative methods in health services research, given extensive experience. All authors meaningfully edited the manuscript content, including AVC and RKS. The corresponding author attests that all listed authors meet authorship criteria and that no others meeting the criteria have been omitted.

Funding: This manuscript did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Competing interests: All authors have completed the ICMJE uniform disclosure form at https://www.icmje.org/disclosure-of-interest/ and declare: no support from any organisation for the submitted work; no financial relationships with any organisations that might have an interest in the submitted work in the previous three years; no other relationships or activities that could appear to have influenced the submitted work.

Provenance and peer review: Not commissioned; externally peer reviewed.

  • Ziebland S ,
  • ↵ A Hybrid Approach to Thematic Analysis in Qualitative Research: Using a Practical Example. 2018. https://methods.sagepub.com/case/hybrid-approach-thematic-analysis-qualitative-research-a-practical-example .
  • Maguire M ,
  • Vindrola-Padros C ,
  • Vindrola-Padros B
  • ↵ Vindrola-Padros C. Rapid Ethnographies: A Practical Guide . Cambridge University Press 2021. https://play.google.com/store/books/details?id=n80HEAAAQBAJ
  • Schroter S ,
  • Merino JG ,
  • Barbeau A ,
  • ↵ Padgett DK. Qualitative and Mixed Methods in Public Health . SAGE Publications 2011. https://play.google.com/store/books/details?id=LcYgAQAAQBAJ
  • Scharp KM ,
  • Korstjens I
  • Barnett-Page E ,
  • ↵ Guest G, Namey EE, Mitchell ML. Collecting Qualitative Data: A Field Manual for Applied Research . SAGE 2013. https://play.google.com/store/books/details?id=-3rmWYKtloC
  • Sainsbury P ,
  • Emerson RM ,
  • Saunders B ,
  • Kingstone T ,
  • Hennink MM ,
  • Kaiser BN ,
  • Hennink M ,
  • O’Connor C ,
  • ↵ Yen RW, Schubbe D, Walling L, et al. Patient engagement in the What Matters Most trial: experiences and future implications for research. Poster presented at International Shared Decision Making conference, Quebec City, Canada. July 2019.
  • ↵ Got questions about Thematic Analysis? We have prepared some answers to common ones. https://www.thematicanalysis.net/faqs/ (accessed 9 Nov 2022).
  • ↵ Braun V, Clarke V. Thematic Analysis. SAGE Publications. 2022. https://uk.sagepub.com/en-gb/eur/thematic-analysis/book248481 .
  • Kalpokas N ,
  • Radivojevic I
  • Campbell KA ,
  • Durepos P ,
  • ↵ Understanding Thematic Analysis. https://www.thematicanalysis.net/understanding-ta/ .
  • Saunders CH ,
  • Stevens G ,
  • CONFIDENT Study Long-Term Care Partners
  • MacQueen K ,
  • Vaismoradi M ,
  • Turunen H ,
  • Schott SL ,
  • Berkowitz J ,
  • Carpenter-Song EA ,
  • Goldwag JL ,
  • Durand MA ,
  • Goldwag J ,
  • Saunders C ,
  • Mishra MK ,
  • Rodriguez HP ,
  • Shortell SM ,
  • Verdinelli S ,
  • Scagnoli NI
  • Campbell C ,
  • Sparkes AC ,
  • McGannon KR
  • Sandelowski M ,
  • Connelly LM ,
  • O’Malley AJ ,

thematic analysis research paper themes

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.6(3); 2019 Jul

Qualitative thematic analysis based on descriptive phenomenology

Annelie j. sundler.

1 Faculty of Caring Science, Work Life and Social Welfare, University of Borås, Borås, Sweden

Elisabeth Lindberg

Christina nilsson, lina palmér.

The aim of this paper was to discuss how to understand and undertake thematic analysis based on descriptive phenomenology. Methodological principles to guide the process of analysis are offered grounded on phenomenological philosophy. This is further discussed in relation to how scientific rigour and validity can be achieved.

This is a discursive article on thematic analysis based on descriptive phenomenology.

This paper takes thematic analysis based on a descriptive phenomenological tradition forward and provides a useful description on how to undertake the analysis. Ontological and epistemological foundations of descriptive phenomenology are outlined. Methodological principles are explained to guide the process of analysis, as well as help to understand validity and rigour. Researchers and students in nursing and midwifery conducting qualitative research need comprehensible and valid methods to analyse the meaning of lived experiences and organize data in meaningful ways.

1. INTRODUCTION

Qualitative research in health care is an increasingly complex research field, particularly when doing phenomenology. In nursing and midwifery, qualitative approaches dealing with the lived experiences of patients, families and professionals are necessary. Today, there are number of diverse research approaches. Still, the clarity regarding approaches for thematic analysis is not yet fully described in the literature and only a few papers describe thematic analysis (Ho, Chiang, & Leung, 2017 ; Vaismoradi, Turunen, & Bondas, 2013 ). It may be difficult to find a single paper that can guide researchers and students in doing thematic analysis in phenomenology.

From our research experiences, it may be complex to read and understand phenomenological approaches. Similarly, the process of analysis can be challenging to comprehend. This makes methodological issues related to the clarity of ontological and epistemological underpinnings and discussions of validity and rigour complex. Norlyk and Harder ( 2010 ) points to difficulties finding a guide for phenomenological research. There is a need for understandable guidelines to take thematic analysis forward. Useful approaches are required to provide researchers and students guidance in the process of thematic analysis. With this paper, we hope to clarify some important methodological stances related to the thematic analysis of meaning from lived experiences that are grounded in descriptive phenomenology and useful to teachers and researchers in nursing and midwifery.

1.1. Background

Phenomenology has been widely used to understand human phenomena in nursing and midwifery practices (Matua, 2015 ). Today, there are several phenomenological approaches available. When using phenomenology, the researcher needs an awareness of basic assumptions to make important methodological decisions. Thus, it is important to understand the underpinnings of the approach used (Dowling & Cooney, 2012 ). Phenomenological underpinnings may, however, be difficult to understand and apply in the research process.

Thematizing meaning has been emphasized as one of a few shared aspects across different qualitative approaches (Holloway & Todres, 2003 ), suggesting that some qualitative research strategies are more generic than others. Although different approaches sometimes overlap, they have different ontological and epistemological foundations. A range of approaches are used to thematize meaning, but some of them would benefit from clarifying ontological and epistemological assumptions. In hermeneutic phenomenological traditions, thematizing meaning can be understood as related to the interpretation of data, illuminating the underlying or unspoken meanings embodied or hidden in lived experiences (Ho et al., 2017 ; van Manen, 2016 ). Another commonly used approach to thematic analysis is the method presented in the psychology literature by Braun and Clarke ( 2006 ). The method is frequently used to find repeated patterns of meaning in the data. However, there is a lack of thematic analysis approaches based on the traditions of descriptive phenomenology.

Researchers must make methodological considerations. In phenomenology, an awareness of the philosophical underpinning of the approach is needed when it is used in depth (Dowling & Cooney, 2012 ; Holloway & Todres, 2003 ). This places demands on methods to be comprehensible and flexible yet consistent and coherent. Questions remain regarding how thematic analysis can be further clarified and used based on descriptive phenomenology.

In this discursive paper, we provide guidance for thematic analysis based on descriptive phenomenology, which, to our knowledge, has not been made explicit in this way previously. This can be used as a guiding framework to analyse lived experiences in nursing and midwifery research. The aim of this paper was to discuss how to understand and undertake thematic analysis based on descriptive phenomenology. Methodological principles to guide the process of analysis are offered grounded on phenomenological philosophy. This is further discussed in relation to how scientific rigour and validity can be achieved.

2. ONTOLOGICAL AND EPISTEMOLOGICAL FOUNDATIONS OF DESCRIPTIVE PHENOMENOLOGY

Phenomenology consists of a complex philosophical tradition in human science, containing different concepts interpreted in various ways. One main theme among phenomenological methods is the diversity between descriptive versus interpretive phenomenology (Norlyk & Harder, 2010 ). Both traditions are commonly used in nursing and midwifery research. Several phenomenological methods have been recognized in the descriptive or interpretative approaches (Dowling, 2007 ; Dowling & Cooney, 2012 ; Norlyk & Harder, 2010 ). The descriptive tradition of phenomenology originated from the writings of Husserl was further developed by Merleau‐Ponty, while the interpretive approach was developed mainly from Heidegger and Gadamer.

The thematic analysis in this paper uses a descriptive approach with focus on lived experience, which refers to our experiences of the world. The philosophy of phenomenology is the study of a phenomenon, for example something as it is experienced (or lived) by a human being that means how things appear in our experiences. Consequently, there is a strong emphasis on lived experiences in phenomenological research (Dowling & Cooney, 2012 ; Norlyk & Harder, 2010 ). In this paper, lived experience is understood from a lifeworld approach originating from the writing of Husserl (Dahlberg, Dahlberg, & Nyström, 2008 ). The lifeworld is crucial and becomes the starting point for understanding lived experiences. Hence, the lifeworld forms the ontological and epistemological foundation for our understanding of lived experiences. In the lifeworld, our experiences must be regarded in the light of the body and the lifeworld of a person (i.e., our subjectivity). Consequently, humans cannot be reduced to a biological or psychological being (Merleau‐Ponty, 2002 /1945). When understanding the meaning of lived experiences, we need to be aware of the lifeworld, our bodily being in the world and how we interact with others.

The understanding of lived experiences is closely linked to the idea of the intentionality of consciousness, or how meaning is experienced. Intentionality encompasses the idea that our consciousness is always directed towards something, which means that when we experience something, the “thing” is experienced as “something” that has meaning for us. For example, a birthing woman's experience of pain or caregiving as it is experienced by a nurse. In a descriptive phenomenological approach, based on the writing of Husserl (Dahlberg et al., 2008 ) such meanings can be described. From this point of view, there are no needs for interpretations of these meanings, although this may be argued differently in interpretive phenomenology. Intentionality is also linked to our natural attitude. In our ordinary life, we take ourselves and our life for granted, which is our natural attitude and how we approach our experiences. We usually take for granted that the world around us is as we perceive it and that others perceive it as we do. We also take for granted that the world exists independently of us. Within our natural attitude, we normally do not constantly analyse our experiences. In phenomenology, an awareness of the natural attitude is important.

3. METHODOLOGICAL PRINCIPLES

In the ontological and epistemological foundations of descriptive phenomenology, some methodological principles can be recognized and how these are managed throughout the research process. Phenomenological studies have been criticized for lacking in clarity on philosophical underpinnings (Dowling & Cooney, 2012 ; Norlyk & Harder, 2010 ). Thus, philosophical stances must be understood and clarified for the reader of a study. Our suggestion is to let the entire research process, from data gathering to data analysis and reporting the findings, be guided by the methodological principles of emphasizing openness , questioning pre‐understanding and adopting a reflective attitude . We will acknowledge that the principles presented here may not be totally distinct from, or do follow, a particular phenomenological research approach. However, the outlined approach has some commonalities with the approaches of, for example, Dahlberg et al. ( 2008 ) and van Manen ( 2016 ).

When researching lived experiences, openness to the lifeworld and the phenomenon focused on must be emphasized (i.e., having curiosity and maintaining an open mind when searching for meaning). The researcher must adopt an open stance with sensitivity to the meaning of the lived experiences currently in focus. Openness involves being observant, attentive and sensitive to the expression of experiences (Dahlberg et al., 2008 ). It also includes questioning the understanding of data (Dahlberg & Dahlberg, 2003 ). Thus, researchers must strive to maintain an attitude that includes the assumption that hitherto the researcher does not know the participants experience and the researcher wants to understand the studied phenomenon in a new light to make invisible aspects of the experience become visible.

When striving for openness, researchers need to question their pre‐understanding , which means identifying and becoming aware of preconceptions that might influence the analysis. Throughout the research process and particularly the analysis, researchers must deal with the natural attitude and previous assumptions, when analysing and understanding the data. Questioning involves attempting to set aside one's experiences and assumptions as much as possible and means maintaining a critical stance and reflecting on the understanding of data and the phenomenon. This is similar to bracketing, a commonly used term in descriptive phenomenology based on Husserl, but it has been criticized (Dowling & Cooney, 2012 ). Some would argue that bracketing means to put aside such assumptions, which may not be possible. Instead, Gadamer ( 2004 ) deals with this in a different way, arguing that such assumptions are part of our understanding. Instead of using bracketing, our intention is to build on questioning as a representative way to describe what something means. Accordingly, researchers need to recognize personal beliefs, theories or other assumptions that can restrict the researcher's openness. Otherwise, the researcher risks describing his or her own pre‐understanding instead of the participants' experiences. Our pre‐understanding, described as “prejudice” in interpretive phenomenology by Gadamer ( 2004 ), is what we already know or think we know about a phenomena. As humans, we always have such a pre‐understanding or prejudice and Gadamer ( 2004 ) posits this is the tradition of our lived context and emphasizes that our tradition has a powerful influence on us. This means that it might be more difficult to see something new in the data than describe something already known by the researcher. Therefore, an open and sensitive stance is needed towards oneself, one's pre‐understanding and the understanding of data. However, one must be reflective and critical towards the data, as well as how to understand meanings from the data. Questioning can help researchers become aware of their pre‐understanding and set aside previous assumptions about the phenomenon (Dahlberg et al., 2008 ).

Questioning one's pre‐understanding is closely linked to having a reflective attitude . With a reflective attitude, the researcher needs to shift from the ordinary natural understanding of everyday life to a more self‐reflective and open stance towards the data (Dahlberg et al., 2008 ). An inquiring approach throughout the research process helps researchers become more aware of one's assumptions and reflect regarding the context of the actual research. For instance, researchers may need to reflect on why some meanings occur, how meanings are described and if meanings are grounded in the data. In striving for an awareness of the natural attitude, a reflective attitude becomes imperative. By having such an awareness, some of the pitfalls related to our natural attitude can be handled in favour of an open and reflective mind.

To summarize, methodological principles have been described in terms of emphasizing openness, questioning pre‐understanding and adopting a reflective attitude, which are three related concepts. To emphasis openness, one needs to reflect on preconceptions and judgements concerning the world and our experiences with a reflective approach to become aware of the natural attitude and process of understanding. Engaging in critical reflection throughout the research process may facilitate an awareness of how the researcher influences the research process. These methodological principles, related to ontological and epistemological foundations of phenomenology, are suggested to guide the research process, particularly the analysis.

4. THEMATIC ANALYSIS OF LIVED EXPERIENCES

The thematic analysis approach described in this paper is inductive. A prerequisite for the analysis is that it includes data on lived experiences, such as interviews or narratives. Themes derived from the analysis are data driven (i.e., grounded in data and the experience of the participants). The analysis begins with a search for meaning and goes on with different meanings being identified and related to each other. The analysis is aimed to try to understand the complexity of meanings in the data rather than measure their frequency. It involves researcher engaging in the data and the analysis. The analysis contains a search for patterns of meanings being further explored and determining how such patterns can be organized into themes. Moreover, the analysis must be guided by openness. Thus, the analysis involves a reflective process designed to illuminate meaning. Although the process of analysis is similar to descriptive phenomenological approaches focusing on the understanding and description of meaning‐oriented themes (Dahlberg et al., 2008 ; van Manen, 2016 ), there are important differences. While the thematic analysis in this paper focuses on how to organize patterns of meaning into themes, some would argue that an essential, general structure of meaning, rather than fragmented themes, is preferred (van Wijngaarden, Meide, & Dahlberg, 2017 ) and that such an essential meaning structure is a strength. We argue that meaning‐oriented themes can contribute to robust qualitative research findings. Still, it is important that the findings move between concrete expressions and descriptive text on meanings of lived experiences.

4.1. The process of analysis

The goal of the thematic analysis is to achieve an understanding of patterns of meanings from data on lived experiences (i.e., informants' descriptions of experiences related to the research question in, e.g., interviews or narratives). The analysis begins with data that needs to be textual and aims to organize meanings found in the data into patterns and, finally, themes. While conducting the analysis, the researcher strives to understand meanings embedded in experiences and describe these meanings textually. Through the analysis, details and aspects of meaning are explored, requiring reading and a reflective writing. Parts of the text need to be understood in terms of the whole and the whole in terms of its parts. However, the researcher also needs to move between being close to and distant from the data. Overall, the process of analysis can be complex and the researcher needs to be flexible. This process is summarized in Figure ​ Figure1 1 and detailed in the description below.

An external file that holds a picture, illustration, etc.
Object name is NOP2-6-733-g001.jpg

Summary of thematic analysis

To begin the analysis, the researcher needs to achieve familiarity with the data through open‐minded reading. The text must be read several times in its entirety. This is an open‐ended reading that puts the principle of openness into practice with the intention of opening one's mind to the text and its meanings. When reading, the researcher starts to explore experiences expressed in the data, such as determining how these are narrated and how meanings can be understood. The goal is to illuminate novel information rather than confirm what is already known while keeping the study aim in mind.

Thereafter, the parts of the data are further illuminated and the search for meanings and themes deepens. By moving back and forth between the whole and its parts, a sensitive dialogue with the text may be facilitated. While reading, meanings corresponding to the study's aim are marked. Notes and short descriptive words can be used to give meanings a preliminary name. As the analysis progresses, meanings related to each other are compared to identify differences and similarities. Meanings need to be related to each other to get a sense of patterns. Patterns of meanings are further examined. It is important to not make meanings definite too rapidly, slow down the understanding of data and its meanings. This demands the researcher's openness to let meanings emerge.

Lastly, the researcher needs to organize themes into a meaningful wholeness. Methodological principles must remind the researcher to maintain a reflective mind, while meanings are further developed into themes. Meanings are organized into patterns and, finally, themes. While deriving meaning from text, it is helpful to compare meanings and themes derived from the original data. Nothing is taken for granted, and the researcher must be careful and thoughtful during this part of the process. It can be valuable to discuss and reflect on tentative themes emerging from the data. Findings need to be meaningful, and the naming and wording of themes becomes important. The writing up of the themes is aimed to outline meanings inherent in the described experiences. At this point, findings are written and rewritten. Faithful descriptions of meanings usually need more than a single word, and the writing is important.

To conclude, the process of thematic analysis, based in a descriptive phenomenological approach, goes from the original data to the identification of meanings, organizing these into patterns and writing the results of themes related to the study aim and the actual context. When the findings are reported, these are described conversely (i.e., starting with the themes and the descriptive text, illustrated with quotes). Thus, meanings found from participants experiences are described in a meaningful text organized in themes.

4.2. Validity and Rigour

Hereby follows our discussion on scientific quality in terms of validity and rigour in the thematic analysis process. There is no consensus on which concepts should be used regarding validity in qualitative and phenomenological research. The term validity is typically used in relation to quantitative methods; however, qualitative researchers claim that the term is suitable in all paradigms as a generic term implying whether the research conclusions are sound, just and well‐founded (Morse, 2015 ; Whittemore, Chase, & Mandle, 2001 ). Rolfe ( 2006 ) states that scientific rigour can be judged based on how the research is presented for the reader and appraising research lies with both the reader and the writer of the research. Thus, clarity regarding methodological principles used becomes necessary. Porter ( 2007 ) argues that a more realistic approach is needed and that scientific rigour needs to be taken seriously in qualitative research (Porter, 2007 ). It has been stressed that strategies are needed to ensure rigour and validity; such strategies must be built into the research process and not solely evaluated afterwards (Cypress, 2017 ). Therefore, we further discuss scientific rigour and phenomenological validity in relation to reflexivity , credibility and transferability .

Reflexivity is strictly connected to previously described methodological principles of a reflective attitude and questioning one's pre‐understanding. Reflexivity must be maintained during the entire process, and the researcher needs to sustain a reflective attitude. Particularly, reflexivity must involve questioning the understanding of data and themes derived. Qualitative researchers are closely engaged in this process and must reflect on what the data actually state that may be different from the researcher's understanding. This means the researcher should question the findings instead of taking them for granted. Malterud ( 2001 ) claims that multiple researchers might strengthen the study since they can give supplementary views and question each other's statements, while an independent researcher must find other strategies. Another way to maintain reflexivity is comparing the original data with the descriptive text of themes derived. Moreover, findings need to be illustrated with original data to demonstrate how the derived descriptions are grounded in the data rather than in the researcher's understanding. Furthermore, information is needed on the setting so the reader can understand the context of the findings.

Credibility refers to the meaningfulness of the findings and whether these are well presented (Kitto, Chesters, & Grbich, 2008 ). Credibility and reflexivity are not totally distinct but are correlated with each other. Credibility stresses that nothing can be taken for granted and is associated with the methodological principles described above. The researcher needs to emphasize how the analysis and findings are presented for the reader. The analysis needs to be transparent, which means that the researcher should present it as thoroughly as possible to strive for credibility. The reader needs information concerning the methodology used and methodological decisions and considerations made. This includes, for instance, how the thematic analysis was performed, descriptions of how meanings were derived from the data and how themes were identified. Descriptions need to be clear and consistent. However, it must be possible to agree with and understand the logic of the findings and themes. Credibility lies in both the methodology and in the presentation of findings. Thus, in striving for credibility, the procedures and methods need to be presented as thoroughly and transparently as possible. Themes described must be illustrated with quotes to ensure the content and described meanings are consistent.

Transferability refers to the usefulness and relevance of the findings. However, the method used does not guarantee transferability in itself. Transferability is not explicitly related to any of the methodological principles, but it may be a result of them. Transferability is a measure of whether the findings are sound and if the study adds new knowledge to what is already known. The clarity of findings is also important. Thus, findings must be understandable and transferable to other research (i.e., findings need to be recognizable and relevant to a specific or broader context other than the original study). Specifically, the relevance, usefulness and meaningfulness of research findings to other contexts are important components of the study's transferability.

To conclude, reflexivity, credibility and transferability are concepts important to acknowledge and consider throughout the research process to engender validity and rigour. We maintain that meaning‐oriented themes can contribute to robust findings, if reported in a text describing patterns of meanings illustrated with examples of expressions from lived experiences. Questions researchers need to ask themselves in relation to validity when conducting a thematic analysis are presented in Figure ​ Figure2. 2 . Since the method in itself is no guarantee of validity and rigour, discussions related to these areas are needed.

An external file that holds a picture, illustration, etc.
Object name is NOP2-6-733-g002.jpg

Overview of questions useful to the uphold reflexivity, credibility and transferability of the research process in the thematic analysis of meanings

5. IMPLICATIONS FOR NURSING AND MIDWIFERY

In this paper, a method for thematic analysis based on phenomenology has been outlined. Doing phenomenological research is challenging. Therefore, we hope this paper contributes to the understanding of phenomenological underpinnings and methodological principles of thematic analysis based on descriptive phenomenology. This approach can be useful for teachers and researchers in nursing and midwifery. The thematic analysis presented can offer guidance on how to understand meaning and analyse lived experiences. Methodological stances of descriptive phenomenology are clarified, linking the process of analysis with theoretical underpinnings. Methodological principles are explained to give guidance to the analysis and help understand validity and rigour. Thus, this paper has the potential to provide researchers and students who have an interest in research on lived experiences with a comprehensive and useful method to thematic analysis in phenomenology. Nurses and midwives conducting qualitative research on lived experiences need robust methods to ensure high quality in health care to benefit patients, childbearing women and their families.

6. CONCLUSION

We provide researchers in nursing and midwifery with some clarity regarding thematic analysis grounded in the tradition of descriptive phenomenology. We argue that researchers need to comprehend phenomenological underpinnings and be guided by these in the research process. In thematic analysis, descriptive phenomenology is a useful framework when analysing lived experiences with clarified applicable ontological and epistemological underpinnings. Emphasizing openness, questioning pre‐understanding and adopting a reflective attitude were identified as important methodological principles that can guide researchers throughout the analysis and help uphold scientific rigour and validity. For novice researchers, the present paper may serve as an introduction to phenomenological approaches.

CONFLICT OF INTEREST

No conflict of interest has been declared by the authors.

AUTHOR CONTRIBUTIONS

AS, EL, CN, LP: Made substantial contributions to conception and design, or acquisition of data, or analysis and interpretation of data; involved in drafting the manuscript or revising it critically for important intellectual content; given final approval of the version to be published and each author should have participated sufficiently in the work to take public responsibility for appropriate portions of the content; and agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Sundler AJ, Lindberg E, Nilsson C, Palmér L. Qualitative thematic analysis based on descriptive phenomenology . Nursing Open . 2019; 6 :733–739. 10.1002/nop2.275 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]

  • Braun, V. , & Clarke, V. (2006). Using thematic analysis in psychology . Qualitative Research in Psychology , 3 ( 2 ), 77–101. [ Google Scholar ]
  • Cypress, B. S. (2017). Rigor or reliability and validity in qualitative research: Perspectives, strategies, reconceptualization and recommendations . Dimensions of Critical Care Nursing , 36 ( 4 ), 253–263. 10.1097/DCC.0000000000000253 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dahlberg, H. , & Dahlberg, K. (2003). To not make definite what is indefinite: A phenomenological analysis of perception and its epistemological consequences . Journal of the Humanistic Psychologist , 31 ( 4 ), 34–50. [ Google Scholar ]
  • Dahlberg, K. , Dahlberg, H. , & Nyström, M. (2008). Reflective lifeworld research . Lund, Sweden: Studentlitteratur. [ Google Scholar ]
  • Dowling, M. (2007). From Husserl to van Manen: A review of different phenomenological approaches . International Journal of Nursing Studies , 44 ( 1 ), 131–142. 10.1016/j.ijnurstu.2005.11.026 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dowling, M. , & Cooney, A. (2012). Research approaches related to phenomenology: Negotiating a complex landscape . Nurse Researcher , 20 ( 2 ), 21–27. 10.7748/nr2012.11.20.2.21.c9440 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Gadamer, H. G. (2004). Truth and method . London, UK: Continuum. [ Google Scholar ]
  • Ho, K. H. M. , Chiang, V. C. L. , & Leung, D. (2017). Hermeneutic phenomenological analysis: The ‘possibility’ beyond ‘actuality’ in thematic analysis . Journal of Advanced Nursing , 73 ( 7 ), 1757–1766. 10.1111/jan.13255 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Holloway, I. , & Todres, L. (2003). The status of method: Flexibility, consistency and coherence . Qualitative Research , 3 ( 3 ), 345–357. 10.1177/1468794103033004 [ CrossRef ] [ Google Scholar ]
  • Kitto, S. C. , Chesters, J. , & Grbich, C. (2008). Quality in qualitative research . The Medical Journal of Australia , 188 ( 4 ), 243–246. [ PubMed ] [ Google Scholar ]
  • Malterud, K. (2001). Qualitative research: Standards, challenges and guidelines . Lancet , 358 ( 9280 ), 483–488. [ PubMed ] [ Google Scholar ]
  • Matua, G. A. (2015). Choosing phenomenology as a guiding philosophy for nursing research . Nurse Researcher , 22 ( 4 ), 30–34. 10.7748/nr.22.4.30.e1325 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Merleau‐Ponty, M. (2002/1945). Phenomenology of perception . London, UK: Routledge Classics. [ Google Scholar ]
  • Morse, J. M. (2015). Critical analysis of strategies for determining rigor in qualitative inquiry . Qualitative Health Research , 9 , 1212–1222. 10.1177/1049732315588501 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Norlyk, A. , & Harder, I. (2010). What makes a phenomenological study phenomenological? An analysis of peer‐reviewed empirical nursing studies . Qualitative Health Research , 20 ( 3 ), 420–431. 10.1177/1049732309357435 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Porter, S. (2007). Validity, trustworthiness and rigour: Reasserting realism in qualitative research . Journal of Advanced Nursing , 60 ( 1 ), 79–86. 10.1111/j.1365-2648.2007.04360.x [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Rolfe, G. (2006). Validity, trustworthiness and rigor: Quality and the idea of qualitative research . Journal of Advanced Nursing , 53 , 304–310. [ PubMed ] [ Google Scholar ]
  • Vaismoradi, M. , Turunen, H. , & Bondas, T. (2013). Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study . Nursing & Health Sciences , 5 ( 3 ), 398–405. 10.1111/nhs.12048 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Van Manen, M. (2016). Phenomenology of practice . New York, NY: Routlege. [ Google Scholar ]
  • van Wijngaarden, E. , Meide, H. V. , & Dahlberg, K. (2017). Researching health care as a meaningful practice: Toward a nondualistic view on evidence for qualitative research . Qualitative Health Research , 11 , 1738–1747. 10.1177/1049732317711133 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Whittemore, R. , Chase, S. K. , & Mandle, C. L. (2001). Validity in qualitative research . Qualitative Health Research , 11 ( 4 ), 522–537. 10.1177/104973201129119299 [ PubMed ] [ CrossRef ] [ Google Scholar ]

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology
  • How to Do Thematic Analysis | Guide & Examples

How to Do Thematic Analysis | Guide & Examples

Published on 5 May 2022 by Jack Caulfield .

Thematic analysis is a method of analysing qualitative data . It is usually applied to a set of texts, such as an interview or transcripts . The researcher closely examines the data to identify common themes, topics, ideas and patterns of meaning that come up repeatedly.

There are various approaches to conducting thematic analysis, but the most common form follows a six-step process:

  • Familiarisation
  • Generating themes
  • Reviewing themes
  • Defining and naming themes

This process was originally developed for psychology research by Virginia Braun and Victoria Clarke . However, thematic analysis is a flexible method that can be adapted to many different kinds of research.

Table of contents

When to use thematic analysis, different approaches to thematic analysis, step 1: familiarisation, step 2: coding, step 3: generating themes, step 4: reviewing themes, step 5: defining and naming themes, step 6: writing up.

Thematic analysis is a good approach to research where you’re trying to find out something about people’s views, opinions, knowledge, experiences, or values from a set of qualitative data – for example, interview transcripts , social media profiles, or survey responses .

Some types of research questions you might use thematic analysis to answer:

  • How do patients perceive doctors in a hospital setting?
  • What are young women’s experiences on dating sites?
  • What are non-experts’ ideas and opinions about climate change?
  • How is gender constructed in secondary school history teaching?

To answer any of these questions, you would collect data from a group of relevant participants and then analyse it. Thematic analysis allows you a lot of flexibility in interpreting the data, and allows you to approach large datasets more easily by sorting them into broad themes.

However, it also involves the risk of missing nuances in the data. Thematic analysis is often quite subjective and relies on the researcher’s judgement, so you have to reflect carefully on your own choices and interpretations.

Pay close attention to the data to ensure that you’re not picking up on things that are not there – or obscuring things that are.

Prevent plagiarism, run a free check.

Once you’ve decided to use thematic analysis, there are different approaches to consider.

There’s the distinction between inductive and deductive approaches:

  • An inductive approach involves allowing the data to determine your themes.
  • A deductive approach involves coming to the data with some preconceived themes you expect to find reflected there, based on theory or existing knowledge.

There’s also the distinction between a semantic and a latent approach:

  • A semantic approach involves analysing the explicit content of the data.
  • A latent approach involves reading into the subtext and assumptions underlying the data.

After you’ve decided thematic analysis is the right method for analysing your data, and you’ve thought about the approach you’re going to take, you can follow the six steps developed by Braun and Clarke .

The first step is to get to know our data. It’s important to get a thorough overview of all the data we collected before we start analysing individual items.

This might involve transcribing audio , reading through the text and taking initial notes, and generally looking through the data to get familiar with it.

Next up, we need to code the data. Coding means highlighting sections of our text – usually phrases or sentences – and coming up with shorthand labels or ‘codes’ to describe their content.

Let’s take a short example text. Say we’re researching perceptions of climate change among conservative voters aged 50 and up, and we have collected data through a series of interviews. An extract from one interview looks like this:

In this extract, we’ve highlighted various phrases in different colours corresponding to different codes. Each code describes the idea or feeling expressed in that part of the text.

At this stage, we want to be thorough: we go through the transcript of every interview and highlight everything that jumps out as relevant or potentially interesting. As well as highlighting all the phrases and sentences that match these codes, we can keep adding new codes as we go through the text.

After we’ve been through the text, we collate together all the data into groups identified by code. These codes allow us to gain a condensed overview of the main points and common meanings that recur throughout the data.

Next, we look over the codes we’ve created, identify patterns among them, and start coming up with themes.

Themes are generally broader than codes. Most of the time, you’ll combine several codes into a single theme. In our example, we might start combining codes into themes like this:

At this stage, we might decide that some of our codes are too vague or not relevant enough (for example, because they don’t appear very often in the data), so they can be discarded.

Other codes might become themes in their own right. In our example, we decided that the code ‘uncertainty’ made sense as a theme, with some other codes incorporated into it.

Again, what we decide will vary according to what we’re trying to find out. We want to create potential themes that tell us something helpful about the data for our purposes.

Now we have to make sure that our themes are useful and accurate representations of the data. Here, we return to the dataset and compare our themes against it. Are we missing anything? Are these themes really present in the data? What can we change to make our themes work better?

If we encounter problems with our themes, we might split them up, combine them, discard them, or create new ones: whatever makes them more useful and accurate.

For example, we might decide upon looking through the data that ‘changing terminology’ fits better under the ‘uncertainty’ theme than under ‘distrust of experts’, since the data labelled with this code involves confusion, not necessarily distrust.

Now that you have a final list of themes, it’s time to name and define each of them.

Defining themes involves formulating exactly what we mean by each theme and figuring out how it helps us understand the data.

Naming themes involves coming up with a succinct and easily understandable name for each theme.

For example, we might look at ‘distrust of experts’ and determine exactly who we mean by ‘experts’ in this theme. We might decide that a better name for the theme is ‘distrust of authority’ or ‘conspiracy thinking’.

Finally, we’ll write up our analysis of the data. Like all academic texts, writing up a thematic analysis requires an introduction to establish our research question, aims, and approach.

We should also include a methodology section, describing how we collected the data (e.g., through semi-structured interviews or open-ended survey questions ) and explaining how we conducted the thematic analysis itself.

The results or findings section usually addresses each theme in turn. We describe how often the themes come up and what they mean, including examples from the data as evidence. Finally, our conclusion explains the main takeaways and shows how the analysis has answered our research question.

In our example, we might argue that conspiracy thinking about climate change is widespread among older conservative voters, point out the uncertainty with which many voters view the issue, and discuss the role of misinformation in respondents’ perceptions.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

Caulfield, J. (2022, May 05). How to Do Thematic Analysis | Guide & Examples. Scribbr. Retrieved 22 April 2024, from https://www.scribbr.co.uk/research-methods/thematic-analysis-explained/

Is this article helpful?

Jack Caulfield

Jack Caulfield

Other students also liked, qualitative vs quantitative research | examples & methods, inductive reasoning | types, examples, explanation, what is deductive reasoning | explanation & examples.

Meta-thematic synthesis of research on early childhood coding education: A comprehensive review

  • Open access
  • Published: 24 April 2024

Cite this article

You have full access to this open access article

thematic analysis research paper themes

  • Mehmet Başaran   ORCID: orcid.org/0000-0003-1871-520X 1 ,
  • Şermin Metin   ORCID: orcid.org/0000-0001-5984-6359 2 &
  • Ömer Faruk Vural 3  

The growing significance of coding in 21st-century early childhood education extends beyond technical proficiency, encompassing cognitive development, problem-solving, and creativity. Coding is being integrated globally into educational curricula to prepare students for the digital era. This research examines coding’s potential impact on cognitive and socio-emotional development and emphasizes the need for evidence-based analysis. A meta-thematic analysis synthesizes qualitative data from various studies in a study on coding’s effects on preschool children’s cognitive and socio-emotional development. It focuses on two themes: cognitive contributions and socio-emotional contributions. Thirteen suitable studies were identified from 942 visualized using the PRISMA flow diagram. Coding education enhances cognitive and socio-emotional skills in preschoolers, with implications for curriculum integration. In summary, coding’s holistic benefits in early childhood education are explored, and a meta-thematic analysis investigates its influence on cognitive and socio-emotional domains in preschoolers, emphasizing the need for rigorous evidence-based research.

Avoid common mistakes on your manuscript.

1 Introduction

Technological developments require new generations to acquire specific skills (P21). As technology has become an integral part of our lives, understanding basic computing structures and applications has become essential knowledge required in the 21st century (Czerkawski, 2015 , October). Therefore, it is widely recognized that digital literacy is essential in today’s information society (Barendsen & Stoker, 2013 ). Beyond digital literacy, coding, which refers to using languages that enable computing, is increasingly recognized as a new literacy (Bers, 2020 ; Burke et al., 2016 ; Vee, 2013 ).

When Papert ( 1980 ) developed LOGO, the first programming language to support children’s mathematical skills, he firmly believed that it influenced children’s thinking and led them to think, build, and design in new ways (Papert, 1980 , 2000 , 2005 ). Interest in Papert’s views, which draw attention to the basic concepts of computer science, has increased. This interest has led to the need to enable individuals to take an active and creative role in the use of new cognitive skills and technologies, such as code literacy, and the promotion of programming skills in the early years as essential educational support (Muñoz-Repiso & González, 2019 ). Lin and Weintrop ( 2021 ) stated that computing and the technologies it enables are reshaping the world, and they emphasized that every aspect of our lives is influenced by technology, from how we work and learn to how we play and socialize. Given this increasing presence in our lives, providing opportunities and tools to help people understand how technologies work and train them to control them is becoming an increasing focus of computer education efforts.

Coding is being promoted as a new literacy for all students at all levels of education, including very young children, and is seen as a necessity of the 21st century (Bers, 2019 ; Lye & Koh, 2014 ). For this reason, in recent years, efforts to teach coding and computational thinking, the basic concepts of computer science, in early years and to integrate them into educational processes have increased. These efforts have also accelerated classroom practices and research in this field. However, the studies focus on children’s coding and computational thinking skills (Macrides et al., 2022 ; Papadakis et al., 2016 ; Popat & Starkey, 2019 ). However, Papert ( 1980 ) stated that children’s building using technology and writing code is a new way of thinking for children and that children develop many skills while writing code. For this reason, it is necessary to examine and support the effects of coding on children’s developmental areas in preschool.

Coding is defined as an essential 21st-century skill and literacy that affects all areas of life (Bers et al., 2019 ; McLennan, 2018 ; Monteiro et al., 2021 ; Vee, 2013 ), which is defined as the process of writing the correct syntaxes in a ruleful and sequential manner using command sets and developing applications in order to solve problems, provide human-computer interaction, and enable computers to perform a specific task (Bers et al., 2019 ; Demirer & Sak, 2016 ; Fesakis & Serafeim, 2009 ; Kalelioğlu et al., 2016; Li et al., 2020; McLennan, 2018 ; Vorderman, 2019 ; Wing, 2006 ). Coding is the process of developing systematic ways to solve problems by creating algorithms, which are a set of instructions used to describe each step to perform a specific task or solve a problem (Campell & Walsh, 2017; Ching et al., 2018 ; Lee & Junoh, 2019 ; Lee & Björklund Larsen, 2019 ; McLennan, 2017 ; Vorderman, 2017). The thinking style in coding is seen as the process of numerical thinking, solving problems using algorithms and developing a logical approach, analyzing and organizing data, dividing problems into small and manageable parts, transforming them into specific algorithms, and transforming and organizing them into programming languages (Arabacıoğlu et al., 2007 ; Bers et al., 2019 ; Futschek, 2006 ; Futschek & Moschitz, 2011 ; Gibson, 2012 ; Li et al., 2020; Sullivan et al., 2017 ; Van-Roy & Haridi ( 2004 ).

2.1 Coding in preschool

Coding, a new form of literacy, has become a fundamental tool for reading and interpreting data and communicating with others in a digital society, providing an opportunity to connect children with technology. Thus, coding goes beyond algorithmic thinking and offers children a symbolic language to read and write (Bers, 2018a , 2018b; Mclennan, 2017 ). Despite different conceptual approaches, coding, which is seen not only as a set of technical skills but also as a social and cultural issue involving different fields of knowledge, basically involves thinking like a computer scientist (Grover & Pea, 2018 ), creating and collaborating (Kafai & Burke, 2014 ), and using computing languages, which are especially important for future generations (Monteiro et al., 2021 ). Bers ( 2019 ) argues that, similar to natural languages, children should be introduced to and familiarized with these new artificial languages from an early age. Monteiro et al. ( 2021 ) emphasize that this artificial language should develop children’s perceptual, expressive, and creative skills and lay a strong foundation for developing critical and functional competencies. They also cite understanding “artificial languages” used to create digital structures and transformations as a fundamental skill. In this context, Rushkoff ( 2010 ) states that being able to use the language of computers is emerging as an inevitable skill that allows us to participate fully and effectively in the digital reality that surrounds us. González ( 2015 ) and Bers ( 2019 ) state that individuals will join the new world as code literate when they can read and write in the language of computers and other machines and think numerically.

The literature emphasizes that coding as literacy in preschool education enables the development of personal and social skills that enable children to express, share, and create using computer science languages, ways of thinking, and creativity (Bers, 2020 ; Grover & Pea, 2018 ; Kafai & Burke, 2014 ; Monteiro et al., 2021 ; Resnick & Rusk, 2020 ; Vee, 2013 ). Coding is increasingly recognized as a new literacy that should be encouraged at the right age (Monteiro et al., 2021 ). In recent years, countries and scholars have emphasized the importance and necessity for children to develop the fundamental understandings, skills, and thinking approaches emerging in computer science, such as coding, programming, and computational thinking (García-Valcárcel et al., 2017; Liu et al., 2017 ; Webb et al., 2017 ; Wilson et al., 2010 ). Education stakeholders have begun to emphasize that coding, like mathematics and literacy, is essential for everyone. On January 17, 2018, the European Commission presented a new “Digital Education Action Plan” for Europe to help educational institutions and education systems better adapt individuals to live and work in an era of rapid digital change (Bocconi et al., 2018 ; Webb et al., 2017 ; Wilson et al., 2010 ). The European Commission has also taken an active role in this regard and started to promote coding as today’s literacy (Moreno-León et al., 2015 ).

When the studies on coding skills are examined, it is emphasized that coding provides children with an essential skill necessary for participation in the digital society and contributes to developing all children into computational participants (Kafai & Burke ( 2014 ). In addition, while coding develops children’s critical and creative thinking skills, it also supports their computational competencies (Grover & Pea, 2013 ). The coding process develops problem-solving, reasoning, acquisition of mathematical concepts, meta-cognitive skills (Akyol-Altun, 2018 ; Baytak & Land, 2011 ; Clements & Nastasi, 1999; Çiftçi & Bildiren, 2019; Fessakis et al., 2013 ). (Israel et al., 2015 ; Lai & Yang (2011) Lambert & Guiffre, 2009 ; Sengupta et al., 2013 ); creative thinking skills (Kim, Chunk, & Yu (2013). As Papert ( 1980 ), one of the pioneers of computer science education, emphasized, coding can be generalized for children’s lifelong learning and development, giving them a valuable intellectual structure. In the last decade, numerous research and policy initiatives have focused on the conceptual and technical aspects of introducing coding to young children and the cognitive and social aspects underlying this trend (Monteiro et al.)

Studies on coding in early childhood show that intensive efforts are being made to teach coding skills to children in their early years. It is seen that there have been significant developments in areas such as how to teach coding, instructional approaches, and the assessment of these skills. However, it is necessary to reveal how children and educators conceptualize coding in early childhood and their views on its contribution to development.

When studies on coding skills are examined, coding provides a fundamental skill necessary for participation in the digital society and significantly contributes to children’s developmental areas. According to Papert ( 1980 ), one of the pioneers of computer science education, coding can be generalized for children’s lifelong learning and development. It can equip them with a valuable intellectual structure. In the last decade, numerous research and policy initiatives have focused on the conceptual and technical aspects of introducing coding to young children and the cognitive and social aspects underlying this trend (Monteiro et al.).

2.2 The effect of coding on development

Many countries have incorporated coding education into school curricula (Heintz et al., 2016 ; Hsu, 2019 ). The United States, 16 European countries (Austria, Bulgaria, Czech Republic, Denmark, Estonia, France, Hungary, Ireland, Israel, Lithuania, Malta, Malta, Spain, Poland, Portugal, Slovakia, and the United Kingdom), as well as New Zealand, Australia, Singapore, and Nordic countries have integrated coding into the curriculum at the national, regional, or local level (Bers 2018b; Bocconi et al., 2018 ; Digital News Asia, 2015 ; European Schoolnet, 2015 ). This effort has made coding a new focus of instructional processes starting from early childhood (Bers, 2018a , 2018b; Barron et al., 2011 ; Bers, 2018; CSTA, 2020 ; Grover & Pea, 2013 ; ISTE, 2019 ; NAEYC, 2012 ; US Department of Education, 2010 ; K-12 CSframework, https://k12cs.org/ ).

In recent years, the widespread use of innovative coding platforms, especially screenless programmable robots, has made it possible to integrate coding into early childhood education (Su et al., 2023 ), but classroom applications have not gained momentum. However, Macrides et al. ( 2022 ) and Papadakis et al. ( 2016 ) revealed that these studies were primarily aimed at supporting coding and IS skills. Popat and Starkey ( 2019 ) stated that the revival of coding in the school curriculum promises to prepare students for the future beyond just learning to code. In their review, Popat and Starkey ( 2019 ) found that various other educational outcomes, such as problem-solving, critical thinking, social skills, self-management, and academic skills, can also be learned through teaching coding.

2.3 Effects on cognitive development

There is still a limited understanding of the effects of learning to code on the cognitive development of young children. Although more studies are needed in this area (Relkin et al., 2021 ), studies prove the positive effects of coding on children’s cognitive attitudes, knowledge, and skills (Bers et al., 2014 ; Çiftci & Bildiren, 2020 ; Sullivan & Bers, 2016 ). Coding contributes to developing these skills involving analysis, problem-solving, concept development, transforming problems into specific algorithms and programming languages (García- Peñalvo et al., 2016 ), and spatial reasoning and logic (NAEYC, 2012 ). García- Peñalvo et al. (2016) argued that since children develop their thinking skills through language, learning to use a programming language involving logical sequencing, abstraction, and problem-solving also supports their analytical thinking skills. In a rapidly changing digital society, coding is thought to be useful for children to develop computational thinking skills (Bers et al., 2014 ; Chou, 2020 ), mathematical thinking (Goldenberg & Carter, 2021 ), problem-solving, critical thinking, and higher order thinking (Ackermann, 2001 ; Bers et al., 2002 ; Bers, 2010 ; Bers & Horn, 2010 ; Clements & Gullo, 1984 ; Clements & Meredith, 1993 ; Kazakoff & Bers, 2012 ; Lee et al., 2013 ; Popat & Starkey, 2019 ; Portelance et al., 2016 ; Strawhacker et al., 2015 ).

Coding helps develop cognitive abilities such as systematic thinking, problem-solving, relationships between events, and creative thinking (Fesakis & Serafeim, 2009 ). For this reason, studies are showing that coding practices contribute significantly to children’s cognitive development (Grover & Pea, 2013 ; Kazakoff & Bers, 2012 ; Kazakoff et al., 2013 ; Papadakis et al., 2016 ). Recent studies on this subject have examined cognitive development (Flannery et al., 2013), sequencing skills (Caballero-Gonzalez et al., 2019; Kazakoff et al., 2013 ; Kazakoff & Bers, 2014 ), problem-solving skills (Akyol-Altun, 2018 ; Bers et al., 2014 ; Fessakis et al., 2013 ; Koç, 2019; Saxena et al., 2020 ), executive functions (Di Lieto et al., 2017 ), creativity (Flannery & Bers, 2013; Resnick, 2006 ; Siper-Kabadayı, 2019 ; Sullivan & Bers, 2017, 2019 ; Wang et al., 2011 ), and computational thinking (Batı, 2022; Bers et al., 2014 ; Bers et al., 2019 ; Caballero-Gonzalez et al., 2019; Kalogiannakis & Papadakis, 2017 ; Kazakoff et al., 2013 ; Papadakis et al., 2016 ), and visuospatial skills (Bers et al., 2014 ; Flannery et al., 2013).

2.4 Effect on social-emotional development

Bers ( 2020 ), who sees coding as another language and a new literacy and presents its general framework, refers to coding as “expressive symbolic systems” and “computational thinking tools.” However, she emphasizes that focusing only on information processing ignores the symbolic language aspect of coding, an expressive tool and that a language can be a language when it has a social and a mental side. Moreover, she emphasizes that coding as literacy should include not only thinking like a natural language but also expression and communication or social interaction, which involves doing, creating, and bringing into being. Bers ( 2008 ) states that coding, like writing, is a tool for human expression and emphasizes that in this process, children seek new ways of thinking and expressing new ideas and develop new thinking, feeling, and communication skills through this impressive process.

Coding provides the necessary motivation for children to learn programming in more detail and supports their emotional aspects by enabling them to transform ideas into products (Heikkilä, 2020 ; Toh et al., 2016 ). Machines have become a part of our lives, and we communicate with them just as we do with other individuals. For this reason, García- Peñalvo et al. (2016) stated that coding enables children to collaborate better with machines.

Fox and Farmer ( 2011 ) state that children not only manipulate objects and learn rules while creating concrete products through coding but also write codes, build artifacts in virtual environments, and review, share, and revise them. For this reason, it is emphasized that coding activities allow students to cooperate with their peers and provide highly sustainable participation in problem-solving and reasoning (Fox & Farmer, 2011 ). Studies have found that computers can act as a catalyst for social interaction in early childhood education classrooms (Clements, 1999 ) and that children have twice as much social interaction in front of computers as in other activities (Svensson, 2000 ) and speak twice as many words as in non-technology-related activities (New & Cochran, 2007 ). Coding education, whether provided through block-based applications or robotic tools and activities, can improve children’s peer collaboration, communication, and social relations (Bers et al., 2019 ; Lee et al., 2013 , 2017 ; Sullivan & Bers, 2018 ; Wartella & Jennings, 2000 ), social development and socially oriented development (Bers, 2012 ; Caballero-Gonzalez et al., 2019; Critten et al., 2022 ; Fessakis et al., 2013 ; Flannery et al., 2013; Pugnali et al., 2017 ; Strawhacker & Bers, 2015 ) and self-regulation skills (Kazakoff, 2014 ).

The findings of this study provide evidence that coding contributes to some children’s developmental areas. In addition, the opinions and perceptions of the participants regarding coding are also seen as a factor that will contribute to the field. The views of children who receive coding education and teachers who work with children on the effects of coding on development are considered necessary to guide the studies conducted in this field and the practices and curricula to be developed.

2.5 Review studies on coding

Many systematic analysis studies have been conducted on coding at the K-12 level. Lye and Koh ( 2014 ), who conducted one of these studies, revealed that empirical studies on early childhood are lacking. However, since Lye and Koh ( 2014 ) drew attention to the deficiency in the field of early childhood, it is seen that studies in this field have increased rapidly. With this increase, the studies conducted in this field have started to be analyzed. There are a limited number of review studies conducted for preschool children. Papadakis et al. ( 2016 ) present a literature review including 18 studies on how the ScratchJr application affects children’s CT, coding, and general literacy skills in preschool. The study emphasized that ScratchJr seems to be a helpful application that positively affects children’s IT and coding skills. Popat and Starkey ( 2019 ) included 11 studies in their review study to analyze the educational outcomes of children learning coding at school. Of these studies, only one was on the problem-solving skills of 5-6-year-old children. Other studies are primarily studies for primary school children. Popat and Starkey ( 2019 ) stated that the studies show that students can learn coding and that they can learn several other educational outcomes (such as mathematical problem-solving, critical thinking, social skills, self-management, and academic skills) through coding instruction.

Sulistyaningtyas et al. ( 2021 , September) reviewed 9 studies on coding for early childhood children between 2015 and 2020. This review includes two main objectives: coding practices in early childhood and the impact of coding on early childhood development. In the study, unplugged and plugged activities were used in early childhood, and Children’s planning and inhibition skills in communication, collaboration, and creativity were stated as learning outcomes. Macrides et al. ( 2022 ) analyzed the studies on programming in early childhood education. This review study analyzed 34 studies for children aged 3–8 years. Of these studies, 5 were conducted with children over 6. These findings show that there has been a significant increase in studies on preschool children in recent years. The intervention programs examined in these studies primarily focus on teaching coding (11 studies) and IT skills (11 studies), with limited attention given to supporting children’s overall development. Among the studies targeting developmental areas, the emphasis is mainly on cognitive aspects, particularly problem-solving and creativity. Zurnacı and Turan ( 2022 ) reported that, in Turkey, there were 30 studies on preschool coding, consisting of 11 qualitative, 11 quantitative, and 4 mixed-methods studies. These studies predominantly address coding and IT skills but also address academic, cognitive, language, and social skills.

Su et al. ( 2023 ) reviewed 20 studies on early childhood coding curricula published in 2012–2021. In this study, educational practices for children were examined in depth. In this review, how the curricula in educational practices for children are designed, which coding platforms or applications are used, what pedagogical approaches are used, research methods, and findings obtained from these studies were examined in depth. In recent years, educational approaches to support preschool children’s coding skills have increased, and robotics, Web 2.0 tools, and web-based applications have been developed to support children’s coding skills. These studies have revealed that children can acquire coding skills early on. However, it is essential to examine how coding skills contribute to children’s other developmental areas and to develop research and applications in this field. This review of coding has contributed significantly to the current state of the art in this field, as well as the needs and future research. Resnick and Rusk ( 2020 ) note that over the past decade, they have seen that it is possible to extend coding experiences to millions of children worldwide. At the same time, they emphasize that there are extraordinary challenges, that coding has been introduced in ways that undermine its potential and promise in many places, and that educational strategies and pedagogies to introduce coding must be carefully discussed. For this reason, in addition to the quantitative data on coding, it is thought that knowing how teachers and children interpret coding can shed light on similar future studies. For this reason, this study aims to shed light on future studies by comprehensively examining qualitative studies on preschool children and the effects of coding on children’s developmental areas in these studies.

3 Methodology

3.1 research model.

This research endeavors to ascertain the impact of coding instruction on preschool-aged children’s cognitive and socio-emotional development. The primary objective of this investigation is to undertake a systematic analysis of qualitative primary data, discerning recurring themes and topics elucidating the effects of coding education on children’s development. This analytical process culminates in synthesizing these identified themes and topics, ultimately facilitating the derivation of comprehensive conclusions. In the context of this research, the meta-thematic analysis approach is recurrently utilized to meticulously dissect the primary qualitative data (Thomas & Harden, 2008 ). Specifically, this study adopts a meta-thematic framework to synthesize qualitative studies concerning preschool children and their engagement with coding education. Within the purview of the meta-thematic analysis, three overarching themes are meticulously examined:

Theme 1: “What are the cognitive ramifications of incorporating coding education in preschool settings?”

Theme 2: “What are the socio-emotional implications stemming from integrating coding education in preschool contexts?”

Theme 3: “ What are the comparisons of theses data and research articles data ?”

These themes provide the structural foundation for the comprehensive investigation into the multifaceted impacts of coding education on preschool-aged children’s cognitive and socio-emotional development.

3.2 Studies included in the study

In this study, studies on coding education at the preschool education level were investigated within the scope of meta-thematic analysis. The criteria for the inclusion of the study in the meta-thematic analysis were determined as follows:

Being at the level of preschool education (0–6 years),

Aiming to measure the effects and limitations of coding education on students’ cognitive, emotional, and social context,

Scientifically qualified and sufficient,

Including direct participant views,

Being an experimental study,

Being a thesis or article,

The studies were selected according to these criteria.

In the study, seven databases, including “Science Direct-SD,” “Taylor and Francis-TF,” “Higher Education Council Thesis Center (YokTez-YT),” “Dergipark,” “ProQuest-PQ,” ERIC-E,” and “Web of Science-WOS,” were utilized. The databases were searched with the keywords “preschool coding,” “early childhood coding,” “computer-free coding,” “preschool programming,” and “early childhood programming.”

The articles and theses searched in the database were selected based on the above criteria. At the end of this study, 942 studies had been reached. Based on the criteria at the end of the evaluation, 13 articles were included in the meta-thematic analysis. The number of included and excluded studies in the meta-thematic analysis is presented in Fig.  1 using the PRISMA flow diagram (Moher et al., 2009 ).

figure 1

Flow diagram of the studies included in the meta-thematic analysis

According to the criteria presented in the PRISMA flow diagram in Fig.  1 and 942 studies examining the research topic were reached. Based on the evaluation according to the research criteria, some studies were eliminated by not being included in the meta-thematic analysis. Two of the studies scanned in the databases were eliminated due to duplication. Another 653 studies were eliminated from the remaining studies due to irrelevant topics. Of the remaining 287 studies, 182 studies were eliminated because they were not suitable for the primary purpose as a result of abstract screening. Of the remaining 105 studies, 88 were eliminated due to qualitative evaluation. Of these studies, 62 were eliminated because there was no qualitative interview data, and 26 were eliminated because there was no experimental study. Among the remaining 17 studies, as a result of the research conducted at the level of the findings, it was determined that the data of four studies needed to be sufficient and appropriate in terms of content and were eliminated. Thus, 13 studies were reached as a result of the screening. This study is limited to 13 studies accessed during the meta-thematic analysis process and included in the analysis. Although this situation is considered a limitation of the study, it follows the nature of meta-thematic studies (Batdı, 2017 , 2019 ).

The reasons for not including the studies that were not included in the meta-thematic analysis are shown in Table  1 . Accordingly, 942 studies were collected from 7 databases, and 929 were eliminated for the reasons shown in Table  1 . 13 studies were included in the meta-thematic analysis.

General information on the articles and the theses used in this study is given in Table  2 below.

The provided sources offer a diverse range of perspectives and insights on the integration of coding into education. Despite this diversity, the common thread across all sources is their emphasis on the importance and benefits of integrating coding into educational settings. They highlight how this integration can address various challenges educators face, such as teaching abstract concepts, fostering creativity, and enhancing problem-solving skills among students. Moreover, the sources underscore the significance of providing resources and support for educators to incorporate coding into their teaching practices effectively. However, differences emerge in the themes explored and the depth of analysis offered. For instance, some sources delve into the practical challenges educators face in implementing coding activities (E1, SD), while others focus on the pedagogical benefits and implications of such integration (WOS, PQ). Overall, while the sources vary in their approach and emphasis, they collectively advocate for integrating coding as a valuable tool for enhancing education and preparing students for the demands of the digital age.

The codes obtained in the meta-thematic analysis related to coding education in preschool were grouped under three themes. In this context, the titles “Contributions of coding education in preschool to the cognitive domain,” “Contributions of coding education in preschool to a social-emotional domain,” and “Comparision of theses data and research articles data” were accepted as themes.

In the current study, the theme created by the researcher related to the research topic and the codes that make up the theme were discussed separately and presented with the findings. At the same time, in interpreting the findings, the sources from which the codes were referenced were directly quoted and supported by the presentation of the themes and codes.

4.1 Contributions of coding education in preschool to the cognitive domain

In the meta-thematic analysis, the sub-problem of the research, “Contributions of coding education in preschool to the cognitive domain,” was taken as a theme. Participant opinions were analyzed in the studies, and codes were created regarding their statements. Codes were created for features such as coding education in preschool, developing students’ intelligence, developing cognitive skills, and reinforcing what is learned.

figure 2

Contributions of coding education in preschool to the cognitive domain

As a result of the meta-thematic analysis, three sub-categories and ten codes were reached under the theme “Contributions of Coding Education in Preschool to Cognitive Domain.” These codes are shown in Fig.  2 ; Table  3 with the frequency and percentage values. Two experts (academicians) from the field of educational sciences worked on the codes and grouped them into three sub-themes.

The skills development sub-category covers the skills that students are expected to develop, especially those widely referred to as 21st-century skills. During the coding process, it was observed that students especially developed these skills. The codes in the learning enhancement sub-category cover the skills that need to be acquired in daily life and learning towards the permanent learning process. In this case, it is an essential skill that emerges in the final learning process. Interdisciplinary contribution is an important dimension in education that is becoming increasingly important today. In this study, it emerged as a sub-dimension, albeit a very small one.

Table  3 shows that the codes are grouped around three sub-categories. Among these sub-categories, skills development has the highest rate, with 75.3%. Learning enhancement is the sub-category with the second highest rate of 23.6%. Interdisciplinary contribution is the sub-category with the lowest rate of 1.1%. In this context, it can be said that coding education develops skills in preschool children in general.

These codes belong to the skills development sub-category. The contribution of coding education in the cognitive dimension was to develop problem-solving skills with 26.4% and directing (commanding) skills with 24.7%. This skill can also be expressed as a computational thinking skill. This code emerged from the statements about students giving commands to the robot or computer and directing it. In the thesis coded YT3-p.73, the statement “ Then it would be like this. First, I program it to turn silently, then play a birthday song, and then turn it off .” “ It is to teach ways to tell tools such as computers and phones what to do. ” In the article coded E3-p.10, the statement “ I need to stick the arrows in the right direction and take this character to dinner by following the path… ” can be shown as an example.

The code for problem-solving skills was found 47 times in the studies. Some of the statements referenced in this code are “ I believe that it will contribute to the development of children’s abilities in areas such as thinking skills, logic development, problem-solving, etc .” in the article coded E2- p.753. In the thesis coded YT2-p.55, the statement “ It is an approach that provides problem-solving, creativity and analytical thinking skills. ” can be given as examples.

For the code related to the development of creativity: in the thesis coded YT6-p.117, the statement “ They did not have difficulty in applying the new rule as before, they created new rules themselves and turned this situation into a new game ” in the thesis coded YT1-p.68, the statement “ We adjust those things when we press it, it does the coding we want, it does the coding according to our imagination .” and in the article coded PQ-p.304, the statement “ It develops creative thinking and improves cooperative learning. It was collaborative training because we carried out the activities in two groups.”

These codes serve as crucial indicators of the impact of coding education on cognitive dimensions, showcasing its role in enhancing problem-solving skills, directing abilities (such as computational thinking), and fostering creativity among students. They are supported by specific statements and instances extracted from the qualitative research studies, demonstrating real-world applications and observations.

These codes belong to the learning enhancement sub-category. The references related to the code of transferring to daily life: in the thesis coded YT4-s.119, the statement “ There were touches about life-related to the general program. In other words, you always tried to associate it with life rather than sitting down and doing fashion mode robotics training…” and in the thesis coded YT3-p.76, the statement “ They reach places that we cannot reach… For example, lifting large items… ” can be given as examples.

Regarding the effective learning code: In the article coded E1-p.63, the statement “ Taking some concepts through disconnected activities that they already had some experience with and using them to apply them with technology helped them respond quickly and understand better .” can be given as an example.

Codes related to permanent learning: In the thesis coded YT6-p.115, statements “ They did not forget the order of events in the story. Each child made small changes in the story for his/her next friend, and the other child had no difficulty remembering or practicing .” Regarding the code of facilitating learning: In the thesis coded YT4- p.120, the statement “…They had much difficulty in the activities we did about graphics. At the end of the training process, they were able to do such activities much more easily. ” can be given as an example. Regarding the statement in which the code for being comprehensive was revealed: In the article PT4- p.119, the statement “ The activities in the implemented education program were very comprehensive and numerous. Turkish language, art, science, mathematics, drama, play, etc. activities in the preschool program were all included, . .” can be given as an example.

These codes collectively illustrate how coding education transcends theoretical learning, promoting practical application in daily life, improving learning efficacy, supporting long-term knowledge retention, enhancing skill mastery, and contributing to a comprehensive educational experience across different subject areas.

These codes belong to the interdisciplinary contribution sub-category. For the code of contributing to different disciplines: in the article coded PQ-p.311, the statement “ For example, I can use it in animals, colors, shapes, internal organs, and mathematics activities. ” can be given as an example. Regarding the code for the development of intelligence and manual skills: in the article coded E2-p.755, the statement “ I think it was beneficial for the development of intelligence. Being careful helped a lot in the development of manual skills. I also believe using the materials will improve the sensory organs .” can be shown as an example.

These codes emphasize the broad spectrum of benefits associated with coding education. They show how coding contributes to diverse subject areas and is pivotal in enhancing cognitive abilities, fostering manual dexterity, and potentially improving sensory perception through materials and hands-on experiences.

4.2 Contributions of coding education in preschool to the social-emotional domain

In the meta-thematic analysis, the sub-problem of the study, “ Contributions of coding education in preschool to the social-emotional domain ,” was taken as a theme. The participants’ opinions in the articles and theses obtained from the research were examined, and codes were created regarding their statements. Codes such as motivating, fun, and cooperative learning were created for coding education in preschool. As a result of the meta-thematic analysis, eight codes were found under the theme “Contributions of coding education in preschool to a social-emotional domain.” These codes are given in Fig.  3 . In addition, Table  4 below shows the frequency and percentage values of the codes.

figure 3

Contributions of coding education in preschool to the social-emotional domain”

As a result of the meta-thematic analysis, two sub-categories and eight codes were reached under the theme “Contributions of Coding Education in Preschool to Social-Emotional Domain.” These codes are shown in Fig.  3 ; Table  4 with the frequency and percentage values. Two experts (academicians) from the field of educational sciences worked on the codes and grouped them into two sub-themes.

These sub-categories encompass crucial facets of comprehensive growth. Social and behavioral development entails the acquisition of proficiencies indispensable for efficacious engagement, collaboration, and adjustment in diverse social contexts. Personal development and empowerment concentrate on individual advancement, nurturing resilience, self-assurance, and self-governance to empower individuals to navigate life with certitude. In unison, these categories epitomize manifold dimensions of human maturation and skill enhancement.

Table  4 shows that the codes are grouped around two sub-categories. Social and behavioral development has the highest rate among these sub-categories, with 76.5%. Personal development and empowerment is the sub-category with the second highest rate of 23.5%. In this context, it can be said that coding education develops social-emotional aspects in preschool children in general.

These codes belong to the social and behavioral development sub-category. The code with the highest percentage value was the code of being fun, with 25.9%. Codes related to being fun: In the thesis coded YT6-p.118, the statement “ They had much fun in the game of reaching the nest through obstacles. They put the obstacles in different places and continued to play. ” and in the article coded PQ-p.309, the statement “ It should be included in the school curriculum. It provides cognitive thinking as it both entertains and provides problem-solving skills and even cooperation… ” can be given as an example.

The codes related to supporting cooperative learning and communication can be referenced as follows: “ In the field of social-emotional development, the fact that children look for solutions together, communicate and help each other during programming activities supports the development of collaborative attitude in children .” in the thesis coded YT2- p.64 and “… The fact that group activities were given much space and the groups were mixed strengthened their communication .” the thesis coded YT4- p.120 can be given as examples. Regarding the curiosity code: In the thesis coded YT5- p.78, the statement “ I want to place the cubes immediately for my character to move. ” can be exemplified.

These codes underscore how coding endeavors impart technical proficiencies and yield considerable benefits towards cultivating intangible skills, such as collaboration, proficient communication, and inherent drive and intellectual inquisitiveness, among students.

These codes belong to the personal development and empowerment sub-category. In the present study, 9.9% was found for the code of increasing motivation. The statement “ They were also eager to put the blocks together to create different dances .” In the articles WOS- p.341 and SD- p.142, the statement “… KIBO was an extraordinary source of motivation for our students” can be cited as examples. About the code related to gaining responsibility: In the article SD- p.141, the statement “…Progress was made in supporting values such as respect for a partner and their ideas, the ability to wait, the development of responsibility and autonomy, and the care of materials… ”. Regarding the code for increasing self-confidence: In the article PT2- p.64, the statement “… Learning new things makes children feel good and increases their self-confidence. They express that they are happy after the activity. ” can be given as an example. Referring to the codes related to providing focus: In the article coded E2- p.754, the statement “ The application contributed to the development of children in areas such as cooperation, sharing, focusing and attention… ” can be exemplified.

These codes highlight how coding education transcends technical skills, fostering personal growth by enhancing motivation, instilling a sense of responsibility, boosting self-confidence, and refining essential behavioral attributes like focus and attention.

4.3 Comparision of theses data and research articles data

When the studies are classified as theses and articles and analyzed in terms of similarities and differences, similarities and differences in Target Age Group, Learning Focus, Main Tools, Activities, Benefits, Challenges, Educational Impact, and Teacher Involvement are given in the table in detail (Table 5 ).

The data of research articles delves into the educational application of robotics and coding activities, primarily aimed at young children in preschool and early elementary school. The emphasis is on hands-on learning experiences integrating technology tools such as KIBO and Bee-Bot into the classroom environment. These tools are designed to introduce children to foundational concepts of programming and computational thinking playfully and interactively.

One of the key observations from the research articles’ data is the positive impact of these activities on various aspects of child development. Through engaging with robotics and coding, students demonstrate enhanced teamwork by collaborating with peers to solve problems and complete tasks. The iterative nature of these activities encourages perseverance and determination as students persist in their efforts to achieve success, boosting their confidence along the way.

Teachers and researchers also note the benefits of using structured materials, such as wooden blocks, in conjunction with technology tools. These materials provide tangible, hands-on experiences that help students develop spatial reasoning, problem-solving, and fine motor skills. Moreover, using concrete materials ensures that learning activities are accessible and engaging for all students, regardless of their prior experience or background knowledge.

However, integrating robotics and coding into the curriculum presents its own set of challenges. Educators highlight the importance of starting with unplugged, concrete activities to build foundational understanding before introducing technology-based tools. They also stress the need for adequate teacher training and resources to support effective implementation, particularly in designing developmentally appropriate activities and scaffolding learning experiences to meet the diverse needs of students.

In summary, the data from the research articles underscores the potential of robotics and coding activities to foster critical thinking, collaboration, and creativity among young learners. By providing hands-on experiences with technology tools, educators can help students develop essential skills for success in the digital age while promoting a positive attitude towards learning and exploration. However, achieving these goals requires careful planning, ongoing support, and a commitment to inclusive and equitable education for all students.

Theses data centers around educational activities promoting active participation, problem-solving skills, and curriculum integration. Teachers engage students in diverse activities that target various learning outcomes, including motor skills and cognitive development. These activities are adaptable for different age groups and subjects, allowing for flexibility in implementation.

Teachers reflect on the effectiveness of these activities, considering factors such as student engagement, comprehension, and skill acquisition. While the specific nature of the activities is not detailed, they likely involve hands-on experiences, group collaboration, and exploration of different concepts.

Overall, theses’ data highlight the importance of engaging students in interactive and multidimensional learning experiences that cater to their developmental needs and enhance their understanding of various subjects.

5 Discussion

The fact that computer science is seen as a skill that all individuals should acquire in the early years has increased interest in coding. In addition, innovative coding platforms such as screenless programmable robotics, which have increased in importance in recent years to support 21st-century skills and STEM skills, have increasingly entered children’s early years (Macrides et al., 2022 ). This growing interest in the necessity of coding has increased the efforts of countries to integrate coding into their educational curricula. This increase has also accelerated research in this field. The view that coding is not only about teaching computer science concepts to children but also about skills and literacy has started to gain importance. The view that coding is a skill that provides children with a new perspective, way of thinking, and behavior has been emphasized. However, Popat and Starkey ( 2019 ) and Su et al. ( 2023 ) emphasize that recent studies on coding in early childhood have mainly focused on children’s coding or computational thinking. Su et al. ( 2023 ) pointed out that there are limited studies on the effects of coding on development and that studies should be conducted in this field. Therefore, in this study, qualitative studies on coding were examined to reveal the effects of coding on development. This study has analyzed qualitative studies, considering that they will contribute significantly to this emerging field by examining the work done in this area, what needs to be done in the future, and what kinds of gaps exist.

The meta-thematic analysis aimed to answer the primary research question: “What are the contributions of coding in early childhood education to the cognitive domain?” The findings indicate opinions that coding contributes to directive (command-giving) skills, problem-solving abilities, and fostering creativity. Cognitive-weighted learning outcomes such as transferring knowledge to daily life, effective and lasting learning, and facilitating learning have been highlighted, emphasizing their contributions to various disciplines. Quantitative studies have demonstrated that coding affects sequencing (Kazakoff & Bers, 2012 ; Kazakoff et al., 2013 ; Muñoz-Repiso & Caballero-González, 2019), problem-solving (Akyol-Altun, 2018 ; Bers et al., 2014 ; Çiftci & Bildiren, 2020 ; Fessakis et al., 2013 ), and executive functions (Di Lieto et al., 2017 ). Furthermore, coding and robotics education have significantly supported early mathematical reasoning skills in children (Blanchard et al., 2010 ; Caballero-Gonzalez et al., 2019; Di Lieto et al., 2017 ; Flannery et al., 2013; Kazakoff et al., 2013 ). Canbeldek and Işıkoğlu (2023) observed that coding and robotics education programs positively affected preschool children’s cognitive development, language skills, and creativity. Mısırlı and Komis (2014) found that their implemented program supported the development of mathematical concepts such as sequencing and repetition, algorithmic thinking, measurement, and spatial orientation in children.

Popat and Starkey ( 2019 ) highlighted those researchers mentioned that the inclusion of coding in school curricula provides a range of learning outcomes applicable beyond computer science. Meanwhile, Su et al. ( 2023 ) reviewed studies on coding in early childhood and emphasized that it is a new field focusing on imparting coding skills. The authors suggested evaluating the effects of coding curriculum on holistic learning outcomes in early childhood, such as school readiness skills (e.g., literacy, numeracy, spatial, and social skills). They emphasized the need to assess more critical child developmental outcomes like language, self-regulation, and metacognitive skills to understand the impact of coding curriculum. Zurnacı and Turan ( 2022 ) reviewed studies on coding in preschool education in Turkey, revealing that the most addressed topic was cognitive skills such as problem-solving abilities (in 7 studies), attention, sequencing, and analysis. The findings of this study also demonstrate an emphasis on the limited skills of cognitive development as a multidimensional process related to coding.

The study sought to address the question of “What are the contributions of using coding in early childhood education to the socio-emotional domain?” as the second sub-problem of the research. The study’s findings indicated that coding contributes to the socio-emotional domain by enhancing enjoyment, increasing motivation, fostering collaborative learning, improving communication skills, promoting personal development, empowering through increased motivation for responsibility, enhancing self-confidence, and facilitating focus. Bers ( 2008 , 2012 ), who studies coding in early childhood, states that children should be motivated while using technology and that working in a social and collaborative environment should support social and emotional skills along with these skills. Based on the positive youth development approach, he developed the PTG approach in programs and applications to be developed for children and applied this approach to his applications. In unplugged and block-based applications, he has drawn the framework of learning environments where children can be motivated while coding and develop their social skills by working collaboratively. He presented a road map to change the perspectives that technology negatively affects children’s social and emotional development and to support these areas of development.

Similar studies, like the results of this study, also indicate that coding supports socio-emotional development. Applications focused on coding demonstrate support for children’s peer collaboration, communication, and social relationships (Bers et al., 2019 ; Caballero-Gonzalez et al., 2019; Critten et al., 2022 ; Fessakis et al., 2013 ; Flannery et al., 2013; Lee et al., 2013 ; Sullivan & Bers, 2016 ; Pugnali et al., 2017 ). Studies have shown that coding supports children’s self-regulation skills (Canbeldek and Işıkoğlu, 2023; Di Lieto et al., 2017 ; Kazakoff, 2014 ). Heikkilä ( 2020 ) observed that robotics applications supporting coding generated significant interest in children, increased their patience and enthusiasm, and reduced gender-biased perspectives.

The study sought to address the question of “What are the comparisons of theses data and research articles data?” as the third sub-problem of the research. Theses data, which focus on LEGO-based education, primarily target elementary and middle school students, offering activities that foster creativity, problem-solving, and engineering skills. Students build structures, mechanisms, and robots using LEGO bricks, motors, and sensors. This approach benefits learners by developing their spatial reasoning and engineering abilities, although it can present challenges in the complexity of designs and motor programming. Teachers in this context typically serve as facilitators, guiding students through exploration and experimentation.

In contrast, the data of research articles revolves around robotics and coding education for preschool and early elementary school students. It emphasizes computational thinking, coding skills, and teamwork, often using tools like KIBO and Bee-Bot. Students participate in sequencing, programming, and interactive storytelling, which promote collaboration, critical thinking, and fine motor skills. However, integrating technology and ensuring age-appropriateness can be significant challenges for educators in this domain. Teachers play a more active role in designing activities and scaffolding learning experiences to suit the developmental needs of young learners.

While both topics aim to enhance students’ learning experiences and skills development, their target age groups, learning focuses, main tools, and teacher involvement differ. LEGO-based education leans towards older students and emphasizes hands-on building and engineering, while robotics and coding education cater to younger learners and prioritize computational thinking and programming skills. Despite these variances, both approaches contribute to fostering creativity, problem-solving, and critical thinking skills essential for success in the 21st century.

Due to the nature of meta-thematic research (Batdı, 2019 ), the data used in this study consisted only of articles and theses that presented experimental studies and direct participant views. Therefore, the comparison of articles and thesis studies was limited to these articles. A more detailed comparison is recommended to contribute to the field.

Reviews conducted on coding in early childhood (Lye & Koh, 2014 ; Macrides et al., 2022 ; Papadakis et al., 2016 ; Su et al., 2023 ) have revealed significant findings. These studies have indicated that intervention programs primarily focus on children’s coding and computational thinking skills, with a limited number examining their impact on developmental domains. The present study, however, has demonstrated an understanding of coding’s influence on cognitive and socio-emotional development. Furthermore, a significant finding of this study indicates a focus on a few foundational skills within cognitive and socio-emotional development through coding.

Previous review studies have contributed significantly to coding practices, approaches, methods, techniques, materials, and assessments used in these interventions. They have also outlined a framework for studies centered around coding. Additionally, it is believed that identifying views, thoughts, and trends in the field will provide substantial contributions from practitioners or researchers regarding their perspectives on coding, ultimately strengthening and enhancing studies.

This study suggests a trend indicating that coding contributes to cognitive and socio-emotional domains. However, coding is proposed to support various cognitive and socio-emotional development aspects. It is essential to empirically validate and confirm these views concerning the impacts of coding on development through empirical studies.

Data availability

The data used to support the findings of this study are available from the corresponding author upon request.

Ackermann, E. (2001). Piaget’s constructivism, Papert’s constructionism: What is the difference. Future of Learning Group Publication , 5 (3), 438.

Google Scholar  

Akyol-Altun, C. (2018). Okul öncesi öğretim programına algoritma ve kodlama eğitimi entegrasyonunun öğrencilerin problem çözme becerisine etkisi (Master’s thesis), Ankara University, Ankara.

Arabacıoğlu, T., Bülbül, H. İ., & Filiz, A. (2007). Bilgisayar programlama öğretiminde yeni bir yaklaşım. Akademik Bilişim , 193–197.

Barendsen, E., & Stoker, I. (2013, November). Computational thinking in CS teaching materials: A pilot study. In Proceedings of the 13th Koli Calling International Conference on Computing Education Research (pp. 199–200). https://doi.org/10.1145/2526968.2526995 .

Barron, B., Cayton-Hodges, G., Bofferding, L., Copple, C., Darling-Hammond, L., & Levine, M. H. (2011). Take a giant step: A blueprint for teaching young children in a digital age . Joan Ganz Cooney Center at Sesame Workshop.

Batdı, V. (2017). The effect of multiple intelligences on academic achievement: A meta-analytic and thematic study. Educational Sciences: Theory & Practice , 17 (6), 2057–2092.

Batdı, V. (2019). Meta tematik analiz örnek uygulamalar . Anı Yayıncılık.

Baytak, A., & Land, S. M. (2011). An investigation of the artifacts and process of constructing computers games about environmental science in a fifth grade classroom. Educational Technology Research and Development , 59 (6), 765–782.

Article   Google Scholar  

Bers, M. U. (2008). Civic identities, online technologies: From designing civics curriculum to supporting civic experiences . MacArthur Foundation Digital Media and Learning Initiative.

Bers, M. U. (2010). The TangibleK robotics program: Applied computational thinking for young children. Early Childhood Research & Practice , 12 (2), n2.

Bers, M. U. (2012). Designing digital experiences for positive youth development: From playpen to playground . OUP USA.

Bers, M. U. (2018a). Coding and computational thinking in early childhood: The impact of ScratchJr in Europe. European Journal of STEM Education , 3 (3), 8.

Bers, M. U. (2019). Coding as another language: A pedagogical approach for teaching computer science in early childhood. Journal of Computers in Education , 6 (4), 499–528. https://doi.org/10.1007/s40692-019-00147-3 .

Bers, M. U. (2020). Coding as a playground: Programming and computational thinking in the early childhood classroom . Routledge.

Bers, M. U. (2018b, April). Coding, playgrounds, and literacy in early childhood education: The development of KIBO robotics and ScratchJr. In 2018 IEEE Global Engineering Education Conference (EDUCON) (pp. 2094–2102). https://doi.org/10.1109/EDUCON.2018.8363498 .

Bers, M. U., & Horn, M. S. (2010). Tangible programming in early childhood. High-tech tots: Childhood in a Digital World , 49 , 49–70.

Bers, M. U., Ponte, I., Juelich, C., Viera, A., & Schenker, J. (2002). Teachers as designers: Integrating robotics in early childhood education. Information Technology in Childhood Education Annual , 2002 (1), 123–145.

Bers, M. U., Flannery, L., Kazakoff, E. R., & Sullivan, A. (2014). Computational thinking and tinkering: Exploration of an early childhood robotics curriculum. Computers & Education , 72 , 145–157. https://doi.org/10.1016/j.compedu.2013.10.020 .

Bers, M. U., González-González, C., & Armas–Torres, M. B. (2019). Coding as a playground: Promoting positive learning experiences in childhood classrooms. Computers & Education , 138 , 130–145. https://doi.org/10.1016/j.compedu.2019.04.013 .

Blanchard, S., Freiman, V., & Lirrete-Pitre, N. (2010). Strategies used by elementary schoolchildren solving robotics-based complex tasks: Innovative potential of technology. Procedia-Social and Behavioral Sciences , 2 (2), 2851–2857. https://doi.org/10.1016/j.sbspro.2010.03.427 .

Bocconi, S., Chioccariello, A., & Earp, J. (2018). The Nordic approach to introducing Computational Thinking and programming in compulsory education. Report prepared for the Nordic@ BETT2018 Steering Group , 42 .

Burke, Q., O’Byrne, W. I., & Kafai, Y. B. (2016). Computational participation: Understanding coding as an extension of literacy instruction. Journal of Adolescent & Adult Literacy , 59 (4), 371–375. https://doi.org/10.1002/jaal.496 .

Caballero-Gonzalez, Y. A., Muñoz-Repiso, A. G. V., & García-Holgado, A. (2019, October). Learning computational thinking and social skills development in young children through problem-solving with educational robotics. In Proceedings of the seventh international conference on technological ecosystems for enhancing Multiculturality (pp. 19–23). https://doi.org/10.1145/3362789.3362874 .

Campbell, C., & Walsh, C. (2017). Introducing the new digital literacy of coding in the early years. Practical Literacy , 22 (3), 10–12.

Canbeldek, M., & Isikoglu, N. (2023). Exploring the effects of productive children: Coding and robotics education program in early childhood education. Education and Information Technologies , 28 (3), 3359–3379. https://doi.org/10.1007/s10639-022-11315-x .

Ching, Y. H., Hsu, Y. C., & Baldwin, S. (2018). Developing computational thinking with educational technologies for young learners. TechTrends , 62 , 563–573. https://doi.org/10.1007/s11528-018-0292-7 .

Chou, P. N. (2020). Using ScratchJr to foster young children’s computational thinking competence: A case study in a third-grade computer class. Journal of Educational Computing Research , 58 (3), 570–595. https://doi.org/10.1177/0735633119872908 .

Çiftci, S., & Bildiren, A. (2020). The effect of coding courses on the cognitive abilities and problem-solving skills of preschool children. Computer Science Education , 30 (1), 3–21. https://doi.org/10.1080/08993408.2019.1696169 .

Clements, D. H. (1999). The future of educational computing research: The case of computer programming. Information Technology in Childhood Education Annual , 1999 (1), 147–179.

Clements, D. H., & Gullo, D. F. (1984). Effects of computer programming on young children’s cognition. Journal of Educational Psychology , 76 (6), 1051. https://doi.org/10.1037/0022-0663.76.6.1051 .

Clements, D. H., & Meredith, J. S. (1993). Research on Logo: Effects and efficacy. Journal of Computing in Childhood Education , 4 (4), 263–290.

Critten, V., Hagon, H., & Messer, D. (2022). Can preschool children learn programming and coding through guided play activities? A case study in computational thinking. Early Childhood Education Journal , 50 (6), 969–981. https://doi.org/10.1007/s10643-021-01236-8 .

CSTA, & K-12. (2020). Computer science standards. Computer Science Teachers Association , 12.

Czerkawski, B. (2015, October). Computational thinking in virtual learning environments. In E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (pp. 1227–1231). Association for the Advancement of Computing in Education (AACE).

Demirer, V., & Sak, N. (2016). Programming education and new approaches around the world and in Turkey. Eğitimde Kuram Ve Uygulama , 12 (3), 521–546.

Di Lieto, M. C., Inguaggiato, E., Castro, E., Cecchi, F., Cioni, G., Dell’Omo, M., & Dario, P. (2017). Educational Robotics intervention on executive functions in preschool children: A pilot study. Computers in Human Behavior , 71 , 16–23. https://doi.org/10.1016/j.chb.2017.01.018 .

Digital News Asia (2015). IDA launches S $1.5 m pilot to roll out tech toys for preschoolers.

European Schoolnet (2015). Programming & coding policies: Where are we at European level? http://www.eun.org/news/detail?articleId=649724

Fesakis, G., & Serafeim, K. (2009). Influence of the familiarization with scratch on future teachers’ opinions and attitudes about programming and ICT in education. ACM SIGCSE Bulletin , 41 (3), 258–262. https://doi.org/10.1145/1595496.1562957 .

Fessakis, G., Gouli, E., & Mavroudi, E. (2013). Problem-solving by 5–6 years old kindergarten children in a computer programming environment: A case study. Computers & Education , 63 , 87–97. https://doi.org/10.1016/j.compedu.2012.11.016 .

Flannery, L. P., Silverman, B., Kazakoff, E. R., Bers, M. U., Bontá, P., & Resnick, M. (2013, June). Designing ScratchJr: Support for early childhood learning through computer programming. In Proceedings of the 12th international conference on interaction design and children (pp. 1–10). https://doi.org/10.1145/2485760.2485785 .

Fox, R. W., & Farmer, M. E. (2011). The effect of computer programming education on the reasoning skills of high school students. In Proceedings of the International Conference on Frontiers in Education: Computer Science and Computer Engineering (FECS) (p. 1). The Steering Committee of The World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp).

Futschek, G. (2006). Algorithmic thinking: the key for understanding computer science. In International conference on informatics in secondary schools-evolution and perspectives (pp. 159–168). Berlin, Heidelberg: Springer Berlin Heidelberg.

Futschek, G., & Moschitz, J. (2011). Learning algorithmic thinking with tangible objects eases transition to computer programming. In Informatics in Schools. Contributing to 21st Century Education: 5th International Conference on Informatics in Schools: Situation, Evolution and Perspectives, ISSEP 2011, Bratislava, Slovakia, October 26–29, 2011. Proceedings 5 (pp. 155–164). Springer Berlin Heidelberg.

García-Peñalvo, F. J., Rees, A. M., Hughes, J., Jormanainen, I., Toivonen, T., & Vermeersch, J. (2016, November). A survey of resources for introducing coding into schools. In Proceedings of the Fourth International Conference on Technological Ecosystems for enhancing multiculturality (pp. 19–26).

García-Valcárcel Muñoz-Repiso, A., & Caballero González, Y. A. (2017). Development of computational thinking and collaborative learning in kindergarten using programmable educational robots: A teacher training experience. https://doi.org/10.1145/3144826.3145353 .

Gibson, J. P. (2012). Teaching graph algorithms to children of all ages. In Proceedings of the 17th ACM annual conference on Innovation and technology in computer science education (pp. 34–39). https://doi.org/10.1145/2325296.2325308 .

Goldenberg, E. P., & Carter, C. J. (2021). Programming as a language for young children to express and explore mathematics in school. British Journal of Educational Technology , 52 (3), 969–985. https://doi.org/10.1111/bjet.13080 .

González, M. R. (2015). Computational thinking test: Design guidelines and content validation. In EDULEARN15 Proceedings (pp. 2436–2444). IATED.

Grover, S., & Pea, R. (2013). Computational thinking in K–12: A review of the state of the field. Educational Researcher , 42 (1), 38–43. https://doi.org/10.3102/0013189X12463051 .

Grover, S., & Pea, R. (2018). Computational thinking: A competency whose time has come. Computer Science Education: Perspectives on Teaching and Learning in School , 19 (1), 19–38.

Heikkilä, M. (2020). What happens when the robot gets eyelashes? Gender perspective on programming in preschool. STEM Education across the Learning Continuum: Early Childhood to Senior Secondary , 29–44. https://doi.org/10.1007/978-981-15-2821-7_3 .

Heintz, F., Mannila, L., & Färnqvist, T. (2016). A review of models for introducing computational thinking, computer science, and computing in K–12 education. Teoksessa 2016 IEEE frontiers in education conference (FIE).

Hsu, T. C. (2019). A study of the readiness of implementing computational thinking in compulsory education in Taiwan. Computational Thinking Education , 295–314.

Israel, M., Pearson, J. N., Tapia, T., Wherfel, Q. M., & Reese, G. (2015). Supporting all learners in school-wide computational thinking: A cross-case qualitative analysis. Computers & Education , 82 , 263–279.

ISTE (2019). International Society for Technology in Education [Conference presentation]. Philadelphia.

Kafai, Y. B., & Burke, Q. (2014). Connected code: Why children need to learn programming . MIT Press.

Kalogiannakis, M., & Papadakis, S. (2017). Pre-service kindergarten teacher’s acceptance of ScratchJr as a tool for learning and teaching computational thinking and Science education.

Kazakoff, E. R. (2014). Cats in Space, Pigs that Race: Does self-regulation play a role when kindergartners learn to code? (Doctoral dissertation, Tufts University).

Kazakoff, E., & Bers, M. (2012). Programming in a robotics context in the kindergarten classroom: The impact on sequencing skills. Journal of Educational Multimedia and Hypermedia , 21 (4), 371–391.

Kazakoff, E. R., & Bers, M. U. (2014). Put your robot in, put your robot out: Sequencing through programming robots in early childhood. Journal of Educational Computing Research , 50 (4), 553–573. https://doi.org/10.2190/EC.50.4.f .

Kazakoff, E. R., Sullivan, A., & Bers, M. U. (2013). The effect of a classroom-based intensive robotics and programming workshop on sequencing ability in early childhood. Early Childhood Education Journal , 41 , 245–255. https://doi.org/10.1007/s10643-012-0554-5 .

Lai, A. F., & Yang, S. M. (2011, September). The learning effect of visualized programming learning on 6 th graders’ problem solving and logical reasoning abilities. In 2011 International Conference on Electrical and Control Engineering (pp. 6940–6944). IEEE.

Lambert, L., & Guiffre, H. (2009). Computer science outreach in an elementary school. Journal of Computing Sciences in Colleges , 24 (3), 118–124.

Lee, F., & Björklund Larsen, L. (2019). How should we theorize algorithms? Five ideal types in analyzing algorithmic normativity’s. Big Data & Society , 6 (2), 2053951719867349. https://doi.org/10.1177/2053951719867349 .

Lee, J., & Junoh, J. (2019). Implementing unplugged coding activities in early childhood classrooms. Early Childhood Education Journal , 47 , 709–716. https://doi.org/10.1007/s10643-019-00967-z .

Lee, K. T., Sullivan, A., & Bers, M. U. (2013). Collaboration by design: Using robotics to foster social interaction in kindergarten. Computers in the Schools , 30 (3), 271–281. https://doi.org/10.1080/07380569.2013.805676 .

Lee, J., Jung, Y., & Park, H. (2017). Gender differences in computational thinking, creativity, and academic interest on elementary SW education. Journal of the Korean Association of Information Education , 21 (4), 381–391.

Lin, Y., & Weintrop, D. (2021). The landscape of Block-based programming: Characteristics of block-based environments and how they support the transition to text-based programming. Journal of Computer Languages , 67 , 101075. https://doi.org/10.1016/j.cola.2021.101075 .

Liu, H. P., Perera, S. M., & Klein, J. W. (2017). Using model-based learning to promote computational thinking education. Emerging research, practice, and policy on computational thinking , 153–172. https://doi.org/10.1007/978-3-319-52691-1_10 .

Lye, S. Y., & Koh, J. H. L. (2014). Review on teaching and learning of computational thinking through programming: What is next for K-12? Computers in Human Behavior , 41 , 51–61. https://doi.org/10.1016/j.chb.2014.09.012 .

Macrides, E., Miliou, O., & Angeli, C. (2022). Programming in early childhood education: A systematic review. International Journal of Child-Computer Interaction , 32 , 100396. https://doi.org/10.1016/j.ijcci.2021.100396 .

Mclennan, D. P. (2017). Creating coding stories and games. Teaching Young Children , 10 (3), 18–21.

McLennan, D. P. (2018). Code breaker: Increase creativity, remix assessment, and develop a class of coder ninjas. Journal of Teaching and Learning , 12 (1), 51–52.

Article   MathSciNet   Google Scholar  

Misirli, A., & Komis, V. (2014). Robotics and programming concepts in early Childhood Education: A conceptual framework for designing educational scenarios. Research on e-learning and ICT in education: Technological, pedagogical and instructional perspectives (pp. 99–118). Springer New York. https://doi.org/10.1007/978-1-4614-6501-0_8 .

Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & PRISMA Group*. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Annals of Internal Medicine , 151 (4), 264–269. https://doi.org/10.7326/0003-4819-151-4-200908180-00135 .

Monteiro, A. F., Miranda-Pinto, M., & Osório, A. J. (2021). Coding as literacy in preschool: A case study. Education Sciences , 11 (5), 198. https://doi.org/10.3390/educsci11050198 .

Moreno-León, J., Robles, G., & Román-González, M. (2015). Dr. Scratch: análisis automático de proyectos Scratch para evaluar y fomentar el pensamiento computacional. Revista de Educación a Distancia (RED) , (46).

Muñoz-Repiso, A. G. V., & González, Y. A. C. (2019). Robótica para desarrollar El pensamiento computacional en Educación Infantil. Comunicar: Revista Científica Iberoamericana De Comunicación Y Educación , 59 , 63–72.

National Association for the Education of Young Children (NAEYC) (2012). Technology and interactive media as tools in early childhood programs serving children from birth through age 8 (DRAFT). Retrieved from http://www.naeyc.org/content/technology-and-young-children .

New, R. S., & Cochran, M. (Eds.). (2007). Early childhood education: An international encyclopedia (Vol. 4). Greenwood Publishing Group.

Papadakis, S., Kalogiannakis, M., & Zaranis, N. (2016). Developing fundamental programming concepts and computational thinking with ScratchJr in preschool education: A case study. International Journal of Mobile Learning and Organization , 10 (3), 187–202. https://doi.org/10.1504/IJMLO.2016.077867 .

Papert, S. (1980). Mindstorms Children . Computers and powerful ideas.

Papert, S. (2000). What is the big idea? Toward a pedagogy of idea power. IBM Systems Journal , 39 (3.4), 720–729.

Papert, S. (2005). Teaching children thinking. Contemporary Issues in Technology and Teacher Education , 5 (3), 353–365.

Pila, S., Aladé, F., Sheehan, K. J., Lauricella, A. R., & Wartella, E. A. (2019). Learning to code via tablet applications: An evaluation of daisy the Dinosaur and Kodable as learning tools for young children. Computers & Education , 128 , 52–62.

Popat, S., & Starkey, L. (2019). Learning to code or coding to learn? A systematic review. Computers & Education , 128 , 365–376. https://doi.org/10.1016/j.compedu.2018.10.005 .

Portelance, D. J., Strawhacker, A. L., & Bers, M. U. (2016). Constructing the ScratchJr programming language in the early childhood classroom. International Journal of Technology and Design Education , 26 , 489–504. https://doi.org/10.1007/s10798-015-9325-0 .

Pugnali, A., Sullivan, A., & Bers, M. U. (2017). The impact of user interface on young children’s computational thinking. Journal of Information Technology Education Innovations in Practice , 16 , 171.

Relkin, E., de Ruiter, L. E., & Bers, M. U. (2021). Learning to code and the acquisition of computational thinking by young children. Computers & Education , 169 , 104222. https://doi.org/10.1016/j.compedu.2021.104222 .

Resnick, M. (2006). Computer as paintbrush: Technology, play, and the creative society. Play learning: How play motivates and enhances children’s cognitive and social-emotional growth, 192–208.

Resnick, M., & Rusk, N. (2020). Coding at a crossroads. Communications of the ACM , 63 (11), 120–127.

Rushkoff, D. (2010). Program or be programmed: Ten commands for a digital age . Or Books.

Saxena, A., Lo, C. K., Hew, K. F., & Wong, G. K. W. (2020). Designing unplugged and plugged activities to cultivate computational thinking: An exploratory study in early childhood education. The Asia-Pacific Education Researcher , 29 (1), 55–66. https://doi.org/10.1007/s40299-019-00478-w .

Sengupta, P., Kinnebrew, J. S., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with K-12 science education using agent-based computation: A theoretical framework. Education and Information Technologies , 18 (2), 351–380.

Siper-Kabadayı, G. (2019). The effects of robotic activities on pre-school children’s creative thinking skills. (Master thesis). Hacettepe Unıversity, Ankara.

Strawhacker, A., & Bers, M. U. (2015). I want my robot to look for food: Comparing Kindergartner’s programming comprehension using tangible, graphic, and hybrid user interfaces. International Journal of Technology and Design Education , 25 , 293–319. https://doi.org/10.1007/s10798-014-9287-7 .

Strawhacker, A., Portelance, D., Lee, M., & Bers, M. U. (2015). Designing tools for developing minds: the role of child development in educational technology. In Proceedings of the 14th International Conference on Interaction Design and Children.

Su, J., Yang, W., & Li, H. (2023). A scoping review of studies on coding curriculum in early childhood: Investigating its design, implementation, and evaluation. Journal of Research in Childhood Education , 37 (2), 341–361.

Sulistyaningtyas, R. E., Yuliantoro, P., Astiyani, D., & Nugraheni, C. (2021, September). A Literature Review of Coding for Early Childhood In Proceedings of the 2nd Borobudur International Symposium on Humanities and Social Sciences, BIS-HSS 2020, November 18 2020, Magelang, Central Java, Indonesia.

Sullivan, A., & Bers, M. U. (2016). Robotics in the early childhood classroom: Learning outcomes from an 8-week robotics curriculum in pre-kindergarten through second grade. International Journal of Technology and Design Education , 26 , 3–20. https://doi.org/10.1007/s10798-015-9304-5 .

Sullivan, A., & Bers, M. U. (2018). Dancing robots: Integrating art, music, and robotics in Singapore’s early childhood centers. International Journal of Technology and Design Education , 28 , 325–346. https://doi.org/10.1007/s10798-017-9397-0 .

Sullivan, A., & Bers, M. U. (2019). Computer science education in early childhood: The case of ScratchJr. Journal of Information Technology Education Innovations in Practice , 18 , 113.

Sullivan, A. A., Bers, M. U., & Mihm, C. (2017). Imagining, playing, and coding with KIBO: Using robotics to foster computational thinking in young children (p. 110). Siu-cheung KONG The Education University of Hong Kong.

Svensson, A. K. (2000). Computers in school: Socially isolating or a tool to promote collaboration? Journal of Educational Computing Research , 22 (4), 437–453. https://doi.org/10.2190/30KT-1VLX-FHTM-RCD6 .

Thomas, J., & Harden, A. (2008). Methods for the thematic synthesis of qualitative research in systematic reviews. BMC Medical Research Methodology , 8 (1), 1–10. https://doi.org/10.1186/1471-2288-8-45 .

Toh, L. P. E., Causo, A., Tzuo, P. W., Chen, I. M., & Yeo, S. H. (2016). A review on the use of robots in education and young children. Journal of Educational Technology & Society , 19 (2), 148–163.

U.S. Department of Education, Office of Educational Technology (2010). Transforming American education: Learning powered by technology. U.S. Department of Education, Office of Educational Technology. http://www.ed.gov/technology/netp-2010 (Accessed 12 June 2023).

Van Roy, P., & Haridi, S. (2004). Concepts, techniques, and models of computer programming . MIT Press.

Vee, A. (2013). Understanding computer programming as a literacy. Literacy in Composition Studies , 1 (2), 42–64. http://d-scholarship.pitt.edu/id/eprint/21695 .

Vorderman, C. (2019). Computer coding for kids: A unique step-by-step visual guide, from binary code to building games . Dorling Kindersley Ltd.

Wang, D., Zhang, C., & Wang, H. (2011). T-Maze: A tangible programming tool for children. Proceedings of the 10th International Conference on Interaction Design and Children , 127–135. https://doi.org/10.1145/1999030.1999045 .

Wartella, E. A., & Jennings, N. (2000). Children and computers: New technology. Old concerns. The Future of Children , 31–43. https://doi.org/10.2307/1602688 .

Webb, M., Davis, N., Bell, T., Katz, Y. J., Reynolds, N., Chambers, D. P., & Sysło, M. M. (2017). Computer science in K-12 school curricula of the 2lst century: Why, what and when? Education and Information Technologies , 22 , 445–468. https://doi.org/10.1007/s10639-016-9493-x .

Wilson, C., Sudol, L. A., Stephenson, C., & Stehlik, M. (2010). Running on empty: The failure to teach k–12 computer science in the digital age . ACM.

Wing, J. M. (2006). Computational thinking. Communications of the ACM , 49 (3), 33–35. https://doi.org/10.1145/1118178.1118215 .

Zurnaci, B., & Turan, Z. (2022). Türkiye’de okul öncesinde kodlama eğitimine ilişkin yapılan çalışmaların incelenmesi. Kocaeli Üniversitesi Eğitim Dergisi , 5 (1), 258–286.

Download references

Open access funding provided by the Scientific and Technological Research Council of Türkiye (TÜBİTAK).

Author information

Authors and affiliations.

Department of Educational Sciences, Gaziantep University, Gaziantep, Turkey

Mehmet Başaran

Department of Preschool Teacher Education, Hasan Kalyoncu University, Gaziantep, Turkey

Şermin Metin

Department of Educational Sciences, Sakarya University, Sakarya, Turkey

Ömer Faruk Vural

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Mehmet Başaran .

Ethics declarations

Conflict of interest.

All authors declared that there are no conflicts of interest.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Başaran, M., Metin, Ş. & Vural, Ö.F. Meta-thematic synthesis of research on early childhood coding education: A comprehensive review. Educ Inf Technol (2024). https://doi.org/10.1007/s10639-024-12675-2

Download citation

Received : 01 October 2023

Accepted : 02 April 2024

Published : 24 April 2024

DOI : https://doi.org/10.1007/s10639-024-12675-2

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Coding education
  • Early childhood development
  • Cognitive growth
  • Socio-emotional development
  • Preschool curriculum integration
  • Find a journal
  • Publish with us
  • Track your research

This paper is in the following e-collection/theme issue:

Published on 23.4.2024 in Vol 26 (2024)

This is a member publication of University of Oxford (Jisc)

Empowering School Staff to Support Pupil Mental Health Through a Brief, Interactive Web-Based Training Program: Mixed Methods Study

Authors of this article:

Author Orcid Image

Original Paper

  • Emma Soneson 1, 2 , PhD   ; 
  • Emma Howarth 3 , PhD   ; 
  • Alison Weir 4, 5 , MA, MSc   ; 
  • Peter B Jones 2 * , PhD   ; 
  • Mina Fazel 1 * , DM  

1 Department of Psychiatry, University of Oxford, Oxford, United Kingdom

2 Department of Psychiatry, University of Cambridge, Cambridge, United Kingdom

3 School of Psychology, University of East London, London, United Kingdom

4 Faculty of Education, University of Cambridge, Cambridge, United Kingdom

5 Howard Community Academy, Anglian Learning multi-academy trust, Bury St Edmunds, United Kingdom

*these authors contributed equally

Corresponding Author:

Emma Soneson, PhD

Department of Psychiatry

University of Oxford

Warneford Lane

Oxford, OX3 7JX

United Kingdom

Phone: 44 1865 613127

Email: [email protected]

Background: Schools in the United Kingdom and elsewhere are expected to protect and promote pupil mental health. However, many school staff members do not feel confident in identifying and responding to pupil mental health difficulties and report wanting additional training in this area.

Objective: We aimed to explore the feasibility of Kognito’s At-Risk for Elementary School Educators , a brief, interactive web-based training program that uses a simulation-based approach to improve school staff’s knowledge and skills in supporting pupil mental health.

Methods: We conducted a mixed methods, nonrandomized feasibility study of At-Risk for Elementary School Educators in 6 UK primary schools. Our outcomes were (1) school staff’s self-efficacy and preparedness to identify and respond to pupil mental health difficulties, (2) school staff’s identification of mental health difficulties and increased risk of mental health difficulties, (3) mental health support for identified pupils (including conversations about concerns, documentation of concerns, in-class and in-school support, and referral and access to specialist mental health services), and (4) the acceptability and practicality of the training. We assessed these outcomes using a series of questionnaires completed at baseline (T1), 1 week after the training (T2), and 3 months after the training (T3), as well as semistructured qualitative interviews. Following guidance for feasibility studies, we assessed quantitative outcomes across time points by comparing medians and IQRs and analyzed qualitative data using reflexive thematic analysis.

Results: A total of 108 teachers and teaching assistants (TAs) completed T1 questionnaires, 89 (82.4%) completed T2 questionnaires, and 70 (64.8%) completed T3 questionnaires; 54 (50%) completed all 3. Eight school staff members, including teachers, TAs, mental health leads, and senior leaders, participated in the interviews. School staff reported greater confidence and preparedness in identifying and responding to mental health difficulties after completing the training. The proportion of pupils whom they identified as having mental health difficulties or increased risk declined slightly over time (median T1 =10%; median T2 =10%; median T3 =7.4%), but findings suggested a slight increase in accuracy compared with a validated screening measure (the Strengths and Difficulties Questionnaire). In-school mental health support outcomes for identified pupils improved after the training, with increases in formal documentation and communication of concerns as well as provision of in-class and in-school support. Referrals and access to external mental health services remained constant. The qualitative findings indicated that school staff perceived the training as useful, practical, and acceptable.

Conclusions: The findings suggest that brief, interactive web-based training programs such as At-Risk for Elementary School Educators are a feasible means to improve the identification of and response to mental health difficulties in UK primary schools. Such training may help address the high prevalence of mental health difficulties in this age group by helping facilitate access to care and support.

Introduction

In recent years, there has been an increased emphasis on the role of schools in supporting children’s mental health [ 1 - 3 ]. This enhanced focus has been driven in large part by an apparent increase in mental health difficulties (including behavioral, social, and emotional difficulties) present in school-aged populations [ 4 - 6 ]—a concern that became increasingly prominent in the context of the COVID-19 pandemic and the associated school closures and social distancing measures [ 7 , 8 ]. There is also a growing recognition of the many unique advantages of using the school setting to promote and protect pupil mental health [ 9 ]. First, most lifetime disorders begin during the schooling years [ 10 ], which suggests that schools may be an ideal setting for early identification and intervention. Second, schools have access to most children, meaning that they are an important component of any public health approach to address child mental health difficulties [ 11 - 14 ]. Third, schools benefit from prolonged engagement with pupils, which can facilitate the implementation of mental health promotion and prevention strategies as well as support and interventions for pupils with identified mental health needs [ 12 ]. Finally, mental health support in schools is often more accessible to families than other types of support [ 15 ].

However, while school staff are increasingly expected to support children’s mental health [ 1 ], many do not feel prepared to do so [ 16 - 19 ] due in part to receiving limited training and supervision in this area [ 20 ]. Therefore, improving school staff’s confidence and preparedness are important considerations for supporting them in taking an expanded role in pupil mental health [ 21 ]. Most schools offer some form of mental health training [ 22 , 23 ], but many staff members believe that they could benefit from additional training [ 18 - 20 , 24 - 26 ]. One area where staff training may be particularly beneficial is the identification of and first response to pupils who have mental health difficulties or who are believed to be at increased risk of developing them. However, although there is evidence suggesting that school staff, parents, and practitioners see such training as an acceptable, feasible, and potentially useful way to support pupil mental health [ 20 , 27 - 29 ], empirical evidence for the effectiveness of such training is limited and focuses primarily on intermediate outcomes (eg, staff knowledge and confidence) rather than downstream outcomes (eg, accurate identification, access to support, and mental health outcomes) [ 30 , 31 ]. Furthermore, there are several potential barriers to implementing training programs in schools, including time, cost, and resource requirements [ 28 ].

At-Risk for Elementary School Educators : A Brief, Interactive Web-Based Training

Training programs that address these barriers may be beneficial for supporting schools in identifying and responding to pupil mental health difficulties. Brief, interactive web-based training programs are a particularly promising avenue as they have the potential to be more affordable, flexible, and scalable than other training formats. One such training is At-Risk for Elementary School Educators (hereinafter, At-Risk ), a virtual simulation-based program developed by the American company Kognito [ 32 ]. The program, which has been completed by >125,000 teachers in the United States, aims to improve pupil mental health by “[building] awareness, knowledge, and skills about mental health, and [preparing] users to lead real-life conversations with pupils, parents, and caregivers about their concerns and available support” [ 33 ].

The program addresses many common implementation barriers to school-based mental health training. For example, At-Risk only requires approximately 1 hour to complete, which is much shorter than many other available training programs [ 31 , 34 ]. This comparatively low time commitment may address the concern that training programs are overly time intensive and, thus, make the training more feasible for busy schools [ 28 , 34 , 35 ]. The web-based format of At-Risk may also address concerns about school-based mental health programs being resource intensive [ 28 ]. Nearly all school mental health training programs documented in the literature are face-to-face sessions led by external facilitators [ 34 , 36 ], with only a few examples of web-based training [ 37 - 39 ]. For schools with limited budgets, programs requiring external facilitators can prove unsustainable and have limited scalability. In terms of financial resources, the costs of At-Risk vary depending on the number of licenses purchased, but the maximum cost is approximately £22 (US $30) per user, a price point that is feasible for many UK schools. In the United States, there have been many examples of bulk purchases at the district or state level that have made the training even more affordable per teacher. In many areas, the training is even free at point of use due to state- or district-wide licensing agreements [ 40 ].

To date, 3 randomized studies have examined the effectiveness and acceptability of At-Risk among samples of American teachers [ 17 , 41 ] and teachers in training [ 42 ] across school years. Each study found high satisfaction ratings, with between 75% and 85% of participants rating the training as useful, well constructed, relevant, and easy to use, and nearly all (88%-95%) reporting that they would recommend it to colleagues. The training also improved teachers’ self-rated preparedness, self-efficacy, and likelihood of identifying and discussing concerns about pupils’ mental health and referring them to appropriate support when needed. These improvements were reflected in the teachers’ behaviors—compared with teachers in the control group , those who completed At-Risk self-reported significantly more helping behaviors (eg, identifying psychological distress, discussing concerns with pupils and parents, and consulting with parents about options for care and support) and gatekeeping behaviors (ie, connecting pupils with care and support) after the training and at 3 months after the training. The findings of these studies indicate that At-Risk may help improve teachers’ ability to identify and respond to pupil mental health needs and lead to positive behavior change in terms of discussing concerns and facilitating access to care and support.

At-Risk in a UK Context: Considerations for Transportability

These 3 studies suggest that At-Risk may be a promising intervention for improving children’s mental health; however, there is still much to be learned about the training’s effectiveness, feasibility, and acceptability. Furthermore, to date, no evaluation of the training has been conducted outside the United States. There is increasing focus on the influence of context on the effectiveness of complex interventions [ 43 - 48 ], and while some interventions have shown success in terms of transportability [ 48 ], other interventions that have evidence of effectiveness in one context have demonstrated null or even negative effects in another [ 46 ]. Furthermore, information that could inform “transportability” is often not collected as part of evaluations [ 44 ], making it difficult to determine the likelihood of success in a new setting.

There are many contextual differences between the United States and the United Kingdom that could mean that school-based interventions developed in one country may not translate well to the other. Cross-country differences in education systems and (mental) health services are particularly relevant to this study. Differences in the education system include the length and content of initial teacher training, the number and roles of teaching assistants (TAs), and school funding structures. There are also key differences in the structure and availability of school-based mental health provision. In the United States, schools often have staff whose sole or at least main responsibility is mental health, such as school psychologists. While these roles are becoming more common in the United Kingdom with the implementation of the Green Paper recommendations [ 1 ], in most UK primary schools, mental health is included within the broader roles of the special educational needs coordinator (SENCo) and pastoral team. Finally, differences in the wider health care systems across the countries also mean that the process and outcomes of external referrals to specialist mental health services vary across settings, another fact that may influence the transportability of school-based interventions such as At-Risk.

Given these uncertainties regarding intervention transportability, additional evaluation of At-Risk is needed to understand whether it is a potentially useful and feasible tool to improve the identification of and response to mental health difficulties in UK primary schools. To explore the potential value of the training in this new context, we conducted a mixed methods feasibility study of At-Risk in 6 UK primary schools covering pupils aged 4 to 11 years . We aimed to examine the influence of At-Risk on staff confidence and preparedness, identification of pupils with mental health difficulties or increased risk of developing mental health difficulties, mental health support outcomes for identified children, and intervention acceptability and practicality.

Study Design

We used a mixed methods, nonrandomized, pretest-posttest study design to explore the feasibility of At-Risk in UK primary schools. While feasibility studies are acknowledged as a key stage of intervention design and evaluation [ 49 , 50 ], there is no universally agreed-upon definition of a feasibility study [ 50 , 51 ]. Therefore, we focused on 3 criteria from the guidance by Bowen et al [ 52 ]: acceptability, practicality, and limited effectiveness testing.

Intervention: At-Risk for Elementary School Educators

At-Risk is a web-based training that is delivered individually and requires only a log-in and internet connection. Using a simulation-based teaching model, the training aims to (1) improve mental health awareness and knowledge, (2) empower users to approach pupils about what they have noticed, (3) impart skills to have meaningful conversations with pupils and parents, and (4) train users to refer pupils to further support. The diagram in Figure 1 illustrates how the training might lead to improved mental health outcomes for pupils.

The simulation begins with an introduction by a virtual coach, who defines and explains how to recognize the warning signs of psychological distress and specific mental health difficulties and provides guidance and practical advice for discussing and acting upon concerns. Users then practice 2 virtual scenarios. The first scenario involves a fifth-grade (UK Year 6; ages of 10-11 years) teacher speaking with the parent of a pupil showing signs of behavioral difficulties. The second involves a third-grade (UK Year 4; ages of 8-9 years) teacher speaking with a pupil showing signs of emotional difficulties. During the conversations, users choose what to say via drop-down menus organized into categories (eg, “bring up concerns” or “ask a question”) and phrases (eg, “Mia sometimes seems a little agitated in class”). Throughout the conversation, users receive feedback through a “comfort bar” (based on how the pupil or parent perceives the conversation), opportunities to “see” the thoughts of the pupil or parent, and suggestions from the virtual coach.

Importantly, there is no one “right” way to conduct the conversations, and several approaches can lead to a positive outcome. Throughout the conversation, users can “undo” actions to backtrack after receiving an undesirable response or to explore what the response would have been had they chosen another option. At the end of each conversation, the pupil or parent provides feedback on the conversation. The training finishes with a short segment on connecting pupils with further support.

For this feasibility study, we used an unmodified version of the training (ie, the standard training designed for American schools, not tailored to the UK context) provided free of cost by Kognito. The potential need for adaptation and tailoring was an important consideration that we explicitly examined as part of our exploration of the acceptability and practicality of At-Risk in this new setting.

thematic analysis research paper themes

Recruitment

We originally sought to purposively sample 5 primary schools from Cambridgeshire or Norfolk that (1) had a higher-than-average proportion of pupils eligible for free school meals or (2) were located in an area in the top tertile of deprivation as measured using the Index of Multiple Deprivation [ 53 ], which we calculated with the publicly available Schools, pupils and their characteristics data [ 54 ]. We emailed headteachers, SENCos, and mental health leads from 131 candidate schools in September and October 2019 about participating in the study. To increase recruitment, we contacted additional schools in January 2020 for a study start date of March 2020. However, the study was suspended in March 2020 due to the in-person school closures associated with the onset of the COVID-19 pandemic. As some of the participating schools dropped out due to the pandemic, we reopened recruitment for a January 2021 study start date. In this round, we did not restrict participation by the 2 deprivation criteria described previously (ie, free school meal eligibility and Index of Multiple Deprivation), so any UK-based mainstream primary school was eligible to participate. The January 2021 start date was again delayed by the pandemic, but there was no subsequent recruitment.

Teachers and TAs

Schools were responsible for recruiting individual teachers and TAs to participate in the training. We encouraged schools to invite all teachers and TAs to participate, but schools made a variety of decisions in this regard. Three schools (schools D, E, and F) had all staff complete the training during inset (in-service training) days or other designated times, 2 schools (schools A and C) had staff volunteer to participate, and 1 school (school B) selected 2 to 3 staff members in each year group to participate.

Measures and Materials

School characteristics.

The characteristics of the participating schools, including school type, school sex (ie, whether they were single or mixed sex), urbanicity, head count, area-level deprivation, level of free school meal eligibility, ethnic composition, and proportion of pupils with special educational needs, were obtained from publicly available data from the Department for Education [ 54 , 55 ].

Teacher and TA Identification Form

The purpose of the Teacher and TA Identification Form ( Multimedia Appendix 1 ) was to understand which pupils participants would identify as having mental health difficulties or an increased risk of developing mental health difficulties. As systematic reviews in this area have identified no suitable questionnaires [ 28 , 30 ], we developed a bespoke questionnaire, which was reviewed by a school staff advisory group to ensure accuracy and relevance. The questionnaire begins with instructions, including explanations and examples of what is meant by “mental health difficulties or risk for mental health difficulties.” Full definitions are provided in Multimedia Appendix 1 , but in brief, “mental health difficulties” are described as “behavioural and social-emotional problems” regardless of formal diagnosis, and “risk for mental health difficulties” is described as experiences that increase the chance of a child developing mental health difficulties in the future.

For all pupils in their class, participants first indicated whether they thought a pupil had mental health difficulties or increased risk. If yes, they answered 9 subsequent questions about mental health support outcomes. The first four outcomes were about communication of concerns, namely whether they had (1) formally documented their concerns with the school, (2) communicated concerns to the SENCo, pastoral care lead, or mental health lead, (3) communicated concerns to another member of the school staff, or (4) communicated concerns to the child or their parents. The next five outcomes pertained to the provision of mental health support, namely whether the pupil (5) received in-class support; (6) received in-school support or had an in-house support plan; (7) had documented social, emotional, and mental health (SEMH) status (a type of special educational need focused on mental health difficulties); (8) had been referred to external mental health services; or (9) had access to external mental health services.

Strengths and Difficulties Questionnaire

The teacher-report Strengths and Difficulties Questionnaire (SDQ) [ 56 - 59 ] served as the comparator for findings about teachers’ and TAs’ identification of pupils. The SDQ includes 25 positive and negative psychological attributes across 5 scales: emotional symptoms, conduct problems, hyperactivity/inattention, peer relationship problems, and prosocial behavior. The first 4 scales add up to a Total Difficulties Score (0-40, with higher scores representing greater difficulties). The SDQ has demonstrated acceptable psychometric properties in primary school samples [ 60 ]. It is important to note that the SDQ is not an exact comparator as it measures a narrower concept than the Teacher and TA Identification Form (which also includes increased risk ). However, this comparison could potentially yield valuable information regarding feasibility .

Pre- and Posttraining Surveys

Kognito uses pre- and posttraining surveys to assess their training. These surveys (based on the validated Gatekeeper Behavior Scale [ 61 ]) explore teachers’ self-efficacy in identifying and responding to mental health difficulties and whether their attitudes, self-efficacy, or practice have changed since completing the At-Risk training. The posttraining survey also includes questions on perceptions of the training’s impact. We independently (ie, with no input from Kognito) reviewed the merits of these questionnaires and decided to use them in this study because (1) they covered relevant and useful concepts related to our aims and (2) using them increased comparability to the other 3 US-based studies of At-Risk . We slightly adapted the surveys to make them more relevant to the UK context ( Multimedia Appendix 2 ).

Interview Schedules

For the pretraining interviews with SENCos and mental health leads, we developed a topic guide about current practice ( Multimedia Appendix 3 ) with the specific purpose of creating Mental Health Resource Maps for each school (refer to the Procedures section). The main topics pertained to formal and informal procedures for when staff members suspect that a child might have mental health difficulties or increased risk, as well as the types of support available.

For the posttraining interviews with teachers, TAs, and strategic stakeholders (ie, those with key leadership roles, including senior leadership teams [SLTs], school governors, and SENCos and mental health leads), we developed 3 separate topic guides ( Multimedia Appendix 3 ), which were informed by our research questions, systematic reviews [ 28 , 30 ], and the Consolidated Framework for Implementation Research [ 62 , 63 ]. For teachers and TAs who completed At-Risk and strategic stakeholders, interview topics included the acceptability of the training, the practicality of implementing it in schools, the utility of further refinement and testing, possible harms associated with the training (if any), and suggestions for adaptations. For teachers and TAs who did not complete the training, topics included reasons for not completing it, barriers to acceptability and practicality, and suggestions for adaptations.

Interviews With SENCos and Mental Health Leads

We conducted a pretraining interview with each school’s SENCo or mental health lead to develop a “Mental Health Resource Map” with information on referral processes and available support. These maps served an ethical purpose by ensuring that pupils identified as potentially having mental health difficulties would have the best possible chance of being linked to care and support.

Completing At-Risk

Schools’ timelines for the study varied due to the pandemic and other commitments. School D completed the training in December 2020; school E completed the training in March 2021; schools B, C, and F completed the training in May 2021; and school A completed the training in June 2021. At baseline (T1), participants completed a Teacher and TA Identification Form and the pretraining survey. They then completed the At-Risk training. We encouraged schools to designate specific time for the training, which 3 schools (schools D, E, and F) did. One week after training (T2), participants were asked to complete a second Teacher and TA Identification Form and the posttraining survey. Three months after the training or at the end of the school year (whichever came first; T3), participants completed a third Teacher and TA Identification Form as well as SDQs for all pupils. All questionnaires were completed on the University of Cambridge Qualtrics platform (Qualtrics International Inc).

Feedback Provision

After T2, we provided all SENCos and mental health leads but not teachers or TAs with feedback regarding which children had been identified as having mental health difficulties or increased risk. After T3, we provided SDQ scores for each child as well as whole-class distributions (where available). This feedback was provided to ensure the ethical conduct of the study.

Interviews With Teachers, TAs, and Strategic Stakeholders

We aimed to recruit at least 3 teachers or TAs who completed the training per school, 3 to 5 teachers or TAs who had not completed the training across all schools, and up to 3 strategic stakeholders per school for posttraining semistructured interviews. Schools contacted staff members directly with an invitation to complete a virtual interview.

Quantitative Outcomes

Analytical samples.

For the main analysis, participants were included if they (1) completed at least the pretraining (T1) questionnaires and the training itself and (2) had what we judged to be a typical number of children they regularly worked with. For the latter criterion, given that the average UK primary school class size is approximately 27 to 28 pupils [ 64 ], we excluded teachers and TAs who worked with <10 children (as we suspected this would not be a random selection of pupils and would therefore influence aggregate identification rates) and those who worked with >60 children (as we believed that it would be difficult for a teacher or TA to know >2 classes’ worth of children well enough to make accurate judgments about their mental health).

Teacher and TA Self-Efficacy and Preparedness

To assess teachers’ and TAs’ preparedness, self-efficacy, and perceptions of training impact, we calculated the absolute and relative frequencies of responses to the pre- and posttraining surveys. Participants were eligible for inclusion in this analysis only if they had pretraining (T1) data.

Identification Outcomes

On the basis of the Teacher and TA Identification Forms, we calculated the number and percentage of pupils in each class whom teachers and TAs perceived as having mental health difficulties or increased risk at each time point. We summarized these across all participants using medians and IQRs.

We then calculated SDQ scores, which we compared with responses from the Teacher and TA Identification Form by calculating (1) the median and IQR for the percentage of children identified by participants who did not have elevated SDQ scores and (2) the median and IQR for the percentage of children with elevated SDQ scores who were not identified by participants. To be included in these analyses, participants had to have completed all 3 time points. For the first outcome, they had to have completed an SDQ for all children they identified in the Teacher and TA Identification Form . For the second outcome, they had to have completed SDQs for at least 80% of their class. Where it was possible to match pupil IDs between teachers and TAs, we pooled SDQ data such that, if one participant did not meet the inclusion criteria themselves, they could still be included if the SDQ data were available from another staff member working with the same children.

Mental Health Support Outcomes

Finally, for each time point, we calculated medians and IQRs for the proportion of identified children with each of the 9 mental health support outcomes (refer to the Teacher and TA Identification Form section for the outcomes) .

Sensitivity Analyses

We also conducted 2 post hoc sensitivity analyses. The first sensitivity analysis excluded all participants from school D. When we prepared feedback for school D (the first school to complete the training), we learned that most participants at the school had misinterpreted the Teacher and TA Identification Form. We edited the form and instructions accordingly to address this issue, but therefore, school D participants completed a slightly different form than the other schools. The second sensitivity analysis was a complete case analysis intended to explore observed differences in outcomes according to whether participants had completed all 3 time points. For the analysis of outcomes pertaining to preparedness, self-efficacy, and perceptions of training impact, we included all participants who completed the surveys at least at T1 and T2.

Statistical Analysis

For all quantitative outcomes, we focused on preliminary, descriptive comparisons across the 3 time points and did not perform any formal hypothesis testing. This aligns with established recommendations for feasibility studies, which generally lack the statistical power necessary for a clear interpretation of hypothesis-testing results [ 65 - 68 ]. We conducted all quantitative analyses in R (version 4.0.3; R Foundation for Statistical Computing) [ 69 ] except for the comparison of Teacher and TA Identification Forms and SDQ scores, for which we used Microsoft Excel (Microsoft Corp). We created all plots using the ggplot2 [ 70 ] and likert packages [ 71 ]. To score the SDQs, we used the freely available R code on the Youthinmind website [ 72 ].

Qualitative Outcomes

We considered 3 analysis approaches for the interview and qualitative questionnaire data: content analysis [ 73 ], framework analysis [ 74 ], and reflexive thematic analysis [ 75 , 76 ]. We initially decided to use content analysis for the survey comments and reflexive thematic analysis for the interviews; however, as we familiarized ourselves with the data, we realized that there was significant overlap between the survey comments and interviews and decided that analyzing them separately was not a useful distinction. As our main aim was to generate insights into the program and its future potential, we decided to use the 6-phase reflexive thematic analysis by Braun and Clarke [ 76 ] for all qualitative data due to its flexibility and ability to generate themes both inductively and deductively. ES developed the initial themes, and MF and EH helped clarify and enrich them. ES and MF worked together to name and refine the themes before the final write-up. We managed and coded all qualitative data in ATLAS.ti (version 9.1.3; ATLAS.ti Scientific Software Development GmbH) and additionally created manual thematic maps to better visualize and understand patterns between our data.

Ethical Considerations

This study was approved by the University of Cambridge Psychology Research Ethics Committee (PRE 2019.076). We obtained active informed consent from all teachers and TAs who took part in the study. We used an opt-out model for parental consent whereby parents received (directly from the schools via their preferred communication routes) an information sheet detailing study aims, procedures, how data would be used, and the right to opt their child out of participation. Parents had 2 weeks to opt their child out of the study by returning a hard copy of the opt-out form or emailing or calling the school. Schools kept track of all opt-outs and instructed teachers and TAs not to include these children in their forms. All quantitative data were collected using anonymous pupil and staff identifiers generated by the participating schools, and all qualitative data were deidentified before analysis, with identifiable information stored on secure servers at the University of Cambridge. Teachers and TAs received £20 (approximately US $28) vouchers for completing the training and questionnaires for at least 2 of the 3 time points and an additional £10 (approximately US $14) for taking part in an interview. School staff members who created the anonymous identifiers received £10 (approximately US $14) vouchers to thank them for their time.

Participants

A total of 6 schools participated in this study (Table S1 in Multimedia Appendix 4 [ 40 ]). Among these 6 schools, there were 4 (67%) from Cambridgeshire and 1 (17%) each from Greater London and Merseyside; 5 (83%) were located in urban areas and 1 (17%) was located in a rural area. All but 1 school (5/6, 83%) were situated in areas of above-average deprivation, and 50% (3/6) of the schools had a higher-than-average proportion of pupils eligible for free school meals. In total, 67% (4/6) of the schools had a high proportion of White pupils (>80%), and 33% (2/6) of the schools were more diverse, with approximately 20% of pupils from Black, Black British, Caribbean, or African backgrounds (school B) or Asian or Asian British backgrounds (school E).

A total of 108 teachers and TAs completed the T1 questionnaires and the training itself, 89 (82.4%) completed the T2 questionnaires, and 70 (64.8%) completed the T3 questionnaires ( Table 1 ), with 54 (50%) having completed all 3. After excluding those teachers and TAs who did not meet the inclusion criteria for the analyses, the final analytical samples were as follows:

  • Main analysis of identification and mental health support outcomes: n=97 at T1, n=75 at T2, and n=57 at T3.
  • Main analysis of preparedness, self-efficacy, and training impact outcomes: n=107 at T1 and n=83 at T2.
  • Main analysis comparing identification outcomes with SDQ scores: n=28 and n=25 (refer to the following section).
  • Complete case sensitivity analysis: n=51 at T1, T2, and T3.
  • Sensitivity analysis excluding all teachers and TAs from school D: n=70 at T1, n=54 at T2, and n=41 at T3.

Compared with the 2019-2020 national workforce statistics for teachers and TAs working in state-funded nursery and primary schools [ 77 ], our sample had a similar proportion of women (81/89, 91% in our sample vs 90.9% nationally) and a slightly higher proportion of White staff members (82/89, 92% in our sample vs 90.5% nationally).

A total of 7.4% (8/108) of school staff members from 67% (4/6) of the schools completed an interview ( Table 2 ).

a N=89 because this information was collected only at T2.

b N NA =4 (number with missing data for this question).

c Percentages add up to >100 because some participants had multiple roles.

d SENCo: special educational needs coordinator.

e N NA =7 (number with missing data for this question).

a PSHE: Personal, Social, Health and Economic.

b SENCo: special educational needs coordinator.

c TA: teaching assistant.

d HLTA: higher-level teaching assistant.

Pretest-posttest changes suggested that participating in the training was beneficial for the staff and that they had positive perceptions of the training. Findings regarding preparedness ( Figure 2 ) suggest improvements across all domains of recognizing and acting upon concerns about pupils’ mental health, particularly in terms of using key communication strategies and working with parents. Findings regarding self-efficacy ( Figure 3 ) suggest that participants were more confident in their abilities to discuss their concerns about pupils’ mental health after the training than before. Again, the largest changes were observed in discussing concerns with parents and applying key communication strategies. Finally, findings regarding teachers’ and TAs’ perceptions of the impact of applying the skills of the training ( Figure 4 ) suggest that they were generally positive about the possible effects of the training on pupil outcomes (ie, attendance and academic success), teacher-pupil rapport, and the classroom environment. The results from the complete case analysis ( Multimedia Appendix 5 ) were nearly identical to those of the main analysis (all differences were ≤3 percentage points in magnitude).

thematic analysis research paper themes

In terms of how many pupils were identified as having mental health difficulties or increased risk, participants identified similar proportions of their pupils before and immediately after the training and then fewer over time. The median percentage of pupils whom participants believed had mental health difficulties or increased risk was 10% (IQR 6.7%-18.2%) at T1, 10% (IQR 4.5%-16.7%) at T2, and 7.4% (IQR 5.0%-16.7%) at T3. The directions of change were similar for both sensitivity analyses (whereby teachers and TAs identified fewer children over time), with slight differences. For the sensitivity analysis excluding school D ( Multimedia Appendix 6 ), the percentages were slightly (approximately 2 percentage points) higher. For the complete case analysis, the decrease was also notable 1 week after the training, decreasing from 10% (IQR 6.7%-17.3%) at T1 to 8% (IQR 3.9%-16.7%) at T2 and 7.4% (IQR 5.7%-16.7%) at T3.

In terms of the accuracy of identification, it seems that teachers and TAs became slightly more accurate over time in comparison to pupils’ SDQ scores (although it is important to acknowledge the limitations described in the Methods section regarding questionnaire comparability). The median percentage of children identified by participants who did not have elevated SDQ scores was 40% (IQR 0%-50%) at T1, 27.2% (IQR 0%-50%) at T2, and 25% (IQR 0%-50%) at T3. The median percentage of children with elevated SDQ scores who were not identified by participants was 68.8% (IQR 42.9%-87.5%) at T1, 66.7% (IQR 50%-88.2%) at T2, and 57.1% (IQR 33.3%-87.5%) at T3. In the sensitivity analysis excluding school D, the results were similar (typically within 5 percentage points); one small difference was that the median percentage of children identified by teachers and TAs who did not have elevated SDQ scores was 0% (IQR 0%-50%) at T2. The results of the complete case analysis were identical to those of the main analysis.

Overall, the findings suggest that the training may be beneficial for facilitating conversations and access to school-based support (but not external support) for pupils with identified mental health difficulties or increased risk. Figure 5 presents the findings for the 9 mental health support outcomes among identified children across the 3 study time points. As with before the training, there was typically a wide variation in outcomes.

A comparison across time points suggests that participants formally documented their concerns and spoke with the SENCo, pastoral lead, or mental health lead for a greater proportion of identified pupils after the training than before. For example, at T1, teachers and TAs documented concerns for a median of 50% (IQR 0%-100%) of identified pupils; this increased to 56.3% (IQR 4.2%-100%) at T2 and 75.7% (IQR 0%-100%) at T3. The equivalent statistics for speaking with the SENCo, pastoral lead, or mental health lead were a median of 66.7% (IQR 16.7%-100%) at T1, 75% (IQR 50.0%-100%) at T2, and 95.5% (IQR 50.0%-100%) at T3. There was no change in speaking with another staff member, but this was because nearly all participants did so across all time points. Finally, the percentage of pupils whom teachers and TAs spoke with (or whose parents they spoke with) also increased after the training, with a median of 33.3% (IQR 0%-87.5%) at T1, 61.9% (IQR 0%-100%) at T2, and 50% (IQR 0%-100%) at T3.

A comparison across time points also suggests increases in school-based support for identified children after the training compared with before. The median percentage of pupils identified by teachers and TAs who received in-class support increased from 75% (IQR 35.4%-100%) at T1 to 100% at T2 and T3 (IQR 50%-100% and 66.7%-100%, respectively). There was a more modest increase in the receipt of in-school support or in-house support plans, with a median of 40% (IQR 0%-71.4%) of identified pupils receiving them at T1 compared with 50% at T2 and T3 (IQR 3.6%-100% and 8.3%-81.4%, respectively). There was very little change in documented SEMH status or referral or access to specialist mental health services. For each of these outcomes, the median percentage of identified pupils was 0% across time points.

The findings from the sensitivity analyses were similar to those of the main analysis in terms of direction, although improvements across time in the complete case analysis ( Multimedia Appendix 5 ) tended to be more modest than for the main analysis.

thematic analysis research paper themes

Acceptability and Practicality

Quantitative findings.

Quantitative data from the posttraining survey showed that participants were generally positive about the training. Of the 83 participants who completed the survey, 53 (64%) rated it as “good” and 13 (16%) rated it as “very good.” An additional 17% (14/83) rated it as “fair,” 2% (2/83) rated it as “poor”, and 1% (1/83) as “very poor.” A total of 84% (70/83) of the teachers and TAs said that the scenarios in the training were relevant to them. Finally, most participants (74/83, 89%) would recommend the training to other educators.

Qualitative Findings

Qualitative data also suggested that school staff generally found the training practical and acceptable. We generated three themes from our survey and interview data:

  • Individual fit: positive perceptions, self-efficacy, and change.
  • Institutional fit: alignment with school values and context.
  • Taking it forward: improvements and implementation.

Additional findings on possible harms are presented in Multimedia Appendix 7 .

Individual Fit: Positive Perceptions, Self-Efficacy, and Change

In general, participants perceived the program to be a “good fit” with their personal philosophies and practice. Regarding the training itself, many appreciated the included scenarios, particularly in terms of their relevance to their practice. The format of the training—primarily that it was web-based and required active role-play—was also viewed as useful, engaging, and novel and might have contributed to its perceived usefulness. For example, one teacher commented:

The interactive elements of the training were brilliant and something which I have never encountered before! [Survey respondent (SR) 56; school E]

One teacher and well-being lead described:

I think it definitely made you think. [...] you had to really think about what was being said and the response that you would give, reflecting back on sort of the knowledge that they’d given you beforehand, so I thought that was good. [Interviewee 1]

Other participants suggested that opportunities to practice skills during the training improved the likelihood of using those skills in day-to-day practice.

Participants also believed that they had learned a lot from the training, especially in terms of skills and strategies. These included but were not limited to the skills within the At-Risk “EASING” strategy (check your Emotions, Ask for permission, be Specific, use I statements, keep it Neutral, and show Genuine curiosity). Importantly, there was evidence that participants had also applied new skills. Several participants described having new conversations with pupils or parents facilitated by the skills and strategies from the training. For example, one teacher described:

It was that permission thing [...] I wanted to ask [a child] about his home life [...] and kind of he just cried and didn’t want to speak about it anymore, and then when I asked him if we were OK to talk about it, he said, “Actually no, because I think I’m going to cry again,” so then we left it. And then he came to me the following week, and [...] said, “Can we talk about it now?” [...] so actually me asking that, it was the wrong time for him to talk about it, he wasn’t ready, he would have just been emotional, and wouldn’t have been able to get his words out, and actually the week after, him coming to me and saying, “Can we have a little chat,” works perfectly [...] And now we’re more aware of his situation. [Interviewee 2]

This skill seems to have enabled this pupil to have this conversation with the teacher in a manner (in terms of time, place, and identified person) that suited him. Other participants provided similar examples, referencing how skills from the training had facilitated better outcomes.

However, it is important to note that the perceived usefulness of the training varied. Most notably, some participants indicated that their previous training or role made the training less impactful. Illustratively, when asked how the training had impacted their practice, one TA responded:

Having previously received similar training, due to my role, I do not have any recent cases where the training would have changed the way I carried out discussions. [SR 60; school E]

Institutional Fit: Alignment With School Values and Context

Sustainable school-based programs should also align with the values of the school more broadly. Participants often referenced the importance of schools’ prioritization of pupil mental health. For example, one teacher described:

[Mental health is] a conversation which is constantly ongoing and trying to constantly better our practices and make sure we’re looking after them as best as we can and spotting things as best we can as well. [Interviewee 3]

This description demonstrates how prioritizing mental health can promote the critical evaluation of related school practices as well as the additional provision of training opportunities. In many cases, support from the SLT led to formal recognition of pupil mental health within school policies or plans. One strategic stakeholder explained:

I think because our school have well-being and mental health as such a focus, SLT are very supportive of doing things like this and they’re very accommodating. So when I said we had the training and people were going to have to take part in the training, it was very flexible, although they had other ones lined up, they were quite happy to move things around to make things work. And I think, the fact it is such a priority in our school definitely makes that easier. [Interviewee 1]

In this school, mental health and well-being were one of three main school priorities. As indicated previously, direction setting from the SLT is key to ensuring momentum and impetus. However, as others noted, it is important that support from the SLT is genuine rather than being “just another tick box” (Interviewee 4) exercise.

Another facet of institutional fit pertained to the practical aspects of the training. Schools are time- and resource-limited settings, so mental health training needs to fit within this context. The format of At-Risk, especially its flexibility and relatively low time requirements, was viewed as beneficial, with comments such as “For the amount of time [...] I got a huge amount from it” (Interviewee 4). Others made direct comparisons with other training courses. For example, one higher-level TA had previously completed a 1-day, in-person training course with a similar purpose to that of At-Risk. While she preferred the in-person training, she listed the benefits of both types:

[In the in-person training] you can then query and question to your trainer, so you’ve got that interaction, so that obviously isn’t there, is it, on the computer one. [...] if I was looking from a management point of view, I would say, budgetary, I’m sure it’s cheaper [...] to use [At-Risk], not just cheaper as in [...] money, [...] but also cheaper in time [...] So probably if I was looking [...] with my management hat on, I would say the computer-based [training] would get the same message, or similar message, across for a wider audience for probably a cheaper cost. [Interviewee 7]

In terms of efficiency, this participant highlighted the favorable input-to-output ratio of At-Risk , which could allow more staff members to participate in training. This quote also highlights that schools could use At-Risk flexibly. For example, schools might assign staff members to different training programs based on their roles and previous experience, with more intensive, in-person training for staff members with more significant mental health roles and At-Risk for those with fewer responsibilities or less experience.

Taking It Forward: Improvements and Implementation

Participants offered key insights into how to take the training forward in terms of both changes to the training itself and how best to implement it, primarily by tailoring it to the UK context. In terms of language, there was some reference to the American accent, but more so, participants highlighted the need to adapt some of the terminology and signposting resources to reflect UK support systems. They also made suggestions about additional training that could be useful with different topics (such as bullying) and age groups (particularly for younger children).

In addition to improvements and adaptations to the training itself, participants illustrated the importance of implementation. A common theme was that, to maximize impact, the training should include follow-up discussions or live workshopping. One teacher suggested:

I think some kind of “live” element to conclude the training—to have a “real” person to ask questions to as part of a group video chat could have been useful. Also, maybe to ask advice about particular scenarios that we may have found ourselves in in the past. [SR 56; school E]

By facilitating greater engagement and critical thinking, a live element could enhance the impact of the training and potentially make At-Risk more acceptable to those who generally prefer face-to-face training. Participants indicated that someone internal, for example, the SENCo, would be best placed to lead a live element and would enable staff to practice role-playing based on situations and scenarios specific to each school.

There was also wide acknowledgment that any training had to lie within a strong support system. This began with having a clear referral pathway for identified concerns, which was viewed as important for facilitating access to support. In some cases, teachers and TAs were able to find new ways to support children after completing the training. However, in many cases, participants—and strategic stakeholders in particular—explained that support had not always been readily available. For example, one strategic stakeholder recounted what happened after the training:

A lot of them are people saying to me, “What are you going to do about it?” about different children. And I, because some of our support staff don’t know the sort of route for getting extra support, or they’re really shocked to find actually there’s nothing out there for these children...it’s about what we can do in school, and I think people have been really quite shocked about that. You know, they just presume I can make a phone call and these children will get face-to-face counselling. [Interviewee 5]

This shows the importance of embedding the training within a wider support system, including collaboration with external agencies. However, many interviewees referenced the systemic issues that schools face in helping pupils access specialist support, particularly in terms of the high thresholds and long waiting lists that exist for many external services. While schools may be able to provide beneficial support for children, particularly for those with lower-level difficulties, this indicates an ongoing area of need for schools and their pupils.

Summary of Findings

This study offers the first UK evidence for Kognito’s At-Risk for Elementary School Educators , extending findings from 3 US-based trials and providing needed evidence regarding the potential utility, acceptability, and practicality of brief, interactive web-based mental health training for school staff. Overall, the findings showed that At-Risk is a feasible means of improving the identification of and response to pupil mental health difficulties in UK primary schools. Quantitative findings showed that staff preparedness and self-efficacy in identifying and responding to mental health difficulties increased after the training. Identification rates did not increase (and, in fact, decreased at the 3-month follow-up), but there was some suggestion that teachers’ and TAs’ identification became slightly more accurate in comparison with SDQ scores. Crucially, for those pupils identified as having mental health difficulties or increased risk, in-school mental health support outcomes (ie, documentation or discussion of concerns, conversations with pupils and parents, and in-class and in-school support) increased after the training, but more “downstream” outcomes (ie, documented SEMH status and referral and access to external mental health services) did not. Qualitative findings indicated that participants generally found the training acceptable and practical, with many explaining how they intended to use or had already used the skills they learned to improve their practice. Participants also suggested several useful improvements for the training and its implementation, including making it more relevant to the United Kingdom, adding more scenarios, and including a live element in the implementation of the training.

Findings regarding confidence and preparedness reflect those of the 3 US-based studies of At-Risk [ 17 , 41 , 42 ] and the wider literature surrounding teacher mental health training [ 31 ]. In general, mental health training seems to be effective in improving staff confidence. For example, 2 Australian-based studies [ 37 , 78 ] found that secondary school teachers who completed training felt more confident discussing their concerns and helping pupils with their mental health. Another UK-based study of a psychoeducational training program to improve recognition of depression in secondary schools [ 79 ] found significant pretest-posttest improvements in teacher confidence in their knowledge of symptoms, ability to recognize symptoms, and knowledge about how to speak with pupils about their mental health. However, not all studies have shown an impact, with a prominent UK-based study of mental health first aid training finding no effect on educators’ confidence in helping pupils with their mental health [ 80 ].

The general decrease in the proportion of pupils identified as having mental health difficulties or increased risk stands in contrast to previous studies of At-Risk , which found that school staff identified significantly more pupils of concern after completing the training [ 17 , 41 ]. Evidence of the effect of other training programs on identification is extremely limited [ 30 , 31 , 36 ], and differences in context, training content and delivery, baseline knowledge, and outcome measurement make it difficult to compare findings across studies. Two vignette-based studies showed little effect of either mental health first aid [ 78 ] or psychoeducational [ 81 ] training on identification (although each study also reported high recognition of difficulties before the training), whereas studies focused on real-world identification have shown mixed results [ 79 , 82 ]. However, changes in the proportion of identified pupils must be contextualized within the accuracy of identification. There are consequences of both over- and underidentification [ 83 , 84 ], most notably in terms of inefficient allocation of limited mental health support resources. While comparison with the SDQ suggested that there was some improvement in terms of the accuracy of identification following the training, underidentification remained a substantial challenge, with between one-half and two-thirds of pupils with elevated SDQ scores remaining unidentified by teachers and TAs. The underidentification of children with mental health difficulties in educational settings, particularly for children with internalizing as opposed to externalizing problems [ 85 ], has been well documented in the literature [ 30 ], and it is likely that a combination of identification models is required to address this challenge [ 27 , 29 ].

Promisingly, the training appeared to be useful in terms of connecting pupils with care and support, an outcome not frequently measured in other studies [ 30 , 31 , 34 ]. First, the findings suggested that participants had conversations about or documented concerns for a greater proportion of identified pupils following the training, which reflects findings from previous studies of At-Risk [ 17 , 41 ]. This is a rather unique outcome in the literature as other training evaluations have found no difference between training and control groups in terms of conversations with pupils and colleagues [ 78 ]. Importantly, this study went beyond conversations to include outcomes pertaining to in-school and external support. The increases in in-class and in-school support for identified pupils reflect findings of the UK-based study by Kidger et al [ 80 ] of mental health first aid training and the Australian-based pilot study by Parker et al [ 37 ] of a web-based training program, each of which found a positive effect of the training on helping behaviors. Although in-class and in-school support seemed to increase following the training, it is notable that referrals and access to specialist services did not. There are several plausible explanations for this finding. For example, it is likely that school staff were already aware of children with the most severe mental health difficulties and were confident and able to support newly identified pupils—who might have had lower-level mental health difficulties—within the school setting. However, if the training did lead to the identification of children who might benefit from specialist care, there are many barriers to accessing such support (eg, availability and long waiting lists) that might have influenced these outcomes, as reflected in both the qualitative interviews and the wider literature [ 23 , 86 ].

In addition, quantitative and qualitative findings suggested that the program was a good fit for individuals and schools, which aligns with previous research on the acceptability and perceived need for mental health training for school staff [ 18 , 20 , 27 - 29 , 87 , 88 ]. The training’s format seemed to be a key contributor to its feasibility. With a few exceptions [ 37 , 39 , 89 ], the web-based simulation-based format of At-Risk is unique among training programs and is well aligned with teachers’ preferences. For example, in their focus group study of UK secondary school teachers, Shelemy et al [ 20 ] found that participants wanted engaging, interactive, and concise training that included practical strategies and illustrative case studies, all of which are central to At-Risk . While the authors found that teachers disagreed over the usefulness of web-based training, it is possible that these concerns would have decreased during the COVID-19 pandemic as staff became more accepting of web-based opportunities to learn.

Qualitative findings also demonstrated the importance of school context and culture, which have been highlighted in previous research [ 27 ]. In particular, participants noted the importance of school culture in adopting mental health interventions into regular practice. In their systematic review, Moore et al [ 90 ] identified school culture, values, and policies as key facilitators of sustaining mental health interventions. A related area of focus was support from the SLT. This support is a well-recognized factor contributing to intervention success and sustainability for several reasons, including these leaders’ practical role in communicating about interventions and allocating specific time and resources to them [ 43 , 90 , 91 ]. However, it is important to recognize that mental health training for school staff may be even more needed and impactful in schools where mental health is not as much of a priority.

Limitations

Our mixed methods approach, wide range of outcomes, and diverse sample of participating schools offer rich information regarding the feasibility of At-Risk in the United Kingdom. These strengths notwithstanding, there are also several limitations to consider when interpreting the findings. The nonrandomized design, while common for feasibility studies, prevents any conclusions regarding causality and also limits the exploration of other factors that may have influenced outcomes (eg, providing teachers and TAs with the Mental Health Resource Maps or SENCos and mental health leads with feedback on identified pupils). In terms of recruitment and retention, the study had 50% (54/108) attrition. Several factors may have influenced this, including the increased pressure on school staff due to the COVID-19 pandemic, the timing of the study within the school year, and the requirement to communicate with participants only via the study link person. While we tried to explore the effect of attrition through a complete case sensitivity analysis, we lacked important information on the characteristics of those who dropped out as this information was collected only at T2. Furthermore, we were only able to recruit 8 staff members for the posttraining interviews, which was far below our recruitment target. Low participation rates could again be due to several factors, including the impact of the COVID-19 pandemic or competing priorities. Of note, we were not able to recruit anyone who did not complete the training, any headteachers, or any staff from 2 of the schools (schools A and D). This could mean that we lack viewpoints that may be important for understanding the feasibility and utility of the training.

There were also limitations associated with the study measures. While the Teacher and TA Identification Form was informed by the literature and reviewed by our primary school staff advisory group, its validity and reliability are unknown. In addition, the questionnaire only measured mental health support outcomes for those pupils identified as having mental health difficulties or increased risk. Therefore, we do not have information on those who were not identified. The measure is also based on teacher and TA reports and so may not have complete information about all types of support that pupils receive. Another important limitation pertains to the comparator used to assess the identification outcomes. To understand the potential utility of the training program, it is important to have a robust comparator. While we chose to use the teacher-report SDQ, it would also have been interesting to compare identification outcomes with parent-rated mental health difficulties, particularly in light of the low interrater agreement of common measures of child mental health difficulties [ 92 ]. An even stronger comparator would be to assess the teacher and TA identification outcomes against a clinical interview; however, this was not feasible in this study.

Finally, at the time of writing, At-Risk is currently not available for use as Kognito restructures its offerings. This demonstrates a trend that unfortunately is a common occurrence in the field of mental health, whereby many evidence-based digital tools are not available to potential end users [ 93 ]. Nonetheless, the learnings from this feasibility study offer rich information on what type of content and format may be useful for training programs in this area and, as such, can support further development and evaluation in the field.

Implications for Practice

Studies have consistently demonstrated that school staff would appreciate additional training on how best to support pupil mental health [ 18 , 20 , 87 , 88 ]. However, to be scalable, such programs must be realistic in terms of time, cost, and resource requirements [ 28 , 90 , 91 ]. Contextualized within the wider literature on school-based mental health interventions, the findings from this study suggest that mental health training is a feasible option for upskilling school staff to identify and respond to pupil mental health difficulties. They further highlight several specific factors that might positively contribute to feasibility and scalability, many of which are reflected in the broader literature on mental health training [ 20 , 28 ]. For example, teachers and TAs appreciated that the training actively engaged them in learning and applying new skills and that it used realistic examples to demonstrate the real-world applicability of the training, whereas school leaders identified the relatively low time and cost requirements and flexibility as key factors that could make the training feasible for their school context.

However, this is not to say that there are no implementation barriers associated with At-Risk or similar training programs. While the resources required to implement At-Risk are relatively low compared with other training programs, they must still be considered within the context of other school priorities. As demonstrated in the interviews and the wider literature [ 3 , 43 , 90 , 91 ], support from school leadership is essential for securing the time and budget required to implement a training such as At-Risk , and in schools where mental health is not a priority, there are likely to be many barriers to implementation . Even in schools with strong support from the leadership team, it may be difficult to find the requisite budget, time, and human resources to devote to the training. Finally, as is the case with any school-based mental health intervention, it is important that schools do not take sole responsibility for pupils’ mental health. Active partnership between schools and mental health services is key to ensuring that schools feel empowered and supported in this role [ 21 , 90 , 94 ]. While the schools in this study worked hard to support pupils as best they could, interviewees expressed frustration about the difficulty of accessing external support for children who could benefit from it. This is not an uncommon theme in the wider literature surrounding school-based interventions [ 20 , 23 , 91 ] and is a key consideration for scaling up training programs.

Implications for Future Research

The promising findings of this study suggest that additional research is needed to explore the role of scalable mental health training in supporting schools to protect and promote children’s mental health. On the basis of gaps in the literature, particular areas of interest include training for primary school staff (as most are focused on secondary school staff), web-based training (as opposed to traditional time- and resource-intensive in-person training), and training that takes a “whole school approach” by including all school staff members (rather than only teachers). This final area is especially interesting as findings from this study and others [ 27 ] have highlighted stakeholders’ preference that training programs include all school staff members. While our study jointly analyzed findings for teachers and TAs, future research would do well to consider how the unique roles and perspectives of these professionals—as well as other staff members within the school setting—might influence outcomes. Furthermore, future research should be more inclusive about their choice of outcomes, as too often evaluations of school staff training programs have focused on intermediate outcomes such as knowledge or confidence [ 31 ] without considering more “downstream” outcomes such as access to support. Finally, as demonstrated in our study, there is great value in using mixed methods approaches and including information about wider issues of feasibility and implementation, and studies that take this broader lens can help identify programs that are scalable, sustainable, and effective.

Conclusions

School staff would welcome additional mental health training to enable them to respond to pupil mental health difficulties, but there are many barriers to implementing such training at scale. Therefore, training programs that have relatively low time and resource requirements have great potential to fulfill an unmet need in schools. This mixed methods feasibility study showed that At-Risk for Elementary School Educators —an example of a brief, interactive web-based training program—is a feasible means of empowering school staff to accurately identify and respond to pupil mental health difficulties and increased risk.

Acknowledgments

The authors would like to thank Professor Paul Ramchandani for his input in the design of this study, the Cambridge Mental Health in Schools Advisory Group for sharing their views and advice throughout the study, and the staff at the 6 schools that took part in this study for their time and effort. This study was funded by the UK Research and Innovation Emerging Minds network (grant ES/S004726/2), and the training was provided to the schools free of cost by Kognito. ES was funded by a Gates Cambridge Scholarship (grant OPP1144) for the duration of the study. MF is funded by the National Institute for Health and Care Research (NIHR) Oxford and Thames Valley Applied Research Collaboration at the Oxford Health National Health Service (NHS) Foundation Trust. PBJ is funded by the NIHR (grant 0616-20003). All research in the Cambridge Department of Psychiatry is supported by the NIHR Applied Research Collaboration East of England and the Cambridge Biomedical Research Centre at the Cambridgeshire and Peterborough NHS Foundation Trust. The views expressed are those of the authors and not necessarily those of the NHS, the NIHR, the Department of Health and Social Care, or the Bill & Melinda Gates Foundation.

Data Availability

The data sets generated during and analyzed during this study are not publicly available due to restrictions associated with our ethics approvals but are available from the corresponding author on reasonable request.

Conflicts of Interest

Kognito is a for-profit company. After reviewing their questionnaires on preparedness and self-efficacy and having found them rigorous and unbiased, we independently decided to include them as outcomes in our study. Kognito had no role in the study design, analysis, or publication. PBJ was a scientific advisory board member for MSD. All other authors declare no other conflicts of interest.

Teacher and Teaching Assistant Identification Form.

Kognito pre- and posttraining surveys.

Interview topic guides.

School characteristics.

Results from complete case sensitivity analysis.

Results from the sensitivity analysis excluding school D.

Quantitative and qualitative findings pertaining to the potential harms of At-Risk.

  • Transforming children and young people's mental health provision: a green paper. Department of Health and Department for Education, United Kingdom Government. Dec 2017. URL: https:/​/assets.​publishing.service.gov.uk/​media/​5a823518e5 274a2e87dc1b56/​Transforming_children_and_young_people_s_mental_health_provision.​pdf [accessed 2024-03-25]
  • Future in mind: promoting, protecting and improving our children and young people’s mental health and wellbeing. National Health Service England. URL: https://assets.publishing.service.gov.uk/media/5a80b26bed915d74e33fbe3c/Childrens_Mental _Health.pdf [accessed 2024-03-25]
  • Promoting children and young people's mental health and wellbeing. Office for Health Improvement and Disparities and Department for Education, United Kingdom Government. Mar 20, 2015. URL: https:/​/www.​gov.uk/​government/​publications/​promoting-children-and-young-peoples-emotional-health-and-wellbeing [accessed 2024-03-25]
  • Newlove-Delgado T, Williams T, Robertson K, McManus S, Sadler K, Vizard T, et al. Mental health of children and young people in England 2021 - wave 2 follow up to the 2017 survey. National Health Service Digital. 2021. URL: https://openac cess.city.ac.uk/id/eprint/28718/ [accessed 2024-03-25]
  • Vizard T, Sadler K, Ford T, Newlove-Delgado T, McManus S, Marcheselli F, et al. Mental health of children and young people in England, 2020. National Health Service Digital. Oct 22, 2020. URL: https://www.infocoponline.es/pdf/mhcyp _2020_rep.pdf [accessed 2024-03-25]
  • Pitchforth J, Fahy K, Ford T, Wolpert M, Viner RM, Hargreaves DS. Mental health and well-being trends among children and young people in the UK, 1995-2014: analysis of repeated cross-sectional national health surveys. Psychol Med. Jun 2019;49(8):1275-1285. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Newlove-Delgado T, McManus S, Sadler K, Thandi S, Vizard T, Cartwright C, et al. Child mental health in England before and during the COVID-19 lockdown. Lancet Psychiatry. May 2021;8(5):353-354. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Creswell C, Shum A, Pearcey S, Skripkauskaite S, Patalay P, Waite P. Young people's mental health during the COVID-19 pandemic. Lancet Child Adolesc Health. Aug 2021;5(8):535-537. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Fazel M, Hoagwood K, Stephan S, Ford T. Mental health interventions in schools 1: mental health interventions in schools in high-income countries. Lancet Psychiatry. Oct 2014;1(5):377-387. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Kessler RC, Berglund P, Demler O, Jin R, Merikangas KR, Walters EE. Lifetime prevalence and age-of-onset distributions of DSM-IV disorders in the National Comorbidity Survey Replication. Arch Gen Psychiatry. Jun 2005;62(6):593-602. [ CrossRef ] [ Medline ]
  • Greenberg MT, Domitrovich CE, Weissberg RP, Durlak JA. Social and emotional learning as a public health approach to education. Future Child. 2017;27(1):13-32. [ CrossRef ]
  • Humphrey N, Wigelsworth M. Making the case for universal school-based mental health screening. Emot Behav Difficulties. 2016;21(1):22-42. [ CrossRef ]
  • Kutcher S, Wei Y. School mental health: a necessary component of youth mental health policy and plans. World Psychiatry. Jun 2020;19(2):174-175. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Fazel M, Soneson E. Current evidence and opportunities in child and adolescent public mental health: a research review. J Child Psychol Psychiatry. Dec 2023;64(12):1699-1719. [ CrossRef ] [ Medline ]
  • Stephan SH, Weist M, Kataoka S, Adelsheim S, Mills C. Transformation of children's mental health services: the role of school mental health. Psychiatr Res. Oct 2007;58(10):1330-1338. [ CrossRef ]
  • Day L, Blades R, Spence C, Ronicle J. Mental health services and schools link pilots: evaluation brief. Department for Education, United Kingdom Government. Feb 2017. URL: https:/​/assets.​publishing.service.gov.uk/​media/​5a817dfaed915d74 e33fe812/​Evaluation_of_the_MH_services_and_schools_link_pilots-RB.​pdf [accessed 2024-03-25]
  • Long MW, Albright G, McMillan J, Shockley KM, Price OA. Enhancing educator engagement in school mental health care through digital simulation professional development. J Sch Health. Sep 2018;88(9):651-659. [ CrossRef ] [ Medline ]
  • Rothì DM, Leavey G, Best R. On the front-line: teachers as active observers of pupils’ mental health. Teach Teach Educ. Jul 2008;24(5):1217-1231. [ CrossRef ]
  • Mazzer KR, Rickwood DJ. Teachers' role breadth and perceived efficacy in supporting student mental health. Adv Sch Ment Health Promot. 2015;8(1):29-41. [ CrossRef ]
  • Shelemy L, Harvey K, Waite P. Supporting students’ mental health in schools: what do teachers want and need? Emot Behav Difficulties. 2019;24(1):100-116. [ CrossRef ]
  • Fazel M, Soneson E, Sellars E, Butler G, Stein A. Partnerships at the interface of education and mental health services: the utilisation and acceptability of the provision of specialist liaison and teacher skills training. Int J Environ Res Public Health. Feb 24, 2023;20(5):4066. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Marshall L, Wishart R, Dunatchik A, Smith N. Supporting mental health in schools and colleges: quantitative survey. Department for Education, United Kingdom Government. Aug 2017. URL: https:/​/assets.​publishing.service.gov.uk/​media/​5a82e5bb40f0b6230269d46c/​Supporting_Mental-Health_survey_report.​pdf [accessed 2024-03-25]
  • Sharpe H, Ford T, Lereya ST, Owen C, Viner RM, Wolpert M. Survey of schools' work with child and adolescent mental health across England: a system in need of support. Child Adolesc Ment Health. Sep 2016;21(3):148-153. [ CrossRef ] [ Medline ]
  • Danby G, Hamilton P. Addressing the ‘elephant in the room’. The role of the primary school practitioner in supporting children’s mental well-being. Pastor Care Educ. 2016;34(2):90-103. [ CrossRef ]
  • Reinke WM, Stormont M, Herman KC, Puri R, Goel N. Supporting children's mental health in schools: teacher perceptions of needs, roles, and barriers. Sch Psychol Q. 2011;26(1):1-13. [ CrossRef ]
  • Kidger J, Gunnell D, Biddle L, Campbell R, Donovan J. Part and parcel of teaching? Secondary school staff's views on supporting student emotional health and well-being. Br Educ Res J. Jan 02, 2013;36(6):919-935. [ CrossRef ]
  • Soneson E, Burn AM, Anderson JK, Humphrey A, Jones PB, Fazel M, et al. Determining stakeholder priorities and core components for school-based identification of mental health difficulties: a Delphi study. J Sch Psychol. Apr 2022;91:209-227. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Soneson E, Howarth E, Ford T, Humphrey A, Jones PB, Thompson Coon J, et al. Feasibility of school-based identification of children and adolescents experiencing, or at-risk of developing, mental health difficulties: a systematic review. Prev Sci. Jul 2020;21(5):581-603. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Childs-Fegredo J, Burn AM, Duschinsky R, Humphrey A, Ford T, Jones PB, et al. Acceptability and feasibility of early identification of mental health difficulties in primary schools: a qualitative exploration of UK school staff and parents’ perceptions. Sch Ment Health. Nov 18, 2020;13:143-159. [ CrossRef ]
  • Anderson JK, Ford T, Soneson E, Coon JT, Humphrey A, Rogers M, et al. A systematic review of effectiveness and cost-effectiveness of school-based identification of children and young people at risk of, or currently experiencing mental health difficulties. Psychol Med. Sep 13, 2018;49(1):9-19. [ CrossRef ]
  • Yamaguchi S, Foo JC, Nishida A, Ogawa S, Togo F, Sasaki T. Mental health literacy programs for school teachers: a systematic review and narrative synthesis. Early Interv Psychiatry. Feb 2020;14(1):14-25. [ CrossRef ] [ Medline ]
  • Kognito homepage. Kognito. URL: https://kognito.com/ [accessed 2024-03-26]
  • At-risk for elementary school educators. Kognito. URL: https://kognito.com/products/at-risk-for-elementary-schools [accessed 2024-03-25]
  • Anderson M, Werner-Seidler A, King C, Gayed A, Harvey SB, O’Dea B. Mental health training programs for secondary school teachers: a systematic review. Sch Ment Health. Oct 3, 2018;11:489-508. [ CrossRef ]
  • Ekornes S. Teacher perspectives on their role and the challenges of inter-professional collaboration in mental health promotion. Sch Ment Health. Mar 17, 2015;7:193-211. [ CrossRef ]
  • Ohrt JH, Deaton JD, Linich K, Guest JD, Wymer B, Sandonato B. Teacher training in K–12 student mental health: a systematic review. Psychol Sch. Mar 03, 2020;57(5):833-846. [ CrossRef ]
  • Parker BL, Anderson M, Batterham PJ, Gayed A, Subotic-Kerry M, Achilles MR, et al. Examining the preliminary effectiveness and acceptability of a web-based training program for Australian secondary school teachers: pilot study of the BEAM (building educators' skills in adolescent mental health) program. JMIR Ment Health. Oct 22, 2021;8(10):e29989. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • McVey G, Gusella J, Tweed S, Ferrari M. A controlled evaluation of web-based training for teachers and public health practitioners on the prevention of eating disorders. Eat Disord. 2009;17(1):1-26. [ CrossRef ] [ Medline ]
  • Pereira CA, Wen CL, Miguel EC, Polanczyk GV. A randomised controlled trial of a web-based educational program in child mental health for schoolteachers. Eur Child Adolesc Psychiatry. Aug 2015;24(8):931-940. [ CrossRef ] [ Medline ]
  • Kognito At-Risk in PK-12. Oklahoma Department of Mental Health and Substance Abuse Services. URL: https://oklahoma.gov/odmhsas/trainings/training-institute/at-risk-pk-12.html [accessed 2024-03-14]
  • Albright G, Fazel M, Khalid N, McMillan J, Hilty D, Shockley K, et al. High school educator training by simulation to address emotional and behavioral concerns in school settings: a randomized study. J Technol Behav Sci. 2022;7(3):277-289. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Greif Green J, Levine RS, Oblath R, Corriveau KH, Holt MK, Albright G. Pilot evaluation of preservice teacher training to improve preparedness and confidence to address student mental health. Evid Based Pract Child Adolesc Ment Health. 2020;5(1):42-52. [ CrossRef ]
  • Domitrovich CE, Bradshaw CP, Poduska JM, Hoagwood K, Buckley JA, Olin S, et al. Maximizing the implementation quality of evidence-based preventive interventions in schools: a conceptual framework. Adv Sch Ment Health Promot. Jul 2008;1(3):6-28. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Bonell C, Prost A, Melendez-Torres GJ, Davey C, Hargreaves JR. Will it work here? A realist approach to local decisions about implementing interventions evaluated as effective elsewhere. J Epidemiol Community Health. Jan 2021;75(1):46-50. [ CrossRef ] [ Medline ]
  • Domitrovich CE, Greenberg MT. The study of implementation: current findings from effective programs that prevent mental disorders in school-aged children. J Educ Psychol Consult. 2000;11(2):193-221. [ CrossRef ]
  • Evans RE, Craig P, Hoddinott P, Littlecott H, Moore L, Murphy S, et al. When and how do 'effective' interventions need to be adapted and/or re-evaluated in new contexts? The need for guidance. J Epidemiol Community Health. Jun 2019;73(6):481-482. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Gardner F, Montgomery P, Knerr W. Transporting evidence-based parenting programs for child problem behavior (age 3-10) between countries: systematic review and meta-analysis. J Clin Child Adolesc Psychol. 2016;45(6):749-762. [ CrossRef ] [ Medline ]
  • Leijten P, Melendez-Torres GJ, Knerr W, Gardner F. Transported versus homegrown parenting interventions for reducing disruptive child behavior: a multilevel meta-regression study. J Am Acad Child Adolesc Psychiatry. Jul 2016;55(7):610-617. [ CrossRef ] [ Medline ]
  • Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new medical research Council guidance. BMJ. Sep 29, 2008;337:a1655. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Moore L, Hallingberg B, Wight D, Turley R, Segrott J, Craig P, et al. Exploratory studies to inform full-scale evaluations of complex public health interventions: the need for guidance. J Epidemiol Community Health. Oct 2018;72(10):865-866. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Eldridge SM, Lancaster GA, Campbell MJ, Thabane L, Hopewell S, Coleman CL, et al. Defining feasibility and pilot studies in preparation for randomised controlled trials: development of a conceptual framework. PLoS One. 2016;11(3):e0150205. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Bowen DJ, Kreuter M, Spring B, Cofta-Woerpel L, Linnan L, Weiner D, et al. How we design feasibility studies. Am J Prev Med. May 2009;36(5):452-457. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • McLennan D, Noble S, Noble M, Plunkett E, Wright G, Gutacker N. The English indices of deprivation 2019: technical report. Ministry of Housing, Communities and Local Government, United Kingdom Government. Sep 2019. URL: https://dera.ioe.ac.uk/34259/1/IoD2019_Technical_Report.pdf [accessed 2024-03-25]
  • Schools, pupils and their characteristics: January 2019. Department for Education, United Kingdom Government. Jun 2019. URL: https://www.gov.uk/government/statistics/schools-pupils-and-their-characteristics-january-2019 [accessed 2024-03-25]
  • Special educational needs in England: January 2019. Department for Education, United Kingdom Government. Jul 4, 2019. URL: https://assets.publishing.service.gov.uk/media/5d1c91cce5274a08df3d35bd/SEN_2019_Text.docx.pdf [accessed 2024-03-25]
  • Goodman R. The Strengths and Difficulties Questionnaire: a research note. J Child Psychol Psychiatry. Jul 07, 1997;38(5):581-586. [ CrossRef ] [ Medline ]
  • Goodman R. The extended version of the Strengths and Difficulties Questionnaire as a guide to child psychiatric caseness and consequent burden. J Child Psychol Psychiatry. Jul 1999;40(5):791-799. [ CrossRef ]
  • Goodman R, Ford T, Simmons H, Gatward R, Meltzer H. Using the Strengths and Difficulties Questionnaire (SDQ) to screen for child psychiatric disorders in a community sample. Br J Psychiatry. Dec 02, 2000;177(6):534-539. [ CrossRef ] [ Medline ]
  • Goodman R, Meltzer H, Bailey V. The Strengths and Difficulties Questionnaire: a pilot study on the validity of the self-report version. Eur Child Adolesc Psychiatry. Sep 12, 1998;7(3):125-130. [ CrossRef ] [ Medline ]
  • Stone LL, Otten R, Engels RC, Vermulst AA, Janssens JM. Psychometric properties of the parent and teacher versions of the Strengths and Difficulties Questionnaire for 4- to 12-year-olds: a review. Clin Child Fam Psychol Rev. Sep 2010;13(3):254-274. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Albright GL, Davidson J, Goldman R, Shockley KM, Timmons-Mitchell J. Development and validation of the Gatekeeper Behavior Scale. Crisis. Jul 2016;37(4):271-280. [ CrossRef ] [ Medline ]
  • Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. Aug 07, 2009;4:50. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Lyon AR, Bruns EJ. From evidence to impact: joining our best school mental health practices with our best implementation strategies. School Ment Health. Mar 1, 2019;11(1):106-114. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Schools, pupils and their characteristics: January 2020. Department for Education, United Kingdom Government. Jun 25, 2020. URL: https://www.gov.uk/government/statistics/schools-pupils-and-their-characteristics-january-2020 [accessed 2024-03-25]
  • Eldridge SM, Chan CL, Campbell MJ, Bond CM, Hopewell S, Thabane L, et al. CONSORT 2010 statement: extension to randomised pilot and feasibility trials. BMJ. Oct 24, 2016;355:i5239. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Wilson DT, Walwyn RE, Brown J, Farrin AJ, Brown SR. Statistical challenges in assessing potential efficacy of complex interventions in pilot or feasibility studies. Stat Methods Med Res. Jun 2016;25(3):997-1009. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Lancaster GA, Dodd S, Williamson PR. Design and analysis of pilot studies: recommendations for good practice. J Eval Clin Pract. May 2004;10(2):307-312. [ CrossRef ] [ Medline ]
  • Thabane L, Ma J, Chu R, Cheng J, Ismaila A, Rios LP, et al. A tutorial on pilot studies: the what, why and how. BMC Med Res Methodol. Jan 06, 2010;10:1. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • R: a language and environment for statistical computing. The Comprehensive R Archive Network. Feb 10, 2015. URL: https://www.gbif.org/tool/81287/r-a-language-and-environment-for-statistical-computing [accessed 2024-03-25]
  • Wickham H. Ggplot2: Elegant Graphics for Data Analysis. New York, NY. Springer; 2009.
  • Bryer J, Speerschneider K. Analysis and visualization of Likert based items. GitHub. URL: http://github.com/jbryer/likert
  • YouthinMind homepage. YouthinMind. URL: https://youthinmind.com/ [accessed 2024-03-26]
  • Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. Nov 2005;15(9):1277-1288. [ CrossRef ] [ Medline ]
  • Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. Sep 18, 2013;13:117. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77-101. [ CrossRef ]
  • Braun V, Clarke V. Thematic Analysis: A Practical Guide. Thousand Oaks, CA. SAGE Publications; 2021.
  • School workforce in England. United Kingdom Government. Jun 8, 2023. URL: https://explore-education-statistics.service .gov.uk/find-statistics/school-workforce-in-england [accessed 2024-03-14]
  • Jorm AF, Kitchener BA, Sawyer MG, Scales H, Cvetkovski S. Mental health first aid training for high school teachers: a cluster randomized trial. BMC Psychiatry. Jun 24, 2010;10:51. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Moor S, Maguire A, McQueen H, Wells EJ, Elton R, Wrate R, et al. Improving the recognition of depression in adolescence: can we teach the teachers? J Adolesc. Feb 2007;30(1):81-95. [ CrossRef ] [ Medline ]
  • Kidger J, Stone T, Tilling K, Brockman R, Campbell R, Ford T, et al. A pilot cluster randomised controlled trial of a support and training intervention to improve the mental health of secondary school teachers and students - the WISE (Wellbeing in Secondary Education) study. BMC Public Health. Oct 06, 2016;16(1):1060. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Vieira MA, Gadelha AA, Moriyama TS, Bressan RA, Bordin IA. Evaluating the effectiveness of a training program that builds teachers' capability to identify and appropriately refer middle and high school students with mental health problems in Brazil: an exploratory study. BMC Public Health. Feb 28, 2014;14:210. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • McLaughlin RJ, Holcomb JD, Jibaja-Rusth ML, Webb J. Teacher ratings of student risk for substance use as a function of specialized training. J Drug Educ. 1993;23(1):83-95. [ CrossRef ]
  • Soneson E, Childs-Fegredo J, Anderson JK, Stochl J, Fazel M, Ford T, et al. Acceptability of screening for mental health difficulties in primary schools: a survey of UK parents. BMC Public Health. Dec 22, 2018;18(1):1404. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Levitt JM, Saka N, Hunter Romanelli L, Hoagwood K. Early identification of mental health problems in schools: the status of instrumentation. J Sch Psychol. Apr 2007;45(2):163-191. [ CrossRef ]
  • Loades ME, Mastroyannopoulou K. Teachers' recognition of children's mental health problems. Child Adolesc Ment Health. Sep 2010;15(3):150-156. [ CrossRef ] [ Medline ]
  • Anderson JK, Howarth E, Vainre M, Jones PB, Humphrey A. A scoping literature review of service-level barriers for access and engagement with mental health services for children and young people. Child Youth Serv Rev. Jun 2017;77:164-176. [ CrossRef ]
  • Moon J, Williford A, Mendenhall A. Educators' perceptions of youth mental health: implications for training and the promotion of mental health services in schools. Child Youth Serv Rev. Feb 2017;73:384-391. [ CrossRef ]
  • Graham A, Phelps R, Maddison C, Fitzgerald R. Supporting children’s mental health in schools: teacher views. Teach Teach. 2011;17(4):479-496. [ CrossRef ]
  • Atkins MS, Graczyk PA, Frazier SL, Abdul-Adil J. Toward a new model for promoting urban children's mental health: accessible, effective, and sustainable school-based mental health services. Sch Psychol Rev. 2003;32(4):503-514. [ CrossRef ]
  • Moore A, Stapley E, Hayes D, Town R, Deighton J. Barriers and facilitators to sustaining school-based mental health and wellbeing interventions: a systematic review. Int J Environ Res Public Health. Mar 17, 2022;19(6):3587. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Gee B, Wilson J, Clarke T, Farthing S, Carroll B, Jackson C, et al. Review: delivering mental health support within schools and colleges - a thematic synthesis of barriers and facilitators to implementation of indicated psychological interventions for adolescents. Child Adolesc Ment Health. Feb 2021;26(1):34-46. [ CrossRef ] [ Medline ]
  • Fält E, Wallby T, Sarkadi A, Salari R, Fabian H. Agreement between mothers', fathers', and teachers' ratings of behavioural and emotional problems in 3-5-year-old children. PLoS One. Nov 01, 2018;13(11):e0206752. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Bear HA, Ayala Nunes L, DeJesus J, Liverpool S, Moltrecht B, Neelakantan L, et al. Determination of markers of successful implementation of mental health apps for young people: systematic review. J Med Internet Res. Nov 09, 2022;24(11):e40347. [ FREE Full text ] [ CrossRef ] [ Medline ]
  • Durbin K, Harmon S. We are on the same team: child psychiatry and the school system. JAACAP Connect. 2020;7(1):32-36. [ CrossRef ]

Abbreviations

Edited by T Leung; submitted 24.02.23; peer-reviewed by E Widnall, B Fernandes, J Burns, K Cohen; comments to author 27.08.23; accepted 01.03.24; published 23.04.24.

©Emma Soneson, Emma Howarth, Alison Weir, Peter B Jones, Mina Fazel. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 23.04.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.

IMAGES

  1. How to write a Thematic Essay

    thematic analysis research paper themes

  2. Example of thematic analysis from codes to analytical themes

    thematic analysis research paper themes

  3. Thematic Analysis of Qualitative Data: Identifying Patterns that solve

    thematic analysis research paper themes

  4. Thematic Analysis Diagram

    thematic analysis research paper themes

  5. Thematic Analysis: Step-by-Step Guide

    thematic analysis research paper themes

  6. How to Do Thematic Analysis

    thematic analysis research paper themes

VIDEO

  1. Thematic Analysis in Qualitative research studies very simple explanation with example

  2. How to Assess the Quantitative Data Collected from Questionnaire

  3. ICSE |JULIUS CAESAR| SHAKESPEARE

  4. Qualitative Data Analysis Procedures

  5. 3 reasons why you cannot find your themes / Thematic analysis in qualitative research

  6. Ayodin Inayrus

COMMENTS

  1. A Step-by-Step Process of Thematic Analysis to Develop a Conceptual

    Thematic analysis is a research method used to identify and interpret patterns or themes in a data set; it often leads to new insights and understanding (Boyatzis, 1998; Elliott, 2018; Thomas, 2006).However, it is critical that researchers avoid letting their own preconceptions interfere with the identification of key themes (Morse & Mitcham, 2002; Patton, 2015).

  2. How to Do Thematic Analysis

    When to use thematic analysis. Different approaches to thematic analysis. Step 1: Familiarization. Step 2: Coding. Step 3: Generating themes. Step 4: Reviewing themes. Step 5: Defining and naming themes. Step 6: Writing up. Other interesting articles.

  3. How to do a thematic analysis [6 steps]

    Generating themes. Reviewing themes. Defining and naming themes. Creating the report. It is important to note that even though the six steps are listed in sequence, thematic analysis is not necessarily a linear process that advances forward in a one-way, predictable fashion from step one through step six.

  4. What Is Thematic Analysis? Explainer + Examples

    When undertaking thematic analysis, you'll make use of codes. A code is a label assigned to a piece of text, and the aim of using a code is to identify and summarise important concepts within a set of data, such as an interview transcript. For example, if you had the sentence, "My rabbit ate my shoes", you could use the codes "rabbit ...

  5. Interpreting themes from qualitative data: thematic analysis

    Step 3: Start to search for themes in your codes across the entire data set. This step is all about organizing your codes and starting to identify reoccurring themes. If doing this by hand, cut out all of the data extracts pertaining to the specific codes and start to group the codes together.

  6. Thematic Analysis

    Thematic Analysis is a qualitative research method that involves identifying, analyzing, and interpreting recurring themes or patterns in data. It aims to uncover underlying meanings, ideas, and concepts within the dataset, providing insights into participants' perspectives and experiences.

  7. Thematic Analysis: A Step-by-Step Guide

    A theme is a pattern that you identify within the data. Relevant steps may vary based on the approach and type of thematic analysis, but these are the general steps you'd take: 1. Familiarize yourself with the data (pre-coding work) Before you can successfully work with data, you need to understand it.

  8. General-purpose thematic analysis: a useful qualitative method for

    Thematic analysis involves a process of assigning data to a number of codes, grouping codes into themes and then identifying patterns and interconnections between these themes. 2 Thematic analysis allows for a nuanced understanding of what people say and do within their particular social contexts. Of note, thematic analysis can be used with interviews and focus groups and other sources of data ...

  9. Practical thematic analysis: a guide for multidisciplinary health

    In particular, researchers acknowledge that thematic analysis is a flexible and powerful method of systematically generating robust qualitative research findings by identifying, analysing, and reporting patterns (themes) within data.3456 Although qualitative methods are increasingly valued for answering clinical research questions, many ...

  10. Qualitative thematic analysis based on descriptive phenomenology

    In this discursive paper, we provide guidance for thematic analysis based on descriptive phenomenology, which, to our knowledge, has not been made explicit in this way previously. This can be used as a guiding framework to analyse lived experiences in nursing and midwifery research. The aim of this paper was to discuss how to understand and ...

  11. Thematic Analysis & Coding: An Overview of the Qualitative Paradigm

    Qualitative data analysis is the process of organising, eliciting meaning, and presenting conclusions from collected data. It could be a tedious process, as it. involves a large volume of data ...

  12. (PDF) How to use Thematic Analysis in Qualitative Research

    The first step is to become acquainted with the research data. In the second phase, we generate initial codes; in the third, we search for themes; and in the fourth, we review these themes. The ...

  13. How to Do Thematic Analysis

    There are various approaches to conducting thematic analysis, but the most common form follows a six-step process: Familiarisation. Coding. Generating themes. Reviewing themes. Defining and naming themes. Writing up. This process was originally developed for psychology research by Virginia Braun and Victoria Clarke.

  14. Thematic analysis of qualitative data: AMEE Guide No. 131

    Abstract. Thematic analysis is a widely used, yet often misunderstood, method of qualitative data analysis. It is a useful and accessible tool for qualitative researchers, but confusion regarding the method's philosophical underpinnings and imprecision in how it has been described have complicated its use and acceptance among researchers.

  15. (PDF) A Brief Introduction to Thematic Analysis

    Thematic Analysis. Thematic analysis is a data analysis procedure that centres on identification, description, explanation, substantiation and l inkages of themes. It is premised. on the view that ...

  16. PDF Thematic Analysis: a Critical Review of Its Process and Evaluation

    This paper finds that thematic analysis is a comprehensive process where researchers are able to identify numerous cross-references between the data the research's evolving themes (Hayes 1997). It provides flexibility for approaching research patterns in two ways, i.e. inductive and deductive (Frith and Gleeson 2004; Hayes 1997; ...

  17. Conducting Thematic Analysis with Qualitative Data

    qualitative research, research methods, thematic analysis . Creative Commons License . This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 International ... it is that the analyst will state that code as the basis of a theme. While thematic analysis may consider coding frequencies in the production of themes ...

  18. (PDF) Thematic Analysis

    Thematic analysis is a poorly demarcated and rarely-acknowledged, yet widely-used qualitative. analytic method (see Boyatzis, 1998; Roulston, 2001) within and beyond psychology. In this paper, we ...

  19. Thematic Analysis

    In summary, there are two different types of 'themes' that researchers tend to narrate in research papers: 1. A domain summary is a summary of an area (domain) of the data. For example, a summary of everything the participants said in relation to an interview question or a particular theme. So for example, a domain summary type theme could ...

  20. Meta-thematic synthesis of research on early childhood ...

    The growing significance of coding in 21st-century early childhood education extends beyond technical proficiency, encompassing cognitive development, problem-solving, and creativity. Coding is being integrated globally into educational curricula to prepare students for the digital era. This research examines coding's potential impact on cognitive and socio-emotional development and ...

  21. Journal of Medical Internet Research

    Background: This paper explores the widely discussed relationship between electronic media use and sleep quality, indicating negative effects due to various factors. However, existing meta-analyses on the topic have some limitations. Objective: The study aims to analyze and compare the impacts of different digital media types, such as smartphones, online games, and social media, on sleep quality.

  22. Theme development in qualitative content analysis and thematic analysis

    struction in qualitative content analysis and thematic analysis. are addressed in this paper to help with an analytical clar-. ification, and to increase rigour and acceptability of data ...

  23. Journal of Medical Internet Research

    Background: Schools in the United Kingdom and elsewhere are expected to protect and promote pupil mental health. However, many school staff members do not feel confident in identifying and responding to pupil mental health difficulties and report wanting additional training in this area. Objective: We aimed to explore the feasibility of Kognito's At-Risk for Elementary School Educators, a ...

  24. (PDF) Thematic analysis.

    thematic map is a visual (see Braun & Clarke, 2006) or sometimes text-based (see Frith &. Gleeson, 2004) tool to map out the facets of your developing analysis and identify main themes, subthemes ...