We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it. Privacy policy

Open-Ended Questions in Qualitative Research: Strategies, Examples, and Best Practices

Table of content, understanding open-ended questions, designing open-ended questions, types of open-ended questions, conducting interviews and focus groups with open-ended questions, analyzing and interpreting open-ended responses, challenges and limitations of using open-ended questions, best practices for using open-ended questions in qualitative research, definition of open-ended questions.

Open-ended questions are a research tool that allows for a wide range of possible answers and encourages respondents to provide detailed and personalized responses. These types of questions typically begin with phrases such as “ How ,” “ What ,” or “ Why “, and require the respondent to provide their thoughts and opinions.

Open-ended questions are crucial in the following scenarios:

Understanding complex phenomena : When a topic is complex, multi-faceted, or difficult to measure with numerical data, qualitative research can provide a more nuanced and detailed understanding.

Studying subjective experiences: When the focus is on people’s perceptions, attitudes, beliefs, or experiences, qualitative research is better suited to capture the richness and diversity of their perspectives.

Developing theories: When a researcher wants to develop a model or theory to explain a phenomenon, qualitative research can provide a rich source of data to support the development of such hypotheses.

Evaluating programs or interventions: Qualitative research can help to evaluate the effectiveness of programs or interventions by collecting feedback from participants, stakeholders, or experts.

Researchers use open-ended methods in research, interviews, counseling, and other situations that may require detailed and in-depth responses.

Benefits of Using Open-Ended Questions in Qualitative Research

Qualitative research is most appropriate when the research question is exploratory, complex, subjective, theoretical, or evaluative. These questions are valuable in qualitative research for the following reasons:

More In-depth Responses

Open-ended questions allow participants to share their experiences and opinions in their own words, often leading to more in-depth and detailed responses.  For example, if a researcher is studying cancer survivors’ experiences, an open-ended question like, “Can you tell me about your experience with cancer?” may elicit a more detailed and nuanced response than a closed-ended question like “Did you find your cancer diagnosis to be difficult?”

Flexibility

Open-ended questions give the participant flexibility to respond to the questions in a way that makes sense to them, often revealing vital information that the researcher may have overlooked.

Better Understanding

Open-ended questions provide the researcher with a better understanding of the participant’s perspectives, beliefs, attitudes, and experiences, which is crucial in gaining insights into complex issues.

Uncovering New Insights

Open-ended questions can often lead to unexpected responses and reveal new information. When participants freely express themselves in their own words, they may bring up topics or perspectives that the researcher had not considered.

Building Rapport

Open-ended questions help build rapport with the participant, allowing the researcher to show interest in the participant’s responses and provide a space for them to share their experiences without feeling judged. This can lead to a positive research experience for participants, which may increase the likelihood of their continued participation in future studies.

Validating or Challenging Existing Theories

By allowing participants to provide their own perspectives and experiences, researchers can compare and contrast these responses with existing theories to see if they align or diverge. If the data from participants align with existing hypotheses, this can provide additional support for this data. On the other hand, if the information diverges from existing theories, this can indicate a need for further investigation or revision of the existing data.

Avoiding Bias and Preconceived Notions

Researchers may unintentionally guide participants towards a particular answer or perspective when using close-ended questions. This can introduce bias into the data and limit the range of responses that participants provide. By using open-ended questions, researchers can avoid this potential source of bias and allow participants to express their unique perspectives.

Differences Between Open-Ended and Closed-Ended Questions

Open-ended questions encourage numerous responses and allow respondents to provide their thoughts and opinions. “ What ,” “ How, ” or “ Why ” are some of the words used to phrase open-ended questions and are designed to elicit more detailed and expansive answers. Researchers use open-ended questions in ethnography, interviews , and focus groups to gather comprehensive information and participants’ insights.

Some examples of open-ended questions include:

  • What do you think about the current state of the economy?
  • How do you feel about global warming?
  • Why did you choose to pursue a career in law?

On the other hand, closed-ended questions only allow for a limited set of responses and are typically answered with a “Yes” or “No” or a specific option from a list of multiple choices. These questions are handy in surveys, customer service interactions and questionnaires to collect quantitative data that can be easily analyzed and quantified. They are significant when you want to gather specific information hastily or when you need to confirm or deny a particular fact.

Some examples of closed-ended questions include:

  • What was your shopping experience with our company like?
  • Have you ever traveled to Europe before?
  • Which of these brands do you prefer: Nike, Adidas, or Puma?

Both open-ended and closed-ended questions have their place in research and communication. Open-ended questions can provide rich and detailed information, while closed-ended questions can provide specific and measurable data. The appropriate question type typically depends on the research or communication goals, context and the information required.

Designing open-ended questions requires careful consideration and planning. Open-ended questions elicit more than just a simple “yes” or “no” response and instead allow for a broad range of answers that provide insight into the respondent’s thoughts, feelings, or experiences. When designing open-ended questions in qualitative research, it is critical to consider the best practices below:

how to write open ended interview questions for research

Before designing your questions, you must predetermine what you want to learn from your respondents. This, in turn, will help you craft clear and concise questions that are relevant to your research goals. Use simple language and avoid technical terms or jargon that might confuse respondents.

Avoid leading or biased language that could influence and limit the respondents’ answers. Instead, use neutral wording that allows participants to share their authentic thoughts and opinions. For example, instead of asking, “Did you enjoy the food you ate?” ask, “What was your experience at the restaurant?”

One of the advantages of open-ended questions is that they allow respondents to provide detailed and personalized responses. Encourage participants to elaborate on their answers by asking follow-up questions or probing for additional information.

One can deliver open-ended questions in various formats, including interviews, surveys, and focus groups. Consider which one is most appropriate for your research goals and target audience. Additionally, before using your questions in a survey or interview, test them with a small group of people to make sure they are clear and functional.

Open-ended questions give a participant the freedom to answer without restriction. Furthermore, these questions evoke detailed responses from participants, unlike close-ended questions that tend to lead to one-word answers.

Open-Ended Questions Categories

When a researcher wants to explore a topic or phenomenon that is not well understood, qualitative research can help generate hypotheses and insights. For instance, “Can you tell me more about your thoughts on animal poaching in Africa?” or “What is your opinion on the future of social media in business?”

Researchers use these questions to prompt respondents to think more deeply about a particular topic or experience, sometimes using anecdotes related to a specific topic. For example, “What did you learn from that experience?” or “How do you think you could have handled that situation differently?

Researchers use probing questions to gain deeper insight into a participant’s response. These questions aim to understand the reasoning and emotion behind a particular answer. For example, “What did you learn from that mistake?” or “How do you think you could have handled that situation differently?

These questions get more information or clarify a point. For example, “Can you explain that further?” or “Can you give me an example?”

These questions ask the respondents to imagine a hypothetical scenario and provide their thoughts or reactions. Examples of hypothetical questions include “What would you do if you won the lottery?” or “How do you think society would be different if everyone had access to free healthcare?”

These questions ask the respondent to describe something in detail, such as a person, place, or event. Examples of descriptive questions include “Can you tell me about your favorite vacation?” or “How would you describe your ideal job?”

When preparing for an interview , it is important to understand the types of interviews available, what topics will be covered, and how to ask open-ended questions.

Questions should be asked in terms of past, present, and future experiences and should be worded in such a way as to invite a more detailed response from the participant. It is also important to establish a clear sequence of questions so that all topics are addressed without interrupting the flow of conversation.

Planning and Preparing For Interviews and Focus Groups

Before starting an interview or focus group, creating a list of topics or areas you want to explore during your research is essential. Consider what questions will help you gain the most insight into the topic.

Once you’ve identified the topics, you can create more specific questions that will be used to guide the conversation. It can be helpful to categorize your questions into themes to ensure all topics are addressed during the interview.

As you write your questions, aim to keep them as open-ended as possible so that the participant has space to provide detailed feedback. Avoid leading questions and try to avoid yes or no answers. Also, allow participants to provide any additional thoughts they may have on the topic.

Let’s say you’re researching customer experience with an online store. Your broad topic categories might be customer service, product selection, ease of use, and shipping. Your questions could cover things like:

  • How satisfied are you with the customer service?
  • What do you think about the product selection?
  • Is it easy to find the products you’re looking for?

 Best Practices

During the conversation, only one person can talk at a time, and everyone should be able to contribute. To ensure participants understand the questions being asked, try asking them in multiple ways.

It is also important to pause briefly and review the question that has just been discussed before moving on. In addition, brief pauses and silences before and after asking a new question may help facilitate the discussion. If participants begin talking about something that may be an answer to a different question during the discussion, then feel free to allow the conversation to go in that direction.

With these strategies, examples, and best practices in mind, you can ensure that your interviews and focus groups are successful.

Tips For Asking Open-Ended Questions During Interviews and Focus Groups

Asking open-ended questions during interviews and focus groups is critical to qualitative research. Open-ended questions allow you to explore topics in-depth, uncover deeper insights, and gain valuable participant feedback.

However, crafting your questions with intention and purpose is important to ensure that you get the most out of your research.

how to write open ended interview questions for research

Start With General Questions

When crafting open-ended questions for interviews or focus groups, it’s important to start with general questions and move towards more specific ones. This strategy helps you uncover various perspectives and ideas before getting into the details.

Using neutral language helps to avoid bias and encourages honest answers from participants. It’s important to determine the goal of the focus group or interview before asking any questions. These findings will help guide your conversation and keep it on track.

Use of Engagement Questions

To get the conversation started during interviews or focus groups, engagement questions are a great way to break the ice. These types of questions can be about anything from personal experiences to interests.

For example: “How did you get here, and what was one unusual thing you saw on your way in?”, “What do you like to do to unwind in your free time?” or “When did you last purchase a product from this line?”.

Use of Exploratory Questions

Exploratory questions about features are also useful in this type of research. Questions such as: “What features would you talk about when recommending this product to a friend?”, “If you could change one thing about this product, what would you change?”, or “Do you prefer this product or that product, and why?” all help to uncover participants’ opinions and preferences.

Exploratory questions about experiences are also helpful; questions such as: “Tell me about a time you experienced a mishap when using this product?” help to identify potential problems that need to be addressed.

Researchers can gain valuable insights from participants by using these tips for asking open-ended questions during interviews and focus groups.

Strategies For Active Listening and Follow-Up Questioning

Active listening is an important skill to possess when conducting qualitative research. It’s essential to ensure you understand and respond to the person you are interviewing effectively. Here are some strategies for active listening and follow-up questioning:

Pay Attention to Non-Verbal Cues

It is important to pay attention to non-verbal cues such as body language and voice when listening. Pay attention to their facial expressions and tone of voice to better understand what they are saying. Make sure not to interrupt the other person, as this can make them feel like their opinions aren’t being heard.

Listen Without Judging or Jumping to Conclusions

It is important to listen without judgment or jumping to conclusions. Don’t plan what to say next while listening, as this will stop you from understanding what the other person is saying.

Use Non-Verbal Signals to Show That You’re Listening

Nodding, smiling, and making small noises like “yes” and “uh huh” can show that you are listening. These signals can help the person feel more comfortable and open up more.

Don’t Impose Your Opinions or Solutions

When interviewing someone, it is important not to impose your opinions or solutions. It is more important to understand the other person and try to find common ground than it is to be right.

Stay Focused While Listening

Finally, it is critical to stay focused while listening. Don’t let yourself get distracted by your own thoughts or daydreaming. Remain attentive and listen with an open mind.

These are all key elements in effectively gathering data and insights through qualitative research.

how to write open ended interview questions for research

Qualitative research depends on understanding the context and content of the responses to open-ended questions. Analyzing and interpreting these responses can be challenging for researchers, so it’s important to have a plan and strategies for getting the most value out of open-ended responses.

Strategies For Coding and Categorizing Responses

Coding qualitative data categorizes and organizes responses to open-ended questions in a research study. It is an essential part of the qualitative data analysis process and helps identify the responses’ patterns, themes, and trends.

Thematic Analysis and Qualitative Data Analysis Software

These are two methods for automated coding of customer feedback. Thematic analysis is the process of identifying patterns within qualitative data. This process can be done by manually sorting through customer feedback or using a software program to do the work for you.

Qualitative data analysis software also facilitates coding by providing powerful visualizations that allow users to identify trends and correlations between different customer responses.

Manual Coding

Manual coding is another method of coding qualitative data, where coders sort through responses and manually assign labels based on common themes. Coding the qualitative data, it makes it easier to interpret customer feedback and draw meaningful conclusions from it.

Coding customer feedback helps researchers make data-driven decisions based on customer satisfaction. It helps quantify the common themes in customer language, making it easier to interpret and analyze customer feedback accurately.

Strategies for manual coding include using predetermined codes for common words or phrases and assigning labels to customers’ responses according to certain categories. Examples of best practices for coding include using multiple coders to review responses for accuracy and consistency and creating a library of codes for ease of use.

Identifying Themes and Patterns in Responses

These processes involve reviewing the responses and searching for commonalities regarding words, phrases, topics, or ideas. Doing so can help researchers to gain a better understanding of the material they are analyzing.

There are several strategies that researchers can use when it comes to identifying themes and patterns in open-ended responses.

Manual Scan

One strategy is manually scanning the data and looking for words or phrases that appear multiple times.

Automatic Scan

Another approach is to use qualitative analysis software that can provide coding, categorization, and data analysis.

For example, if a survey asked people about their experience with a product, a researcher could look for common phrases such as “it was easy to use” or “I didn’t like it.” The researcher could then look for patterns regarding how frequently these phrases were used.

Concept Indicator Model

This model is an important part of the coding process in classic grounded theory. It involves a continuous process of exploring and understanding open-ended responses, which can often lead to the development of new conceptual ideas.

Coding Process

The coding process is broken down into two parts: substantive coding and theoretical coding. Substantive coding involves organizing data into meaningful categories, while theoretical coding looks at how those categories relate.

Forms of Coding

Within the concept indicator model are two forms of coding: open coding and selective coding. Open coding is used to explore responses without predetermined theories or preconceived ideas. It is an iterative process involving connecting categories and generating tentative conclusions.

On the other hand, selective coding uses predetermined theories or ideas to guide data analysis.

The concept indicator model also uses a cycling approach known as constant comparison and theoretical sampling. Constant comparison is the process of constantly comparing new data with previous data until saturation is reached.

Theoretical sampling involves examining different data types to determine which ones will be more useful for exploring the concepts and relationships under investigation.

Gaining experience and confidence in exploring and confirming conceptual ideas is essential for success in the concept indicator model.

Strategies such as brainstorming and creating examples can help analysts better understand the various concepts that emerge from the data.

Best practices such as involving multiple coders in the process, triangulating data from different sources, and including contextual information can also help increase the accuracy and reliability of coding results.

Interpreting and Analyzing Open-Ended Responses in Relation to Your Research Questions

  • Ensure Objectives are Met: For any study or project, you must ensure your objectives are met. To achieve this, the responses to open-ended questions must be categorized according to their subject, purpose, and theme. This step will help in recognizing patterns and drawing out commonalities.
  • Choose A Coding Method: Once you have identified the themes, you must choose a coding method to interpret and analyze the data.

There are various coding strategies that can be employed. For example, a directed coding strategy will help you focus on the themes you have identified in your research objectives. In contrast, an axial coding method can be used to connect related concepts together. With a coding method, it will be easier to make sense of the responses.

Use Narrative Analysis

This process involves looking for story elements such as plot, characters, setting, and conflict in the text. It can be useful for identifying shared experiences or values within a group.

By looking for these narrative elements, you can better understand how individuals perceive their own experiences and those of others.

Analyze the Findings

However, to understand the meanings that the responses may have, it is also important to analyze them. This stage is where techniques such as in-depth interviews, focus groups, and textual analysis come in.

These methods provide valuable insights into how the responses are related to each other and can help uncover potential connections and underlying motivations.

Summarize Your Findings

Once you have interpreted and analyzed the data, it is time to decide on your key findings. For example, you can summarize your findings according to different themes, discuss any implications of your research or suggest ways in which further research can be carried out.

These strategies provide valuable insights into the qualitative data collected from open-ended questions. However, to ensure that the data’s most effective outcomes are obtained, you need to familiarize yourself with the best practices in qualitative research.

Open-ended questions have the potential to generate rich and nuanced data in qualitative research. However, they also present certain challenges and limitations that researchers and educators need to be aware of.

We will now explore some of the challenges associated with using open-ended questions, including potential biases and subjectivity in responses, social desirability bias, and response bias.

We will also discuss strategies to address these challenges, such as balancing open-ended and closed-ended questions in research design. By understanding these limitations and employing best practices, researchers and educators can use open-ended questions to gather meaningful data and insights.

Addressing potential biases and subjectivity in responses

When we use open-ended questions in qualitative research, it’s crucial to be mindful of potential biases and subjectivity in responses. It’s natural for participants to bring their own experiences and beliefs to the table, which can impact their answers and skew the data. To tackle these challenges, we can take several steps to ensure that our research findings are as accurate and representative as possible.

One way to minimize subjectivity is to use neutral and unbiased language when framing our questions. By doing so, we can avoid leading or loaded questions that could influence participants’ responses. We can also use multiple methods to verify data and check responses, like conducting follow-up interviews or comparing responses with existing literature.

Another important consideration is to be open and transparent about the research process and participants’ rights. Addressing these biases also includes providing informed consent and guaranteeing confidentiality so that participants feel comfortable sharing their genuine thoughts and feelings. By recruiting diverse participants and ensuring that our data is representative and inclusive, we can also reduce potential biases and increase the validity of our findings.

By tackling biases and subjectivity in responses head-on, we can gather reliable and insightful data that can inform future research and enhance teaching methods.

Dealing with social desirability bias and response bias

In qualitative research, social desirability bias and response bias can pose significant challenges when analyzing data. Social desirability bias occurs when participants tend to respond in ways that align with social norms or expectations, rather than expressing their true feelings or beliefs. Response bias, on the other hand, happens when participants provide incomplete or inaccurate information due to factors like memory lapse or misunderstanding of the question.

To address these biases, researchers can use various strategies to encourage participants to be more candid and honest in their responses.

For instance, researchers can create a safe and supportive environment that fosters trust and openness, allowing participants to feel comfortable sharing their true thoughts and experiences. Researchers can also use probing techniques to encourage participants to elaborate on their answers, helping to uncover underlying beliefs and attitudes.

It’s also a good idea to mix up the types of questions you ask, utilizing both open-ended and closed-ended inquiries to get a variety of responses. Closed-ended questions can aid in the verification or confirmation of participants’ comments, but open-ended questions allow for a more in-depth investigation of themes and encourage participants to submit extensive and personal responses.

Balancing open-ended and closed-ended questions in your research design

An appropriate combination of open-ended and closed-ended questions is essential for developing an effective research design. Open-ended questions allow participants to provide detailed, nuanced responses and offer researchers the opportunity to uncover unexpected insights.

However, too many open-ended questions can make analysis challenging and time-consuming. Closed-ended questions, on the other hand, can provide concise and straightforward data that’s easy to analyze but may not capture the complexity of participants’ experiences.

Balancing the use of open-ended and closed-ended questions necessitates a careful evaluation of the study objectives, target audience, and issue under examination. Researchers must also consider the available time and resources for analysis.

When designing a research study, it’s essential to prioritize the research goals and choose questions that align with those goals. Careful selection of questions guarantees that the data gathered is pertinent and adds to a greater knowledge of the topic under consideration. Researchers should also consider the participants’ backgrounds and experiences and select questions that are appropriate and sensitive to their needs. Furthermore, adopting a mix of open-ended and closed-ended questions can assist researchers in triangulating data, which allows them to cross-validate their findings by comparing results from multiple sources or techniques.

Lastly, we will be exploring the best practices for utilizing open-ended questions in qualitative research. We cover a range of helpful tips and strategies for creating a research design that fosters rich and nuanced data while maintaining the integrity of your research.

Building an effective connection with your research participants, developing carefully developed research questions that align with your research objectives, remaining flexible and adaptable in your approach, and prioritizing ethical considerations throughout your research process are some of the key best practices we explore.

Building Rapport with Participants

Building rapport with research participants is an essential component of conducting effective qualitative research. Building rapport is all about creating trust and providing a comfortable environment where participants can feel free to share their thoughts and experiences.

The first thing a researcher should do is to introduce themselves and make the participant understand why the research is significant.  Additionally, active listening is critical in building rapport. Listening attentively to your participants’ responses and asking follow-up questions can demonstrate your interest in their experiences and perspective.

Maintaining a nonjudgmental, impartial position is also essential in developing rapport. Participants must feel free to express their opinions and experiences without fear of being judged or prejudiced.

Using respectful language, maintaining eye contact, and nodding along to participants’ responses can show that you are invested in their stories and care about their experiences.

Overall, establishing rapport with participants is an ongoing process that requires attention, care, and empathy.

Developing clear research questions

In research, developing clear research questions is an essential component of qualitative research using open-ended questions. The research questions provide a clear direction for the research process, enabling researchers to gather relevant and insightful data.

To create effective research questions, they must be specific, concise, and aligned with the overall research objectives. It is crucial to avoid overly broad or narrow questions that could impact the validity of the research.

Additionally, researchers should use language that is easy to understand. Researchers should avoid any technical jargon that may lead to confusion.

The order of the questions is also significant; they should flow logically, building on each other and ensuring they make sense. By developing clear research questions, researchers can collect and analyze data in a more effective and meaningful manner.                      

Maintaining a flexible and adaptable approach

When conducting qualitative research, maintaining a flexible and adaptable approach is crucial. Flexibility enables researchers to adjust their research methods and questions to ensure they capture rich and nuanced data that can answer their research questions.

However, staying adaptable can be a daunting task, as researchers may need to modify their research approach based on participants’ responses or unforeseen circumstances.

To maintain flexibility, researchers must have a clear understanding of their research questions and goals, while also remaining open to modifying their methods if necessary. It is also essential to keep detailed notes and regularly reflect on research progress to determine if adjustments are needed.

Staying adaptable is equally important as it requires researchers to be responsive to changes in participants’ attitudes and perspectives. Being able to pivot research direction and approach based on participant feedback is critical to achieving accurate and meaningful results.

Maintaining a flexible and adaptive strategy allows researchers to collect the most extensive and accurate data possible, resulting in a more in-depth understanding of the research topic. While it can be challenging to remain flexible and adaptable, doing so will ultimately lead to more robust research findings and greater insights into the topic at hand.

Being aware of ethical considerations

When conducting research, It is critical to remember the ethical aspects that control how individuals interact with one another in society and how these factors affect research. Ethical considerations refer to the principles or standards that should guide research to ensure it is conducted in an honest, transparent, and respectful manner.

Before beginning the study, researchers must obtain informed consent from participants. Obtaining consent means providing clear and comprehensive information about the research, its purpose, what participation entails, and the potential risks and benefits. Researchers must ensure that participants understand the information and voluntarily consent to participate.

Protecting the privacy and confidentiality of participants must be essential for researchers. They should look into safeguarding personal information, using pseudonyms or codes to protect identities, and securing any identifying information collected.

Researchers must avoid asking questions that are too personal, sensitive, or potentially harmful. If harm or distress occurs, researchers should provide participants with appropriate support and referral to relevant services.

Using open-ended questions in qualitative research presents both challenges and benefits. To address potential limitations, researchers should remain objective and neutral, create a safe and non-judgmental space, and use probing techniques. Best practices include building rapport, developing clear research questions, and being flexible. Open-ended questions offer the benefits of revealing rich and nuanced data, allowing for flexibility, and building rapport with participants. Ethical considerations must also be a top priority.

Interesting topics

  • How to add subtitles to a video? Fast & Easy
  • Subtitles, Closed Captions, and SDH Subtitles: How are they different?
  • Why captions are important? 8 good reasons
  • What is an SRT file, how to create it and use it in a video?
  • Everything You Need for Your Subtitle Translation
  • Top 10 Closed Captioning and Subtitling Services 2023
  • The Best Font for Subtitles : our top 8 picks!
  • Davinci Resolve
  • Adobe After Effects
  • Final Cut Pro X
  • Adobe Premiere Rush
  • Canvas Network
  • What is Transcription
  • Interview Transcription
  • Transcription guidelines
  • Audio transcription using Google Docs
  • MP3 to Text
  • How to transcribe YouTube Videos
  • Verbatim vs Edited Transcription
  • Legal Transcriptions
  • Transcription for students
  • Transcribe a Google hangouts meeting
  • Best Transcription Services
  • Best Transcription Softwares
  • Save time research interview transcription
  • The best apps to record a phone call
  • Improve audio quality with Adobe Audition
  • 10 best research tools every scholar should use
  • 7 Tips for Transcription in Field Research
  • Qualitative and Quantitative research
  • Spotify Podcast Guideline
  • Podcast Transcription
  • How to improve your podcasting skills
  • Convert podcasts into transcripts
  • Transcription for Lawyers: What is it and why do you need it?
  • How transcription can help solve legal challenges
  • The Best Transcription Tools for Lawyers and Law Firms

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

Open-ended interview questions and saturation

Susan c. weller.

1 Department of Preventive Medicine & Community Health, University of Texas Medical Branch, Galveston, Texas, United States of America

Ben Vickers

H. russell bernard.

2 Institute for Social Research, Arizona State University, Tempe, Arizona/University of Florida, Gainesville, Florida, United States of America

Alyssa M. Blackburn

3 Department of Molecular and Human Genetics, Baylor College of Medicine, Houston, Texas, United States of America

Stephen Borgatti

4 Department of Management, University of Kentucky, Lexington, Kentucky, United States of America

Clarence C. Gravlee

5 Department of Anthropology, University of Florida, Gainesville, Florida, United States of America

Jeffrey C. Johnson

Associated data.

All relevant data are available as an Excel file in the Supporting Information files.

Sample size determination for open-ended questions or qualitative interviews relies primarily on custom and finding the point where little new information is obtained (thematic saturation). Here, we propose and test a refined definition of saturation as obtaining the most salient items in a set of qualitative interviews (where items can be material things or concepts, depending on the topic of study) rather than attempting to obtain all the items . Salient items have higher prevalence and are more culturally important. To do this, we explore saturation, salience, sample size, and domain size in 28 sets of interviews in which respondents were asked to list all the things they could think of in one of 18 topical domains. The domains—like kinds of fruits (highly bounded) and things that mothers do (unbounded)—varied greatly in size. The datasets comprise 20–99 interviews each (1,147 total interviews). When saturation was defined as the point where less than one new item per person would be expected, the median sample size for reaching saturation was 75 (range = 15–194). Thematic saturation was, as expected, related to domain size. It was also related to the amount of information contributed by each respondent but, unexpectedly, was reached more quickly when respondents contributed less information. In contrast, a greater amount of information per person increased the retrieval of salient items. Even small samples ( n = 10) produced 95% of the most salient ideas with exhaustive listing, but only 53% of those items were captured with limited responses per person (three). For most domains, item salience appeared to be a more useful concept for thinking about sample size adequacy than finding the point of thematic saturation. Thus, we advance the concept of saturation in salience and emphasize probing to increase the amount of information collected per respondent to increase sample efficiency.

Introduction

Open-ended questions are used alone or in combination with other interviewing techniques to explore topics in depth, to understand processes, and to identify potential causes of observed correlations. Open-ended questions may produce lists, short answers, or lengthy narratives, but in all cases, an enduring question is: How many interviews are needed to be sure that the range of salient items (in the case of lists) and themes (in the case of narratives) are covered. Guidelines for collecting lists, short answers, and narratives often recommend continuing interviews until saturation is reached. The concept of theoretical saturation —the point where the main ideas and variations relevant to the formulation of a theory have been identified—was first articulated by Glaser and Strauss [ 1 , 2 ] in the context of how to develop grounded theory. Most of the literature on analyzing qualitative data, however, deals with observable thematic saturation —the point during a series of interviews where few or no new ideas, themes, or codes appear [ 3 – 6 ].

Since the goal of research based on qualitative data is not necessarily to collect all or most ideas and themes but to collect the most important ideas and themes, salience may provide a better guide to sample size adequacy than saturation. Salience (often called cultural or cognitive salience) can be measured by the frequency of item occurrence (prevalence) or the order of mention [ 7 , 8 ]. These two indicators tend to be correlated [ 9 ]. In a set of lists of birds, for example, robins are reported more frequently and appear earlier in responses than are penguins. Salient terms are also more prevalent in everyday language [ 10 – 12 ]. Item salience also may be estimated by combining an item’s frequency across lists with its rank/position on individual lists [ 13 – 16 ].

In this article, we estimate the point of complete thematic saturation and the associated sample size and domain size for 28 sets of interviews in which respondents were asked to list all the things they could think of in one of 18 topical domains. The domains—like kinds of fruits (highly bounded) and things that mothers do (unbounded)—varied greatly in size. We also examine the impact of the amount of information produced per respondent on saturation and on the number of unique items obtained by comparing results generated by asking respondents to name all the relevant things they can with results obtained from a limited number of responses per question, as with standard open-ended questioning. Finally, we introduce an additional type of saturation based on the relative salience of items and themes— saturation in salience —and we explore whether the most salient items are captured at minimal sample sizes. A key conclusion is that saturation may be more meaningfully and more productively conceived of as the point where the most salient ideas have been obtained .

Recent research on saturation

Increasingly, researchers are applying systematic analysis and sampling theory to untangle the problems of saturation and sample size in the enormous variety of studies that rely on qualitative data—including life-histories, discourse analysis, ethnographic decision modeling, focus groups, grounded theory, and more. For example, Guest et al.[ 17 ] and others[ 18 – 19 ] found that about 12–16 interviews were adequate to achieve thematic saturation. Similarly, Hagaman and Wutich [ 20 ] found that they could reliably retrieve the three most salient themes from each of the four sites in the first 16 interviews.

Galvin[ 21 ] and Fugard and Potts[ 22 ] framed the sample size problem for qualitative data in terms of the likelihood that a specific idea or theme will or will not appear in a set of interviews, given the prevalence of those ideas in the population. They used traditional statistical theory to show that small samples retrieve only the most prevalent themes and that larger samples are more sensitive and can retrieve less prevalent themes as well. This framework can be applied to the expectation of observing or not observing almost anything. Here it would apply to the likelihood of observing a theme in a set of narrative responses, but it applies equally well for situations such as behavioral observations, where specific behaviors are being observed and sampled[ 23 ]. For example, to obtain ideas or themes that would be reported by about one out of five people (0.20 prevalence) or a behavior with the same prevalence, there is a 95% likelihood of seeing those themes or behaviors at least once in 14 interviews—if those themes or behaviors are independent.

Saturation and sample size have also begun to be examined with multivariate models and simulations. Tran et al. [ 24 ] estimated thematic saturation and the total number of themes from open-ended questions in a large survey and then simulated data to test predictions about sample size and saturation. They assumed that items were independent and found that sample sizes greater than 50 would add less than one new theme per additional person interviewed.

Similarly, Lowe et al. [ 25 ] estimated saturation and domain size in two examples and in simulated datasets, testing the effect of various parameters. Lowe et al. found that responses were not independent across respondents and that saturation may never be reached. In this context, non-independence refers to the fact that some responses are much more likely than others to be repeated across people. Instead of complete saturation, they suggested using a goal such as obtaining a percentage of the total domain that one would like to capture (e.g., 90%) and the average prevalence of items one would like to observe to estimate the appropriate sample size. For example, to obtain 90% of items with an average prevalence of 0.20, a sample size of 36 would be required. Van Rijnsoever [ 26 ] used simulated datasets to study the accumulation of themes across sample size increments and assessed the effect of different sampling strategies, item prevalence, and domain size on saturation. Van Rijnsoever’s results indicated that the point of saturation was dependent on the prevalence of the items.

As modeling estimates to date have been based on only one or two real-world examples, it is clear that more empirical examples are needed. Here, we use 28 real-world examples to estimate the impact of sample size, domain size, and amount of information per respondent on saturation and on the total number of items obtained. Using the proportion of people in a sample that mentioned an item as a measure of salience, we find that even small samples may adequately capture the most salient items.

Materials and methods

The datasets comprise 20–99 interviews each (1,147 total interviews). Each example elicits multiple responses from each individual in response to an open-ended question (“Name all the … you can think of”) or a question with probes (“What other … are there?”).

Data were obtained by contacting researchers who published analyses of free lists. Examples with 20 or more interviews were selected so that saturation could be examined incrementally through a range of sample sizes. Thirteen published examples were obtained on: illness terms [ 27 ] (in English and in Spanish); birds, flowers, and fabrics [ 28 ]; recreational/street drugs and fruits [ 29 ]; things mothers do (online, face-to-face, and written administration) and racial and ethnic groups [ 30 ] (online, face-to-face, and written administration). Fifteen unpublished classroom educational examples were obtained on: soda pops (Weller, n.d.); holidays (two replications), things that might appear in a living room, characteristics of a good leader (two replications), a good team (two replications), and a good team player (Johnson, n.d.); and bad words, industries (two replications), cultural industries (two replications), and scary things (Borgatti, n.d.). (Original data appear online in S1 Appendix The Original Data for the 28 Examples.)

Some interviews were face to face, some were written responses, and some were administered on-line. Investigators varied in their use of prompts, using nonspecific (What other … are there?), semantic (repeating prior responses and then asking for others), and/or alphabetic prompts (going through the alphabet and asking for others). Brewer [ 29 ] and Gravlee et al. [ 30 ] specifically examined the effect of prompting on response productivity, although the Brewer et al. examples in these analyses contain results before extensive prompting and the Gravlee et al. examples contain results after prompting. The 28 examples, their topic, source, sample size, the question used in the original data collection, and the three most frequently mentioned items appear in Table 1 . All data were collected and analyzed without personal identifying information.

For each example, statistical models describe the pattern of obtaining new or unique items with incremental increases in sample size. Individual lists were first analyzed with Flame [ 31 , 32 ] to provide the list of unique items for each example and the Smith [ 14 ] and Sutrop [ 15 ] item salience scores. Duplicate items due to spelling, case errors, spacing, or variations were combined.

To help develop an interviewing stopping rule, a simple model was used to predict the unique number of items contributed by each additional respondent. Generalized linear models (GLM, log-linear models for count data) were used to predict the unique number of items added by each respondent (incrementing sample size), because number of unique items added by each respondent (count data) is approximately Poisson distributed. For each example, models were fit with ordinary least squares linear regression, Poisson, and negative binomial probability distributions. Respondents were assumed to be in random order, in the order in which they occurred in each dataset, although in some cases they were in the order they were interviewed. Goodness-of-fit was compared across the three models with minimized deviants (the Akaike Information Criterion, AIC) to find the best-fitting model [ 33 ]. Using the best-fitting model for each example, the point of saturation was estimated as the point where the expected number of new items was one or less. Sample size and domain size were estimated at the point of saturation, and total domain size was estimated for an infinite sample size from the model for each example as the limit of a geometric series (assuming a negative slope).

Because the GLM models above used only incremental sample size to predict the total number of unique items (domain size) and ignored variation in the number of items provided by each person and variation in item salience, an additional analysis was used to estimate domain size while accounting for subject and item heterogeneity. For that analysis, domain size was estimated with a capture-recapture estimation technique used for estimating the size of hidden populations. Domain size was estimated from the total number of items on individual lists and the number of matching items between pairs of lists with a log-linear analysis. For example, population size can be estimated from the responses of two people as the product of their number of responses divided by the number of matching items (assumed to be due to chance). If Person#1 named 15 illness terms and Person#2 named 31 terms and they matched on five illnesses, there would be 41 unique illness terms and the estimated total number of illness terms based on these two people would be (15 x 31) /5 = 93.

A log-linear solution generalizes this logic from a 2 x 2 table to a 2 K table [ 34 ]. the capture–recapture solution estimates total population size for hidden populations using the pattern of recapture (matching) between pairs of samples (respondents) to estimate the population size. An implementation in R with GLM uses a log-linear form to estimate population size based on recapture rates (Rcapture [ 35 , 36 ]). In this application, it is assumed that the population does not change between interviews (closed population) and models are fit with: (1) no variation across people or items (M 0 ); (2) variation only across respondents (M t ); (3) variation only across items (M h ); and (4) variation due to an interaction between people and items (M ht ). For each model, estimates were fit with binomial, Chao’s lower bound estimate, Poisson, Darroch log normal, and gamma distributions [ 35 ]. Variation among items (heterogeneity) is a test for a difference in the probabilities of item occurrence and, in this case, is equivalent to a test for a difference in item salience among the items. Due to the large number of combinations needed to estimate these models, Rcapture software estimates are provided for all four models only up to a sample of size 10. For larger sample sizes (all examples in this study had sample sizes of 20 or larger), only model 1 with no effects for people or items (the binomial model) and model 3 with item effects (item salience differences) were tested. Therefore, models were fit at size 10, to test all four models and then at the total available sample size.

Descriptive information for the examples appears in Table 2 . The first four columns list the name of the example, the sample size in the original study, the mean list length (with the range of the list length across respondents), and the total number of unique items obtained. For the Holiday1 example, interviews requested names of holidays (“Write down all the holidays you can think of”), there were 24 respondents, the average number of holidays listed per person (list length) was 13 (ranging from five to 29), and 62 unique holidays were obtained.

nbi = Negative binomial-identity, p = Poisson-log ; c = Chao’s Lower bound; g = gamma

Predicting thematic saturation from sample size

The free-list counts showed a characteristic descending curve where an initial person listed new themes and each additional person repeated some themes already reported and added new items, but fewer and fewer new items were added with incremental increases in sample size. All examples were fit using the GLM log-link and identity-link with normal, Poisson, and negative binomial distributions. The negative binomial model resulted in a better fit than the Poisson (or identity-link models) for most full-listing examples, providing the best fit to the downward sloping curve with a long tail. Of the 28 examples, only three were not best fit by negative binomial log-link models: the best-fitting model for two examples was the Poisson log-link model (GoodTeam1 and GoodTeam2Player) and one was best fit by the negative binomial identity-link model (CultInd1).

Sample size was a significant predictor of the number of new items for 21 of the 28 examples. Seven examples did not result in a statistically significant fit (Illnesses-US, Holiday2, Industries1, Industries2, GoodTLeader, GoodTeam2Player, and GoodTeam3). The best-fitting model was used to predict the point of saturation and domain size for all 28 examples ( S2 Appendix GLM Statistical Model Results for the 28 Examples).

Using the best-fitting GLM models we estimated the predicted sample size for reaching saturation. Saturation was defined as the point where less than one new item would be expected for each additional person interviewed. Using the models to solve for the sample size (X) when only one item was obtained per person (Y = 1) and rounding up to the nearest integer, provided the point of saturation (Y≤1.0). Table 2 , column five, reports the sample size where saturation was reached (N SAT ). For Holiday1, one or fewer new items were obtained per person when X = 16.98. Rounding up to the next integer provides the saturation point (N SAT = 17). For the Fruit domain, saturation occurred at a sample size of 15.

Saturation was reached at sample sizes of 15–194, with a median sample size of 75. Only five examples (Holiday1, Fruits, Birds, Flowers, and Drugs) reached saturation within the original study sample size and most examples did not reach saturation even after four or five dozen interviews. A more liberal definition of saturation, defined as the point where less than two new items would be expected for each additional person (solving for Y≤2), resulted in a median sample size for reaching saturation of 50 (range 10–146).

Some domains were well bounded and were elicited with small sample sizes. Some were not. In fact, most of the distributions exhibited a very long tail—where many items were mentioned by only one or two people. Fig 1 shows the predicted curves for all examples for sample sizes of 1 to 50. Saturation is the point where the descending curve crosses Y = 1 (or Y = 2). Although the expected number of unique ideas or themes obtained for successive respondents tends to decrease as the sample size increases, this occurs rapidly in some domains and slowly or not at all in other domains. Fruits, Holiday1, and Illness-G are domains with the three bottom-most curves and the steepest descent, indicating that saturation was reached rapidly and with small sample sizes. The three top-most curves are the Moms-F2F, Industries1, and Industries2 domains, which reached saturation at very large sample sizes or essentially did not reach saturation.

An external file that holds a picture, illustration, etc.
Object name is pone.0198606.g001.jpg

Estimating domain size

Because saturation appeared to be related to domain size and some investigators state that a percentage of the domain might be a better standard [ 25 ], domain size was also estimated. First, total domain size was estimated with the GLM models obtained above. Domain size was estimated at the point of saturation by cumulatively summing the number of items obtained for sample sizes n = 1, n = 2, n = 3, … to N SAT . For the Holiday1 sample, summing the number of predicted unique items for sample sizes n = 1 to n = 17 should yield 51 items ( Table 2 , Domain Size at Saturation, D SAT ). Thus, the model predicted that approximately 51 holidays would be obtained by the time saturation was reached.

The total domain size was estimated using a geometric series, summing the estimated number of unique items obtained cumulatively across people in an infinitely large sample. For the Holiday1 domain, the total domain size was estimated as 56 (see Table 2 , Total Domain Size D TOT ). So for the Holiday1 domain, although the total domain size was estimated to be 57, the model predicted that saturation occurred when the sample size reached 17, and at that point 51 holidays should be retrieved. Model predictions were close to the empirical data, as 62 holidays were obtained with a sample of 24.

Larger sample sizes were needed to reach saturation in larger domains; the largest domains were MomsF2F, Industries1, and Industries2 each estimated to have about 1,000 items and more than 100 interviews needed to approach saturation. Saturation (Y≤1) tended to occur at about 90% of the total domain size. For Fruits, the domain size at saturation was 51 and the total domain size was estimated at 53 (51/53 = 96%) and for MomsF2F, domain size at saturation was 904 and total domain size was 951 (95%).

Second, total domain size was estimated using a capture-recapture log-linear model with a parameter for item heterogeneity [ 35 , 36 ]. A descending, concave curve is diagnostic of item heterogeneity and was present in almost all of the examples. The estimated population sizes using R-Capture appear in the last column of Table 2 . When the gamma distribution provided the best fit to the response data, the domain size increased by an order of magnitude as did the standard error on that estimate. When responses fit a gamma distribution, the domain may be extremely large and may not readily reach saturation.

Inclusion of the pattern of matching items across people with a parameter for item heterogeneity (overlap in items between people due to salience) resulted in larger population size estimates than those above without heterogeneity. Estimation from the first two respondents was not helpful and provided estimates much lower than those from any of the other methods. The simple model without subject or item effects (the binomial model) did not fit any of the examples. Estimation from the first 10 respondents in each example suggested that more variation was due to item heterogeneity than to item and subject heterogeneity, so we report only the estimated domain size with the complete samples accounting for item heterogeneity in salience.

Overall, the capture–recapture estimates incorporating the effect of salience were larger than the GLM results above without a parameter for salience. For Fruits, the total domain size was estimated as 45 from the first two people; as 88 (gamma distribution estimate) from the first 10 people with item heterogeneity and as 67 (Chao lower bound estimate) with item and subject heterogeneity; and using the total sample ( n = 33) the binomial model (without any heterogeneity parameters) estimated the domain size as 62 (but did not fit the data) and with item heterogeneity the domain size was estimated as 73 (the best-fitting model used the Chao lower bound estimate). Thus, the total domain size for Fruits estimated with a simple GLM model was 53 and with a capture–recapture model (including item heterogeneity) was 73 ( Table 2 , last column). Similarly, the domain size for Holiday1 was estimated at 57 with the simple GLM model and 100 with capture-recapture model. Domain size estimates suggest that even the simplest domains can be large and that inclusion of item heterogeneity increases domain size estimates.

Saturation and the number of responses per person

The original examples used an exhaustive listing of responses to obtain about a half dozen (GoodLeader and GoodTeam2Player) to almost three dozen responses per person (Industries1 and Industries2). A question is whether saturation and the number of unique ideas obtained might be affected by the number of responses per person. Since open-ended questions may obtain only a few responses, we limited the responses to a maximum of three per person, truncating lists to see the effect on the number of items obtained at different sample sizes and the point of saturation.

When more information (a greater number of responses) was collected per person, more unique items were obtained even at smaller sample sizes ( Table 3 ). The amount of information retrieved per sample can be conceived of in terms of bits of information per sample and is roughly the average number of responses per person times the sample size so that, with all other things being equal, larger sample sizes with less probing should approach the same amount of information obtained with smaller samples and more probing. So, for a given sample size, a study with six responses per person should obtain twice as much information as a study with three responses per person. In the GoodLeader, GoodTeam1, and GoodTeam2Player examples, the average list length was approximately six and when the sample size was 10 (6 x 10 = 60 bits of information), approximately twice as many items were obtained as when lists were truncated to three responses (3 x 10 = 30 bits of information).

Increasing the sample size proportionately increases the amount of information, but not always. For Scary Things, 5.6 bits more information were collected per person with full listing (16.9 average list length) than with three or fewer responses per person (3.0 list length); and the number of items obtained in a sample size of 10 with full listing (102) was roughly 5.6 times greater than that obtained with three responses per person (18 items). However, at a sample size of 20 the number of unique items with free lists was only 4.5 times larger (153) than the number obtained with three responses per person (34). Across examples , interviews that obtained more information per person were more productive and obtained more unique items overall even with smaller sample sizes than did interviews with only three responses per person .

Using the same definition of saturation (the point where less than one new item would be expected for each additional person interviewed), less information per person resulted in reaching saturation at much smaller sample sizes. Fig 2 shows the predicted curves for all examples when the number of responses per person is three (or fewer). The Holiday examples reached saturation (fewer than one new item per person) with a sample size of 17 (Holiday1) with 13.0 average responses per person and 87 (Holiday2) with 17.8 average responses ( Table 2 ), but reached saturation with a sample size of only 9 (Holiday 1 and Holiday2) when there were a maximum of three responses per person ( Table 3 , last column). With three or fewer responses per person, the median sample size for reaching saturation was 16 (range: 4–134). Thus, fewer responses per person resulted in reaching saturation at smaller sample sizes and resulted in fewer domain items.

An external file that holds a picture, illustration, etc.
Object name is pone.0198606.g002.jpg

Salience and sample size

Saturation did not seem to be a useful guide for determining a sample size stopping point, because it was sensitive both to domain size and the number of responses per person. Since a main goal of open-ended interviews is to obtain the most important ideas and themes, it seemed reasonable to consider item salience as an alternative guide to assist with determining sample size adequacy. Here, the question would be: Whether or not complete saturation is achieved, are the most salient ideas and themes captured in small samples?

A simple and direct measure of item salience is the proportion of people in a sample that mentioned an item [ 37 ]. However, we examined the correlation between the sample proportions and two salience indices that combine the proportion of people mentioning an item with the item’s list position [ 13 – 15 ]. Because the item frequency distributions have long tails—there are many items mentioned by only one or two people—we focused on only those items mentioned by two or more people (24–204 items) and used the full lists provided by each respondent. The average Spearman correlation between the Smith and Sutrop indices in the 28 examples was 0.95 (average Pearson correlation 0.96, 95%CI: 0.92, 0.98), between the Smith index and the sample proportions was 0.89 (average Pearson 0.96, 95%CI: 0.915, 0.982), and between the Sutrop index and the sample proportions was 0.86 (average Pearson 0.88 95%CI: 0.753, 0.943). Thus, the three measures were highly correlated in 28 examples that varied in content, number of items, and sample size—validating the measurement of a single construct.

To test whether the most salient ideas and themes were captured in smaller samples or with limited probing, we used the sample proportions to estimate item salience and compared the set of most salient items across sample sizes and across more and less probing. Specifically, we defined a set of salient items for each example as those mentioned by 20% or more in the sample of size 20 (because all examples had at least 20) with full-listing (because domains were more detailed). We compared the set of salient items with the set of items obtained at smaller sample sizes and with fewer responses per person.

The set size for salient items (prevalence ≥ 20%) was not related to overall domain size, but was an independent characteristic of each domain and whether there were core or prototypical items with higher salience. Most domains had about two dozen items mentioned by 20% or more of the original listing sample ( n = 20), but some domains had only a half dozen or fewer items (GoodLeader, GoodTeam2Player, GoodTeam3). With full listing, 26 of 28 examples captured more than 95% of the salient ideas in the first 10 interviews: 18 examples captured 100%, eight examples captured 95–99%, one example captured 91%, and one captured 80% ( Table 4 ). With a maximum of three responses per person, about two-thirds of the salient items (68%) were captured with 20 interviews and about half of the items (53%) were captured in the first 10 interviews. With a sample size of 20, a greater number of responses per person resulted in approximately 50% more items than with three responses per person. Extensive probing resulted in a greater capture of salient items even with smaller sample sizes.

Summary and discussion

The strict notion of complete saturation as the point where few or no new ideas are observed is not a useful concept to guide sample size decisions, because it is sensitive to domain size and the amount of information contributed by each respondent. Larger sample sizes are necessary to reach saturation for large domains and it is difficult to know, when starting a study, just how large the domain or set of ideas will be. Also, when respondents only provide a few responses or codes per person, saturation may be reached quickly. So, if complete thematic saturation is observed, it is difficult to know whether the domain is small or whether the interviewer did only minimal probing.

Rather than attempting to reach complete saturation with an incremental sampling plan, a more productive focus might be on gaining more depth with probing and seeking the most salient ideas. Rarely do we need all the ideas and themes, rather we tend to be looking for important or salient ideas. A greater number of responses per person resulted in the capture of a greater number of salient items. With exhaustive listing, the first 10 interviews obtained 95% of the salient ideas (defined here as item prevalence of 0.20 or more), while only 53% of those ideas were obtained in 10 interviews with three or fewer responses per person.

We used a simple statistical model to predict the number of new items added by each additional person and found that complete saturation was not a helpful concept for free-lists, as the median sample size was 75 to get fewer than one new idea per person. It is important to note that we assumed that interviews were in a random order or were in the order that the interviews were conducted and were not reordered to any kind of optimum. The reordering of respondents to maximally fit a saturation curve may make it appear that saturation has been reached at a smaller sample size [ 31 ].

Most of the examples examined in this study needed sample sizes larger than most qualitative researchers use to reach saturation. Mason’s [ 6 ] review of 298 PhD dissertations in the United Kingdom, all based on qualitative data, found a mean sample size of 27 (range 1–95). Here, few of the examples reached saturation with less than four dozen interviews. Even with large sample sizes, some domains may continue to add new items. For very large domains, an incremental sampling strategy may lead to dozens and dozens of interviews and still not reach complete saturation. The problem is that most domains have very long tails in the distribution of observed items, with many items mentioned by only one or two people. A more liberal definition of complete saturation (allowing up to two new items per person) allowed for saturation to occur at smaller sample sizes, but saturation still did not occur until a median sample size of 50.

In the examples we studied, most domains were large and domain size affected when saturation occurred. Unfortunately, there did not seem to be a good or simple way at the outset to tell if a domain would be large or small. Most domains were much larger than expected, even on simple topics. Domain size varied by substantive content, sample, and degree of heterogeneity in salience. Domain size and saturation were sample dependent, as the holiday examples showed. Also, domain size estimates did not mean that there are only 73 fruits, rather the pattern of naming fruits—for this particular sample—indicated a set size of 73.

It was impossible to know, when starting, if a topic or domain was small and would require 15 interviews to reach saturation or if the domain was large and would require more than 100 interviews to reach saturation. Although eight of the examples had sample sizes of 50–99, sample sizes in qualitative studies are rarely that large. Estimates of domain size were even larger when models incorporated item heterogeneity (salience). The Fruit example had an estimated domain size of 53 without item heterogeneity, but 73 with item heterogeneity. The estimated size of the Fabric domain increased from 210 to 753 when item heterogeneity was included.

The number of responses per person affected both saturation and the number of obtained items. A greater number of responses per person resulted in a greater yield of domain items. The bits of information obtained in a sample can be approximated by the product of the average number of responses per person (list length) and the number of people in a sample. However, doubling the sample size did not necessarily double the unique items obtained because of item salience and sampling variability. When only a few items are obtained from each person, only the most salient items tend to be provided by each person and fewer items are obtained overall.

Brewer [ 29 ] explored the effect of probing or prompting on interview yield. Brewer examined the use of a few simple prompts: simply asking for more responses, providing alphabetical cues, or repeating the last response(s) and asking again for more information. Semantic cueing, repeating prior responses and asking for more information, increased the yield by approximately 50%. The results here indicated a similar pattern. When more information was elicited per person , about 50% more domain items were retrieved than when people provided a maximum of three responses.

Interviewing to obtain multiple responses also affects saturation. With few responses per person, complete saturation was reached rapidly. Without extensive interview probing, investigators may reach saturation quickly and assume they have a sample sufficient to retrieve most of the domain items. Unfortunately, different degrees of salience among items may cause strong effects for respondents to repeat similar ideas—the most salient ideas—without elaborating on less salient or less prevalent ideas, resulting in a set of only the ideas with the very highest salience. If an investigator wishes to obtain most of the ideas that are relevant in a domain , a small sample with extensive probing (listing) will prove much more productive than a large sample with casual or no probing .

Recently, Galvin [ 21 ] and Fugard and Potts [ 22 ] framed sample size estimation for qualitative interviewing in terms of binomial probabilities. However, results for the 28 examples with multiple responses per person suggest that this may not be appropriate because of the interdependencies among items due to salience. The capture–recapture analysis indicated that none of the 28 examples fit the binomial distribution. Framing the sample size problem in terms that a specific idea or theme will or will not appear in a set of interviews may facilitate thinking about sample size, but such estimates may be misleading.

If a binomial distribution is assumed, sample size can be estimated from the prevalence of an idea in the population, from how confident you want to be in obtaining these ideas, and from how many times you would like these ideas to minimally appear across participants in your interviews. A binomial estimate assumes independence (no difference in salience across items) and predicts that if an idea or theme actually occurs in 20% of the population, there is a 90% or higher likelihood of obtaining those themes at least once in 11 interviews and a 95% likelihood in 14 interviews. In contrast, our results indicated that the heterogeneity in salience across items causes these estimates to underestimate the necessary sample size as items with ≥20% prevalence were captured in 10 interviews in only 64% of the samples with full listing and in only 4% (one) of samples with three or fewer responses.

Lowe et al. [ 25 ] also found that items were not independent and that binomial estimates significantly underestimated sample size. They proposed sample size estimation from the desired proportion of items at a given average prevalence. Their formula predicts that 36 interviews would be necessary to capture 90% of items with an average prevalence of 0.20, regardless of degree of heterogeneity in salience, domain size, or amount of information provided per respondent. Although they included a parameter for non-independence, their model does not seem to be accurate for cases with limited responses or for large domains.

Conclusions

In general , probing and prompting during an interview seems to matter more than the number of interviews . Thematic saturation may be an illusion and may result from a failure to use in-depth probing during the interview. A small sample ( n = 10) can collect some of the most salient ideas, but a small sample with extensive probing can collect most of the salient ideas. A larger sample ( n = 20) is more sensitive and can collect more prevalent and more salient ideas, as well as less prevalent ideas, especially with probing. Some domains, however, may not have items with high prevalence. Several of the domains examined had only a half dozen or fewer items with prevalence of 20% or more. The direct link between salience and population prevalence offers a rationale for sample size and facilitates study planning. If the goal is to get a few widely held ideas, a small sample size will suffice. If the goal is to explore a larger range of ideas, a larger sample size or extensive probing is needed. Sample sizes of one to two dozen interviews should be sufficient with exhaustive probing (listing interviews), especially in a coherent domain. Empirically observed stabilization of item salience may indicate an adequate sample size.

A next step would be to test whether these conclusions and recommendations hold for other types of open-ended questions, such as narratives, life histories, and open-ended questions in large surveys. Open-ended survey questions are inefficient and result in thin or sparse data with few responses per person because of a lack of prompting. Tran et al. [ 24 ] reported item prevalence of 0.025 in answers in a large Internet survey suggesting few responses per person. In contrast, we used an item prevalence of 0.20 and higher to identify the most salient items in each domain and the highest prevalence in each domain ranged from 0.30 to 0.80 ( Table 1 ). Inefficiency in open-ended survey questions is likely due to the dual purpose of the questions: They try to define the range of possible answers and get the respondent’s answer. A better approach might be to precede survey development with a dozen free-listing interviews to get the range of possible responses and then use that content to design structured survey questions.

Another avenue for investigation is how our findings on thematic saturation compare to theoretical saturation in grounded theory studies [ 2 , 38 , 39 ]. Grounded theory studies rely on theoretical sampling–-an iterative procedure in which a single interview is coded for themes; the next respondent is selected to discover new themes and relationships between themes; and so on, until no more relevant themes or inter-relationships are discovered and a theory is built to explain the facts/themes of the case under study. In contrast this study examined thematic saturation, the simple accumulation of ideas and themes, and found that saturation in salience was more attainable–-perhaps more important—than thematic saturation.

Supporting information

S1 appendix, s2 appendix, acknowledgments.

We would like to thank Devon Brewer and Kristofer Jennings for providing feedback on an earlier version of this manuscript. We would also like to thank Devon Brewer for providing data from his studies on free-lists.

Funding Statement

This project was partially supported by the Agency for Healthcare Research and Quality (R24HS022134). Funding for the original data sets was from the National Science Foundation (#BCS-0244104) for Gravlee et al. (2013), from the National Institute on Drug Abuse (R29DA10640) for Brewer et al. (2002), and from the Air Force Office of Scientific Research for Brewer (1995). Content is solely the responsibility of the authors and does not necessarily represent the official views of the funding agencies.

Data Availability

  • Panelist area
  • Become a panelist

Qualitative research: open-ended and closed-ended questions

' src=

Our guide to market research can be downloaded free of charge

From a very young age, we have been taught what open-ended , and closed-ended questions are. How are these terms applied to qualitative research methods , and in particular to interviews?

Kathryn J. Roulston reveals her definitions of an open-ended and closed-ended question in qualitative interviews in the SAGE Encyclopedia on Qualitative Research Methods . If you want to better understand how qualitative methods fit within a market research approach, we suggest you take a look at our step-by-step guide to market research which can be downloaded in our white papers section (free of charge and direct; we won’t ask you any contact details first).

credits : Shutterstock

Only for our subscribers: exclusive analyses and marketing advice

Esteban Hendrickx

"I thought the blog was good. But the newsletter is even better!"

Introduction

  • Closed-ended question
  • Open-ended question

Examples of closed and open-ended questions for satisfaction research

Examples of closed and open-ended questions for innovation research, some practical advice.

Let us begin by pointing out that open and closed-ended questions do not at first glance serve the same purpose in market research. Instead, open-ended questions are used in qualitative research (see the video above for more information) and closed-ended questions are used in quantitative research. But this is not an absolute rule.

In this article, you will, therefore, discover the definitions of closed and open-ended questions. We will also explain how to use them. Finally, you will find examples of how to reformulate closed-ended questions into open-ended questions in the case of :

  • satisfaction research
  • innovation research

Essential elements to remember

Open-ended questions:

  • for qualitative research (interviews and focus groups)
  • very useful in understanding in detail the respondent and his or her position concerning a defined topic/situation
  • particularly helpful in revealing new aspects , sub-themes, issues, and so forth that are unknown or unidentified

Closed-ended questions:

  • for quantitative research (questionnaires and surveys)
  • suitable for use with a wide range of respondents
  • allow a standardised analysis of the data
  • are intended to confirm the hypotheses (previously stated in the qualitative part)

A closed-ended question

A closed-ended question offers, as its name suggests, a limited number of answers. For example, the interviewee may choose a response from a panel of given proposals or a simple “yes” or “no”. They are intended to provide a precise, clearly identifiable and easily classified answer.

This type of question is used in particular during interviews whose purpose is to be encoded according to pre-established criteria. There is no room for free expression, as is the case for open-ended questions. Often, this type of question is integrated into 1-to-1 interview guides and focus groups and allows the interviewer to collect the same information from a wide range of respondents in the same format. Indeed, closed-ended questions are designed and oriented to follow a pattern and framework predefined by the interviewer.

how to write open ended interview questions for research

Two forms of closed-ended questions were identified by the researchers: specific closed-ended questions , where respondents are offered choice answers, and implicit closed-ended questions , which include assumptions about the answers that can be provided by respondents.

A specific closed-ended question would be formulated as follows, for example: “how many times a week do you eat pasta: never, once or twice a week, 3 to 4 times, 5 times a week or more?” The adapted version in the form of an implicit closed-ended question would be formulated as follows: “how many times a week do you eat pasta? ». The interviewer then assumes that the answers will be given in figures.

Net Promoter Score question at Proximus

The Net Promoter Score (or NPS) is an example of closed question (see example above)

While some researchers consider the use of closed-ended questions to be restrictive, others see in these questions – combined with open-ended questions – the possibility of generating different data for analysis. How these closed-ended questions can be used, formulated, sequenced, and introduced in interviews depends heavily upon the studies and research conducted upstream.

Read also Creating a questionnaire for quantitative market research

In what context are closed-ended questions used?

  • Quantitative research (tests, confirmation of the qualitative research and so on).
  • Research with a large panel of respondents (> 100 people)
  • Recurrent research whose results need to be compared
  • When you need confirmation, and the possible answers are limited in effect

An open-ended question

An open-ended question is a question that allows the respondent to express himself or herself freely on a given subject. This type of question is, as opposed to closed-ended questions, non-directive and allows respondents to use their own terms and direct their response at their convenience.

Open-ended questions, and therefore without presumptions, can be used to see which aspect stands out from the answers and thus could be interpreted as a fact, behaviour, reaction, etc. typical to a defined panel of respondents.

For example, we can very easily imagine open-ended questions such as “describe your morning routine”. Respondents are then free to describe their routine in their own words, which is an important point to consider. Indeed, the vocabulary used is also conducive to analysis and will be an element to be taken into account when adapting an interview guide, for example, and/or when developing a quantitative questionnaire.

how to write open ended interview questions for research

As we detail in our market research whitepaper , one of the recommendations to follow when using open-ended questions is to start by asking more general questions and end with more detailed questions. For example, after describing a typical day, the interviewer may ask for clarification on one of the aspects mentioned by the respondent. Also, open-ended questions can also be directed so that the interviewee evokes his or her feelings about a situation he or she may have mentioned earlier.

In what context are open-ended questions used?

  • Mainly in qualitative research (interviews and focus groups)
  • To recruit research participants
  • During research to test a design, a proof-of-concept, a prototype, and so on, it is essential to be able to identify the most appropriate solution.
  • Analysis of consumers and purchasing behaviour
  • Satisfaction research , reputation, customer experience and loyalty research, and so forth.
  • To specify the hypotheses that will enable the quantitative questionnaire to be drawn up and to propose a series of relevant answers (to closed-ended questions ).

It is essential for the interviewer to give respondents a framework when using open-ended questions. Without this context, interviewees could be lost in the full range of possible responses, and this could interfere with the smooth running of the interview. Another critical point concerning this type of question is the analytical aspect that follows. Indeed, since respondents are free to formulate their answers, the data collected will be less easy to classify according to fixed criteria.

The use of open-ended questions in quantitative questionnaires

Rules are made to be broken; it is well known. Most quantitative questionnaires, therefore, contain free fields in which the respondent is invited to express his or her opinions in a more “free” way. But how to interpret these answers?

When the quantity of answers collected is small (about ten) it will be easy to proceed manually, possibly by coding (for more information on the coding technique, go here ). You will thus quickly identify the main trends and recurring themes.

On the other hand, if you collect hundreds or even thousands of answers, the analysis of these free answers will be much more tedious. How can you do it? In this case, we advise you to use a semantic analysis tool. This is most often an online solution, specific to a language, which is based on an NLP (Natural Language Processing) algorithm. This algorithm will, very quickly, analyse your corpus and bring out the recurring themes . It is not a question here of calculating word frequencies, but instead of working on semantics to analyse the repetition of a subject.

Of course, the use of open-ended questions in interviews does not exclude the use of closed-ended questions. Alternating these two types of questions in interviews, whether 1-to-1 interviews, group conversations or focus groups, is conducive not only to maintaining a specific dynamic during the interview but also to be able to frame specific responses while leaving certain fields of expression free. In general, it is interesting for the different parties that the interview ends with an open-ended question where the interviewer asks the interviewee if he or she has anything to add or if he or she has any questions.

In this type of research, you confront the respondent with a new, innovative product or service. It is therefore important not to collect superficial opinions but to understand in depth the respondent’s attitude towards the subject of the market research.

As you will have understood, open-ended questions are particularly suitable for qualitative research (1-to-1 interviews and focus groups). How should they be formulated?

The Five W’s; (who did what, where, when, and why ) questioning method should be used rigorously and sparingly :

  • Who? Who? What? Where? When? How? How much? “are particularly useful for qualitative research and allow you to let your interlocutor develop and elaborate a constructed and informative answer.
  • Use the CIT (Critical Incident Technique) method with formulations that encourage your interviewer to go into the details of an experience: “Can you describe/tell me…? “, ” What did you feel? “, ” According to you… “
  • Avoid asking “Why?”: this question may push the interviewer into a corner, and the interviewer may seek logical reasoning for his or her previous answer. Be gentle with your respondents by asking them to tell you more, to give you specific examples, for example.

In contrast, closed-ended questions are mainly used and adapted to quantitative questionnaires since they facilitate the analysis of the results by framing the participants’ answers.

Image: Shutterstock

  • Market research methods

' src=

19 July 2021

Very useful sir….

Post your opinion

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Don't forget to check your spam folder .

You didn't receive the link?

Pour offrir les meilleures expériences, nous utilisons des technologies telles que les cookies pour stocker et/ou accéder aux informations des appareils. Le fait de consentir à ces technologies nous permettra de traiter des données telles que le comportement de navigation ou les ID uniques sur ce site. Le fait de ne pas consentir ou de retirer son consentement peut avoir un effet négatif sur certaines caractéristiques et fonctions.

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

Semi-Structured Interview | Definition, Guide & Examples

Published on January 27, 2022 by Tegan George . Revised on June 22, 2023.

A semi-structured interview is a data collection method that relies on asking questions within a predetermined thematic framework. However, the questions are not set in order or in phrasing.

In research, semi-structured interviews are often qualitative in nature. They are generally used as an exploratory tool in marketing, social science, survey methodology, and other research fields.

They are also common in field research with many interviewers, giving everyone the same theoretical framework, but allowing them to investigate different facets of the research question .

  • Structured interviews : The questions are predetermined in both topic and order.
  • Unstructured interviews : None of the questions are predetermined.
  • Focus group interviews : The questions are presented to a group instead of one individual.

Table of contents

What is a semi-structured interview, when to use a semi-structured interview, advantages of semi-structured interviews, disadvantages of semi-structured interviews, semi-structured interview questions, how to conduct a semi-structured interview, how to analyze a semi-structured interview, presenting your results (with example), other interesting articles, frequently asked questions about semi-structured interviews.

Semi-structured interviews are a blend of structured and unstructured types of interviews.

  • Unlike in an unstructured interview, the interviewer has an idea of what questions they will ask.
  • Unlike in a structured interview, the phrasing and order of the questions is not set.

Semi-structured interviews are often open-ended, allowing for flexibility. Asking set questions in a set order allows for easy comparison between respondents, but it can be limiting. Having less structure can help you see patterns, while still allowing for comparisons between respondents.

Semi-structured interviews are best used when:

  • You have prior interview experience. Spontaneous questions are deceptively challenging, and it’s easy to accidentally ask a leading question or make a participant uneasy.
  • Your research question is exploratory in nature. Participant answers can guide future research questions and help you develop a more robust knowledge base for future research.

Just like in structured interviews, it is critical that you remain organized and develop a system for keeping track of participant responses. However, since the questions are less set than in a structured interview, the data collection and analysis become a bit more complex.

Differences between different types of interviews

Make sure to choose the type of interview that suits your research best. This table shows the most important differences between the four types.

Semi-structured interviews come with many advantages.

Best of both worlds

No distractions, detail and richness.

However, semi-structured interviews also have their downsides.

Low validity

High risk of research bias, difficult to develop good semi-structured interview questions.

Since they are often open-ended in style, it can be challenging to write semi-structured interview questions that get you the information you’re looking for without biasing your responses. Here are a few tips:

  • Define what areas or topics you will be focusing on prior to the interview. This will help you write a framework of questions that zero in on the information you seek.
  • Write yourself a guide to refer to during the interview, so you stay focused. It can help to start with the simpler questions first, moving into the more complex ones after you have established a comfortable rapport.
  • Be as clear and concise as possible, avoiding jargon and compound sentences.
  • How often per week do you go to the gym? a) 1 time; b) 2 times; c) 3 times; d) 4 or more times
  • If yes: What feelings does going to the gym bring out in you?
  • If no: What do you prefer to do instead?
  • If yes: How did this membership affect your job performance? Did you stay longer in the role than you would have if there were no membership?

Once you’ve determined that a semi-structured interview is the right fit for your research topic , you can proceed with the following steps.

Step 1: Set your goals and objectives

You can use guiding questions as you conceptualize your research question, such as:

  • What are you trying to learn or achieve from a semi-structured interview?
  • Why are you choosing a semi-structured interview as opposed to a different type of interview, or another research method?

If you want to proceed with a semi-structured interview, you can start designing your questions.

Step 2: Design your questions

Try to stay simple and concise, and phrase your questions clearly. If your topic is sensitive or could cause an emotional response, be mindful of your word choices.

One of the most challenging parts of a semi-structured interview is knowing when to ask follow-up or spontaneous related questions. For this reason, having a guide to refer back to is critical. Hypothesizing what other questions could arise from your participants’ answers may also be helpful.

Step 3: Assemble your participants

There are a few sampling methods you can use to recruit your interview participants, such as:

  • Voluntary response sampling : For example, sending an email to a campus mailing list and sourcing participants from responses.
  • Stratified sampling of a particular characteristic trait of interest to your research, such as age, race, ethnicity, or gender identity.

Step 4: Decide on your medium

It’s important to determine ahead of time how you will be conducting your interview. You should decide whether you’ll be conducting it live or with a pen-and-paper format. If conducted in real time, you also need to decide if in person, over the phone, or via videoconferencing is the best option for you.

Note that each of these methods has its own advantages and disadvantages:

  • Pen-and-paper may be easier for you to organize and analyze, but you will receive more prepared answers, which may affect the reliability of your data.
  • In-person interviews can lead to nervousness or interviewer effects, where the respondent feels pressured to respond in a manner they believe will please you or incentivize you to like them.

Step 5: Conduct your interviews

As you conduct your interviews, keep environmental conditions as constant as you can to avoid bias. Pay attention to your body language (e.g., nodding, raising eyebrows), and moderate your tone of voice.

Relatedly, one of the biggest challenges with semi-structured interviews is ensuring that your questions remain unbiased. This can be especially challenging with any spontaneous questions or unscripted follow-ups that you ask your participants.

After you’re finished conducting your interviews, it’s time to analyze your results. First, assign each of your participants a number or pseudonym for organizational purposes.

The next step in your analysis is to transcribe the audio or video recordings. You can then conduct a content or thematic analysis to determine your categories, looking for patterns of responses that stand out to you and test your hypotheses .

Transcribing interviews

Before you get started with transcription, decide whether to conduct verbatim transcription or intelligent verbatim transcription.

  • If pauses, laughter, or filler words like “umm” or “like” affect your analysis and research conclusions, conduct verbatim transcription and include them.
  • If not, you can conduct intelligent verbatim transcription, which excludes fillers, fixes any grammatical issues, and is usually easier to analyze.

Transcribing presents a great opportunity for you to cleanse your data . Here, you can identify and address any inconsistencies or questions that come up as you listen.

Your supervisor might ask you to add the transcriptions to the appendix of your paper.

Coding semi-structured interviews

Next, it’s time to conduct your thematic or content analysis . This often involves “coding” words, patterns, or recurring responses, separating them into labels or categories for more robust analysis.

Due to the open-ended nature of many semi-structured interviews, you will most likely be conducting thematic analysis, rather than content analysis.

  • You closely examine your data to identify common topics, ideas, or patterns. This can help you draw preliminary conclusions about your participants’ views, knowledge or experiences.
  • After you have been through your responses a few times, you can collect the data into groups identified by their “code.” These codes give you a condensed overview of the main points and patterns identified by your data.
  • Next, it’s time to organize these codes into themes. Themes are generally broader than codes, and you’ll often combine a few codes under one theme. After identifying your themes, make sure that these themes appropriately represent patterns in responses.

Analyzing semi-structured interviews

Once you’re confident in your themes, you can take either an inductive or a deductive approach.

  • An inductive approach is more open-ended, allowing your data to determine your themes.
  • A deductive approach is the opposite. It involves investigating whether your data confirm preconceived themes or ideas.

After your data analysis, the next step is to report your findings in a research paper .

  • Your methodology section describes how you collected the data (in this case, describing your semi-structured interview process) and explains how you justify or conceptualize your analysis.
  • Your discussion and results sections usually address each of your coded categories.
  • You can then conclude with the main takeaways and avenues for further research.

Example of interview methodology for a research paper

Let’s say you are interested in vegan students on your campus. You have noticed that the number of vegan students seems to have increased since your first year, and you are curious what caused this shift.

You identify a few potential options based on literature:

  • Perceptions about personal health or the perceived “healthiness” of a vegan diet
  • Concerns about animal welfare and the meat industry
  • Increased climate awareness, especially in regards to animal products
  • Availability of more vegan options, making the lifestyle change easier

Anecdotally, you hypothesize that students are more aware of the impact of animal products on the ongoing climate crisis, and this has influenced many to go vegan. However, you cannot rule out the possibility of the other options, such as the new vegan bar in the dining hall.

Since your topic is exploratory in nature and you have a lot of experience conducting interviews in your work-study role as a research assistant, you decide to conduct semi-structured interviews.

You have a friend who is a member of a campus club for vegans and vegetarians, so you send a message to the club to ask for volunteers. You also spend some time at the campus dining hall, approaching students at the vegan bar asking if they’d like to participate.

Here are some questions you could ask:

  • Do you find vegan options on campus to be: excellent; good; fair; average; poor?
  • How long have you been a vegan?
  • Follow-up questions can probe the strength of this decision (i.e., was it overwhelmingly one reason, or more of a mix?)

Depending on your participants’ answers to these questions, ask follow-ups as needed for clarification, further information, or elaboration.

  • Do you think consuming animal products contributes to climate change? → The phrasing implies that you, the interviewer, do think so. This could bias your respondents, incentivizing them to answer affirmatively as well.
  • What do you think is the biggest effect of animal product consumption? → This phrasing ensures the participant is giving their own opinion, and may even yield some surprising responses that enrich your analysis.

After conducting your interviews and transcribing your data, you can then conduct thematic analysis, coding responses into different categories. Since you began your research with several theories about campus veganism that you found equally compelling, you would use the inductive approach.

Once you’ve identified themes and patterns from your data, you can draw inferences and conclusions. Your results section usually addresses each theme or pattern you found, describing each in turn, as well as how often you came across them in your analysis. Feel free to include lots of (properly anonymized) examples from the data as evidence, too.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Student’s  t -distribution
  • Normal distribution
  • Null and Alternative Hypotheses
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles
  • Cluster sampling
  • Stratified sampling
  • Data cleansing
  • Reproducibility vs Replicability
  • Peer review
  • Prospective cohort study

Research bias

  • Implicit bias
  • Cognitive bias
  • Placebo effect
  • Hawthorne effect
  • Hindsight bias
  • Affect heuristic
  • Social desirability bias

A semi-structured interview is a blend of structured and unstructured types of interviews. Semi-structured interviews are best used when:

  • You have prior interview experience. Spontaneous questions are deceptively challenging, and it’s easy to accidentally ask a leading question or make a participant uncomfortable.

The four most common types of interviews are:

  • Structured interviews : The questions are predetermined in both topic and order. 
  • Semi-structured interviews : A few questions are predetermined, but other questions aren’t planned.

Social desirability bias is the tendency for interview participants to give responses that will be viewed favorably by the interviewer or other participants. It occurs in all types of interviews and surveys , but is most common in semi-structured interviews , unstructured interviews , and focus groups .

Social desirability bias can be mitigated by ensuring participants feel at ease and comfortable sharing their views. Make sure to pay attention to your own body language and any physical or verbal cues, such as nodding or widening your eyes.

This type of bias can also occur in observations if the participants know they’re being observed. They might alter their behavior accordingly.

The interviewer effect is a type of bias that emerges when a characteristic of an interviewer (race, age, gender identity, etc.) influences the responses given by the interviewee.

There is a risk of an interviewer effect in all types of interviews , but it can be mitigated by writing really high-quality interview questions.

Inductive reasoning is a bottom-up approach, while deductive reasoning is top-down.

Inductive reasoning takes you from the specific to the general, while in deductive reasoning, you make inferences by going from general premises to specific conclusions.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

George, T. (2023, June 22). Semi-Structured Interview | Definition, Guide & Examples. Scribbr. Retrieved April 15, 2024, from https://www.scribbr.com/methodology/semi-structured-interview/

Is this article helpful?

Tegan George

Tegan George

Other students also liked, structured interview | definition, guide & examples, unstructured interview | definition, guide & examples, what is a focus group | step-by-step guide & examples, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

Logo for Mavs Open Press

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

9.2 Qualitative interviews

Learning objectives.

  • Define interviews from the social scientific perspective
  • Identify when it is appropriate to employ interviews as a data-collection strategy
  • Identify the primary aim of in-depth interviews
  • Describe what makes qualitative interview techniques unique
  • Define the term interview guide and describe how to construct an interview guide
  • Outline the guidelines for constructing good qualitative interview questions
  • Describe how writing field notes and journaling function in qualitative research
  • Identify the strengths and weaknesses of interviews

Knowing how to create and conduct a good interview is an essential skill. Interviews are used by market researchers to learn how to sell their products, and journalists use interviews to get information from a whole host of people from VIPs to random people on the street. Police use interviews to investigate crimes.

how to write open ended interview questions for research

In social science,  interviews are a method of data collection that involves two or more people exchanging information through a series of questions and answers. The questions are designed by the researcher to elicit information from interview participants on a specific topic or set of topics. These topics are informed by the research questions. Typically, interviews involve an in-person meeting between two people—an interviewer and an interviewee — but interviews need not be limited to two people, nor must they occur in-person.

The question of when to conduct an interview might be on your mind. Interviews are an excellent way to gather detailed information. They also have an advantage over surveys—they can change as you learn more information. In a survey, you cannot change what questions you ask if a participant’s response sparks some follow-up question in your mind. All participants must get the same questions. The questions you decided to put on your survey during the design stage determine what data you get. In an interview, however, you can follow up on new and unexpected topics that emerge during the conversation. Trusting in emergence and learning from participants are hallmarks of qualitative research. In this way, interviews are a useful method to use when you want to know the story behind the responses you might receive in a written survey.

Interviews are also useful when the topic you are studying is rather complex, requires lengthy explanation, or needs a dialogue between two people to thoroughly investigate. Also, if people will describe the process by which a phenomenon occurs, like how a person makes a decision, then interviews may be the best method for you. For example, you could use interviews to gather data about how people reach the decision not to have children and how others in their lives have responded to that decision. To understand these “how’s” you would need to have some back-and-forth dialogue with respondents. When they begin to tell you their story, inevitably new questions that hadn’t occurred to you from prior interviews would come up because each person’s story is unique. Also, because the process of choosing not to have children is complex for many people, describing that process by responding to closed-ended questions on a survey wouldn’t work particularly well.

Interview research is especially useful when:

  • You wish to gather very detailed information
  • You anticipate wanting to ask respondents follow-up questions based on their responses
  • You plan to ask questions that require lengthy explanation
  • You are studying a complex or potentially confusing topic to respondents
  • You are studying processes, such as how people make decisions

Qualitative interviews are sometimes called intensive or in-depth interviews. These interviews are semi-structured ; the researcher has a particular topic about which she would like to hear from the respondent, but questions are open-ended and may not be asked in exactly the same way or in exactly the same order to each and every respondent. For in-depth interviews , the primary aim is to hear from respondents about what they think is important about the topic at hand and to hear it in their own words. In this section, we’ll take a look at how to conduct qualitative interviews, analyze interview data, and identify some of the strengths and weaknesses of this method.

Constructing an interview guide

Qualitative interviews might feel more like a conversation than an interview to respondents, but the researcher is in fact usually guiding the conversation with the goal in mind of gathering specific information from a respondent. Qualitative interviews use open-ended questions, which are questions that a researcher poses but does not provide answer options for. Open-ended questions are more demanding of participants than closed-ended questions because they require participants to come up with their own words, phrases, or sentences to respond.

how to write open ended interview questions for research

In a qualitative interview, the researcher usually develops an interview guide in advance to refer to during the interview (or memorizes in advance of the interview). An interview guide is a list of questions or topics that the interviewer hopes to cover during the course of an interview. It is called a guide because it is simply that—it is used to guide the interviewer, but it is not set in stone. Think of an interview guide like an agenda for the day or a to-do list—both probably contain all the items you hope to check off or accomplish, though it probably won’t be the end of the world if you don’t accomplish everything on the list or if you don’t accomplish it in the exact order that you have it written down. Perhaps new events will come up that cause you to rearrange your schedule just a bit, or perhaps you simply won’t get to everything on the list.

Interview guides should outline issues that a researcher feels are likely to be important. Because participants are asked to provide answers in their own words and to raise points they believe are important, each interview is likely to flow a little differently. While the opening question in an in-depth interview may be the same across all interviews, from that point on, what the participant says will shape how the interview proceeds. Sometimes participants answer a question on the interview guide before it is asked. When the interviewer comes to that question later on in the interview, it’s a good idea to acknowledge that they already addressed part of this question and ask them if they have anything to add to their response.  All of this uncertainty can make in-depth interviewing exciting and rather challenging. It takes a skilled interviewer to be able to ask questions; listen to respondents; and pick up on cues about when to follow up, when to move on, and when to simply let the participant speak without guidance or interruption.

As we’ve discussed, interview guides can list topics or questions. The specific format of an interview guide might depend on your style, experience, and comfort level as an interviewer or with your topic. Figure 9.1 provides an example of an interview guide for a study of how young people experience workplace sexual harassment. The guide is topic-based, rather than a list of specific questions. The ordering of the topics is important, though how each comes up during the interview may vary.

interview guide using topics, not questions

For interview guides that use questions, there can also be specific words or phrases for follow-up in case the participant does not mention those topics in their responses. These probes , as well as the questions are written out in the interview guide, but may not always be used. Figure 9.2 provides an example of an interview guide that uses questions rather than topics.

interview guide using questions rather than topic

As you might have guessed, interview guides do not appear out of thin air. They are the result of thoughtful and careful work on the part of a researcher. As you can see in both of the preceding guides, the topics and questions have been organized thematically and in the order in which they are likely to proceed (though keep in mind that the flow of a qualitative interview is in part determined by what a respondent has to say). Sometimes qualitative interviewers may create two versions of the interview guide: one version contains a very brief outline of the interview, perhaps with just topic headings, and another version contains detailed questions underneath each topic heading. In this case, the researcher might use the very detailed guide to prepare and practice in advance of actually conducting interviews and then just bring the brief outline to the interview. Bringing an outline, as opposed to a very long list of detailed questions, to an interview encourages the researcher to actually listen to what a participant is saying. An overly detailed interview guide can be difficult to navigate during an interview and could give respondents the mis-impression the interviewer is more interested in the questions than in the participant’s answers.

Constructing an interview guide often begins with brainstorming. There are no rules at the brainstorming stage—simply list all the topics and questions that come to mind when you think about your research question. Once you’ve got a pretty good list, you can begin to pare it down by cutting questions and topics that seem redundant and group similar questions and topics together. If you haven’t done so yet, you may also want to come up with question and topic headings for your grouped categories. You should also consult the scholarly literature to find out what kinds of questions other interviewers have asked in studies of similar topics and what theory indicates might be important. As with quantitative survey research, it is best not to place very sensitive or potentially controversial questions at the very beginning of your qualitative interview guide. You need to give participants the opportunity to warm up to the interview and to feel comfortable talking with you. Finally, get some feedback on your interview guide. Ask your friends, other researchers, and your professors for some guidance and suggestions once you’ve come up with what you think is a strong guide. Chances are they’ll catch a few things you hadn’t noticed. Once you begin your interviews, your participants may also suggest revisions or improvements.

In terms of the specific questions you include in your guide, there are a few guidelines worth noting. First, avoid questions that can be answered with a simple yes or no. Try to rephrase your questions in a way that invites longer responses from your interviewees. If you choose to include yes or no questions, be sure to include follow-up questions. Remember, one of the benefits of qualitative interviews is that you can ask participants for more information—be sure to do so. While it is a good idea to ask follow-up questions, try to avoid asking “why” as your follow-up question, as this particular question can come off as confrontational, even if that is not your intent. Often people won’t know how to respond to “why,” perhaps because they don’t even know why themselves. Instead of asking “why,” you say something like, “Could you tell me a little more about that?” This allows participants to explain themselves further without feeling that they’re being doubted or questioned in a hostile way.

Also, try to avoid phrasing your questions in a leading way. For example, rather than asking, “Don’t you think most people who don’t want to have children are selfish?” you could ask, “What comes to mind for you when you hear someone doesn’t want to have children?” Finally, remember to keep most, if not all, of your questions open-ended. The key to a successful qualitative interview is giving participants the opportunity to share information in their own words and in their own way. Documenting the decisions made along the way regarding which questions are used, thrown out, or revised can help a researcher remember the thought process behind the interview guide when she is analyzing the data. Additionally, it promotes the rigor of the qualitative project as a whole, ensuring the researcher is proceeding in a reflective and deliberate manner that can be checked by others reviewing her study.

Recording qualitative data

Even after the interview guide is constructed, the interviewer is not yet ready to begin conducting interviews. The researcher has to decide how to collect and maintain the information that is provided by participants. Researchers keep field notes or written recordings produced by the researcher during the data collection process.  Field notes can be taken before, during, or after interviews. Field notes help researchers document what they observe, and in so doing, they form the first step of data analysis. Field notes may contain many things—observations of body language or environment, reflections on whether interview questions are working well, and connections between ideas that participants share.

how to write open ended interview questions for research

Unfortunately, even the most diligent researcher cannot write down everything that is seen or heard during an interview. In particular, it is difficult for a researcher to be truly present and observant if she is also writing down everything the participant is saying. For this reason, it is quite common for interviewers to create audio recordings of the interviews they conduct. Recording interviews allows the researcher to focus on the interaction with the interview participant.

Of course, not all participants will feel comfortable being recorded and sometimes even the interviewer may feel that the subject is so sensitive that recording would be inappropriate. If this is the case, it is up to the researcher to balance excellent note-taking with exceptional question-asking and even better listening.

Whether you will be recording your interviews or not (and especially if not), practicing the interview in advance is crucial. Ideally, you’ll find a friend or two willing to participate in a couple of trial runs with you. Even better, find a friend or two who are similar in at least some ways to your sample. They can give you the best feedback on your questions and your interview demeanor.

Another issue interviewers face is documenting the decisions made during the data collection process. Qualitative research is open to new ideas that emerge through the data collection process. For example, a participant might suggest a new concept you hadn’t thought of before or define a concept in a new way. This may lead you to create new questions or ask questions in a different way to future participants. These processes should be documented in a process called journaling or memoing. Journal entries are notes to yourself about reflections or methodological decisions that emerge during the data collection process. Documenting these are important, as you’d be surprised how quickly you can forget what happened. Journaling makes sure that when it comes time to analyze your data, you remember how, when, and why certain changes were made. The discipline of journaling in qualitative research helps to ensure the rigor of the research process—that is its trustworthiness and authenticity which we will discuss later in this chapter.

Strengths and weaknesses of qualitative interviews

As we’ve mentioned in this section, qualitative interviews are an excellent way to gather detailed information. Any topic can be explored in much more depth with interviews than with almost any other method. Not only are participants given the opportunity to elaborate in a way that is not possible with other methods such as survey research, but they also are able share information with researchers in their own words and from their own perspectives. Whereas, quantitative research asks participants to fit their perspectives into the limited response options provided by the researcher. And because qualitative interviews are designed to elicit detailed information, they are especially useful when a researcher’s aim is to study social processes or the “how” of various phenomena. Yet another, and sometimes overlooked, benefit of in-person qualitative interviews is that researchers can make observations beyond those that a respondent is orally reporting. A respondent’s body language, and even their choice of time and location for the interview, might provide a researcher with useful data.

Of course, all these benefits come with some drawbacks. As with quantitative survey research, qualitative interviews rely on respondents’ ability to accurately and honestly recall specific details about their lives, circumstances, thoughts, opinions, or behaviors. Further, as you may have already guessed, qualitative interviewing is time-intensive and can be quite expensive. Creating an interview guide, identifying a sample, and conducting interviews are just the beginning. Writing out what was said in interviews and analyzing the qualitative interview data are time consuming processes. Keep in mind you are also asking for more of participants’ time than if you’d simply mailed them a questionnaire containing closed-ended questions. Conducting qualitative interviews is not only labor-intensive but can also be emotionally taxing. Seeing and hearing the impact that social problems have on respondents is difficult. Researchers embarking on a qualitative interview project should keep in mind their own abilities to receive stories that may be difficult to hear.

Key Takeaways

  • Understanding how to design and conduct interview research is a useful skill to have.
  • In a social scientific interview, two or more people exchange information through a series of questions and answers.
  • Interview research is often used when detailed information is required and when a researcher wishes to examine processes.
  • In-depth interviews are semi-structured interviews where the researcher has topics and questions in mind to ask, but questions are open-ended and flow according to how the participant responds to each.
  • Interview guides can vary in format but should contain some outline of the topics you hope to cover during the course of an interview.
  • Qualitative interviews allow respondents to share information in their own words and are useful for gathering detailed information and understanding social processes.
  • Field notes and journaling are ways to document thoughts and decisions about the research process
  • Drawbacks of qualitative interviews include reliance on respondents’ accuracy and their intensity in terms of time, expense, and possible emotional strain.
  • Field notes- written notes produced by the researcher during the data collection process
  • In-depth interviews- interviews in which researchers hear from respondents about what they think is important about the topic at hand in the respondent’s own words
  • Interviews- a method of data collection that involves two or more people exchanging information through a series of questions and answers
  • Interview guide- a list of questions or topics that the interviewer hopes to cover during the course of an interview
  • Journaling- making notes of emerging issues and changes during the research process
  • Semi-structured interviews- questions are open ended and may not be asked in exactly the same way or in exactly the same order to each and every respondent

Image attributions

interview restaurant a pair by alda2 CC-0

questions by geralt CC-0

Figure 9.1 is copied from Blackstone, A. (2012) Principles of sociological inquiry: Qualitative and quantitative methods. Saylor Foundation. Retrieved from: https://saylordotorg.github.io/text_principles-of-sociological-inquiry-qualitative-and-quantitative-methods/ Shared under CC-BY-NC-SA 3.0 License

writing by StockSnap CC-0

Foundations of Social Work Research Copyright © 2020 by Rebecca L. Mauldin is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

Share This Book

UX Mastery

How to Write Effective Qualitative Interview Questions

Two people talking

Qualitative interviewing is an effective technique to quickly understand more about a target user group. It is a key skill that any aspiring user researcher should develop. It is important to carefully craft the questions to ensure the sessions run efficiently and get the desired information. This article outlines best practice tips on creating effective session guides, ensuring your questions produce great results.

Don’t Ask Leading Questions

A leading question guides the respondent to a desired answer by implying that there is a correct answer. People tend to provide socially desirable answers, so if you ask a question that guides them, they will likely provide one that they believe you want to hear. Leading questions can be used by people to persuade someone. They should not be used when trying to uncover new information or understand an audience. They reduce the objectivity of the session, and therefore, reduce the reliability of the results.

Example: Leading: ‘Why would you prefer to use our product?’ Better: ‘What are your thoughts about using our product?

In the leading example, it implies that the respondent prefers the product and is enquiring as to why. The respondent may list a bunch of reasons that they like the product but may leave out crucial information where they believe the product could improve. Asking about their opinions and thoughts will provide them with a platform to discuss the product freely.

Example: Leading: Would you prefer to use the product to improve efficiencies or to gain an overview? Better: Why might you use this product?

In this example, the interviewer provides two reasons why someone might use a product. The interviewer may have only considered the two reasons why someone may use the product. Simply asking why they may use the product achieves the same goal, but also allows the respondent to consider other options.

To avoid leading questions, act as if you know nothing of the topic. Note down what you would ask if you have no information at all. Keep the questions simple, neutral and free from any words with connotations or emotions. It is also best to have an independent observer assess the topic, as it is easier for them to have an unbiased opinion on the matter.

Behavioural, Attitudinal

People often hold a belief that does not match with their behaviours. Using a mixture of attitudinal and behavioural questions uncovers what a person does, but also their thoughts about their actions. Attitudinal questions are used to understand their opinions and motivations. Behavioural questions are used to find out how a participant does something. It is best to utilise a mixture.

Example: Attitudinal: How often should you brush your teeth? Behavioural: How many times did you brush your teeth last week?

Try to keep all behavioural questions about the user’s past, as future behaviours are influenced by opinions and attitudes. It is best practice to repeat questions from a different angle. Don’t be afraid of users repeating themselves or going over a topic multiple times.

Ask Open-Ended Questions Instead of Closed Questions

Open-ended questions are ones that require more than one word to answer. Closed questions result in either a yes/no situation. Open-ended questions are used to find out people’s goals, motivations and pain points. They provide an opportunity for the participant to speak freely on the topic.

Example: Yes/No: Do you like coffee? Open: What are your thoughts on coffee?

Closed questions should be avoided unless you want to either clarify to gain more context to the user’s situation. Yes/No questions close down conversations and can be considered as quantitative. The following examples are both fine to use in an interview, as they will put other details into perspective.

Context: Do you drink coffee? Clarify: You mentioned you drink coffee, correct?

When creating your questionnaire, try and stick with ‘how’, ‘why’, ‘what’, ‘when’ and ‘where’ questions.

Don’t Use Double-Barreled Questions

Sometimes interviewers get excited and want to ask multiple things at once. Double-barreled questions touch on more than one topic. This can be overwhelming to answer, and respondents may either try to answer both at once or answer only one part of the question. If you want to ask something on multiple topics, it is best to split them into two different questions.

Example: Double-barreled: What do you like about coffee and new coffee products? Better: What do you like about coffee products?

It is normal in casual conversation to ask questions in such a manner. Interviewing is best when the questions are short and to the point, focusing on one topic.

Differentiate Between Quantitative and Qualitative Questions

Quantitative and qualitative questions both have their own strengths and weaknesses. Quantitative questions are typically reserved for surveys but can be used in interviewing to add some context and allow the interviewer to ask more follow-up questions. They mostly uncover ‘who’ and ‘what’. Qualitative questions will provide detailed information on the topic of interest, uncovering the ‘why’ and ‘how’.

Examples of quantitative questions:

  • Numerical answers: How many coffees do you drink a day?
  • Preferences: What type of coffee drink do you prefer?
  • Single word answers: What brand of coffee do you drink?

It is not immediately obvious and clear-cut the quantitative nature of these questions. You can tell through the low complexity of data gathered. If you ask these questions to participants, you will get a straightforward answer. However, the issue is that the responses are not statistically valid, and require further investigation. You can better use your time in an in-depth one on one session asking qualitative questions such as:

Examples of qualitative questions:

  • Recount your morning routine.
  • Why do you prefer one brand over another?
  • Why do you drink coffee everyday?

Shifting to why and how people do things, outlining goals, motivations, pain points and delights gives a much more in-depth perspective. These insights can be validated later through other techniques, but interviewing is the quickest and easiest way to gather them.

For qualitative interviewing, there are few clear best practices. Each interviewer has their own way of gathering information and forming questions. The tips above are there to guide you but are not definitive rules that one cannot break. I hope these help to elevate your interviewing process and gather better insights.

' src=

John Lassen

John started his career in market research and marketing where he constantly championed the experience of the users and customer journeys. John jumped into the profession of UX Research because of the ability to create products and understand users on a qualitative level. As an advocate of design thinking, he is constantly in touch with users, creating strategic outputs and reinforcing the business value of research.

Join the discussion Cancel reply

amazing and best content and post about effective interview questions.

Further reading

how to write open ended interview questions for research

Bad UX Examples: A Guideline to Prevent Errors

Achieving a great User Experience (UX) is essential in the digital space world. Good UX design connects with users. Yet, one can encounter dozens of...

how to write open ended interview questions for research

Wireframes Are Bad… Don’t Use Them

I failed in using wireframes; that’s why I say that they are bad. I know so many beginners and intermediate UX designers use wireframes in the...

how to write open ended interview questions for research

Living the change from Product Strategy to Research Ops: The Journey of Aurelius Labs

Zack is the co-founder of Aurelius, an online user research repository that allows you to tag, analyze and share your research quickly across teams...

Follow @uxmastery

UX Mastery

TPR Teaching

Learn and Grow

Open-Ended Questions: +20 Examples, Tips and Comparison

Open-ended questions allow for a wide range of responses, unlike closed-ended questions with limited response options. They are often used in surveys or interviews to gather qualitative data, providing more detailed and insightful information than closed-ended questions. 

Let’s explore the definition, purpose, and benefits of open-ended questions and tips for crafting and asking effective ones.

See next: Close-Ended Questions: Examples, Tips, and When To Use

What Are Open-Ended Questions?

An open-ended question encourages a full, meaningful answer using the subject’s knowledge, experience, attitude, and feelings. 

A good open-ended question should be broad enough to invite thoughtful responses yet specific enough to provide direction. It should avoid leading the respondent to a particular answer, eliminating bias. 

Additionally, the language should be simple and clear to ensure understanding and comfort for the respondent. Lastly, it should be relevant and purposeful to align with the overall objectives of the survey or interview .

Characteristics of Effective Open-Ended Questions

  • Non-leading question
  • Relevant to the topic or issue discussed
  • Should allow for a variety of free-form responses
  • Cannot be answered with a “ yes ” or “no,” “true” or “false” response

Examples of Open-Ended Questions

Here are some examples of open-ended questions that can be used in surveys, discussions, or interviews:

  • What do you believe are the biggest challenges facing our society today?
  • Can you describe a time when you felt most fulfilled in your work?
  • How has your perspective on [topic] evolved over time? 
  • What does [concept] mean to you?
  • How do you see technology shaping our future?
  • Can you share a personal experience that has shaped your values and beliefs? 
  • In your opinion, what are the most important qualities of a leader?
  • What factors do you consider when making important decisions? 
  • How can we better address issues of diversity and inclusion in our community? 
  • What impact do you think [policy/decision] will have on our environment?

Tips for Crafting Effective Open-Ended Questions

Crafting effective open-ended questions requires careful consideration of the wording and structure. Here are some tips to keep in mind:

Start With “What,” “How” or “Why”

Beginning a question with “what,” “how,” or “why” encourages the respondent to think critically and provide a detailed answer. 

Other common words and phrases that an open-ended question include “describe,” “tell me about,” and “what do you think about…”

Avoid Leading or Loaded Questions

Leading questions or loaded questions can bias the respondent towards a certain answer and limit the range of responses. It is important to avoid using language that suggests a preferred or expected answer.

For example, instead of asking, “Don’t you agree that…?” a more effective open-ended question would be, “What is your opinion on…?”

Use Simple and Clear Language

Open-ended questions should be easy to understand and answer. Using complicated or technical language can confuse the respondent and result in incomplete or inaccurate responses.

Be Specific and Direct

It is important to be specific with open-ended questions to gather relevant and useful information. Avoiding broad or vague questions can help elicit more focused and detailed responses.

Consider the Order of Questions

The order of questions in a survey or interview can impact the responses. It is often best to start with closed-ended questions before moving on to open-ended ones, as this can help warm up and engage the respondent.

For instance, in the initial stages, closed questions can be helpful to gather information about customer demographics such as age, marital status, religion, and so on.

Tips for Asking Open-Ended Questions

When asking open-ended questions, here are some other tips to keep in mind.

Appropriateness

Decide if an open-ended question is necessary or appropriate for the situation. 

Consider the purpose of the question and evaluate if a closed-ended question may be more suit.

Don’t Ask Too Many Open-Ended Questions

It is important to balance open-ended and closed-ended questions to gather relevant information effectively.

Too many open-ended questions can overwhelm respondents and lead to incomplete answers, or they might abandon the survey altogether.

Consider using open-ended and closed-ended questions to gather detailed responses and specific data or statistics.

Change Close-Ended to Open-Ended Questions

Sometimes, a closed-ended question can be rephrased to elicit more detailed, open-ended responses.

For example, instead of asking, “Do you like our product?” which only allows for a yes or no answer, you could ask, “What do you like or dislike about our product?”

Listen Carefully to Responses

When using open-ended questions, it is important to listen and take note of the responses actively. This can provide valuable insights and help identify areas for improvement or potential new ideas.

Allow Time for Responses

Open-ended questions require more thought and reflection, so give respondents enough time to formulate their responses. Avoid rushing them or interrupting them before they have finished speaking.

Advantages of Open-Ended Questions

Open-ended questions offer several advantages over other question forms. Here are some key benefits:

1. Encourages Thoughtful Responses

Open-ended questions require the respondent to think and provide a more detailed answer rather than simply selecting from a list of predetermined options.

This allows for more thoughtful and insightful responses, providing a deeper understanding of the subject matter.

2. Allows for Individual Perspectives

Since open-ended questions do not limit the response options, they allow individuals to express their unique perspectives and experiences.

Open-ended questions can provide diverse answers and a more holistic view of the topic.

3. Provides Rich and Detailed Data

The open-ended nature of these questions allows for a wider range of responses, providing richer and more detailed data compared to closed-ended questions.

This can be especially useful in qualitative research and allows researchers to uncover deeper meaning and understanding.

4. Promotes Engagement

Open-ended questions often require the respondent to provide longer answers, which can promote engagement and interest in the topic being discussed.  

Open Ended Questions Vs. Close Ended Questions

Closed-ended questions can be answered with a simple “yes” or “no” or are limited to a predetermined set of options. Unlike open-ended questions, they do not allow respondents to expand on their answers or provide additional information. 

Common closed-ended questions include multiple-choice, ranking scale, or binary questions (yes/no). 

These questions are often used in quantitative research, where the objective is to gather statistical data. For instance, they are commonly found in surveys where data needs to be analyzed swiftly and uniformly. 

These questions provide a straightforward way for researchers to categorize responses and draw conclusions from the data. However, they may not offer the depth and nuance of information that open-ended questions can provide.

Examples of Close-Ended Questions

As mentioned, closed-ended questions are useful for gathering specific data or statistics. Here are some examples:

  • Do you agree or disagree with the statement?
  • On a scale of 1 to 10, how satisfied are you with your job?
  • What is your age?
  • Which of the following options best describes your educational background?
  • How much is your monthly phone bill? Select from the range below.
  • Have you ever used our product/service before? 
  • Would you consider using our product/service again?
  • Do you prefer [option A] or [option B] for [specific situation]?
  • Were you happy with your purchase?
  • Was this helpful?

Open Ended Questions to Ask Your Customers or Clients

  • What do you value most in a product/service?
  • How has our product/service improved your business?
  • Can you explain how our product/service has helped you achieve a specific goal? 
  • What improvements or changes would you like to see in our product/service? 
  • How likely are you to recommend our product/service to others ? Why or why not?
  • Can you share a specific experience or interaction with our brand that stands out in your mind? 
  • What do you think sets us apart from our competitors? 
  • Can you describe a time when our product/service exceeded your expectations?
  • How has using our product/service impacted your daily routine or workflow?  
  • In your opinion, what is the biggest benefit of our product/service? 

Open-ended questions are valuable for gathering qualitative data and gaining deeper insights into a topic. They offer several advantages over closed-ended questions, including promoting engagement and providing rich, detailed data.

By following the tips provided, researchers can craft practical, open-ended questions that elicit thoughtful and meaningful responses from participants.  

7bfa06325c3b2265cb43a0ca30587dda?s=150&d=mp&r=g

Caitriona Maria is an education writer and founder of TPR Teaching, crafting inspiring pieces that promote the importance of developing new skills. For 7 years, she has been committed to providing students with the best learning opportunities possible, both domestically and abroad. Dedicated to unlocking students' potential, Caitriona has taught English in several countries and continues to explore new cultures through her travels.

' src=

Caitriona Maria

Nerdy types of men make ‘the best husbands’ according to millenial women, close-ended questions: +20 examples, tips and comparison.

guest

how to write open ended interview questions for research

Open-ended Questions Vs. Closed-ended Questions In User Research

Simbar Dube

Simbar Dube

So you have decided to conduct a usability test for your product —so as to understand your product from the users’ perspective. You then realize that you have to come up with relevant questions to ask participants.

“ That sounds like a piece of cake ,” some may presume. 

Writing effective usability questions seems deceptively easy, but the harsh reality is it’s not as simple as you may think it is. There are quite a number of mistakes to avoid . 

How you phrase your questions will directly impact the quality and value of your user testing results. Ask your participants the wrong questions, or use the wrong words to structure the questions, and you will gather incorrect feedback. Wrong questions can contaminate the whole research —leading to misleading quantitative data and qualitative data.

Before you come up with any usability questions, there is a great need for creative thinking, which involves answering the question – what you intend to learn from the test. Begin by asking yourself this: what information do I need from this research ? 

Taking time to answer that question will help you narrow down all the possible wrong directions you might head before getting to the qualitative data or quantitative data you need. The ultimate goal could be as simple as finding out if users will click on your search result listing. 

When you have a clearly defined goal, writing usability questions isn’t a task that is hard to do.

By default, your usability questions could either be open-ended questions or close-ended questions.

how to write open ended interview questions for research

Whenever we conduct any user research at Invesp, our conversations with our participants have a natural rhythm.

We ensure this by using open-ended questions and closed-ended questions in unison.

This article will explore open-ended questions and close-ended questions in great detail, focusing on how to write them and when to use them. 

What are open-ended questions? 

An open-ended question is what it is: a question that is open to any answer. In the context of user research, open-ended questions are questions that do not limit users to one- or two-word answers. Instead, they have multiple potential responses, and they often give room for further probing by the moderator.  

Open-ended questions are versatile in nature, and they prompt users to describe their feelings and thoughts in their own voice. In this regard, the Digital Marketing Evangelist for Google, Avinash Kaushik , says:

The greatest nuggets of insights are in open ended questions because it is the Voice of the Customer speaking directly to you (not cookies and shopper_ids but customers).

There is something about asking open-ended questions that makes participants feel more comfortable during a usability test . People, in general, tend to open up and express themselves better when they are given room to answer in their own words.

Open ended questions for user research

For example, rather than asking, “Was the new feature easy to use?” You can try something like, “How would you describe your experience of using the new feature?”

The most common response to the first question would have been, “ Yes, it was ” or “ No, it wasn’t ” and there was no way you could have understood the context behind the user’s response. But the second question allows users to respond freely, and there is a high chance of getting unique answers that you might not have anticipated.

Open-ended questions are ideal for starting and holding a conversation in any circle. They empower users, giving them 100% control of what they intend to say —and this is something that cannot be done using closed-ended questions.

Tips for writing effective open-ended questions.

Coming up with the right kind of questions requires practice. There’s an art to asking questions that prompts people to think before giving a response. Unless it’s an open-ended question, not every question you ask your users will motivate them to give a detailed answer. Here are a few tips that will help you write effective open-ended questions. 

1. Begin your question with how, why, and what

What makes a question open-ended is the wording of the sentence. For an effective open-ended question, start the question with words such as how, what, why, and can. This way, you give your users freedom to say more, and in the process, there is a possibility of uncovering rich insights. 

close ended questions for user research

Avoid using more specific words such as did, would, which, when and was —these usually prompt one-worded answers.

For example, a question like “ Which part of your experience was unsatisfactory ?” does not evoke users to give in-depth details about what they found unsatisfactory with the application. Instead, you can elicit users to reflect on their experience and give a more insightful response if you ask it this way “ What challenges did you face during your experience with XYZ ?”.

Here are a couple more examples of how to attain a more precise answer by changing the wording of your question:

2. Clarity and Analysis

Ensure that your question requires users to be analytical and to clarify their points. One defining trait of an open-ended question is its ability to propel users to put more thought to their responses. 

Analytical questions do not require users to generalize their answers. For instance, in a Conversion Optimization research, you might ask your participants these questions:

  • What is the significance of a certain element in a website?  
  • How important is the new feature on the site?
  • Why did you choose to use this service/product?

All these three questions are different, but they all have one thing in common: they would require participants to be clear and describe their answers in more detail.

By motivating users to give clarity in their responses, open-ended questions can also be used as a way of motivating their participants to verify their answers, especially when your previous question was closed-ended.

Suppose you ask this close-ended question: “Did you find the product you were looking for?” you can then verify the given answer by asking this follow-up open-ended question: “Why were you looking for that product?”

open ended questions examples

3. Avoiding leading participants into a certain answer 

In any research approach, open-ended questions are asked so as to elicit valuable insights from users, not to confirm the moderator’s existing beliefs. So, if your questioning subtly prompts users to answer in a certain way or gives hints at the expected answer, then you need to revise the phrasing of your questions. 

The wording of the question shouldn’t be suggestive of any answers to the participants as this biases the users into giving a predetermined answer. 

Let’s say you ask this question: “ Which feature made you visit our site ?” 

The problem with this question is that already suggests an answer for the users. It implies that it’s a feature that made users visit the site. Come to think of it, what if it wasn’t a feature but a service that lured the users to the site? 

Examples Of Open-Ended Questions.

You can use these sample questions as conversation starters and to also make your participant explain more.

1. How satisfied or dissatisfied are you with this process?

2. What would (did) you expect to happen when you … ?

3. Did you find it?

4. How would this fit into your work?

5. How might this change the way you do that today?

6. What do you think about that?

7. What kinds of questions or difficulties have you had when doing this in the past?

8. What happened when you did this before?

9. Please describe your level of experience with …

10. What’s most confusing or annoying about … ?

11. What worked well for you?

12. How do you know … ?

13. How do you normally … ?

14. What just happened?

15. What was that?

16. What would you most want to change about … ?

17. Which things did you like the best about … ?

18. What were you expecting?

How Do You Ask Open-Ended Questions In UX Research?

Open-ended questions in UX research are used to gather qualitative data and gain insights into users’ thoughts, feelings, and behaviors. 

These types of questions allow participants to provide detailed and unstructured responses rather than simply choosing from a set of pre-determined options.

To ask open-ended questions in UX research, you can use prompts such as:

  • “Can you tell me about a time when you used a similar product or service?”
  • “What do you like/dislike about the current design?”
  • “How would you describe your experience using this feature?”
  • “What would you change about this feature if you could?”
  • “Can you walk me through your thought process as you complete this task?”

When asking open-ended questions, it is important to create a comfortable and non-threatening environment for participants and to listen and probe actively for more information as needed. Additionally, it is important to avoid leading questions and to keep the question open-ended to gather unbiased answers.

When To Use Open-Ended Questions.

In some situations, the only way to get valuable insights is to give respondents some sense of control over the conversation by allowing them to answer in their own words. There is a high chance of bumping into something completely unique and valuable if you allow users to have the freedom to express themselves.

1. In sales 

how to write open ended interview questions for research

SalesHacker made an interesting observation about interactive discussions prompted by open ended questions: when you have a conversation with your potential customers, and they talk for at least 30% of the time, your conversion sales will likely increase. But if they talk for less than 30% of the time, your sales conversion rates will drop. 

With open-ended questions, you are not only guaranteed an increase in sales, but they can also help you: 

  •  Explore the needs of your customers. 
  •  Provide you with a better idea of what your customers think about your product.
  •  Foresee and minimize risks.
  •  Trigger a meaningful and insightful conversation with your customers.
  •  Discover new opportunities. 
  • They can play a significant role in building a good rapport with your customers. 

However, not all open-ended questions are good. In an article written by Business 2 Community, they gave examples of some of the “ bad open-ended questions ” that won’t work well in sales: 

  • How much are you willing to spend? 
  • What is your worst pain?
  • What kind of goods or services are you ready to pay for
  • What don’t you like about our service? 
  • What don’t you like about our service?

2. Open-ended questions in conversion optimization research

The essence of an effective CRO program is not only based on getting tactics right and testing this and that, but it’s also about knowing the mindset of your customers. You first have to see your product or service from the customer’s lens to deliver a product or the services they desire. 

how to write open ended interview questions for research

One way of getting into the customer’s head is by asking open-ended questions.

In this regard, JeremySaid put out a handy list of questions you can ask your current customers when you intend to increase your conversion rate. Here are some of the open-ended questions they recommended: 

  • How was your overall experience? 
  • Why are you here today? 
  • What about this product/service struck out to you? 
  • What do you know about our company? 
  • What would you like to know about our company? 
  • What problems have you experienced in the past with similar products? 
  • What would you like to see us do more online?  

The more you let your customers feel comfortable, the more they will reveal what drove them to consider purchasing your product. So, whether you are conducting a usability test, focus group, customer interviews, or surveys , keep your customers TALKİNG and use the information to your advantage. And the best way of doing this is by opening up the conversation. 

What Are Closed-Ended Questions (With Examples)

If you can imagine a question restricting  participants to a set of predefined answers , then that’s a close-ended question. It aims to get precise and clear-cut answers — without leaving any room for users to express themselves.

According to Wikipedia : 

A close ended question refers to any question for which a researcher provides research participants with options from which to choose a response. Close ended questions are sometimes phrased as a statement which requires a response. A close ended question contrasts with an open ended question, which cannot easily be answered with specific information .

how to write open ended interview questions for research

Asking closed-ended questions will give you specific answers, aka quantitative data. Do they want to purchase your product? Are they shopping around with your competitors for the same service? The answers are simple and direct.

Although close-ended questions provide limited insights, that doesn’t make them any less important. In most cases, close-ended questions are used in a quantitative research approach where insights gathered are numerical. 

Close-ended questions have their place in user research, and they are wonderfully effective in guiding participants into giving certain answers. 

For example, you conduct a usability test to determine if your app store listing will convert well. So you ask your participants this question: 

Which of the following pieces of information made you download this app on the Play Store?  

  • Screenshots 
  • Customer reviews 

From this example above, the question eliminates any element of surprise by setting boundaries for the participants’ responses. Participants are not expected to give an answer outside the set of predefined responses. 

So, if close-ended questions do not require participants to express themselves, then isn’t that a disadvantage? Well, it is. Respondents are biased into responding in a certain way. But sometimes, it’s necessary to use these types of questions in user research as they make it easier and quicker for respondents to answer . 

how to write open ended interview questions for research

Although close-ended questions have different forms, they all have this in common: they are similar in the kind of answer they draw out from the respondents — clear-cut answers.

Specific questions

Specific questions are precise, clearly defined and they leave no room as to the intended meaning. At times, they come as multiple-choice questions that consist of two sections: (1) the stem which is the question itself and (2) a list of response alternatives, choices or answers that respondents will have to select an answer from. 

Example :  Suppose you want to evaluate your marketing channels and find out which platform is your brand more visible in. So you ask your participants: 

How did you first learn about our product/website? 

As alternative responses to the question, you can give these to your participants: 

Among all kinds of questions,  specific or multiple-choice questions are considered to be the most versatile type of questions.  In user research, you can use specific questions to discover facts or to gain an understanding of user behavior.  

Implicit questions

If anything is said to be implicit, then it means that it is not directly pointed out, but it is somehow suggested in the statement. So, an implicit statement is as an expression that prompts a certain reaction .   

With that said, what then is an implicit question ? 

how to write open ended interview questions for research

  An implicit question can be defined as a leading question that gives hints about the type of answer needed. Think of them as leading questions , that pushes participants to respond in a specific manner.  

Example :  If you ask users this question: How many times do you visit our website ? 

Using “ how many times” in the above question implies that the participants have visited the website before. There is an element of conjecture and assumptions and in this case, participants are persuaded to give a numerical answer. 

If the question was to be phrased as a direct question, it would have been: 

Have you ever visited our website? 

This second question doesn’t influence the participants’ responses —and this means that the question doesn’t cultivate any biases in respondents. 

Tips For Writing Effective Close-Ended Questions

Close-ended questions should not always be thought of as simple questions that anyone can easily answer merely because they do not require a detailed answer.  

1. Begin sentences with Where, Which, When, Did 

To make a question a close-ended, there is a certain way you should phrase it. In his book Conversationally Speaking , Alan Garner suggests that you use these few words to begin close-ended questions:

how to write open ended interview questions for research

Using these words, here’s a list of some examples of close-ended questions you can use: 

  • Are you happy with your experience when using our site? 
  • Would you recommend our product/service? 
  • What challenges did you face when you were using this website?  
  • Which elements on our mobile applications were easy to use? 

None of these closed-ended questions prompt participants to give detailed answers. They all can be answered with a one-word answer , as they aim to find out the ‘ what’ and not the ‘why’ . 

2. Be clear and simple 

Needless to say, when you ask a clear and simple question, you allow the possibility of a clear-cut answer. So the starting point is to remove extra verbiage that may end up distracting or confusing the respondents. 

Good example : Would you recommend our website ? 

Bad example : You have used the website for more than 10 minutes and you have visited all the pages and clicked on all, so does it mean that you will recommend our website to other people? 

Once you frustrate users with wordy questions, you risk compromising the value of your feedback. Here’s a list of clear and simple close-ended questions : 

  • Did you experience good customer service? 
  • Would you consider using our product or service again? 
  • Did you like our product or service? 
  • What product or service were you looking for today? 
  • Are you happy with your experience with us? 
  • Did you find what you were looking for today?   

how to write open ended interview questions for research

Having clear and simple close ended questions won’t just make it easy for users to infer the intended meaning, but you, as a moderator, will understand the given answers without any hassles. 

3. Relevant answer choices 

If you intend to use multiple-choice questions, then make sure that your suggested answers are plausible. Participants usually have different experiences even after using the same product, so you should have several alternative answers that best describe answers. 

Use at least four alternatives in each multiple-choice question so as to give users a variety of alternatives.  

When to use close-ended questions.

Generally, an online poll can have close-ended questions and open-ended questions as long as they require short feedback. But to give users a simple experience, you can use close-ended questions as they are easy to answer and do not require a detailed answer.  Since they require limited answers, Susan Farrell from Nielsen Norman Group  says this: 

Closed ended questions are often good for surveys , because you get higher response rates when users don’t have to type so much. Also, answers to closed ended questions can easily be analyzed statistically, which is what you usually want to do with survey data.

Here’s an example of one of the close-ended questions we use on the FigPii polls. The question asked users the reason for their visit on the site and it allowed them to select one answer from four alternatives. 

how to write open ended interview questions for research

Such close-ended questions which have multiple-choice forms have higher completion rates as users do not have to come up with their own responses. Although the answers given provide a general sentiment of insights, you can always follow up with an open-ended question so as to see things from the users’ perspectives. 

For instance, you can ask users to further elaborate on their answers by asking this as a follow-up:

What is the most important feature of our product/service for you ? This way you can understand the context behind the users’ decisions. 

Considering that they don’t demand much explanation from the respondents, close ended are perfect in quantitative usability research where you’d need to measure usability metrics such as task completion rates, error rates, and post-task satisfaction. 

Insights gained using close ended questions allow researchers to categorize respondents based on the answers they have selected. How so? 

Let’s say you have an online store, and you need to know the demographics of people who visit(ed) your site and left without completing a purchase. To decipher this demographic information, you can conduct an online survey that asks these close ended questions: 

Question 1: Can you please specify your gender?

Question 2: You are in which age group? 

  • 18-24 years 
  • 25-34 years
  • 35-44 years
  • 45-54 years
  • 55-64 years

Question 3: What is your annual income range?

how to write open ended interview questions for research

This knowledge would help you target the right kind of marketing campaign to the exact customers you’d want to attract. 

Similarly, the Nielsen Norman Group gave this list that indicates when to use close ended questions. Here are the situations where you should use this type of questions: 

  • In quantitative usability studies, where you are measuring time on task and error rates, and you need to compare results among users
  • In surveys where you expect many (1000+), respondents
  • When collecting data that must be measured carefully over time, for example with repeated (identical) research efforts
  • When the set of possible answers is strictly limited for some reason
  • After you have done enough qualitative research that you have excellent multiple-choice questions that cover most of the cases. 

Final Thoughts

Whenever you hear any CRO consultant or agency saying that they will fish out all the ‘ barriers that inhibit conversion’ on your site, all they mean is that they will ask relevant questions until they achieve better results. 

In this CRO business, assumptions can ruin what might have been a good relation with your customers. Foster the culture of asking questions, after all, the source of valuable knowledge is attained through asking questions. 

So, whether it’s open-ended or closed-ended you decide to use, make sure you can answer Yes to the following questions before coming up with any questions: 

  • Do your participants have relevant prior knowledge needed to respond to your questions?
  • Does your question address one of the important aspects that your users may have experienced during the course of the research? 
  • Is your question clearly outlined, using the appropriate language that can be easily interpreted by your customers/users? 
  • Are your questions grammatically correct? 

Additional Resources

1. Open ended questions and close ended questions: What they are, and how they affect user research.

2 . 6 DTC ecommerce websites with killer value proposition.

3. 11 customer service psychology secrets that go down the funnel, not the drain.

Share This Article

Join 25,000+ marketing professionals.

Subscribe to Invesp’s blog feed for future articles delivered to receive weekly updates by email.

Simbar Dube

Discover Similar Topics

Bayesian vs. Frequentist AB Testing

Bayesian vs. Frequentist AB Testing: Which Testing Method Is Better

  • Test Categories

MULTIVARIATE TESTING STRATEGIES

Expert Insights: Exploring Multivariate Testing Strategies for Websites in 2024

  • A/B Testing , Multivariate Testing

how to write open ended interview questions for research

Our Services

  • Conversion Optimization Training
  • Conversion Rate Optimization Professional Services
  • Landing Page Optimization
  • Conversion Rate Audit
  • Design for Growth
  • Conversion Research & Discovery
  • End to End Digital Optimization

By Industry

  • E-commerce CRO Services
  • Lead Generation CRO Services
  • SaaS CRO Services
  • Startup CRO Program
  • Case Studies
  • Privacy Policy
  • © 2006-2020 All rights reserved. Invesp

Subscribe with us

  • US office: Chicago, IL
  • European office: Istanbul, Turkey
  • +1.248.270.3325
  • [email protected]
  • Conversion Rate Optimization Services
  • © 2006-2023 All rights reserved. Invesp
  • Popular Topics
  • A/B Testing
  • Conversion Optimization
  • Conversion Rate Optimization
  • Copy Writing
  • Infographics
  • Landing Pages
  • Multivariate Testing
  • Sales & Marketing
  • Sales and Marketing
  • Shopping Cart
  • Social Media

Learn / Blog / Article

Back to blog

How to analyze open-ended questions in 5 steps [template included]

Open-ended questions are great for getting authentic feedback because they give people a chance to describe what they’re experiencing in their own voice.

how to write open ended interview questions for research

Last updated

Reading time.

how to write open ended interview questions for research

Open-ended questions  are great for getting authentic feedback because they give people a chance to describe what they’re experiencing in their own voice. Analyzing such  survey questions  yourself is an excellent opportunity to empathize with your audience, gather essential insights, and make the right decisions.

But you may be wondering...

How do you efficiently analyze more than 100 replies? Or even 1,000?

Here’s a system we use at Hotjar to categorize and visually represent large volumes of  qualitative data —and it’s easier than you might think! You’ll have to work with the technique a bit before you become comfortable with it, but once you get it, you’ll be sorting through mountains of qualitative data in no time.

What you’ll need:

Working knowledge of spreadsheets (Google Sheets or Excel)

A quiet space with some uninterrupted focus time

Hotjar’s open-ended question analysis template

how to write open ended interview questions for research

To help you learn this technique,  we created a data sample that you can download  and use to follow along.

Now let’s begin…

Table of contents

Step 1: Get your data into the template

Step 2: Identify response categories

Step 3: Record the individual responses

Step 4: Organize your categories

Step 5: Represent your data visually

Step 1: get your data into the template

1) Export the data  from your survey or poll into a .CSV or .XLS file.

how to write open ended interview questions for research

2) Copy the data  from your .CSV or .XLS file and paste it into the sheet ‘CSV Export’ of the template.

how to write open ended interview questions for research

🏆   Pro tip : use 'Paste special' to paste 'Values Only' in the Hotjar analysis template, so no formulas or formatting are copied over.

This is what your data should you like after being copy-pasted in the ‘CSV export’ sheet

3) Copy the column from the ‘CSV Export’ sheet  containing the open-ended question you want to  analyze  first and paste it into the ‘Question 1’ sheet, in the cell marked with < Paste answers to first open-ended question here >.

Your open-ended answers should you like the above after being copy-pasted in the ‘question 1’ sheet

4) Choose wrap text for the entire column , so the data fits the column width and is easier for you to read later on.

The 'wrap' function is available from the main menu in google sheets

Step 2: identify response categories

A response category is a set of replies that can be grouped because they are part of the same theme, even if they’re worded differently.

In the sample dataset we use for this tutorial, we asked Hotjar customers to explain how their employer measures their performance (e.g., revenue, conversions, traffic). In theory, you could go through every answer to identify your response categories one-by-one, but that wouldn’t be very efficient. Instead, we’re going to use a series of techniques that help you identify the broad categories.

A) Use a text analyzer:  text analyzers take your data and analyze it for the most commonly used words in your text, which helps you identify broad categories of responses.

🏆  Pro tip :  Textalyser  is a simple, free resource that does this well.

how to write open ended interview questions for research

If you do this with the sample data we’ve provided above, you’ll find that  ‘sales,’ ‘conversion,’ and ‘traffic’ are some of the most commonly used words in the data set:

<#'sales,’ ‘conversion,’ and ‘traffic’ are some of the most commonly used words in the data set and could be used as response categories

As such, they represent some of the most popular replies to the question we asked. They don’t represent  all  the answers, of course, but they’re a good place to start when building the list of response categories.

Add each category to the top of separate a separate column (replacing the text that reads, 'Response Category 01,' 'Response Category 02,' etc.):

how to write open ended interview questions for research

Note : some of the popular words in our text analyzer mean the same thing (e.g., 'sales' and 'revenue'), so you’ll want to create a single category for those responses called 'Sales/Revenue.' Other popular words will NOT become categories because, as stand-alone words, they tell us nothing useful (e.g., 'our,' 'rate').

B) Sort your responses alphabetically:  when you sort alphabetically, you’ll notice that specific patterns emerge, and you can create more categories based on the trends you spot.

In Google sheets, select the range, right-click, and sort the range alphabetically

In our sample data, every sentence beginning with the word 'Revenue' gets grouped when you sort alphabetically. Of course, we already have a category for 'Sales/Revenue,' so there’s no need to add that category in this case—but grouping the data alphabetically will allow groups to stand out.

Sorting responses alphabetically helps you to uncover themes easily

Alphabetical sorting will also draw your attention to certain stand-alone response. For example, someone replied 'Huh?' and another person told us they didn’t understand the question. This information allows us to add a new category called 'Didn’t understand the question.'

Alphabetical sorting will also draw your attention to certain stand-alone response

Scan the alphabetically sorted responses for other categories, such as 'It’s not measured,' 'Traffic,' 'Conversions,' etc. Be on the lookout for synonyms, but don’t worry if you create a few redundant categories for now. You will combine the categories that mean the same thing at the end.

Step 3: record the individual responses

1) Place a '1' in each cell  where a response (the row) matches a category (the column) to identify a positive response in each category. Add categories as you go.

For example, if you sorted our sample data alphabetically, you’ll find that the response in Row 6 reads, 'Huh?' If you added 'Did not understand the question' to Column E (as we did in the screenshot), then you’ll place a '1' in E36.

how to write open ended interview questions for research

Note:  In our example, many respondents indicate that their performance was measured by multiple factors (e.g., lead gen + sales + customer satisfaction). Be sure to place a '1' in each category. In other words, the row for that single answer, 'Revenue, then conversion rate, then traffic.' will record three different positive responses.

how to write open ended interview questions for research

When you input your first '1,' the cell in Row 3 (below the category) will change to indicate the number of positive responses in that category. Row 4 will change from a '#DIV/0' error to the percentage of responses that fall into each category.

2) Use the 'Find' feature to search for words related to each category:  begin with the first category (in our example, that’s 'sales') and search the data column for any response that mentions 'sales.' Read the entire response to ensure it fits the category you searched for, then place a '1' in the appropriate column for that response.

how to write open ended interview questions for research

3) Fill in the gaps:  read each row that hasn’t been categorized and place a '1' under the appropriate category, creating new categories as necessary. As you create new categories, search your data for those terms to quickly find similar responses.

⚠️ Important :  when adding a new category as you go through the responses, make sure to retroactively check previous answers that might fit in this new category.

As you create new categories and fill in the gaps, some interesting trends will start to appear

Step 4: organize your categories

1) Group your data:  you will almost certainly find categories that should be grouped but ended up in different categories because respondents used different words to describe the same concept. In our sample data, we found the terms 'Lead Gen' and 'Form Submissions,' and these belong in the same category.

Drag these columns next to each other, and apply a color (any color) to the group of columns you plan to merge—this marks them as a group so you can return to them in a bit when it’s time to combine them. Repeat this step for each set of categories you plan to join.

Column k is dragged next to column h because both response categories are related

Add a new column to the left-hand side of each group. For example, with 'Lead Gen' and 'Form Submissions,' you’ll create a new category called 'Lead Gen / Form Submissions,' add up the Row 3 totals for the two old categories, and enter the new total under the new group. Copy and paste the percentage formula from any Row 4 cell, then delete the old categories.

how to write open ended interview questions for research

⚠️ Important : when merging multiple categories, make sure to re-add the '1s' under the newly merged category, or you run the risk of losing your data.

Repeat this step for every group you plan to merge.

2) Arrange your categories from large to small:  arrange your categories in descending order from left to right. For those that only contribute to a small percentage of the total (2% or less), use the grouping method above to merge them into one category called 'Others,' which you’ll leave on the far right.

how to write open ended interview questions for research

Step 5: represent your data visually

1) Prep your data to create a bar chart.  First, select and copy the top three rows of your spreadsheet (those that make up the 'Response Categories,' 'Total respondents who answered X,' and '% respondents who answered X').

#Select and copy the top three rows of your spreadsheet

Subscribe to fresh and free monthly insights.

Over 50,000 people interested in UX, product,
 digital empathy, and beyond, receive our newsletter every month. No spam, just thoughtful perspectives from a range of experts, new approaches to remote work, and loads more valuable insights. If that floats your boat, why not become a subscriber?

I have read and accepted the message outlined here: Hotjar uses the information you provide to us to send you relevant content, updates and offers from time to time. You can unsubscribe at any time by clicking the link at the bottom of any email.

Paste them into the ‘Graph Question 1’ sheet using the 'Paste special' feature to paste only the values (so the formulas don’t copy over).

how to write open ended interview questions for research

Select and copy the table you just pasted, and choose 'Paste special' again—this time using 'Paste transposed' to invert the rows and columns (this makes your data more chart-friendly).

#Select and copy the table you just pasted, and choose 'paste special' again—this time using 'paste transposed' in cell a9

This is what you should see:

#Your table containing categories, the volume of responses, and percentage should you like the above

2) Create your chart:  insert your chart, selecting the percentage column as your 'Series' and the categories as your 'X-axis.' Resize the chart however you see fit.

#Your open-ended answers are now visualized in a graph

And there you have it—a visual representation of your data! Feel free to experiment with different formats if you’re putting the chart into a formal presentation.

Analyzing open-ended questions efficiently and empathizing with your audience take some practice, but the more you do it, the easier it becomes. Your mind will begin to recognize patterns the more you practice this technique, so don’t be afraid to dive into it.

Hotjar's open-ended question analysis template

Want to efficiently analyze a large volume of qualitative data? Get our Google Sheets/Excel template to get started.

Related articles

how to write open ended interview questions for research

User research

5 tips to recruit user research participants that represent the real world

Whether you’re running focus groups for your pricing strategy or conducting usability testing for a new product, user interviews are one of the most effective research methods to get the needle-moving insights you need. But to discover meaningful data that helps you reach your goals, you need to connect with high-quality participants. This article shares five tips to help you optimize your recruiting efforts and find the right people for any type of research study.

Hotjar team

how to write open ended interview questions for research

How to instantly transcribe user interviews—and swiftly unlock actionable insights

After the thrill of a successful user interview, the chore of transcribing dialogue can feel like the ultimate anticlimax. Putting spoken words in writing takes several precious hours—time better invested in sharing your findings with your team or boss.

But the fact remains: you need a clear and accurate user interview transcript to analyze and report data effectively. Enter automatic transcription. This process instantly transcribes recorded dialogue in real time without human help. It ensures data integrity (and preserves your sanity), enabling you to unlock valuable insights in your research.

how to write open ended interview questions for research

Shadz Loresco

how to write open ended interview questions for research

An 8-step guide to conducting empathetic (and insightful) customer interviews

Customer interviews uncover your ideal users’ challenges and needs in their own words, providing in-depth customer experience insights that inform product development, new features, and decision-making. But to get the most out of your interviews, you need to approach them with empathy. This article explains how to conduct accessible, inclusive, and—above all—insightful interviews to create a smooth (and enjoyable!) process for you and your participants.

The Ultimate Guide to Open-Ended Questions vs. Closed-Ended Questions

  • Written By Lena Katz
  • Updated: November 15, 2023
What is an open-ended question? An open-ended question is one that can only be answered by a unique thought or statement in someone’s own words. Unlike a closed-ended question, it cannot be answered in one word, or by yes/no, or by multiple choice. Open-ended questions encourage people to incorporate more of their own information and point of view.

For stronger connections,  better insights , and more business, experts recommend one conversational tool above all in the demo or discovery phase: open-ended questions. Profile writers use them all the time to elicit thoughts and anecdotes from their subjects.

Smart marketers also use them to maximize authentic engagement with new business leads and current clients. However, there’s a method and skill required to ask open-ended questions… and part of it is realizing and leveraging the other, equally important benefits of asking closed-ended questions.

In this article, we’ll go over the best habits to get into for asking open-ended questions, when to use closed-ended questions instead, scenarios when you might need to use both, the different ways they impact data collection , and some examples of open versus closed questions as used in marketing, sales, and content interviews.

But first, a little teaser of examples for each approach…

Examples of open-ended questions:

  • Where would you like your business to grow from here?
  • What would success look like to you?
  • What campaigns are out there right now that caught your eye, and for what reasons?
  • What are a couple of day-to-day practices of yours that people can implement for greater success/fulfillment in their own lives?
  • Can you give me a few dates for a follow-up call?

Examples of closed-ended questions

  • Are you satisfied with your current sales numbers?
  • What is your #1 goal?
  • Did you like your competitor’s latest campaign/commercial?
  • Where can someone go to learn more about what you do?
  • When would you like to set a follow-up?

What is an open-ended question?

What is an open-ended question?

An open-ended question is one that can only be answered by a unique thought or statement in someone’s own words — it cannot be answered in one word, or by yes/no, or by multiple choice. Open-ended questions encourage people to come up with a more thoughtful and filled-out answer incorporating more of their own information and point of view.

People who want to keep an exchange of information and flow of thoughts going with whomever they’re interviewing will generally stick with open-ended questions. These questions encourage interviewees to explore their “why” and to give context to their decisions.

They illuminate the reasoning behind decisions and opinions. In interviews, they help the writer/producer get to know and understand a subject… and then pass that insight along to readers.

Why/when are open-ended questions recommended/important?

They can be used at any time when it’s more important to the interviewer to elicit thoughts and opinions and insights than to get definitive answers.

Situations may include:

  • Informational interviews with business prospects
  • Discovery sessions with potential or new clients
  • Feedback sessions with existing clients
  • Testimonial interviews
  • Interviews for profiles
  • Market research — when you’re trying to gauge people’s perception of a brand
  • Market research — customer insight interviews
  • Customer satisfaction surveys —  solicit people’s opinions

Do’s for crafting open-ended questions:

  • Do start off with “Why…” or “What…” But if you fear that even with that opening, your question will lead to a succinct answer, build in a request for the interviewee to share their thoughts, not get straight to the point.
  • Do ask people to explain something.
  • Do ask people for their thoughts on something.
  • Do ask for an example.
  • Do remember, an open-ended question can also be phrased as a statement: “Tell me about a moment when…”
  • Do follow a closed-ended question with an open-ended question — to get exact data, and then an explanation of the data provided.

Don’ts for crafting open-ended questions:

  • Don’t make them so broad that people get confused.
  • Don’t encourage lengthy answers to every question (especially if this is a survey situation).
  • Don’t overuse them and forget to get quantitative data.
  • Don’t make them two-part questions where each part requires its own separate train of thought.
  • Don’t prompt an answer or make any suggestions that could push an answer in a certain direction.

Using yes/no questions

What is a closed-ended question?

We’ve briefly touched upon closed-ended questions just to compare with open-ended ones. Now, let’s define exactly what they are and in what scenarios it’s better to use them.

Closed-ended questions require one specific answer — either a yes/no or a choice between a few options. Sometimes they’re in pursuit of a fact, and sometimes a decision. These types of questions are used to collect quantitative data , which can be mapped out on charts or graphs.

The answers are also used to come up with numerical ratings of how a company is performing or meeting customer expectations. When used by salespeople, closed-ended questions can also be a tactic to assess how cold or warm a lead is, and to move the sales process along.

For interviewers such as writers, closed-ended questions are often used to establish background facts about a topic or person. They can also be used for winding up an exploratory Q+A session with some definitive conclusions.

You see this on reality TV interviews often. One person shares her drama with another cast member, explores the person’s possible motivations, speculates on her intentions, and then the interviewer asks:  Do you trust that person?   No.   Do you still think of her as a friend?   No.

It puts a bow on the conversation and lets viewers know where the storyline is headed.

Why/when are closed-ended questions important to use?

  • When you want to get fast facts or basic biographical details
  • When you need answers to be exact
  • When you are collecting quantitative data
  • When the answer is provided, it will determine whether or not it makes sense to continue pursuing a lead (especially related to budget and timeline)
  • When you are setting goals and KPIs that you’ll be expected to deliver against
  • When you’re fact-checking
  • When your legal department is going to want to put information into a contract

Do’s for crafting closed-ended questions:

  • Do begin the question with Have , Will or Do/Did .
  • Do switch up the question structure between yes/no, multiple-choice, rating scale multiple-choice, and fact-based answers.
  • Do create the questions according to what data you need to get from a study, survey, or questionnaire.
  • Do follow (or lead) a closed-ended question with an open-ended question to get both quantitative and qualitative information.

Don’ts for crafting closed-ended questions:

  • Don’t provide a selection of multiple choice answers that’s too limited to cover the full range of possibilities.
  • Don’t assume that everyone will be able to make a yes/no answer based on their experience of something.
  • Don’t attempt to craft complex or two-part questions as you might with an open-ended question.
  • Don’t use this format to explore emotions or feelings.
  • Don’t create a survey or study that is only closed-ended questions; at minimum have an open-ended question at the end of each section that allows people to explain their answers or give context to them.

Open-ended vs. close-ended questions

Open-ended vs. closed-ended questions

Let’s have a look at the different purposes they serve, how they complement each other, what kind of data they garner, and how each can be used in our three scenarios (a sales call, a marketing exercise, a writers’ interview).

  • An open-ended question opens up a topic for exploration and discussion while a closed-ended question leads to a closed-off conversational path. After “Yes” or “No” or the specific one-word answer to the question, the thread is done.
  • Open-ended questions lead to qualitative answers while closed-ended questions lead to quantitative answers.
  • Open-ended questions ask people for their why while closed-ended questions ask people for their decision .

In shopper behavior analysis:

  • Open-ended questions spend time peeling back the layers of why someone feels some way about a product.
  • Closed-ended questions take a person through their buying habits: how often do they buy a product, which brand do they typically buy, have they heard of your brand, do they buy it.

In sales meetings:

  • Open-ended questions help you understand your potential customer better.
  • Closed-ended questions help you realistically decide whether there’s business to close.

In marketing research:

  • Open-ended questions are good for getting customer insights.
  • Closed-ended questions are good for establishing who is a loyal customer and who has little brand awareness or loyalty.

In writing profiles or bios:

  •  Open-ended questions are good for establishing a connection, getting lots of nuanced details, and pulling back the curtain on a person’s life.
  • Closed-ended questions are good for establishing their credentials , hitting biographical details, and fact-checking anecdotes you discovered during preliminary research.

Sample open-ended questions vs. closed-ended questions

Open-ended vs. closed question set examples for sales professionals.

10 open-ended vs. closed question set examples

For sales professionals.

When you’re in sales, open-ended questions are good for understanding more about your customer and opening up a real dialogue. Closed-ended questions are good for getting prospects to let you know whether they have any intentions of signing a contract any time soon.

Sales example 1:

CLOSED : Were you happy with your former [agency/SaaS provider/other competing product or vendor]? OPEN : What was it about your former [competing product/vendor] that has you looking for a new vendor?

Sales example 2:

CLOSED:  Are you satisfied with your current sales numbers? OPEN : Where would you like your business to grow from here?

Sales example 3:

CLOSED : Have you ever executed the kind of project/campaign we specialize in before, either on your own or with a different partner? OPEN : Tell me about a case study or existing campaign/project in the market that is in this category that you really like. It can be one of your own, or another company.

Sales example 4:

CLOSED : (after a product demo) Do you have any questions? OPEN :  We went through a lot of information just now. What part stood out to you the most, either because you loved it or because you’d like a little more time to understand?

Sales example 5:

CLOSED : (after going through prices) Does this fall more-or-less into the budget range you have in mind? OPEN:  Could you tell me how you’d want to customize a scope-of-work or what services would be important to you? That way I can come up with a price quote.

Sales example 6:

CLOSED : What’s your main goal that you’re hoping I can help with? OPEN :  What are your immediate and also your big-picture goals?

Sales example 7:

CLOSED : Are you interested in buying/subscribing to/getting a membership to the product I’ve shown you today? OPEN : Now that we’ve previewed our product/service together, what are you thinking your next step will be?

Sales example 8:

CLOSED : When would you like to set a follow-up? OPEN : Can you give me a few dates for a follow-up call?

Sales example 9:

CLOSED : Do you feel like you got all the information you needed? OPEN : Before we wrap, can you tell me what you’d like to look over again — either here or as an email follow-up?

Sales example 10:

CLOSED : On a scale of 1-10, how would you rate our team’s service up to this point? OPEN : Please share anything specific that stood out to you about the service you’ve received from our team so far.

Open-ended vs. closed question set examples for marketers.

10 open-ended vs. closed question set examples for marketers

Marketers are constantly interacting with customers, stakeholders, current clients and leads — their lives are an interesting mix of collecting data and fostering connection.

Just look at a social media manager’s day-to-day: Half may be spent analyzing paid campaign results and crunching numbers. The other half may be spent following up on an angry customer’s Facebook tirade or getting people’s permission to use content for UGC.

Today’s marketer needs to be able to flip from analyzing facts to feelings, balance trends with tried-and-true, ask closed-ended to open-ended questions instantaneously, and then explain their findings to the non-marketers that they work with or are hoping to work with soon.

Marketing example 1:

CLOSED : Are you satisfied with the quantity and quality of new business leads you’re currently getting? OPEN : What are your thoughts on the new business/lead-gen process at your company as it is now?

Marketing example 2:

CLOSED : What is your #1 goal? OPEN : What would success look like to you?

Marketing example 3:

CLOSED : Have you considered putting your budget toward X channel or tactic? OPEN : What channels and tactics do you feel are important to include in your next marketing plan?

Marketing example 4:

CLOSED : Did you like your competitor’s latest campaign/commercial? OPEN : What campaigns are out there right now that caught your eye, and for what reason?

Marketing example 5:

CLOSED : Which of the four logos shown here is best in your opinion? OPEN : Why did that one stand out to you?

Marketing example 6:

CLOSED : On a scale of 1 to 10, how satisfied were you with the information provided on our website? OPEN : What areas/sections do you think we can improve and how?

Marketing example 7:

CLOSED : Did you like the first version of the video I just sent over? OPEN : If you had a chance to watch the video I sent, what’s your feedback?

Marketing example 8:

CLOSED : What’s your budget for this activation/campaign/partnership? OPEN : There are a few ways we’ve discussed that a partnership could play out. How flexible is your budget if I were to send three different options?

Marketing example 9:

CLOSED : Are you mainly looking at reach, engagement or conversion as the key metric to gauge success in this campaign? OPEN : Let’s discuss what KPIs will be used to determine success in this campaign.

Marketing example 10:

CLOSED : Can we move forward with X project at $X budget for the dates presented? OPEN : We are ready to answer any final questions you might have before moving forward with this project.

Using open-ended vs. closed questions in interviews

10 open-ended vs. closed question set examples for interviewers:

One pitfall that’s common and you really need to be cautious of with experts and executives is the false open-ended question. This is a question phrased so it could lead to a personal anecdote or insight, but could also be answered with a “No.”

While experts and execs usually like to talk about their work , they will sometimes answer something with a simple “No” because they haven’t thought about it before, and they don’t really have an opinion.

All the open-ended sample questions here are crafted to avoid the possibility of a “No.”

Interview example 1:

CLOSED : What’s your job title? OPEN : How would you describe your professional specialty/expertise /niche?

Interview example 2:

CLOSED : What’s your focus right now? OPEN : Tell me one of your key focuses right now and why you’re interested in it.

Interview example 3:

CLOSED : Do you like X trend? OPEN : Name three of your favorite trends in our industry right now and why you like them.

Interview example 4:

CLOSED : What would you consider your key accomplishment in your field to be? OPEN : Please walk us through the accomplishment that gave you the most satisfaction in your career.

Interview example 5:

CLOSED : What degrees, awards or certifications do you have? OPEN : Of the degrees and awards you’ve received, which would you say are the most meaningful, and why?

Interview example 6:

CLOSED : Was it difficult to transition from [#1 well-documented career] to [#2]? OPEN : You successfully transitioned from [#1 well-documented career] to [#2]. Explain to us how that happened.

Interview example 7:

CLOSED : Can you tell us who will be in your next project/speaking at your next event? OPEN : How do you choose collaborators or speakers for your projects/events?

Interview example 8:

CLOSED : Where can someone go to learn more about what you do? OPEN : What are a couple day-to-day practices of yours that people can implement for greater success/fulfillment in their own lives?

Interview example 9:

CLOSED : What’s new/next for you? OPEN : What upcoming project or venture are you most excited about and why?

Interview example 10:

CLOSED : What social channels can we find you at? OPEN : If we all go follow you on Instagram or Twitter, what kind of content are we going to see?

Each kind of questions are equally valuable.

Open- and closed-ended questions are equally valuable.

While open-ended questions are a buzzword among salespeople and business coaches right now, we think the proper mix of open- and closed-ended is essential to any discovery process.

If you understand the difference between them, know how and for what purpose to use each, and can rework a closed-ended question into an open-ended question on the fly when needed, then you’re halfway to being a great interviewer .

Whether in sales or medical research or journalism, questions are a means to create connections and explore stories. They’re also a way to get useful data. One leads to the “why,” and the other leads to the “yes.”

The real question is: What’s next?

Now that you’re an expert on open and closed-ended questions, you’ll be a master at creating authentic engagement with your brand. But if you need some help, ClearVoice has got you covered. Our managed content creation and expert teams can help you produce content that can maximize your brand’s growth and impact. Connect with us here to see how.

Stay in the know.

We will keep you up-to-date with all the content marketing news and resources. You will be a content expert in no time. Sign up for our free newsletter.

Elevate Your Content Game

Transform your marketing with a consistent stream of high-quality content for your brand.

Marketer showing high-quality content.

You May Also Like...

User Feedback for Effective Content Audits

Amplifying Your Content Strategy: Harnessing User Feedback for Effective Content Audits

Content Decay and Revival

Content Decay and Revival: Identifying and Updating Underperforming Content

how to write open ended interview questions for research

Unleashing the Power of AI: A Definitive Guide to AI-Driven Content Audits

  • Content Production
  • Build Your SEO
  • Amplify Your Content
  • For Agencies

Why ClearVoice

  • Talent Network
  • How It Works
  • Freelance For Us
  • Statement on AI
  • Talk to a Specialist

Get Insights In Your Inbox

  • Privacy Policy
  • Terms of Service
  • Intellectual Property Claims
  • Data Collection Preferences
  • Study Guides
  • Homework Questions

Write this section of the case study research

20 Good Questions to Ask in an Interview

how to write open ended interview questions for research

In a job interview , there are few things worse than responding to an interviewer’s final question, “Do you have anything to ask me?” by saying “No, I’m all set.”

According to Sara Hutchison, CEO and executive career consultant at Get Your Best Resume , not coming prepared with questions will “kill the tone” — even if it was a great interview up to that point.

“The questions you asked show whether or not you did research,” Hutchison said. “It shows that you’re genuinely interested in this organization or in the technologies.” 

Best Questions to Ask in an Interview

  • What do you do to foster an inclusive team?
  • What does career growth for this role look like? 
  • How does your company support its employees?
  • Why is this position open now?
  • What tools and platforms do your teams work with?
  • If you left this company, what is the biggest thing you would miss?
  • What is the biggest challenge facing this team right now?
  • How is performance rewarded?
  • How does your company nurture innovation?

To help you leave a positive impression in your next interview, we asked HR leaders and career advisers to provide the best types of questions to ask during a job interview. It might just be the difference between securing an offer and not.

Questions About Management and Leadership

1. what do you do to foster an inclusive team  .

A broader related question could be, “What is your company doing to encourage  workplace diversity ?”

When Ji Park, a software developer at LaunchPad Lab , first applied to work there, it was important to her to work for a company that emphasized diversity. She asked her interviewer about diversity statistics at the company, and found out that the team was mostly made up of white men, but her interviewer also mentioned that they were making efforts to make their team more inclusive . “In a case like that, I think it’s important to keep asking, ‘What are those efforts? What plans do you have to hire more diverse candidates?’” she said. 

Being intentional with your questions pushes companies to be accountable and can get them to better focus on issues like diversity and inclusion that often get overlooked.

2. What Are the Strengths and Weaknesses of the Company’s Leadership?

It’s important to have a good understanding of how the company’s leadership works because their actions ultimately affect employees at all levels of the organization. Dawid Wiacek, career and interview coach and founder of  The Career Fixer , said related questions you can ask are “How long has the leadership been in place?” “What’s their leadership style ?” “What are they really great at?” “What are their gaps?” 

“You’ll want to understand the management style of the person who can make your life great or a miserable living hell,” Wiacek said.

3. How Do You Practice and Implement Your Company’s Values?

This question can provide a closer look at how authentic company executives are in practicing what the business preaches. It also shows how effective leaders and managers are in getting employees to buy into a  company culture and abide by specific values. Strong, positive leadership by leaders at various levels of an organization is necessary for a company to have a thriving culture that everyone believes in.

4. What Excites You About the Company’s Direction? 

It’s much easier to find purpose and growth at a company where leaders possess a clear vision for where they want to take the business moving forward. Asking about the company’s direction can lead to key intel on whether a company has goals and whether these goals align with yours.

More on Interview Questions How to Use the STAR Interview Method to Land a Job

Questions About Employee Development and Job Growth

5. what does career growth in this role look like.

This question will help you determine if there are opportunities for you to grow at this company and help you envision how the role fits into your  career path . Plus, it shows that you are excited about the potential of sticking with a company for years to come.

“When people ask those questions in interviews, it suggests that they want to stay at this company in the long term, that they’re not just looking at this as a waystation, and that’s really appealing to employers,” said Erin Brown, associate director of graduate student career services at UCLA.

6. How Does the Company Invest in Training and Development?

This question will give you a sense of whether or not the company cares about nurturing its talent and growing existing employees’ skills. Another related question is, “What is manager coaching and training like?” This question is good to ask, even if you’re not pursuing a managerial role. 

“If people are like, ‘I don’t know what happens there,’ then that makes it clear the company doesn’t  invest in management , which is so critical to everyone’s experience,” said Emily Connery, senior director of people and talent at people analytics platform ChartHop . 

7. How Does Your Company Support Its Employees?

At any job, you’re going to run into challenges or snags that you’ll need help overcoming. Before joining a new company, you want to be positive that they care about their employees and will support you when things get tough.

When applying to jobs in the middle of the pandemic, Park knew that jumping into a new role while remote would be tricky. She wanted to make sure that whatever company she joined would provide her with adequate support to make the transition .

“In my interview I made sure to ask what resources the company provided to make people feel well-adjusted,” Park said. “I wanted to know that they were aware of the common challenges teammates might face and were ready to help them out.”

8. What’s the Typical Career Path for Employees in This Role?

This is a great question to explore what kind of movement is possible within a company. Perhaps employees who thrive in a role follow a specific career track within their department and receive promotions . Or maybe they’ve moved laterally to other departments, applying transferable skills to a variety of roles. 

You could also follow this up by asking what the most popular paths are that employees follow within a department. It’s a promising sign if an interviewer not only provides a detailed explanation of what employees are doing now, but also lays out a process for how the company has helped employees get to their current positions.

More on Interview Questions How to Answer ‘Tell Me About Yourself’ In a Job Interview

Questions About the Job Role

9. why is this position open now  .

While there’s a risk that this question could put the interviewer on the defensive if the last person in the role left on bad terms, asking this question can help you understand important information about the team. You could ask, “Is the role brand-new, and if so, what prompted its creation?” If you’re pursuing an established role, you might want to know how many people have held the position lately. If there’s been a revolving door of people in the role and high turnover on the team, that might be a red flag.

You could even ask, “Where did the previous person in the role go? Did they stay at the company and climb up?” Wiacek said.

10. What’s a Non-Obvious Skill That Would Make Someone a Great Fit for This Role?

This is a question that can help you stand out in a later-stage interview. You’re ultimately asking the interviewer what would be the skills that their dream candidate would have. Maybe you actually have this skill, and this presents an opportunity to talk about it. Or, if you don’t have that skill and are interested in learning , you could talk about ways you would be willing to acquire it. 

11. What Tools and Platforms Do Your Teams Work With?

If you’re applying for a software development or data science role, you’ll likely be expected to work with a variety of technology stacks, and some might be unfamiliar. Ask about what platforms or  tools you’ll need to use as a part of your role and find out what kind of training resources they offer to help you learn new technologies .

“Asking what value the customers will get from what we build shows that you’re not just myopically thinking about how to write a line of Python or build a machine learning model,” said David Fellows, chief digital officer at analytics company Acuity Knowledge Partners . “You’re actually thinking about providing solutions that people can use.”

Typically, don’t save this question until the end of the interview process, and don’t pose the question to the recruiter or someone not on the technical side . 

12. What Is the 90-Day Plan for This Role?

To understand what roadmap and support exist for a certain role, a helpful question to ask is, “What is the  90-day plan for this role?”

“It should be clear. They should really understand what the first 90 days should look like, and if it’s not, I think that tells you a lot about the level of organization,” Connery said.

Ultimately, you could ask the more common but important questions like, “How will the success of this candidate be measured?” 

“It helps you to kind of have goals for yourself for those first three to six months,” Hutchison said. “It gives them an idea of what their expectations are and how much guidance you’re going to have before they let you on your own.”

More on Interview Questions The Best Way to Answer ‘Where Do You See Yourself in 5 Years?’

Questions About the Company and Company Culture

13. if you left this company, what would you miss the most.

This is a way to flip around the question that candidates often hear, “ Why do you want to work at this company? ” You want to understand the best parts of the company and why employees stay. 

“This gets the person talking and loosens them up and engages them on a different level, rather than just talking about the sometimes dull job description,” Wiacek said. “It humanizes the interview experience.”

14. What Surprised You About Working at This Company?

This can elicit a positive or negative answer, but either way, it will give you important insight about the  workplace culture and company dynamics.

“It’s a surprising question and can help you be more memorable as a candidate and can help you stand out against those who ask boring questions or don’t engage the interviewer,” Wiacek said.

15. What Is the Biggest Challenge Facing This Team Right Now?

Every company has areas for improvement, and this helps you start to understand what challenges you might encounter should you be offered the role.

Stacy Ulery, assistant director for career education and engagement at UCLA Career Center, said that asking this question allows you an opportunity to showcase your problem-solving skills or talk about another similar project you worked on.

“It’s another opportunity for you first to demonstrate that you’ve done your research, that you understand the industry, you understand the company’s place in the marketplace but also what can you bring to the table to help them,” she said. 

If you’ve done advanced research or learned about a challenge in a previous interview, Lily Valentin, head of operations for North America at job search engine  Adzuna , suggests presenting a potential solution to the company’s problem.

“It’s most important to hear questions from a job seeker that really embeds themselves in the business and the business framework,” she said.

16. How Does This Company Handle Failure?

The answer to this question will tell you a lot about a company’s resiliency and how it supports people when mistakes and shortcomings inevitably happen. It’ll be helpful to learn what systems and tools of support the company offers employees to ensure success. Do you get mentorship and coaching in these instances? 

“Use this question wisely. It may not be appropriate for Type-A companies or interviewers. But if you have succeeded in previous roles and have every reason to believe you’ll give 100 percent effort in the new role, then it’s a fair question to ask of the employer,” Wiacek said. “For some of my clients, they only want to work for companies that invest in their people, and actually put their money where their mouth is.”

17. How Do You Think This Company Stacks Up Against Your Direct Competition?

This question will give you a sense of how the company perceives itself and how it is thinking about maintaining a competitive advantage against other players in the industry. 

“A weak answer might give you pause. A good answer will give you confidence that the company is proactive, transparent, honest, prepared,” Wiacek said. “You need to grill the company as much as they want to grill you.”

18. Can You Tell Me About How Communication Happens Here? 

Does the company host all-hands meetings? How often should you expect one-on-one meetings with your supervisor? Are there team meetings? You should get answers to these questions by asking about  communication . 

Another communication question you could ask is, “How does the company interact with the executive team?”

“That could be very telling in terms of how the executive team shows up. Are they like Oz behind the curtains, or are they really a part of the teams?” Connery said.

It is also helpful to learn how different teams communicate with each other, especially if you’re in a highly collaborative role. Wiacek said many of his clients in the tech industry cite challenges in communication between tech teams and nontechnical departments, so it’s a good idea to learn how the company works through communication challenges like that. 

19. How Is Performance Rewarded?

Some companies might reward excellent performance with bonuses, while others focus more on  awards or recognition . If a company doesn’t place a high value on feedback or acknowledge exceptional work at all, you could end up frustrated in your role.

“People might ask questions more about compensation or promotions but not necessarily, ‘How is performance rewarded?’ I think if people stumble in answering that question, it might not be an environment where people feel recognized,” Connery said. 

Should you be offered the role and be looking at a promotion with the company down the road, it would be helpful to have information at the start of your tenure about how leveling is determined for roles and how promotion decisions are made , so feel free to ask about that during the interview process as well.

20. How Does Your Company Plan to Keep Innovating?

Your interviewer might be excited to answer a question about how the company is innovating . This question will help you understand how the company feels about new ideas, new technologies and adapting in the ever-changing tech world.

It’s also important to understand what the vision for the company is and how the company plans to innovate for the future. Kimberly Terrill, associate director for career education and development at UCLA, suggests asking questions about how the company’s mission and focus might change in the future. What are the hopes and aspirations for the company? 

“Tech changes so quickly. Even five years is a long time in tech,” she said. 

Ask All the Logistical Questions Early

While it might seem poor form to ask about salary range in an early interview, experts are now saying it’s best to gather all of the important basic information right away. This saves everyone time if plans suddenly change or the expectations for compensation and benefits don’t align. These can be a part of the questions you ask during the interview, too. 

Make sure you have answers to the following questions from interviewers before proceeding with future interviews:

  • How many interviews are there going to be? 
  • When are you expecting to have this role filled by? 
  • What is the salary range?
  • What are the benefits offered?
  • How is the title for this role determined? 
  • What are the day-to-day responsibilities of this role?
  • How many hours a week would be spent working on certain tasks?

How Do I Come Up With Good Questions?  

Do your homework and learn about the company ahead of time, so you can get answers that are truly useful to you should you be  faced with deciding whether or not to accept the job offer . Don’t ask questions you already know the answers to or could easily find from a Google search — your questions need to be well-thought-out and specific to the company and role you’re pursuing.

“I always do a ton of research into companies that I’m interviewing with, gathering as much as I can from their website and blog posts,” said Park. “I want to get a sense of the kind of people they hire. That usually gives me an idea of questions I want to ask.”

How Many Questions Should I Ask?

Typically, you should ask between two and five questions at every interview. You may not get a chance to ask them all, but it’s better to come prepared.

“It is a huge red flag whenever a job seeker comes into an interview and has no questions,” Valentin said. “It really doesn’t matter at what stage in the interview process you are.”

Frequently Asked Questions

What are common questions to ask in an interview.

Common questions to ask in an interview include “Why is this position open now?” “What is unique about your company’s culture?” and “What’s the biggest pain point your team is facing right now?”

How many questions to ask at the end of an interview?

It’s best to prepare two to five questions to ask at the end of an interview, with the expectation that the recruiter may not have time to answer all of them.

how to write open ended interview questions for research

LaunchPad Lab

Great companies need great people. that's where we come in..

IMAGES

  1. Open-Ended Questions in Marketing Research

    how to write open ended interview questions for research

  2. Open Ended Questions and Close Ended Questions In User Research (With

    how to write open ended interview questions for research

  3. How to Ask Open-Ended Questions: 20 Examples

    how to write open ended interview questions for research

  4. Writing an Interview Paper: Formatting Guide, Samples and Writing Tips

    how to write open ended interview questions for research

  5. Open-ended questions: When to ask them + 15 examples

    how to write open ended interview questions for research

  6. What are open ended questions? (300+ questions list) (2022)

    how to write open ended interview questions for research

VIDEO

  1. aging related answer|upsc interview questions|RESEARCH METHOD FOR STUDYING AGING|#shortsfeed #shorts

  2. YouTube, Business Analyst, Trust and Safety

  3. Closed vs Open Ended Interview Questions

  4. Research Analyst Interview Questions and Answers

  5. Top 20 Research Analyst Interview Questions and Answers for 2024

  6. How to Generate Interview Questions for Qualitative Research

COMMENTS

  1. Qualitative Interview Questions: Guidance for Novice Researchers

    practice to write open-ended questions; the hallmark of a qualitative interview" (Sofaer, 2002, ... In addition to being aligned closely with the research question, the interview questions .

  2. Open-Ended Questions in Qualitative Research

    Open-ended questions encourage numerous responses and allow respondents to provide their thoughts and opinions. " What ," " How, " or " Why " are some of the words used to phrase open-ended questions and are designed to elicit more detailed and expansive answers. Researchers use open-ended questions in ethnography, interviews, and ...

  3. Interview Techniques

    Depending on the research question, interviews require many of the same skills as medical interviews, for example, developing rapport and mixing open-ended questions with focus questions. Furthermore, developing an open mind and active listening is important for researchers to conduct effective research interviews.

  4. Open-ended interview questions and saturation

    Abstract. Sample size determination for open-ended questions or qualitative interviews relies primarily on custom and finding the point where little new information is obtained (thematic saturation). Here, we propose and test a refined definition of saturation as obtaining the most salient items in a set of qualitative interviews (where items ...

  5. Qualitative research: open-ended and closed-ended questions

    Introduction. Let us begin by pointing out that open and closed-ended questions do not at first glance serve the same purpose in market research. Instead, open-ended questions are used in qualitative research (see the video above for more information) and closed-ended questions are used in quantitative research. But this is not an absolute rule.

  6. Types of Interviews in Research

    Depending on the type of interview you are conducting, your questions will differ in style, phrasing, and intention. Structured interview questions are set and precise, while the other types of interviews allow for more open-endedness and flexibility. Here are some examples. Structured. Semi-structured.

  7. Preparing Questions for a Qualitative Research Interview

    Writing the Qualitative Research Interview Questions. After deciding the type of interview and nature of information you'd like to gather, the next step is to write the actual questions. Using Open-Ended Questions. Open-ended questions are the backbone of qualitative research interviews. They encourage participants to share their experiences ...

  8. Semi-Structured Interview

    Semi-structured interview questions. Since they are often open-ended in style, it can be challenging to write semi-structured interview questions that get you the information you're looking for without biasing your responses. Here are a few tips: Define what areas or topics you will be focusing on prior to the interview. This will help you ...

  9. PDF Interviewing in Qualitative Research

    Introducing questions: Broad, open-ended questions to start a conversation. They should be general and non-threatening, to start the conversation on the friendly note; e.g., "I want to ask you about your career in sports and how it developed." Follow-up questions: Questions that rephrase the interviewee's answer and ask them to

  10. 9.2 Qualitative interviews

    Figure 9.2 provides an example of an interview guide that uses questions rather than topics. Figure 9.2 Interview guide displaying questions rather than topics. As you might have guessed, interview guides do not appear out of thin air. They are the result of thoughtful and careful work on the part of a researcher.

  11. How to Write Effective Qualitative Interview Questions

    Quantitative questions are typically reserved for surveys but can be used in interviewing to add some context and allow the interviewer to ask more follow-up questions. They mostly uncover 'who' and 'what'. Qualitative questions will provide detailed information on the topic of interest, uncovering the 'why' and 'how'.

  12. How to Use Open-Ended Questions in Qualitative Interviews

    Using open-ended questions in qualitative interviews requires skillful listening and probing, as well as flexibility and adaptability. To do so effectively, you should follow the interview guide ...

  13. Open-Ended Questions: +20 Examples, Tips and Comparison

    1. Encourages Thoughtful Responses. Open-ended questions require the respondent to think and provide a more detailed answer rather than simply selecting from a list of predetermined options. This allows for more thoughtful and insightful responses, providing a deeper understanding of the subject matter. 2.

  14. Analyzing Open-ended Questions for Qualitative Research

    4. Open-ended questions can also provide a greater depth of insight that a closed-ended question. may not have. As Farber (2006) e xplains: agrees with this notion and adds that qualitative ...

  15. 18 Tough Open-Ended Questions (And How To Answer Them)

    8 tough open-ended interview questions and answers. Here are eight tough open-ended questions and example answers to reference as you write your own: 1. How would you describe yourself? Being self-aware is a valuable trait that employers look for in potential job candidates.

  16. Open-Ended Interview Questions: Types and Tips for Answering

    Thoughtfully and thoroughly preparing for open-ended interview questions can build your confidence and help you succeed in a job interview. Here are four tips for answering open-ended interview questions: Be specific. Connect it to the job description. Provide the right amount of detail.

  17. Open-ended Questions Vs. Closed-ended Questions In User Research

    Here are a few tips that will help you write effective open-ended questions. 1. Begin your question with how, why, and what. What makes a question open-ended is the wording of the sentence. For an effective open-ended question, start the question with words such as how, what, why, and can.

  18. PDF Examples of Open Ended Interview Questions

    Give me an example of how you handled it. 7. Think of a day when you had many things to do and describe how you scheduled your time. 8. Tell me about something you've done in your current (or most recent) job that is creative. 9. Tell me about a time when you made a quick decision that you were proud of. 10. Tell me about an important goal you ...

  19. How to analyze open-ended questions in 5 steps [template included]

    3) Copy the column from the 'CSV Export' sheet containing the open-ended question you want to analyze first and paste it into the 'Question 1' sheet, in the cell marked with < Paste answers to first open-ended question here >. 4) Choose wrap text for the entire column, so the data fits the column width and is easier for you to read ...

  20. 7 Great Open-Ended Questions to Ask in an Interview

    Published on Apr. 29, 2022. When Dave Vu interviews for roles, there's one question he frequently gets asked: "Why would you want to deal with the challenges of a startup?". The first time he was asked that question, it stopped him in his tracks. "It's a simple but effective question that made me think about my motivation," he said.

  21. Open-Ended Questions vs. Closed: 30 Examples & Comparisons

    An open-ended question opens up a topic for exploration and discussion while a closed-ended question leads to a closed-off conversational path. After "Yes" or "No" or the specific one-word answer to the question, the thread is done. Open-ended questions lead to qualitative answers while closed-ended questions lead to quantitative answers.

  22. How To Answer Open-Ended Interview Questions

    Here are strategies for answering open-ended interview questions: 1. Study the job description carefully. One simple way to ace open-ended interview questions is to study the job description carefully and ensure your answers focus on the job requirements. Since there are no right or wrong answers in an open-ended interview, you have more ...

  23. How to Answer Open-Ended Interview Questions (With Samples)

    6. Keep your responses brief. Answers to open-ended questions can be detailed and explicit while being brief. Avoid deviating from the topic and delivering long responses, since they might increase the likelihood of deviating from the topic. If your replies are overly long, the interviewer may lose interest.

  24. Write this section of the case study research (docx)

    Management. Write this section of the case study research Application of Case Study Approach: 3. In-Depth Interviews: Interview Process: Identify key informants such as HR managers, team leaders, and employees from various departments. Develop open-ended questions exploring changes in work processes, employee experiences, coping mechanisms, and ...

  25. 20 Questions to Ask in an Interview

    8 Smart Questions To Ask Hiring Managers In A Job Interview. | Video: Work It Daily Ask All the Logistical Questions Early. While it might seem poor form to ask about salary range in an early interview, experts are now saying it's best to gather all of the important basic information right away. This saves everyone time if plans suddenly change or the expectations for compensation and ...

  26. 18 Great Questions to Ask at the End of an Interview

    18 questions to ask at the end of a job interview. There are three main aspects that you can draw questions from at the end of an interview. These include general questions about the company, questions that specifically relate to the role you are interviewing for and wrap-up questions that bring the meeting to a productive close.