U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • JMIR Form Res
  • v.6(4); 2022 Apr

Logo of formative

The Strategies for Quantitative and Qualitative Remote Data Collection: Lessons From the COVID-19 Pandemic

Keenae tiersma.

1 Department of Radiology, Massachusetts General Hospital, Boston, MA, United States

2 Department of Psychiatric Oncology, Massachusetts General Hospital, Boston, MA, United States

Mira Reichman

3 Integrated Brain Health Clinical and Research Program, Department of Psychiatry, Massachusetts General Hospital, Boston, MA, United States

Paula J Popok

Maura barry, a rani elwy.

4 Implementation Science Core, Department of Psychiatry and Human Behavior, Brown University, Providence, RI, United States

5 Center for Healthcare Organization and Implementation Research, VA Bedford Healthcare System, Bedford, MA, United States

Efrén J Flores

Kelly e irwin, ana-maria vranceanu.

The COVID-19 pandemic has necessitated a rapid shift to web-based or blended design models for both ongoing and future clinical research activities. Research conducted virtually not only has the potential to increase the patient-centeredness of clinical research but may also further widen existing disparities in research participation among underrepresented individuals. In this viewpoint, we discuss practical strategies for quantitative and qualitative remote research data collection based on previous literature and our own ongoing clinical research to overcome challenges presented by the shift to remote data collection. We aim to contribute to and catalyze the dissemination of best practices related to remote data collection methodologies to address the opportunities presented by this shift and develop strategies for inclusive research.

Introduction

The COVID-19 pandemic is transforming the landscape of clinical research. The pandemic has necessitated the unexpected adaptation of ongoing clinical research activities to web-based or blended design (ie, part web-based, part in-person) models [ 1 ] and has rapidly accelerated a shift within clinical research toward web-based study designs. Despite the high levels of patient and health care provider satisfaction with telemedicine and virtually conducted clinical research [ 2 , 3 ], many challenges exist to the web-based conduct of rigorous, efficient, and patient-centered clinical research, particularly related to the engagement of diverse and marginalized populations [ 4 ]. The aim of this paper is to discuss practical strategies to guide researchers in the remote collection of quantitative and qualitative data, derived from both previous literature and our own ongoing clinical research.

Many health care providers and clinical researchers have marveled at the way the COVID-19 pandemic catalyzed the widespread adoption and expansion of telemedicine, seemingly overnight [ 5 , 6 ]. Despite the sluggish adoption of telemedicine observed in academic medical centers over the past several decades [ 7 , 8 ], the pandemic has spurred rapid changes in public and organizational policy regulating telemedicine in the United States, facilitating a tipping point toward the web-based provision of both health care and conduct of clinical research [ 1 , 2 , 5 , 6 ]. Enabled by fast-tracked institutional review board policies and amendments [ 1 ], researchers have adapted clinical research study procedures in innovative ways: engaging in web-based outreach for study recruitment, collecting electronic informed consent, conducting study visits, delivering interventions over the phone or live video, and using remote methods to collect data [ 1 ]. Several studies have reported high satisfaction of both providers and patients with the use of telemedicine during COVID-19 and a willingness to continue using telemedicine after the pandemic, including for clinical research [ 2 , 3 ].

This shift toward virtually conducted clinical research creates many opportunities to increase the accessibility of clinical research. Virtually conducted research reduces many burdens on patients associated with research participation, including time and monetary costs involved in travel to research facilities. This enables researchers to include patients who lack access to transportation or the ability to travel independently. Furthermore, web-based patient outreach allows researchers to recruit geographically diverse participants, enabling researchers to target populations through disease-specific registries, internet-based patient communities, and advocacy groups without geographical constraints [ 9 ]. By centering patients rather than investigative sites in the study design and operation, virtually conducted research has the potential to increase the patient-centeredness of clinical research [ 9 ].

At the same time, the transition to virtually conducted clinical research also presents many challenges to patient engagement and data collection. Losing supervision of the physical setting of research activities challenges researchers’ ability to ensure patients’ adherence to study protocols, engagement and interest in research activities, and privacy protections. Researchers are faced with complex decisions regarding the appropriateness of data collection methodologies or specific measures and assessments for web-based delivery [ 10 ]. Furthermore, there are barriers associated with the technology used for remote data collection (eg, telephones, electronic databases, live videoconferencing software, and ecological momentary assessment), including a lack of technology literacy and challenges using technology among both patients and research staff [ 1 , 4 , 8 , 11 ]. Finally, some patients lack access to smartphones, the internet, or secure and stable housing, which may preclude their participation in web-based clinical research unless researchers can allocate funding to provide these devices. Consequently, the transition to virtually conducted clinical research may further marginalize people in low-income and rural settings [ 4 ].

To thoughtfully respond to the challenges associated with remote data collection and ensure that disparities in access to clinical research do not widen, there is a critical need for practical strategies for researchers. By integrating recommendations from previous literature with examples from the ongoing clinical research projects of this authorship team with extensive patient and provider populations (ie, adults and adolescents with neurofibromatosis, older adults with chronic pain and cognitive decline, adults with cancer and serious mental illness, adults with young-onset dementia, and orthopedic medical providers), we present a discussion of practical strategies for researchers to support the rigorous, efficient, and patient-centered collection of quantitative and qualitative data remotely. Summary tables present a list of strategies related to the remote collection of quantitative ( Table 1 ) and qualitative ( Table 2 ) data.

Challenges in remote quantitative data collection and associated strategies.

a REDCap: Research Electronic Data Capture (Vanderbilt University).

b Qualtrics Survey Distribution (Qualtrics XM Platform).

c HIPAA: Health Insurance Portability and Accountability Act.

Challenges in remote qualitative data collection and associated strategies.

a HIPAA: Health Insurance Portability and Accountability Act.

Strategies for Remote Data Collection

Optimizing quantitative measures for effective remote distribution and delivery.

Asynchronous distribution and measure completion (eg, electronic distribution of surveys) maximize efficiency for the study team and flexibility for study participants but necessitates additional consideration for participants with varying levels of familiarity with and access to technology. Secure web platforms (eg, REDCap [Research Electronic Data Capture], Vanderbilt University and Qualtrics, Qualtrics XM Platform) are ideal for asynchronous distribution because they have functionalities that promote study team efficiency and organization (eg, scheduling survey distribution in advance and automatic reminders to participants to complete surveys) while enabling participants to complete measures independently and at a time most convenient for them [ 9 ]. Although these platforms are widely compliant with Health Insurance Portability and Accountability Act (HIPAA) and regulatory requirements, study teams should ensure that platforms are compliant with institution-specific clinical research regulatory requirements before use (and consider potential differences between clinical research and clinical care requirements).

Many of these platforms also offer participant screening, consenting functionality, and mobile device compatibility, which maximize the utility for study teams [ 9 ]. Study teams relying on web-based platforms and asynchronous measure completion should also consider the adoption of flexible alternative options for measure completion to maximize completion rates and the engagement of participants. For example, study teams might offer participants the option to complete measures on paper through physically mailed surveys or over the phone with a member of the study staff, depending on participant technology access and preference. Similarly, in addition to electronic reminders integrated within the distribution platform, study teams will likely need to use other methods to contact participants and remind them to complete measures (eg, calling, texting, and reminding in person). To decrease the burden on participants and increase adherence to study procedures, participants should be informed of how many of these reminders to expect.

Validating participant credentials in studies where research staff have no personal interaction with participants (ie, web-based survey studies) is another challenge with web-based research. Data quality checks, such as eligibility, attention, and manipulation checks (see Table 1 for examples) can be introduced to protect from duplicate responses or participants falsifying information. Enabling IP address tracking is another feature of some survey platforms (eg, Qualtrics). As with all data collection, it is imperative that participants are aware of how their information is being collected and researchers have been granted previous institutional review board approval.

We use REDCap and rely on predominantly asynchronous measure completion to collect quantitative data in an ongoing randomized controlled trial of a mind-body intervention to promote quality of life in adults with a genetic condition called neurofibromatosis [ 12 ]. Participants receive links via email to complete surveys at all time points (ie, baseline, posttest, and 6- and 12-month follow-ups), and we set automatic email reminders to go out at defined intervals every 3 days until participants complete surveys. The frequency of reminders should be determined by the study team to balance the burden on study staff and participants with the desire to have high survey completion rates. We find that participants enjoy the flexibility of completing measures at their convenience from the comfort of their homes and using personal devices.

For quantitative measures other than self-report surveys, study teams may need to use innovative methods to adapt data collection methods for remote delivery. Although not all measures can be adapted for remote delivery (eg, imaging data collection), many can through a combination of creative and flexible strategies, including using mobile device data collection, mailing materials and devices to participants, and conducting assessments over live videoconferencing. Even the collection of biomarker data, common in quantitative research clinical trials, can sometimes be adapted for remote conduct through mailing of devices and use of smartphone technology, such as mailing saliva or nicotine strips for the verification of tobacco abstinence or the provision of personal devices to measure expired carbon monoxide that are compatible with smartphones [ 1 ]. In adapting measures for remote delivery, it is essential to examine previous literature to assess the availability of remote alternatives and evidence to support the validity of remote alternatives or adaptations [ 10 ]. Study teams’ attention to usability and patient burden is essential [ 10 ]. It may also be important to account for the modality of data collection during data analysis (eg, evaluating whether the mode of data collection is a confounder in multimodal studies).

In our randomized controlled trial with patients with chronic pain and cognitive decline, we conducted a literature search to identify remote methods for assessing cognitive functioning as well as performance-based physical function. The Montreal Cognitive Assessment [ 13 ], a measure we previously used in our in-person study [ 14 ], has been adapted and validated for remote administration over live videoconferencing [ 15 , 16 ]. Accordingly, we developed a standardized protocol for applying the Montreal Cognitive Assessment audiovisual procedures, including mailing participants a paper with items that required drawing and instructing them to display the paper to the camera for us to screenshot over videoconferencing [ 17 ]. Our literature review also identified a validated, free-of-charge mobile app that uses GPS coordinates to measure the distance walked in 6 minutes to replace the 6-minute walk test (6MWT) [ 18 ] that we had previously conducted in our in-person study [ 14 ]. Before using the app with participants, we piloted the app and developed a standardized protocol to assist participants in downloading the app, using the app, and reporting the results [ 17 ].

In the process of adapting quantitative measures for remote completion, the safety of the participants must be considered. For example, in our randomized controlled trial with adults with neurofibromatosis, we used the Patient Health Questionnaire-9, which contains an item assessing suicidal ideation, to measure depression. We developed a standardized protocol to respond to cases in which participants endorse suicidality, including collecting the name and number of an emergency contact for each study participant during enrollment, having the study clinician and principal investigator receive immediate notification from REDCap, and having the study clinician or principal investigator follow up over phone with the participant within 24 hours to complete a safety assessment [ 12 ]. Similarly, in our randomized controlled trial with older adults with chronic pain and cognitive decline, we considered the safety risks associated with asking participants to complete the 6MWT independently (eg, falls). Participants were asked to create a plan to complete the 6MWT on a familiar route at a designated date and time, with support from a friend or family member when possible [ 17 ].

Synchronously Assisting Participants in Remote Completion of Quantitative Measures

Depending on the study protocol and population, the best practice may be the synchronous completion of measures (ie, in which a study team member administers the assessment to the participant in real time). The synchronous completion of self-report measures enables study staff to directly support participants in completing measures, including ensuring participants’ best effort, attention, focus, and comprehension during measure completion. Assisting participants synchronously in completing self-report measures also allows study staff to ensure that data are supplied directly from intended participants and eliminate the possibility that participants are being influenced by others such as spouses or parents. The factors to consider when making this decision include participants’ age, cognitive ability, previous experience with technology, and preference. When assisting participants with assessment completion remotely, multiple modalities that can be used. First, calling participants by phone requires minimal technology access and familiarity for participants and enables study staff to catch participants at an opportune moment and ensure prompt survey completion with minimal effort on the part of the participant. Over the phone, study staff can ensure comprehension of every item (important for data validity); however, reading aloud every question-and-answer option can also be tedious for both study staff and participants. Strategies to address comprehension and focus include pausing to ask if clarification is needed, breaking up longer questions, and asking participants if they wish to take a break throughout the conversation.

For some participants, the visual component was beneficial for enhancing their comprehension of measure items. Video calling a participant with HIPAA-compliant, secure platforms [ 1 ] (eg, with Zoom and WebEx) and screen sharing the measure is a novel strategy to support participants in completing measures remotely. This screen share method provides the opportunity for the participant to see the questions in addition to hearing them and can enable better comprehension as well as more efficient measure completion (eg, study staff may not need to read every answer choice for items when participants can read them on-screen). Mailing participants paper copies of surveys in advance of phone calls is another method for allowing participants to have questions in front of them while also receiving live assistance in responding.

We use this novel screen share strategy in an ongoing randomized controlled trial of a mind-body intervention to promote quality of life in geographically diverse adolescents aged 12 to 17 years with neurofibromatosis [ 19 ]. We decided to rely on synchronous measure completion for this population, given the age of participants and high rates of learning disabilities, leading to anticipated challenges with thoughtful independent measure completion, as well as anticipated challenges with comprehension of items. The method has been effective in engaging participants during data collection to ensure participant comprehension of items and thoughtfulness when selecting answer choices. This method has also allowed us to identify and eliminate situations in which participants’ parents are inappropriately coaching participants during data collection. Notably, videoconferencing does require a higher level of access and familiarity with technology; therefore, creative problem-solving abilities with participants are essential. As with all forms of technology used in data collection, study teams should consider ease of use for participants and be prepared to provide both emotional and technical support [ 11 ].

For group-based interventional studies and situations in which study staff want to be available to answer potential questions related to measure completion (about either technology use or specific items) but do not want to walk participants through every item, a group support procedure could be used using videoconferencing. In this strategy, a member of study staff can email participants the links to complete surveys on their own devices and schedule a time in which the group of participants joins a videoconferencing call to complete the measures at the same time. We use this strategy in our randomized controlled trial for older adults with chronic pain and cognitive decline [ 17 ]. Participants in a group video call are supported in navigating to their email to open the secure link to complete the questionnaires. Although completing their questionnaires independently, participants turn their video on or off, and we mute all participants and the study staff host to enhance focus and privacy and to replicate an in-person visit [ 14 , 17 ]. This method allows us to assist as needed when a participant takes themselves off mute to ask a question, physically raises their hand, or privately chats us. In addition, we periodically ask if anyone needs assistance, particularly after noticing that participants are not progressing as expected because REDCap allows the ability to monitor progress in real time.

As with the shift to remote clinical care, the privacy and confidentiality of patients is not as easy to ensure as it is in person. Research staff have an obligation to ensure participant privacy and confidentiality to adhere to the principles of good clinical practice [ 20 ] and to ensure the acceptability of study procedures to participants for whom concerns of being overheard are common [ 11 ]. Informing (or reminding) participants of the sensitive nature of the questions (eg, pertaining to physical health, mental health, and intimate relationships) and ensuring that they are in a space where they feel comfortable to answer is the best practice. Working with participants to ensure the highest level of privacy may be necessary. Suggestions include using headphones (both participants and research staff), inquiring about participants’ location and privacy, and allowing participants to determine the best time for the call [ 5 , 11 ]. Additional safety protocols are necessary when providing devices to participants, as they could be exposed to data theft or lose track of the device. We suggest enabling password protection on devices and limiting the data stored on the actual device to protect patient safety. Ultimately, although providing devices introduces the risk of needing to potentially replace the hardware, it is a readily integrable strategy to address the digital divide and increase access to research [ 21 ]. Participants should be reminded of the privacy risks associated with remote study participation (eg, possible breaches to the security of data collected remotely) and informed of the measures study staff are taking to safeguard against these risks (eg, encryption of devices and deidentification of data).

Motivating Participants to Complete Quantitative Measures Remotely

Building relationships with study participants is central to engaging participants in study procedures and ensuring thorough and thoughtful data collection. Survey fatigue and general fatigue related to research participation pose real challenges to data collection as well as study retention [ 9 ]. Interactions with participants vary in length and frequency depending on study protocols; however, each interaction should be viewed as an opportunity to build rapport with participants. Strategies to build rapport include smiling (if on a video call), communicating clearly and confidently, and providing adequate emotional and technical support [ 5 , 11 ] ( Figure 1 ). Researchers, clinicians, and patients alike cite increased mental health symptoms, stress, and added duties owing to the pandemic [ 22 ]. It is important to keep these additional burdens in mind when communicating with participants. Adjusting calls about study measures to be more conversational (eg, making time to ask participants about their day and how they are doing) can aid in establishing and maintaining rapport in the study team–participant relationship. The shared experience of COVID-19 is unifying and can be a source of common ground to relate to participants. Engaging in this way and expressing gratitude for participants’ time can help build participant investment in the study.

An external file that holds a picture, illustration, etc.
Object name is formative_v6i4e30055_fig1.jpg

Building rapport.

Study teams face additional challenges in prompting participants to complete measures when participants are difficult to reach or are unresponsive. Persisting in using creative outreach methods for calling and texting participants using HIPAA-compliant technologies (eg, Cisco Jabber and Twilio) [ 1 ] is essential. Study teams should consider adopting standardized procedures for attempted contact with participants to limit the burden on both participants and the study team. Often, research coordinators or research assistants are the first line of communication with participants and will attempt to call participants a certain number of times. It is helpful to consider when participants are usually home (ie, what time of the day is best to call) and to try different times throughout the day to achieve higher response rates. Study teams should standardize the maximum number of outreach attempts by research coordinators. Once that number is reached, it has proven useful in our experience to pass the communication up the chain to a study clinician or principal investigator. Study teams can also use this approach to allow a clinician to assess whether disengagement may be related to any concerns regarding the participant’s well-being. Other strategies to bolster participant motivation include involving family members in study procedures, accommodating participants’ preferred methods of communication (eg, texting, email, and phone call), and providing monetary or other forms of incentives [ 9 , 11 ].

Promoting Health Equity and Overcoming Barriers to Web-Based Engagement Among Participants With Varying Levels of Technology Access and Familiarity

As the COVID-19 pandemic continues to lay bare the existing health disparities in racially, ethnically, and socioeconomically minoritized groups, concerns that the increased reliance on digital technologies for clinical care and research will exacerbate the digital divide rather than mitigate systemic health inequities are prevalent [ 23 ]. Indeed, digital access is considered a social determinant of health, with 21 million adults in the United States lacking access to broadband internet [ 24 ]. With the transition to web-based research, we risk compounding this structural disadvantage and not realizing the potential to expand research access to increasingly diverse and underrepresented populations [ 1 ] without targeted measures to address digital access and literacy [ 21 , 25 , 26 ].

Building capacity for person-centered, equitable research can be facilitated by providing smartphones or internet plans to participants if access to these technologies is an inclusion requirement [ 1 , 11 ] as well as using multiple outreach modalities. Enabling outreach through multiple modalities has led to successful data collection during the pandemic in our ongoing randomized controlled trial for patients with serious mental illness and a new cancer diagnosis [ 27 ]. In this trial, we use multiple traditional outreach methods for data collection (ie, phone, email, and letter mail) in addition to nontraditional methods such as partnering with family caregivers and staff in congregate living settings. Despite a slower study accrual because of fewer new oncology consultations during the pandemic, we maintained consent and survey completion rates for a marginalized population with flexible, multimodal, patient-centered outreach [ 28 ].

Providing adequate technology support is also of utmost importance. Study teams must provide training to participants for all forms of technology used, through manual documentation, prerecorded videos, or live assistance (eg, over the phone) [ 11 ]. Proactive outreach to individuals for technology coaching can promote efficiency and decrease participant frustration. Test-driving technologies and creating a short list of common technology challenges encountered by participants can help study teams troubleshoot and identify unnecessarily confusing aspects of instructions or procedures that can be changed. Study teams can also consider engaging family members in study procedures, which has been shown to aid in the adoption of technology for older populations with cognitive impairment [ 11 ]. We commonly use the approach of meeting participants where they are by first assessing participants’ comfort with technology during a study enrollment phone call. This allows us to provide extra support where necessary, such as detailed instructions on software installation, test calls with study staff, and encouragement. We also prioritize conducting qualitative exit interviews to obtain feedback on study procedures to refine study protocols and participant instruction materials [ 14 ]. Technical support activities may increase the total time spent both preparing for and conducting a session with a participant. However, the time invested in participants proactively will contribute to improved data quality by ensuring patient understanding of the technology and study measures. Furthermore, digital solutions tailored for specific populations can aid in realizing the potential for web-based research to increase accessibility to underrepresented individuals.

Practical and Logistical Considerations to Conducting Qualitative Interviews and Focus Groups Remotely

Focus groups, or interviews, are conducted synchronously; therefore, time (and time zone) coordination is required. For individual interviews, offering flexible hours that prioritize participants’ preferences may assist in study enrollment because participants will be able to schedule and mark their calendars for a study visit in real time. To coordinate a focus group, study staff can ask participants about their availability within multiple potential time blocks to choose a time to maximize attendance. Once a specified time frame has the minimum target focus group size, study staff may call unavailable participants to assess whether there has been a change in schedule or continue recruiting to reach the maximum focus group limit, ranging anywhere from 4 to 12 participants [ 29 ], with smaller groups often preferred for web-based conduct. In general, participants should be made aware before the interview or group what the policies will be (ie, how long the group will run, expectations for video on or off, and audio-recording plan).

HIPAA-compliant videoconferencing software (eg, Zoom and WebEx) is necessary for the conduct of remote qualitative interviews or focus groups (as opposed to phone calls) to facilitate rapport building between study staff and participants to ensure that participants feel at ease. Many types of videoconferencing software contain features, such as waiting rooms and passcodes, that maximize participants’ security and confidentiality. Still, participants should be informed of the privacy risks associated with participation in remote focus groups (eg, the unsanctioned audiotaping or videotaping of groups) and the rules for participation (eg, use of headphones and being against recording of groups) should be clearly articulated at the start of every group. Features such as breakout rooms can also be innovatively used to conduct multiple interviews at one time, such as in the case of exit interviews after focus groups. Microphone and video camera positioning should be considered for both the interviewer and the interviewee, and 5 to 10 minutes should be allotted to ensure the proper placement and functioning of microphones and video cameras to enhance the quality of data. Automated live captioning of the interview conversation (closed captions) may also benefit participants who have difficulty hearing.

Having study staff on call during interviews and focus groups is essential to provide technological support to participants in case of issues. Study staff can provide individual support to participants and troubleshoot issues related to remote participation, including poor connectivity with the internet, audio or camera issues, the use of videoconferencing software, and environmental disruptions [ 11 ]. In the case of challenges that cannot be solved within a reasonable amount of time, study staff should have backup strategies in place to conduct interviews over the phone, allow participants to join focus groups by phone, or reschedule meetings flexibly. These procedures were used in qualitative interviews with patients with young-onset dementia and their caregivers [ 30 ], as well as in focus groups with orthopedic medical providers to enable the recruitment of geographically diverse participants.

We used these strategies at the beginning of the pandemic to transition from an in-person focus group study investigating barriers to smoking cessation clinical trials for Hispanic, Latino, or Latina individuals to remote procedures. Before the pandemic, we recruited Hispanic, Latino, or Latina individuals for focus groups conducted in both English and Spanish. After transitioning to remote research, we ran the web-based focus groups with smaller numbers than intended in person (3-4 people) to ease the burden on the study team while we navigated the new technology and ensured that each participant was able to receive one-on-one assistance. We faced challenges with technology, including finding solutions for individuals who did not have email or webcam access, a noted disparity among older Hispanic individuals [ 31 ]. To increase access, we mailed information to all participants (eg, study information sheet and materials to be discussed during the group) 1 week before the group and expanded our protocol to include both telephone conference calls and videoconferencing calls to accommodate participants’ varying levels of technology access. Despite technological challenges, we found that offering web-based focus groups was helpful for both participants and study staff because we could more flexibly schedule groups with the bilingual study staff member who facilitated the groups. We also offered participants the option to have a test call with a member of the study staff to ensure adequate internet connection, microphone or camera functioning, and confidence navigating the video software. An alternative method would be to include a brief introduction to the video software at the beginning of a qualitative interview or focus group and encourage participants to test different functions (eg, toggling audio and video on and off).

Adapting Facilitation Strategies for Remote Qualitative Data Collection

Although remotely conducted interviews and focus groups may pose some challenges to interviewers in engaging participants, connecting with participants, and encouraging open and active dialogue among participants, there are many verbal and nonverbal strategies that interviewers can adopt. Before the interview, study staff should begin building rapport with participants ( Figure 1 ), explain who will be conducting the interview with their credentials, and provide information on what topics the interview will cover (particularly important for sensitive topics). At the start of the interview or focus group, interviewers should warmly introduce themselves and provide additional reminders to set the appropriate tone. For example, interviewers should encourage participants to be in a quiet and private space (or use headphones) with efforts to minimize environmental distractions (eg, participants should not be driving, doing chores, or eating) [ 11 ]. Interviewers may want to encourage participants to keep their camera on if they are able to facilitate engagement and rapport building but to mute themselves when they are not talking to reduce background noise. If participants are muted, interviewers should be prepared to probe them more enthusiastically than usual to motivate active dialogue and participation. It may be helpful for interviewers to continually encourage participants to share, particularly those who have been quiet. Encouraging diversity of opinion among groups can also help participants feel comfortable expressing their personal experiences and differing perspectives.

Assuming that they are visible to participants, interviewers should also pay attention to their nonverbal communication. If interviewers must take notes during qualitative data collection and are therefore unable to maintain eye contact throughout the interview or focus group, participants should be informed to avoid potential nonverbal miscommunication. Reactive facial expressions are critical in remote qualitative data collection, as body language cannot be observed as it typically would be in person, although some aspects such as posture may be observed. Nonverbally reacting appropriately to what participants share is vital to encourage participants to be open and honest during an interview. The key aspects of nonverbal communication include eye contact (toward the participant or the camera), using facial expressions to demonstrate understanding and listening, and body language, including nodding [ 11 ].

For structured and semistructured interviews and focus groups, keeping track of the timing during the interview is also necessary to ensure that all questions are answered, with appropriate time allocated to each section or question. This is particularly important for remotely conducted interviews, in which participants may only reserve the exact expected amount of time for the call (eg, 60 minutes) and when adequate attention and focus might be more difficult to maintain than in person. To support interviewers in managing time, we commonly include time stamps in interview guides and denote the questions to be prioritized. In focus groups, it is recommended to have 2 interviewers on the call if possible. That way, at least one interviewer can be primarily concerned with active listening and engagement with the participants, whereas another interviewer can focus on note-taking and timekeeping.

In our recent qualitative study with patients with young-onset dementia and their caregivers (dyadic interviews), we found it critical to consider the specific cognitive challenges of persons with dementia in facilitation as well as the sensitive nature of dyadic interviews. All questions were piloted with experts in young-onset dementia before the interviews to ensure clarity. Interviewers were prepared to repeat questions several times as well as define or explain keywords as needed. Because couples were asked to share their perspectives regarding the person with dementia’s symptoms and illness progression as well as relationship satisfaction in front of each other, we prefaced the interview by validating the difficulty of openly sharing and encouraging participants to be as open as possible. When participants were hesitant in sharing, we found that sitting with the silence before moving on to a new question encouraged participants to reflect and add to the conversation. Before asking about relationship challenges, the interviewer acknowledged that this might be the first time couples are discussing certain questions and assured couples that we would be available to provide support to the couple together or individually after the interview as well. It is particularly important to consider participant emotional safety and sense of support in the case of remote interviews.

Essential Training Competencies for Study Staff

At the forefront of training competencies to conduct remote data collection is ensuring study staff have familiarity with practices to promote participant privacy and security, including encrypting computer devices; using secure, encrypted video and audio software; and conducting qualitative data collection in private, quiet locations. Equipping the study team with institutionally encrypted equipment (laptops with webcams and phones) and software programs facilitates standardized and HIPAA-protected data collection [ 1 ]. It is essential that study staff have sufficient familiarity with all technologies used so that they can troubleshoot any problems that may arise for either themselves or the participants and provide technical support as needed [ 11 ]. Therefore, study staff must be thoroughly trained in the use of any relevant technology as well as provided with resources to contact in the case of questions or issues.

Given the unique challenges to rapport building and participant engagement through remote encounters, it is also important to provide study staff with adequate training in verbal and nonverbal communication. For study staff with less experience with participant interaction and without clinical training, providing some level of peer or hierarchical supervision may be helpful in supporting them in developing effective communication skills.

In this paper, we integrated recommendations from previous literature with examples from our ongoing clinical research to identify and respond to specific challenges to remote data collection ( Tables 1 and ​ and2). 2 ). We hope to catalyze other research teams to think critically about the strategies they use in remote data collection and contribute to the collective body of knowledge on best practices through the publication of protocol papers and other methodologically oriented works. It is imperative that research teams thoughtfully and creatively solve problems in response to the challenges they face in remote data collection to ensure the validity and quality of data as well as the patient-centeredness of study procedures.

Future Directions

Future research is needed to evaluate whether data collected through web-based study designs are of the same nature and quality as data collected through traditional in-person approaches and to continue to identify strategies to maximize the validity of data collected remotely. The shift toward more web-based designs prompted by the COVID-19 pandemic brings with it the opportunity to remove many barriers of access to clinical research and engage more diverse participant populations while minimizing the burden on participants. However, without proper capacity building for web-based research, we risk widening the digital divide perpetuating existing disparities. We discussed our experiences with conducting web-based research with different populations, including individuals underrepresented in research such as Hispanic, Latino, or Latina individuals, those with serious mental illness, and those who face increased barriers to research participation, such as older adults with dementia and adolescents with learning disabilities. The strategies presented (eg, device provision, increasing technological support, and using multiple modalities to conduct research) are examples of mechanisms to promote equity in research participation. We acknowledge the significant participant burden in using technology for research and that the same digital health solutions do not work for all individuals. Therefore, it is imperative that researchers assess barriers specific to their study designs and populations of interest to mitigate the threat of increasing existing disparities. Additional research is needed to further characterize strategies that can be used to ensure accessibility of virtually conducted research to marginalized and underrepresented populations.

Abbreviations

Conflicts of Interest: None declared.

  • Open access
  • Published: 03 October 2022

Quantitative data collection approaches in subject-reported oral health research: a scoping review

  • Carl A. Maida 1 ,
  • Di Xiong 1 , 2 ,
  • Marvin Marcus 1 ,
  • Linyu Zhou 1 , 2 ,
  • Yilan Huang 1 , 2 ,
  • Yuetong Lyu 1 , 2 ,
  • Jie Shen 1 ,
  • Antonia Osuna-Garcia 3 &
  • Honghu Liu 1 , 2 , 4  

BMC Oral Health volume  22 , Article number:  435 ( 2022 ) Cite this article

4884 Accesses

3 Citations

Metrics details

This scoping review reports on studies that collect survey data using quantitative research to measure self-reported oral health status outcome measures. The objective of this review is to categorize measures used to evaluate self-reported oral health status and oral health quality of life used in surveys of general populations.

The review is guided by the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR) with the search on four online bibliographic databases. The criteria include (1) peer-reviewed articles, (2) papers published between 2011 and 2021, (3) only studies using quantitative methods, and (4) containing outcome measures of self-assessed oral health status, and/or oral health-related quality of life. All survey data collection methods are assessed and papers whose methods employ newer technological approaches are also identified.

Of the 2981 unduplicated papers, 239 meet the eligibility criteria. Half of the papers use impact scores such as the OHIP-14; 10% use functional measures, such as the GOHAI, and 26% use two or more measures while 8% use rating scales of oral health status. The review identifies four data collection methods: in-person, mail-in, Internet-based, and telephone surveys. Most (86%) employ in-person surveys, and 39% are conducted in Asia-Pacific and Middle East countries with 8% in North America. Sixty-six percent of the studies recruit participants directly from clinics and schools, where the surveys were carried out. The top three sampling methods are convenience sampling (52%), simple random sampling (12%), and stratified sampling (12%). Among the four data collection methods, in-person surveys have the highest response rate (91%), while the lowest response rate occurs in Internet-based surveys (37%). Telephone surveys are used to cover a wider population compared to other data collection methods. There are two noteworthy approaches: 1) sample selection where researchers employ different platforms to access subjects, and 2) mode of interaction with subjects, with the use of computers to collect self-reported data.

The study provides an assessment of oral health outcome measures, including subject-reported oral health status and notes newly emerging computer technological approaches recently used in surveys conducted on general populations. These newer applications, though rarely used, hold promise for both researchers and the various populations that use or need oral health care.

Peer Review reports

A fundamentally different approach is currently needed to address the oral health of populations worldwide namely by considering the perspective of patients or populations and not only dental professionals' views [ 1 ]. It seems increasingly necessary to integrate the self-reported perceptions of oral health, as they can complete or even replace clinical measures of dental status in surveys of populations. Indeed, such subjective measures are easy to use in large-scale populations and can provide a broader health perspective as compared to clinically determined measures of dental status alone [ 2 , 3 ]. Since the topic is broad, this scoping review sets out to identify methods employed in population surveys that discussed self-reported perceptions of oral health, and the extent to which new computer-oriented technological approaches are being incorporated in the research methods.

The literature on oral health and dental-related scoping and systematic reviews includes studies that use specific populations in terms of disease or clinical conditions, treatments, political or social status and typically do not explore oral health status outcome measures [ 4 , 5 , 6 , 7 , 8 , 9 , 10 , 11 , 12 , 13 , 14 , 15 ]. These studies only occasionally provide perspectives on general populations. A review by Mittal et al. identifies dental Patient-Reported Outcomes (dPROs), and dental Patient Reported Outcome Measures (dPROMs) related to oral function, oral-facial pain, orofacial pain and psychosocial impact [ 16 ]. The study affords a valuable and extensive review of self-reported oral health and quality of life measures, many of which are found in this paper. This scoping review, then, seeks approaches used in subject-reported surveys, including those with general populations, which may broaden the perspective on oral health outcome measures.

The objective of this review is to categorize measures used to evaluate self-reported oral health status and oral health quality of life used in surveys of general populations.

This work is implemented following the framework of scoping reviews [ 17 , 18 , 19 ] and is presented according to the recommendations of the Preferred Reporting of Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews (PRISMA-ScR), as listed in Additional file 1 : Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) Checklist [ 20 ]. Additional file 5 : Glossary of Terms provides definitions for the important terms used across the paper.

Search strategy and data sources

A health science librarian assisted in the development of a search strategy that identified papers concerning subject-reported oral health status surveys. The search terms consisted of three broad categories, including survey methods, subject-reported outcomes, and oral health and disease (see Additional file 2 : Search Terms for the full list of search strings). The search comprised peer-reviewed journal articles, conference proceedings and reviews with at least one keyword from each of three aspects. Four online databases: Ovid Medline, Embase, Web of Science, and Cochrane Reviews and Trials were used. In addition, a manual search used similar keywords for the gray literature achieved on MedRxiv. The search focused on peer-reviewed papers written in English and published between 2011 to September 2021. Publications in the last decade were reviewed to investigate the extent to which different methods were being used and the trends that occurred during this period. The final search was completed on September 29, 2021. Using the current decade provides a period where there is considerable interest in non-clinical oral health status outcome measures and the potential for examining technological innovation. All references were imported for review and appraisal. Duplicates were identified using Mendeley (Mendeley, London, UK) and manually verified. After removing the duplicates, data were tabulated in Microsoft Excel (Microsoft, Redmond, WA, USA) for recording screening results and data charting.

Study inclusion criteria

Studies that did not meet with the research objective were excluded using a screening tool (Additional file 3 : Search Tool). First, the titles and abstracts of publications were screened to determine if studies conducted quantitative surveys, and to assess if self-reported and/or proxy-reported OHS was a primary objective. Only surveys with more than three questions that related to OHS were considered. Studies with secondary analysis were excluded because the data collection methods were normally not developed as part of the research and were developed previously. Papers whose sole purpose was to validate well-known measures of oral health were also rejected since the intent was not to assess the OHS of a population. Literature reviews were likewise excluded, as were papers describing results from focus groups and other qualitative studies. Papers whose objectives were to validate measures or predict specific oral disease entities, such as caries or gingival bleeding, rather than overall OHS. Studies primarily focusing on general health status or other systemic diseases instead of OHS were eliminated. Randomized Controlled Trials (RCT) or quasi RCT studies that tested an active agent (e.g., therapy, experiment, and medicine) were excluded because the main research purpose was a comparison of treatment rather than an assessment of subject-reported OHS.

The research team performed the secondary screening through a full-text review. We dropped papers with full text missing or not in English. Then, we screened the available full-text works using a similar set of inclusion criteria aforementioned and further excluded papers without information about data collection methods.

Selection strategies

Figure  1 outlines the review process utilizing the PRISMA-ScR framework. The title-and-abstract screening was completed by a researcher (D.X.) against the inclusion criteria using a screening tool (Additional file 3 : Search Tool). To check for reliability and consistency, one of the researchers (L.Z.) randomly screened 10% of articles independently and compared the inclusion decisions. Given the result of title-and-abstract screening, two researchers (L.Z. and Y.H.) verified the eligibility of the remaining articles independently through full-text review. Inclusion discrepancies were resolved by an additional researcher (D.X.).

figure 1

PRISMA framework with additional examples

Data extraction

The data charting form (Additional file 4 : Data Charting Form) consists of quantitative and qualitative variables for the data collection methods and their characteristics, such as outcome measures, use of assistive devices/tools or data sources, report type, and so on. The form has been pre-tested by two project staff (C.M. and M.M.) before being utilized. Two researchers (Y.H. and L.Z.) extracted data using the form. Two project staff (C.M. and M.M.) collaborated to review the charted study characteristics and the discrepancies have been addressed through discussion.

Data synthesis and analysis

The scoping review synthesizes the research findings based on dimensions and attributes of major oral health survey data collection methods using descriptive and content analyses. The review provides an overview of various related data collection methods in the recent literature, which refer to the quantitative methods to collect information from a pool of respondents, and the trends in using these new technological approaches, which involve computerized modes, Internet-supported devices and interactive web technologies. Through the literature review, we locate four major types of data collection methods: in-person, Internet-based, telephone-based, and mail-in based approaches.

Screening and study selection

After removing duplicates, the initial search revealed 2981 articles from four online databases for title-and-abstract screening; 2503 of which were excluded after being examined against the inclusion criteria. The interrater reliability of screening was measured by Kappa agreement as 0.94 (95% confidence interval [0.89, 0.99]) for title-and-abstract screening, which implies almost perfect agreement [ 21 ]. After full-text reviewing and excluding 239 articles, we summarized and categorized the remaining 239 studies based on the pre-tested data charting form. In addition, we identified 12 studies with various technological approaches to data collection. Figure  1 presented the PRISMA Framework used for this scoping review.

General characteristics of included studies

Table 1 presents various characteristics of the 239 articles that meet inclusion criteria that were published from 2011 to September 2021. Fifty-six percent of the papers are published in dental journals. About 40% of the papers are published in journals from the Asia-Pacific and Middle East region (APAC), and only 8.4% are from North America (NA). The majority of studies (69%) focus on the general population. Most (88.6%) of the studies use in-person surveys. Around two-thirds of the studies invite and recruit participants from the study sites, e.g., schools, clinics, and hospitals. Some studies recruit participants by having the research team visit communities (16%) or by sampling directly from a database (13%). In the latter case, participants are selected using probability and/or non-probability sampling methods, including convenience sampling (52%), simple random sampling (12%), and stratified sampling (12%). Most studies (193 or 80.8%) investigate self-reported outcomes. Dental examinations accompany the survey in 54% of the studies, while 32% of studies do not use any clinical exam or records. The data charting details are listed in Additional file 4 : Data Charting Form.

Characteristics of data collection methods

The four main data collection methods include in-person (N = 206, 86.2%), mail-in (N = 15, 6.3%), Internet-based (N = 6, 2.5%), and telephone-based (N = 3, 1.3%) surveys. The characteristics of the various data collection methods are summarized in Table 2 .

The majority of the studies using in-person surveys have high response rates with an average of 90.6%. Those studies using in-person survey methods represent half 55.8% of the studies employ face-to-face interviews, while 35.4% used a paper-and-pencil approach. Participants for 58.7% of the studies are recruited directly from clinics [ 22 ], hospitals [ 23 ], and community care centers [ 24 ]. For those sites with electronic records, additional data sources are directly linked to the survey, for example, clinical dental exams with visual components (e.g., X-ray [ 25 ] and pictures [ 26 ]) and medical records [ 23 , 27 , 28 ]. Moreover, different qualitative assessments (e.g., Malocclusion Assessment [ 22 ] and Masticatory Performance Test [ 24 , 29 ]) are captured in patient progress notes.

The mail-in survey method is used by 15 studies and may be more cost-effective than in-person delivery, though these were the two main sources, via post (80%) and by carriers (20%). Mail-in surveys have a relatively high response rate averaging 72%, especially when children or other respondents bring surveys home to complete. Similar to in-person surveys, mail-in surveys can incorporate additional resources, such as photographs and explanations of clinical conditions and treatments [ 30 , 31 ].

Only six studies are identified as using an Internet-based survey, mainly through computer-assisted web interviews (4 studies), and email (2 studies). Three papers employ direct recruitment and another three papers recruit participants through websites and databases. The average response rate is as low as 36.7% for this method with small sample sizes with a median of 259 participants.

Three studies use a telephone survey method covering large populations compared to other survey methods with more responders on average. Two of these studies recruit participants through an existing database, and all surveys used interviewers. Computer-Assisted Telephone Interviews (CATI) [ 32 ] and Voice Response Systems [ 33 ] which are commonly used in industry are not found in the studies.

In addition to the data collection methods, we further categorize the measures found in the 239 articles. Table 3 presents the frequencies and percentages of the various self-reported outcome measures. The three basic approaches are oral health impact measures [ 34 ], functional measures [ 34 ], and self- or proxy-ratings of OHS, with the terms defined in Additional file 5 : Glossary of Terms. These are used as single measures or in combination. The Oral Health Impact Profile-14 (OHIP-14) is the most prevalent single measure with 69 papers and 29% overall, of which 25 papers are about child impact, representing 10% of the total number of selected papers. The Geriatric Oral Health Assessment Index (GOHAI), a functional measure, is second with 21 papers and 9% overall. The GOHAI is the first among the studies on the elderly. There are also two adolescent papers representing 9% of the functioning category. The self- or proxy-rating of OHS has 18 single-measure papers representing 8% of these articles. Of these, 12 or 80% are children's measure's, representing 5% of all selected papers.

There is a total of 63 papers using more than one type of measure. Either combining functional and impact measures (36 and 15%) or self-rating OHS and one or more of the other measures (27 or 11%). The group of single impact measures is 50% of the overall and also represents where two or more measures were used. The single measure, GOHAI, based on function is only 9% of all measures but also played a role in combination with other measures. Finally, the self-reported OHS as a single measure represents 8% of the studies. Its role is mainly in combination with other measures and represented another 15% of the articles. In total. children's oral health measures form a considerable portion of the self-reported oral health outcome research papers, representing 16% of all studies. There are additional studies where children’s measures are used in combination with adults.

Currently, the use of technological approaches emerged in the field of survey research to improve the quality and quantity of data collection. After reviewing and charting all qualified 239 articles, twelve studies that employ technological approaches are summarized in Table 4 .

This scoping review provides an overview of data collection methods used for subject-reported surveys to measure oral health outcomes. Studies are characterized by four survey methods (in-person, mail-in, Internet-based, and telephone) and by summarized dimensions and attributes of data collection for each method, such as technological approaches, survey population or sampling methods. Studies typically employ in-person surveys and more studies were conducted in Asia-Pacific and Middle East countries than in any other world region. Most studies recruit participants directly from study sites. Both probability and non-probability sampling methods employ typically convenience sampling, simple random sampling, and stratified sampling. Studies that achieve the highest response rate on average use in-person surveys, while the lowest rate occurs in Internet-based surveys. Telephone surveys are used to cover a wider population compared to other data collection methods. Many studies, especially those using in-person and mail-in data collection methods, incorporate supplemental data types and technological approaches. Outcome measures are frequently used to evaluate impacts caused by functional limitations related to physical, psychological, and social factors.

Frequently used self-reported oral health status and OHRQoL measures are OHIP-14, an impact measure, and the GOHAI, a functional measure. Children’s oral health outcomes measures form a considerable portion of the self-reported oral health outcome research papers. Although OHIP-14 is the most utilized single measure, many other papers use only portions of this measure, while adding other outcome measures, such as dental care needs, satisfaction, oral health status, and so on. The validity of these measures is therefore compromised and could not provide insight into the degree that the studies are measuring self-reported oral health status or quality of life [ 4 ]. Other measures rate an individual’s oral health status using a simple self-rating scale, from very poor to excellent. This approach is more directly related to a person’s oral conditions and therefore their perceptions and behavior tend to be more consistent with this rating [ 56 , 57 ]. These self-rating measures focus on the overall dimension of perceived oral health status. Unlike the measures previously discussed, these simple ratings do not delineate the psychological, social and physical dimensions of oral health. Nevertheless, such measures can enable researchers to identify hidden dimensions by analyzing independent variables that account for the respondent’s perception.

This review identifies research that employs more conventional methods. The face-to-face interview and the pencil and paper format are conventionally used in many studies along with a clinical dental exam. While offering unique flexibility and easier administration, in-person approaches are more labor-intensive and normally take more time compared to other methods. Countries, such as Brazil, rely for years on these techniques to develop national epidemiological oral health surveys [ 28 , 58 , 59 ]. Although these surveys are very well-organized and established throughout the country, this review does not find that newer technological approaches are introduced into their conventional approach. In this case, there may be little incentive to change their approach because their methods are well understood and employing more technological approaches may be costly.

The use of Internet-based surveys is increasingly common in the medical field. Although these surveys end with potentially lower response rates, this approach is normally more cost-effective [ 60 ]. Internet-based surveys have many notable advantages, including easy administration, fast data collection process, lower cost, wider population coverage and better data quality with fewer overall data errors and fewer missing items [ 61 , 62 , 63 ]. However, this data collection method is constrained by sample bias, topic salience, data security concerns and low digital literacy that may affect response rates [ 62 ]. In settings where Internet-based surveys are not practical, longstanding and effective conventional oral health data collection methods in research will continue. It is evident from this review that the use of computerized technological approaches is limited. While such approaches in survey research improve the quality and quantity of data collection, only twelve studies in this review employ them. The most widely used technical approaches are Computer-Assisted Personal Interviewing (CAPI) and online survey platforms (e.g., Google Forms and SurveyMonkey).

Two noteworthy approaches to survey research methodology emerge from this review, particularly in: (1) sample selection, and (2) mode of interaction with research subjects. North American researchers found different platforms to access subjects for their studies. Canadian studies use random digit dialing to recruit and conduct computer-assisted interviews [ 54 ]. In the United States, researchers access existing polling populations or use Amazon’s MTurk platform for “workers” who are paid small amounts for each survey they respond to [ 64 ]. The second approach is the use of computers to collect self-reported data. The basic surveying technique is CAPI with interviewers directly entering the data into a database. There is also Computer-Assisted Telephone Interviewing (CATI), a survey technique, where the interviewer follows a scripted interview guided by a questionnaire that appears on the screen. A third Internet-based survey technique, the Computer-Assisted Web Interviewing (CAWI), requires no live interviewer. Instead, the respondent follows a script made in a program for designing web interviews that may include images, audio and video clips, and web-based information.

An innovative technological approach worth noting is the use of OralCam to perform self-examination using a smartphone camera [ 65 ]. The study applies research used in medicine to detect liver problems from face photos as well as other diseases [ 66 ]. The paper describes the use of a smartphone camera to interact with a computer using diagnostic algorithms, such as the deep convolutional neural network-based multitask learning approach. Based on over three thousand intraoral photos, the system learns to analyze teeth and gingiva. The smartphone camera takes a picture using a mouth opener. The computer’s algorithms analyze the captured picture, along with survey data, to diagnose several dental conditions including caries, chronic gingival inflammation, and dental calculus. This use of multitask learning technology, with the extensive availability of cell phones, may revolutionize oral health research and care.

This scoping review is limited to oral health survey-based studies in peer-reviewed journals and MedRxiv published in English between 2011 to 2021. A further limitation is that many of the reviewed papers do not adequately describe the methods they use to collect data. Publications using secondary data from national studies are excluded, The exclusion is based on the fact that these researchers are not engaged in designing the methods or conducting the data collection. Often, the publications refer to the original study to describe the method used. Also, the original data collection may have occurred before the time frame of this review. The fifteen papers that use secondary data published over this study’s time frame represent only about six percent of the reviewed papers. Thus, the overall impact of this exclusion is minimal on this scoping review’s results.

Conclusions

This scoping review provides an assessment of oral health outcome measures, including subject-reported oral health status, and notes newly emerging computer technological approaches recently used in surveys conducted on general populations. Such technological approaches, although rarely used in the reviewed studies, hold promise for both researchers and the various populations that use or need oral health care. Future studies employing more developed computer applications for survey research to boost recruitment and participation of study subjects with wide and diverse backgrounds from almost unlimited geographic areas can then provide a broader perspective on oral health survey methods and outcomes.

Availability of data and materials

All data generated or analyzed during this study are included in this published article and its supplementary information files.

Abbreviations

Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews

Asia-Pacific (including the Middle East)

Latin America

North America

Computer-Assisted Personal Interviewing

Computer-Assisted Web Interviewing

Computer-Assisted Telephone Interview

Oral Health-Related Quality of Life

Oral Health Impact Profile-14

Geriatric Oral Health Assessment Index

Oral Health Status

Watt RG, Daly B, Allison P, Macpherson LMD, Venturelli R, Listl S, et al. Ending the neglect of global oral health: time for radical action. Lancet. 2019;394:261–72. https://doi.org/10.1016/S0140-6736(19)31133-X .

Article   PubMed   Google Scholar  

Liu H, Hays R, Wang Y, Marcus M, Maida C, Shen J, et al. Short form development for oral health patient-reported outcome evaluation in children and adolescents. Qual Life Res. 2018;27:1599–611. https://doi.org/10.1007/S11136-018-1820-9 .

Article   PubMed   PubMed Central   Google Scholar  

Wang Y, Hays R, Marcus M, Maida C, Shen J, Xiong D, et al. Development of a parents’ short form survey of their children’s oral health. Int J Pediatr Dent. 2019;29:332–44. https://doi.org/10.1111/ipd.12453 .

Article   Google Scholar  

Yang C, Crystal YO, Ruff RR, Veitz-Keenan A, McGowan RC, Niederman R. Quality appraisal of child oral health-related quality of life measures: a scoping review. JDR Clin Transl Res. 2020;5:109–17. https://doi.org/10.1177/2380084419855636 .

Gupta M, Bosma H, Angeli F, Kaur M, Chakrapani V, Rana M, et al. A mixed methods study on evaluating the performance of a multi-strategy national health program to reduce maternal and child health disparities in Haryana. India BMC Public Health. 2017;17:698. https://doi.org/10.1186/s12889-017-4706-9 .

Keboa MT, Hiles N, Macdonald ME. The oral health of refugees and asylum seekers: a scoping review. Glob Health. 2016;12:1–11. https://doi.org/10.1186/S12992-016-0200-X/TABLES/2 .

Wilson NJ, Lin Z, Villarosa A, Lewis P, Philip P, Sumar B, et al. Countering the poor oral health of people with intellectual and developmental disability: a scoping literature review. BMC Public Health. 2019;19:1–16. https://doi.org/10.1186/S12889-019-7863-1/TABLES/1 .

Ajwani S, Jayanti S, Burkolter N, Anderson C, Bhole S, Itaoui R, et al. Integrated oral health care for stroke patients—a scoping review. J Clin Nurs. 2017;26:891–901. https://doi.org/10.1111/JOCN.13520 .

Shrestha AD, Vedsted P, Kallestrup P, Neupane D. Prevalence and incidence of oral cancer in low- and middle-income countries: a scoping review. Eur J Cancer Care. 2020;29:66. https://doi.org/10.1111/ECC.13207 .

Patterson-Norrie T, Ramjan L, Sousa MS, Sank L, George A. Eating disorders and oral health: a scoping review on the role of dietitians. J Eat Disord. 2020;8:1–21. https://doi.org/10.1186/S40337-020-00325-0/TABLES/1 .

Lansdown K, Smithers-Sheedy H, Mathieu Coulton K, Irving M. Oral health outcomes for people with cerebral palsy: a scoping review protocol. JBI Database System Rev Implement Rep 2019;17:2551–8. https://doi.org/10.11124/JBISRIR-2017-004037 .

Beaton L, Humphris G, Rodriguez A, Freeman R. Community-based oral health interventions for people experiencing homelessness: a scoping review. Community Dent Health. 2020;37:150–60. https://doi.org/10.1922/CDH_00014BEATON11 .

Marquillier T, Lombrail P, Azogui-Lévy S. Social inequalities in oral health and early childhood caries: How can they be effectively prevented? A scoping review of disease predictors. Rev Epidemiol Sante Publique. 2020;68:201–14. https://doi.org/10.1016/J.RESPE.2020.06.004 .

Como DH, Duker LIS, Polido JC, Cermak SA. The persistence of oral health disparities for African American children: a scoping review. Int J Environ Res Public Health. 2019;16:66. https://doi.org/10.3390/IJERPH16050710 .

Stein K, Farmer J, Singhal S, Marra F, Sutherland S, Quiñonez C. The use and misuse of antibiotics in dentistry: a scoping review. J Am Dent Assoc. 2018;149:869-884.e5. https://doi.org/10.1016/J.ADAJ.2018.05.034 .

Mittal H, John MT, Sekulić S, Theis-Mahon N, Rener-Sitar K. Patient-reported outcome measures for adult dental patients: a systematic review. J Evid Based Dent Pract. 2019;19:53–70. https://doi.org/10.1016/J.JEBDP.2018.10.005 .

Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Method Theory Pract. 2005;8:19–32. https://doi.org/10.1080/1364557032000119616 .

Levac D, Colquhoun H, O’Brien KK. Scoping studies: advancing the methodology. Implement Sci. 2010;5:69. https://doi.org/10.1186/1748-5908-5-69 .

Paré G, Trudel M-C, Jaana M, Kitsiou S. Synthesizing information systems knowledge: a typology of literature reviews. Inf Manag. 2015. https://doi.org/10.1016/j.im.2014.08.008 .

Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169:467–73. https://doi.org/10.7326/M18-0850 .

Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33:159. https://doi.org/10.2307/2529310 .

Masood M, Masood Y, Newton T, Lahti S. Development of a conceptual model of oral health for malocclusion patients. Angle Orthod. 2015;85:1057–63. https://doi.org/10.2319/081514-575.1 .

Massarente DB, Domaneschi C, Marques HHS, Andrade SB, Goursand D, Antunes JLF. Oral health-related quality of life of paediatric patients with AIDS. BMC Oral Health. 2011;11:2. https://doi.org/10.1186/1472-6831-11-2 .

Lu TY, Chen JH, Du JK, Lin YC, Ho PS, Lee CH, et al. Dysphagia and masticatory performance as a mediator of the xerostomia to quality of life relation in the older population. BMC Geriatr. 2020;20:66. https://doi.org/10.1186/S12877-020-01901-4 .

Strömberg E, Holmèn A, Hagman-Gustafsson ML, Gabre P, Wardh I. Oral health-related quality-of-life in homebound elderly dependent on moderate and substantial supportive care for daily living. Acta Odontol Scand. 2013;71:771–7. https://doi.org/10.3109/00016357.2012.734398 .

Morgan JP, Isyagi M, Ntaganira J, Gatarayiha A, Pagni SE, Roomian TC, et al. Building oral health research infrastructure: the first national oral health survey of Rwanda. Glob Health Act. 2018;11:66. https://doi.org/10.1080/16549716.2018.1477249 .

Preciado A, del Río J, Suárez-García MJ, Montero J, Lynch CD, Castillo-Oyagüe R. Differences in impact of patient and prosthetic characteristics on oral health-related quality of life among implant-retained overdenture wearers. J Dent. 2012;40:857–65. https://doi.org/10.1016/J.JDENT.2012.07.006 .

de Quadros Coelho M, Cordeiro JM, Vargas AMD, de Barros Lima Martins AME, de Almeida Santa Rosa TT, Senna MIB, et al. Functional and psychosocial impact of oral disorders and quality of life of people living with HIV/AIDS. Qual Life Res. 2015;24:503–11. https://doi.org/10.1007/S11136-014-0778-5 .

Said M, Otomaru T, Aimaijiang Y, Li N, Taniguchi H. Association between masticatory function and oral health-related quality of life in partial maxillectomy patients. Int J Prosthodont. 2016;29:561–4. https://doi.org/10.11607/IJP.4852 .

Owens J, Jones K, Marshman Z. The oral health of people with learning disabilities—a user–friendly questionnaire survey. Community Dent Health. 2017;34:4–7. https://doi.org/10.1922/CDH_3867OWENS04 .

Abuzar MA, Kahwagi E, Yamakawa T. Investigating oral health-related quality of life and self-perceived satisfaction with partial dentures. J Investig Clin Dent. 2012;3:109–17. https://doi.org/10.1111/J.2041-1626.2012.00111.X .

Wilson D, Taylor A, Chittleborough C. The second Computer Assisted Telephone Interview (CATI) Forum: the state of play of CATI survey methods in Australia. Aust NZ J Public Health. 2001;25:272–4. https://doi.org/10.1111/J.1467-842X.2001.TB00576.X .

Lee H, Friedman ME, Cukor P, Ahern D. Interactive voice response system (IVRS) in health care services. Nurs Outlook. 2003;51:277–83. https://doi.org/10.1016/S0029-6554(03)00161-1 .

Campos JADB, Zucoloto ML, Bonafé FSS, Maroco J. General Oral Health Assessment Index: a new evaluation proposal. Gerodontology. 2017;34:334–42. https://doi.org/10.1111/GER.12270 .

Slade GD. Derivation and validation of a short-form oral health impact profile. Community Dent Oral Epidemiol. 1997;25:284–90. https://doi.org/10.1111/J.1600-0528.1997.TB00941.X .

Adulyanon S, Vourapukjaru J, Sheiham A. Oral impacts affecting daily performance in a low dental disease Thai population. Community Dent Oral Epidemiol. 1996;24:385–9. https://doi.org/10.1111/J.1600-0528.1996.TB00884.X .

Pahel BT, Rozier RG, Slade GD. Parental perceptions of children’s oral health: the Early Childhood Oral Health Impact Scale (ECOHIS). Health Qual Life Outcomes. 2007;5:6. https://doi.org/10.1186/1477-7525-5-6 .

Gherunpong S, Tsakos G, Sheiham A. Developing and evaluating an oral health-related quality of life index for children: the CHILD-OIDP. Undefined. 2004;21:161–9.

Google Scholar  

Slade GD, Spencer AJ. Development and evaluation of the Oral Health Impact Profile. Community Dent Health. 1994;11:3–11.

PubMed   Google Scholar  

Broder HL, Wilson-Genderson M. Reliability and convergent and discriminant validity of the Child Oral Health Impact Profile (COHIP Child’s version). Community Dent Oral Epidemiol. 2007;35(Suppl 1):20–31. https://doi.org/10.1111/J.1600-0528.2007.0002.X .

Atieh MA. Arabic version of the geriatric oral health assessment Index. Gerodontology. 2008;25:34–41. https://doi.org/10.1111/j.1741-2358.2007.00195.x .

Wright WG, Spiro A, Jones JA, Rich SE, Garcia RI. Development of the teen oral health-related quality of life instrument. J Public Health Dent. 2017;77:115–24. https://doi.org/10.1111/JPHD.12181 .

Jokovic A, Locker D, Tompson B, Guyatt G. Questionnaire for measuring oral health-related quality of life in eight- to ten-year-old children. Undefined. 2004;26:512–8.

Jokovic A, Locker D, Stephens M, Kenny D, Tompson B, Guyatt G. Validity and reliability of a questionnaire for measuring child oral-health-related quality of life. J Dent Res. 2002;81:459–63. https://doi.org/10.1177/154405910208100705 .

Broughton JR, TeH Maipi J, Person M, Randall A, Thomson WM. Self-reported oral health and dental service-use of rangatahi within the rohe of Tainui. NZ Dent J. 2012;108:90–4.

Monaghan N, Karki A, Playle R, Johnson I, Morgan M. Measuring oral health impact among care home residents in Wales. Community Dent Health. 2017;34:14–8. https://doi.org/10.1922/CDH_3950MORGAN05 .

Echeverria MS, Silva AER, Agostini BA, Schuch HS, Demarco FF. Regular use of dental services among university students in southern Brazil. Revista de Saude Publica 2020;54:85. https://doi.org/10.11606/S1518-8787.2020054001935 .

Mohamad Fuad MA, Yacob H, Mohamed N, Wong NI. Association of sociodemographic factors and self-perception of health status on oral health-related quality of life among the older persons in Malaysia. Geriatr Gerontol Int. 2020;20(Suppl 2):57–62. https://doi.org/10.1111/GGI.13969 .

Hanisch M, Wiemann S, Bohner L, Kleinheinz J, Susanne SJ. Association between oral health-related quality of life in people with rare diseases and their satisfaction with dental care in the health system of the Federal Republic of Germany. Int J Environ Res Public Health. 2018. https://doi.org/10.3390/IJERPH15081732 .

Nam SH, Kiml HY, IlChun D. Influential factors on the quality of life and dental health of university students in a specific area. Biomed Res. 2017;28:12.

Mortimer-Jones S, Stomski N, Cope V, Maurice L, Théroux J. Association between temporomandibular symptoms, anxiety and quality of life among nursing students. Collegian. 2019;26:373–7. https://doi.org/10.1016/J.COLEGN.2018.10.003 .

Liu C, Zhang S, Zhang C, Tai B, Jiang H, Du M. The impact of coronavirus lockdown on oral healthcare and its associated issues of pre-schoolers in China: an online cross-sectional survey. BMC Oral Health. 2021;21:66. https://doi.org/10.1186/s12903-021-01410-9 .

Makizodila BAM, van de Wijdeven JHE, de Soet JJ, van Selms MKA, Volgenant CMC. Oral hygiene in patients with motor neuron disease requires attention: a cross-sectional survey study. Spec Care Dent. 2021. https://doi.org/10.1111/SCD.12636 .

Kotzer RD, Lawrence HP, Clovis JB, Matthews DC. Oral health-related quality of life in an aging Canadian population. Health Qual Life Outcomes. 2012;10:50. https://doi.org/10.1186/1477-7525-10-50 .

Hakeberg M, Wide U. General and oral health problems among adults with focus on dentally anxious individuals. Int Dent J. 2018;68:405–10. https://doi.org/10.1111/IDJ.12400 .

Lawal FB, Olawole WO, Sigbeku OF. Self rating of oral health status by student dental surgeon assistants in Ibadan, Nigerian—a Pilot Survey. Ann Ibadan Postgrad Med. 2013;11:12.

Locker D, Wexler E, Jokovic A. What do older adults’ global self-ratings of oral health measure? J Public Health Dent. 2005;65:146–52. https://doi.org/10.1111/J.1752-7325.2005.TB02804.X .

Saintrain MVDL, de Souza EHA. Impact of tooth loss on the quality of life. Gerodontology. 2012;29:66. https://doi.org/10.1111/J.1741-2358.2011.00535.X .

Grando LJ, Mello ALSF, Salvato L, Brancher AP, del Moral JAG, Steffenello-Durigon G. Impact of leukemia and lymphoma chemotherapy on oral cavity and quality of life. Spec Care Dent. 2015;35:236–42. https://doi.org/10.1111/SCD.12113 .

Ebert JF, Huibers L, Christensen B, Christensen MB. Paper- or Web-Based Questionnaire Invitations as a method for data collection: cross-sectional comparative study of differences in response rate, completeness of data, and financial cost. J Med Internet Res. 2018;20:66. https://doi.org/10.2196/JMIR.8353 .

Hohwü L, Lyshol H, Gissler M, Jonsson SH, Petzold M, Obel C. Web-based versus traditional paper questionnaires: a mixed-mode survey with a Nordic perspective. J Med Internet Res. 2013;15:66. https://doi.org/10.2196/JMIR.2595 .

Maymone MBC, Venkatesh S, Secemsky E, Reddy K, Vashi NA. Research techniques made simple: Web-Based Survey Research in Dermatology: conduct and applications. J Invest Dermatol. 2018;138:1456–62. https://doi.org/10.1016/J.JID.2018.02.032 .

Weigold A, Weigold IK, Natera SN. Response rates for surveys completed with paper-and-pencil and computers: using meta-analysis to assess equivalence. Soc Sci Comput Rev. 2018;37:649–68. https://doi.org/10.1177/0894439318783435 .

Burnham MJ, Le YK, Piedmont RL. Who is Mturk? Personal characteristics and sample consistency of these online workers. Ment Health Relig Cult. 2018;21:934–44. https://doi.org/10.1080/13674676.2018.1486394 .

Liang Y, Fan HW, Fang Z, Miao L, Li W, Zhang X, et al. OralCam: enabling self-examination and awareness of oral health using a smartphone camera. In: Conference on human factors in computing systems—proceedings, vol 20. New York: Association for Computing Machinery; 2020. p. 1–13. https://doi.org/10.1145/3313831.3376238 .

Ding X, Jiang Y, Qin X, Chen Y, Zhang W, Qi L. Reading face, reading health: Exploring face reading technologies for everyday health. In: Conference on human factors in computing systems—proceedings. New York: Association for Computing Machinery; 2019. p. 1–13. https://doi.org/10.1145/3290605.3300435 .

Download references

Acknowledgements

Not applicable.

This research was supported by an NIDCR/NIH grant to the University of California, Los Angeles (UCLA) (U01DE029491). The funder had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Author information

Authors and affiliations.

Division of Oral and Systemic Health Sciences, School of Dentistry, University of California, Los Angeles, 10833 Le Conte Ave, Los Angeles, CA, USA

Carl A. Maida, Di Xiong, Marvin Marcus, Linyu Zhou, Yilan Huang, Yuetong Lyu, Jie Shen & Honghu Liu

Department of Biostatistics, Fielding School of Public Health, University of California, Los Angeles, 650 Charles E Young Drive South, Los Angeles, CA, USA

Di Xiong, Linyu Zhou, Yilan Huang, Yuetong Lyu & Honghu Liu

Louise M. Darling Biomedical Library, University of California, Los Angeles, 12-077 Center for Health Sciences, Los Angeles, CA, USA

Antonia Osuna-Garcia

Division of General Internal Medicine and Health Services Research, Geffen School of Medicine, University of California, Los Angeles, 10833 Le Conte Ave, Los Angeles, CA, USA

You can also search for this author in PubMed   Google Scholar

Contributions

C.M., D.X., M.M. and H.L. conceptualized the study and designed the data collection form and established the data analysis plan. A.O. developed search strategies and carried out searching on multiple databases. D.X., Y.L., Y.H., J.S. and Y.L. performed additional searching and tested the data charting form. D.X., Y.L., and Y.H. helped to screen studies for relevance and data charting. C.M. and M.M. reviewed full-text papers and verify the data charting results. C.M., D.X., and M.M. drafted the original manuscript. D.X. and L.Z. prepared Tables 1 , 2 and Fig.  1 . C.M., D.X., M.M., and L.Z prepared Tables 3 and 4 . All authors read and provided substantial comments/edits on the manuscript and approved the final version. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Honghu Liu .

Ethics declarations

Ethics approval and consent to participate, consent for publication, competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1..

Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) Checklist.

Additional file 2.

Search Terms.

Additional file 3.

Search Tool.

Additional file 4.

 Data Charting Form.

Additional file 5.

Glossary of Terms.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Maida, C.A., Xiong, D., Marcus, M. et al. Quantitative data collection approaches in subject-reported oral health research: a scoping review. BMC Oral Health 22 , 435 (2022). https://doi.org/10.1186/s12903-022-02399-5

Download citation

Received : 31 December 2021

Accepted : 17 August 2022

Published : 03 October 2022

DOI : https://doi.org/10.1186/s12903-022-02399-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Quantitative research
  • Patient-reported outcomes
  • Dental disease experience
  • Oral health-related quality of life
  • Data collection

BMC Oral Health

ISSN: 1472-6831

data collection methods in quantitative research scholarly articles

Book cover

Handbook of Research Methods in Health Social Sciences pp 27–49 Cite as

Quantitative Research

  • Leigh A. Wilson 2 , 3  
  • Reference work entry
  • First Online: 13 January 2019

4210 Accesses

4 Citations

Quantitative research methods are concerned with the planning, design, and implementation of strategies to collect and analyze data. Descartes, the seventeenth-century philosopher, suggested that how the results are achieved is often more important than the results themselves, as the journey taken along the research path is a journey of discovery. High-quality quantitative research is characterized by the attention given to the methods and the reliability of the tools used to collect the data. The ability to critique research in a systematic way is an essential component of a health professional’s role in order to deliver high quality, evidence-based healthcare. This chapter is intended to provide a simple overview of the way new researchers and health practitioners can understand and employ quantitative methods. The chapter offers practical, realistic guidance in a learner-friendly way and uses a logical sequence to understand the process of hypothesis development, study design, data collection and handling, and finally data analysis and interpretation.

This is a preview of subscription content, log in via an institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Babbie ER. The practice of social research. 14th ed. Belmont: Wadsworth Cengage; 2016.

Google Scholar  

Descartes. Cited in Halverston, W. (1976). In: A concise introduction to philosophy, 3rd ed. New York: Random House; 1637.

Doll R, Hill AB. The mortality of doctors in relation to their smoking habits. BMJ. 1954;328(7455):1529–33. https://doi.org/10.1136/bmj.328.7455.1529 .

Article   Google Scholar  

Liamputtong P. Research methods in health: foundations for evidence-based practice. 3rd ed. Melbourne: Oxford University Press; 2017.

McNabb DE. Research methods in public administration and nonprofit management: quantitative and qualitative approaches. 2nd ed. New York: Armonk; 2007.

Merriam-Webster. Dictionary. http://www.merriam-webster.com . Accessed 20th December 2017.

Olesen Larsen P, von Ins M. The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index. Scientometrics. 2010;84(3):575–603.

Pannucci CJ, Wilkins EG. Identifying and avoiding bias in research. Plast Reconstr Surg. 2010;126(2):619–25. https://doi.org/10.1097/PRS.0b013e3181de24bc .

Petrie A, Sabin C. Medical statistics at a glance. 2nd ed. London: Blackwell Publishing; 2005.

Portney LG, Watkins MP. Foundations of clinical research: applications to practice. 3rd ed. New Jersey: Pearson Publishing; 2009.

Sheehan J. Aspects of research methodology. Nurse Educ Today. 1986;6:193–203.

Wilson LA, Black DA. Health, science research and research methods. Sydney: McGraw Hill; 2013.

Download references

Author information

Authors and affiliations.

School of Science and Health, Western Sydney University, Penrith, NSW, Australia

Leigh A. Wilson

Faculty of Health Science, Discipline of Behavioural and Social Sciences in Health, University of Sydney, Lidcombe, NSW, Australia

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Leigh A. Wilson .

Editor information

Editors and affiliations.

Pranee Liamputtong

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Singapore Pte Ltd.

About this entry

Cite this entry.

Wilson, L.A. (2019). Quantitative Research. In: Liamputtong, P. (eds) Handbook of Research Methods in Health Social Sciences. Springer, Singapore. https://doi.org/10.1007/978-981-10-5251-4_54

Download citation

DOI : https://doi.org/10.1007/978-981-10-5251-4_54

Published : 13 January 2019

Publisher Name : Springer, Singapore

Print ISBN : 978-981-10-5250-7

Online ISBN : 978-981-10-5251-4

eBook Packages : Social Sciences Reference Module Humanities and Social Sciences Reference Module Business, Economics and Social Sciences

Share this entry

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

BOOK REVIEW article

Book review: data collection research methods in applied linguistics.

\nYu Zhang

  • School of Foreign Languages, Northeast Normal University, Jilin, China

A Book Review on Data Collection Research Methods in Applied Linguistics

Heath Rose, Jim McKinley, and Jessica Briggs Baffoe-Djan (London: Bloomsbury Academic), 2020, 296 pages, ISBN: 978-1-3500-2583-7

Choosing the appropriate data collection methods is the key to obtaining reliable and valid research data. Data Collection Research Methods in Applied Linguistics highlights the importance of data collection, presents a variety of approaches for obtaining data and provides practical guidance for applied linguistics researchers with ample examples from published articles and books.

This book contains 12 chapters. Chapter 1 briefly introduces various research designs, including experiments, surveys, and case studies in applied linguistics. Chapters 2 to 9, which form the main body of the book, describe direct and indirect data collection methods. Chapters 2 to 5 analyze direct ways of acquiring data on participants, including language elicitation tasks (Chapter 2), introspective and retrospective tasks (Chapter 3), tests (Chapter 4), and observations (Chapter 5). Chapters 6 to 9 consider indirect ways of collecting data on participants through self-reporting, including interviews (Chapter 6), diaries, journals, and logs (Chapter 7), questionnaires (Chapter 8), and focus groups (Chapter 9). Chapters 10 and 11 outline direct and indirect techniques for obtaining spoken and written discourses to construct and use corpora, such as the British National Corpus. Finally, Chapter 12 discusses how to improve data validity and reliability by using triangulation and how to ensure research transparency, which allows future researchers to replicate studies.

This book makes itself distinctive from other research methodology books in terms of its explicit focus on data collection, unique categorization of data collection methods, and structural components promoting reader interactions. First, this book primarily presents data collection methods without associating them with particular research designs. This explicit focus on data collection creates more space to feature a variety of approaches, including widely used methods, such as questionnaires, and less commonly used methods, such as logs and focus groups. Thus, this book can encourage researchers to flexibly and creatively integrate various data collection methods within a research design.

Second, this book categorizes data collection methods based on whether the data are obtained from participants directly, such as through role playing and storytelling, or indirectly, such as through written interviews. Compared with other research methodology books, such as those that classify data collection methods according to whether the data are primary or secondary (e.g., Kothari, 2004 ) or quantitative or qualitative (e.g., Dörnyei, 2007 ), this new classification approach aims to “provide guidance in this area by squarely focusing on the things researchers do to obtain data in their research projects” ( Rose et al., 2020 , p. vii).

Third, this book's structural arrangement includes reflective activities and instructive examples to foster reader interactions. Each chapter starts with pre-reading activities, which involve thinking, discussing, and imagining, and ends with post-reading activities, which involve reflecting, expanding, and applying, in order to provoke contemplation and further discussion. Each chapter also explains key concepts and provides specific ways to improve data reliability and validity. For example, Chapter 8 introduces many strategies to reduce bias when constructing questionnaire, such as writing concise items, using simple language, avoiding negative and multiple choice questions, and considering the order effect and the response rate. To provide readers with practical guidance, data collection methods are analyzed using instructive examples from published studies, which involve not only quantitative and qualitative designs but also mixed-method designs. For instance, Chapter 9 contains six examples of studies that employed focus groups as a data collection method in different research designs, including quasi-experimental and mixed-method research.

Meanwhile, some aspects of this book could benefit from further development in future editions. For instance, Chapter 5 introduces many observation instruments, such as the Motivation Orientation of Language Teaching ( Guilloteaux and Dörnyei, 2008 ); however, the brief explanations may make it difficult for novice researcher to apply these instruments effectively. Novice researchers would benefit from the inclusion of complete observation sheets and more detailed depictions of what to observe and how to record observations.

In summary, novice researchers and postgraduate students will find this book as an essential reference as they embark on their research journeys in applied linguistics. By introducing a variety of data collection methods, this book can inspire researchers and students to creatively adopt different approaches to obtain data when conducting research, such as using questionnaires and observations in a case study.

Author Contributions

The author confirms being the sole contributor of this work and has approved it for publication.

This paper was supported by the Project of Discipline Innovation and Advancement (PODIA) - Foreign Language Education Studies at Beijing Foreign Studies University (Grant number: 2020SYLZDXM011) and the Project of the Tertiary Education Reform at Northeast Normal University titled Empowering the English Micro-teaching Class via PBLI (Grant No. 421-131003198).

Conflict of Interest

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Dörnyei, Z. (2007). Research Methods in Applied Linguistics: Quantitative, Qualitative, and Mixed Methodologies . Oxford: Oxford University Press.

Guilloteaux, M. J., and Dörnyei, Z. (2008). Motivating language learners: a classroom-oriented investigation of the effects of motivational strategies on student motivation. TESOL Q . 42, 55–77. doi: 10.2307/40264425

CrossRef Full Text | Google Scholar

Kothari, C. R. (2004). Research Methodology: Methods and Techniques, 2nd Edn . New Delhi: New Age International.

Google Scholar

Rose, H., McKinley, J., and Baffoe-Djan, J. B. (2020). Data Collection Research Methods in Applied Linguistics . London: Bloomsbury Academic.

Keywords: data collection, applied linguistics, qualitative research method, quantitative research method, mixed-method approach

Citation: Zhang Y (2021) Book Review: Data Collection Research Methods in Applied Linguistics. Front. Psychol. 12:668712. doi: 10.3389/fpsyg.2021.668712

Received: 17 February 2021; Accepted: 01 March 2021; Published: 24 March 2021.

Edited and reviewed by: Xuesong Gao , University of New South Wales, Australia

Copyright © 2021 Zhang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Yu Zhang, zhangy435@nenu.edu.cn

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

IMAGES

  1. How to Collect Data

    data collection methods in quantitative research scholarly articles

  2. 10 Easy Steps to Find a Quantitative Article

    data collection methods in quantitative research scholarly articles

  3. Quantitative Data Collection Methods

    data collection methods in quantitative research scholarly articles

  4. Quantitative Research: What It Is, Practices & Methods

    data collection methods in quantitative research scholarly articles

  5. Data Collection Methods

    data collection methods in quantitative research scholarly articles

  6. 7 Data Collection Methods & Tools For Research

    data collection methods in quantitative research scholarly articles

VIDEO

  1. Fundamentals of Qualitative Research Methods: Scientific Rigor (Module 6)

  2. Fundamentals of Qualitative Research Methods: Data Analysis (Module 5)

  3. Fundamentals of Qualitative Research Methods: What is Qualitative Research (Module 1)

  4. Fundamentals of Qualitative Research Methods: Interviews (Module 3)

  5. Fundamentals of Qualitative Research Methods: Developing a Qualitative Research Question (Module 2)

  6. Qualitative and Quantitative

COMMENTS

  1. The Strategies for Quantitative and Qualitative Remote Data Collection: Lessons From the COVID-19 Pandemic

    Even the collection of biomarker data, common in quantitative research clinical trials, ... we use multiple traditional outreach methods for data collection (ie, phone, email, and letter mail) in addition to nontraditional methods such as partnering with family caregivers and staff in congregate living settings. ... 10.1634/theoncologist.2018 ...

  2. Best Practices in Data Collection and Preparation ...

    We offer best-practice recommendations for journal reviewers, editors, and authors regarding data collection and preparation. Our recommendations are applicable to research adopting different epistemological and ontological perspectives—including both quantitative and qualitative approaches—as well as research addressing micro (i.e., individuals, teams) and macro (i.e., organizations ...

  3. Quantitative data collection approaches in subject-reported oral health

    Data extraction. The data charting form (Additional file 4: Data Charting Form) consists of quantitative and qualitative variables for the data collection methods and their characteristics, such as outcome measures, use of assistive devices/tools or data sources, report type, and so on.The form has been pre-tested by two project staff (C.M. and M.M.) before being utilized.

  4. Quantitative Data Analysis—In the Graduate Curriculum

    Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a study's contextual situation. Learning data analysis is not learning how to use statistical tests to crunch ...

  5. Advances in quantitative research within the psychological sciences

    Their style is reminiscent of other checklist-based articles on quantitative methods (e.g., Depaoli & van de Schoot, 2017) that seek to explicate the fundamental issues and practical decisions that must be considered in reference to a given statistical model. Articles presenting topics in such a point-based fashion are accessible to a wide ...

  6. Quantitative Research

    Quantitative research methods are concerned with the planning, design, and implementation of strategies to collect and analyze data. Descartes, the seventeenth-century philosopher, suggested that how the results are achieved is often more important than the results themselves, as the journey taken along the research path is a journey of discovery. . High-quality quantitative research is ...

  7. Challenges for the management of qualitative and quantitative data: The

    An assessment of the reliability of data collection and data interpretation is often not available to the broader academic community (Heinrich et al., 2019: 140). In addition, data collections related to qualitative research methods often cannot (easily) be prepared for online publication in the context of transparency initiatives.

  8. Quantitative research artifacts as qualitative data collection

    This sequential explanatory mixed methods research study, as defined by Creswell and Plano Clark (2017), had two data strands: Phase 1 - a quantitative data strand and Phase 2 - a qualitative data strand.While the data collected and the order in which it was collected aligns with Creswell and Plano Clark's sequential explanatory classification of mixed methods research, our mixed methods ...

  9. Data Collection

    Revised on June 21, 2023. Data collection is a systematic process of gathering observations or measurements. Whether you are performing research for business, governmental or academic purposes, data collection allows you to gain first-hand knowledge and original insights into your research problem. While methods and aims may differ between ...

  10. Book Review: Data Collection Research Methods in Applied Linguistics

    A Book Review on. Data Collection Research Methods in Applied Linguistics. Heath Rose, Jim McKinley, and Jessica Briggs Baffoe-Djan (London: Bloomsbury Academic), 2020, 296 pages, ISBN: 978-1-3500-2583-7. Choosing the appropriate data collection methods is the key to obtaining reliable and valid research data.

  11. PDF Methods of Data Collection in Quantitative, Qualitative, and Mixed Research

    There are actually two kinds of mixing of the six major methods of data collection (Johnson & Turner, 2003). The first is intermethod mixing, which means two or more of the different methods of data collection are used in a research study. This is seen in the two examples in the previous paragraph.

  12. Data Collection Methods and Tools for Research; A Step-by-Step Guide to

    Data Collection, Research Methodology, Data Collection Methods, Academic Research Paper, Data Collection Techniques. I. INTRODUCTION Different methods for gathering information regarding specific variables of the study aiming to employ them in the data analysis phase to achieve the results of the study, gain the answer of the research

  13. Data Collection Methods in Quantitative Research

    various data collection methods. They are broadly classified as self -reports, observation, and biophysiologic measures. This article highlights on the sources of data and on the various data collection techniques which include interviews, questionnaires, scales, category system and check lists, rating scales, and biophysiologic measures. It also analyses the advantages and disadvantages of ...

  14. Assessing Triangulation Across Methodologies, Methods ...

    In quantitative research, D. T. Campbell and Fiske (1959) created the multitrait, multimethod matrix for systematically comparing findings across different data collection methods; if results converged across methods, D. T. Campbell and Fiske (1959) argued that researchers could have greater confidence in the validity of their conclusions (see ...

  15. Qualitative vs. Quantitative Research

    It is important to use a data collection method that will help answer your research question(s). Many data collection methods can be either qualitative or quantitative. For example, in surveys, observational studies or case studies, your data can be represented as numbers (e.g., using rating scales or counting frequencies) or as words (e.g ...

  16. (PDF) METHODS OF DATA COLLECTION

    Learn about the concept, types, and issues of data collection methods, with examples and tips from ResearchGate's experts. Download the PDF for free.

  17. (PDF) Quantitative Research Methods : A Synopsis Approach

    The study established that. quantitative research de als with quantifying and analyzing variables in o rder to get results. It. involves the utilization and analysis of numerical data using ...

  18. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...