What is Evidence-Based Practice in Nursing? (With Examples, Benefits, & Challenges)

evidence based practice skills essay

Are you a nurse looking for ways to increase patient satisfaction, improve patient outcomes, and impact the profession? Have you found yourself caught between traditional nursing approaches and new patient care practices? Although evidence-based practices have been used for years, this concept is the focus of patient care today more than ever. Perhaps you are wondering, “What is evidence-based practice in nursing?” In this article, I will share information to help you begin understanding evidence-based practice in nursing + 10 examples about how to implement EBP.

What Is Evidence-Based Practice In Nursing?

When was evidence-based practice first introduced in nursing, who introduced evidence-based practice in nursing, what is the difference between evidence-based practice in nursing and research in nursing, what are the benefits of evidence-based practice in nursing, top 5 benefits to the patient, top 5 benefits to the nurse, top 5 benefits to the healthcare organization, 10 strategies nursing schools employ to teach evidence-based practices, 1. assigning case studies:, 2. journal clubs:, 3. clinical presentations:, 4. quizzes:, 5. on-campus laboratory intensives:, 6. creating small work groups:, 7. interactive lectures:, 8. teaching research methods:, 9. requiring collaboration with a clinical preceptor:, 10. research papers:, what are the 5 main skills required for evidence-based practice in nursing, 1. critical thinking:, 2. scientific mindset:, 3. effective written and verbal communication:, 4. ability to identify knowledge gaps:, 5. ability to integrate findings into practice relevant to the patient’s problem:, what are 5 main components of evidence-based practice in nursing, 1. clinical expertise:, 2. management of patient values, circumstances, and wants when deciding to utilize evidence for patient care:, 3. practice management:, 4. decision-making:, 5. integration of best available evidence:, what are some examples of evidence-based practice in nursing, 1. elevating the head of a patient’s bed between 30 and 45 degrees, 2. implementing measures to reduce impaired skin integrity, 3. implementing techniques to improve infection control practices, 4. administering oxygen to a client with chronic obstructive pulmonary disease (copd), 5. avoiding frequently scheduled ventilator circuit changes, 6. updating methods for bathing inpatient bedbound clients, 7. performing appropriate patient assessments before and after administering medication, 8. restricting the use of urinary catheterizations, when possible, 9. encouraging well-balanced diets as soon as possible for children with gastrointestinal symptoms, 10. implementing and educating patients about safety measures at home and in healthcare facilities, how to use evidence-based knowledge in nursing practice, step #1: assessing the patient and developing clinical questions:, step #2: finding relevant evidence to answer the clinical question:, step #3: acquire evidence and validate its relevance to the patient’s specific situation:, step #4: appraise the quality of evidence and decide whether to apply the evidence:, step #5: apply the evidence to patient care:, step #6: evaluating effectiveness of the plan:, 10 major challenges nurses face in the implementation of evidence-based practice, 1. not understanding the importance of the impact of evidence-based practice in nursing:, 2. fear of not being accepted:, 3. negative attitudes about research and evidence-based practice in nursing and its impact on patient outcomes:, 4. lack of knowledge on how to carry out research:, 5. resource constraints within a healthcare organization:, 6. work overload:, 7. inaccurate or incomplete research findings:, 8. patient demands do not align with evidence-based practices in nursing:, 9. lack of internet access while in the clinical setting:, 10. some nursing supervisors/managers may not support the concept of evidence-based nursing practices:, 12 ways nurse leaders can promote evidence-based practice in nursing, 1. be open-minded when nurses on your teams make suggestions., 2. mentor other nurses., 3. support and promote opportunities for educational growth., 4. ask for increased resources., 5. be research-oriented., 6. think of ways to make your work environment research-friendly., 7. promote ebp competency by offering strategy sessions with staff., 8. stay up-to-date about healthcare issues and research., 9. actively use information to demonstrate ebp within your team., 10. create opportunities to reinforce skills., 11. develop templates or other written tools that support evidence-based decision-making., 12. review evidence for its relevance to your organization., bonus 8 top suggestions from a nurse to improve your evidence-based practices in nursing, 1. subscribe to nursing journals., 2. offer to be involved with research studies., 3. be intentional about learning., 4. find a mentor., 5. ask questions, 6. attend nursing workshops and conferences., 7. join professional nursing organizations., 8. be honest with yourself about your ability to independently implement evidence-based practice in nursing., useful resources to stay up to date with evidence-based practices in nursing, professional organizations & associations, blogs/websites, youtube videos, my final thoughts, frequently asked questions answered by our expert, 1. what did nurses do before evidence-based practice, 2. how did florence nightingale use evidence-based practice, 3. what is the main limitation of evidence-based practice in nursing, 4. what are the common misconceptions about evidence-based practice in nursing, 5. are all types of nurses required to use evidence-based knowledge in their nursing practice, 6. will lack of evidence-based knowledge impact my nursing career, 7. i do not have access to research databases, how do i improve my evidence-based practice in nursing, 7. are there different levels of evidence-based practices in nursing.

• Level One: Meta-analysis of random clinical trials and experimental studies • Level Two: Quasi-experimental studies- These are focused studies used to evaluate interventions. • Level Three: Non-experimental or qualitative studies. • Level Four: Opinions of nationally recognized experts based on research. • Level Five: Opinions of individual experts based on non-research evidence such as literature reviews, case studies, organizational experiences, and personal experiences.

8. How Can I Assess My Evidence-Based Knowledge In Nursing Practice?

evidence based practice skills essay

Key EBP Nursing Topics: Enhancing Patient Results through Evidence-Based Practice

Avatar

This article was written in collaboration with Christine T. and ChatGPT, our little helper developed by OpenAI.

Key EBP Nursing Topics Enhancing Patient Results through Evidence-Based Practice

Evidence-based practice (EBP) is the use of the best available evidence to inform clinical decision-making in nursing. EBP has become increasingly popular in nursing practice because it ensures that patient care is based on the most current and relevant research. In this article, we will discuss the latest evidence-based practice nursing research topics, how to choose them, and where to find EBP project ideas.

What is Evidence-Based Practice Nursing?

EBP nursing involves a cyclical process of asking clinical questions, seeking the best available evidence, critically evaluating that evidence, and then integrating it with the patient’s clinical experience and values to make informed decisions. By following this process, nurses can provide the best care for their patients and ensure that their practice is informed by the latest research.

One of the key components of EBP nursing is the critical appraisal of research evidence. Nurses must be able to evaluate the quality of studies, including study design, sample size, and statistical analysis. This requires an understanding of research methodology and the ability to apply critical thinking skills to evaluate research evidence.

EBP nursing also involves the use of clinical practice guidelines and protocols, which are evidence-based guidelines for clinical practice. These guidelines have been developed by expert groups and are based on the best available evidence. By following these guidelines, nurses can ensure that their practice is in line with the latest research and can provide the best possible care for their patients.

Finally, EBP nursing involves continuous professional development and a commitment to lifelong learning. Nurses must keep abreast of the latest research and clinical practice guidelines to ensure that their practice is informed by the latest research. This requires a commitment to ongoing learning and professional development, including attending conferences, reading scholarly articles, and participating in continuing education programs.

You can also learn more about evidence-based practice in nursing to gain a deeper understanding of the definition, stages, benefits, and challenges of implementing it.

Medical Studies Overwhelming?

Delegate Your Nursing Papers to the Pros!

Get 15% Discount

+ Plagiarism Report for FREE

How to Choose Evidence-Based Practice Nursing Research Topics

Choosing a science-based topic for nursing practice can be a daunting task, especially if you are new to the field. Here are some tips to help you choose a relevant and interesting EBP topic:

  • Look for controversial or debated issues

Look for areas of nursing practice that are controversial or have conflicting evidence. These topics often have the potential to generate innovative and effective research.

  • Consider ethical issues

Consider topics related to ethical issues in nursing practice. For example, bereavement care, informed consent , and patient privacy are all ethical issues that can be explored in an EBP project.

  • Explore interdisciplinary topics

Nursing practice often involves collaboration with other health professionals such as physicians, social workers, and occupational therapists. Consider interdisciplinary topics that may be useful from a nursing perspective.

  • Consider local or regional issues

Consider topics that are relevant to your local or regional healthcare facility. These topics may be relevant to your practice and have a greater impact on patient outcomes in your community.

  • Check out the latest research

Review recent research in your area of interest to identify gaps in the literature or areas where further research is needed. This can help you develop a research question that is relevant and innovative.

With these tips in mind, you can expand your options for EBP nursing research topics and find a topic that fits your interests and goals. Remember that patient outcomes should be at the forefront of your research and choose a topic that has the potential to improve treatment and patient outcomes.

Where to Get EBP Project Ideas

There are several sources that nurses can use to get EBP project ideas. These sources are diverse and can provide valuable inspiration for research topics. By exploring these sources, nurses can find research questions that align with their interests and that address gaps in the literature. These include:

  • Clinical Practice Guidelines

Look for clinical practice guidelines developed by professional organizations or healthcare institutions. These guidelines provide evidence-based guidelines for clinical practice and can help identify areas where further research is needed.

  • Research databases

Explore research databases such as PubMed, CINAHL, and the Cochrane Library to find the latest studies and systematic reviews. These databases can help you identify gaps in the literature and areas where further research is needed.

  • Clinical Experts

Consult with clinical experts in your practice area. These experts may have insights into areas where further research is needed or may provide guidance on areas of practice that may benefit from an EBP project.

  • Quality Improvement Projects

Review quality improvement projects that have been implemented in your healthcare facility. These projects may identify areas where further research is needed or identify gaps in the literature that could be addressed in an EBP project.

  • Patient and family feedback

Consider patient and family feedback to identify areas where further research is needed. Patients and families can provide valuable information about areas of nursing practice that can be improved or that could benefit from further research.

Remember, when searching for ideas for EBP nursing research projects, it is important to consider the potential impact on patient care and outcomes. Select a topic that has the potential to improve patient outcomes and consider the feasibility of the project in terms of time, resources, and access to data. By choosing a topic that matches your interests and goals and is feasible at your institution, you can conduct a meaningful and productive EBP research project in nursing.

Nursing EBP Topics You Can Use in Your Essay

Here are some of the latest evidence-based practice nursing research topics that you can use in your essay or explore further in your own research:

  • The impact of telehealth on patient outcomes in primary care
  • The use of music therapy to manage pain in post-operative patients
  • The effectiveness of mindfulness-based stress reduction in reducing stress and anxiety in healthcare workers
  • Combating health care-associated infections: a community-based approach
  • The impact of nurse-led discharge education on readmission rates for heart failure patients
  • The use of simulation in nursing education to improve patient safety
  • The effectiveness of early mobilization in preventing post-operative complications
  • The use of aromatherapy to manage agitation in patients with dementia
  • The impact of nurse-patient communication on patient satisfaction and outcomes
  • The effectiveness of peer support in improving diabetes self-management
  • The impact of cultural competence training on patient outcomes in diverse healthcare settings
  • The use of animal-assisted therapy in managing anxiety and depression in patients with chronic illnesses
  • The effectiveness of nurse-led smoking cessation interventions in promoting smoking cessation among hospitalized patients
  • Importance of literature review in evidence-based research
  • The impact of nurse-led care transitions on hospital readmission rates for older adults
  • The effectiveness of nurse-led weight management interventions in reducing obesity rates among children and adolescents
  • The impact of medication reconciliation on medication errors and adverse drug events
  • The use of mindfulness-based interventions to manage chronic pain in older adults
  • The effectiveness of nurse-led interventions in reducing hospital-acquired infections
  • The impact of patient-centered care on patient satisfaction and outcomes
  • The use of art therapy to manage anxiety in pediatric patients undergoing medical procedures
  • Pediatric oncology: working towards better treatment through evidence-based research
  • The effectiveness of nurse-led interventions in improving medication adherence among patients with chronic illnesses
  • The impact of team-based care on patient outcomes in primary care settings
  • The use of music therapy to improve sleep quality in hospitalized patients
  • The effectiveness of nurse-led interventions in reducing falls in older adults
  • The impact of nurse-led care on maternal and infant outcomes in low-resource settings
  • The use of acupressure to manage chemotherapy-induced nausea and vomiting
  • The effectiveness of nurse-led interventions in promoting breastfeeding initiation and duration
  • The impact of nurse-led palliative care interventions on end-of-life care in hospice settings
  • The use of hypnotherapy to manage pain in labor and delivery
  • The effectiveness of nurse-led interventions in reducing hospital length of stay for surgical patients
  • The impact of nurse-led transitional care interventions on readmission rates for heart failure patients
  • The use of massage therapy to manage pain in hospitalized patients
  • The effectiveness of nurse-led interventions in promoting physical activity among adults with chronic illnesses
  • The impact of technology-based interventions on patient outcomes in mental health settings
  • The use of mind-body interventions to manage chronic pain in patients with fibromyalgia
  • Optimizing the clarifying diagnosis of stomach cancer
  • The effectiveness of nurse-led interventions in reducing medication errors in pediatric patients
  • The impact of nurse-led interventions on patient outcomes in long-term care settings
  • The use of aromatherapy to manage anxiety in patients undergoing cardiac catheterization
  • The effectiveness of nurse-led interventions in improving glycemic control in patients with diabetes
  • The impact of nurse-led interventions on patient outcomes in emergency department settings
  • The use of relaxation techniques to manage anxiety in patients with cancer
  • The effectiveness of nurse-led interventions in improving self-management skills among patients with heart failure
  • The impact of nurse-led interventions on patient outcomes in critical care settings
  • The use of yoga to manage symptoms in patients with multiple sclerosis
  • The effectiveness of nurse-led interventions in promoting medication safety in community settings
  • The impact of nurse-led interventions on patient outcomes in home healthcare settings
  • The role of family involvement in the rehabilitation of stroke patients
  • Assessing the effectiveness of virtual reality in pain management
  • The impact of pet therapy on mental well-being in elderly patients
  • Exploring the benefits of intermittent fasting on diabetic patients
  • The efficacy of acupuncture in managing chronic pain in cancer patients
  • Effect of laughter therapy on stress levels among healthcare professionals
  • The influence of a plant-based diet on cardiovascular health
  • Analyzing the outcomes of nurse-led cognitive behavioral therapy sessions for insomnia patients
  • The role of yoga and meditation in managing hypertension
  • Exploring the benefits of hydrotherapy in post-operative orthopedic patients
  • The impact of digital health applications on patient adherence to medications
  • Assessing the outcomes of art therapy in pediatric patients with chronic illnesses
  • The role of nutrition education in managing obesity in pediatric patients
  • Exploring the effects of nature walks on mental well-being in patients with depression
  • The impact of continuous glucose monitoring systems on glycemic control in diabetic patients

The Importance of Incorporating EBP in Nursing Education

Evidence-based practice is not just a tool for seasoned nurses; it’s a foundational skill that should be integrated early into nursing education. By doing so, students learn the mechanics of nursing and the rationale behind various interventions grounded in scientific research.

  • Bridging Theory and Practice:

Introducing EBP in the curriculum helps students bridge the gap between theoretical knowledge and clinical practice. They learn how to perform a task and why it’s done a particular way.

  • Critical Thinking:

EBP promotes critical thinking. By regularly reviewing and appraising research, students develop the ability to discern the quality and applicability of studies. This skill is invaluable in a rapidly evolving field like healthcare.

  • Lifelong Learning:

EBP instills a culture of continuous learning. It encourages nurses to regularly seek out the most recent research findings and adapt their practices accordingly.

  • Improved Patient Outcomes:

At the heart of EBP is the goal of enhanced patient care. We ensure patients receive the most effective, up-to-date care by teaching students to base their practices on evidence.

  • Professional Development:

Familiarity with EBP makes it easier for nurses to contribute to professional discussions, attend conferences, and conduct research. It elevates their professional stature and opens doors to new opportunities.

To truly prepare nursing students for the challenges of modern healthcare, it’s essential to make EBP a core part of their education.

In summary, evidence-based practice nursing is an essential component of providing quality patient care. As a nurse, it is important to stay up to date on the latest research in the field and incorporate evidence-based practices into your daily work. Choosing a research topic that aligns with your interests and addresses a gap in the literature can lead to valuable contributions to the field of nursing.

When it comes to finding EBP project ideas, there are many sources available, including professional organizations, academic journals, and healthcare conferences. By collaborating with colleagues and seeking feedback from mentors, you can refine your research question and design a study that is rigorous and relevant.

The nursing evidence-based practice topics listed above provide a starting point for further exploration and investigation. By studying the effectiveness of various nursing interventions and techniques, we can continue to improve patient outcomes and deliver better care. Ultimately, evidence-based practice nursing is about using the best available research to inform our decisions and provide the highest quality care possible to our patients.

📎 Related Articles

1. Top Nursing Research Topics for Students and Professionals 2. Nursing Debate Topics: The Importance of Discussing and Debating Nursing Issues 3. Mental Health Nursing Research Topics: Inspiring Ideas for Students 4. Top Nursing Argumentative Essay Topics: Engage in Thought-Provoking Debates 5. Top Nursing Topics for Discussion: Engaging Conversations for Healthcare Professionals 6. Exploring Controversial Issues in Nursing: Key Topics and Examples 7. Pediatric Nursing Research Topics for Students: A Comprehensive Guide

Table of content

Crafted with Care:

Nursing Essays!

Precision, Passion, & Professionalism in Every Page.

  • Research article
  • Open access
  • Published: 07 January 2021

Evidence-based practice beliefs and implementations: a cross-sectional study among undergraduate nursing students

  • Nesrin N. Abu-Baker   ORCID: orcid.org/0000-0001-9971-1328 1 ,
  • Salwa AbuAlrub 2 ,
  • Rana F. Obeidat 3 &
  • Kholoud Assmairan 4  

BMC Nursing volume  20 , Article number:  13 ( 2021 ) Cite this article

132k Accesses

14 Citations

1 Altmetric

Metrics details

Integrating evidence-based practice (EBP) into the daily practice of healthcare professionals has the potential to improve the practice environment as well as patient outcomes. It is essential for nurses to build their body of knowledge, standardize practice, and improve patient outcomes. This study aims to explore nursing students’ beliefs and implementations of EBP, to examine the differences in students’ beliefs and implementations by prior training of EBP, and to examine the relationship between the same.

A cross-sectional survey design was used with a convenience sample of 241 nursing students from two public universities. Students were asked to answer the questions in the Evidence-Based Practice Belief and Implementation scales.

This study revealed that the students reported a mean total belief score of 54.32 out of 80 ( SD  = 13.63). However, they reported a much lower implementation score of 25.34 out of 72 ( SD  = 12.37). Students who received EBP training reported significantly higher total belief and implementation scores than those who did not. Finally, there was no significant relationship between belief and implementation scores ( p  > .05).

To advance nursing science, enhance practice for future nurses, and improve patient outcomes, it is critical to teach nursing students not only the value of evidence-based knowledge, but also how to access this knowledge, appraise it, and apply it correctly as needed.

Peer Review reports

Evidence-based practice (EBP) integrates the clinical expertise, the latest and best available research evidence, as well as the patient’s unique values and circumstances [ 1 ]. This form of practice is essential for nurses as well as the nursing profession as it offers a wide variety of benefits: It helps nurses to build their own body of knowledge, minimize the gap between nursing education, research, and practice, standardize nursing practices [ 2 ], improve clinical patient outcomes, improve the quality of healthcare, and decrease healthcare costs [ 3 ]. Thus, clinical decision-making by nurses should be based on the best and most up-to-date, available research evidence [ 4 ].

Earlier studies of EBP implementation by nurses in their everyday clinical practice have shown that it is suboptimal [ 5 , 6 , 7 ]. Implementation of EBP is defined as its application in clinical practice [ 8 ]. Findings from previous studies indicate that nurses’ implementation of EBP can be promoted by improving their belief about EBP. Belief is the perception of the value and benefits of EBP and the perceived self-confidence in one’s knowledge and skills of EBP [ 8 ]. Nurses with a strong belief in EBP implement it more than nurses with a weak belief in the same [ 7 , 9 ].

Preparing nurses for practice and ensuring that they have met a set of minimum core competencies at the point of graduation is achieved through their undergraduate education [ 10 ]. Several formal entities such as the Institute of Medicine (IOM) [ 4 ] and the Accreditation Commission for Education in Nursing (ACEN) [ 11 ] consider EBP as one of the core competencies that should be included in health care clinicians’ education. However, this does not necessarily guarantee the actual implementation of EBP in everyday clinical practice [ 12 ]. It is essential to educate undergraduate nursing students on EBP to improve their knowledge about it, to strengthen their belief regarding its benefits to patients and nurses, and to enhance their self-efficacy in implementing EBP. In order to effect this change, it is crucial to improve the education process and to focus more on the knowledge and implementation of EBP.

There is consistent evidence showing that while undergraduate nursing students hold positive beliefs about EBP and its value in patient care, they also report many challenges regarding its actual implementation in clinical practice. For instance, a mixed-methods study indicated that 118 American undergraduate nursing students found it difficult to distinguish between EBP and research. Students were able to search for evidence, but were less able to integrate evidence to plan EBP changes or disseminate best practices [ 13 ]. Additionally, a correlational study was conducted in Jordan using a sample of 612 senior nursing students. The study reported that students held positive attitudes towards research and 75% of them agreed on using nursing research in clinical practice. Students strongly believed in the usefulness of research. However, they did not believe strongly in their ability to conduct research [ 14 ]. A cross-sectional study was conducted among 188 Saudi undergraduate nursing students. Students reported positive beliefs about EBP; however, they reported a low mean score in EBP implementation (22.57 out of 72). Several significant factors have been reported as influencing EBP implementation, such as age, gender, awareness, and training on EBP [ 15 ]. A comparative survey comprised of 1383 nursing students from India, Saudi Arabia, Nigeria, and Oman. The study reported that having no authority in changing patient care policies, the slow publication of evidence, and the lack of time in the clinical area to implement the evidence were major barriers in implementing EBP according to the participating students [ 16 ].

In Jordan, evidence-based knowledge with critical thinking is one of the seven standards for the professional practice of registered nurses that were released by the Jordan Nursing Council [ 17 ]. Despite the plethora of studies on undergraduate nursing students’ beliefs about EBP and its implementation in everyday clinical practice, this topic has not been fully addressed among Jordanian undergraduate nursing students. Thus, the purpose of this study is to explore the self-reported beliefs and implementations of EBP among undergraduate nursing students in Jordan. The specific aims of this study were to (1) explore nursing students’ beliefs and implementations of EBP, (2) examine the differences in students’ beliefs and implementations by prior training of EBP, and (3) examine the relationship between nursing students’ beliefs and implementations of EBP.

Design and setting

A cross-sectional, correlational research survey design was used to meet the study aims. Recruitment of study participants was undertaken at two governmental universities in the northern part of Jordan. The two universities offer a four-year undergraduate nursing program aimed at graduating competent general nurses with baccalaureate degrees. The nursing research course is included as a compulsory course in the undergraduate nursing curricula in both universities.

Population and sample

The target population of this study was the undergraduate nursing students in Jordan. The accessible population was undergraduate nursing students who are currently enrolled in the four-year BSN program in two governmental universities in the northern region of Jordan. We calculated the sample size using the G*Power software (2014). Using a conventional power estimate of 0.8, with alpha set at 0.05, and medium effect size, it was estimated that for a Pearson Correlation test, a total of 100 participants would need to be recruited to examine the relationship between the beliefs and implementations of EBP. To counteract anticipated non-response and to enhance the power of the study, 300 students were approached. The inclusion criteria of the study participants were as follows: a) senior nursing students who are in the 3 rd or 4th-year level, b) students who are currently taking a clinical course with training in a clinical setting/hospital, c) and students who have successfully passed the nursing research course.

Measurement

A structured questionnaire composed of two parts was used for data collection. The first part aimed to gather the demographic data of the participants: gender, age, study year level, university, and any previous EBP training received in the nursing research course. The second part contained the EBP Belief Scale and EBP Implementation scale developed by Melnyk et al. (2008) [ 18 ]. Both scales had previous satisfactory psychometric properties with a Cronbach’s alpha of more than 0.9 and good construct validity. The Evidence-Based Practice Belief Scale (EBPB) consists of 16 statements that describe the respondent’s beliefs of EBP. Students were asked to report on a five-point Likert scale their agreement or disagreement with each of the 16 statements in the scale. Response options on this scale ranged from strongly disagree (1 point) to strongly agree (5 points). All statements were positive except for two statements (statements 11 and 13), which were reversed before calculating the total and mean scores. Total scores on the EBPB ranged from 16 to 80, with a higher total score indicating a more positive belief toward EBP. In the current study, the scale showed satisfactory internal consistency reliability with a Cronbach’s Alpha of .92 for the total scale.

The Evidence-Based Practice Implementation Scale (EBPI) consists of 18 statements related to the respondent’s actual implementation of EBP in the clinical setting. Students were asked to report the frequency of the application of these statements over the past 8 weeks. The answers were ranked on a Likert scale that ranged from 0 to 4 points (0 = 0 times, 1 = 1–3 times, 2 = 4–5 times, 3 = 6–8, and 4 ≥ 8 times). The total score ranged from 0 to 72, with the higher total score indicating a more frequent utilization of EBP.

Both scales were introduced to the participating students in their original language of English because English is the official language of teaching and instruction in all schools of nursing in Jordan.

Ethical considerations

The Institutional Review Board (IRB) at the first author’s university granted ethical approval for this study (Reference #19/122/2019). The code of ethics was addressed in the cover letter of the questionnaire. The principal investigator met the potential eligible students, provided them with an explanation about the study purpose and procedures, and gave them 5 min to read the questionnaires and to decide whether to participate in the study or not. Students who agreed to participate in the study were assured of voluntary participation and the right to withdraw from the study at any time. Questionnaires were collected anonymously without any identifying information from the participating students. The principal investigator explained to participating students that the return of completed questionnaires is an implicit consent to participate in the study. Permission to use the EBP belief scale and the EBP implementation scale for the purpose of this study was obtained from the authors of the instrument.

Data collection procedure

After ethical approval was granted to conduct the study, data was collected during the second semester of the academic year 2018/2019 (i.e., January through June 2019). The questionnaires were distributed to the nursing students during the classroom lectures after taking permission from the lecturer. The researchers explained the purpose, the significance of the study, the inclusion criteria, and the right of the students to refuse participation in the study. Students were screened for eligibility to participate. Students who met the eligibility criteria and agreed to participate were provided with the study package that included a cover letter and the study questionnaire. Students were given 20 min to complete the questionnaire and return it to the principal investigator who was available to answer students’ questions during the data collection process.

Data analysis

Descriptive statistics (e.g., means, standard deviations, frequencies, and percentages) were performed to describe the demographic characteristics of the participating students and the main study variables. For the belief scale, the two agreement categories (4 = agree, 5 = strongly agree) were collapsed to one category to indicate a positive belief. For the implementation scale, the three categories (2 = 4–5 times, 3 = 6–8, and 4 ≥ 8 times in the past 8 weeks) were collapsed to one category as (≥ 4 times) to indicate frequent implementation. Pearson’s correlation test was used to determine the relationship between the total scores of the EBP belief and implementation scales. A chi-square test was used to examine the difference between trained and untrained students in terms of agreement toward each EBP belief (disagreement vs. agreement) and in terms of frequency of each EBP implementation (less than 4 times vs. 4 times or more in the past 8 weeks). Finally, an independent samples t -test was used to examine the difference between trained and untrained students in terms of the total mean scores of EBP beliefs. The Statistical Package for Social Sciences (SPSS) software (version 22) was used for data analysis.

Among the 300 approached students, 35 students did not meet the inclusion criteria and 24 students refused to participate. Thus, a total of 241 undergraduate nursing students from both universities completed the study questionnaire for a response rate of 91%. The mean age of the participants was 22.09 years ( SD  = 1.55). The majority of the participants were females (73.4%) and in the fourth year of the undergraduate nursing program (85.1%). Further, more than half of the participants (67.6%) stated that they received EBP training before (Table  1 ).

The total mean score of the EBP belief scale was 54.32 out of 80 ( SD  = 13.63). Overall, between 50.5 and 73.4% of students agreed or strongly agreed on the 16 statements on the EBP belief scale, which indicates positive beliefs. However, students held a more positive belief regarding the importance and the usefulness of EBP in quality patient care than in their ability to implement EBP. For example, while the majority of students believed that “EBP results in the best clinical care for patients” and that “evidence-based guidelines can improve clinical care” (73.4 and 72.2%, respectively), only about 54% of them cited that they “knew how to implement EBP sufficiently enough to make practice changes” or were “confident about their ability to implement EBP where they worked”. Students who received previous training on EBP reported more agreements (i.e., more positive beliefs) toward all items of EBP compared to those who did not receive training; however, the difference between the two groups was not always significant. For example, 60.7% of trained students believed that “they are sure that they can implement EBP” compared to 41% of untrained students χ 2 (1, n  = 241) = 8.26, p  = .004. Furthermore, 58.3% of trained students were “clear about the steps of EBP” compared to 41% of untrained students χ 2 (1, n  = 241) = 6.30, p  = .021 (Table  2 ).

In contrast, students reported a much lower total score on the EBP implementation scale: 25.34 out of 72 ( SD  = 12.37). Less than half the students reported implementing all the listed EBPs four times or more in the last 8 weeks. For example, only about one-third of all students reported that they “used evidence to change their clinical practice”, “generated a PICO question about clinical practice”, “read and critically appraised a clinical research study”, and “accessed the database for EBP four times or more in the past eight weeks” (32.4, 33.6, 31.9, and 31.6%, respectively). The only EBP that was implemented by more than half of the students (54.8%) four times or more in the past 8 weeks was “collecting data on a patient problem”. Students who had previous training on EBP reported more frequent implementations of all listed EBPs compared to those who did not receive training; however, the difference between the two groups was not always significant. For example, 50.9% of trained students reported that they “shared an EBP guideline with a colleague” four times or more in the past 8 weeks compared to 30.8% of untrained students χ 2 (1, n  = 241) = 8.68, p  = .003. Almost 50 % of the trained students “shared evidence from a research study with a patient/family member” four times or more in the past 8 weeks, compared to 28.2% of the untrained students χ 2 (1, n  = 241) = 9.95, p  = .002 (Table  3 ).

There was a significant difference between students’ total scores on the EBP belief scale with respect to previous training on EBP. Students who received previous training on EBP had a significantly higher mean score on the EBP belief scale compared to students who did not receive previous training on EBP ( t (239) = 2.04, p  = .042). In addition, there was a significant difference in the total score of EBP implementation by previous training on EBP. Students who received previous training on EBP had a significantly higher mean score on the EBP implementation scale compared to students who did not receive previous training on EBP ( t (239) = 3.08, p  = .002) (Table  4 ).

Finally, results of the Pearson correlation test revealed that there was no significant association between the total score of the EBP belief scale and the total score of the EBP implementation scale ( r  = 0.106, p  = 0.101).

This study aimed to explore the self-reported beliefs regarding and implementation of EBP among undergraduate nursing students in Jordan. It is observed that Jordanian undergraduate nursing students valued EBP and its importance in delivering quality patient care as over 70% of them believed that EBP results in the best clinical care for patients and that evidence-based guidelines can improve clinical care. However, a lower percentage of students believed in their ability to implement EBP where they worked and an even lower percentage of them actually implemented EBP frequently in their everyday clinical practice. For illustration, only one-third of the students accessed a database for EBP, have read and critically appraised a clinical research study, or used evidence to change their clinical practice four times or more in the last 8 weeks. Our results are consistent with previous studies among Jordanian nursing students which also showed students had positive attitudes towards research and its usefulness to providing quality patient care but had insufficient ability to utilize research evidence in clinical practice [ 14 ]. Further, a recent study has shown that nursing students in Jordan had low knowledge about EBP regardless of their admitting university [ 19 ]. These results indicate that there could be a gap in the education process of undergraduate nursing students in Jordan about EBP. Thus, schools of nursing in Jordan have to critically review their current educational strategies on EBP and improve it to enhance students’ knowledge of EBP as well as their abilities to implement evidence in clinical practice.

The results of the current study revealed that despite the positive beliefs of the nursing students, their implementation of EBP was very low. There was no significant relationship between the total score of EBP belief and the total score of EBP implementation. Our results are consistent with those reported among Saudi as well as American nursing students who also had positive beliefs about EBP but implemented it less frequently in their everyday clinical practice [ 13 , 15 ]. Moreover, in line with previous studies which showed that training on EBP was one of the significant predictors of beliefs and implementation [ 15 ], students who previously received EBP training had significantly higher total belief and implementation scores than those who did not, in this study. This finding is expected as EBP training has been shown to improve knowledge, self-efficacy in implementation, and by extension, implementation practices among nurses and nursing students [ 20 , 21 , 22 ]. On the other hand, in this study, we asked students whether they have received training on EBP during the nursing research course taught at their universities. More than one-third of participating students in our study cited that they had not received previous training on EBP even though all of them have successfully passed the nursing research course offered at their universities. One possible explanation for this finding could be that there is an inconsistency in the way the nursing research course is taught. It seems that EBP practice is not always included in the content taught in this course. Thus, nursing schools in Jordan have to revise their curricula to ensure that EBP is included and is taught to all students before graduation.

The results of the current study have several international implications that involve academic education and nursing curricula. There is a pressing need to enhance the education process and to focus more on the knowledge and skills of EBP. Incorporating EBP into the nursing curricula, especially the undergraduate program is critical as it is the first step to prepare the students for their professional roles as registered nurses. Sin and Bliquez (2017) stated that creative and enjoyable strategies are fundamental in order to encourage students’ commitment to and learning about EBP [ 23 ]. One of these effective strategies is teaching the EBP process by asking a clinical question, acquiring and searching for evidence, appraising then applying this evidence, and finally evaluating the effectiveness of its application in clinical practice [ 8 ]. A thematic review study demonstrated that various interactive teaching strategies and clinically integrated teaching strategies have been emphasized to enhance EBP knowledge and skills [ 24 ].

Gaining knowledge about undergraduate nursing students’ beliefs and their ability to implement EBP in a clinical setting is essential for nursing educators at the national and the international level. This knowledge might help them to evaluate and improve the current strategies utilized to educate undergraduate students about EBP. Furthermore, academic administrators and teachers should design their courses to apply EBP concepts. They should promote EBP training courses, workshops, and seminars. For example, the research course should focus more on this topic and should include clinical scenarios that involve the application of EBP. In addition, clinical courses should include assignments for the purpose of integrating EBP within their clinical cases. The scale used in this study could be implemented in clinical courses to evaluate students’ practical skills concerning EBP. Finally, nursing instructors, leaders, and practitioners should always update their EBP knowledge and skills through continuous education and workshops. Since they are the role models and instructors, they should be competent enough to teach and evaluate their students. They should also cooperate to facilitate the implementation of EBP in clinical settings to overcome any barrier.

Study limitations and recommendations

This study sheds light on the existing gap between the belief in and the implementation of EBP among nursing students. However, convenience sampling, using two universities only, and self-report bias are all limitations of this study. In addition, the researchers did not investigate the type of EBP training that was received by the students in this study. More studies are needed in Jordan and the Middle Eastern region about EBP using larger random samples in different settings. It is also recommended to investigate the barriers that prevent nursing students from implementing EBP other than not receiving training on it. Furthermore, conducting qualitative studies might help examine and understand students’ perceptions as well as provide suggestions to bridge the gap between education and practice. Finally, future experimental studies are needed to test the effect of certain interventions on enhancing the implementation of EBP among nursing students.

Evidence-based practice is essential for nursing students worldwide. However, having strong beliefs about EBP and its benefits does not necessarily mean that it is frequently implemented. On the other hand, providing training courses on EBP is an essential step in the enhancement of EBP implementation. This means that in order to advance nursing science and enhance nursing care for future nurses, it is vital to incorporate EBP within the nursing curricula. It is also critical to teach nursing students the value of evidence-based knowledge as well as how to access this knowledge, appraise it, and apply it correctly as needed. This can be achieved through rigorous cooperation between nursing administrators, clinicians, teachers, and students to enhance the implementation process.

Availability of data and materials

Data are available from the corresponding author upon reasonable request and with permission of Jordan University of Science and Technology.

Abbreviations

Evidence-Based Practice

Institute of Medicine

Accreditation Commission for Education in Nursing

Evidence-Based Practice Belief Scale

Evidence-Based Practice Implementation Scale

The Statistical Package for Social Sciences

Straus SE, Glasziou P, Richardson WS, Haynes RB. Evidence-based medicine: how to practice and teach it. Edinburgh: Churchill Livingstone Elsevier; 2011.

Google Scholar  

Stevens K. The impact of evidence-based practice in nursing and the next big ideas. Online J Issues Nurs. 2013;18(2):4.

PubMed   Google Scholar  

Emparanza JI, Cabello JB, Burls AJ. Does evidence-based practice improve patient outcomes? An analysis of a natural experiment in a Spanish hospital. J Eval Clin Prac. 2015;21(6):1059–65. https://doi.org/10.1111/jep.12460 .

Article   Google Scholar  

Institute of Medicine. The future of nursing: Focus on education. 2010. http://iom.nationalacademies.org/Reports/2010/The- Future-of-Nursing-Leading-Change . Accessed 15 May 2019.

AbuRuz ME, Hayeah HA, Al-Dweik G, Al-Akash HY. Knowledge, attitudes, and practice about evidence-based practice: a Jordanian study. Health Sci J. 2017;11(2):1.

Thorsteinsson HS. Icelandic nurses’ beliefs, skills, and resources associated with evidence-based practice and related factors: a national survey. Worldviews Evid-Based Nurs. 2013;10(2):116–26.

Article   PubMed   Google Scholar  

Verloo H, Desmedt M, Morin D. Beliefs and implementation of evidence-based practice among nurses and allied healthcare providers in the Valais hospital. Switzerland J Eval Clin Pract. 2017;23(1):139–48.

Melnyk BM, Fineout-Overholt E. Evidence-based practice in nursing & healthcare: a guide to best practice. Philadelphia: Lippincott Williams & Wilkins; 2015.

Stokke K, Olsen NR, Espehaug B, Nortvedt MW. Evidence-based practice beliefs and implementation among nurses: a cross-sectional study. BMC Nurs. 2014;13(1):8.

Article   PubMed   PubMed Central   Google Scholar  

Lopez V. Implementing evidence-based practice to develop nursing curriculum. Nurs Pract Today. 2015;2(3):85–7.

Accreditation Commission for Education in Nursing. Accreditation manual. Section III Standards and criteria glossary. 2013. https://www.ncsbn.org/SC2013.pdf . Accessed 15 May 2019.

Moch SD, Cronje RJ, Branson J. Part 1. Undergraduate nursing evidence-based practice education: envisioning the role of students. J Prof Nurs. 2010;26(1):5–13.

Lam CK, Schubert C. Evidence-based practice competence in nursing students: an exploratory study with important implications for educators. Worldviews Evid-Based Nurs. 2019;16(2):161–8.

Halabi JO, Hamdan-Mansour A. Attitudes of Jordanian nursing students towards nursing research. J Res Nurs. 2010;17(4):363–73.

Cruz JP, Colet PC, Alquwez N, Alqubeilat H, Bashtawi MA, Ahmed EA, Cruz CP. Evidence-based practice beliefs and implementation among the nursing bridge program students of a Saudi University. Int J Health Sci. 2016;10(3):405.

Labrague LJ, McEnroe-Pettite D, Tsaras K, D’Souza MS, Fronda DC, Mirafuentes EC, Yahyei AA, Graham MM. Predictors of evidence-based practice knowledge, skills, and attitudes among nursing students. Nurs Forum. 2019;54(2):238–45.

Jordanian Nursing Council. National Nursing and Midwifery Strategy: A Road Map to 2025. 2016. www.jnc.gov.jo . Accessed 30 May 2019.

Melnyk BM, Fineout-Overholt E, Mays MZ. The evidence-based practice beliefs and implementation scales: psychometric properties of two new instruments. Worldviews Evid-Based Nurs. 2008;5(4):208–16.

Al Qadire M. Undergraduate student nurses’ knowledge of evidence-based practice: a short online survey. Nurse Educ Today. 2019;72:1–5.

Spiva L, Hart PL, Patrick S, Waggoner J, Jackson C, Threatt JL. Effectiveness of an evidence-based practice nurse mentor training program. Worldviews Evid-Based Nurs. 2017;14(3):183–91.

Ramos-Morcillo AJ, Fernández-Salazar S, Ruzafa-Martínez M, Del-Pino-Casado R. Effectiveness of a brief, basic evidence-based practice course for clinical nurses. Worldviews Evid-Based Nurs. 2015;12(4):199–207.

Mena-Tudela D, González-Chordá VM, Cervera-Gasch A, Maciá-Soler ML, Orts-Cortés MI. Effectiveness of an evidence-based practice educational intervention with second-year nursing students. Rev Lat Am Enfermagem. 2018;26:e3026.

Sin MK, Bliquez R. Teaching evidence-based practice to undergraduate nursing students. J Prof Nurs. 2017;33(6):447–51.

Horntvedt ME, Nordsteien A, Fermann T, Severinsson E. Strategies for teaching evidence-based practice in nursing education: a thematic literature review. BMC Med Educ. 2018;18(1):172.

Download references

Acknowledgments

This study was funded by Jordan University of Science and Technology Grant # (20190141). The funding source had no role in the design of the study and collection, analysis, and interpretation of data or in writing the manuscript.

Author information

Authors and affiliations.

Faculty of Nursing, Community and Mental Health Nursing Department, Jordan University of Science & Technology, P.O Box 3030, Irbid, 22110, Jordan

Nesrin N. Abu-Baker

Faculty of Irbid College, Department of Applied Sciences, Al-Balqa Applied University, P.O. Box 1293, Irbid, Jordan

Salwa AbuAlrub

Faculty of Nursing, Zarqa University, 247D Khawarezmi Building, Zarqa, Jordan

Rana F. Obeidat

Faculty of Nursing, Al-Albayt University, P.O Box 130040, Mafraq, 25113, Jordan

Kholoud Assmairan

You can also search for this author in PubMed   Google Scholar

Contributions

All authors; NA, SA, RO, and KA had active contributions to the conception and design and/or collecting and analysis of the data and/or the drafting of the paper. NA and RO also made critical revisions for important content and finalized the final version of the manuscript. All authors approved the final version of the manuscript.

Corresponding author

Correspondence to Nesrin N. Abu-Baker .

Ethics declarations

Ethics approval and consent to participate.

Obtained from the Institutional Review Board (IRB) at Jordan University of Science and Technology (Reference # 19/122/2019). All participants were asked to sign a consent form before data collection.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing or conflict of interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Abu-Baker, N.N., AbuAlrub, S., Obeidat, R.F. et al. Evidence-based practice beliefs and implementations: a cross-sectional study among undergraduate nursing students. BMC Nurs 20 , 13 (2021). https://doi.org/10.1186/s12912-020-00522-x

Download citation

Received : 12 May 2020

Accepted : 15 December 2020

Published : 07 January 2021

DOI : https://doi.org/10.1186/s12912-020-00522-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Implementations
  • Evidence-based practice
  • Nursing students

BMC Nursing

ISSN: 1472-6955

evidence based practice skills essay

This website is intended for healthcare professionals

British Journal of Nursing

  • { $refs.search.focus(); })" aria-controls="searchpanel" :aria-expanded="open" class="hidden lg:inline-flex justify-end text-gray-800 hover:text-primary py-2 px-4 lg:px-0 items-center text-base font-medium"> Search

Search menu

Brechin A. Introducing critical practice. In: Brechin A, Brown H, Eby MA (eds). London: Sage/Open University; 2000

Introduction to evidence informed decision making. 2012. https://cihr-irsc.gc.ca/e/45245.html (accessed 8 March 2022)

Cullen L, Adams SL. Planning for implementation of evidence-based practice. JONA: The Journal of Nursing Administration. 2012; 42:(4)222-230 https://doi.org/10.1097/NNA.0b013e31824ccd0a

DiCenso A, Guyatt G, Ciliska D. Evidence-based nursing. A guide to clinical practice.St. Louis (MO): Mosby; 2005

Implementing evidence-informed practice: International perspectives. In: Dill K, Shera W (eds). Toronto, Canada: Canadian Scholars Press; 2012

Dufault M. Testing a collaborative research utilization model to translate best practices in pain management. Worldviews Evid Based Nurs. 2004; 1:S26-S32 https://doi.org/10.1111/j.1524-475X.2004.04049.x

Epstein I. Promoting harmony where there is commonly conflict: evidence-informed practice as an integrative strategy. Soc Work Health Care. 2009; 48:(3)216-231 https://doi.org/10.1080/00981380802589845

Epstein I. Reconciling evidence-based practice, evidence-informed practice, and practice-based research: the role of clinical data-mining. Social Work. 2011; 56:(3)284-288 https://doi.org/10.1093/sw/56.3.284

Implementation research: a synthesis of the literature. 2005. https://tinyurl.com/mwpf4be4 (accessed 6 March 2022)

Graham ID, Logan J, Harrison MB Lost in knowledge translation: time for a map?. J Contin Educ Health Prof. 2006; 26:(1)13-24 https://doi.org/10.1002/chp.47

Greenhalgh T, Robert G, Bate P, MacFarlane F, Kyriakidou O. Diffusion of innovations in health service organisations. A systematic literature review.Malden (MA): Blackwell; 2005

Greenhalgh T, Howick J, Maskrey N. Evidence based medicine: a movement in crisis?. BMJ. 2014; 348 https://doi.org/10.1136/bmj.g3725

Haynes RB, Devereaux PJ, Guyatt GH. Clinical expertise in the era of evidence-based medicine and patient choice. BMJ Evidence-Based Medicine. 2002; 7:36-38 https://doi.org/10.1136/ebm.7.2.36

Hitch D, Nicola-Richmond K. Instructional practices for evidence-based practice with pre-registration allied health students: a review of recent research and developments. Adv Health Sci Educ Theory Pract. 2017; 22:(4)1031-1045 https://doi.org/10.1007/s10459-016-9702-9

Jerkert J. Negative mechanistic reasoning in medical intervention assessment. Theor Med Bioeth. 2015; 36:(6)425-437 https://doi.org/10.1007/s11017-015-9348-2

McSherry R, Artley A, Holloran J. Research awareness: an important factor for evidence-based practice?. Worldviews Evid Based Nurs. 2006; 3:(3)103-115 https://doi.org/10.1111/j.1741-6787.2006.00059.x

McSherry R, Simmons M, Pearce P. An introduction to evidence-informed nursing. In: McSherry R, Simmons M, Abbott P London: Routledge; 2002

Implementing excellence in your health care organization: managing, leading and collaborating. In: McSherry R, Warr J (eds). Maidenhead: Open University Press; 2010

Melnyk BM, Fineout-Overholt E, Stillwell SB, Williamson KM. Evidence-based practice: step by step: the seven steps of evidence-based practice. AJN, American Journal of Nursing. 2010; 110:(1)51-53 https://doi.org/10.1097/01.NAJ.0000366056.06605.d2

Implementing evidence-based practices: six ‘drivers’ of success. Part 3 in a Series on Fostering the Adoption of Evidence-Based Practices in Out-Of-School Time Programs. 2007. https://tinyurl.com/mu2y6ahk (accessed 8 March 2022)

Muir-Gray JA. Evidence-based healthcare. How to make health policy and management decisions.Edinburgh: Churchill Livingstone; 1997

Nevo I, Slonim-Nevo V. The myth of evidence-based practice: towards evidence-informed practice. British Journal of Social Work. 2011; 41:(6)1176-1197 https://doi.org/10.1093/bjsw/bcq149

Newhouse RP, Dearholt S, Poe S, Pugh LC, White K. The Johns Hopkins Nursing Evidence-based Practice Rating Scale.: The Johns Hopkins Hospital: Johns Hopkins University School of Nursing; 2005

Nursing and Midwifery Council. The Code. 2018. https://www.nmc.org.uk/standards/code (accessed 7 March 2022)

Nutley S, Walter I, Davies HTO. Promoting evidence-based practice: models and mechanisms from cross-sector review. Research on Social Work Practice. 2009; 19:(5)552-559 https://doi.org/10.1177/1049731509335496

Reed JE, Howe C, Doyle C, Bell D. Successful Healthcare Improvements From Translating Evidence in complex systems (SHIFT-Evidence): simple rules to guide practice and research. Int J Qual Health Care. 2019; 31:(3)238-244 https://doi.org/10.1093/intqhc/mzy160

Rosswurm MA, Larrabee JH. A model for change to evidence-based practice. Image J Nurs Sch. 1999; 31:(4)317-322 https://doi.org/10.1111/j.1547-5069.1999.tb00510.x

Rubin A. Improving the teaching of evidence-based practice: introduction to the special issue. Research on Social Work Practice. 2007; 17:(5)541-547 https://doi.org/10.1177/1049731507300145

Shlonsky A, Mildon R. Methodological pluralism in the age of evidence-informed practice and policy. Scand J Public Health. 2014; 42:18-27 https://doi.org/10.1177/1403494813516716

Straus SE, Tetroe J, Graham I. Defining knowledge translation. CMAJ. 2009; 181:(3-4)165-168 https://doi.org/10.1503/cmaj.081229

Titler MG, Everett LQ. Translating research into practice. Considerations for critical care investigators. Crit Care Nurs Clin North Am. 2001; 13:(4)587-604 https://doi.org/10.1016/S0899-5885(18)30026-1

Titler MG, Kleiber C, Steelman V Infusing research into practice to promote quality care. Nurs Res. 1994; 43:(5)307-313 https://doi.org/10.1097/00006199-199409000-00009

Titler MG, Kleiber C, Steelman VJ The Iowa model of evidence-based practice to promote quality care. Crit Care Nurs Clin North Am. 2001; 13:(4)497-509 https://doi.org/10.1016/S0899-5885(18)30017-0

Ubbink DT, Guyatt GH, Vermeulen H. Framework of policy recommendations for implementation of evidence-based practice: a systematic scoping review. BMJ Open. 2013; 3:(1) https://doi.org/10.1136/bmjopen-2012-001881

Wang LP, Jiang XL, Wang L, Wang GR, Bai YJ. Barriers to and facilitators of research utilization: a survey of registered nurses in China. PLoS One. 2013; 8:(11) https://doi.org/10.1371/journal.pone.0081908

Warren JI, McLaughlin M, Bardsley J, Eich J, Esche CA, Kropkowski L, Risch S. The strengths and challenges of implementing EBP in healthcare systems. Worldviews Evid Based Nurs. 2016; 13:(1)15-24 https://doi.org/10.1111/wvn.12149

Webber M, Carr S. Applying research evidence in social work practice: Seeing beyond paradigms. In: Webber M (ed). London: Palgrave; 2015

Evidence-based practice vs. evidence-based practice: what's the difference?. 2014. https://tinyurl.com/2p8msjaf (accessed 8 March 2022)

Evidence-informed practice: simplifying and applying the concept for nursing students and academics

Elizabeth Adjoa Kumah

Nurse Researcher, Faculty of Health and Social Care, University of Chester, Chester

View articles · Email Elizabeth Adjoa

Robert McSherry

Professor of Nursing and Practice Development, Faculty of Health and Social Care, University of Chester, Chester

View articles

Josette Bettany-Saltikov

Senior Lecturer, School of Health and Social Care, Teesside University, Middlesbrough

Paul van Schaik

Professor of Research, School of Social Sciences, Humanities and Law, Teesside University, Middlesbrough

evidence based practice skills essay

Background:

Nurses' ability to apply evidence effectively in practice is a critical factor in delivering high-quality patient care. Evidence-based practice (EBP) is recognised as the gold standard for the delivery of safe and effective person-centred care. However, decades following its inception, nurses continue to encounter difficulties in implementing EBP and, although models for its implementation offer stepwise approaches, factors, such as the context of care and its mechanistic nature, act as barriers to effective and consistent implementation. It is, therefore, imperative to find a solution to the way evidence is applied in practice. Evidence-informed practice (EIP) has been mooted as an alternative to EBP, prompting debate as to which approach better enables the transfer of evidence into practice. Although there are several EBP models and educational interventions, research on the concept of EIP is limited. This article seeks to clarify the concept of EIP and provide an integrated systems-based model of EIP for the application of evidence in clinical nursing practice, by presenting the systems and processes of the EIP model. Two scenarios are used to demonstrate the factors and elements of the EIP model and define how it facilitates the application of evidence to practice. The EIP model provides a framework to deliver clinically effective care, and the ability to justify the processes used and the service provided by referring to reliable evidence.

Evidence-based practice (EBP) was first mentioned in the literature by Muir-Gray, who defined EBP as ‘an approach to decision-making in which the clinician uses the best available evidence in consultation with the patient to decide upon the option which suits the patient best’ (1997:97). Since this initial definition was set out in 1997, EBP has gained prominence as the gold standard for the delivery of safe and effective health care.

There are several models for implementing EBP. Examples include:

  • Rosswurm and Larrabee's (1999) model
  • The Iowa model ( Titler et al, 2001 )
  • Collaborative research utilisation model ( Dufault, 2004 ); DiCenso et al's (2005) model
  • Greenhalgh et al's (2005) model
  • Johns Hopkins Nursing model ( Newhouse et al, 2005 )
  • Melnyk et al's (2010) model.

Although a comprehensive review of these models is beyond the scope of this article, a brief assessment reveals some commonalities among them. These include a) asking or selecting a practice question, b) searching for the best evidence, c) critically appraising and applying the evidence, d) evaluating the outcome(s) of patient care delivery, and e) disseminating the outcome(s).

Regardless of the benefits of EBP, and the existence of multiple EBP models intended to facilitate the application of evidence into practice, health professionals, including nurses, continue to struggle to implement it effectively ( Ubbink et al, 2013 ). Critics of EBP have questioned its validity ( Rubin, 2007 ; Nevo and Slonim-Nevo, 2011 ); the best practice and setting to support its use ( Nutley et al, 2009 ); its failure to address the complexity of health and health care, as well as the patient's context ( Muir-Gray, 1997 ; Reed et al, 2019 ), and its mechanistic approach ( Epstein, 2009 ; Jerkert, 2015 ). Some of these criticisms are outlined below.

For example, previous studies have reported the barriers health professionals face to successfully implement EBP. Ubbink et al (2013) conducted a systematic review to determine nurses' and doctors' views on knowledge, attitudes, skills, barriers, and behaviour required to implement EBP. The review included 31 studies from 17 countries: eight from North America and 11 from Europe. The results revealed that organisational and individual barriers prevent uptake of EBP among nurses and doctors. These barriers included the lack of material and human resources, and lack of support from managers and leaders; individual barriers included knowledge deficit regarding EBP, time and workload ( Ubbink et al, 2013 ). Researchers such as Hitch and Nicola-Richmond (2017) and Warren et al (2016) found similar barriers to implementing EBP reported by health professionals.

Effective and consistent implementation of EBP in healthcare settings depends on complex interdependent factors, such as the characteristics of an organisation (eg the internal and external healthcare environment, and organisational structures and values); the EBP intervention (eg reduction of hospital-acquired infections); and the attitudes of the individual practitioner towards EBP ( Titler and Everett, 2001 ; Cullen and Adams, 2012 ). Yet, existing approaches of EBP have been ineffective in facilitating its implementation ( Greenhalgh et al, 2014 ).

Consequently, authors such as Cullen and Adams (2012) and Greenhalgh et al (2014) have called for a resurgence of the concept, especially concerning the components of EBP associated with involving patients in decision-making, and with expert judgement and experience. Greenhalgh et al (2014:3) consider it is time to return to implementing ‘real EBP’, where person-centred care is the priority, and health professionals and their patients ‘are free to make appropriate care decisions that may not match what best evidence seems to suggest’. Nonetheless, researchers including McSherry et al (2002) , Epstein (2009) and Nevo and Slonim-Nevo (2011) have proposed an alternative, holistic approach to the application of evidence into practice, termed evidence-informed practice (EIP).

Journey towards evidence-informed practice

The problems with the uptake and effective implementation of EBP led to the emergence of the EIP concept. This concept is based on the premise that healthcare practice should, as a matter of principle, be informed by, rather than based on, evidence ( Nevo and Slonim-Nevo, 2011 ). This implies that other forms of evidence (for example, patient experiences, the nurse's expertise and experiences), not just the ‘research evidence’, should be considered in applying evidence in practice.

McSherry et al (2002) defined EIP as the assimilation of professional judgment and research evidence regarding the efficiency of interventions. This definition was further elaborated as an approach to patient care where:

‘Practitioners are encouraged to be knowledgeable about findings coming from all types of studies and to use them in an integrative manner, taking into consideration clinical experience and judgment, clients' preferences and values, and context of the interventions.’

Nevo and Slonim-Nevo (2011:18)

It has been over two decades since EIP emerged in the literature, however, primary research on the concept has been limited. Hence, although the term EIP has gained momentum in recent times, the methods needed to implement it effectively are not widely known ( McSherry, 2007 ; Woodbury and Kuhnke, 2014 ). While some proponents of EIP (eg Epstein 2011 ; Webber and Carr 2015 ) have identified significant differences between EBP and EIP, most researchers (eg Ciliska, 2012 ; Shlonsky and Mildon, 2014 ) have used the terms interchangeably.

Ciliska (2012) , for instance, developed an evidence-informed decision making (EIDM) module, but referred to the steps of EBP (ie Ask, Acquire, Appraise, Integrate, Adapt, Apply, Analyse) as the processes to be followed in implementing EIDM. Ciliska (2012) explained that the term EIDM was adopted to signify that other types of evidence are useful in clinical decision-making and to attempt to get beyond the criticisms of EBP. This notwithstanding, the author maintained the existing process for implementing EBP. Similarly, Shlonsky and Mildon (2014) used the terms EBP and EIP interchangeably, as they consistently referred to an EBP approach as EIP. Examples include referring to the steps of EBP as ‘the steps of EIP’ ( Shlonsky and Mildon, 2014:3 ) and referring to Haynes et al's (2002) expanded EBP model as a ‘revised EIP model’ ( Shlonsky and Mildon, 2014:2 ).

Another term that is often used interchangeably with EIP is ‘knowledge translation’. This term has been explored extensively. For example, the Canadian Institute of Health Research (CIHR) has adopted knowledge translation to signify the use of high-quality research evidence to make informed decisions ( Straus et al, 2009 ). The CIHR ( Graham et al, 2006 ) developed a ‘knowledge to action’ model intended to integrate the creation and application of knowledge. The model acknowledges the non-linear process of applying evidence in practice, where each stage is influenced by the next, as well as the preceding, stage. In a typical clinical setting, the actual process of applying evidence in practice is not linear, as acknowledged by the proponents of EBP, but cyclical and interdependent. Ciliska (2012) linked Graham et al's (2006) model to the components of evidence-informed decision-making. According to Ciliska (2012:7) , the knowledge-to-action model ‘fits with the steps of evidence-informed decision-making’. However, like EBP, the term ‘knowledge translation’, differs significantly from the EIP concept because it focuses on the ‘research evidence’ in decision-making.

The apparent confusion surrounding EIP is due to inadequate information about its components and the methods involved in implementing the concept. To foster a culture of EIP among health professionals, they must first be made aware of the actual components of the concept and the strategies involved in its successful implementation. The following section uses case scenarios to provide a description of the factors and elements of the EIP model and defines how it facilitates the application of evidence into clinical nursing practice.

Systems thinking

The clinical setting within which nurses work is a complex system made up of several interdependent and inter-related parts. Problems with healthcare delivery and management must therefore be perceived as a consequence of the exchanges between elements of the systems, instead of an outcome or the malfunctioning of a particular element. This, McSherry and Warr (2010) , have referred to as ‘systems thinking’.

Effective implementation of EIP demands an understanding of the various parts of the system that come together to aid the application of evidence in practice.

The evidence-informed practice model

The original model.

The earliest version of the evidence-informed practice model is depicted in Figure 1 . This was developed specifically for nurses and was originally named ‘the evidence-informed nursing model’. The model presented in Figure 1 was developed through PhD research conducted by Robert McSherry (2007) , with the aim to explore, through a mixed-methods study design, why the use of research as evidence in support of clinical nursing practice remains problematic. Study participants were registered nurses practising in a hospital trust located in north-east England.

evidence based practice skills essay

The results of McSherry's (2007) study showed that, to effectively apply evidence in clinical nursing practice, nurses needed to be informed of, and be able to interact with, several key elements. The evidence-informed nursing model was developed as an alternative framework for facilitating the application of evidence in clinical nursing practice and was grounded in the principles and practices of systems thinking. This is because, primarily, the model provided an integrated process to applying evidence into practice, consisting of:

  • A clearly defined input; to encourage nurses to use evidence in practice
  • Throughput; facilitation of the processes associated with the elements
  • Output; improved standards of professional practice

The revised model

The evidence-informed nursing model has been adapted to the evidence-informed practice model. The new model ( Figure 2 ) is adapted in several ways. First, it has been modified to be all-inclusive, so it could be applied to any health profession. Second, the model has been simplified to show the interconnectedness of the various factors and elements that enable a professional to use evidence in support of their clinical decision-making. Third, the model demonstrates the ongoing complexity that health professionals find themselves working in, in the quest to apply evidence to clinical practice. Last, the EIP model incorporates the principles and components of EBP, which is particularly evident in the EIP cycle (the throughput phase of the model).

evidence based practice skills essay

The factors and elements of the EIP model ( Figure 2 ) are explored in more detail below with reference to two scenarios, which are used to apply the EIP model to clinical nursing practice within both a scientific and the wider context within which nursing care takes place.

The first factor of the EIP model is ‘Factor 1. Drivers for evidence-informed practice’ ( Figure 2 ). In order for nurses to enhance patient care and experiences, along with improving their knowledge and skills of the patient's condition and associated signs and symptoms, they need to be aware of what EIP is, what it involves, and the principles required to make it happen. Applying the scenarios, it is essential that the nurse understands and can identify the key elements that drive successful implementation of the EIP concept. This is referred to as the drivers for EIP, which are illustrated in Figure 3 and discussed below.

evidence based practice skills essay

Drivers for EIP

Staff selection.

Recruiting, interviewing and redeploying existing staff or hiring new staff are part of the staff selection process ( Dill and Shera, 2012 ). The importance of this driver is to identify personnel who qualify to implement the EIP programme or model. Additionally, it aims at selecting individuals within the organisation (for example coaches, supervisors, and trainers), who will ensure that the required organisational changes to support nurses in the effective implementation of EIP are done.

In-service training or pre-service

Training on EIP programmes involves activities that are related to offering instruction, providing specialist information or skills development in a structured manner to nurses and other key healthcare staff involved in the EIP programme. Nurses, as well as other members of staff, must learn when, how, where, and with whom to use new approaches and skills in applying evidence to practice ( Metz et al, 2007 ).

Coaching, supervision and mentoring

The coaching and mentoring approach enables new skills to be introduced to nurses on the ward with the support of a coach. The duty of a coach is to offer expert information and support, together with encouragement, opportunities and advice to practise and apply skills that are specific to the EIP programme. Effective implementation of human service interventions (such as EIP) requires changes in behaviour at administrative, supervisory and practitioner levels ( Dill and Shera, 2012 ). Coaching and mentoring are the main ways to bring about a change in behaviour for staff who have been successfully involved in the beginning stage of the implementation process and throughout the life of the EIP programme.

Systems-level partnership

This refers to the improvement of partnerships with the broader and immediate systems to ensure access to required funds, and institutional and human resources necessary to support nurses' work. The immediate systems-level partnership refers to working with individuals or organisations that directly influence healthcare delivery (for example, nurses and doctors).

Partnerships within the broader system, on the other hand, refer to policymakers, funders or other organisations that may support the EIP programme, but are not directly involved in delivering health care. A variety of activities may be conducted as part of the development of systems-level partnerships to aid the implementation of EIP. These may include fundraising activities to support the implementation of EIP programmes, as well as the use of external coaches and consultants to assist with mentoring, technical assistance and training on an ongoing basis.

Internal management support

This involves activities that are associated with establishing processes and structures within an EIP programme to enhance effective implementation of the programme. This is necessary in order to inform healthcare decision-making as well as keep staff organised and focussed on desired care outcomes ( Fixsen et al, 2005 ). Instances of internal management support include the formation of institutional structures and processes, the allocation of resources to support selection of suitable staff, and administrative support for efficient training.

Staff performance and programme evaluation

This involves evaluation of staff performance and the overall EIP programme to determine whether the objectives of the programme have been achieved. To do this effectively, it is important to evaluate the outcomes of the above-defined drivers, in particular, staff selection, in-service training, as well as coaching and mentoring. This will offer managers and stakeholders insight about the effectiveness of staff selection, training, and mentoring in facilitating the application of evidence into clinical practice ( Dill and Shera, 2012 ).

Elements of the EIP model

The first element of the EIP model is professional accountability, depicted as an ‘input’ in Figure 2 . This is an essential part of a nurse's roles and responsibilities and is reaffirmed in the nursing Code ( Nursing and Midwifery Council, 2018 ) of professional practice, the contract of employment and job description. In both case scenarios involving Mitchell and Yvonne ( Box 1 ), professional accountability is evident on several fronts: the nurse must establish a caring, compassionate and therapeutic relationship with the patients by involving and engaging them in shared decision-making regarding all aspects of their care, treatments, and interventions; the nurse is accountable and answerable to the patient and his or her professional colleagues throughout the patient's journey.

Box 1.Patient scenariosScenario 1Yvonne, aged 31, is admitted to the emergency medical unit following a visit to her GP for a non-healing wound to her right big toe. The GP also reported that Yvonne has had a recurring sore throat, extreme tiredness and a low white blood cell count.The GP requested an urgent investigation of these symptoms. Yvonne was placed in a side room for precaution.Scenario 2Mitchell, aged 58, arrives in the emergency department complaining of severe chest pain. He is diaphoretic (sweating excessively) and says his pain is radiating down his left arm and up into his jaw, and he adds that he feels nauseated. A few minutes after admission, he suffers a cardiac arrest.He is resuscitated and transferred to the intensive care unit. He is intubated, is placed on a ventilator and has a central line catheter in place.

Throughput: the evidence-informed practice cycle

The EIP cycle (located in the ‘throughput’ of Figure 2 ) involves the processes or methods through which nurses apply evidence in support of their decision-making in clinical nursing practice. This often occurs in a clinical nursing environment that is complex, constantly changing, and involves numerous members of the multidisciplinary team, patients and their family. Effective communication (verbal and written) is essential for ensuring that the various elements are interchanging, interconnecting and communicating between, and with, each other. For example, the case of Yvonne in scenario 1 ( Box 1 ) can be used as an example to underline the importance of good communication. It is important to explain to the patient and her family the reason for nursing her in a side room rather than the main ward. In this situation, avoiding and preventing cross-infection is essential to safeguard Yvonne from harm.

To ensure the EIP cycle proceeds effectively requires that the nurse (the health professional) acts as the conduit for the interplay between the different elements of the model (ie Element 2: informed decision-making; Element 3: research awareness; Element 4: application of knowledge; and Element 5: evaluation). These elements will be further explored.

Element 2. Informed decision-making

This involves two-way communication between the nurse and the patient(s), and is critical in ensuring there is a robust relationship (honesty, openness, transparency) founded on the principles of person-centred care ( McSherry and Warr, 2010 ). It reaffirms the ethical principle of a patient's right to make an informed decision about what is suitable for them, and takes into account their beliefs, values, priorities and personal circumstances. In case scenario 2, applying the EIP model, the critical care nurse will be expected to involve Mitchell's (the patient's) relatives, medical staff and other members of the healthcare team in making decisions about, for example, ventilator management and care of the central line catheter. However, decision-making in an intensive care unit can be complex, and some of the decisions may involve the nurse only. Similarly, applying the EIP model in case scenario 1, the nurse will be expected to communicate with the patient (Yvonne), carers and colleagues about the importance of hand hygiene, wound care and the importance of using precautions to avoid hospital-acquired infections when caring for the patient.

In both case scenarios, the nurse must endeavour to involve the patient/family members in the process of decision-making by providing them with timely, appropriate and relevant information needed to make often complex and life-changing decisions.

Element 3. Research awareness

This element refers to motivating practitioners to acquire skills and knowledge, as well as to conceptualise what research and evidence involves and the significance they have in improving standards of healthcare practice ( McSherry et al, 2006 ). Research awareness is reliant on the nurse's attitudes towards research, the acquisition of knowledge and confidence about the value of research to practice, and on having supportive managers and colleagues.

This element of the EIP cycle, contained within the model, incorporates three of the steps (Research awareness) of EBP: ask a clinical question, search the literature for research evidence to answer the question, and critically appraise the evidence obtained). Although the nurse is not required to be a researcher to implement the EIP model effectively, they must be knowledgeable about relevant databases and search engines (such as Medline and Google), as well as critical appraisal tools, in order to be able to include high-quality research evidence when making patient care decisions.

However, the EIP model acknowledges the fact that research evidence may not always be readily available, and nurses may not have the necessary hardware and software in the care environment to enable them to search for research evidence. Hence, recommendations by Greenhalgh et al (2014) led to inclusion, within the EIP model, of nurses as critical thinkers and doers which, therefore, allows them to make appropriate care decisions based on patient preferences and actions, the clinical state, clinical setting and circumstances, and advocates that nurses apply their own knowledge, expertise and clinical experiences in clinical decision-making, which may not necessarily match what the research evidence seems to suggest.

With reference to scenario 2 (and similarly for scenario 1), to adhere to the EIP model the nurse would take the following steps:

  • Update his/her knowledge about Mitchell's clinical presentation
  • Search Medline for research evidence on ‘chest pain’, and ‘cardiac arrest’ and its associated symptoms. Based on the number of articles obtained, the nurse reads the titles and abstracts, and then, the full text of selected articles to exclude irrelevant articles. The remaining articles are then critically appraised to include the best research evidence in patient care decisions.

In situations where the above steps are not possible, the model advocates that the nurse endeavours to make the best care decisions possible based on patient preferences, clinical state, context and circumstances, and the nurse's own expertise and experience, as well as the experience of the patient and family members where possible.

Element 4. Application of knowledge

This is a complex element that requires the gathering and assimilation of various sources of information, evidence, quality and standards, and policy and guidance, to support the nurse's decision-making in clinical practice. In relation to both scenarios, the nurse would need to:

  • Apply knowledge acquired from the patients (Mitchell and Yvonne), along with information from their relatives
  • Apply evidence from reviewing the findings from research
  • Take into account information gleaned from engaging with the multidisciplinary team
  • Ensure they follow recommended local and national guidance and policy on the management of each patient's condition.

It is imperative that the nurse is experienced, knowledgeable, and competent in order to make the most appropriate care decisions together with the patient, the family and the wider multidisciplinary team. To do this effectively, the nurse requires certain personal attributes, it is also important for the organisation within which the nurse works to have specific institutional characteristics. Institutional features include culture, education and training, and workload/skill mix, whereas personal characteristics include improved confidence, attitude, understanding and behaviour towards the application of evidence into practice.

Element 5. Evaluation

This element of the EIP cycle within the model measures the effects of decision-making and actions of the nurse on care outcomes and in creating an optimal care environment. In both scenarios, the nurse would need to periodically evaluate specific processes and outcomes of care. For example, with regards to scenario 2, this would include:

  • Monitoring how Mitchell is performing on the ventilator
  • Taking the necessary infection prevention precautions to avoid the development of infections related to the insertion of a central line and transmission of hospital-acquired infection
  • Monitoring improvement in Mitchell's general wellbeing.

Depending on the outcome of the evaluation, Mitchell's care plan would be either revised or continued.

Element 6. Conditions affecting research utilisation

Research utilisation involves critically appraising research findings, disseminating, and using the knowledge obtained from research to cause changes in an existing healthcare practice ( Titler et al, 1994 ). The conditions that affect research utilisation are grouped into five domains ( Wang et al, 2013 ):

  • The process involved in utilising research findings
  • Accessibility to research
  • The quality of research
  • The knowledge and attitudes of the nurse (health professional) regarding the use of research findings
  • The organisation within which the findings of research are to be implemented.

In the two scenarios ( Box 1 ), the nurse needs to be aware of the potential barriers to research utilisation and identify ways to overcome these in order to effectively apply evidence to healthcare practice. In addition, the clinical environment within which nurses work must provide sufficient support in order to enhance the effective and consistent application of evidence to practice. Nurses must be supported to acquire the necessary knowledge, skills, and understanding needed to practise safely (ie competently and confidently). In addition, the resources necessary to obtain research evidence, such as IT (computers and internet), must be readily available in the clinical setting for easy access to information.

Factor 2 (Output). Critical thinker and doer, the professional nurse

To ensure that nurses inform their decisions with the best available evidence, it is imperative that they have a sound understanding and knowledge of what constitutes the EIP model ( Figure 2 ). Successfully engaging with the various factors and elements of this model will lead to the desired outcome—that of a professional who is a critical thinker and doer, a professional nurse who, as argued by Brechin (2000:44) , is ‘knowledgeable and skilled, yet welcomes alternative ideas and belief systems, appreciating and respecting alternative views’. In this context, it is about creating a caring and compassionate environment in which excellence in nursing practice occurs. This can only be exemplified by ensuring that decisions and actions are based on the best available evidence.

The benefits of the EIP model for the nurse, patient and family are that it simplifies a highly complex series of systems and processes pertaining to how evidence is used to support decisions made in clinical practice. The EIP model simply illustrates the why, the how and the sequencing of getting evidence into clinical practice. It also complements the evidence-based movement by offering a holistic systems-based approach to facilitating the application of evidence into clinical practice.

EIP is a holistic integrated approach to applying evidence into practice, which incorporates the steps of EBP within its system and processes. In other words, EBP is a subset of the EIP model, made explicit within the EIP cycle. Thus, EIP is neither an alternative to, nor a replacement for, EBP. The EIP model provides a framework for nurses (indeed all health practitioners) to deliver clinically effective care and enable them to justify the processes used and the service provided by referring to reliable evidence. Using two scenarios, this article demonstrated how the EIP model can be applied to clinical nursing practice. Future initiatives should focus on developing EIP educational interventions and determining the effects of such interventions on healthcare students' knowledge of, and attitudes towards, the application of evidence to practice.

  • Two main concepts have been associated with the application of evidence into practice: evidence-based practice (EBP) and evidence-informed practice (EIP)
  • The main feature that distinguishes EIP from EBP is the processes used in implementing the concepts
  • EIP provides the mechanisms or processes to follow in implementing EBP
  • EIP is not a substitute or replacement for EBP. EIP is an integrated approach to applying evidence to practice, which incorporates the steps of EBP in its processes

CPD reflective questions

  • Make a list of the challenges you encounter in implementing EBP
  • Use the same list and indicate how these challenges prevent you from using evidence to support your nursing clinical decisions and actions in practice
  • How does viewing health and healthcare delivery as a complex system impact on your patient care?
  • Make a list of the drivers that are encouraging you to support your clinical nursing decisions and actions with evidence
  • Using your own experience to date and the information presented in the text, make a list of why and how you think evidence-informed practice forms part of your professional accountability and professional registration
  • Submission Guidelines

Evidence Based Practice

Evidence Based Practice

Learning Objectives

After completing this lesson you will be able to:

(1) Define evidence-based practice (EBP)

(2) Describe the role of research and EBP in clinical practice

(3) Discuss the differences between research and EBP

Shay is a clinical nurse on the Bone Marrow Transplant & Hematology Inpatient Unit. She is assigned four patients, three of which have multiple blood products ordered. Throughout the day, Shay notices the Health Care Assistant (HCA) is “frazzled,” running from room to room to obtain frequent vital signs on these patients. A fall-risk patient in room 25 hits their call light to ask for help to the restroom, but the HCA is busy in room 31 with more vital signs. The patient decides to get up without assistance, and falls. Shay starts to wonder, “Is there a better way to do this?” She also realizes that she has never personally seen a blood transfusion reaction happen on the unit, and asks herself, “How often do transfusion reactions actually happen? Does the evidence support vital signs being done this frequently?”

What is evidence-based practice?

videnced-based practice (EBP) is applying or translating research findings in our daily patient care practices and clinical decision-making. 

EBP also involves integrating the best available evidence with clinical knowledge and expertise, while considering patients’ unique needs and personal preferences. If used consistently, optimal patient outcomes are more likely to be achieved .

Using EBP means abandoning outdated care delivery practices and choosing effective, scientifically validated methods to meet individual patient needs. Health care providers who use EBP must be skilled at discerning the value of research for their specific patient population. 

How to apply EBP in clinical practice

Evaluating all of the available evidence on a subject would be a nearly impossible task. Luckily, there are a number of EBP processes that have been developed to help health care providers implement EBP in the workplace.

The most common process follows these six steps:

1. ASK a question .  Is there something in your clinical setting that you are wondering about? Perhaps you wonder if a new intervention is more effective than the one currently used. Ask yourself: What works well and what could be improved? And, more importantly, WHY? Evaluate the processes and workflow that impact, or are impacted by, the identified practice gap. We’ll use a format called PICO(T) (pronounced “pee ko”). Learn more about PICOT questions in the next module.

2. ACQUIRE the current evidence .  You’ll do this by conducting a literature search . Your search will be guided by your clinical question.

3.  APPRAISE the literature .  Or, in other words, sort, read, and critique peer-reviewed literature.

4. APPLY your findings to clinical decision-making.   Integrate the evidence with clinical expertise and patient preferences and values. Then make evidence-based recommendations for day-to-day practice.

5.  EVALUATE your outcomes . Review data and document your approach. Be sure to include any revisions or changes. Keep close tabs on the outcomes of your intervention. Evaluate and summarize the outcome.

6. DISSEMINATE the information. Share the results of your project with others. Sharing helps promote best practices and prevent duplicative work. It also adds to the existing resources that support or oppose the practice.

Though we may learn how to apply EBP by participating in project-based work, integrating EBP in our daily practice can help us strive to achieve the best possible patient outcomes. It requires us to be thoughtful about our practice and ask the right questions.

It's important to note that although applying evidence at the bedside can be conducted individually, working collaboratively as a team is more likely to result in lasting improvement.

Before there was evidence…

As health care providers, delivery of patient care should stimulate questions about the evidence behind our daily practice. 

For instance, there was a time when neutropenic patients were placed in strict isolation to protect them from developing life-threatening infections. Research findings were evaluated for best evidence and it was noted that using strict isolation precautions did not result in more favorable patient outcomes when compared to proper handwashing procedures coupled with standard precautions—and it seemed that we unnecessarily subjected patients to the negative psychological effects caused by extreme isolation. 

As clinicians, we sometimes follow outdated policies or practices without questioning their relevance, accuracy, or the evidence that supports their continued use.

What’s the difference between research and EBP?

There is a common misconception that EBP and research are one in the same. Not true! While there are similarities, one of the fundamental differences lies in their purpose. The purpose of conducting research is to generate new knowledge or to validate existing knowledge based on a theory. Research involves systematic, scientific inquiry to answer specific questions or test hypotheses using disciplined, rigorous methods. For research results to be considered reliable and valid, researchers must use the scientific methods in orderly, sequential steps.

In contrast, the purpose of EBP isn’t about developing new knowledge or validating existing knowledge—it’s about translating the evidence and applying it to clinical practice and decision-making. The purpose of EBP is to use the best available evidence to make informed patient-care decisions. Most of the best evidence stems from research, but EBP goes beyond research and includes the clinical expertise of the clinician and healthcare teams, as well as patient preferences and values.   

Before you begin – a few important considerations

Do you have more than just evidence, patient feedback, pt design explore thumb.

pt design explore thumb

Need help getting patient feedback? Learn about U of U Health's  Patient Design Studio .

Research findings, in the absence of other considerations, should not be used independently to justify a change in practice. Other factors that must be considered include:

Patient values and preferences

Experience of the health care provider

Patient assessment and laboratory findings

Data obtained from other sources, such as unit-based metrics and workflow 

For EBP strategies to result in the best patient outcomes, all of these factors must be  considered.

Do you have adequate sponsorship and resources?

Start smart, carrot stick shared purpose.

carrot stick shared purpose

For more helpful tips to get started, read " Ask These Four Questions Before Starting Any Improvement ."

To implement EBP, we also need to consider if the implementation of the project will be supported by administration and institutional resources. For example, suppose there is a strong body of evidence showing reduced incidence of depression in pregnant women who receive cognitive therapy sessions when they are hospitalized for extended periods of time. While this might be a great idea, budget constraints may prevent hiring a therapist to offer this treatment.  

While you are thinking of resources, think about people, or human resources. Who in your organization can assist you with the project? Are there content experts or key stakeholders that you should involve early on?

Do you have access to data and a plan for measuring progress?

Just like research, we must evaluate and monitor any changes in outcomes after implementing an EBP project so that positive effects are supported and negative effects are remedied. An intervention may be highly effective in a rigorously controlled trial, but that doesn’t always indicate it will work exactly the same way in your clinical setting or for your individual patients.   

The goal of conducting EBP is to utilize current knowledge and connect it with patient preferences and clinical expertise to standardize and improve care processes and, ultimately, patient outcomes. 

Resources to get started:

  • Contact the Evidence-based Practice Council (U of U Health) An interprofessional collective dedicated to incorporating evidence-based practice into daily work.
  • Clinical Skills: Clinical Staff Education (U of U Health) The “Clinical Skills” tab offers a host of evidence-based practice changes to start applying today.
  • Eccles Health Sciences Library Resources (EHSL U of U Health) Resources, tips, and tools for evidence-based practice in health care.  

This article originally appeared 12/18/19. It was updated to reflect currrent practice 2/26/21.

Barbara Wilson

Mary-jean (gigi) austria.

You have a good idea about what you want to study, compare, understand or change. But where do you go from there? First, you need to be clear about exactly what it is you want to find out. In other words, what question are you attempting to answer? Librarian Tallie Casucci and nursing leaders Gigi Austria and Barb Wilson help us understand how to formulate searchable, answerable questions using the PICO(T) framework.

Librarian Tallie Casucci and college of nursing leader Barb Wilson review the steps to conduct a literature search, as well as provide some local resources to help if you get stuck.

The practice of medicine is recognized as a high-risk, error-prone environment. Anesthesiologist Candice Morrissey and internist and hospitalist Peter Yarbrough help us understand the importance of building a supportive, no-blame culture of safety.

Subscribe to our newsletter

Receive the latest insights in health care equity, improvement, leadership, resilience, and more..

evidence based practice skills essay

Contact the Accelerate Team

50 North Medical Drive   |   Salt Lake City, Utah 84132   |   801-587-2157

Log in using your username and password

  • Search More Search for this keyword Advanced search
  • Latest content
  • Current issue
  • BMJ Journals More You are viewing from: Google Indexer

You are here

  • Volume 24, Issue 3
  • Evidence-based practice education for healthcare professions: an expert view
  • Article Text
  • Article info
  • Citation Tools
  • Rapid Responses
  • Article metrics

Download PDF

  • Elaine Lehane 1 ,
  • Patricia Leahy-Warren 1 ,
  • Cliona O’Riordan 1 ,
  • Eileen Savage 1 ,
  • Jonathan Drennan 1 ,
  • Colm O’Tuathaigh 2 ,
  • Michael O’Connor 3 ,
  • Mark Corrigan 4 ,
  • Francis Burke 5 ,
  • Martina Hayes 5 ,
  • Helen Lynch 6 ,
  • Laura Sahm 7 ,
  • Elizabeth Heffernan 8 ,
  • Elizabeth O’Keeffe 9 ,
  • Catherine Blake 10 ,
  • Frances Horgan 11 ,
  • Josephine Hegarty 1
  • 1 Catherine McAuley School of Nursing and Midwifery , University College Cork , Cork , Ireland
  • 2 School of Medicine , University College Cork , Cork , Ireland
  • 3 Postgraduate Medical Training , Cork University Hospital/Royal College of Physicians , Cork , Ireland
  • 4 Postgraduate Surgical Training, Breast Cancer Centre , Cork University Hospital/Royal College of Surgeons , Cork , Ireland
  • 5 School of Dentistry , University College Cork , Cork , Ireland
  • 6 School of Clinical Therapies , University College Cork , Cork , Ireland
  • 7 School of Pharmacy , University College Cork , Cork , Ireland
  • 8 Nursing and Midwifery Planning and Development Unit , Kerry Centre for Nurse and Midwifery Education , Cork , Ireland
  • 9 Symptomatic Breast Imaging Unit , Cork University Hospital , Cork , Ireland
  • 10 School of Public Health, Physiotherapy and Sports Science , University College Dublin , Dublin , Ireland
  • 11 School of Physiotherapy , Royal College of Surgeons in Ireland , Dublin , Ireland
  • Correspondence to Dr Elaine Lehane, Catherine McAuley School of Nursing and Midwifery, University College Cork, Cork T12 K8AF, Ireland; e.lehane{at}ucc.ie

https://doi.org/10.1136/bmjebm-2018-111019

Statistics from Altmetric.com

Request permissions.

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

  • qualitative research

Introduction

To highlight and advance clinical effectiveness and evidence-based practice (EBP) agendas, the Institute of Medicine set a goal that by 2020, 90% of clinical decisions will be supported by accurate, timely and up-to-date clinical information and will reflect the best available evidence to achieve the best patient outcomes. 1 To ensure that future healthcare users can be assured of receiving such care, healthcare professions must effectively incorporate the necessary knowledge, skills and attitudes required for EBP into education programmes.

The application of EBP continues to be observed irregularly at the point of patient contact. 2 5 7 The effective development and implementation of professional education to facilitate EBP remains a major and immediate challenge. 2 3 6 8 Momentum for continued improvement in EBP education in the form of investigations which can provide direction and structure to developments in this field is recommended. 6

As part of a larger national project looking at current practice and provision of EBP education across healthcare professions at undergraduate, postgraduate and continuing professional development programme levels, we sought key perspectives from international EBP education experts on the provision of EBP education for healthcare professionals. The two other components of this study, namely a rapid review synthesis of EBP literature and a descriptive, cross-sectional, national, online survey relating to the current provision and practice of EBP education to healthcare professionals at third-level institutions and professional training/regulatory bodies in Ireland, will be described in later publications.

EBP expert interviews were conducted to ascertain current and nuanced information on EBP education from an international perspective. Experts from the UK, Canada, New Zealand and Australia were invited by email to participate based on their contribution to peer-reviewed literature on the subject area and recognised innovation in EBP education. Over a 2-month period, individual ‘Skype’ interviews were conducted and recorded. The interview guide (online  supplementary appendix A ) focused on current practice and provision of EBP education with specific attention given to EBP curricula, core EBP competencies, assessment methods, teaching initiatives and key challenges to EBP education within respective countries. Qualitative content analysis techniques as advised by Bogner et al 9 for examination of expert interviews were used. Specifically, a six-step process was applied, namely transcription, reading through/paraphrasing, coding, thematic comparison, sociological conceptualisation and theoretical generalisation. To ensure trustworthiness, a number of practices were undertaken, including explicit description of the methods undertaken, participant profile, extensive use of interview transcripts by way of representative quotations, peer review (PL-W) of the data analysis process and invited interviewees to feedback in relation to the overall findings.

Supplementary file 1

Five EBP experts participated in the interviews ( table 1 ). All experts waived their right to anonymity.

  • View inline

EBP education expert profile

Three main categories emerged, namely (1) ‘EBP curriculum considerations’, (2) ‘Teaching EBP’ and (3) ‘Stakeholder engagement in EBP education’. These categories informed the overarching theme of ‘Improving healthcare through enhanced teaching and application of EBP’ ( figure 1 ).

  • Download figure
  • Open in new tab
  • Download powerpoint

Summary of data analysis findings from evidence-based practice (EBP) expert interviews—theme, categories and subcategories.

EBP curriculum considerations

Definitive advice in relation to curriculum considerations was provided with a clear emphasis on the need for EBP principles to be integrated throughout all elements of healthcare professions curricula. Educators, regardless of teaching setting, need to be able to ‘draw out evidence-based components’ from any and all aspects of curriculum content, including its incorporation into assessments and examinations. Integration of EBP into clinical curricula in particular was considered essential to successful learning and practice outcomes. If students perceive a dichotomy between EBP and actual clinical care, then “never the twain shall meet” (GG) requiring integration in such a way that it is “seen as part of the basics of optimal clinical care” (GG). Situating EBP as a core element within the professional curriculum and linking it to professional accreditation processes places further emphasis on the necessity of teaching EBP:

…it is also core in residency programmes. So every residency programme has a curriculum on evidence-based practice where again, the residency programmes are accredited…They have to show that they’re teaching evidence-based practice. (GG)

In terms of the focus of curriculum content, all experts emphasised the oft-cited steps of asking questions, acquiring, appraising and applying evidence to patient care decisions. With regard to identifying and retrieving information, the following in particular was noted:

…the key competencies would be to identify evidence-based sources of information, and one of the key things is there should be no expectation that clinicians are going to go to primary research and evaluate primary research. That is simply not a realistic expectation. In teaching it…they have to be able to identify the pre-processed sources and they have to be able to understand the evidence and they have to be able to use it… (GG)

In addition to attaining proficiency in the fundamental EBP steps, developing competence in communicating evidence to others, including the patient, and facilitating shared decision-making were also highlighted:

…So our ability to communicate risks, benefits, understand uncertainty is so poor…that’s a key area we could improve… (CH)
…and a big emphasis [is needed] on the applicability of that information on patient care, how do you use and share the decision making, which is becoming a bigger and bigger deal. (GG)

It was suggested that these EBP ‘basics’ can be taught “from the start in very similar ways” (GG), regardless of whether the student is at an undergraduate or postgraduate level. The concept of ‘ developmental milestones’ was raised by one expert. This related to different levels of expectations in learning and assessing EBP skills and knowledge throughout a programme of study with an incremental approach to teaching and learning advocated over a course of study:

…in terms of developmental milestones. So for the novice…it’s really trying to get them aware of what the structure of evidence-based practice is and knowing what the process of asking a question and the PICO process and learning about that…in their final year…they’re asked to do critically appraised topics and relate it to clinical cases…It’s a developmental process… (LT)

Teaching EBP

Adoption of effective strategies and practical methods to realise successful student learning and understanding was emphasised. Of particular note was the grounding of teaching strategy and associated methods from a clinically relevant perspective with student exposure to EBP facilitated in a dynamic and interesting manner. The use of patient examples and clinical scenarios was repeatedly expressed as one of the most effective instructional practices:

…ultimately trying to get people to teach in a way where they go, “Look, this is really relevant, dynamic and interesting"…so we teach them in loads of different ways…you’re teaching and feeding the ideas as opposed to “"Here’s a definitive course in this way”. (CH)
…It’s pretty obscure stuff, but then I get them to do three examples…when they have done that they have pretty well got their heads around it…I build them lots of practical examples…clinical examples otherwise they think it’s all didactic garbage… (BA)

EBP role models were emphasised as being integral to demonstrating the application of EBP in clinical decision-making and facilitating the contextualisation of EBP within a specific setting/organisation.

…where we’ve seen success is where organisations have said, “There’s going to be two or three people who are going to be the champions and lead where we’re going”…the issue about evidence, it’s complex, it needs to be contextualised and it’s different for each setting… (CH)

It was further suggested that these healthcare professionals have the ‘X-factor’ required of EBP. The acquisition of such expertise which enables a practitioner to integrate individual EBP components culminating in evidence-based decisions was proposed as a definitive target for all healthcare professionals.

And we call it the X factor…the idea is that the clinician who has the X factor is the good clinician. It’s actually integrating the evidence, the patient values, the patient’s pathophysiology, etc. It could be behavioural issues, systems issues…Those are the four quadrants and the clinical expertise is about integrating those together…You’re not actually adding clinical expertise. It seems to me that the clinical expertise is the ability to integrate those four quadrants. (RJ)

The provision of training for educators to aid the further development of skills and use of resources necessary for effective EBP teaching was recommended:

…so we choose the option to train people as really good teachers and give them really high level skills so that they can then seed it across their organisation… (CH)

Attaining a critical mass of people who are ‘trained’ was also deemed important in making a sustained change:

…and it requires getting the teachers trained and getting enough of them. You don’t need everybody to be doing it to make an impression, but you need enough of them really doing it. (GG)

Stakeholder engagement in EBP education

Engagement of national policy makers, healthcare professionals and patients with EBP was considered to have significant potential to advance its teaching and application in clinical care. The lack of a coherent government and national policy to EBP teaching was cited as a barrier to the implementation of the EBP agenda resulting in a somewhat ‘ad-hoc’ approach, dependent on individual educational or research institutions:

…there’s no cohesive or coherent policy that exists…It’s not been a consistent approach. What we’ve tended to see is that people have started going around particular initiatives…but there’s never been any coordinated approach even from a college perspective, to say we are about improving the uptake and use of evidence in practice and/or generating evidence in practice. And so largely, it’s been left to research institutions… (CH)

To further ingrain EBP within healthcare professional practice, it was suggested that EBP processes, whether related to developing, disseminating or implementing evidence, be embedded in a more structured way into everyday clinical care to promote active and consistent engagement with EBP on a continuous basis:

…we think it should be embedded into care…we’ve got to have people being active in developing, disseminating and implementing evidence…developing can come in a number of formats. It can be an audit. It can be about a practice improvement. It can be about doing some aspect like a systematic review, but it’s very clearly close to healthcare. (CH)

Enabling patients to engage with evidence with a view to informing healthcare professional/patient interactions and care decisions was also advocated:

…I think we really need to put some energy into…this whole idea of patient-driven care, patient-led care and putting some of these tools in the hands of the consumers so that they’re enabled to be able to ask the right questions and to go into an interaction with some background knowledge about what treatments they should be expecting. (LT)

If patients are considered as recipients of EBP rather than key stakeholders, the premise of shared decision-making for care cannot be achieved.

The implementation of a successful EBP education is necessary so that learners not only understand the importance of EBP and be competent in the fundamental steps, but it ultimately serves to influence behaviour in terms of decision-making, through application of EBP in their professional practice. In essence, it serves the function of developing practitioners who value EBP and have the knowledge and skills to implement such practice. The ultimate goal of this agenda is to enhance the delivery of healthcare for improved patient outcomes. The overarching theme of ‘Improving healthcare through enhanced teaching and application of EBP’ represents the focus and purpose of the effort required to optimally structure healthcare professional (HCP) curricula, promote effective EBP teaching and learning strategies, and engage with key stakeholders for the overall advancement of EBP education as noted:

…we think that everyone in training should be in the game of improving healthcare…It’s not just saying I want to do some evidence-based practice…it’s ultimately about…improving healthcare. (CH)

Discussion and recommendations

Education programmes and associated curricula act as a key medium for shaping healthcare professional knowledge, skills and attitudes, and therefore play an essential role in determining the quality of care provided. 10 Unequivocal recommendations were made in relation to the pervasive integration of EBP throughout the academic and clinical curricula. Such integration is facilitated by the explicit inclusion of EBP as a core competency within professional standards and requirements in addition to accreditation processes. 11

Further emphasis on communication skills was also noted as being key to enhancing EBP competency, particularly in relation to realising shared decision-making between patients and healthcare practitioners in making evidence-based decisions. A systematic review by Galbraith et al , 12 which examined a ‘real-world’ approach to evidence-based medicine in general practice, corroborates this recommendation by calling for further attention to be given to communication skills of healthcare practitioners within the context of being an evidence-based practitioner. This resonates with recommendations by Gorgon et al 13 for the need to expose students to the intricacies of ‘real world’ contexts in which EBP is applied.

Experts in EBP, together with trends throughout empirical research and recognised educational theory repeatedly, make a number of recommendations for enhancing EBP teaching and learning strategies. These include (1) clinical integration of EBP teaching and learning, (2) a conscious effort on behalf of educators to embed EBP throughout all elements of healthcare professional programmes, (3) the use of multifaceted, dynamic teaching and assessment strategies which are context-specific and relevant to the individual learner/professional cohort, and (4) ‘scaffolding’ of learning.

At a practical level this requires a more concerted effort to move away from a predominant reliance on stand-alone didactic teaching towards clinically integrative and interactive teaching. 10 14–17 An example provided by one of the EBP experts represents such integrated teaching and experiential learning through the performance of GATE/CATs (Graphic Appraisal Tool for Epidemiological studies/Critically Appraised Topics) while on clinical rotation, with assessment conducted by a clinician in practice. Such an activity fulfils the criteria of being reflective of practice, facilitating the identification of gaps between current and desired levels of competence, identifying solutions for clinical issues and allowing re-evaluation and opportunity for reflection of decisions made with a practitioner. This level of interactivity facilitates ‘deeper’ learning, which is essential for knowledge transfer. 8 Such practices are also essential to bridge the gap between academic and clinical worlds, enabling students to experience ‘real’ translation of EBP in the clinical context. 6 ‘Scaffolding’ of learning, whereby EBP concepts and their application increase in complexity and are reinforced throughout a programme, was also highlighted as an essential instructional approach which is in keeping with recent literature specific both to EBP education and from a broader curriculum development perspective. 3 6 18 19

In addition to addressing challenges such as curriculum organisation and programme content/structure, identifying salient barriers to implementing optimal EBP education is recommended as an expedient approach to effecting positive change. 20 Highlighted strategies to overcome such barriers included (1) ‘Training the trainers’, (2) development of and investment in a national coherent approach to EBP education, and (3) structural incorporation of EBP learning into workplace settings.

National surveys of EBP education delivery 21 22 found that a lack of academic and clinical staff knowledgeable in teaching EBP was a barrier to effective and efficient student learning. This was echoed by findings from EBP expert interviews, which correspond with assertions by Hitch and Nicola-Richmond 6 that while recommended educational practices and resources are available, their uptake is somewhat limited. Effective teacher/leader education is required to improve EBP teaching quality. 10 16 23 24 Such formal training should extend to academic and clinical educators. Supporting staff to have confidence and competence in teaching EBP and providing opportunities for learning throughout education programmes is necessary to facilitate tangible change in this area.

A national and coherent plan with associated investment in healthcare education specific to the integration of EBP was highlighted as having an important impact on educational outcomes. The lack of a coordinated and cohesive approach and perceived value of EBP in the midst of competing interests, particularly within the context of the healthcare agenda, was suggested to lead to an ‘ad-hoc’ approach to the implementation of and investment in EBP education and related core EBP resources. Findings from a systematic scoping review of recommendations for the implementation of EBP 16 draw attention to a number of interventions at a national level that have potential to further promote and facilitate EBP education. Such interventions include government-level policy direction in relation to EBP education requirements across health profession programmes and the instalment and financing of a national institute for the development of evidence-based guidelines.

Incorporating EBP activities into routine clinical practice has potential to promote the consistent participation and implementation of EBP. Such incorporation can be facilitated at various different levels and settings. At a health service level, the provision of computer and internet facilities at the point of care with associated content management/decision support systems allowing access to guidelines, protocols, critically appraised topics and condensed recommendations was endorsed. At a local workplace level, access to EBP mentors, implementation of consistent and regular journal clubs, grand rounds, audit and regular research meetings are important to embed EBP within the healthcare and education environments. This in turn can nurture a culture which practically supports the observation and actualisation of EBP in day-to-day practice 16 and could in theory allow the coherent development of cohorts of EBP leaders.

There are study limitations which must be acknowledged. Four of the five interviewees were medical professionals. Further inclusion of allied healthcare professionals may have increased the representativeness of the findings. However, the primary selection criteria for participants were extensive and recognised expertise in relation to EBP education, the fundamental premises of which traverse specific professional boundaries.

Despite positive attitudes towards EBP and a predominant recognition of its necessity for the delivery of quality and safe healthcare, its consistent translation at the point of care remains elusive. To this end, continued investigations which seek to provide further direction and structure to developments in EBP education are recommended. 6 Although the quality of evidence has remained variable regarding the efficacy of individual EBP teaching interventions, consistent trends in relation to valuable andragogically sound educational approaches, fundamental curricular content and preferential instructional practices are evident within the literature in the past decade. The adoption of such trends is far from prevalent, which brings into question the extent of awareness that exists in relation to such recommendations and accompanying resources. There is a need to translate EBP into an active clinical resolution, which will have a positive impact on the delivery of patient care. In particular, an examination of current discourse between academic and clinical educators across healthcare professions is required to progress a ‘real world’ pragmatic approach to the integration of EBP education which has meaningful relevance to students and engenders active engagement from educators, clinicians and policy makers alike. Further attention is needed on strategies that not only focus on issues such as curricula structure, content and programme delivery but which support educators, education institutions, health services and clinicians to have the capacity and competence to meet the challenge of providing such EBP education.

Summary Box

What is already known.

Evidence-based practice (EBP) is established as a fundamental element and key indicator of high-quality patient care.

Both achieving competency and delivering instruction in EBP are complex processes requiring a multimodal approach.

Currently there exists only a modest utilisation of existing resources available to further develop EBP education.

What are the new findings?

In addition to developing competence in the fundamental EBP steps of ‘Ask’, ‘Acquire’, ‘Appraise’, ‘Apply’ and ‘Assess’, developing competence in effectively communicating evidence to others, in particular patients/service users, is an area newly emphasised as requiring additional attention by healthcare educators.

The successful expansion of the assessment and evaluation of EBP requires a pragmatic amplification of the discourse between academic and clinical educators.

How might it impact on clinical practice in the foreseeable future?

Quality of care is improved through the integration of the best available evidence into decision-making as routine practice and not in the extemporised manner often currently practised.

Acknowledgments

Special thanks to Professor Leanne Togher, Professor Carl Heneghan, Professor Bruce Arroll, Professor Rodney Jackson and Professor Gordon Guyatt, who provided key insights on EBP education from an international perspective. Thank you to Dr Niamh O’Rourke, Dr Eve O’Toole, Dr Sarah Condell and Professor Dermot Malone for their helpful direction throughout the project.

  • 1. ↵ Institute of Medicine (IOM) (US) Roundtable on Evidence-Based Medicine . Leadership Commitments to Improve Value in Healthcare: Finding Common Ground: Workshop Summary . Washington (DC : National Academies Press (US) , 2009 .
  • Summerskill W ,
  • Glasziou P , et al
  • Saroyan A ,
  • Dauphinee WD
  • Thangaratinam S ,
  • Barnfield G ,
  • Weinbrenner S , et al
  • Barends E ,
  • Nicola-Richmond K
  • Zeleníková R ,
  • Ren D , et al
  • Menz W , et al
  • Volmink J , et al
  • Bhutta ZA , et al
  • Galbraith K ,
  • Gorgon EJ ,
  • Fiddes P , et al
  • Coomarasamy A ,
  • Ubbink DT ,
  • Guyatt GH ,
  • Vermeulen H
  • Kortekaas MF ,
  • Bartelink ML ,
  • van der Heijden GJ , et al
  • Odabaşı O , et al
  • Camosso-Stefinovic J ,
  • Gillies C , et al
  • Blanco MA ,
  • Capello CF ,
  • Dorsch JL , et al
  • Heneghan C ,
  • Crilly M , et al
  • Ingvarson L ,
  • Walczak J ,
  • Gabryś E , et al
  • Ahmadi SF ,
  • Baradaran HR ,
  • Tilson JK ,
  • Kaplan SL ,
  • Harris JL , et al

Contributors This project formed part of a national project on EBP education in Ireland of which all named authors are members. The authors named on this paper made substantial contributions to both the acquisition and analysis of data, in addition to reviewing the report and paper for submission.

Funding This research was funded by the Clinical Effectiveness Unit of the National Patient Safety Office (NPSO), Department of Health, Ireland.

Competing interests None declared.

Patient consent Not required.

Ethics approval All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional ethical committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. Ethical approval was granted by the Social Research Ethics Committee, University College Cork (Log 2016–140).

Provenance and peer review Not commissioned; externally peer reviewed.

Data sharing statement The full report entitled ’Research on Teaching EBP in Ireland to healthcare professionals and healthcare students' is available on the National Clinical Effectiveness, Department of Health website.

Read the full text or download the PDF:

Improving healthcare quality, patient outcomes, and costs with evidence-based practice

Setting the stage.

EBP-book-cover_SFW

Tina Magers (nursing professional development and research coordinator at Mississippi Baptist Health Systems) and her team wondered why catheter-associated urinary tract infections (CAUTIs) affect as many as 25% of all hospitalized patients and questioned what evidence exists that could inform a practice change to reduce these infections in their hospital. (This is Step #0 in the seven-step evidence-based practice [EBP] process, which we describe in detail later in this chapter.) As a result, the team formed the following question in a format called PICOT (Patient population, Intervention or Interest area, Comparison intervention or group, Outcome, and Time; Step #1 in EBP) that facilitated them to conduct an expedited effective search for the best evidence (Magers, 2015):

In adult patients hospitalized in a long-term acute care hospital (P), how does the use of a nurse-driven protocol for evaluating the appropriateness of short-term urinary catheter continuation or removal (I) compared to no protocol (C) affect the number of catheter days and CAUTI rates (O) over a six-month post-intervention period (T)?

The team conducted an evidence search to answer this clinical question using the Cumulative Index to Nursing and Allied Health Literature (CINAHL), the Cochrane Database of Systematic Reviews, Cochrane Central Register of Controlled Trials, the Database of Abstracts of Reviews of Effects (DARE), Ovid Clinical Queries, and PubMed (Step #2 in EBP), followed by rapid critical appraisal of 15 studies found in the search (Step #3 in EBP). A synthesis of the 15 studies led the team to conclude that early removal of urinary catheters would likely reduce catheter days and CAUTIs (the identified outcomes). Therefore, the team wrote a protocol based on the evidence, listing eight criteria for the continuation of a short-term urinary catheter (Step #4 in EBP).

After the protocol was presented to the medical executive committee at their hospital for approval, a process for the change was put into practice, including an education plan with an algorithm that was implemented in small group inservices for the nurses, posters, and written handouts for physicians. An outcomes evaluation (Step #5 in the EBP process) revealed a significant reduction in catheter days and a clinically significant reduction of 33% in CAUTIs. The team disseminated the outcomes of the project to internal audiences (e.g., their Nursing Quality Council, the EBP and Research Council, Nursing Leadership Council, Organization Infection Control Committee) and external venues (presentations at regional conferences and a publication in the American Journal of Nursing) (Magers, 2013). (Step #6 in the EBP process.)

This is a stellar exemplar of how a team with a spirit of inquiry and a commitment to improving healthcare quality can use the seven-step EBP process discussed in this chapter to improve patient outcomes and reduce hospital costs.

Evidence-based practice and the quadruple aim in healthcare

Findings from an extensive body of research support that EBP improves the quality and safety of healthcare, enhances health outcomes, decreases geographic variation in care, and reduces costs (McGinty & Anderson, 2008; Melnyk & Fineout-Overholt, 2015; Melnyk, Fineout-Overholt, Gallagher-Ford, & Kaplan, 2012a). In the United States, EBP has been recognized as a key factor in meeting the Triple Aim in healthcare, defined as (Berwick, Nolan, & Whittington, 2008):

  • Improving the patient experience of care (including quality and satisfaction)
  • Improving the health of populations
  • Reducing the per capita cost of healthcare

The Triple Aim has now been expanded to the Quadruple Aim: the fourth goal being to improve work life and decrease burnout in clinicians (Bodenheimer & Sinsky, 2014).

Because EBP has been found to empower clinicians and result in higher levels of job satisfaction (Strout, 2005), it also can assist healthcare systems in achieving the Quadruple Aim. However, regardless of its tremendous positive outcomes, EBP is not standard of care in healthcare systems throughout the United States or the rest of the world due to multiple barriers that have continued to persist over the past decades. Some of these barriers include (Melnyk & Fineout-Overholt, 2015; Melnyk et al., 2012a; Melnyk et al., 2012b; Melnyk et al., 2016; Pravikoff, Pierce, & Tanner, 2005; Titler, 2009):

  • Inadequate knowledge and skills in EBP by nurses and other healthcare professionals
  • Lack of cultures and environments that support EBP
  • Misperceptions that EBP takes too much time
  • Outdated organizational politics and policies
  • Limited resources and tools available for point-of-care providers, including budgetary investment in EBP by chief nurse executives
  • Resistance from colleagues, nurse managers, and leaders
  • Inadequate numbers of EBP mentors in healthcare systems
  • Academic programs that continue to teach baccalaureat, master’s, and doctor of nursing practice students the rigorous process of how to conduct research instead of taking an evidence-based approach to care

Urgent action is needed to rapidly accelerate EBP in order to reduce the tremendously long lag between the generation of research findings and their implementation in clinical settings. Many interventions or treatments that have been found to improve outcomes through research are not standard of care throughout healthcare systems or have never been used in clinical settings. It took more than 20 years for neonatal and pediatric intensive care units to adopt the Creating Opportunities for Parent Empowerment (COPE) Program for parents of preterm infants and critically ill children even though multiple intervention studies supported that COPE reduced parent depression and anxiety, enhanced parental-infant interaction, and improved child outcomes (Melnyk & Fineout-Overholt, 2015). It was not until findings from a National Institute of Nursing Research funded randomized controlled trial supported that COPE reduced neonatal intensive care unit (NICU) length of stay in premature infants by 4 days (8 days in preterms less than 32 weeks) and its associated substantial decreased costs that NICUs across the country began to implement the intervention as standard of care (Melnyk & Feinstein, 2009; Melnyk et al., 2006).

If not for an improvement in “so-what” outcomes (outcomes of importance to the healthcare system, such as decreased length of stay and costs), COPE would not have been translated into NICU settings to improve outcomes in vulnerable children and their families. On the other hand, many interventions or practices that do not have a solid body of evidence to support them continue to be implemented in healthcare, including double-checking pediatric medications, assessing nasogastric tube placement with air, and taking vital signs every 2 or 4 hours for hospitalized patients. These practices that are steeped in tradition instead of based upon the best evidence result in less than optimum care, poor outcomes, and wasteful healthcare spending.

Definition of evidence-based practice

As EBP evolved, it was defined as the conscientious use of current best evidence to make decisions about patient care (Sackett, Straus, Richardson, Rosenberg, & Haynes, 2000). Since this earlier definition, EBP has been broadened to include a lifelong problem-solving approach to how healthcare is delivered that integrates the best evidence from high-quality studies with a clinician’s expertise and also a patient’s preferences and values (Melnyk & Fineout-Overholt, 2015; see Figure 1.1).

Incorporated within a clinician’s expertise are:

  • Clinical judgment
  • Internal evidence from the patient’s history and physical exam, as well as data gathered from EBP, quality improvement, or outcomes management projects
  • An evaluation of available resources required to deliver the best practices

Some barriers inhibit the uptake of EBP across all venues and disciplines within healthcare. Although the strongest level of evidence that guides clinical practice interventions (i.e., Level I evidence) are systematic reviews of randomized controlled trials followed by well-designed randomized controlled trials (i.e., Level II evidence), there is a limited number of systematic reviews and intervention studies in the nursing profession. Single descriptive quantitative and qualitative studies, which are considered lower-level evidence, continue to dominate the field; see Table 1.1 for levels of evidence that are used to guide clinical interventions.

However, all studies that are relevant to the clinical question should be included in the body of evidence that guides clinical practice. In addition, clinicians often lack critical appraisal skills needed to determine the quality of evidence that is produced by research. Critical appraisal of evidence is an essential step in EBP given that strength or level of evidence plus quality of that evidence gives clinicians the confidence to act and change practice. If Level I evidence is published but is found to lack rigor and be of poor quality through critical appraisal, a clinician would not want to make a practice change based on that evidence.

EBP-book-Figure1.1

TABLE 1.1 RATING SYSTEM FOR THE HIERARCHY OF EVIDENCE TO GUIDE CLINICAL INTERVENTIONS

Source: Modified from Elwyn et al. (2015) and Harris et al. (2001) .

The seven steps of evidence-based practice

Evidence-based practice was originally described as a five-step process including (Sackett et al., 2000):

  • Ask the clinical question in PICOT format.
  • Search for the best evidence.
  • Critically appraise the evidence.
  • Integrate the evidence with a clinician’s expertise and a patient’s preferences and values.
  • Evaluate the outcome of the practice change.

In 2011, Melnyk and Fineout-Overholt added two additional steps to the process, resulting in the following seven-step EBP process (see Table 1.2).

TABLE 1.2   THE SEVEN STEPS OF EVIDENCE-BASED PRACTICE

Step #0: Cultivate a spirit of inquiry within an EBP culture and environment The first step in EBP is to cultivate a spirit of inquiry, which is a continual questioning of clinical practices. When delivering care to patients, it is important to consistently question current practices: For example, is Prozac or Zoloft more effective in treating adolescents with depression? Does use of bronchodilators with metered dose inhalers (MDIs) and spacers versus nebulizers in the emergency department (ED) with asthmatic children lead to better oxygenation levels? Does double-checking pediatric medications lead to fewer medication errors?

Cultures and environments that support a spirit of inquiry are more likely to facilitate and sustain a questioning spirit in clinicians. Some key components of an EBP culture and environment include (Melnyk, 2014; Melnyk & Fineout-Overholt, 2015; Melnyk et al., 2012a, 2016):

  • An organizational vision, mission, and goals that include EBP
  • An infrastructure with EBP tools and resources
  • Orientation sessions for new clinicians that communicate an expectation of delivering evidence-based care and meeting the EBP competencies for practicing registered nurses (RNs) and advanced practice nurses (APNs)
  • Leaders and managers who “walk the talk” and support their clinicians to deliver evidence-based care
  • A critical mass of EBP mentors to work with point-of-care clinicians in facilitating evidence-based care
  • Evidence-based policies and procedures
  • Orientations and ongoing professional development seminars that provide EBP knowledge and skills-building along with an expectation for EBP
  • Integration of the EBP competencies in performance evaluations and clinical ladders
  • Recognition programs that reward evidence-based care

Step #1: Ask the burning clinical question in PICOT format After a clinician asks a clinical question, it is important to place that question in PICOT format to facilitate an evidence search that is effective in getting to the best evidence in an efficient manner. PICOT represents:

  • Sometimes, there is not a time element; therefore you see PICO rather than PICOT. P: Patient population
  • I: Intervention or Interest area
  • C: Comparison intervention or group
  • T: Time (if relevant)

For example, the clinical questions asked in Step #0 that all involve interventions or treatments should be rephrased in the following PICOT format to result in the most efficient and effective database searches:

  • In depressed adolescents (P), how does Prozac (I) compared to Zoloft (C) affect depressive symptoms (O) 3 months after starting treatment (T)?
  • In asthmatic children seen in the ED (P), how do bronchodilators delivered with MDIs with spacers (I) compared to nebulizers (C) affect oxygenation levels (O) 1 hour after treatment (T)?
  • In hospitalized children (P), how does double-checking pediatric medications with a second nurse (I) compared to not double-checking (C) affect medication errors (O) during a 30-day time period (T)?

In addition to intervention or treatment questions, other types of PICOT questions include meaning questions, diagnosis questions, etiology questions, and prognosis questions that are addressed in Chapter 3.

Step #2: Search for and collect the most relevant best evidence After the clinical question is placed in PICOT format with the proper template, each keyword in the PICOT question should be used to systematically search for the best evidence; this strategy is referred to as keyword searching . For example, to gather the evidence to answer the intervention PICOT questions in Step #1, you would first search databases for systematic reviews and randomized controlled trials given that they are the strongest levels of evidence to guide practice decisions.

However, the search should extend to include all evidence that answers the clinical question. Each keyword or phrase from the PICOT question (e.g., depressed adolescents, Prozac, Zoloft, depressive symptoms) should be entered individually and searched. Searching controlled vocabulary that matches the keywords is the next step in a systematic approach to searching.

In the final step, combine each keyword and controlled vocabulary previously searched, which typically yields a small number of studies that should answer the PICOT question. This systematic approach to searching for evidence typically yields a small number of studies to answer the clinical question versus a less systematic approach, which usually produces a large number of irrelevant studies. More specific information about searching is covered in Chapter 4.

Step #3: Critically appraise the evidence After relevant evidence has been found, critical appraisal begins. First, it is important to conduct a rapid critical appraisal (RCA) of each study from the data search to determine whether they are keeper studies : that is, they indeed answer the clinical question. This process includes answering the following questions:

  • Are the results of the study valid? Did the researchers use the best methods to conduct the study (study validity)? For example, assessment of a study’s validity determines whether the methods used to conduct the study were rigorous.
  • What are the results? Do the results matter, and can I get similar results in my practice (study reliability)?
  • Will the results help me in caring for my patients? Is the treatment feasible to use with my patients (study applicability)?

Rapid critical appraisal checklists can assist clinicians in evaluating validity, reliability, and applicability of a study in a time-efficient way. See Chapter 5 for one example of an RCA checklist for randomized controlled trials and Melnyk & Fineout-Overholt (2015) for a variety of RCA checklists. After an RCA is completed on each study and found to be a keeper, it is included in the evaluation and synthesis of the body evidence to determine whether a practice change should be made. Chapter 5 contains more information on critically appraising, evaluating, and synthesizing evidence.

Step #4: Integrate the best evidence with one’s clinical expertise and patient preferences and values in making a practice decision or change After the body of evidence from the search is critically appraised, evaluated, and synthesized, it should be integrated with a clinician’s expertise and also a patient’s preferences and values to determine whether the practice change should be conducted. Providing the patient with evidence-based information and involving him or her in the decision regarding whether he or she should receive a certain intervention is an important step in EBP. To facilitate greater involvement of patients in making decisions about their care in collaboration with healthcare providers, there has been an accelerated movement in creating and testing patient-decision support tools, which provide evidence-based information in a relatable understandable format (Elwyn et al., 2015).

Step #5: Evaluate outcomes of the practice decision or change based on evidence After making a practice change based on the best evidence, it is critical to evaluate outcomes—the consequences of an intervention or treatment. For example, an outcome of providing a baby with a pacifier might be a decrease in crying. Outcomes evaluation is essential to determine the impact of the practice changes on healthcare quality and health outcomes. It is important to target “so-what” outcomes that the current healthcare system considers important, such as complication rates, length of stay, rehospitalization rates, and costs given that hospitals are currently being reimbursed based on their performance on these outcomes (Melnyk & Morrison-Beedy, 2012). A more thorough discussion of approaches to outcomes evaluation is included in Chapter 7.

Step #6: Disseminate the outcomes of the EBP decision or change Silos often exist, even within the same healthcare organization. So that others can benefit from the positive changes resulting from EBP, it is important to disseminate the findings. Various avenues for dissemination include institutional EBP rounds; poster and podium presentations at local, regional, and national conferences; and publications. More detailed information about disseminating outcomes of EBP is included in Chapter 9.

Rationale for the new EBP competencies

This chapter discussed how evidence-based practice (EBP) improves healthcare quality, patient outcomes, and cost reductions, yet multiple barriers persist in healthcare settings that need to be rapidly overcome. Ensuring that clinicians meet the newly established EBP competencies along with creating cultures and environments that support EBP are key strategies to transform the current state of nursing practice and healthcare delivery to its highest level. This chapter discussed how evidence-based practice (EBP) improves healthcare quality, patient outcomes, and cost reductions, yet multiple barriers persist in healthcare settings that need to be rapidly overcome. Ensuring that clinicians meet the newly established EBP competencies along with creating cultures and environments that support EBP are key strategies to transform the current state of nursing practice and healthcare delivery to its highest level.

Information on purchasing Implementing the Evidence-Based Practice (EBP) Competencies in Healthcare.

References American Nurses Association. (2010). Nursing: Scope and standards of practice (2nd edition). Silver Spring, MD: American Nurses Association.

Berwick, D. M., Nolan, T. W., & Whittington, J. (2008). The Triple Aim: Care, health, and cost. Health Affairs, 27 (3), 759–769.

Bodenheimer, T., & Sinsky, C. (2014). From Triple to Quadruple Aim: Care of the patient requires care of the provider. Annals of Family Medicine, 12 , 573–576.

Elwyn, G., Quinlan, C., Mulley, A., Agoritsas, T., Vandik, P. O., & Guyatt, G. (2015). Trustworthy guidelines—excellent; customized care tools—even better. BioMed Central Medicine, 13 (1), 199. Modified from Guyatt, G., & Rennie, D. (2002), Users’ guides to the medical literature . Chicago, IL: American Medical Association.

Harris, R. P., Hefland, M., Woolf, S. H., Lohr, K. N., Mulrow, C. D., Teutsch, S. M., & Atkins, D. (2001). Current methods of the U.S. Preventive Services Task Force: A review of the process. American Journal of Preventive Medicine, 20 , 21–35.

Magers, T. (2013). Using evidence-based practice to reduce catheter-associated urinary tract infections. American Journal of Nursing, 113 (6), 34–42.

Magers, T. L. (2015). Using evidence-based practice to reduce catheter-associated urinary tract infections in a long-term acute care facility. In B. M. Melnyk & E. Fineout-Overholt (Eds.), Evidence-based practice in nursing & healthcare. A guide to best practice (3rd ed.) (pp. 70–73). Philadelphia, PA: Wolters Kluwer.

McGinty, J., & Anderson, G. (2008). Predictors of physician compliance with American Heart Association guidelines for acute myocardial infarction. Critical Care Nursing Quarterly, 31 (2), 161–172.

Melnyk, B. M. (2014). Building cultures and environments that facilitate clinician behavior change to evidence-based practice: What works? Worldviews on Evidence-Based Nursing, 11 (2), 79–80.

Melnyk, B. M., & Fineout-Overholt, E. (2011). Evidence-based practice in nursing & healthcare. A guide to best practice (pp. 1–24). Philadelphia, PA: Wolters Kluwer/Lippincott Williams & Wilkins.

Melnyk, B. M., & Fineout-Overholt, E. (2015). Evidence-based practice in nursing & healthcare. A guide to best practice (3rd ed.) (pp. 3–23). Philadelphia, PA: Wolters Kluwer.

Melnyk, B. M., Fineout-Overholt, E., Gallagher-Ford, L., & Kaplan, L. (2012a). The state of evidence-based practice in US nurses: Critical implications for nurse leaders and educators. Journal of Nursing Administration, 42 (9), 410–417.

Melnyk, B. M., Grossman, D., Chou, R., Mabry-Hernandez, I., Nicholson, W., Dewitt, T.G. . . . & Flores, G. (2012b). USPSTF perspective on evidence-based preventive recommendations for children. Pediatrics, 130 (2), e399–e407.

Melnyk, B. M., & Feinstein, N. (2009). Reducing hospital expenditures with the COPE (Creating Opportunities for Parent Empowerment) program for parents and premature infants: An analysis of direct healthcare neonatal intensive care unit costs and savings. Nursing Administrative Quarterly, 33 (1), 32–37.

Melnyk, B. M., Feinstein, N. F., Alpert-Gillis, L., Fairbanks, E., Crean, H. F., Sinkin, R., & Gross, S. J. (2006). Reducing premature infants’ length of stay and improving parents’ mental health outcomes with the COPE NICU program: A randomized clinical trial. Pediatrics, 118 (5), e1414–e1427.

Melnyk, B. M., Gallagher-Ford, L., Thomas, B. K., Troseth, M., Wyngarden, K., & Szalacha, L. (2016). A study of chief nurse executives indicates low prioritization of evidence-based practice and shortcomings in hospital performance metrics across the United States. Worldviews on Evidence-Based Nursing, 13 (1), 6–14.

Melnyk, B. M., Gallagher-Ford, L., Long, L., & Fineout-Overholt, E. (2014). The establishment of evidence-based practice competencies for practicing nurses and advanced practice nurses in real-world clinical settings: Proficiencies to improve healthcare quality, reliability, patient outcomes, and costs. Worldviews on Evidence-Based Nursing, 11 (1), 5–15.

Melnyk, B. M., & Morrison-Beedy, D. (2012). Setting the stage for intervention research: The “so what,” “what exists” and “what’s next” factors. In B. M. Melnyk & D. Morrison-Beedy (Eds.), Designing, conducting, analyzing and funding intervention research. A practical guide for success (pp. 1–9). New York, NY: Springer Publishing Company.

Pravikoff, D. S., Pierce, S. T., & Tanner A. (2005). Evidence-based practice readiness study supported by academy nursing informatics expert panel. Nursing Outlook, 53 (1), 49–50.

Sackett, D. L., Straus, S. E., Richardson, W. S., Rosenberg, W., & Haynes, R. B. (2000). Evidence-based medicine: How to practice and teach EBM . London, UK: Churchill Livingstone.

Strout, T. D. (2005). Curiosity and reflective thinking: Renewal of the spirit. In Clinical scholars at the bedside: An EBP mentorship model for today [electronic version]. Excellence in Nursing Knowledge . Indianapolis, IN: Sigma Theta Tau International.

Titler, M. G. (2009). Developing an evidence-based practice. In G. LoBiondo-Wood & J. Haber (Eds.), Nursing research: Methods and critical appraisal for evidence-based practice (7th ed.) (pp. 385–437). St Louis, MO: Mosby.

Book authors:  Bernadette Mazurek Melnyk,   PhD, RN, CPNP/PMHNP, FAANP, FNAP, FAAN   , is associate vice president for health promotion, university chief wellness officer, and professor and dean of the College of Nursing at The Ohio State University. She also is professor of pediatrics and professor of psychiatry at Ohio State’s College of Medicine.

Lynn Gallagher-Ford, PhD, RN, DPFNAP, NE-BC,   is director of the Center for Transdisciplinary Evidence-based Practice (CTEP) and clinical associate professor in the College of Nursing at The Ohio State University.

Ellen Fineout-Overholt, PhD, RN, FNAP, FAAN,   is the Mary Coulter Dowdy Distinguished Nursing Professor in the College of Nursing & Health Sciences at the University of Texas at Tyler.

  • evidence-based practice
  • RNL Feature

EBP Competencies

This site uses cookies to keep track of your information. Learn more here . Accept and close .

Evidence Based Practice’ Impact on Nursing Essay (Article)

Article: apa format, brief discussion, ebp discussions.

Reid, J., Briggs, J., Carlisle, S., Scott, D., & Lewis, C. (2017). Enhancing utility and understanding of the evidence-based practice through undergraduate nurse education. BMC Nursing, 16 (58), 1-8. Web.

The selected article offers meaningful insights that can empower nursing educationists and practitioners to embrace the power of evidence-based practice (EBP). The authors describe a new course (Evidence-Based Nursing 1) that was implemented as part of an undergraduate nursing program. The researchers observed that the targeted learners were willing to make evidence-based practices part of their nursing philosophies after completing the course. The practice can encourage practitioners to integrate EBP into their respective care delivery models (Reid, Briggs, Carlisle, Scott, & Lewis, 2017). The judicious use of emerging or current evidence in care delivery and health decision-making processes can result in improved patient outcomes and support advanced practice nursing. This article describes the meaning of EBP and how it can be implemented in nursing institutions to ensure that advanced practice nurses (APNs) are prepared to meet their patients’ health needs. The use of emerging evidence and concepts from research studies can guide nurses to offer advanced care. When APNs embrace the power of EBP, they will achieve their potential and offer quality and equitable health services.

The concept of EBP revolves around the use of best evidence to improve patient outcomes. Mackey and Bassendowski (2016) indicate that external clinical findings, results from systematic studies, and personal nursing expertise constitutes “best evidence” for EBP. Nurses should combine such concepts to develop appropriate care delivery models and make desirable decisions to support their patients. EBP is a powerful approach that can be used at the point of care. Proficient nurses can diagnose and educate patients depending on their conditions. Such practitioners will identify signs and symptoms, offer timely patient education, and empower individuals to engage in disease management practices. These tasks at the point of care will be informed by every nurse’s current evidence and information backed by the latest research findings.

Informatics can bring the best available evidence to support AGPC practice. Modern technologies empower nurses to use standardized terminologies that can result in desirable health outcomes. Digital sources of timely or latest evidence can also be used to meet patients’ needs. Practitioners can use informatics processes to acquire and apply evidence to different clinical situations (Reid et al., 2017). Informatics competencies empower nurses to minimize sentinel events and meet patients’ needs.

I am planning to embrace the future by using EBP in my practice. I will incorporate the concept using a powerful strategy. The approach will be implemented using the notion of lifelong learning. I will also undertake numerous researches and use modern informatics to improve my nursing philosophy. Unfortunately, some barriers can affect the implementation and development of an EBP culture. The first one is the existing gap in education and practice. This limitation affects nurses’ ability to use evidence accurately and efficiently. The lack of appropriate policies to support the use of EBP is the second challenge (Mackey & Bassendowski, 2016). The third obstacle is that many institutions and practitioners have failed to embrace the power of informatics. These gaps affect patients’ health outcomes negatively.

EBP is expected to impact advanced nursing practice positively. The concept can sanction practitioners to make informed decisions and offer desirable care depending on their patients’ expectations. The approach results in improved care delivery systems. It also encourages practitioners to improve their nursing philosophies using emerging ideas (or concepts) and their competencies (Reid et al., 2017). EBP empowers nurses to make informed decisions, develop superior care delivery models, and update their skills. APNs using the concept will, therefore, offer safe, affordable, and sustainable care to their patients.

Mackey, A., & Bassendowski, S. (2016). The history of evidence-based practice in nursing education and practice. Journal of Professional Nursing, 33 (1), 51-55. Web.

Reid, J., Briggs, J., Carlisle, S., Scott, D., & Lewis, C. (2017). Enhancing utility and understanding of evidence based practice through undergraduate nurse education. BMC Nursing, 16 (58), 1-8. Web.

  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2020, December 17). Evidence Based Practice' Impact on Nursing. https://ivypanda.com/essays/evidence-based-practice-impact-on-nursing/

"Evidence Based Practice' Impact on Nursing." IvyPanda , 17 Dec. 2020, ivypanda.com/essays/evidence-based-practice-impact-on-nursing/.

IvyPanda . (2020) 'Evidence Based Practice' Impact on Nursing'. 17 December.

IvyPanda . 2020. "Evidence Based Practice' Impact on Nursing." December 17, 2020. https://ivypanda.com/essays/evidence-based-practice-impact-on-nursing/.

1. IvyPanda . "Evidence Based Practice' Impact on Nursing." December 17, 2020. https://ivypanda.com/essays/evidence-based-practice-impact-on-nursing/.

Bibliography

IvyPanda . "Evidence Based Practice' Impact on Nursing." December 17, 2020. https://ivypanda.com/essays/evidence-based-practice-impact-on-nursing/.

  • EBP Application in the Case of Pre-Eclampsia
  • EBP Models and Their Clinical Applications
  • Promoting Evidence-Based Practice in the Workplace
  • Evidence-Based Practice and Research in Nursing
  • Nursing Unit Operating Budget
  • Nursing Informatics and Evidence-Based Practice
  • Reflection of Professional Experience: EBP
  • Evidence-Based Practice in Informatics
  • Evidence-Based Practice Changes in a Clinical Setting
  • Informatics Nurses, Their Roles and Skills
  • The Future of Nursing
  • Interdisciplinary Cooperation in Nursing
  • Teaching Beliefs in Nursing Education
  • Nursing Theory: Evidence-Based Practice
  • Nursing Research: Clinical Performance

Interventions, methods and outcome measures used in teaching evidence-based practice to healthcare students: an overview of systematic reviews

  • Lea D. Nielsen 1 ,
  • Mette M. Løwe 2 ,
  • Francisco Mansilla 3 ,
  • Rene B. Jørgensen 4 ,
  • Asviny Ramachandran 5 ,
  • Bodil B. Noe 6 &
  • Heidi K. Egebæk 7  

BMC Medical Education volume  24 , Article number:  306 ( 2024 ) Cite this article

Metrics details

To fully implement the internationally acknowledged requirements for teaching in evidence-based practice, and support the student’s development of core competencies in evidence-based practice, educators at professional bachelor degree programs in healthcare need a systematic overview of evidence-based teaching and learning interventions. The purpose of this overview of systematic reviews was to summarize and synthesize the current evidence from systematic reviews on educational interventions being used by educators to teach evidence-based practice to professional bachelor-degree healthcare students and to identify the evidence-based practice-related learning outcomes used.

An overview of systematic reviews. Four databases (PubMed/Medline, CINAHL, ERIC and the Cochrane library) were searched from May 2013 to January 25th, 2024. Additional sources were checked for unpublished or ongoing systematic reviews. Eligibility criteria included systematic reviews of studies among undergraduate nursing, physiotherapist, occupational therapist, midwife, nutrition and health, and biomedical laboratory science students, evaluating educational interventions aimed at teaching evidence-based practice in classroom or clinical practice setting, or a combination. Two authors independently performed initial eligibility screening of title/abstracts. Four authors independently performed full-text screening and assessed the quality of selected systematic reviews using standardized instruments. Data was extracted and synthesized using a narrative approach.

A total of 524 references were retrieved, and 6 systematic reviews (with a total of 39 primary studies) were included. Overlap between the systematic reviews was minimal. All the systematic reviews were of low methodological quality. Synthesis and analysis revealed a variety of teaching modalities and approaches. The outcomes were to some extent assessed in accordance with the Sicily group`s categories; “skills”, “attitude” and “knowledge”. Whereas “behaviors”, “reaction to educational experience”, “self-efficacy” and “benefits for the patient” were rarely used.

Conclusions

Teaching evidence-based practice is widely used in undergraduate healthcare students and a variety of interventions are used and recognized. Not all categories of outcomes suggested by the Sicily group are used to evaluate outcomes of evidence-based practice teaching. There is a need for studies measuring the effect on outcomes in all the Sicily group categories, to enhance sustainability and transition of evidence-based practice competencies to the context of healthcare practice.

Peer Review reports

Evidence-based practice (EBP) enhances the quality of healthcare, reduces the cost, improves patient outcomes, empowers clinicians, and is recognized as a problem-solving approach [ 1 ] that integrates the best available evidence with clinical expertise and patient preferences and values [ 2 ]. A recent scoping review of EBP and patient outcomes indicates that EBPs improve patient outcomes and yield a positive return of investment for hospitals and healthcare systems. The top outcomes measured were length of stay, mortality, patient compliance/adherence, readmissions, pneumonia and other infections, falls, morbidity, patient satisfaction, patient anxiety/ depression, patient complications and pain. The authors conclude that healthcare professionals have a professional and ethical responsibility to provide expert care which requires an evidence-based approach. Furthermore, educators must become competent in EBP methodology [ 3 ].

According to the Sicily statement group, teaching and practicing EBP requires a 5-step approach: 1) pose an answerable clinical question (Ask), 2) search and retrieve relevant evidence (Search), 3) critically appraise the evidence for validity and clinical importance (Appraise), 4) applicate the results in practice by integrating the evidence with clinical expertise, patient preferences and values to make a clinical decision (Integrate), and 5) evaluate the change or outcome (Evaluate /Assess) [ 4 , 5 ]. Thus, according to the World Health Organization, educators, e.g., within undergraduate healthcare education, play a vital role by “integrating evidence-based teaching and learning processes, and helping learners interpret and apply evidence in their clinical learning experiences” [ 6 ].

A scoping review by Larsen et al. of 81 studies on interventions for teaching EBP within Professional bachelor-degree healthcare programs (PBHP) (in English undergraduate/ bachelor) shows that the majority of EBP teaching interventions include the first four steps, but the fifth step “evaluate/assess” is less often applied [ 5 ]. PBHP include bachelor-degree programs characterized by combined theoretical education and clinical training within nursing, physiotherapy, occupational therapy, radiography, and biomedical laboratory students., Furthermore, an overview of systematic reviews focusing on practicing healthcare professionals EBP competencies testifies that although graduates may have moderate to high level of self-reported EBP knowledge, skills, attitudes, and beliefs, this does not translate into their subsequent EBP implementation [ 7 ]. Although this cannot be seen as direct evidence of inadequate EBP teaching during undergraduate education, it is irrefutable that insufficient EBP competencies among clinicians across healthcare disciplines impedes their efforts to attain highest care quality and improved patient outcomes in clinical practice after graduation.

Research shows that teaching about EBP includes different types of modalities. An overview of systematic reviews, published by Young et al. in 2014 [ 8 ] and updated by Bala et al. in 2021 [ 9 ], synthesizes the effects of EBP teaching interventions including under- and post graduate health care professionals, the majority being medical students. They find that multifaceted interventions with a combination of lectures, computer lab sessions, small group discussion, journal clubs, use of current clinical issues, portfolios and assignments lead to improvement in students’ EBP knowledge, skills, attitudes, and behaviors compared to single interventions or no interventions [ 8 , 9 ]. Larsen et al. find that within PBHP, collaboration with clinical practice is the second most frequently used intervention for teaching EBP and most often involves four or all five steps of the EBP teaching approach [ 5 ]. The use of clinically integrated teaching in EBP is only sparsely identified in the overviews by Young et al. and Bala et al. [ 8 , 9 ]. Therefore, the evidence obtained within Bachelor of Medicine which is a theoretical education [ 10 ], may not be directly transferable for use in PBHP which combines theoretical and mandatory clinical education [ 11 ].

Since the overview by Young et al. [ 8 ], several reviews of interventions for teaching EBP used within PBHP have been published [ 5 , 12 , 13 , 14 ].

We therefore wanted to explore the newest evidence for teaching EBP focusing on PBHP as these programs are characterized by a large proportion of clinical teaching. These healthcare professions are certified through a PBHP at a level corresponding to a University Bachelor Degree, but with strong focus on professional practice by combining theoretical studies with mandatory clinical teaching. In Denmark, almost half of PBHP take place in clinical practice. These applied science programs qualify “the students to independently analyze, evaluate and reflect on problems in order to carry out practice-based, complex, and development-oriented job functions" [ 11 ]. Thus, both the purpose of these PBHP and the amount of clinical practice included in the educations contrast with for example medicine.

Thus, this overview, identifies the newest evidence for teaching EBP specifically within PBHP and by including reviews using quantitative and/or qualitative methods.

We believe that such an overview is important knowledge for educators to be able to take the EBP teaching for healthcare professions to a higher level. Also reviewing and describing EBP-related learning outcomes, categorizing them according to the seven assessment categories developed by the Sicily group [ 2 ], will be useful knowledge to educators in healthcare professions. These seven assessment categories for EBP learning including: Reaction to the educational experience, attitudes, self-efficacy, knowledge, skills, behaviors and benefits to patients, can be linked to the five-step EBP approach. E.g., reactions to the educational experience: did the educators teaching style enhance learners’ enthusiasm for asking questions? (Ask), self-efficacy: how well do learners think they critically appraise evidence? (Appraise), skills: can learners come to a reasonable interpretation of how to apply the evidence? (Integrate) [ 2 ]. Thus, this set of categories can be seen as a basic set of EBP-related learning outcomes to classify the impact from EBP educational interventions.

Purpose and review questions

A systematic overview of which evidence-based teaching interventions and which EBP-related learning outcomes that are used will give teachers access to important knowledge on what to implement and how to evaluate EBP teaching.

Thus, the purpose of this overview is to synthesize the latest evidence from systematic reviews about EBP teaching interventions in PBHP. This overview adds to the existing evidence by focusing on systematic reviews that a) include qualitative and/ or quantitative studies regardless of design, b) are conducted among PBHP within nursing, physiotherapy, occupational therapy, midwifery, nutrition and health and biomedical laboratory science, and c) incorporate the Sicily group's 5-step approach and seven assessment categories when analyzing the EBP teaching interventions and EBP-related learning outcomes.

The questions of this overview of systematic reviews are:

Which educational interventions are described and used by educators to teach EBP to Professional Bachelor-degree healthcare students?

What EBP-related learning outcomes have been used to evaluate teaching interventions?

The study protocol was guided by the Cochrane Handbook on Overviews of Reviews [ 15 ] and the review process was reported in accordance with The Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) statement [ 16 ] when this was consistent with the Cochrane Handbook.

Inclusion criteria

Eligible reviews fulfilled the inclusion criteria for publication type, population, intervention, and context (see Table  1 ). Failing a single inclusion criterion implied exclusion.

Search strategy

On January 25th 2024 a systematic search was conducted in; PubMed/Medline, CINAHL (EBSCOhost), ERIC (EBSCOhost) and the Cochrane library from May 2013 to January 25th, 2024 to identify systematic reviews published after the overview by Young et al. [ 8 ]. In collaboration with a research librarian, a search strategy of controlled vocabulary and free text terms related to systematic reviews, the student population, teaching interventions, teaching context, and evidence-based practice was developed (see Additional file 1 ). For each database, the search strategy was peer reviewed, revised, modified and subsequently pilot tested. No language restrictions were imposed.

To identify further eligible reviews, the following methods were used: Setting email alerts from the databases to provide weekly updates on new publications; backward and forward citation searching based on the included reviews by screening of reference lists and using the “cited by” and “similar results” function in PubMed and CINAHL; broad searching in Google Scholar (Advanced search), Prospero, JBI Evidence Synthesis and the OPEN Grey database; contacting experts in the field via email to first authors of included reviews, and by making queries via Twitter and Research Gate on any information on unpublished or ongoing reviews of relevance.

Selection and quality appraisal process

Database search results were merged, duplicate records were removed, and title/abstract were initially screened via Covidence [ 17 ]. The assessment process was pilot tested by four authors independently assessing eligibility and methodological quality of one potential review followed by joint discussion to reach a common understanding of the criteria used. Two authors independently screened each title/abstract for compliance with the predefined eligibility criteria. Disagreements were resolved by a third author. Four authors were paired for full text screening, and each pair assessed independently 50% of the potentially relevant reviews for eligibility and methodological quality.

For quality appraisal, two independent authors used the AMSTAR-2 (A MeaSurement Tool to Assess systematic Reviews) for reviews including intervention studies [ 18 ] and the Joanna Briggs Institute Checklist for systematic reviews and research Synthesis (JBI checklist) [ 19 ] for reviews including both quantitative and qualitative or only qualitative studies. Uncertainties in assessments were resolved by requesting clarifying information from first authors of reviews and/or discussion with co-author to the present overview.

Overall methodological quality for included reviews was assessed using the overall confidence criteria of AMSTAR 2 based on scorings in seven critical domains [ 18 ] appraised as high (none or one non-critical flaw), moderate (more than one non-critical flaw), low (one critical weakness) or critically low (more than one critical weakness) [ 18 ]. For systematic reviews of qualitative studies [ 13 , 20 , 21 ] the critical domains of the AMSTAR 2, not specified in the JBI checklist, were added.

Data extraction and synthesis process

Data were initially extracted by the first author, confirmed or rejected by the last author and finally discussed with the whole author group until consensus was reached.

Data extraction included 1) Information about the search and selection process according to the PRISMA statement [ 16 , 22 ], 2) Characteristics of the systematic reviews inspired by a standard in the Cochrane Handbook (15), 3) A citation index inspired by Young et al. [ 8 ] used to illustrate overlap of primary studies in the included systematic reviews, and to ensure that data from each primary study were extracted only once [ 15 ], 4) Data on EBP teaching interventions and EBP-related outcomes. These data were extracted, reformatted (categorized inductively into two categories: “Collaboration interventions” and “  Educational interventions ”) and presented as narrative summaries [ 15 ]. Data on outcome were categorized according to the seven assessment categories, defined by the Sicily group, to classify the impact from EBP educational interventions: Reaction to the educational experience, attitudes, self-efficacy, knowledge, skills, behaviors and benefits to patients [ 2 ]. When information under points 3 and 4 was missing, data from the abstracts of the primary study articles were reviewed.

Results of the search

The database search yielded 691 references after duplicates were removed. Title and abstract screening deemed 525 references irrelevant. Searching via other methods yielded two additional references. Out of 28 study reports assessed for eligibility 22 were excluded, leaving a total of six systematic reviews. Screening resulted in 100% agreement among the authors. Figure  1 details the search and selection process. Reviews that might seem relevant but did not meet the eligibility criteria [ 15 ], are listed in Additional file 2 . One protocol for a potentially relevant review was identified as ongoing [ 23 ].

figure 1

PRISMA flow diagram on search and selection of systematic reviews

Characteristics of included systematic reviews and overlap between them

The six systematic reviews originated from the Middle East, Asia, North America, Europe, Scandinavia, and Australia. Two out of six reviews did not identify themselves as systematic reviews but did fulfill this eligibility criteria [ 12 , 20 ]. All six represented a total of 64 primary studies and a total population of 6649 students (see Table  2 ). However, five of the six systematic reviews contained a total of 17 primary studies not eligible to our overview focus (e.g., postgraduate students) (see Additional file 3 ). Results from these primary studies were not extracted. Of the remaining primary studies, six were included in two, and one was included in three systematic reviews. Data from these studies were extracted only once to avoid double-counting. Thus, the six systematic reviews represented a total of 39 primary studies and a total population of 3394 students. Nursing students represented 3280 of these. One sample of 58 nutrition and health students and one sample of 56 mixed nursing and midwife students were included but none from physiotherapy, occupational therapy, or biomedical laboratory scientists. The majority ( n  = 28) of the 39 primary studies had a quantitative design whereof 18 were quasi-experimental (see Additional file 4 ).

Quality of systematic review

All the included systematic reviews were assessed as having critically low quality with 100% concordance between the two designed authors (see Fig.  2 ) [ 18 ]. The main reasons for the low quality of the reviews were a) not demonstrating a registered protocol prior to the review [ 13 , 20 , 24 , 25 ], b) not providing a list of excluded studies with justification for exclusion [ 12 , 13 , 21 , 24 , 25 ] and c) not accounting for the quality of the individual studies when interpreting the result of the review [ 12 , 20 , 21 , 25 ].

figure 2

Overall methodological quality assessment for systematic reviews. Quantitative studies [ 12 , 24 , 25 ] were assessed following the AMSTAR 2 critical domain guidelines. Qualitative studies [ 13 , 20 , 21 ] were assessed following the JBI checklist. For overall classification, qualitative studies were also assessed with the following critical AMSTAR 2 domains not specified in the JBI checklist (item 2. is the protocol registered before commencement of the review, item 7. justification for excluding individual studies and item 13. consideration of risk of bias when interpreting the results of the review)

Missing reporting of sources of funding for primary studies and not describing the included studies in adequate detail were, most often, the two non-critical items of the AMSTAR 2 and the JBI checklist, not met.

Most of the included reviews did report research questions including components of PICO, performed study selection and data extraction in duplicate, used appropriate methods for combining studies and used satisfactory techniques for assessing risk of bias (see Fig.  2 ).

Main findings from the systematic reviews

As illustrated in Table  2 , this overview synthesizes evidence on a variety of approaches to promote EBP teaching in both classroom and clinical settings. The systematic reviews describe various interventions used for teaching in EBP, which can be summarized into two themes: Collaboration Interventions and Educational Interventions.

Collaboration interventions to teach EBP

In general, the reviews point that interdisciplinary collaboration among health professionals and/or others e.g., librarian and professionals within information technologies is relevant when planning and teaching in EBP [ 13 , 20 ].

Interdisciplinary collaboration was described as relevant when planning teaching in EBP [ 13 , 20 ]. Specifically, regarding literature search Wakibi et al. found that collaboration between librarians, computer laboratory technicians and nurse educators enhanced students’ skills [ 13 ]. Also, in terms of creating transfer between EBP teaching and clinical practice, collaboration between faculty, library, clinical institutions, and teaching institutions was used [ 13 , 20 ].

Regarding collaboration with clinical practice, Ghaffari et al. found that teaching EBP integrated in clinical education could promote students’ knowledge and skills [ 25 ]. Horntvedt et al. found that during a six-week course in clinical practice, students obtained better skills in reading research articles and orally presenting the findings to staff and fellow students [ 20 ]. Participation in clinical research projects combined with instructions in analyzing and discussing research findings also “led to a positive approach and EBP knowledge” [ 20 ]. Moreover, reading research articles during the clinical practice period enhances the students critical thinking skills. Furthermore, Horntvedt et al. mention, that students found it meaningful to conduct a “mini” – research project in clinical settings, as the identified evidence became relevant [ 20 ].

Educational interventions

Educational interventions can be described as “Framing Interventions” understood as different ways to set up a framework for teaching EBP, and “  Teaching methods ” understood as specific methods used when teaching EBP.

Various educational interventions were described in most reviews [ 12 , 13 , 20 , 21 ]. According to Patelarou et al., no specific educational intervention regardless of framing and methods was in favor to “ increase knowledge, skills and competency as well as improve the beliefs, attitudes and behaviors of nursing students”  [ 12 ].

Framing interventions

The approaches used to set up a framework for teaching EBP were labelled in different ways: programs, interactive teaching strategies, educational programs, courses etc. Approaches of various durations from hours to months were described as well as stepwise interventions [ 12 , 13 , 20 , 21 , 24 , 25 ].

Some frameworks [ 13 , 20 , 21 , 24 ] were based on the assessments categories described by the Sicily group [ 2 ] or based on theory [ 21 ] or as mentioned above clinically integrated [ 20 ]. Wakibi et al. identified interventions used to foster a spirit of inquiry and EBP culture reflecting the “5-step approach” of the Sicily group [ 4 ], asking PICOT questions, searching for best evidence, critical appraisal, integrating evidence with clinical expertise and patient preferences to make clinical decisions, evaluating outcomes of EBP practice, and disseminating outcomes useful [ 13 ]. Ramis et al. found that teaching interventions based on theory like Banduras self-efficacy or Roger’s theory of diffusion led to positive effects on students EBP knowledge and attitudes [ 21 ].

Teaching methods

A variety of teaching methods were used such as, lectures [ 12 , 13 , 20 ], problem-based learning [ 12 , 20 , 25 ], group work, discussions [ 12 , 13 ], and presentations [ 20 ] (see Table  2 ). The most effective method to achieve the skills required to practice EBP as described in the “5-step approach” by the Sicely group is a combination of different teaching methods like lectures, assignments, discussions, group works, and exams/tests.

Four systematic reviews identified such combinations or multifaceted approaches [ 12 , 13 , 20 , 21 ]. Patelarou et al. states that “EBP education approaches should be blended” [ 12 ]. Thus, combining the use of video, voice-over, PowerPoint, problem-based learning, lectures, team-based learning, projects, and small groups were found in different studies. This combination had shown “to be effective” [ 12 ]. Similarly, Horntvedt et al. found that nursing students reported that various teaching methods improved their EBP knowledge and skills [ 20 ].

According to Ghaffari et al., including problem-based learning in teaching plans “improved the clinical care and performance of the students”, while the problem-solving approach “promoted student knowledge” [ 25 ]. Other teaching methods identified, e.g., flipped classroom [ 20 ] and virtual simulation [ 12 , 20 ] were also characterized as useful interactive teaching interventions. Furthermore, face-to-face approaches seem “more effective” than online teaching interventions to enhance students’ research and appraisal skills and journal clubs enhance the students critically appraisal-skills [ 12 ].

As the reviews included in this overview primarily are based on qualitative, mixed methods as well as quasi-experimental studies and to a minor extent on randomized controlled trials (see Table  2 ) it is not possible to conclude of the most effective methods. However, a combination of methods and an innovative collaboration between librarians, information technology professionals and healthcare professionals seem the most effective approach to achieve EBP required skills.

EBP-related outcomes

Most of the systematic reviews presented a wide array of outcome assessments applied in EBP research (See Table  3 ). Analyzing the outcomes according to the Sicily group’s assessment categories revealed that assessing “knowledge” (used in 19 out of 39 primary studies), “skills” (used in 18 out of 39 primary studies) and “attitude” (used in 17 out of 39) were by far the most frequently used assessment categories, whereas outcomes within the category of “behaviors” (used in eight studies) “reaction to educational experience” (in five studies), “self-efficacy” (in two studies), and “benefits for the patient” (in one study), were used to a far lesser extent. Additionally, outcomes, that we were not able to categorize within the seven assessment categories, were “future use” and “Global EBP competence”.

The purpose of this overview of systematic reviews was to collect and summarize evidence of the diversity of EBP teaching interventions and outcomes measured among professional bachelor- degree healthcare students.

Our results give an overview of “the state of the art” of using and measuring EBP in PBHP education. However, the quality of included systematic reviews was rated critically low. Thus, the result cannot support guidelines of best practice.

The analysis of the interventions and outcomes described in the 39 primary studies included in this overview, reveals a wide variety of teaching methods and interventions being used and described in the scientific literature on EBP teaching of PBHP students. The results show some evidence of the five step EBP approach in accordance with the inclusion criteria “interventions aimed at teaching one or more of the five EBP steps; Ask, Search, Appraise, Integrate, Assess/evaluate”. Most authors state, that the students´ EBP skills, attitudes and knowledge improved by almost any of the described methods and interventions. However, descriptions of how the improvements were measured were less frequent.

We evaluated the described outcome measures and assessments according to the seven categories proposed by the Sicily group and found that most assessments were on “attitudes”, “skills” and “knowledge”, sometimes on “behaviors” and very seldom on” reaction to educational experience”, “self-efficacy” and “benefits to the patients”. To our knowledge no systematic review or overview has made this evaluation on outcome categories before, but Bala et al. [ 9 ] also stated that knowledge, skills, and attitudes are the most common evaluated effects.

Comparing the outcomes measured between mainly medical [ 9 ] and nursing students, the most prevalent outcomes in both groups are knowledge, skills and attitudes around EBP. In contrast, measuring on the students´ patient care or on the impact of the EBP teaching on benefits for the patients is less prevalent. In contrast Wu et al.’s systematic review shows that among clinical nurses, educational interventions supporting implementation of EBP projects can change patient outcomes positively. However, they also conclude that direct causal evidence of the educational interventions is difficult to measure because of the diversity of EBP projects implemented [ 26 ]. Regarding EBP behavior the Sicily group recommend this category to be assessed by monitoring the frequency of the five step EBP approach, e.g., ASK questions about patients, APPRAISE evidence related to patient care, EVALUATE their EBP behavior and identified areas for improvement [ 2 ]. The results also showed evidence of student-clinician transition. “Future use” was identified in two systematic reviews [ 12 , 13 ] and categorized as “others”. This outcome is not included in the seven Sicily categories. However, a systematic review of predictive modelling studies shows, that future use or the intention to use EBP after graduation are influenced by the students EBP familiarity, EBP capability beliefs, EBP attitudes and academic and clinical support [ 27 ].

Teaching and evaluating EBP needs to move beyond aiming at changes in knowledge, skills, and attitudes, but also start focusing on changing and assessing behavior, self-efficacy and benefit to the patients. We recommend doing this using validated tools for the assessment of outcomes and in prospective studies with longer follow-up periods, preferably evaluating the adoption of EBP in clinical settings bearing in mind, that best teaching practice happens across sectors and settings supported and supervised by multiple professions.

Based on a systematic review and international Delphi survey, a set of interprofessional EBP core competencies that details the competence content of each of the five steps has been published to inform curriculum development and benchmark EBP standards [ 28 ]. This consensus statement may be used by educators as a reference for both learning objectives and EBP content descriptions in future intervention research. The collaboration with clinical institutions and integration of EBP teaching components such as EBP assignments or participating in clinical research projects are important results. Specifically, in the light of the dialectic between theoretical and clinical education as a core characteristic of Professional bachelor-degree healthcare educations.

Our study has some limitations that need consideration when interpreting the results. A search in the EMBASE and Scopus databases was not added in the search strategy, although it might have been able to bring additional sources. Most of the 22 excluded reviews included primary studies among other levels/ healthcare groups of students or had not critically appraised their primary studies. This constitutes insufficient adherence to methodological guidelines for systematic reviews and limits the completeness of the reviews identified. Often, the result sections of the included reviews were poorly reported and made it necessary to extract some, but not always sufficient, information from the primary study abstracts. As the present study is an overview and not a new systematic review, we did not extract information from the result section in the primary studies. Thus, the comprehensiveness and applicability of the results of this overview are limited by the methodological limitations in the six included systematic reviews.

The existing evidence is based on different types of study designs. This heterogeneity is seen in all the included reviews. Thus, the present overview only conveys trends around the comparative effectiveness of the different ways to frame, or the methods used for teaching EBP. This can be seen as a weakness for the clarity and applicability of the overview results. Also, our protocol is unpublished, which may weaken the transparency of the overview approach, however our search strategies are available as additional material (see Additional file 1 ). In addition, the validity of data extraction can be discussed. We extracted data consecutively by the first and last author and if needed consensus was reached by discussion with the entire research group. This method might have been strengthened by using two blinded reviewers to extract data and present data with supporting kappa values.

The generalizability of the results of this overview is limited to undergraduate nursing students. Although, we consider it a strength that the results represent a broad international perspective on framing EBP teaching, as well as teaching methods and outcomes used among educators in EBP. Primary studies exist among occupational therapy and physiotherapy students [ 5 , 29 ] but have not been systematically synthesized. However, the evidence is almost non-existent among midwife, nutrition and health and biomedical laboratory science students. This has implications for further research efforts because evidence from within these student populations is paramount for future proofing the quality assurance of clinical evidence-based healthcare practice.

Another implication is the need to compare how to frame the EBP teaching, and the methods used both inter-and mono professionally among these professional bachelor-degree students. Lastly, we support the recommendations of Bala et al. of using validated tools to increase the focus on measuring behavior change in clinical practice and patient outcomes, and to report in accordance with the GREET guidelines for educational intervention studies [ 9 ].

This overview demonstrates a variety of approaches to promote EBP teaching among professional bachelor-degree healthcare students. Teaching EBP is based on collaboration with clinical practice and the use of different approaches to frame the teaching as well as different teaching methods. Furthermore, this overview has elucidated, that interventions often are evaluated according to changes in the student’s skills, knowledge and attitudes towards EBP, but very rarely on self-efficacy, behaviors, benefits to the patients or reaction to the educational experience as suggested by the Sicily group. This might indicate that educators need to move on to measure the effect of EBP on outcomes comprising all categories, which are important to enhance sustainable behavior and transition of knowledge into the context of practices where better healthcare education should have an impact. In our perspective these gaps in the EBP teaching are best met by focusing on more collaboration with clinical practice which is the context where the final endpoint of teaching EBP should be anchored and evaluated.

Availability of data and materials

The datasets used an/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

Evidence-Based Practice

Professional bachelor-degree healthcare programs

Mazurek Melnyk B, Fineout-Overholt E. Making the Case for Evidence-Based Practice and Cultivalting a Spirit of Inquiry. I: Mazurek Melnyk B, Fineout-Overholt E, redaktører. Evidence-Based Practice in Nursing and Healthcare A Guide to Best Practice. 4. ed. Wolters Kluwer; 2019. p. 7–32.

Tilson JK, Kaplan SL, Harris JL, Hutchinson A, Ilic D, Niederman R, et al. Sicily statement on classification and development of evidence-based practice learning assessment tools. BMC Med Educ. 2011;11(78):1–10.

Google Scholar  

Connor L, Dean J, McNett M, Tydings DM, Shrout A, Gorsuch PF, et al. Evidence-based practice improves patient outcomes and healthcare system return on investment: Findings from a scoping review. Worldviews Evid Based Nurs. 2023;20(1):6–15.

Article   PubMed   Google Scholar  

Dawes M, Summerskill W, Glasziou P, Cartabellotta N, Martin J, Hopayian K, et al. Sicily statement on evidence-based practice. BMC Med Educ. 2005;5(1):1–7.

Article   PubMed   PubMed Central   Google Scholar  

Larsen CM, Terkelsen AS, Carlsen AF, Kristensen HK. Methods for teaching evidence-based practice: a scoping review. BMC Med Educ. 2019;19(1):1–33.

Article   CAS   Google Scholar  

World Health Organization. Nurse educator core competencies. 2016 https://apps.who.int/iris/handle/10665/258713 Accessed 21 Mar 2023.

Saunders H, Gallagher-Ford L, Kvist T, Vehviläinen-Julkunen K. Practicing healthcare professionals’ evidence-based practice competencies: an overview of systematic reviews. Worldviews Evid Based Nurs. 2019;16(3):176–85.

Young T, Rohwer A, Volmink J, Clarke M. What Are the Effects of Teaching Evidence-Based Health Care (EBHC)? Overview of Systematic Reviews PLoS ONE. 2014;9(1):1–13.

Bala MM, Poklepović Peričić T, Zajac J, Rohwer A, Klugarova J, Välimäki M, et al. What are the effects of teaching Evidence-Based Health Care (EBHC) at different levels of health professions education? An updated overview of systematic reviews. PLoS ONE. 2021;16(7):1–28.

Article   Google Scholar  

Copenhagen University. Bachelor in medicine. 2024 https://studier.ku.dk/bachelor/medicin/undervisning-og-opbygning/ Accessed 31 Jan 2024.

Ministery of Higher Education and Science. Professional bachelor programmes. 2022 https://ufm.dk/en/education/higher-education/university-colleges/university-college-educations Accessed 31 Jan 2024.

Patelarou AE, Mechili EA, Ruzafa-Martinez M, Dolezel J, Gotlib J, Skela-Savič B, et al. Educational Interventions for Teaching Evidence-Based Practice to Undergraduate Nursing Students: A Scoping Review. Int J Env Res Public Health. 2020;17(17):1–24.

Wakibi S, Ferguson L, Berry L, Leidl D, Belton S. Teaching evidence-based nursing practice: a systematic review and convergent qualitative synthesis. J Prof Nurs. 2021;37(1):135–48.

Fiset VJ, Graham ID, Davies BL. Evidence-Based Practice in Clinical Nursing Education: A Scoping Review. J Nurs Educ. 2017;56(9):534–41.

Pollock M, Fernandes R, Becker L, Pieper D, Hartling L. Chapter V: Overviews of Reviews. I: Higgins J, Thomas J, Chandler J, Cumpston M, Li T, Page M, et al., editors. Cochrane Handbook for Systematic Reviews of Interventions version 62. 2021 https://training.cochrane.org/handbook Accessed 31 Jan 2024.

Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, m.fl. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:1-9

Covidence. Covidence - Better systematic review management. https://www.covidence.org/ Accessed 31 Jan 2024.

Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. BMJ. 2017;21(358):1–9.

Joanna Briggs Institute. Critical Appraisal Tools. https://jbi.global/critical-appraisal-tools Accessed 31 Jan 2024.

Horntvedt MT, Nordsteien A, Fermann T, Severinsson E. Strategies for teaching evidence-based practice in nursing education: a thematic literature review. BMC Med Educ. 2018;18(1):1–11.

Ramis M-A, Chang A, Conway A, Lim D, Munday J, Nissen L. Theory-based strategies for teaching evidence-based practice to undergraduate health students: a systematic review. BMC Med Educ. 2019;19(1):1–13.

Rethlefsen ML, Kirtley S, Waffenschmidt S, Ayala AP, Moher D, Page MJ, et al. PRISMA-S: an extension to the PRISMA Statement for Reporting Literature Searches in Systematic Reviews. Syst Rev. 2021;10(1):1–19.

Song CE, Jang A. Simulation design for improvement of undergraduate nursing students’ experience of evidence-based practice: a scoping-review protocol. PLoS ONE. 2021;16(11):1–6.

Cui C, Li Y, Geng D, Zhang H, Jin C. The effectiveness of evidence-based nursing on development of nursing students’ critical thinking: A meta-analysis. Nurse Educ Today. 2018;65:46–53.

Ghaffari R, Shapoori S, Binazir MB, Heidari F, Behshid M. Effectiveness of teaching evidence-based nursing to undergraduate nursing students in Iran: a systematic review. Res Dev Med Educ. 2018;7(1):8–13.

Wu Y, Brettle A, Zhou C, Ou J, Wang Y, Wang S. Do educational interventions aimed at nurses to support the implementation of evidence-based practice improve patient outcomes? A systematic review. Nurse Educ Today. 2018;70:109–14.

Ramis MA, Chang A, Nissen L. Undergraduate health students’ intention to use evidence-based practice after graduation: a systematic review of predictive modeling studies. Worldviews Evid Based Nurs. 2018;15(2):140–8.

Albarqouni L, Hoffmann T, Straus S, Olsen NR, Young T, Ilic D, et al. Core competencies in evidence-based practice for health professionals: consensus statement based on a systematic review and Delphi survey. JAMA Netw Open. 2018;1(2):1–12.

Hitch D, Nicola-Richmond K. Instructional practices for evidence-based practice with pre-registration allied health students: a review of recent research and developments. Adv Health Sci Educ Theory Pr. 2017;22(4):1031–45.

Download references

Acknowledgements

The authors would like to acknowledge research librarian Rasmus Sand for competent support in the development of literature search strategies.

This work was supported by the University College of South Denmark, which was not involved in the conduct of this study.

Author information

Authors and affiliations.

Nursing Education & Department for Applied Health Science, University College South Denmark, Degnevej 17, 6705, Esbjerg Ø, Denmark

Lea D. Nielsen

Department of Oncology, Hospital of Lillebaelt, Beriderbakken 4, 7100, Vejle, Denmark

Mette M. Løwe

Biomedical Laboratory Science & Department for Applied Health Science, University College South Denmark, Degnevej 17, 6705, Esbjerg Ø, Denmark

Francisco Mansilla

Physiotherapy Education & Department for Applied Health Science, University College South Denmark, Degnevej 17, 6705, Esbjerg Ø, Denmark

Rene B. Jørgensen

Occupational Therapy Education & Department for Applied Health Science, University College South Denmark, Degnevej 17, 6705, Esbjerg Ø, Denmark

Asviny Ramachandran

Department for Applied Health Science, University College South Denmark, Degnevej 17, 6705, Esbjerg Ø, Denmark

Bodil B. Noe

Centre for Clinical Research and Prevention, Section for Health Promotion and Prevention, Bispebjerg and Frederiksberg Hospital, Nordre Fasanvej 57, 2000, Frederiksberg, Denmark

Heidi K. Egebæk

You can also search for this author in PubMed   Google Scholar

Contributions

All authors have made substantial contributions to the conception and design of the study, acquisition of data, analysis, and interpretation of data, writing the main manuscript, preparing figures and tables and revising the manuscripts.

Corresponding author

Correspondence to Lea D. Nielsen .

Ethics declarations

Ethics approval and consent to participate.

Not applicable.

Consent for publication

Competing interests.

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Supplementary material 1., supplementary material 2., supplementary material 3., supplementary material 4., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Nielsen, L.D., Løwe, M.M., Mansilla, F. et al. Interventions, methods and outcome measures used in teaching evidence-based practice to healthcare students: an overview of systematic reviews. BMC Med Educ 24 , 306 (2024). https://doi.org/10.1186/s12909-024-05259-8

Download citation

Received : 29 May 2023

Accepted : 04 March 2024

Published : 19 March 2024

DOI : https://doi.org/10.1186/s12909-024-05259-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • MH "Students, Health occupations+"
  • MH "Students, occupational therapy"
  • MH "Students, physical therapy"
  • MH "Students, Midwifery"
  • “Students, Nursing"[Mesh]
  • “Teaching"[Mesh]
  • MH "Teaching methods+"
  • "Evidence-based practice"[Mesh]

BMC Medical Education

ISSN: 1472-6920

evidence based practice skills essay

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • HHS Author Manuscripts

Logo of nihpa

Evidence-Based Practices in Addiction Treatment: Review and Recommendations for Public Policy

The movement in recent years towards evidence-based practice (EBP) in health care systems and policy has permeated the substance abuse treatment system, leading to a growing number of federal and statewide initiatives to mandate EBP implementation. Nevertheless, due to a lack of consensus in the addiction field regarding procedures or criteria to identify EBPs, the optimal processes for disseminating empirically based interventions into real-world clinical settings have not been identified. Although working lists of interventions considered to be evidence-based have been developed by a number of constituencies advocating for EBP dissemination in addiction treatment settings, the use of EBP lists to form policy-driven mandates has been controversial. This article examines the concept of EBP, critically reviews criteria used to evaluate the evidence basis of interventions, and highlights the manner in which such criteria have been applied in the addictions field. Controversies regarding EBP implementation policies and practices in addiction treatment are described, and suggestions are made to shift the focus of dissemination efforts from manualized psychosocial interventions to specific skill sets that are broadly applicable and easily learned by clinicians. Organizational and workforce barriers to EBP implementation are delineated, with corresponding recommendations to facilitate successful dissemination of evidence-based skills.

The importance of translating scientific advances in disease-specific interventions into clinical practice has been emphasized throughout the health care system, largely stemming from the consistent observation of a wide gap between research and practice [ 1 ]. As a move towards “evidence-based practice” has permeated health care systems and policy, several working groups in the addiction treatment field both within and outside of the United States have considered ways to align with this initiative. In the U.S., these efforts have been channeled through various legislative mandates and programs requiring implementation of evidence-based practices. Concurrently, national-level programs have been initiated outside of the U.S. to implement extensive rollouts of evidence-based treatments (e.g., the Improving Access to Psychological Therapies program in the United Kingdom) [ 2 ]. Likewise, the largest international training initiative in the addictions, developed by United Nations Office on Drugs and Crime, involved dissemination of addiction treatment practices to the Treatnet, a network of 20 drug dependence treatment resources around the world. The inclusion of evidence-based addiction treatment practices with strong empirical foundation was a major emphasis of the training curriculum for Treatnet, which was used successfully both in regions of the world with highly developed and relatively less developed addiction treatment systems [ 3 ].

The rationale for the recent movement emphasizing dissemination and implementation of evidence-based practices is straightforward: if clinical decision-making and practice are informed by experimental studies that have established the effectiveness of particular interventions for specified clinical populations, this should (i) increase treatment effectiveness, (ii) facilitate consistency in practice, (iii) establish accountability of health service providers to funding sources, (iv) increase cost-effectiveness of treatment, and (v) improve the overall quality of treatment. In the field of addiction, however, consensus regarding the optimal procedures for identifying practices with sufficient empirical foundation to be considered “evidence-based” has not yet been reached. Nevertheless, the concept of “evidence-based practice” (EBP) is increasingly emphasized by providers, managers, payers, and regulators of behavioral health care. In this review, extant definitions and variations of this concept are reviewed, issues that should be considered prior to implementing EBP in real-world clinical settings are outlined, and finally, recommendations are delineated for policymakers who are shaping the role of EBP in addiction treatment.

For the present review of the concept of EBP in psychosocial addiction treatment, a literature search for publications concerning this topic within the past 10 years was conducted using databases including PubMed, google, and googlescholar, incorporating the following terms: evidence-based practices, psychotherapy, behavioral treatments, addiction, substance dependence, practice guidelines, principles, best practices, promising practices, criteria. In addition, current published documents concerning evidence-based practices for addiction treatment were reviewed from various sources both within and outside of the United States, including the National Institute on Drug Abuse, Substance Abuse and Mental Health Administration, the American Psychological Association, the National Institute for Health and Clinical Excellence, and Cochrane Reviews. Because pharmacotherapies were outside the scope of this review, searches were limited to publications and guidelines germane to psychosocial interventions for addiction.

What is EBP?

Stemming from the concept of “evidence-based medicine,” coined by clinical epidemiologists in the 1980s, the most widely cited formal definition of EBP comes from the Institute of Medicine. In 2001, the Institute of Medicine’s Committee on Quality of Health Care in America produced the Quality Chasm report, which underscored quality shortcomings in the U.S. health system, emphasizing the gross disparity between the care patients receive and the clinical practices supported by empirical evidence [ 4 ]. Implementing evidence-based diagnostic and treatment processes was therefore one of the Institute of Medicine’s recommendations to facilitate an urgently needed health system redesign. Adapting Sackett et al.’s definition [ 5 , 6 ], the Quality Chasm report characterized EBP as: “the integration of best research evidence with clinical expertise” and patient values (p.71). The most debated components of this definition are the concepts of (i) best research evidence, and (ii) clinical expertise.

Best Research Evidence

Although it has been appropriately noted that the definition of best research evidence depends upon the nature of the clinical question (e.g., etiological questions versus identification of the most efficacious treatment for a particular disease) [ 7 ], descriptions of this concept to date uniformly acknowledge a variety of sources from which data can be brought to bear on clinical decision making. These sources include randomized clinical trials (RCTs), quasi-experimental investigations, correlational studies, field studies, case reports, and clinical guidelines based upon professional consensus [ 8 ]. Moreover, it has been argued that the rigor of the study design may correspond to a hierarchical ranking denoting experimental integrity; on this basis, the value of the evidence for a particular approach could be “graded.” [ 9 ]. Evidentiary value encompasses the concepts of (a) evidence quality , or the extent to which bias is minimized in the context of the experimental design, and (b) evidence strength , which is inferred collectively on the basis of evidence quality, the size of treatment effects, the extent to which the outcomes reflect valid information about the populations and settings in which the study was conducted (i.e., internal validity), and the clinical utility and generalizability (i.e., external validity) of the findings [ 6 , 9 ]. Within this framework, RCTs are considered to be the least subject to bias, and therefore the most empirically “sound” source of evidence (i.e., that of the highest evidence quality). Table 1 depicts a hierarchical model of research evidence, drawing upon common elements of previously described systems for grading study quality [ 9 – 11 ].

Levels of Evidence Used in Evidence-Based Practice

As an alternative to weighting individual studies and drawing corresponding conclusions regarding the evidence basis for particular interventions, clinicians may draw upon published syntheses of study findings, typically in the form of systematic reviews and meta-analyses. Hierarchical models of research evidence place both of these methods in the highest tier alongside RCTs. Systematic reviews evaluate research evidence based upon pre-defined objective criteria. Over the past decade, the Cochrane Collaboration and the Agency for Healthcare Research and Quality have accelerated the dissemination of synthesized information concerning health care practices through systematic reviews [ 12 , 13 ]. Although these reviews may or may not use meta-analytic techniques, the standard methodology for evaluating the strength of evidence in meta-analyses involves the calculation of an effect size, or summary statistic depicting the magnitude of the treatment effect, averaged across studies. While this method provides a useful metric from which to infer the utility of a treatment approach for a population or subgroup, some disadvantages of this technique warrant consideration. Specifically, the use of aggregate estimates of effect size may obscure qualitative differences between the individual studies, including variations in experimental integrity, subject characteristics, and study endpoints. Moreover, meta-analyses are subject to publication bias (also known as the “file drawer problem”), in that studies showing an effect of a treatment are more likely to be published than those showing no effect, thereby biasing the pool of clinical data from which meta-analyses are conducted. Nevertheless, synthesized reports on treatment effectiveness, whether in the form of meta-analyses or systematic reviews, remain a valuable resource to inform clinical decision-making.

Among the many constituencies advocating for the use of EBP in behavioral health, some have proposed highly specified criteria that reflect the number and types of trials required to establish a treatment as “evidence-based.” To this end, in 1995, the American Psychological Association’s Division of Clinical Psychology published criteria for identifying evidenced-based treatments, formally labeled “empirically supported treatments.” Highlighting the distinction between treatment efficacy (i.e., clinical benefit produced by the intervention in the context of controlled research) and effectiveness (i.e., clinical benefit produced by the intervention in a clinical setting under naturalistic conditions), the criteria for empirically supported treatments require demonstration of efficacy in at least two investigations conducted by independent research teams [ 14 ]. Using a graded system, the criteria for empirically supported treatments specify that an intervention with evidence in favor of its use from a single study or from multiple studies conducted by the same research group is considered possibly efficacious pending replication. While studies demonstrating clinical utility of an intervention outside of the experimental setting are considered important for the translation of manualized interventions studied in RCTs into real-world clinical settings, the designation of a treatment as an empirically supported treatment does not require evidence of effectiveness; rather, efficacy is considered paramount [ 15 ].

Critics of empirically supported treatments’ dissemination have argued that interventions established through efficacy research are unlikely to generalize to “real world” clinical settings [ 16 , 17 ]. Contrary to this argument, however, an emerging body of effectiveness research has yielded promising evidence for psychotherapy approaches previously established through efficacy research [ 18 , 19 ]. Effectiveness studies are not without their limitations, however; clients in some effectiveness studies receive more treatment than is routinely provided in efficacy trials and achieve treatment effects of a smaller magnitude than that observed in efficacy studies [ 20 ]. Moreover, though high in external validity, the research designs employed are often problematic with respect to internal validity.

Clinical Expertise

The notion that the clinician’s expertise and experience are key components of the science-to-practice translation is well accepted [ 14 , 15 ]. Numerous elements constitute clinical expertise, including scientific expertise to guide evaluation and use of research evidence, awareness of individual patient characteristics as they influence treatment needs, interpersonal ability, awareness of the limits of one’s clinical skill set, and clinical decision-making [ 15 ].

Standards and regulations regarding the level of training required of clinicians delivering addiction treatment directly impact the extent to which clinicians in such settings evidence these elements of expertise, however, and recent studies suggest that the current educational requirements for substance abuse counselors fall short as a means of preparing them to adopt EBPs. Specifically, the minimum educational requirements in most states fail to support the development of skills to review and understand research evidence; according to a recent study of state requirements for the training of addiction counselors in 31 states, only one state (3%) mandated coursework in research and evaluation [ 21 ]. Moreover, the association between level of training (i.e., as indicated by degrees and certification) and innovation adoption is well-established [ 22 , 23 ]; thus, the potential success of EBP dissemination efforts in addiction treatment settings will depend, in part, on the pre-existing level of education and training of the workforce. Nevertheless, the majority of U.S. states require less than college education for entry to the field [ 21 ]. Increasing the rigor of training, certification, and licensing requirements for the addiction treatment workforce is therefore an important consideration as a means of facilitating successful technology transfer [ 22 , 24 ].

A wealth of evidence suggests that interpersonal skill is a particularly important aspect of clinical expertise [ 25 ], and individual therapist effects account consistently and significantly for variance in outcomes [ 26 – 28 ]. As such, scientific experts recommend that psychosocial EBP dissemination efforts in addiction focus on a limited set of core change principles with corresponding skill sets that can be widely applied to clinicians with varying levels of experience [ 29 ]. Arguably, a key component of these principles would involve skills to establish and maintain a therapeutic alliance, along with techniques that facilitate use of the alliance to promote behavior change.

Variations in the concept of EBP

Although several aspects of the guidelines set forth by the APA Division of Clinical Psychology have been sources of controversy [ 30 ], their initiative to disseminate information on empirically supported treatments was followed by numerous ongoing efforts to develop EBP guidelines. In the process of accelerated efforts towards EBP implementation, however, a variety of terms have been utilized to describe documented sets of guidelines, some of which have meaningful distinctions. Apart from the term EBP, two additional broad categories of documents that summarize recommendations for translating research evidence into clinical practice include: practice guidelines and best practices . Although numerous other terms have been used, they generally fall within the scope one of these three categories of recommendations.

Practice Guidelines

According to the Institute of Medicine, practice guidelines are “systematically developed statements to assist practitioners and patient decisions about appropriate health care for specific clinical circumstances” [ 31 ]. These statements are developed through a consensus process that includes clinical and research experts in the appropriate field, and may also elicit input from health care provider organizations, consumer groups, and government agencies, depending upon the scope and purpose of the guidelines. In terms of content, practice guidelines may include approaches to the prevention, diagnosis, or treatment of an illness [ 32 ]. Content may be drawn from various theoretical frameworks, and flexibility is allowed in the actual implementation of the practice.

Practice guidelines may be referred to as protocols, standards, or algorithms, and vary widely in their level of detail, ranging from an extensive manual to a summary article in a peer-reviewed journal. For example, the American Psychiatric Association’s practice guidelines for addiction treatment are a 276-page chapter of a book on practice guidelines in psychiatry [ 33 ]. This comprehensive document synthesizes research evidence and clinical and expert consensus in the form of both a literature review and clinical recommendations to guide the selection of appropriate modalities, levels of care, and disease management practices for the major substances of abuse. Another set of practice guidelines was put forth by the National Institute on Drug Abuse; formally termed principles of drug abuse treatment, the publication describes 13 concepts or themes defined as “a set of overarching principles that characterize the most effective drug abuse and addiction treatments and their implementation [ 34 , 35 ].” Although the NIDA principles include broad concepts such as “effective treatment attends to multiple needs of the individual, not just his/her drug use,” these principles may be appropriately categorized as practice guidelines, given that they are intended to help clinicians make empirically informed treatment decisions. Unlike EBP manuals for addiction treatment, which contain highly specific descriptions of session content, generally centering around a specific theoretical orientation, practice guidelines (a) are not based on a single theoretical base; and (b) vary widely in the extent to which specifications are provided to inform the implementation process.

Best Practices

Rather than serving as a clinician’s guide, the purpose of best practice documents is to guide treatment program planning and to outline processes that facilitate dissemination of research-based intervention strategies to clinical settings [ 32 ]. The content of these documents often includes guidelines for service delivery, such as recommended scope of services, assessment and intervention techniques, considerations when treating special populations, and processes for coordinating treatment with other types of services. As such, best practice documents often inform policy by describing optimum standards of treatment service delivery for addicted populations and subgroups with special needs. In so doing, these recommendations may also inform the advancement of standards for training of clinicians in the addiction field. Best practice documents germane to addiction treatment have been published by the Addiction Technology Transfer Center [ 36 ], the Institute of Medicine [ 1 ], the Network for the Improvement of Addiction Treatment [ 37 ], the National Quality Forum [ 38 ] and the Iowa Consortium for Substance Abuse Research and Evaluation [ 39 ], among others.

Evaluation Research on Guidelines

Practice guidelines and best practices are not always mutually exclusive, posing a challenge to the standardization of terminology used to inform EBP dissemination efforts. For example, the Center for Substance Abuse Treatment set forth a comprehensive set of 47 consensus-based Treatment Improvement Protocols (TIPs), which have also been referred to as “best practice guidelines.” Although some of the TIP manuals are indistinguishable from practice guidelines (e.g., TIP 40: Clinical guidelines for the use of buprenorphine in the treatment of opioid addiction), a sizable subset fall clearly into the “best practice” domain as defined above, such as TIP 46, which addresses administrative issues fundamental to running an outpatient treatment program, and TIP 38, which describes the importance of integrating vocational and addiction treatment services with corresponding process recommendations.

The TIPs were originally developed in 1993 and were disseminated gradually to all state Alcohol and Substance Abuse Directors within the United States, Addiction Technology Transfer Centers, and individuals in the U.S. Department of Health and Human Services. Subsequently the outcomes of the TIPs dissemination efforts were evaluated scientifically to continue to inform the process of using empirically derived knowledge to refine clinical service delivery [ 40 ]. Aside from the TIPs evaluation project, however, the quality and outcomes of implementation of practice guidelines in addiction treatment have rarely been studied [ 41 ]. The absence of relevant data to shed light on the utility of best practice and practice guideline documents in the technology transfer process is limiting; not only is it unclear whether these guidelines adequately meet the needs of treatment professionals, but without such data, there is a limited basis for improving this type of dissemination resource.

Despite awareness of and positive attitudes towards the TIPs, the majority of clinicians surveyed reported difficulty in using them in practice [ 41 ]. Studies are needed to (i) evaluate the effectiveness of existing protocols as tools for EBP implementation, and (ii) identify key components of protocols and guideline materials to maximize clinical utility and the likelihood of implementation and sustainability.

What are EBPs for the Treatment of Substance Use Disorders?

The idea of utilizing EBPs in community treatment settings for substance abusers is a controversial one. As reviewed earlier, although several sets of criteria for designation as an EBP have been published, there is currently no consensus in the field of addiction treatment research as to which evidence standards to use for defining EBPs. Nevertheless, there are increasing federal and state initiatives emphasizing the implementation of EBPs in addiction treatment settings as a priority. At the federal level, for example, SAMHSA has named the use of “evidence-based programs and strategies” among the 10 indicators of quality care in the context of the National Outcomes Monitoring System [ 42 ]. Among the many state initiatives currently in process, Oregon’s Senate Bill 267 represents a phased, yet fiscally aggressive, effort to implement EBPs for youth and adults at high risk for involvement in the criminal justice system, including substance abuse treatment settings. This legislation requires that state agencies spend 75% of their budgets (federal and state dollars) on EBP-related activities [ 43 ]. Correspondingly, a list of EBPs for substance use disorders was generated to guide implementation of this legislation [ 44 ]. This list joins several others developed by various research and professional consensus groups to determine which treatments meet sufficient standards of evidence quantity and quality, albeit varied ones, to be considered EBPs. Among the sources of compiled lists are the American Psychiatric Association, the American Psychological Association, SAMHSA’s National Registry of Evidence-Based Programs and Practices (NREPP) [ 45 ], the University of Washington Alcohol and Drug Abuse Institute, and numerous meta-analyses and reviews [ 8 , 20 , 46 – 48 ]. See Table 2 for a description of the criteria for inclusion in each of these lists. Although such lists serve a legitimate purpose, they also have drawbacks. Among the most frequently cited concerns are that (i) EBPs might be used incorrectly or with insufficient fidelity by clinicians who do not have relevant training and/or expertise to facilitate proper delivery of the interventions; (ii) the use of manualized EBPs will result in less individualized treatment and, consequently, poorer quality of care [ 30 ]; and (iii) EBPs may be used for political purposes [ 49 ]. Moreover, in light of research delineating the importance of the therapeutic relationship in affecting treatment outcomes, it has been argued that the “practice” itself is not as important as the nature of the therapist-client relationship; thus, in contrast to the emphasis on treatments or techniques that work, Division 29 (Psychotherapy) of the American Psychological Association compiled a list of “psychotherapy relationships that work” [ 50 ]. This serves to highlight the core elements of the therapeutic relationship that impact the course and outcomes of treatment across many, if not all forms of interventions for addictions. Nevertheless, not all elements of the therapeutic relationship can be readily evaluated using rigorous experimental methodologies (e.g., RCTs). This raises the question of whether the evidence hierarchy ought to be modified so as not to preferentially select therapy techniques while deemphasizing the therapeutic relationship [ 51 ].

Models for evaluating strength of evidence: minimum criteria for EBP designation

When comparing lists of approved practices, which are based on several widely cited EBP criteria, the number and types of EBPs on such lists vary widely for reasons apart from the quality of the evidence. Likewise, more rigorous criteria sets are not consistently associated with smaller numbers of corresponding EBPs. This is largely due to the disparate selection processes for interventions to be reviewed; for example, NREPP accepts voluntary submissions for review; thus the interventions meeting minimum criteria are essentially “self-selected,” thereby biasing the pool of treatments. Second, because lists are not always updated in a manner commensurate with the expansion of corresponding scientific literature, they can easily become outdated. This is particularly problematic for the EBP list that was generated by the American Psychological Association, which was last updated formally by a designated task force in 1998. This list excludes well-studied interventions such as 12-Step Facilitation, contingency management, and others that would clearly meet the criteria. Finally, pharmacological treatments have not been consistently evaluated by the various EBP workgroups. Oregon AMH lists some, but not all FDA-approved medication treatments for substance dependence; however, the practice of “medication management” is listed as an EBP but with no description of pharmacotherapies that fall under that term. Furthermore, NREPP and the American Psychological Association do not include any pharmacotherapies on their EBP lists.

Treatments that Don’t Work

It has been cogently argued that identifying the interventions for which there is the strongest empirical evidence of efficacy and/or effectiveness is equally as important as knowledge of which treatments are in effective or perhaps even detrimental to the clinical course and outcomes of addicted patients. As such, to the extent that lists of EBPs are utilized to drive the development and refinement of addiction treatment programs, lists of discredited treatments may help providers avoid ineffective ones. To this end, a recent Delphi survey delineated a set of intervention approaches that are contraindicated treatments for addiction [ 52 ]. A list of these treatments is provided in Table 3 .

Discredited Techniques in Addiction Treatment

Source: Fala et al. (2008)

While the treatments that are considered to be effective are established as such through empirical research, the absence of efficacy studies does not render an intervention approach ineffective [ 8 ]. This important argument underscores one of the problems with reliance upon well-studied interventions in shaping the treatment system. This may, in effect, exclude interventions that have not had the opportunity to accumulate evidence in support of their use. Certain easily or already well-standardized approaches to addiction treatment are more likely to be tested in RCTs (e.g., pharmacotherapies, cognitive behavioral therapy), thereby biasing the pool of available interventions by including only those that have generated the most scientific interest.

Considerations for dissemination of EBPs

In the face of rapidly burgeoning enthusiasm about disseminating EBPs, there is an urgent need to inform the implementation process with new empirical knowledge. Because dissemination research is a relatively new area of study in the addiction field [ 41 ], little is known about how to optimize evidence-based innovation adoption and sustainability. The extant body of research in this area does, however, point to some fundamental components of the implementation process to consider when forming a dissemination plan. These components are reviewed in this section.

Workforce Barriers

Workforce characteristics are important determinants of EBP adoption. Providers’ familiarity with EBPs, perceptions of their effectiveness, and attitudes towards them are each associated with the likelihood of successful implementation [ 53 – 55 ]. Spreading awareness of EBPs and their effectiveness is a complex process which requires support from the addiction treatment system’s infrastructure. Nevertheless, surveys of the addiction treatment system, such as the National Survey of Substance Abuse Treatment Services [ 56 ], consistently reveal gross inadequacies in this infrastructure, particularly in the leadership, workforce, and information systems fundamental to supporting quality evidence-based care. System-level change has therefore been an emphasis of numerous discussions in recent years concerning EBP dissemination strategies [ 56 – 59 ].

Intervention Fidelity

In the face of nonstandard implementation, the transfer of EBPs into clinical settings can be problematic. Research consistently shows that accurate implementation of EBP protocols is associated with positive clinical outcomes [ 60 , 61 ]. As such, fidelity measurement is used to assess the extent to which an EBP is being implemented as intended. As yet, there is no consensus as to how to optimize fidelity assessment of EBPs for substance use disorders. Typically, survey research methods aimed at providers are utilized to examine the extent of EBP implementation in treatment programs. This approach is not without drawbacks, however. When surveyed, providers may over-estimate the extent to which they utilize EBPs, including those for which they have received no formal training [ 62 ]. Inaccuracies in reporting are even more likely in the context of pressures arising from mandates regarding EBP delivery. Likewise, although direct observation of clinical activities might overcome some of the limitations of survey research methodology, such observation at a single time point may capture the practitioner’s ability to conduct an EBP, while leaving the nature of their routine clinical practices largely unknown.

Implementation researchers are nevertheless making strides in developing fidelity assessment methods that overcome some of these problems. One such approach was described in a recent report from the National Implementing Evidence-Based Practices Project [ 63 ], from which the first systematic study of the fidelity of EBP implementation across a large number of sites (N=53) was performed. In this longitudinal study of dissemination of 5 EBPs for mental health and co-occurring disorders, trained fidelity assessors conducted site visits before and repeatedly after a 1-year EBP implementation phase. Fidelity was assessed using a multimethod approach that included interviews with site practitioners, observation of clinical activities, interviews with clients, and chart reviews. The integration of these evaluation methods, coupled with repeated measurements over time, provides a rigorous model for fidelity assessment, one that is consistent with prior recommendations based upon a review of fidelity measurement methods in psychosocial treatment research [ 64 ].

Practice-based Evidence

Given that the evidence base for psychosocial treatments for addiction is acquired largely from RCTs, EBP dissemination efforts focus on transporting specific theoretically based approaches with a relative de-emphasis on the level of competence of the individual therapist. Variability in clinicians’ level of competency is minimized in psychosocial RCTs through the selection of highly qualified and educated therapists coupled with rigorous training, supervision, and use of manuals to inform practice. Nevertheless, within the psychotherapy literature a number of studies and meta-analyses have reported moderate to large effects of individual therapists on clinical outcomes [ 65 ], often in the absence of observed differences between psychosocial treatment approaches of varying orientations [ 66 ]. These observations have formed the basis of a rationale for an approach that emphasizes practice-based evidence .

Using practice-based evidence, client outcome data are gathered from routine practice and used to provide therapists with real-time feedback regarding the impact of their interventions on client functioning. The delivery of feedback to therapists, when coupled with suggestions to improve clinical performance, has been touted as a complementary and effective means to improve the quality of care and outcomes [ 67 , 68 ]. This model, which historically has been implemented in psychological treatment settings in the United Kingdom [ 69 ], is currently the focus of a statewide mental health services initiative in Utah, in which “mental health vital signs” are monitored routinely using a 5-minute self-report instrument [ 67 ]. These “vital signs,” which correspond to several domains of functioning, are analyzed using a software system that tracks clinical trajectories. Using this system, patients who are at risk for “treatment failure” are flagged, with corresponding feedback and recommendations delivered to clinicians. The practice-based evidence model can also be used at the systems level; for example, in the context of performance-based contracting, administrative and clinical practices that are linked with the achievement of targeted clinical performance indices can be adopted by the system, an approach that was successfully adapted and instituted recently in Delaware [ 70 ].

Evidence-based Caveats

Given that the primary goal of implementing EBPs is to improve client treatment outcomes, treatment-related procedures that are thought to positively affect such outcomes are worthy of consideration. NIATx, for example, developed process improvement strategies that affect important outcome dimensions for substance abusers, including treatment retention and access to care [ 71 , 72 ]. Although the NIATx approach is, like EBPs, a manualized approach to improving treatment process and outcome, it does not lend itself to testing in RCTs and thus does not fit squarely into the EBP model. Other such treatment processes include urine monitoring, a common practice in addiction treatment that has not been subjected to testing in controlled research. Conceptually, it is unclear where these procedures fall in the continuum of EBPs. Despite their apparent utility, it is unclear how such processes will be viewed and to what extent they will be supported at this time, given the emphasis of current policy on EBP implementation.

Recommendations

In planning the necessary steps to align treatment providers with policy driven EBP initiatives, a clear set of goals must be established. To successfully impact real-world treatment, a feasible set of objectives that are adaptable to different types of settings and that take into account the limitations of the current system of care is needed. There is compelling evidence reviewed in this paper that the current treatment system in the United States is ill-suited for the immediate goal of implementing manualized psychosocial EBPs. On the other hand, the potential for implementing manualized EBPs successfully outside of the United States is likely best in areas with relatively undeveloped addiction treatment systems. In these regions, new providers are likely to be receptive to the idea of building the foundation of their treatment programs on EBPs, provided that the protocols are appropriately adapted for use by their respective cultures. In parts of the world with well-developed addiction systems of care, such as the U.S., Europe, and Australia, rather than focusing efforts on disseminating manualized EBPs, a more realistic proximal goal is to inform addiction treatment with scientific evidence by providing clinicians with training in core evidence-based skills that tangibly influence their practice. The proper use of these skills should be expected to improve clinical outcomes, for which measurement systems are already in place (e.g., the National Outcomes Monitoring System model). Of note, clinicians report that skills that can be learned and put into practice rapidly are more appealing than those that require large-scale system change [ 73 ], underscoring the need to focus training efforts on a limited set of key techniques. Moreover, a number of steps will be required to create a workforce that is receptive to “evidence-based skills training” as well as pharmacotherapies for addiction. Recommendations are offered below regarding target clinical skills for evidence-based skills training and strategies to encourage innovation adoption:

  • In determining which practices are to be target EBPs for implementation, employ a stakeholder consensus process. Four sets of criteria for designating EBPs as such were described in this review. The most stringent criteria set forth by the American Psychological Association greatly limit the practices that can be considered evidence-based, while the Oregon Addictions and Mental Health Division criteria, requiring an evidence base that is less robust, allow for a greater array of empirically based techniques. Though, clearly, each approach has its strengths and limitations, any working EBP list for the purposes of informing policy decisions must be developed with consideration of the context of local patient needs and available treatment resources. Rather than relying exclusively upon one of these established EBP criteria sets, it is recommended that any working list of evidence-based techniques be reviewed by stakeholders to arrive at consensus recommendations for dissemination, taking into consideration the interventions that are the most feasible, affordable, and suitable to the patient needs of the specific region.

First , principles of contingency management should be targets for dissemination, emphasizing the effects of reinforcing abstinence or other non-drug alternative behaviors on clinical outcomes, including treatment retention, adherence, and abstinence. In this context, urine monitoring procedures can be introduced as a means of promoting improved outcomes, including reinforcing compliance with pharmacological interventions aimed at reducing cravings and/or psychiatric symptoms. Second, motivational interviewing and brief intervention skills training is recommended as a means of promoting reductions in substance use. Given that the impairment in brain regions associated with impulse control can be at least partially reversed with abstinence, motivational interviewing is considered a cost-effective approach for targeting impulsive drug-seeking and use. Third, core cognitive-behavioral coping skills and relapse prevention strategies, including coping with risky situations and cravings, respectively, can be easily taught and understood by clinicians with a range of education and experience. Fourth, training in couples and family counseling skills is suggested as a means of optimizing the substance user’s social environment. Skills training in this area would be aimed at engaging the support of the substance abuser’s family for behavior change as well as restructuring couple and family interaction patterns in ways conducive to abstinence. As suggested by Carroll and Rounsaville, to the extent that any and all of these evidence-based skills can be taught using Web-based training techniques, a larger number of trainees can be reached while minimizing cost and enabling clinicians to train at their own pace. Nevertheless, maintenance and practice of these skills will require some supervisor-facilitated demonstration and rehearsal, according to dissemination research [ 74 ]. See Table 4 for examples of manualized psychosocial interventions that apply each of the four skill sets that are recommended targets for dissemination.

Examples of psychosocial interventions corresponding to four recommended skill sets for dissemination to clinicians in addiction treatment settings

Given the wealth of evidence indicating that feedback on practice patterns strongly impact practice behaviors [ 75 , 81 , 82 ], it is clear that fidelity monitoring [ 63 ] as well as ongoing supervision and consultation [ 74 ] are essential components of the technology transfer process. Indeed, studies comparing various counselor training methods for EBPs, such as motivational interviewing and cognitive behavioral therapy, have found that supervision and feedback increase post-training proficiency, relative to counselors who receive training via a workshop or seminar without feedback or supervision [ 83 – 85 ]. Optimally, feedback provided in the context of fidelity monitoring can be complemented with feedback to practitioners regarding client outcomes. Given the modest association between fidelity and treatment outcomes, striking a balance between these two forms of feedback should enable clinicians to adapt the practice to meet individual client needs while maintaining an acceptable degree of fidelity. Using the implementation model described above, coupled with the multimethod fidelity evaluation protocol described earlier, moderate to high fidelity was achieved [ 63 ].

  • Assess organizational readiness for change and adapt the implementation approach as needed . Addressing the psychological dynamics of change at both the individual and program levels is fundamental to instituting new practices [ 86 ]. Programs and providers must be sufficiently motivated to engage and sustain a change process before implementation efforts can be effective. The Change Book [ 87 ], an Addiction Technology Transfer Center publication, describes 10 steps for strategic development, implementation, and evaluation of innovation adoption efforts for addiction treatment systems. The 4 th of the 10 steps involves assessing the program’s readiness to change. This can be achieved by using instruments such as the Texas Christian University Organizational Readiness for Change survey, which was designed specifically for use in addiction treatment and health services fields [ 88 ]. Direct conversations and focus groups are also powerful means of gathering these data, with the goals of identifying organizational barriers to change; supports for implementing changes (e.g., funding, desire to improve outcomes); implications of the change for agencies, administrators, counselors, and clients; features of the organizational structure currently in place to support change; and the organization’s stage of readiness to change. Because organizations may naturally resist change, adapting implementation strategies to an agency’s stage of readiness to change can increase the odds of success.
  • Increase access to training and informational resources . Counselor attitudes regarding acceptability of EBPs may be amenable to change using management practices that enhance access to new, clinically relevant knowledge. Training is an effective means of disseminating information about the utility of EBPs and may be associated with more favorable attitudes towards them [ 89 ]. To this end, the Addiction Technology Transfer Centers of NIDA and SAMHSA have a number of training materials available for providers. The use of external sources of information facilitates the transfer of research information into real-world practice settings [ 90 ]; thus, providing Internet access, encouraging use of research-based publications, and promoting clinicians’ involvement in professional development activities are all both necessary and effective means of enhancing absorptive capacity. Recently, the use of a relatively simple, low-cost counselor toolkit for implementation of a motivational interviewing exercise proved to be an effective means of translating core evidence-based techniques into practice across 6 community based addiction treatment sites [ 91 ]. Introducing these types of resources coupled with training activities would serve as part of a persuasion process to set the stage for successful EBP adoption [ 23 ].
  • Increase clinician and organizational exposure to EBPs . For addiction treatment programs, it is now well documented that involvement in a research network such as NIDA’s Clinical Trials Network enhances EBP adoption [ 92 ], particularly for pharmacotherapies [ 93 ]. The unique opportunity provided by such networks for addiction treatment programs to implement a novel intervention approach on a time-limited basis, a condition referred to as “trialability,” [ 94 ] not only provides exposure to treatment innovations, but also dampens the financial burden that might otherwise be posed by EBP mandates by providing training, study materials, and financial support needed for successful implementation. Exposure to contingency management in the context of Clinical Trials Network participation resulted in successful adoption of these techniques in at least one large hospital system in New York [ 95 ]. Thus, increasing the involvement of addiction treatment organizations in research networks is an important step towards successful EBP adoption.

Conclusions

The transfer of knowledge acquired through addiction intervention research into clinical settings has great promise as a means of increasing treatment effectiveness and facilitating greater consistency in practice. The recommendations made towards this end in this paper are based upon a review of the literature concerning EBP criteria and dissemination in the area of psychosocial addiction treatment. Of note, because pharmacotherapy was outside the scope of this review, we did not address the important issue of evidence-based pharmacotherapy practices for addicted populations. Moreover, because there is a paucity of data to inform how practice guidelines, best practices, and EBP lists influence patient outcomes, conclusive recommendations regarding the use and future development of these resources cannot be made at this time. These limitations notwithstanding, several conclusions regarding EBP dissemination are warranted.

In initiating the process of EBP dissemination, it is important to consider not only the strength of the evidence basis for the target intervention(s), but also the features of the organization and workforce that will be instituting the practices. At both the organizational and individual counselor levels, receptivity to EBPs and research more generally are important precursors to successful dissemination. Facilitating positive perceptions and attitudes regarding the acceptability of EBPs may be achieved by enriching the research culture in clinical settings using some of the recommended dissemination strategies presented in this article. Variations in levels of training and competency of individual clinicians employed in community addiction treatment settings add another layer of complexity to EBP dissemination efforts, particularly given that manualized EBPs are typically delivered and validated in the research context by highly trained and educated clinicians. As such, the utility of EBP lists of manualized interventions that are currently informing policy around EBP mandates in addiction treatment is limited. Likewise, the alternative to EBP dissemination presented in this article (i.e., evidence-based skills training) is thought to be a more effective and feasible means of transferring evidence-based intervention strategies into clinical settings.

Acknowledgments

Preparation of this article was supported in part by National Institute on Drug Abuse Grant 1K23DA020085 and California Department of Alcohol and Drug Programs Grant 07-00176.

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

IMAGES

  1. Nursing Research and Evidence-Based Practice Free Essay Example

    evidence based practice skills essay

  2. 12 Ways to Introduce Textual Evidence in Your Essay in 2021

    evidence based practice skills essay

  3. (PDF) Evidence Based Practice

    evidence based practice skills essay

  4. Utilizing Evidence-Based Practices (EBP) for NCLEX® Success

    evidence based practice skills essay

  5. The Essential (Oxford Review) Guide to Evidence-Based Practice

    evidence based practice skills essay

  6. Evidence Based Practice In Nursing Research Essay Sample

    evidence based practice skills essay

VIDEO

  1. RR-CLaN Evidence-Based Practice Seminar Series Step 5- Assess: Implementing EBP skills into practice

  2. IA SCHOLAR LECTURE SERIES#021: Easy Writing 6: How to Write a Strong "Conclusion" in Article

  3. Essay Skill 2

  4. Essay Session

  5. Essay Skill 01

  6. 5+1 Simple Steps To Ace Your Next Research Essay (Undergraduate Level)

COMMENTS

  1. What is Evidence-Based Practice in Nursing? (With Examples, Benefits

    Top 5 Benefits To The Nurse. 1. Evidence-based practice in nursing provides nurses with scientifically supported research to help them make well-educated decisions. 2. EBP in nursing helps nurses stay up-to-date about new nursing interventions and protocols used in patient care. 3.

  2. Evidence Based Practice in Nursing Essay

    The importance of evidence based practice is to enable nurses to provide high quality care, improve outcomes for patient and families and to run a more efficient health service. Therefore other agencies within the health service will benefit when interventions and care is based on research (Burns & Grove 2007).

  3. What is Evidence-Based Practice in Nursing?

    5 min read • June, 01 2023. Evidence-based practice in nursing involves providing holistic, quality care based on the most up-to-date research and knowledge rather than traditional methods, advice from colleagues, or personal beliefs. Nurses can expand their knowledge and improve their clinical practice experience by collecting, processing ...

  4. EBP Nursing Skills Essay

    Evidence based practice nursing skills. Introduction. This essay assesses a peer reviewed paper written by McCaughan et al (2018) and published in the International Journal of Nursing Studies. The full citation precedes this introduction.

  5. Top EBP Topics in Nursing: Improving Patient Outcomes

    These topics often have the potential to generate innovative and effective research. Consider ethical issues. Consider topics related to ethical issues in nursing practice. For example, bereavement care, informed consent, and patient privacy are all ethical issues that can be explored in an EBP project.

  6. Evidence-Based Practice: Step by Step: The Seven Steps of ...

    Research studies show that evidence-based practice (EBP) leads to higher quality care, improved patient outcomes, reduced costs, and greater nurse satisfaction than traditional approaches to care. 1-5 Despite these favorable findings, many nurses remain inconsistent in their implementation of evidence-based care. Moreover, some nurses, whose education predates the inclusion of EBP in the ...

  7. Strategies for teaching evidence-based practice in nursing education: a

    Development of literature-based models for Bachelor's degree essays and evaluation of students' experiences. ... Ehrenberg A, Wallin L, Gustavsson P. Educational support for research utilization and capability beliefs regarding evidence-based practice skills: a national survey of senior nursing students. J Adv Nurs. 2012; 68:888-897. doi ...

  8. Evidence-based practice beliefs and implementations: a cross-sectional

    Evidence-based practice (EBP) integrates the clinical expertise, the latest and best available research evidence, as well as the patient's unique values and circumstances [].This form of practice is essential for nurses as well as the nursing profession as it offers a wide variety of benefits: It helps nurses to build their own body of knowledge, minimize the gap between nursing education ...

  9. Strategies for teaching evidence-based practice in nursing education: a

    Background Evidence-based practice (EBP) is imperative for ensuring patient safety. Although teaching strategies to enhance EBP knowledge and skills are recommended, recent research indicates that nurses may not be well prepared to apply EBP. A three-level hierarchy for teaching and learning evidence-based medicine is suggested, including the requirement for interactive clinical activities in ...

  10. Evidence-Based Practice

    EBP is a process used to review, analyze, and translate the latest scientific evidence. The goal is to quickly incorporate the best available research, along with clinical experience and patient preference, into clinical practice, so nurses can make informed patient-care decisions ( Dang et al., 2022 ). EBP is the cornerstone of clinical practice.

  11. Evidence-informed practice: simplifying and applying the concept for

    The problems with the uptake and effective implementation of EBP led to the emergence of the EIP concept. This concept is based on the premise that healthcare practice should, as a matter of principle, be informed by, rather than based on, evidence (Nevo and Slonim-Nevo, 2011). This implies that other forms of evidence (for example, patient experiences, the nurse's expertise and experiences ...

  12. What is Evidence-Based Practice?

    E. videnced-based practice (EBP) is applying or translating research findings in our daily patient care practices and clinical decision-making. EBP also involves integrating the best available evidence with clinical knowledge and expertise, while considering patients' unique needs and personal preferences. If used consistently, optimal ...

  13. Implementation of evidence-based practice: The experience of nurses and

    Results. Nurses and midwives perceived that implementation of evidence-based practice is the use of research findings, guidelines, hospital protocols, books, and expert experience in clinical decision-making practice. However, there was limited support for the implementation of evidence-based practice by nurses and midwives.

  14. Evidence-based practice education for healthcare professions: an expert

    Introduction. To highlight and advance clinical effectiveness and evidence-based practice (EBP) agendas, the Institute of Medicine set a goal that by 2020, 90% of clinical decisions will be supported by accurate, timely and up-to-date clinical information and will reflect the best available evidence to achieve the best patient outcomes.1 To ensure that future healthcare users can be assured of ...

  15. Improving healthcare quality, patient outcomes, and costs with evidence

    The seven steps of evidence-based practice. Evidence-based practice was originally described as a five-step process including (Sackett et al., 2000): Ask the clinical question in PICOT format. Search for the best evidence. Critically appraise the evidence. Integrate the evidence with a clinician's expertise and a patient's preferences and ...

  16. Research Guides: Evidence-Based Practice: EBP: Principles

    Clinical expertise refers to the clinician's cumulated experience, education, and clinical skills. The patient brings to the encounter his or her own personal and unique concerns, expectations, and values. The best evidence is usually found in clinically relevant research that has been conducted using sound methodology (Sackett, 2002).

  17. Evidence Based Practice' Impact on Nursing Essay (Article)

    The authors describe a new course (Evidence-Based Nursing 1) that was implemented as part of an undergraduate nursing program. The researchers observed that the targeted learners were willing to make evidence-based practices part of their nursing philosophies after completing the course. The practice can encourage practitioners to integrate EBP ...

  18. Evaluation of Facilitators and Barriers to Implementing Evidence-Based

    Introduction. Evidence-based practice (EBP) is an ambition for health service administrators. EBP has been the combination of best study evidence with clinical expertise and patient uses in the decision-making method for patient administration [1-4].EBP is an essential function in upholding the national health system; it supports efficient interventions and therefore presents the ground for ...

  19. The Evidence for Evidence-Based Practice Implementation

    Models of Evidence-Based Practice. Multiple models of EBP are available and have been used in a variety of clinical settings. 16-36 Although review of these models is beyond the scope of this chapter, common elements of these models are selecting a practice topic (e.g., discharge instructions for individuals with heart failure), critique and syntheses of evidence, implementation, evaluation ...

  20. Evidence Based Practice Essay

    Evidence-based clinical practice or evidence-based medicine is the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients" (Sackett et al., 1996). The beginning of Evidence-Based Practice (EBP), formerly known as evidence-based medicine, was initiated by Archie Cochrane in ...

  21. Evidence-informed practice: simplifying and applying the ...

    Evidence-based practice (EBP) is recognised as the gold standard for the delivery of safe and effective person-centred care. However, decades following its inception, nurses continue to encounter difficulties in implementing EBP and, although models for its implementation offer stepwise approaches, factors, such as the context of care and its ...

  22. Nursing Evidence-Based Practice Skills

    Nursing Evidence-Based Practice Skills. Karen Holland, Colin Rees. OUP Oxford, Apr 22, 2010 - Medical - 299 pages. To deliver safe and effective patient care student nurses need to find, critique, and use evidence in every day nursing decisions and academic work, Consequently they need a new textbook that goes further than simply explaining ...

  23. PDF Principles for putting evidence-based guidance into practice

    Building capability, by improving leadership, management, professional and institutional culture, skills and behaviours to assure quality and sustain improvement. 7. Staying ahead, by developing research, innovation and planning to provide progressive, high-quality care. The 7 steps to quality improvement.

  24. Interventions, methods and outcome measures used in teaching evidence

    Evidence-based practice (EBP) enhances the quality of healthcare, reduces the cost, improves patient outcomes, empowers clinicians, and is recognized as a problem-solving approach [] that integrates the best available evidence with clinical expertise and patient preferences and values [].A recent scoping review of EBP and patient outcomes indicates that EBPs improve patient outcomes and yield ...

  25. Evidence-Based Practices in Addiction Treatment: Review and

    The importance of translating scientific advances in disease-specific interventions into clinical practice has been emphasized throughout the health care system, largely stemming from the consistent observation of a wide gap between research and practice [].As a move towards "evidence-based practice" has permeated health care systems and policy, several working groups in the addiction ...