Introduction to Evidence-Based Medicine

What is EBM?

Evidence-based medicine (EBM) requires the integration of the best research evidence with our clinical expertise and our patient’s unique values and circumstances.

  • By best research evidence, we mean clinically relevant research, sometimes from the basic sciences of medicine, but especially from patient-centered clinical research into the accuracy and precision of diagnostic tests (including the clinical examination), the power of prognostic markers, and the efficacy and safety of therapeutic, rehabilitative, and preventive strategies.
  • By clinical expertise, we mean the ability to use our clinical skills and past experience to rapidly identify each patient’s unique health state and diagnosis, their individual risks and benefits of potential interventions/exposures/diagnostic tests, and their personal values and expectations. Moreover, clinical expertise is required to integrate evidence with patient values and circumstances.
  • By patient values, we mean the unique preferences, concerns and expectations each patient brings to a clinical encounter and which must be integrated into shared clinical decisions if they are to serve the patient; and by patient circumstances we mean their individual clinical state and the clinical setting.

Why the interest in EBM?

Interest in EBM has grown exponentially since the coining of the term 1 in 1992 by a group led by Gordon Guyatt at McMaster University, from 1 Medline citation in 1992 to over 119000 in December 2016. Searching Google and Google Scholar with the terms evidence based medicine retrieves almost 40 million hits and more than 1.5 million hits respectively. We encourage interested readers to review ‘an oral history of EBM’ that was published in 2014 by JAMA and the BMJ and presented by Dr. Richard Smith. This online resource outlines the origins and development of EBM, including discussions with Drs. David Sackett, Brian Haynes and Gordon Guyatt. We also recommend taking a look at the James Lind Library, which provides a more detailed history of the development of ‘fair tests of treatments in health care’ including many of the seminal moments in the history of EBM. As a teaching tip, we use many of the resources provided in the James Lind Library such as the story of James Lind’s 1753 ‘treatise of the scurvy’ and the randomised trial of streptomycin treatment for pulmonary tuberculosis, published in 1948. These are great articles to engage learners and stimulate interest in EBM, while highlighting that EBM isn’t a new concept but instead builds on a solid foundation, the work of countless people worldwide who have been interested in using evidence to support decision making!

Evidence-based practice has become incorporated into many health care disciplines including occupational therapy, physiotherapy, nursing, dentistry, and complementary medicine amongst many others. Indeed, we’ve been told by one publisher that adding evidence-based to the title of a book can increase sales – regardless of whether the book is evidence-based! Similarly, its use has spilled over into many other domains including justice, education, policy making. When we first started working in this area, while we looked for the day when politicians would talk freely about using research evidence to inform their decision making, we did not anticipate it happening so soon or across so many countries! [Trudeau in Canada, Norwegian govt]

Because of the recognition that EBM is critical for decision making, professional organizations and training programs for various health care professionals have moved from whether to teach EBM, to how to teach it, resulting in an explosion in the number of courses, workshops and seminars offered in this practice. Similarly, EBM educational interventions for the public, policy makers and health care managers have grown. And, colleagues have extended training on critical appraisal to primary and secondary school students, highlighting that everyone should develop the ability to understand research evidence and use it in their own decision making, thereby enhancing health literacy. The format for teaching EBM to these diverse audiences has also grown showing less focus on didactic sessions and more focus on interactive, case-based discussion, opportunistic teaching, and use of different media including online platforms and social media. Indeed, we hope that this ebook stimulates interest in sharing content and curricula worldwide, and developing collaborative educational opportunities such as twitter journal clubs and massive online courses.

Because of the recognition that EBM is critical for decision making, professional organizations and training programs for various health care professionals have moved from whether to teach EBM, to how to teach it, resulting in an explosion in the number of courses, workshops and seminars offered in this practice. Similarly, EBM educational interventions for the public, policy makers and health care managers have grown. And, colleagues have extended training on critical appraisal to primary and secondary school students, highlighting that everyone should develop the ability to understand research evidence and use it in their own decision making, thereby enhancing health literacy. The format for teaching EBM to these diverse audiences has also grown showing less focus on didactic sessions and more focus on interactive, case-based discussion, opportunistic teaching, and use of different media including online platforms and social media. Indeed, we hope that this ebook stimulates interest in sharing content and curricula worldwide, and developing collaborative educational opportunities such as twitter journal clubs and massive online courses.

While champions and opinion leaders have facilitated the rapid spread of EBM over the last 25 years, its spread over this period has arisen from several realizations:

  1. our daily clinical need for valid and quantitative information about diagnosis, prognosis, therapy and prevention (up to five times per in-patient2 and twice for every three outpatients3)
  2. the inadequacy of traditional sources for this information because they are out of date (traditional textbooks4), frequently wrong (experts5), ineffective (didactic continuing medical education6) or too overwhelming in their volume and too variable in their validity for practical clinical use (medical journals7)
  3. the disparity between our diagnostic skills and clinical judgment, which increase with experience, and our up-to-date knowledge8 and clinical performance9 which decline
  4. our inability to afford more than a few seconds per patient for finding and assimilating this evidence10 or to set aside more than half an hour per week for general reading and practice reflection 11
  5. the gaps between evidence and practice (including overuse and underuse of evidence) lead to variations in practice and quality of care12-14. This issue in particular, has gained increasing recognition over the last few years, including a recent series in the Lancet on research waste that highlights the inadequate return on investment in research. Recognition of this issue has created a moral and financial imperative for funders, clinicians, researchers and policy makers to try to bridge these evidence to practice gaps.

These challenges have been met with a number of potential solutions, which facilitate the practice of EBM:

  1. the development of strategies for efficiently tracking down and appraising evidence (for its validity and relevance)
  2. the creation and explosion in development of evidence synopsis and summary services, which allow us to find and use high-quality preappraised evidence15
  3. the creation of information systems for bringing these evidence resources to us within seconds10 including ‘meta-search’ engines that search across multiple resources and strategies for integrating evidence with electronic health records (creating both pull and push strategies for evidence)
  4. the identification and application of effective strategies for lifelong learning and for improving our clinical performance16
  5. the engagement of other stakeholders including patients, the public and policy makers in seeking and applying evidence

This book is devoted to describing some of these innovations, demonstrating their application to clinical problems, and showing how they can be learned and practiced by clinicians who have just 30 minutes per week to devote to their continuing professional development.

Why the need for a new edition of this book?

Given the explosion in interest in EBM and its incorporation into educational curricula worldwide, why the need for another edition of this book? We sent an email survey to colleagues worldwide to ask them: What do they see as the challenges facing EBM practitioners and teachers now and in the next 5 years? We sent this to 40 colleagues and they identified a number of issues that we will attempt to tackle in this book. Through their comments and based on our reflections, we believe there is still work to be done to advance EBM. We encourage readers to offer their own tips to overcoming these challenges to include in this book.

How do we practise EBM?

The complete practice of EBM comprises five steps, and this book addresses each in turn:

  • Step 1 – converting the need for information (about prevention, diagnosis, prognosis, therapy, causation, etc.) into an answerable question (Ch. 1)
  • Step 2 – tracking down the best evidence with which to answer that question (Ch. 2)
  • Step 3 – critically appraising that evidence for its validity (closeness to the truth), impact (size of the effect), and applicability (usefulness in our clinical practice) (the first halves of Chs 3–7)
  • Step 4 – integrating the critical appraisal with our clinical expertise and with our patient’s unique biology, values and circumstances (the second halves of Chs 3–7)
  • Step 5 – evaluating our effectiveness and efficiency in executing steps 1–4 and seeking ways to improve them both for next time (Ch. 9).

When we examine our practice and that of our colleagues and trainees in this five-step fashion, we identify that clinicians can incorporate evidence into their practices in 3 ways. First, is the ‘doing’ mode, in which at least the first 4 steps above are completed. Second, is the ‘using’ mode in which searches are restricted to evidence resources that have already undergone critical appraisal by others, such as evidence summaries (thus skipping step 3). Third, is the ‘replicating’ mode in which the decisions of respected opinion leaders are followed (abandoning at least steps 2 and 3). All three of these modes involve the integration of evidence (from whatever source) with our patient’s unique biology, values and circumstances of step 4, but they vary in the execution of the other steps. For the conditions we encounter every day (e.g. acute coronary syndrome and venous thromboembolism) we need to be “up to the minute” and very sure about what we are doing. Accordingly, we invest the time and effort necessary to carry out both steps 2 (searching) and 3 (critically appraising), and operate in the ‘doing’ mode; all the chapters in this book are relevant to this mode. For the conditions we encounter less often (e.g. salicylate overdose), we conserve our time by seeking out critical appraisals already performed by others who describe (and stick to!) explicit criteria for deciding what evidence they selected and how they decided whether it was valid. We omit the time-consuming step 3 (critically appraising) and carry out just step 2 (searching) but restrict the latter to sources that have already undergone rigorous critical appraisal (e.g. ACP Journal Club). Only the third portions (“Can I apply this valid, important evidence to my patient?”) of Chapters 3–7 are strictly relevant here, and the growing database of pre-appraised resources (described in Chapter 2) is making this “using” mode more and more feasible for busy clinicians. For the problems we’re likely to encounter very infrequently (e.g. graft vs. host disease in a bone marrow transplant recipient), we “blindly” seek, accept and apply the recommendations we receive from authorities in the relevant branch of medicine. This “replicating” mode also characterizes the practice of medical students and clinical trainees when they haven’t yet been granted independence and have to carry out the orders of their consultants. The trouble with the “replicating” mode is that it is “blind” to whether the advice received from the experts is authoritative (evidence-based, resulting from their operating in the “appraising” mode) or merely authoritarian (opinion-based). Sometimes we can gain clues about the validity of our expert source (Do they cite references?). If we tracked the care we give when operating in the “replicating” mode into the literature and critically appraised it, we would find that some of it was effective, some useless, and some harmful. But in the “replicating” mode we’ll never be sure which.

The authors of this book don’t practise as EBM doers all of the time and we find that we move between the different modes of practising EBM depending on the clinical scenario, the frequency with which it arises, and the time and resources available to address our clinical questions. And, while some clinicians may want to become proficient in practising all 5 steps of EBM, many others would instead prefer to focus on becoming efficient users (and knowledge managers) of evidence. This book tries to meet the needs of these various endusers. And, for those readers who are teachers of EBM, we try to describe various ways in which the learning needs of the different learners can be achieved, including those who want to be primarily users or doers of EBM.

Can clinicians practise EBM?

Surveys conducted amongst clinicians and students from various disciplines and from different countries have found that clinicians are interested in learning the necessary skills for practicing EBM.17,18 One survey of UK GPs suggests that many clinicians already practice in the ‘using’ mode, using evidence-based summaries generated by others (72%) and evidence-based practice guidelines or protocols (84%).18 Far fewer claimed to understand (and to be able to explain) the “appraising” tools of NNTs (35%) and confidence intervals (20%). Several studies have found that participants’ understandings of EBM concepts are quite variable and that substantial barriers to its practice continue.19,20

If clinicians have the necessary skills for practising EBM, can it be done in real-time? One of the first studies showing how this could be accomplished was conducted on a busy (180+ admissions per month) in-patient medical service. Electronic summaries of evidence previously appraised either by team members (“CATs”) or by synopsis resources were brought to working rounds and it was documented that, on average, the former could be accessed in 10 seconds and the latter in 25.10 Moreover, when assessed from the viewpoint of the most junior member of the team caring for the patient, this evidence changed 25% of their diagnostic and treatment suggestions and added to a further 23% of them. This study has been replicated in other clinical settings including an obstetrical service.21 Finally, clinical audits from many practice settings have found that there is a significant evidence base for the primary interventions that are encountered on these clinical services.22-29

What’s the ‘E’ for EBM?

There is an accumulating body of evidence relating to the impact of EBM on health care professionals from systematic reviews of training in the skills of EBM30 to qualitative research describing the experience of EBM practitioners31. Indeed, since the last edition of this book was published, there has been an explosion in the number of studies evaluating EBM educational interventions targeting primary and secondary school students, undergraduates, postgraduates and practising clincians. However, these studies of the effect of teaching and practising EBM are challenging to conduct. In many studies, the intervention has been difficult to define. It’s unclear what the appropriate ‘dose’ or ‘formulation’ should be. Some studies use an approach to clinical practice while others use training in one of the discrete ‘microskills’ of EBM such as Medline searching32 or critical appraisal33. Studies have evaluated online, in-person, small group and large group educational interventions.34 Learners have different learning needs and styles and these differences must be reflected in the educational experiences provided.

Just as the intervention has proved difficult to define, the evaluation of whether the intervention has met its goals has been challenging. Effective EBM interventions will produce a wide range of outcomes. Changes in knowledge and skills are relatively easy to detect and demonstrate. Changes in attitudes and behaviours are harder to confirm. Randomised studies of EBM educational interventions have shown that these interventions can change knowledge and attitudes.35 Similarly randomised trials have shown that these interventions can enhance EBM skills.34,36 And, a study has shown that a multi-faceted EBM educational intervention (including access to evidence resources and a seminar series using real clinical scenarios) significantly improved evidence-based practice patterns in a district general hospital.37 Still more challenging is detecting changes in clinical outcomes. Studies of undergraduate and postgraduate educational interventions have shown limited impact on ongoing behaviour or clinical outcomes.34,38 Studies demonstrating better patient survival when practice is evidence-based (and worse when it isn’t) are limited to outcomes research. 39,40 We are still waiting to see a trial where access to evidence is withheld from control clinicians. Finally, it is also important to explore impact on all of these various outcomes over time.[ref]

Along with the interest in EBM, interest in evaluating EBM and developing evaluation instruments has grown. There are several instruments available for evaluating EBM educational interventions including those that assess attitudes, knowledge and skills. We encourage interested readers to review the systematic review which addresses this topic but note this hasn’t been updated since it was published in 2006 so it should serve as a starting point only.41 For any educational intervention, we encourage teachers and researchers to consider that it is necessary to consider changes in performance and outcomes over time because EBM requires lifelong learning and this is not something that can be measured over the short-term.

By questioning the ‘E’ for EBM, are we asking the right question? It has been recognized that providing evidence from clinical research is a necessary but not sufficient condition for the provision of optimal care. This has created interest in knowledge translation, the scientific study of the methods for closing the knowledge-to-practice gap and the analysis of barriers and facilitators inherent in this process.42 (as a side note here, while in Canada and Australia, we call this knowledge translation, we know other terms are used in other countries including implementation science in the UK and dissemination and implementation in the US).[ref] Proponents of knowledge translation have identified that changing behaviour is a complex process requiring comprehensive approaches directed towards patients, physicians, managers and policymakers and provision of evidence is but one component. In this edition, we’ll touch briefly on knowledge translation which focuses on evidence-based implementation. This is not the primary focus of the book, which instead targets the practice of individual clinicians, patients and teachers.

What are the limitations of EBM?

Discussion about the practice of EBM naturally engenders negative and positive reactions from clinicians. Some of the criticisms focus on misunderstandings and misperceptions of EBM such as the concerns that it ignores patient values and preferences and promotes a cookbook approach (for interested readers, we refer you to an early systematic review of the criticisms of EBM and editorial discussing these43,44). We have noted that discussion of these same criticisms bubble up in the literature periodically. An examination of the definition and steps of EBM quickly dismisses these criticisms. Evidence, whether strong or weak, is never sufficient to make clinical decisions. Individual values and preferences must balance this evidence to achieve optimal shared decision making and highlight that the practice of EBM is not a ‘one-size fits all’ approach. Other critics have expressed worry that EBM will be hijacked by managers to promote cost-cutting. However, it is not an effective cost-cutting tool, since providing evidence-based care directed toward maximizing patients’ quality of life often increases the costs of their care and raises the ire of some health economists.45 The self-reported employment of the ‘using’ mode by a great majority of front-line GPs dispels the contention that EBM is an ivory tower concept, another common criticism. Finally, we hope that the rest of this book will put to rest the concern that EBM leads to therapeutic nihilism in the absence of randomised trial evidence. Proponents of EBM would acknowledge that several sources of evidence inform clinical decision making. The practice of EBM stresses finding the best available evidence to answer a question, and this evidence may come from randomised trials, rigorous observational studies, or even anecdotal reports from experts. Hierarchies of evidence have been developed to help describe the quality of evidence that may be found to answer clinical questions. Randomised trials and systematic reviews of randomised trials provide the highest quality evidence—that is, the lowest likelihood of bias, and thus the lowest likelihood to mislead because they establish the effect of an intervention. However, they are not usually the best sources for answering questions about diagnosis, prognosis, or the harmful impact of potentially noxious exposures.

This debate has highlighted limitations unique to the practice of EBM that must be considered. For example, the need to develop new skills in seeking and appraising evidence cannot be underestimated. And, the need to develop and apply these skills within the time constraints of our clinical practice must be addressed.

This book attempts to tackle these limitations and offers potential solutions. For example, EBM skills can be acquired at any stage in clinical training and members of clinical teams at various stages of training can collaborate by sharing the searching and appraising tasks. Incorporating the acquisition of these skills into grand rounds, as well as postgraduate and undergraduate seminars integrates them with the other skills being developed in these settings. These strategies are discussed at length in Chapter 8. Important developments to help overcome the limited time and resources include the growing numbers of evidence-based journals and evidence-based summary services. These are discussed throughout the book and in detail in Chapter 2. Indeed, one of the goals of this edition of the book is to provide tips and tools for practising EBM in ‘real-time’. And, we encourage readers to use the website to let us know about ways in which they’ve managed to meet the challenges of practising EBM in real-time.

How is this resource organized?

The overall package is designed to help practitioners from any health care discipline learn how to practice evidence-based health care. Thus, although the book is written within the perspectives of internal medicine and general practice, the website provides clinical scenarios, questions, searches, critical appraisals, and evidence summaries from other disciplines, permitting readers to apply the strategies and tactics of evidence-based practice to any health discipline.

For those of you who want to become more proficient ‘doers’ of EBM, we’d suggest that you take a look at Chapters 1 through 9. For readers who want to become ‘users’ of EBM, we’d suggest tackling Chapters 1 and 2, focusing on question formulation and matching those questions to the various evidence resources. We have also provided tips on practising EBM in real-time throughout the book. With the move to an ebook, we have been able to incorporate many of the tools/tips/strategies directly in the discussion where they are relevant. We hope this makes it easier for you to use the materials and we encourage you to use the online forum to let us know your thoughts and how this book can be more user-friendly. Finally, for those interested in teaching the practice of EBM, we have dedicated Chapter 7 to this topic.

The chapters and appendices that comprise this book constitute a traditional way of presenting our ideas about EBM. It offers the ‘basic’ version of the model for practising EBM. For those who want more detailed discussion, we’d suggest you review some other resources.46


  1. Evidence-based Medicine Working Group. Evidence-based medicine. A new approach to teaching the practice of medicine. JAMA 1992;268:2420-5.
  2. Osheroff J A, Forsythe D E, Buchanan B G, Bankowitz R A, Blumenfeld B H, Miller R A. Physicians’ information needs: analysis of questions posed during clinical teaching. Ann Intern Med 1991; 114: 576–81.
  3. Covell D G, Uman G C, Manning P R. Information needs in office practice: are they being met? Ann Intern Med 1985; 103: 596–9.
  4. Antman E M, Lau J, Kupelnick B, Mosteller F, Chalmers T C. A comparison of results of meta-analyses of randomised control trials and recommendations of clinical experts. JAMA 1992; 268: 240–8.
  5. Oxman A, Guyatt G H. The science of reviewing research. Ann NY Acad Sci 1993; 703: 125–34.
  6. Davis D, O’Brien MA, Freemantle N et al. Impact of formal continuing medical education. JAMA 1999;282:867-74.
  7. Haynes R B. Where’s the meat in clinical journals [editorial]? ACP Journal Club 1993; 119: A-22–3.
  8. Evans C E, Haynes R B, Birkett N J et al. Does a mailed continuing education program improve clinician performance? Results of a randomised trial in antihypertensive care. JAMA 1986; 255: 501–4.
  9. Sackett D L, Haynes R B, Taylor D W, Gibson E S, Roberts R S, Johnson A L. Clinical determinants of the decision to treat primary hypertension. Clinical Research 1977; 24: 648.
  10. Sackett DL, Straus SE. Finding and applying evidence during clinical rounds: the ‘evidence cart’. JAMA 1998;280:1336-8.
  11. Sackett DL. Using evidence-based medicine to help physicians keep up-to-date. Serials 1997;9:178-81.
  12. Shah BR, Mamdani M, Jaakkimainen L, Hux JE. Risk modification for diabetic patients. Can J Clin Pharmacol 2004;11:239-44.
  13. Pimlott NJ, Hux JE, Wilson LM et al. Educating physicians to reduce benzodiazepine use by elderly patients. CMAJ 2003;168:835-9.
  14. Kennedy J, Quan H, Ghali WA, Feasby TE. Variations in rates of appropriate and inappropriate carotid endarterectomy for stroke prevention in 4 Canadian provinces. CMAJ 2004;171:455-9.
  15. Haynes RB, Cotoi C, Holland J et al. Second-order peer review of the medical literature for clinical practitioners. JAMA 2006;295:1801-8.
  16. Effective practice and organization of care group. The Cochrane Library. Wiley, 2009.
  17. McAlister FA, Graham I, Karr GW, Laupacis A. Evidence-based medicine and the practicing clinician: a survey of Canadian general internists. JGIM 1999;14:236-42.
  18. McColl A, Smith H, White P, Field J. General practitioners’ perceptions of the route to evidence-based medicine: a questionnaire survey. BMJ 1998;316:361-5.
  19. Young JM, Glasziou P, Ward J. General practitioners’ self ratings of skills in evidence based medicine: validation study. BMj 2002;324:950-1.
  20. Sekimoto M, Imanaka Y, Kitano N et al. Why are physicians not persuaded by scientific evidence? BMC Health Serv Res 2006;6:92.
  21. Deshpande N, Publicover M, Gee H, Khan KS. Incorporating the views of obstetric clinicians in implementing evidence-supported labour and delivery suite ward rounds: a case study. Health Info Libr J. 2003 Jun;20(2):86-94.
  22. Ellis J, Mulligan I, Rowe J, Sackett D L. Inpatient general medicine is evidence based. Lancet 1995; 346: 407–10.
  23. Geddes J R, Game D, Jenkins N E, Peterson L A, Pottinger G R, Sackett D L. In-patient psychiatric care is evidence-based. Proceedings of the Royal College of Psychiatrists Winter Meeting, Stratford, UK, January 23–25, 1996.
  24. Howes N, Chagla L, Thorpe M, McCulloch P. Surgical practice is evidence based. Br J Surg 1997; 84: 1220–3.
  25. Kenny S E, Shankar K R, Rintala R, Lamont G L, Lloyd D A. Evidence-based surgery: interventions in a regional paediatric surgical unit. Arch Dis Child 1997; 76: 50–3.
  26. Gill P, Dowell A C, Neal R D, Smith N, Heywood P, Wilson A E. Evidence based general practice: a retrospective study of interventions in one training practice. BMJ 1996; 312: 819–21.
  27. Moyer VA, Gist AK, Elliott EJ. Is the practice of pediatric inpatient medicine evidence-based? J Pediatr Child Health 2002;38:347-51.
  28. Waters KL, Wiebe N, Cramer K et al. Treatment in the pediatric emergency department is evidence based: a retrospective analysis. BMC Pediatr 2006;6;26.
  29. Lai TY, Wong VW, Leung GM. Is ophthalmology evidence based? Br J Ophthalmol 2003;87:385-90.
  30. Parkes J, Hyde C, Deeks J, Milne R. Teaching critical appraisal skills in health care settings.Cochrane Database of Systematic Reviews 2001, Issue 3. Art. No.: CD001270. DOI: 10.1002/14651858.CD001270
  31. Greenhalgh T, Douglas HR. Experiences of general practitioners and practice nurses of training courses in evidence-based health care: a qualitative study. Br J Gen Pract 1999;49:536-40.
  32. Rosenberg W, Deeks J, Lusher A et al. Improving searching skills and evidence retrieval. J Roy Coll Phys 1998;328:557-63.
  33. Taylor RS, Reeves BC, Ewings PE, Taylor RJ. Critical appraisal skills training for health care professionals: a randomised controlled trial. BMD Med Educ 2004;4:30.
  34. Bradley P, Oterhold C, Herrin J et al. Comparison of directed and seld-directed learning in evidence-based medicine: a randomised controlled trial, Med Educ 2005;39:1027-35.
  35. Johnston J, Schooling CM, Leung GM. A randomised controlled trial of two educational modes for undergraduate evidence-based medicine learning in Asia. BMC Med Educ 2009;9:63.
  36. Shnval K, Berkovits E, Netzer D et al. Evaluating the impact of an evidence-based medicine educational intervention on primary care doctors’ attitudes, knowledge and clinical behaviour: a controlled trial and before and after study. J Eval Clin Pract 2007;13:581-98.
  37. Straus SE, Ball C, Balcombe N, Sheldon, J McAlister FA. Teaching evidence-based medicine skills can change practice in a community hospital. JGIM 2005 April;20(4):340-343.
  38. Kim S, Willett LR, Murphy DJ et al. Impact of an evidence-based medicine curriculum on resident use of electronic resources. JGIM 2008;23:1804-8.
  39. Mitchell J B, Ballard D J, Whisnant J P, Ammering C J, Samsa G P, Matchar D B. What role do neurologists play in determining the costs and outcomes of stroke patients? Stroke 1996; 27: 1937–43.
  40. Wong J H, Findlay J M, Suarez-Almazor M E. Regional performance of carotid endarterectomy appropriateness, outcomes and risk factors for complications. Stroke 1997; 28: 891–8.
  41. Shaneyfelt T, Baum KD, Bell D et al. Instruments for evaluating education in evidence-based practice. JAMA 2006;296:1116-27.
  42. Straus SE, Tetroe J, Graham ID. Defining knowledge translation. CMAJ 2009;181:165-8.
  43. Straus SE, McAlister FA. Evidence-based medicine: a commentary on common criticisms. CMAJ 2000;163:837-41
  44. Straus SE, Glasziou P, Haynes RB, Dickersin K, Guyatt GH. Misunderstandings, misperceptions and mistakes. ACP JC 2007;146:A8.
  45. Maynard A. Evidence-based medicine: an incomplete method for informing treatment choices. Lancet 1997; 349: 126–8.
  46. Guyatt G, Rennie D, Meade M, Cook DJ. Ed. Users’ guides to the medical literature. A manual for evidence-based clinical practice. AMA Press: Chicago, 2008