Professional Documents
Culture Documents
40 (2007) 1261–1274
The term evidence-based medicine (EBM) has been used by many educa-
tors and physicians to describe the need for applying the ever-enlarging
body of knowledge to everyday care. Practicing EBM is a structured process
that integrates the best available literature with patient desires and clinical
experience to direct patient care decisions. The goal of this article is to pro-
vide a framework for teaching EBM in otolaryngology, particularly in a res-
ident training program.
This article begins by defining EBMdwhat it is and what it is not. Next,
potential barriers to EBM teaching are identified, and then examples of
structured learning environments discussed to provide effective frameworks
for learning. Later, key concepts to consider when critically evaluating pa-
pers are discussed. Finally, the article reviews an otolaryngologic example
of applying an evidence-based review to address a clinical question. Exam-
ples and literature from multiple disciplines are cited.
* Corresponding author.
E-mail address: mgs2002@med.cornell.edu (M.G. Stewart).
0030-6665/07/$ - see front matter Ó 2007 Elsevier Inc. All rights reserved.
doi:10.1016/j.otc.2007.07.006 oto.theclinics.com
1262 LIU & STEWART
effectiveness, not all problems are amenable to such evaluation. A mock sys-
tematic review of RCTs on the effectiveness of parachutes in preventing ma-
jor trauma [5], for example, was one author’s tongue-in-cheek method of
expressing that RCTs cannot be performed to answer all clinical questions.
The judicious review of other available types of evidencedcohort, case con-
trol, case reports, and even expert opiniondcan provide significant guid-
ance in clinical decision making. A related misconception about EBM is
that ‘‘lack of evidence’’ means ‘‘lack of effectiveness.’’ This is also untrue.
Many effective treatments have not been studied systematically, and exter-
nal evidence cannot answer every question.
Practicing EBM is not acquired instantaneously. It is developed over
time, and learning should begin with resident physicians in training. Indeed,
EBM involves appraisal of one’s own practice experience and assessing
whether it could be improved, which is part of practice-based learning
and improvementda core competency for residency training. However, un-
like taking a good history or performing a thorough physical exam, the pro-
cess of acquiring and critically analyzing literature to inform clinical
decision making is not uniformly well taught in medical schools. In one
study, Caspi and colleagues [6] evaluated the perceived competence and per-
formance of graduating medical students in EBM techniques across the
United States. They first evaluated the respondents’ self-assessment of com-
petency in critically evaluating evidence, and then tested their actual skills
using an online instrument. The study found that although respondents
felt competent about their ability to evaluate problems across multiple com-
petencies, the average score of 55% correct on the assessment instrument
suggested otherwise. This is further evidence that residency training pro-
grams need to refine the skills of graduating medical students as they pro-
ceed toward becoming active physicians.
Residency programs have always provided residents with strong training
in the ‘‘clinical judgment’’ and ‘‘physician experience’’ aspects of medical
care, as well as the art of integrating patient preference and expectation
into decision making. However, searching, identifying, and appraising the
best evidence are often relegated to monthly ‘‘journal club’’ conferences
and maybe one or two didactic lectures, often from a nonphysician (ie, a stat-
istician). Teaching the process of EBM to residents should be a key skill in
their training.
Barriers to teaching
Teaching EBM to residents remains a significant challenge because bar-
riers exist at multiple levels. These barriers can be divided into three broad
categories: institutional, resident related, and attending staff related. Much
of this section is derived from an excellent study by Bhandari and colleagues
[7]. This study used structured interviews to assess surgical resident
1264 LIU & STEWART
Taken together, many barriers can exist to the teaching of EBM in resi-
dency. These barriers lie in three areas: resident, attending, and institutional.
Strategies and mechanisms to overcome institutional barriers to teaching
EBM are beyond the scope of this article. However, assuming that resources
are available, the ability to provide good EBM teaching is mostly limited by
attending and resident physician factors.
The ability to provide EBM teaching to residents necessarily begins with
a commitment by attending surgeons. Many of the factors identified in
Bhandari and colleagues [7] stem from a lack of background or commitment
amongst attending physicians. Establishing a fundamental knowledge base
in EBM and widespread acceptance of its value are important steps to over-
come these key barriers to teaching EBM. In an upcoming section, struc-
tured strategies that have been reported by training programs to teach
EBM are discussed.
Step I:
Formulate a
focused clinical
question.
Step II:
Search for and retrieve
appropriate literature.
Step III:
Review and Grade the
literature.
Steps III & IV:
Critically evaluate evidence
based reviews or
guidelines.
Step IV:
Summarize and formulate
recommendations.
Step V:
Integrate summary
recommendations with clinical
experience and patient preferences
and return to regular oral intake (outcome) when compared with electrocau-
tery technique (comparison)?’’ This focused question lends itself to literature
searches, and gives the best chance for a successful answer.
Literature search
The second step is the literature search. A computerized database of most
of the world’s biomedical literature is called MEDLINE and is maintained
by the National Library of Medicine. Searching this database is available
free in the United States through the PubMed Web site at www.pubmed.
gov. Executing a search of the literature requires practice and experience.
There are definitely techniques to improve the effectiveness of literature
searching, and a detailed discussion of literature searching is beyond the
scope of this article. Others have addressed this question in more detail
[10]. The third and fourth stepsdcritical literature review and summaryd
are covered later in this article.
Although it is important for physicians to understand the concepts be-
hind comprehensive searchingdand to perform literature searches, if
neededdin reality, this process can be lengthy and tedious. Therefore, the
clinician successfully practicing EBM should be able to identify and use
TEACHING EVIDENCE-BASED MEDICINE IN OTOLARYNGOLOGY 1267
available resources to assist in EBM practice. In many cases, the steps of lit-
erature search, critical review, and evidence synthesis have already been per-
formed and published. The clinician can then use these summaries, either as
the definitive review or as a key starting point. Evidence-based reviews and
working group or committee recommendations are typically performed by
large groups of experts (often a multidisciplinary group), and undergo
peer review. The time saved by using these resources can be significant, es-
pecially for the busy clinician. However, because these articles require faith
that the process was thorough and unbiased, it is important that clinicians
still understand the skills needed to perform these reviews. In addition, ev-
idence-based reviews have not been performed for some questions, or only
one or two articles are all that are available, again pointing out the clini-
cian’s need to be able to review and synthesize evidence on their own.
Unlike UpToDate [8] or the Cochrane Library [11], which are respected
resources for EBM reviews on selected topics, there is no consolidated
source for otolaryngologic evidence-based reviews. The Agency for Health-
care Research and Quality (www.ahrq.gov) has some reviews of otolaryn-
gology topics such as their review on acute bacterial sinusitis [12].
Reviews such as these are sometimes also published in otolaryngology jour-
nals [13,14].
There are some problems with relying on existing evidence-based reviews.
They usually only tackle problems of general interest, and many questions
that challenge a clinician may not have ‘‘made the radar’’ for evaluation.
Some organizations that perform evidence-based reviews make the decision
to only review and report on the highest-level evidence (eg, RCTs). The phi-
losophy is that compiling and summarizing lower-quality evidence is not
worth the time, and efforts should be directed at addressing questions
with higher-level evidence. This creates a problem in several areasdpartic-
ularly surgical questionsdwhere there may be a paucity of randomized or
controlled trials, and therefore the evidence that is available is not system-
atically reviewed. Also, some databases are proprietary and expensive, put-
ting them out of reach for clinicians not in a large organization.
The final step, integration of the best available evidence with clinical
judgment and the patient’s wishes, is the ‘‘balancing the three-legged stool’’
concept discussed earlier. It is important to remember that the evidence was
gathered from populations of patients, but is being applied to the care of an
individual patient.
being used by 95% of internal medicine programs [15] and 73% of family
medicine programs [16]. The varying strategies and structures used in
journal clubs have been described in detail elsewhere [17]. Most journal
clubs are structured around reviewing a single landmark or recent publi-
cation in depth. The research methodology, statistical analysis, premises,
and conclusions are among the items reviewed to help evaluate the qual-
ity of the paper. The paper is also reviewed in the greater context of the
practice of medicine to determine its applicability. Discussion of all these
aspects allow for residents to discover how to better perform critical anal-
ysis. In this way, the journal club emphasizes critical analysis skills of the
literature. Although a useful exercise, it is limited because the discussions
usually occur in a general context and not in reference to a specific pa-
tient. In addition, studies should be assigned specific levels of evidence
based on methodology in each journal club, a step that does not uni-
formly occur.
In open evidence-based curricula [17], a clinical scenario is presented that
is usually based on an actual patient. Residents then perform literature and
evidence reviews to answer a clinically related question regarding appropri-
ate management of the presented patient. Doing so allows for many other
steps to occur that go beyond the traditional journal club. The critical ap-
praisal of the literature is necessary to determine the quality of the evidence,
which is similar to a journal club. However, open evidence-based curricula
differ in that participants engage in the literature search process, summari-
zation, and integration with clinical experiences. Discussion that ensues re-
volves not only on the quality of the evidence, but also the search method
and different styles of evidence integration with clinical experience. Thus,
the incorporation of these additional steps provides a powerful teaching
strategy.
Finally, professor rounds offer a structured way of teaching EBM [18].
In this format, a patient is presented to an attending physician who is
unfamiliar with the case. Discussion on the diagnosis, workup, and treat-
ment are performed. When questions arise regarding any part of this pro-
cess, these questions are refined into an answerable form. In the interval
between sessions, residents divide the questions and explore the literature.
These results are synthesized and presented at the following session. Like
the open curricula, the emphasis is less on strictly critical appraisal of the
literature but rather directing this literature toward care of a single pa-
tient. The review of the literature and discussion of the evidence as it ap-
plies to an individual case encompass the broader aspect of the EBM
process.
These three strategies are all slightly different in their method of teaching
EBM. Journal club puts a heavy focus on critical journal reviewda key part
of being able to deliver EBM. The other two activities, open curriculum and
professor rounds, emphasize the integration of the literature toward a single
patient.
TEACHING EVIDENCE-BASED MEDICINE IN OTOLARYNGOLOGY 1269
1. RCT
2. Cohort studies
3. Case control studies
4. Case series
5. Expert opinion
Table 1
Levels of evidence
Therapy/prevention, Economic and decision
Level etiology/harm Prognosis analyses
1a SR (with homogeneity) SR (with homogeneity) SR (with homogeneity) of
of RCTs of inception Level 1 economic studies
cohort studies;
CDR validated in
different populations
1b Individual RCT (with Individual inception Analysis based on clinically
narrow confidence cohort study sensible costs or
interval) with O80% follow-up; alternatives; systematic
CDR validated in a review(s) of the evidence;
single population and including multiway
sensitivity analyses
1c All or none All or none case series Absolute better-value or
worse-value analyses
2a SR (with homogeneity) SR (with homogeneity) of SR (with homogeneity) of
of cohort studies either retrospective cohort Level O2 economic
studies or untreated studies
control groups in RCTs
2b Individual cohort study Retrospective cohort study Analysis based on clinically
(including low or follow-up of untreated sensible costs or
quality RCT; eg, !80% control patients in an alternatives; limited
follow-up) RCT; derivation of CDR review(s) of the evidence
or validated on split or single studies; and
sample only including multiway
sensitivity analyses
2c ‘‘Outcomes’’ research; ‘‘Outcomes’’ research Audit or outcomes research
ecological studies
3a SR (with homogeneity) SR (with homogeneity) of
of case control studies Level 3b and better
studies
3b Individual case control Analysis based on limited
study alternatives or costs, poor
quality estimates of data,
but including sensitivity
analyses incorporating
clinically sensible
variations
4 Case series (and poor Case series (and poor Analysis with no sensitivity
quality cohort and quality prognostic cohort analysis
case control studies) studies)
5 Expert opinion without Expert opinion without Expert opinion without
explicit critical appraisal, explicit critical appraisal, explicit critical appraisal,
or based on physiology, or based on physiology, or based on economic
bench research, or bench research, or ‘‘first theory, or ‘‘first
‘‘first principles’’ principles’’ principles’’
Abbreviations: CDR, Clinical decision rule; MA, Meta analysis; RCT, Randomized control
trials; SR, Systematic review.
Adapted from Phillips B, Ball C, Sackett D, et al. Levels of evidence and grades of recom-
mendation. Oxford Centre for Evidence-Based Medicine, 2001. Available at: (http://www.cebm.
net/levels_of_evidence.asp).
TEACHING EVIDENCE-BASED MEDICINE IN OTOLARYNGOLOGY 1271
Table 2
Evidence-based grading system
Grade A Consistent Level 1 studies
Grade B Consistent Level 2 or 3 studies or extrapolations from Level 1 studies
Grade C Level 4 studies or extrapolations from Level 2 or 3 studies
Grade D Level 5 evidence or troublingly inconsistent or inconclusive studies of any level
Adapted from Phillips B, Ball C, Sackett D, et al. Levels of evidence and grades of recom-
mendation. Oxford Centre for Evidence-Based Medicine, 2001. Available at: (http://www.cebm.
net/levels_of_evidence.asp).
example, if the study evaluated quality of life as an outcome, did the authors
use a previously validated instrument or a proprietary one? What did the au-
thors consider the ‘‘gold standard’’ for diagnosis? These are important con-
siderations that should be evaluated before study results can be translated
into clinical practice.
Finally, the physician should understand the potential difference between
statistical significance and clinical significance. Statistical significance mea-
sures the probability that the results seen might have been a result of chance
and, in many ways, statistical significance is a function of the sample size. It
is possible to find a statistically significant difference that is not clinically
significant. For example, after use of an oral appliance, patients with sleep
apnea might have a mean apnea-hypopnea index of 46, versus 49 in a non-
treated group. Even if that difference achieves statistical significance (eg,
P!.05, probably only achieved with a very large sample size), that just
means that the difference seen was likely not due to chance. From a clinical
standpoint, both groups still have severe sleep apnea, and a difference of 3
has no clinical significance. So the clinician should assess the clinical signif-
icance of results as well as statistical significance. This helps determine
whether the results are clinically useful.
Another measure of the clinical impact of an intervention is the NNT. In
a RCT, the NNT is the number of patients needed to be treated by a particular
therapy that would result in one less negative outcome. This is a helpful statis-
tical calculation to understand the relative magnitude of differences in out-
come, and gives a sense of the population impact of changes in treatment.
successful and definitive than needle aspiration alone. The attending faculty
physician suggests that the resident team perform a literature search to look
for evidence on each potential treatment.
An online literature search using MEDLINE identifies a published evi-
dence-based review of PTA treatment by Johnson and colleagues [21].
This review article is reviewed by the resident team: there have been three
published RCTs (Level 1 evidence), which compare I&D versus needle aspi-
ration. None of these studies provided statistical analysis, and the reported
success rates were nearly the same in all studies. The review has pooled the
data from those studies, and the success rate was approximately 93.7% for
I&D and 91.6% for needle aspiration; the difference was statistically insig-
nificant. A NNT analysis from the review showed that 48 patients would
need to undergo I&D to save one patient a treatment failure from needle as-
piration. Overall, the grade of this evidence was C, even though there were
some limited RCTs.
After the findings are discussed, the participants agree that the evidence is
appropriate for their patient, and that either needle aspiration or I&D
would be appropriate treatment. Both options are presented to the patient,
who then relates his significant fear of needles and expresses his preference
for I&D. This is successfully performed.
This example illustrates the EBM process. A clinical question was identi-
fied from the resident clinic and presented to the supervising faculty. A lit-
erature review quickly identified a published review that addressed the
desired question, which was then reviewed in depth. This information was
subsequently put in light of the patient context again. Although the less in-
vasive option was initially offered (needle aspiration), patient’s preference
directed an I&D to be performed. Although this example is not one of the
formal structured teaching strategies discussed previously, it shows how
adequacy of resources, sufficient time in a resident clinic, and attending mo-
tivation can allow the EBM process to occur when clinical questions arise,
all while resulting in good patient care.
Summary
EBM is rapidly becoming a cornerstone for practicing good medicine. It
brings the experience of clinicians and experts to inform clinical decisions
faced by practitioners every day. Practicing good EBM is a structured intel-
lectual process. Although taught in medical schools, the skills and abilities
of medical graduates to engage in EBM are generally inadequate. It falls
on medical residency to develop and refine these skills.
EBM should always begin with a question about an individual patient,
which is restructured into a readily answerable question. An appropriate re-
view of the literature is then performed through database searches and crit-
ical review of journal articles. Reviewing published evidence-based
guidelines and reviews is a possible alternative. This evidence is summarized
TEACHING EVIDENCE-BASED MEDICINE IN OTOLARYNGOLOGY 1273
and ultimately combined with one’s clinical experience and patient factors to
finally render an appropriate treatment.
Teaching this process has many potential barriers at the institutional, res-
ident, and attending physician level; however, EBM can be successfully
taught in a residency training program. EBM is probably best taught by ex-
ample, from faculty physicians. The residents in training do not need to see
every step in the process every time, but they should see the end result:
knowledge of the best available evidence, and integration of that knowledge
into the care of individual patients.
References
[1] Sackett DL, Rosenberg WM, Gray JA, et al. Evidence based medicine: what it is and what it
isn’t. BMJ 1996;312(7023):71–2.
[2] Eddy DM. Evidence-based medicine: a unified approach. Health Aff (Millwood) 2005;24(1):
9–17.
[3] Anderson GL, Limacher M, Assaf AR, et al. Effects of conjugated equine estrogen in post-
menopausal women with hysterectomy: the Women’s Health Initiative randomized con-
trolled trial. JAMA 2004;291(14):1701–12.
[4] Bailey BJ, Johnson Jonas T, Newlands Shawn D. Head & neck surgery–otolaryngology. 4th
edition. Philadelphia: Lippincott Williams & Wilkins; 2006. p. 33–40.
[5] Smith GC, Pell JP. Parachute use to prevent death and major trauma related to gravitational
challenge: systematic review of randomised controlled trials. BMJ 2003;327(7429):1459–61.
[6] Caspi O, McKnight P, Kruse L, et al. Evidence-based medicine: discrepancy between per-
ceived competence and actual performance among graduating medical students. Med Teach
2006;28(4):318–25.
[7] Bhandari M, Montori V, Devereaux PJ, et al. Challenges to the practice of evidence-based
medicine during residents’ surgical training: a qualitative study using grounded theory.
Acad Med 2003;78(11):1183–90.
[8] UpToDate. [Electronic online resource]. Available at: http://www.utdol.com. Accessed
June, 2007.
[9] Sackett DL. Evidence-based medicine: how to practice and teach EBM. 2nd edition. New
York: Churchill Livingstone; 2000.
[10] Stewart MG, Kuppersmith RB, Moore AS. Searching the medical literature on the internet.
Otolaryngol Clin North Am 2002;35(6):1163–74, v–vi.
[11] The Cochrane Library. Wiley Intersicence. Available at: http://www.thecochranelibrary.
com/. Accessed June, 2007.
[12] Diagnosis and treatment of acute bacterial rhinosinusitis. Summary, evidence report/tech-
nology assessment: Number 9, March 1999. Agency for Health Care Policy and Research.
Available at: http://www.ahrq.gov/clinic/epcsums/sinussum.htm. Accessed June, 2007.
[13] Benninger MS, Sedory Holzer SE, Lau J. Diagnosis and treatment of uncomplicated acute
bacterial rhinosinusitis: summary of the Agency for Health Care Policy and Research evi-
dence-based report. Otolaryngol Head Neck Surg 2000;122(1):1–7.
[14] Rosenfeld RM, Casselbrant ML, Hannley MT. Implications of the AHRQ evidence report
on acute otitis media. Otolaryngol Head Neck Surg 2001;125(5):440–8 [discussion: 439].
[15] Sidorov J. How are internal medicine residency journal clubs organized, and what makes
them successful? Arch Intern Med 1995;155(11):1193–7.
[16] Heiligman RM, Wollitzer AO. A survey of journal clubs in U.S. family practice residencies.
J Med Educ 1987;62(11):928–31.
[17] Green ML. Evidence-based medicine training in graduate medical education: past, present
and future. J Eval Clin Pract 2000;6(2):121–38.
1274 LIU & STEWART
[18] Haines SJ, Nicholas JS. Teaching evidence-based medicine to surgical subspecialty residents.
J Am Coll Surg 2003;197(2):285–9.
[19] McAlister FA, Straus SE, Guyatt GH, et al. Users’ guides to the medical literature: XX.
Integrating research evidence with the care of the individual patient. Evidence-Based Med-
icine Working Group. JAMA 2000;283(21):2829–36.
[20] CEBM (Center for Evidence Based Medicine). ‘‘Levels of Evidence.’’ Available at: http://
www.cebm.net/index.aspx?o¼1047. Accessed June, 2007.
[21] Johnson RF, Stewart MG, Wright CC. An evidence-based review of the treatment of peri-
tonsillar abscess. Otolaryngol Head Neck Surg 2003;128(3):332–43.