You are on page 1of 5

CONSENSUS STATEMENT

Meta-analysis of Observational Studies


in Epidemiology
A Proposal for Reporting
Donna F. Stroup, PhD, MSc Objective Because of the pressure for timely, informed decisions in public health and
Jesse A. Berlin, ScD clinical practice and the explosion of information in the scientific literature, research
results must be synthesized. Meta-analyses are increasingly used to address this prob-
Sally C. Morton, PhD
lem, and they often evaluate observational studies. A workshop was held in Atlanta,
Ingram Olkin, PhD Ga, in April 1997, to examine the reporting of meta-analyses of observational studies
G. David Williamson, PhD and to make recommendations to aid authors, reviewers, editors, and readers.

Drummond Rennie, MD Participants Twenty-seven participants were selected by a steering committee, based
on expertise in clinical practice, trials, statistics, epidemiology, social sciences, and biomedi-
David Moher, MSc cal editing. Deliberations of the workshop were open to other interested scientists. Fund-
Betsy J. Becker, PhD ing for this activity was provided by the Centers for Disease Control and Prevention.

Theresa Ann Sipe, PhD Evidence We conducted a systematic review of the published literature on the con-
duct and reporting of meta-analyses in observational studies using MEDLINE, Educa-
Stephen B. Thacker, MD, MSc tional Research Information Center (ERIC), PsycLIT, and the Current Index to Statistics.
for the Meta-analysis Of We also examined reference lists of the 32 studies retrieved and contacted experts in
Observational Studies in the field. Participants were assigned to small-group discussions on the subjects of bias,
Epidemiology (MOOSE) Group searching and abstracting, heterogeneity, study categorization, and statistical methods.
Consensus Process From the material presented at the workshop, the authors

B
developed a checklist summarizing recommendations for reporting meta-analyses of ob-
ECAUSE OF PRESSURE FOR TIMELY
servational studies. The checklist and supporting evidence were circulated to all confer-
and informed decisions in pub- ence attendees and additional experts. All suggestions for revisions were addressed.
lic health and medicine and the
Conclusions The proposed checklist contains specifications for reporting of meta-
explosion of information in the analyses of observational studies in epidemiology, including background, search strat-
scientific literature, research results must egy, methods, results, discussion, and conclusion. Use of the checklist should improve
be synthesized to answer urgent ques- the usefulness of meta-analyses for authors, reviewers, editors, readers, and decision
tions.1-4 Principles of evidence-based makers. An evaluation plan is suggested and research areas are explored.
methods to assess the effectiveness of JAMA. 2000;283:2008-2012 www.jama.com
health care interventions and set policy
are cited increasingly.5 Meta-analysis, a studies are available.8 Here, we define an Author Affiliations: Centers for Disease Control and
systematic approach to identifying, ap- observational study as an etiologic or ef- Prevention, Atlanta, Ga (Drs Stroup, Williamson, and
Thacker); University of Pennsylvania School of Medi-
praising, synthesizing, and (if appropri- fectiveness study using data from an ex- cine, Philadelphia (Dr Berlin); RAND Corporation, Santa
ate) combining the results of relevant isting database, a cross-sectional study, Monica (Dr Morton), University of California, San Fran-
cisco (Dr Rennie), Stanford University, Stanford (Dr
studies to arrive at conclusions about a a case series, a case-control design, a de- Olkin), Calif; JAMA, Chicago, Ill (Dr Rennie); Thomas
body of research, has been applied with sign with historical controls, or a co- C. Chalmers Centre for Systematic Reviews, Chil-
dren’s Hospital of Eastern Ontario Research Insti-
increasing frequency to randomized con- hort design.9 Observational designs may tute, Ottawa (Mr Moher); Michigan State Univer-
trolled trials (RCTs), which are consid- lack the experimental element of a ran- sity, East Lansing (Dr Becker); and Georgia State
ered to provide the strongest evidence dom allocation to an intervention and University, Atlanta (Dr Sipe).
A complete list of members of the MOOSE Group ap-
regarding an intervention.6,7 rely on studies of association between pears at the end of this article.
However, in many situations random- changes or differences in 1 characteris- Corresponding Author and Reprints: Donna F. Stroup,
PhD, MSc, Centers for Disease Control and Preven-
ized controlled designs are not fea- tic (eg, an exposure or intervention) and tion, 1600 Clifton Rd NE, Mail Stop C08, Atlanta, GA
sible, and only data from observational changes or differences in an outcome of 30333 (e-mail: dfs2@cdc.gov).

2008 JAMA, April 19, 2000—Vol 283, No. 15 ©2000 American Medical Association. All rights reserved.

Downloaded from www.jama.com by guest on January 18, 2010


REPORTING META-ANALYSES OF OBSERVATIONAL STUDIES

interest. These designs have long been checklist of items for reporting that methods, results, discussion, and con-
used in the evaluation of educational builds on similar activities for RCTs22 clusions (TABLE).
programs10 and exposures that might and is intended for use by authors, re-
cause disease or injury.11 Studies of risk viewers, editors, and readers of meta- Background
factors generally cannot be random- analyses of observational studies. Reporting of the background should
ized because they relate to inherent hu- include the definition of the problem
man characteristics or practices, and ex- METHODS under study, statement of hypothesis,
posing subjects to harmful risk factors We conducted a systematic review of description of the study outcome(s)
is unethical.12 At times, clinical data may the published literature on the con- considered, type of exposure or inter-
be summarized in order to design a ran- duct and reporting of meta-analyses vention used, type of study design used,
domized comparison.13 Observational in observational studies. Databases and complete description of the study
data may also be needed to assess the searched included MEDLINE, Educa- population. When combining observa-
effectiveness of an intervention in a tional Resources Information Center, tional studies, heterogeneity of popu-
community as opposed to the special PsycLIT (http://www.wesleyan.edu lations (eg, US vs international stud-
setting of a controlled trial.14 Thus, a /libr), and the Current Index to Statis- ies), design (eg, case-control vs cohort
clear understanding of the advantages tics. In addition, we examined refer- studies), and outcome (eg, different
and limitations of statistical syntheses ence lists and contacted experts in the studies yielding different relative risks
of observational data is needed.15 field. We used the 32 articles retrieved that cannot be accounted for by sam-
Although meta-analysis restricted to to generate the conference agenda and pling variation) is expected.8
RCTs is usually preferred to meta- set topics of bias, searching and ab-
analyses of observational studies,16-18 the stracting, heterogeneity, study catego- Search
number of published meta-analyses rization, and statistical methods. We in- Reporting of the search strategy should
concerning observational studies in vited experts in meta-analysis from the include qualifications of the search-
health has increased substantially dur- fields of clinical practice, trials, statis- ers, specification of databases used,
ing the past 4 decades (678 in 1955- tics, epidemiology, social sciences, and search strategy and index terms, use of
1992, 525 in 1992-1995, and more than biomedical editing. any special features (eg, “explosion”),
400 in 1996 alone).19 The workshop included an overview search software used, use of hand
While guidelines for meta-analyses of the quality of reporting of meta- searching and contact with authors, use
have been proposed, many are written analyses in education and the social of materials in languages other than En-
from the meta-analyst’s (author’s) rather sciences. Plenary talks were given on the glish, use of unpublished material, and
than from the reviewer’s, editor’s, or topics set by the conference agenda. For exclusion criteria used. Published re-
reader’s perspective20 and restrict at- each of 2 sessions, workshop partici- search shows that use of electronic da-
tention to reporting of meta-analyses of pants were assigned to 1 of 5 small dis- tabases may find only half of all rel-
RCTs.21,22 Meta-analyses of observa- cussion groups, organized around the evant studies, and contacting authors
tional studies present particular chal- topic areas. For each group, 1 of the may be useful,27 although this result
lenges because of inherent biases and authors served as facilitator, and a may not be true for all topic areas.28
differences in study designs23; yet, they recorder summarized points of discus- For example, a meta-analysis of de-
may provide a tool for helping to un- sion for issues to be presented to all pression in elderly medical inpatients29
derstand and quantify sources of vari- participants. Time was provided for the used 2 databases for the search. In
ability in results across studies.24 2 recorders and 2 facilitators for each addition, bibliographies of retrieved
We describe here the results of a topic to meet and prepare plenary pre- papers were searched. However, the au-
workshop held in Atlanta, Ga, in April sentations given to the entire group. thors did not report their search strat-
1997, to examine concerns regarding the We proposed a checklist for meta- egy in enough detail to allow replica-
reporting of Meta-analysis Of Observa- analyses of observational studies based tion. An example of a thorough “reject
tional Studies in Epidemiology on the deliberation of the independent log” can be found in the report of a meta-
(MOOSE). This article summarizes de- groups. Finally, we circulated the check- analysis of electrical and magnetic field
liberations of 27 participants (the list for comment to all conference attend- exposure and leukemia.30 Examples of
MOOSE group) of evidence leading to ees and representatives of several con- a table characterizing studies included
recommendations regarding the report- stituencies who would use the checklist. can be found in Franceschi et al31 and
ing of meta-analyses. Meta-analysis of in- Saag et al.32 Complete specification of
dividual-level data from different stud- RESULTS search strategy is not uniform; a review
ies, sometimes called “pooled analysis” The checklist resulting from work- of 103 published meta-analyses in edu-
or “meta-analysis of individual patient group deliberations is organized cation showed that search procedures
data,”25,26 has unique challenges that we around recommendations for report- were described inadequately in the ma-
will not address here. We propose a ing background, search strategy, jority of the articles.10
©2000 American Medical Association. All rights reserved. JAMA, April 19, 2000—Vol 283, No. 15 2009

Downloaded from www.jama.com by guest on January 18, 2010


REPORTING META-ANALYSES OF OBSERVATIONAL STUDIES

Methods tional studies is controversial, as it is for also recommend subgroup or sensitiv-


Items in this checklist section are con- RCTs,16,33 because scores constructed in ity analysis rather than using quality
cerned with the appropriateness of any an ad hoc fashion may lack demon- scores as weights in the analysis.37,38
quantitative summary of the data; de- strated validity, and results may not be While some control over heteroge-
gree to which coding of data from the ar- associated with quality.34 Nevertheless, neity of design may be accomplished
ticles was specified and objective; assess- some particular aspects of study qual- through the use of exclusion rules, we
ment of confounding, study quality, and ity have been shown to be associated recommend using broad inclusion cri-
heterogeneity; use of statistical meth- with effect: eg, adequate concealment of teria for studies, and then performing
ods; and display of results. Empirical evi- allocation in randomized trials.35 Thus, analyses relating design features to out-
dence shows that reporting of proce- key components of design, rather than come.8 In cases when heterogeneity of
dures for classification and coding and aggregate scores themselves, may be im- outcomes is particularly problematic,
quality assessment is often incomplete: portant. For example, in a study of blind- a single summary measure may well be
fewer than half of the meta-analyses re- ing (masking) of readers participating inappropriate.39 Analyses that stratify
ported details of classifying and coding in meta-analyses, masking essentially by study feature or regression analysis
the primary study data, and only 22% as- made no difference in the summary with design features as predictors can
sessed quality of the primary studies.10 odds ratios across the 5 meta-analy- be useful in assessing whether study
We recognize that the use of quality ses.36 We recommend the reporting of outcomes indeed vary systematically
scoring in meta-analyses of observa- quality scoring if it has been done and with these features.40
Investigating heterogeneity was a key
Table. A Proposed Reporting Checklist for Authors, Editors, and Reviewers of Meta-analyses feature of a meta-analysis of observa-
of Observational Studies tional studies of asbestos exposure and
Reporting of background should include risk of gastrointestinal cancer.41 The au-
Problem definition thors of the meta-analysis hypoth-
Hypothesis statement
Description of study outcome(s) esized that studies allowing for a latent
Type of exposure or intervention used period between the initiation of expo-
Type of study designs used
Study population sure and any increases in risk should
Reporting of search strategy should include show, on average, appropriately higher
Qualifications of searchers (eg, librarians and investigators)
Search strategy, including time period included in the synthesis and keywords standardized mortality ratios than stud-
Effort to include all available studies, including contact with authors ies that ignored latency. In other words,
Databases and registries searched
Search software used, name and version, including special features used (eg, explosion)
the apparent effect of exposure would
Use of hand searching (eg, reference lists of obtained articles) be attenuated by including the latent
List of citations located and those excluded, including justification period in the calculation of time at risk
Method of addressing articles published in languages other than English
Method of handling abstracts and unpublished studies (the “denominator”), since exposure-
Description of any contact with authors related deaths (the “numerator”) would,
Reporting of methods should include
Description of relevance or appropriateness of studies assembled for assessing the hypothesis by definition, not occur during that la-
to be tested tent period (FIGURE).
Rationale for the selection and coding of data (eg, sound clinical principles or convenience)
Documentation of how data were classified and coded (eg, multiple raters, blinding, and In fact, the data suggested that stud-
interrater reliability) ies allowing for latent periods found
Assessment of confounding (eg, comparability of cases and controls in studies where on average somewhat higher standard-
appropriate)
Assessment of study quality, including blinding of quality assessors; stratification or regression ized mortality ratios than studies
on possible predictors of study results ignoring latency. This example shows
Assessment of heterogeneity
Description of statistical methods (eg, complete description of fixed or random effects models, that sources of bias and heterogeneity
justification of whether the chosen models account for predictors of study results, can be hypothesized prior to analysis
dose-response models, or cumulative meta-analysis) in sufficient detail to be replicated
Provision of appropriate tables and graphics and subsequently confirmed by the
Reporting of results should include analysis.
Graphic summarizing individual study estimates and overall estimate
Table giving descriptive information for each study included
Results of sensitivity testing (eg, subgroup analysis) Results
Indication of statistical uncertainty of findings Recommendations for reporting of re-
Reporting of discussion should include
Quantitative assessment of bias (eg, publication bias) sults include graphical summaries of
Justification for exclusion (eg, exclusion of non–English-language citations) study estimates and any combined es-
Assessment of quality of included studies
Reporting of conclusions should include timate, a table listing descriptive infor-
Consideration of alternative explanations for observed results mation for each study, results of sen-
Generalization of the conclusions (ie, appropriate for the data presented and within the domain
of the literature review) sitivity testing and any subgroup
Guidelines for future research analysis, and an indication of statisti-
Disclosure of funding source cal uncertainty of findings.
2010 JAMA, April 19, 2000—Vol 283, No. 15 ©2000 American Medical Association. All rights reserved.

Downloaded from www.jama.com by guest on January 18, 2010


REPORTING META-ANALYSES OF OBSERVATIONAL STUDIES

Discussion synthesis, while indicating when more


Figure. Effect of Latent Period on
The discussion should include issues re- research is necessary. Heterogeneity
lated to bias, including publication bias, The application of formal meta-
confounding, and quality. Bias can oc- analytic methods to observational stud- Latent Period Time-at-Risk

cur in the original studies (resulting from ies has been controversial.42 One reason
Exposure-Related Deaths Exposure-Related
flaws in the study design that tend to dis- for this has been that potential biases in Do Not Occur Deaths Occur
tort the magnitude or direction of asso- the original studies, relative to the biases Start of At-Risk Period Begins
ciations in the data) or from the way in in RCTs, make the calculation of a single Exposure
Time
which studies are selected for inclu- summary estimate of effect of exposure
sion.42 Publication bias, the selective potentially misleading. Similarly, the
publication of studies based on the mag- extreme diversity of study designs and list, to ensure its usefulness to journal
nitude (usually larger) and direction of populations in epidemiology makes the reviewers and editors. The US Food and
their findings, represents a particular interpretation of simple summaries prob- Drug Administration (FDA) receives
threat to the validity of meta-analysis of lematic, at best. In addition, methodo- and reviews petitions and applica-
observational studies.43-45 Thorough logic issues related specifically to meta- tions for approval of regulated prod-
specifications of quality assessment can analysis, such as publication bias, could ucts and/or their labeling. The FDA’s
contribute to understanding some of the have particular impact when combin- Center for Food Safety and Applied Nu-
variations in the observational studies ing results of observational studies.44,47 trition is now receiving applications that
themselves. Methods should be used to Despite these challenges, meta- use results of meta-analyses in sup-
aid in the detection of publication bias, analyses of observational studies con- port of the requested action. The re-
eg, fail-safe procedures or funnel plots.46 tinue to be one of the few methods for vised checklist should be tested dur-
Schlesselman47 comments on such bi- assessing efficacy and effectiveness and ing the review of an application. One
ases in assessing the possible associa- are being published in increasing num- might randomly assign FDA review-
tion between endometrial cancer and bers. Our goal is to improve the report- ers who encounter systematic reviews
oral contraceptives. This meta-anal- ing of these meta-analyses so that read- of observational studies to use the
ysis combined both cohort and case- ers can understand what was done in a checklist or not. Since the require-
control studies and used a sensitivity given analysis, who did it, and why it was ments for reporting for regulatory pur-
analysis to illustrate the influence of done. If bias is a problem, we suggest that poses might not completely coincide
specific studies, such as those pub- an informative approach is to use broad with those in the checklist and since
lished in English. inclusion criteria for studies and then to sample size (the number of formal sys-
perform analyses (when the data per- tematic reviews received by the FDA)
Conclusion mit) relating suspected sources of bias might be small, this evaluation should
Due to these biases in observational and variability to study findings. document any potential incompatibil-
studies, the conclusion of the report Methodologic and interpretational ity between requirements for regula-
should contain consideration of alter- concerns make the clear and thorough tory reporting and the checklist.
native explanations for observed re- reporting of meta-analyses of observa- Second, we will work with the Coch-
sults and appropriate generalizations of tional studies absolutely essential. Our rane Collaboration to promote the use
the conclusion. A carefully conducted workshop was convened to address the of these recommendations by Coch-
meta-analysis can reveal areas warrant- problem of increasing diversity and vari- rane collaborative review groups.49
ing further research. Finally, since fund- ability that exist in reporting meta- Members of the Cochrane Collabora-
ing source has been shown to be an im- analyses of observational studies. In tion are involved routinely in perform-
portant source of heterogeneity,48 the constructing the checklist, we have ing systematic reviews. Some are now
sponsoring organization should be dis- attempted, where possible, to provide incorporating nonrandomized studies
closed and any effect on analysis should references to literature justifying the out of necessity. A trial of use of the
be examined. inclusion of particular items. checklist could be compared with the
Assessment of the usefulness of rec- FDA experience.
COMMENT ommendations for reporting is depen- Third, an evaluation of the check-
Taking stock of what is known in any dent on a well-designed and effec- list by authors, reviewers, readers, and
field involves reviewing the existing lit- tively conducted evaluation. The editors could compare objective mea-
erature, summarizing it in appropri- workshop participants proposed a sures of the quality of articles written
ate ways, and exploring the implica- 3-pronged approach to determine use- with and without the formal use of the
tions of heterogeneity of population and fulness and implementation of these guidelines. A challenge to the use of
study for heterogeneity of study re- recommendations. quality measures would be arriving at
sults. Meta-analysis provides a system- First, further comments should be in- a valid measure of quality. A more im-
atic way of performing this research corporated into revisions of the check- portant end point for trials in journals
©2000 American Medical Association. All rights reserved. JAMA, April 19, 2000—Vol 283, No. 15 2011

Downloaded from www.jama.com by guest on January 18, 2010


REPORTING META-ANALYSES OF OBSERVATIONAL STUDIES

is process measures. Questions of in- Permanente Medical Care Program, Pasadena; Dun- analysis using individual patient data for ovarian can-
can Saunders, MBBCh, PhD, University of Alberta, Ed- cer studies. Am J Epidemiol. 1997;145:1917-1925.
terest include whether the use of the monton; and Stephen D. Walter, PhD, McMaster Uni- 27. McManus RJ, Wilson S, Delaney BC, et al. Re-
checklist makes preparation and evalu- versity, Hamilton, Ontario, for their advice and view of the usefulness of contacting other experts when
substantive comments on an earlier draft of this article. conducting a literature search for systematic reviews.
ation of manuscripts easier or is oth- In addition, we thank Barbara McDonnell, CDC, for BMJ. 1998;317:1562-1563.
erwise helpful. Again, defining the managing complex logistics of this project. 28. Hetherington J, Dickersin K, Chalmers I, Meinert
CL. Retrospective and prospective identification of un-
constructs of interest present crucial published controlled trials.Pediatrics. 1989;84:374-380.
challenges to this research. REFERENCES 29. Cole MG, Bellavance F. Depression in elderly medi-
cal inpatients. CMAJ. 1997;157:1055-1060.
Less formal evaluations, based on 1. Greenland S. Quantitative methods in the review of 30. Kheifets LI, Afifi AA, Buffler PA, et al. Occupa-
comments from users in any of the epidemiologic literature. Epidemiol Rev. 1987;9:1-30. tional electric and magnetic field exposure and leu-
2. Chalmers TC, Lau J. Meta-analytic stimulus for kemia. J Occup Environ Med. 1997;39:1074-1091.
above groups, would certainly be help- changes in clinical trials. Stat Methods Med Res. 1993; 31. Franceschi S, La Vecchia C, Talamini R. Oral con-
ful, as well. One would need to be con- 2:161-172. traceptives and cervical neoplasia. Tumori. 1986;72:
3. Badgett RG, O’Keefe M, Henderson MC. Using sys- 21-30.
cerned about contamination of the tematic reviews in clinical education. Ann Intern Med. 32. Saag KG, Criswell LA, Sems KM, et al. Low-dose
control groups when evaluating the 1997;126:886-891. corticosteroids in rheumatoid arthritis. Arthritis Rheum.
4. Ohlsson A. Systematic reviews. Scand J Clin Lab
checklist, as journals, for example, Invest Suppl. 1994;219:25-32.
1996;39:1818-1825.
33. Emerson JD, Burdick E, Hoaglin DC, et al. An em-
might adopt the checklist even in the 5. Bero LA, Jadad AR. How consumers and policy- pirical study of the possible relation of treatment dif-
makers can use systematic reviews for decision mak-
absence of evidence of its efficacy from ing. Ann Intern Med. 1997;127:37-42.
ferences to quality scores in controlled randomized clini-
cal trials. Control Clin Trials. 1990;11:339-352.
randomized trials. 6. Thacker SB. Meta-analysis. JAMA. 1988;259:1685- 34. Jüni P, Witschi A, Bloch R, Egger M. The hazards
In conclusion, the conference partici- 1689. of scoring the quality of clinical trials for meta-
7. Petitti D. Meta-Analysis, Decision Analysis, and analysis. JAMA. 1999;282:1054-1060.
pants noted that meta-analyses are them- Cost Effectiveness Analysis. New York, NY: Oxford
35. Schulz KF, Chalmers I, Hayes RJ, Altman DG. Em-
selves observational studies, even when University Press; 1994.
pirical evidence of bias. JAMA. 1995;273:408-412.
8. Berlin JA. Invited commentary. Am J Epidemiol.
applied to RCTs.50 If a role for meta- 36. Berlin JA, for the University of Pennsylvania Meta-
1995;142:383-387.
analysis Blinding Study Group. Does blinding of read-
analyses of observational studies in set- 9. Peipert JF, Phipps MG. Observational studies. Clin
ers affect the results of meta-analyses? Lancet. 1997;
Obstet Gynecol. 1998;41:235-244.
ting policy is to be achieved,51 stan- 350:185-186.
10. Sipe TA, Curlette WL. A meta-synthesis of fac-
37. Hasselblad V, Eddy DM, Kotchmar DJ. Synthesis
dards of reporting must be maintained tors related to educational achievement. Int J Educ Res.
of environmental evidence. J Air Waste Manag As-
1997;25:583-598.
to allow proper evaluation of the qual- 11. Ioannidis JP, Lau J. Pooling research results. Jt soc. 1992;42:662-671.
38. Friedenreich CM, Brant RF, Riboli E. Influence of
ity and completeness of meta-analyses. Comm J Qual Improv. 1999;25:462-469.
methodologic factors in a pooled analysis of 13 case-
12. Lipsett M, Campleman S. Occupational expo-
The Meta-analysis Of Observational Studies in Epi- sure to diesel exhaust and lung cancer: a meta- control studies of colorectal cancer and dietary fiber.
demiology (MOOSE) Group: Centers for Disease Con- analysis. Am J Public Health. 1999;89:1009-1017. Epidemiology. 1994;5:66-67.
trol and Prevention, Atlanta, Ga: Gary Jeng, PhD, Rob 13. Vickers A, Cassileth B, Ernst E, et al. How should 39. Berlin JA, Rennie D. Measuring the quality of tri-
Lyerla, PhD, Thomas Peterman, MD, Donna F. Stroup, we research unconventional therapies? Int J Technol als. JAMA. 1999;282:1083-1085.
PhD, MSc, Stephen B. Thacker, MD, MSc, and G. David Assess Health Care. 1997;13:111-121. 40. Colditz GA, Burdick E, Mosteller F. Heterogene-
Williamson, PhD; University of Pennsylvania School 14. Mann CC. Can meta-analysis make policy? Sci- ity in meta-analysis of data from epidemiologic stud-
ence. 1999;266:960-962. ies. Am J Epidemiol. 1995;142:371-382.
of Medicine, Philadelphia: Jesse A. Berlin, ScD; RAND
15. Blettner M, Sauerbrei W, Schlehofer B, et al. Tra- 41. Frumkin H, Berlin J. Asbestos exposure and gas-
Corporation, Santa Monica, Calif: Sally C. Morton, PhD;
ditional reviews, meta-analyses and pooled analyses trointestinal malignancy review and meta-analysis
Stanford University, Stanford, Calif: Ingram Olkin, PhD;
in epidemiology. Int J Epidemiol. 1999;28:1-9. [published correction appears in Am J Ind Med. 1988;
University of California, San Francisco, and JAMA, Chi-
16. Greenland S. Invited commentary. Am J Epide- 14:493]. Am J Ind Med. 1988;14:79-95.
cago, Ill: Drummond Rennie, MD; Thomas C. Chal-
miol. 1994;140:290-296. 42. Blettner M, Sauerbrei W, Schlehofer B, et al. Tra-
mers Centre for Systematic Reviews, Children’s Hos-
17. Lau J, Ioannidis JP, Schmid CH. Summing up evi- ditional reviews, meta-analyses and pooled analyses
pital of Eastern Ontario Research Institute, Ottawa:
dence. Lancet. 1998;351:123-127. in epidemiology. Int J Epidemiol. 1999;28:1-9.
David Moher, MSc; Michigan State University, East
18. Shapiro S. Meta-analysis/shmeta-analysis. Am J 43. Rosenthal R. The file drawer problem and toler-
Lansing: Betsy J. Becker, PhD; Georgia State Univer- ance for null results. Psychol Bull. 1979;86:638-641.
sity, Atlanta: Theresa Ann Sipe, PhD; Centre for Sta- Epidemiol. 1994;140:771-778.
19. Stroup DF, Thacker SB, Olson CM, Glass RM. 44. Easterbrook PJ, Berlin JA, Gopalan R, Matthews
tistics in Medicine, Oxford, England: Douglas Alt- DR. Publication bias in clinical research. Lancet. 1991;
man, PhD; Memorial Sloan-Kettering Cancer Center, Characteristics of meta-analyses submitted to a medi-
cal journal. From: International Congress on Biomedi- 337:867-872.
New York, NY: Colin Begg, PhD; Hamilton Went- 45. Dickersin K, Min YI. NIH clinical trials and publi-
cal Peer Review and Global Communications; Sep-
worth Regional Public Health Department, Hamil- cation bias. Online J Curr Clin Trials [serial online].
tember 17-21, 1997; Prague, Czech Republic.
ton, Ontario: Larry Chambers, PhD; Harvard Medi- 1993 Apr 28: Doc No 50.
20. Lang TA, Secic M. How to Report Statistics in
cal School and Brigham and Women’s Hospital, Boston, 46. Hedges LV, Olkin I. Statistical Methods for Meta-
Medicine. Philadelphia, Pa: American College of Phy-
Mass: Graham Colditz, PhD; University of Maryland, analysis. Boston, Mass: Academic Press; 1985.
sicians; 1997.
Baltimore: Kay Dickersin, PhD; AT&T Labs, Murray Hill, 21. Cook DJ, Sackett DL, Spitzer WO. Methodo- 47. Schlesselman JJ. Risk of endometrial cancer in re-
NJ: William DuMouchel, PhD; University of Colo- logic guidelines for systematic reviews of random- lation to use of combined oral contraceptives. Hum
rado, Denver: Karen Kafadar, PhD; Cleveland, Ohio: ized control trials in health care from the Potsdam con- Reprod. 1997;12:1851-1863.
Tom Lang, MA; Food and Drug Administration, Wash- sultation on meta-analysis. J Clin Epidemiol. 1995; 48. Jadad A, Sullivan C, Luo D, et al. Patients’ pref-
ington, DC: Lynn Larsen, PhD; University of Minne- 48:167-171. erences for Turbuhaler or pressurized metered dose
sota, Minneapolis: Thomas A. Louis, PhD; University 22. Moher D, Cook DJ, Eastwood S, et al. Improving inhalers (pMDIs) in the treatment. From Annual Meet-
of British Columbia, Vancouver: Parminder Raina, PhD; the quality of reports of meta-analyses of randomised ing of the American Academy of Allergy, Asthma, and
University of Pittsburgh, Pittsburgh, Pa: Allan Samp- controlled trials. Lancet. 1999;354:1896-1900. Immunology; March 3-8, 2000; San Diego, Calif.
son, PhD; Family Health International, Research Tri- 23. Huston P. Health services research. CMAJ. 1996; 49. Huston P. Cochrane Collaboration helping un-
angle Park, NC: Ken Schulz, PhD, MSc; and Institute 155:1697-1702. ravel tangled web woven by international research.
of Medicine, Washington, DC: Mike Stoto, PhD. 24. Egger M, Scheider M, Davey-Smith G. Meta- CMAJ. 1996;154:1389-1392.
Acknowledgment: We thank Susan Eastwood, MA, analysis. BMJ. 1998;316:140-144. 50. Moher D, Pham B, Jones A, et al. Does the qual-
University of California, San Francisco; Christine 25. Stewart LA, Parmar MK. Meta-analysis of the lit- ity of reports of randomised trials affect estimates of
Friedenreich, PhD, Alberta Cancer Board and Univer- erature or of individual patient data? Lancet. 1993; intervention efficacy reported in meta-analyses? Lan-
sity of Calgary, Calgary; Sander Greenland, DrPH, Cstat, 341:418-422. cet. 1998;352:609-613.
MA, MS, University of California, Los Angeles; Rich- 26. Steinberg K, Smith SF, Lee N, et al. Comparison 51. Berlin JA, Colditz GA. The role of meta-analysis
ard Horton, MBBCh, MD, Lancet, London, England; of effect estimates from a meta-analysis of summary in the regulatory process for foods, drugs, and de-
Diana Petitti, MD, MPH, Southern California Kaiser data from published studies and from a meta- vices. JAMA. 1999;281:830-834.

2012 JAMA, April 19, 2000—Vol 283, No. 15 ©2000 American Medical Association. All rights reserved.

Downloaded from www.jama.com by guest on January 18, 2010

You might also like