You are on page 1of 19

REJECT ANALYSIS

A Low Tech Solution?

Andy Rogers
Head of Radiology Physics
Nottingham City Hospital NHS Trust
OVERVIEW

• Definition of terms
• Literature review
• Potential dose reduction
• Incidental issues
• Summary
• Lunch
DEFINITIONS

• A Reject – a film deemed useless &


discarded, with another film being
taken.
• A Repeat – a film retaken to provide
extra/missing diagnostic information.
Sent with the original for reporting.
DEFINITIONS

Reject Rate (%):


(# rejects / # examinations) x 100 %

Repeat Rate (%):


(# repeats / # examinations) x 100 %
DATA COLLECTION
• Collect rejects & repeats for defined
time period
• Assign reasons for reject/repeat e.g.:
– exposure (over/under)
– positioning
– movement
– processing
DATA COLLECTION
• Assign reference data for analysis e.g.:
– date/time
– Operator ID code
– room
– exam type
– reject or repeat?
• Data to be collected depends upon output
required from reject/repeat analysis – when
planning, specify data analysis strategy then
know ‘up-front’ what data to collect
DATA ANALYSIS
• Depends what you want!
• Single global figure
– quick & easy
– hides specific problems

• Rate by exam type, reason for reject etc


– more complicated data collection/analysis
– may be more productive in terms of useful results
LITERATURE REVIEW

• Quick Medline search yields ~ 20 papers


• One review claims 11 reports of 48 studies
• Most recent studies tend to be comparative
e.g. film/CR comparison
• Older ones focus on role of reject analysis
providing info for local quality improvement
programs
LITERATURE REVIEW

• Reject rates quoted, 6.4% – 28%


• Reject rate reductions of 40% – 90%
• Reject rates of 5% - 10% for DGH-type
organisation seem ‘satisfactory’
• Less in smaller & private organisations?
[~3%]
Examples (1)

• ‘Analysis of an image quality assurance


program’ (1985)
– QA led to 45% reduction in reject rate!!
– 4.5% reduction in operational/maintainance
costs!
– I deduce about a 10% reject rate prior
to QA
Examples (2)
• ‘Comparative reject analysis in
conventional film -screen & digital
storage phosphor radiography’ (1999)
– 28% film-screen reject rate!!!
– 2.3% digital reject rate
– most of film-screen rejects were exposure
and processing issues
– most digital retakes were ‘positioning’
[hardly surprising!]
Examples (3)

• ‘Continuing reject-repeat film analysis


program’ (1989)
– rate dropped from 15% to 6.4% over 9
years
– diminishing returns
Examples (last one)
• ‘X-ray film reject analysis as a quality
indicator’ (1998)
– Overall rate ~ 8%
– hid specifics of:
» knees = 26%
» all spines > 10%
» chests = 6.5%
– chest rate dominated overall rate due to
being about 50% total exams!
Examples (last one)

Exam Reject Main Reason for Second reason Third reason


Type / reject/repeat
repeat
rate
Chest 6.5 position 54% exposure 29% other 15%
Abdomen 4.0 position 47% exposure 21% other 7%
Knee 26.4 position 85% other 9% exposure 7%
Pelvis 5.9 position 52% exposure 10% other 10%
C Spine 10.0 exposure 44% position 30% other 18%
T Spine 18.6 exposure 67% other 13% movement 6%
L Spine 14.3 position 67% other 18% exposure 16%
DOSE REDUCTION

• Model the dose reduction capabilities:


– Use NRPB Report W-4 (for exam
frequencies & collective doses)
– Use literature search to show general rates
and rate reductions
– Use my local situation to see if any
different
DOSE REDUCTION
• Only use plain film radiography (~20%
collective dose, ~60% exams)
• Input Parameters
– 10% reject rate (about right)
– 50% reject rate reduction (optimistic)
• Results
– < 1% collective dose reduction
– ~ 5% time saving in plain film radiography
effort
Nottingham Dose Reduction
• Input Parameters
– local reject rates & doses
– 50% reject rate reduction
– 2002 examination frequencies/doses
• Results
– 4% time/cost savings
– spines/abdo/pelvis, 20% rejects, 95% dose
savings (0.2 man.Sv)
– chests, 50% rejects, 5% dose savings
SIDE ISSUE
• Locally, 0.2 man.Sv potential reject
saving annually by:
– few £1k project costs
– ongoing training, monitoring, device
purchase etc

• BaE dose reduction project:


– £10k project costs (including
publication/presentation)
– 0.3 man.Sv savings annually
SUMMARY
• It’s certainly ‘low tech’ & NOT the whole
solution
• It’s cost-effectiveness seems commensurate
with other local projects
• Continuing reject analysis may yield less
effective results, so maybe move from
‘specific’ to ‘global’ analysis as time passes?
• Provides useful info to local quality
improvement programs
• However – is 1% collective UK dose saving
worth the effort?

You might also like