You are on page 1of 409

Food safety assurance and

veterinary public health

– volume 4 –

Towards a risk-based
chain control

edited by:
Frans J.M. Smulders
Towards a risk-based chain control
Dedicated to the memories of Prof. Dr. Roberto Chizzolini, Prof. Dr. Maurizio Severini and
Dr. Jos Snijders, founding fathers of the European College of Veterinary Public Health,
whose support in establishing ECVPH will not be forgotten.

Food safety assurance and
veterinary public health

– volume 4 –

Towards a risk-based
chain control

edited by:
Frans J.M. Smulders

Wageningen Academic
P u b l i s h e r s
This work is subject to copyright. All
rights are reserved, whether the whole or
part of the material is concerned. Nothing
from this publication may be translated,
reproduced, stored in a computerised
system or published in any form or
in any manner, including electronic,
­mechanical, reprographic or photographic,
without prior written permission from
the publisher, Wageningen Academic
Publishers, P.O. Box 220, 6700 AE
Wageningen, the Netherlands,
ISBN: 978-90-76998-97-8
e-ISBN: 978-90-8686-583-3
DOI: 10.3920/978-90-8686-583-3 The individual contributions in this
publication and anyliabilities arising from
them remain the responsibility of the
First published, 2006 authors.

The publisher is not responsible for

© Wageningen Academic Publishers possible damages, which could be a result
The Netherlands, 2006 of content derived from this publication.

In 2000 an Austrian/Irish consortium of scientists was awarded a considerable EU grant to

organise a series of conferences with the objective of systematically addressing public health
hazards, the prevention and control of which is a function of veterinary public health. The
format of the conference series [three consecutive events in Vienna (2001 and 2002) and
Dublin (2003)] was based on following a longitudinal approach to food quality and safety
assurance programmes. For its primary substance the conference series relied on the input of
recognised senior experts engaged at the cutting edge of research on the safety of foods of
animal origin. In addition, the EU grants allowed for supporting the participation of more
junior colleagues embarking on a career in food safety assurance and veterinary public health.
The organisers were fortunate to find Wageningen Academic Publishers who were prepared
to include all written output in a carefully edited “Food Safety Assurance and Veterinary
Public Health” book series, that was to comprise three volumes: 1) Food Safety Assurance in
the Pre-Harvest Phase, 2) Safety Assurance during Food Processing and 3) Risk Management
Strategies: Monitoring and Surveillance.

The aforementioned conferences were held at the same time the European College of Veterinary
Public Health (ECVPH) came into being and its founding fathers have taken the conference
series as an opportunity to hold ECVPH’s Annual General Meetings in parallel.
ECVPH is a veterinary specialty college with the primary objectives to advance veterinary
public health and its subspecialties population medicine and food science in Europe and
to increase the competence of those who are active in these fields by: i) establishing
guidelines for postgraduate education and training and training prequisites for specialisation
in Veterinary Public Health, ii) examining and authenticating veterinarians as specialists
in order to serve the livestock population (at both herd, region and national level), the
livestock owners and the general public, iii) encouraging research and other contributions
and promoting the communication and dissemination of knowledge, and, finally iv) improving
the quality of service to the public. Since its creation, ECVPH has recognised more than 250
veterinarians across EU Member States as Diplomate of Veterinary Public Health and hence
has become one of the largest colleges accredited by the European Board of Veterinary
Specialisation (EBVS). The interested reader is referred to ECVPH’s website (
for additional, more detailed information.

After having issued the originally envisaged three-volume book series and in consideration of
the positive response of reviewers from the scientific press and other professional readership,
the publisher approached the editors with the request to consider continuation of the
series. This was to be based on contributions delivered during the annual general meetings
and scientific conferences of ECVPH. The Council of ECVPH decided to accept the offer.
Consequently, the readership can expect a sizeable number of additional volumes over the
next years. In accordance with the format originally chosen, these will include the edited
proceedings of ECVPH’s conferences and comprise not only the scientific keynote addresses
during these events but also the synopses of other contributions and occasionally additional
chapters written on invitation. Depending on duration and thematics it is intended to include

Towards a risk-based chain control  

the output of either one or of more scientific conferences in one single volume. Hence, new
releases are expected with intervals of approximately two years.

The theme and contents of this book – volume 4 in the series – originate from a conference
held on the 22nd and 23rd of October, 2004, which was co-organised by ECVPH and the
Istituto Zooprofillatico Sperimentale delle Regioni Lazio e Toscana, under the coordination
of local organiser Dr. Romano Zilli. The conference was hosted by the Food and Agriculture
Organisation (FAO) and staged at its premises in Rome. Contributions ranged from reviews
on risk analysis in the food chain, epidemiological monitoring and surveillance in primary
production and processing of foods of animal origin, antimicrobial resistance and transfer
in these foods, to those on risk modelling and management strategies. Finally, recent food
legislation aspects were discussed.

In consideration of the time passed between conference and the release of this book, authors
have been requested to update their contributions when this was considered necessary.

On the occasion of the publication of this volume, I take the opportunity to thank a number
of individuals who over the years have been supportive of this (what turned out to be a
continuing) project. First of all, the invaluable input of my co-editor of the first three
volumes, Emeritus Prof. Dan Collins is gratefully acknowledged. His ‘seniority’ on so many
issues has been instrumental in our attempts to produce the series in a form that could truly
be considered a platform for exchange of views in the area of Veterinary Public Health and
his continuing inspiration has made the exercise challenging and gratifying. Mr. Andreas
Wunsch deserves credit for ably managing budgetary and other organisational issues of the
first three conferences and for rendering additional assistance whenever I needed it. The
Council of the European College of Veterinary Public Health is thanked for embracing the idea
of issuing the present and following volumes under the aegis of ECVPH. I am very grateful
to the various scientists for their willingness to generate well-written contributions strictly
following the rather demanding Instructions for Authors and this without complaining. Last
but not least, I thank Ms. Alexandra Bauer, Dr. Johann Hiesberger and Mr. Ronald Matky
for their help in solving formatting and word-processing problems that needed to be dealt
with before the end-result could be filed with the publisher. May our joint efforts serve the
veterinary profession in its endeavours to become even more professional.

Vienna, July, 2006

Frans J.M. Smulders

 Towards a risk-based chain control


Preface 7
Frans J.M. Smulders

Keynote contributions

Risk assessment as a tool for evaluating risk management options for food safety 19
Riitta Maijala
Summary 19
1. Introduction 19
2. Risk assessment 20
3. Use of risk assessment in decision making 25
4. Conclusions 30
References 31

Food safety: A must for the food chain 33

Ivar Vågsholm
Summary 33
1. Introduction 33
2. Food safety; an economic perspective 34
3. Foodborne zoonoses 34
4. Prolongation of the food chain 36
5. Integrated food production system 38
6. Pre- and post-harvest control 40
7. Conclusions 42
References 43

Risk assessment of feed additives and contaminants 45

Alberto Mantovani and Roberto Cozzani
Summary 45
1. Feed additives and food safety 45
2. Risk assessment of feed additives in Europe 46
3. Examples of EFSA evaluation of feed additives 47
4. Risk assessment of feed contaminants in Europe 51
5. Examples of EFSA evaluations of feed contaminants 52
6. Conclusions 54
References 54

Field data availability and needs for use in microbiological risk assessment 57
John N. Sofos
Summary 57
1. Introduction 57
2. Risk analysis based pathogen control 59
3. Data gaps, needs and flow 60

Towards a risk-based chain control  


4. Strategy for pathogen control 62

5. Antimicrobial interventions to control of pathogens in live animals 62
6. Antimicrobial interventions to control of pathogens at slaughter 66
7. Difficulties in risk assessment 71
References 71

Chemical residues in foods of animal origin: Assessing risk and implementing

control strategies 75
Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.
Summary 75
1. Introduction 75
2. Sources of chemical residues in foods of animal origin 76
3. Regulatory control of chemicals in foods 77
4. International activities on residues of veterinary drugs in foods 79
5. Conclusions 87
References 88

Quantitative risk assessment of aflatoxicosis associated with milk

consumption in Italy (2000-2004) 91
Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo
Summary 91
1. Introduction: identification of the problem and consideration of the
context 92
2. Overview of the risk assessment 94
3. Estimation of aflatoxin level in milk 95
4. Modelling process of milk 96
5. Estimation of milk consumption 96
6. Hazard characterization 97
7. Risk model and calculation 101
8. Aflatoxin concentration in milk 101
9. Production module 102
10. Food consumption module 103
11. Body weight module 104
12. Demographic data 106
13. Prevalence of carriers of Hepatitis B virus in the Italian population 106
14. Risk characterization module 107
15. Discussion 109
16. Conclusions 111
References 112

Use of sensors for early disease detection - visions on proactive disease

control in the primary animal production 115
Marcus G. Doherr
Summary 115
1. Introduction 115

10  Towards a risk-based chain control


2. Disease “surveillance” 117

3. Test principles and characteristics 119
4. Modelling 125
5. Sensors 126
6. Bringing everything together 127
References 128

Antimicrobial resistance and transfer in foodborne pathogens 129

Friederike Hilbert
Summary 129
1. Introduction 129
2. Material and methods 130
3. Results 133
4. Conclusions 136
References 138

Antibiotic resistance monitoring in veterinary medicine 139

Antonio Battisti and Alessia Franco
Summary 139
1. Introduction 140
2. Monitoring and surveillance of antibiotic resistance in veterinary
medicine 141
3. Antimicrobial resistance monitoring in veterinary medicine: the current
situation in Italy 144
4. Data drawn from the Italian Veterinary Antimicrobial Resistance
Monitoring (ITAVARM) 147
5. Conclusions 162
References 162

Use of veterinary epidemiology to improve food safety along the food chain:
An industry perspective on Salmonella 165
Lis Alban and Stine G. Goldbach
Summary 165
1. Introduction 165
2. Pre-harvest initiatives in the early phase 166
3. Post-harvest initiatives in the second phase 169
4. Conclusions 173
4.3. What needs to be done 173
References 174

Epidemiological surveillance in primary and processing food production in the

network of “Istituti Zooprofilattici” in Italy 177
Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni
Summary 177
1. Introduction 177
2. Scientific data organization 178

Towards a risk-based chain control  11


3. The surveillance network of the “Istituti Zooprofilattici Sperimentali” in

Italy 179
4. Primary food production 180
5. Conclusions 195
References 200

The new EU legislation on food control and how veterinarians fit in 201
Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens
Summary 201
1. Introduction 201
2. Regulation (EC) No 178/2002 on ‘the General Food Law’ 203
3. Regulation (EC) 852/2004 on the hygiene of foodstuffs (H1), and
Regulation (EC) 853/2004 on specific hygiene rules for the hygiene of
foods of animal origin (H2) 208
4. Regulation (EC) No 882/2004 “on official controls performed to ensure
the verification of compliance with feed and food law, animal health and
animal welfare rules” 208
5. Regulation (EC) No. 854/2004 on official controls of foods of animal
origin (H3) 212
6. Council Directive 2002/99/EC on animal health rules (H4) 215
7. Consequences of the new EU food legislation for the veterinary profession 216
8. Conclusions 222
Acknowledgements 223
References 223

Synopses of other conference contributions

Is Vibrio parahaemolyticus a risk pathogen in Brazil? 227

S.D. Amorim, C.S. Pereira, A. Lafisca and D.P. Rodrigues

Implementation of a risk based chain control through the detection of some

Escherichia coli genes in faecal swabs and food products with multiplex PCR assay 229
E. Bartocci, A. Codega de Oliveira, R. Ortenzi, S. Costarelli, S. Crotti, S. Scuota, A.
Zicavo, A. Vizzani and B.T. Cenci Goga

Real-Time PCR method as a powerful tool to detect Escherichia coli O157:H7 in

wastewater produced from Mozzarella cheese factories 232
L. Beneduce, G. Spano, S. Baldassarre, V. Terzi, G. La Salandra and S. Massa

NIR analysis of veal meat as an easy way to discriminate illicit hormones

treatment 233
P. Berzaghi, S. Segato, E. Soardo, L. Serva, M. Mirisola, A. Corato and I. Andrighetto

12  Towards a risk-based chain control


Analysis of Listeria monocytogenes, Salmonella typhimurium and enteritidis

and Staphylococcus aureus death rate in Grana Padano DOP cheese 236
P. Boni, P. Daminelli, E. Cosciani Cunico, P. Monastero, B. Bertasi, F. Rossi and L.

Contamination distributions and virulence factors of Listeria monocytogenes in

raw meat of avian, bovine, and swine origin 240
P. Bonilauri, G. Merialdi, L. Casadei, G. Liuzzo, S. Bentley and M. Dottori

Monitoring programs on aquaculture products 245

T. Bossu, E. Ingle, S. Saccares, R.N. Brizioli and S. Cataudella

The slaughterhouse as an epidemiological observatory for the surveillance of

caseous lymphadenitis in sheep 247
R. Branciari, R. Mammoli, D. Ranucci, D. Miraglia, G. Gorziglia, F. Feliziani and P.

Risk based surveillance of milk and dairy products 250

F. Brülisauer, T. Berger, B. Klein and J. Danuser

Are food recalls risk based food safety tools? 253

L. Bucchini and L. Caricchio

Subtyping of Salmonella enterica serotype Typhimurium of human and animal

origin as a tool for estimating the fraction of human infections attributable to
a given source 257
L. Busani, C. Graziani, I. Luzzi, A.M. Dionisi, C. Scalfaro, A. Caprioli and A. Ricci

Enter-Net Italia: surveillance of verocytotoxin-producing Escherichia coli

infections in Italy 261
A. Caprioli, S. Morabito, F. Minelli, M.L. Marziano, A. Fioravanti, R.Tozzoli, G.Scavia,
L.Busani, G. Rizzoni, A. Gianviti, M.A. Procaccino and A.E. Tozzi

Effect of the introduction of HACCP on the microbiological quality of meals at

an university restaurant 262
A. Codega de Oliveira, R. Ortenzi, E. Bartocci, A. Vizzani and B.T. Cenci Goga

Monitoring of safety and quality of donkey’s milk 265

F. Conte, M.L. Scatassa, G. Monsù, V. Lo Verde, A. Finocchiaro and M. De Fino

The frequency and capacity for dissemination of brain tissue embolism

associated with pre-slaughter stunning of cattle 269
R.R. Coore, S. Love and M.H. Anil

Towards a risk-based chain control  13


Resistance of Salmonella enteritidis and Salmonella spp. to quinolones in

poultry in Styria (Austria) 271
F. Dieber, P. Wagner and J. Köfer

Changes in histamine and microbiological analysis in fresh and frozen tuna

muscle during temperature abuse 273
V.K. Economou¹, M.M. Brett², C. Papadopoulou¹ and T. Nichols³

VRE (Vancomycin-resistant Enterococci) from human, animal, and

environmental samples in Styria, Austria 278
A. Eisner, G. Feierl, G. Gorkiewicz, F. Dieber, E. Marth and J. Köfer

Risk analysis and food safety: a new EU approach 281

M. Ferri, V. Giaccone and V. Tepedino

The food chain as a source for dissemination of Salmonella infantis multidrug

resistant clone 283
E.L. Fonseca, E.M.F. Reis, M.D. Asensi, A. Lafisca and D.P. Rodrigues

Risk of transmission of infection by exporting fresh boar semen from

Switzerland to Norway 286
T. Fuchs, F. Brülisauer and K.D.C. Stärk

Enter-Net Italia: Integrated medical and veterinary surveillance of

salmonellosis in Italy 290
P. Galetta, E. Filetici, A. Dionisi, S. Arena, I. Benedetti, S. Lana, A. Ricci, D. Vio, L.
Busani, C. Graziani, A. Caprioli and I. Luzzi

The impact of feed supplementation with oregano essential oil and α-

tocopheryl acetate on the microbial growth and lipid oxidation of turkey
breast during refrigerated storage 293
A. Govaris, E. Botsoglou, P. Florou-Paneri, A.N. Moulas and G. Papageorgiou

“Boil it, cook it, peel it or forget it!” does not always work; or: low effect of
boiling on Vibrio biodiversity in mussels sold in Rio de Janeiro (Brazil) 296
A. Lafisca, C. Soares Pereira, V. Giaccone, D. dos Prazeres Rodrigues

Pre-Harvest Salmonella control in the EU pig herds is feasible 301

L. Leontides and C. Enoe

Detection of bovine noroviruses in Italian herds: Does their genetic

relatedness to human noroviruses imply a risk of interspecies transmission? 305
S. Magnino, R. Santoni, S. Crudeli, I.D. Bartolo and F.M. Ruggeri

Risk classification of food establishments 309

A. Mancuso, L. Decastelli, L. Prato and M. Voghera

14  Towards a risk-based chain control


Risk management: The Codex Alimentarius approach 314

J. Maskeliunas and A. Bruno

A source of extensive scientific issues on post-mortem findings in slaughtered

Sarda sheep as basis for risk assessment 319
D. Meloni, R. Mazzette, E.P.L. De Santis, G.P. Mangia and A.M. Cosseddu

Contaminant levels in the main freshwater fish of Latium (Italy) for the
evaluation of consumption risks 323
E. Orban, M. Masci and L. Gambelli

Risk management and science for safety assurance of foods 325

R. Ortenzi, E. Bartocci, A. Codega de Oliveira, A. Vizzani and B.T. Cenci Goga

Occurrence of Taenia saginata cysticercosis in slaughtered cattle in the North

of Italy: Results of a ten-year monitoring program 329
A. Padovani, A. Pelloni, M. Trevisani and G. Bettini

Risk assessment of veterinary drug residues: The case of a new LC method for
sulphmethazine in swine tissue 332
E.P. Papapanagiotou, D.J. Fletouris and I.E. Psomas

Assessment of human enteric viruses in shellfish collected in the Adriatic Sea 336
E. Pavoni, M.N. Losio, L. Croci, C. Panteghini, N. Zanardini, G. Maccabiani, M. Tilola,
F. D’Abrosca and P. Boni

Characteristics of prevalent Salmonella serovars isolated from food in Brazil:

Implications for public health 339
C.S. Pereira, R.G. Costa, M.L. Festivo, L.M. Seki, E.M.F. Reis, A. Lafisca and D.P.

Ensuring the safety of beef, operatives and abattoir environment during

slaughter with a view to BSE risks: Alternative slaughter technology 342
S.B. Ramantanis

Risk analysis on human salmonellosis from pork products in the Veneto region
of Italy: preliminary results 346
A. Ricci, R. Mioni, V. Cibin, L. Busani, P. Zavagnin and L. Barco

Antibiotic resistance trend and analysis of Salmonella enterica isolates

between 2002-2004 in Marche region 350
M. Staffolani, E. Micci, G. Striano, G. Perugini and S. Fisichella

Escherichia coli O157:H7 in slaughtered cattle. A surveillance study in the

Ravenna District (North of Italy) 354
M. Trevisani, S. Albonetti, L. Alberghini, S. Alonso and O. Peppi

Towards a risk-based chain control  15


Characterization of product quality from Lazio and Toscana seafood breedings 358
A. Ubaldi, K. Russo, R. Cozzani, T. Bossù, S. Berretta and P. Di Giustino

Residues of antimicrobials in bovine milk samples in Lombardia region 361

G. Varisco, G.Bolzoni, L. Bertocchi

Bovine spongiform encephalopathy in the Lombardy region: descriptive

epidemiology after three years of active surveillance 365
G. Zanardi, V. Tranquillo, D. Avisani, C. Nassuato and C. Bonacina

Biographies 369

Index 377

16  Towards a risk-based chain control

Keynote contributions
Riitta Maijala

Risk assessment as a tool for evaluating risk

management options for food safety
Riitta Maijala
Department of Food and Environmental Hygiene, Faculty of Veterinary Medicine, Helsinki
University, Finland,


In the area of food safety, the use of risk assessment in decision making has expanded from
regulatory toxicology to environmental toxicology and microbiological hazards. Different
types of risk assessment are used. In this chapter, a short comparison of the risk assessment
approaches of the Office International d´Épizooties (OIE) and Codex Alimentarius is given.
For the successful use of risk assessment in decision making several questions need to
be addressed, including who conducts risk assessments, how to define the question to be
answered, what is the basic outline of the risk assessment process, availability and quality
of data, use of expert opinions as well as expertise and resources needed. As an example,
the work done for the Finnish Salmonella Control program consisting of both of quantitative
microbiological risk assessment and economic evaluation is presented.

Keywords: risk assessment, risk management, food safety, OIE, Codex Alimentarius,

1. Introduction

The use of risk assessment, in its various forms, has increased tremendously since an
agreement on the application of sanitary and phytosanitary measures (SPS-agreement) was
accepted at the negotiations for the World Trade Organization. In the SPS-agreement it is
stipulated that ”Members shall ensure that their sanitary or phytosanitary measures are
based on an assessment, as appropriate to the circumstances, of the risks to human, animal
or plant life or health, taking into account risk assessment techniques developed by the
relevant international organizations”.

Although it originally referred exclusively to scientific risk assessment, risk assessment has
since been used in many different contexts where a concept of “evaluating first - decision
thereafter” is expected. Currently, many of the risk management decisions are supposed
be based on risk assessment. Risk managers from local production plants up to national
ministries are assumed to be able to assess risks in their own work and/or commission the
scientific risk assessment work to expert groups. As a consequence, the assessment of risks
has been (and is) used in various fields including exercises dedicated to determining chemical
and microbiological hazards, licensing of feed and food additives, veterinary medicines and
vaccines, developing HACCP-systems, targeting food control resources as well as discussing
international trade barriers.

Towards a risk-based chain control  19

Riitta Maijala

The use of risk assessment in decision making is simple and difficult at the same time. The
basic concepts of the risk analysis process as a whole are those which should anyway be part
of good decision making, i.e. collecting the information available, assessing public health
risks as well as the consequences of different control options, evaluating other factors (e.g.
practical, political and financial) and taking decisions based on all these considerations. In
the process of data collection and processing as well as for decision making, facts, opinions
and decisions must be communicated within all relevant parties. This process has been defined
in a food safety context as risk analysis, which comprises risk assessment, risk management
and risk communication (Codex Alimentarius, 1999; OIE, 2003).

However, simple the basic risk analysis concept may seem, its application to decision
making requires generating additional knowledge and does not in the least imply bringing
about changes in the attitudes of scientists, authorities, industry and consumers. The use
of risk assessment increases the role of science in the process of decision making. It also
defines more clearly than in the past the area in which scientists should not be involved,
namely risk management. Furthermore, it also forces authorities to define more clearly the
basic arguments underlying their decisions, especially when these decisions differ from the
recommendations made in risk assessment reports. Furthermore, it dictates an increased
generation of reliable quality data, which may be expensive to realize both by industry and
academia. Last but not least, the risk analysis concept is based on the premise that risks
must be evaluated which usually deviate considerably. Even if the risk estimate achieved at
the end of risk assessment exercise would deviate only slightly from zero (i.e. a “negligible”
risk), communicating to stakeholders that a “zero-risk” situation exists is generally very
difficult in the area of food safety.

In this contribution, the types of risk assessment in decision making, commissioning of

risk assessment by risk managers as well as the relationship between risk assessment and
management are discussed. In addition to risk assessment in its pure scientific context as
discussed here, several other closely related terms are in use within the food safety area.
These include e.g. hazard identification, assessment of risks, risk profiling and scientific
opinions presented as “risk assessments”. It seems that risk assessment has become a general
word to express that the opinion of scientists should be sought before decisions are made.
By the same token all decision making appears to be termed “risk management”. As many
organisations have been structured to base their strategies on risk analysis, it is crucial to
understand what aspects are important when risk management decisions are to be based on
risk assessment exercises.

2. Risk assessment

2.1. Use of risk assessment by managers

In principle, the use of risk assessment by risk managers bases either on safety evaluations
or on different options of management. A significant part, perhaps the majority, of all risk
assessment exercises currently conducted in the field of food safety are targeted towards

20  Towards a risk-based chain control

Riitta Maijala

product approval (feed additives, etc.). Yet, safety evaluations of specific food commodities
/ types of foods, comparisons between hazards and of imports are increasingly called for. For
instance risk assessment can also be used to evaluate the effect of different control options
for prevention / reduction / elimination of a hazard in an attempt to select options, develop
legislation or control programmes in feeds, animals, food production or consumption, to
improve self-checking systems and HACCP or for contingency planning.

2.2. Codex Alimentarius and OIE approach

The international standards, guidelines and recommendations for risk analysis according
to the SPS-agreement are given by the Codex Alimentarius Commission (CAC), the Office
International d´Épizooties (OIE) and by the International Plant Protection Convention (IPPC).
Although the structure of the risk analysis process described by these various organizations
vary, the main questions in all of these risk assessment process are:

• What can cause risk?

• How can it cause risk?
• What is the probability of risks occurring?
• What are the consequences?
• What are the prerequisites for risks to indeed occur?

In addition, the OIE for instance includes also the cost evaluation into the risk assessment,
whereas in the approach of Codex Alimentarius, the latter is a part of risk management. In
the area of food safety, the Codex Alimentarius and OIE approaches are the most important
ones. In Codex Alimentarius (1999) and OIE (2003) guidelines, the basic concepts - hazard
and risk - are defined differently. This is only logic, when one realises that OIE mainly
focuses on import risks whereas Codex Alimentarius stresses domestic risks or risks related
to specific products.

A hazard is defined by the Codex Alimentarius Commission as a “biological, chemical or

physical agent or property of food with the potential to cause an adverse health effect”. In
the OIE code, a hazard is “any pathogenic agent that can produce adverse consequences on
the importation of a commodity”. Consequently in the framework of the Codex Alimentarius
definition a risk is a function of the probability of an adverse health effect and the severity
of that effect, consequential to one or more hazards in a food. The OIE approach implies
that a risk is the likelihood of occurrence and the likely magnitude of the consequences of
an adverse event to animal or human health in the importing country during a specified
time period.

The CAC has defined risk assessment as a scientifically based process consisting of the following
steps: (1) hazard identification, (2) hazard characterization, (3) exposure assessment and
(4) risk characterization (CAC, 1999). In the OIE code, risk assessment follows hazard
identification and is defined as the evaluation of the likelihood and the biological and
economic consequences of entry, establishment, or spread of a pathogenic agent within the
territory of an importing country (Figure 1).

Towards a risk-based chain control  21

Riitta Maijala

Codex alimentarius OIE

Hazard identification
Risk communication

Risk communication
Risk assessment Risk assessment
• Hazard identification • Release assessment
• Hazard characterization • Exposure assessment
• Exposure assessment • Consequence assessment
• Risk characterization • Risk estimation

Risk management Risk management

Figure 1. Risk assessment in the context of risk analysis process as defined by Codex Alimentarius (1999)
and OIE (2003).

Although - when the OIE and Codex Alimentarius approaches are compared - hazard
identification (and hazard characterisation) represent different steps in risk analysis, they
both include the identification of hazard(s) and establishing whether further assessment is
needed. It also includes the description of these hazards (in Codex Alimentarius) and of dose-
response (Figure 2). In the OIE code, release assessment (risk of enter) is followed by exposure
assessment (animals exposed within a country). This approach highlights the different aspects
involved in import risk assessment. In the risk assessment of Codex Alimentarius, exposure
assessment covers the whole transmission route of a pathogen or relevant exposure path
for a chemical hazard. Therefore, in OIE and Codex Alimentarius approaches, the release and
exposure route share many similar features.

The most significant difference between both approaches is the inclusion by OIE of consequence
assessment as a part of the risk assessment process. The consequence assessment consists
of describing the relationship between specified exposures to a biological agent and the
consequences of those exposures producing adverse health or environmental consequences,
which may in turn lead to socioeconomic consequences. The direct consequences include e.g.

Hazard identification

Hazard characterization Exposure assessment

Hazard Human Contamination Consumption

Risk characterization

Figure 2. Datasets needed for a risk assessment according to Codex Alimentarius (1999).

22  Towards a risk-based chain control

Riitta Maijala

animal infection, disease and production losses as well as public health consequences. The
indirect consequences include e.g. surveillance, control and compensation costs, potential
trade losses and adverse consequences to the environment. Following the OIE approach,
the risk estimate at the end of risk assessment process is therefore based on both health,
economical and environmental consequences. In contrast, the Codex Alimentarius approach
addresses only public health consequences. In reality, many risk assessment exercises in the
food safety area are modifications of these two basic guidelines. However, should the major
purpose of risk assessment be international trade, particular care must be taken that either
the principles of OIE or those of Codex Alimentarius are followed strictly.

2.3. Types of risk assessment

The basic question in risk management is “What is the true risk?” The estimate can be based
on different statistics, e.g. human outbreaks, sporadic cases and results of the surveys.
However, often these statistics give only a general picture of the real situation and depending
on the question, different types of risk assessment can be applied, i.e. qualitative, semi
quantitative or quantitative. Quantitative risk assessments can be either deterministic (i.e.
calculations are based on point estimates e.g. means) and stochastic models where at
least part of the input values are presented as probability distributions. The type of risk
assessment usually depends on the time constraints, resources, data availability and the
main questions. The recent development of software has enabled the vast increase of the
use of the probabilistic approach, which is generally considered to better describe the reality
and uncertainties. The features of microbiological hazards present in food differ from most
chemical hazards. However, they have a lot in common so generally the risk assessment
process of chemical and microbiological hazards is similar.

Often quantitative risk assessment is preferred although the availability of data or time
constraints would prevent such. Yet, many qualitative / semi quantitative risk assessments
have proven very useful in decision making. One example is the Geographical BSE risk (GBR)-
evaluation which has been used to support decision making of the European Commission.
GBR-evaluation is based on data of imports and domestic factors and generates a risk
estimate of the BSE-agent in the domestic cattle population. It is based on the qualitative
method developed by the Scientific Steering Committee (SSC) of the European Commission
which formed the basis of the classification of countries in four GBR-levels (EC/SSC 2000,
2002). In essence, the model can be broken down into two parts relating to challenge and
stability. Stability is the ability of a BSE/cattle system to prevent the introduction and to
reduce the spread of the BSE agent within the borders of a country. A “stable” system would
eliminate BSE over time; an “unstable” system would amplify it. The main factors in stability
are feeding procedures, rendering processes and SRM removal.

After the assessment of the first 23 countries, EU legislation was dramatically changed.
In 2000, a rapid post-mortem test of BSE was introduced, the use of SRM (Specified Risk
Material) prohibited, the surveillance greatly reinforced and a temporary ban on the use of
meat and bone meal established. If these risk management decisions are compared to the
BSE/cattle model developed by SSC, it can be seen that these various measures try to focus
on all the important parts in cutting the BSE/cattle loop described in the GBR-method as

Towards a risk-based chain control  23

Riitta Maijala

well as in other opinions given by SSC. Therefore, it can be said, that in the development
of BSE-legislation in EU, scientific risk assessment has had a significant effect on the risk
analysis process. The work of SSC on TSE (Transmissible Spongiphorm Encephalopathy) has
been continued by the European Food Safety Authority (EFSA). Until the summer of 2005,
66 countries were assessed once or more often for their GBR-level and the assessments
were made public via Internet. Provided the surveillances are adequately conducted in the
various countries, they will show how correct the estimates of GBR have been and will be
in the future.

2.4. Problems in risk assessments

There are several problems in scientific risk assessment. The main problem is lack of data
in the various production steps. This problem can be solved by identifying main hazards,
using previous risk assessments to determine which important information is lacking (and
establishing programs to collect information on these points), using data from countries with
similar production systems, establishing databanks and combining control programs and the
data needs of risk assessment. Furthermore, the efforts of various institutes, universities and
industry can be combined and special attention be paid to the way the results are presented.
Often only the mean and range of the results are presented although quantitative stochastic
risk assessment can only be conducted provided the distribution type of the results (e.g.
normal or beta distribution) are known. Also the publishing can be a problem, since usually
only the most recent data can be published in the scientific literature. However, these may
not reflect the normal situation and therefore industry and research institutes should be
encouraged to publish normal data in order to avoid unnecessarily increasing the number
of worst case scenarios.

Even if data are available, they can be of poor quality. This situation can be improved by
focusing on essential issues in various parts of the production chain while collecting data,
training researchers in risk assessment data needs, better planning of research studies and
reporting, using these raw data for generating expert opinions and by combining the efforts
of various groups. It is essential that these issues are transparently presented in the risk
assessment reports. In generating expert opinions, the selection of experts must be equally
transparent. These individuals should cover various fields of expertise and they must be
trained for risk assessment. Different techniques can be used. Special care must be taken
when expert opinions are used and it must be clear and transparent in the final report where,
why and what kind of experts were used.

Dose-response modelling aims at mathematically describing the probability of adverse health

effects following exposure to different doses of a hazard. The data used can be obtained from
experimental data from studies involving humans, animals or laboratory media. However,
for identifying microbiological hazards, this is usually either not very useful or ethically
inappropriate. In national risk assessments, deriving dose-response curves from published
data of other risk assessments is sometimes a good solution. However, in each country
the investigation methods of foodborne outbreaks could be developed in order to collect
information on:

24  Towards a risk-based chain control

Riitta Maijala

• people exposed to food;

• people with symptoms / infection;
• level of contamination in contaminated foodstuffs;
• consumption of the implicated food items in exposed groups (with and without symptoms/

For microbiological risk assessment purposes, special attention should be paid to collecting
national information on the consumption of raw vs. processed foodstuffs, and on production,
importation, product types, large-scale kitchens and retail shops. The behaviour of consumers
is an important factor affecting the risks. Therefore, research projects on consumer behaviour,
normal storage (temperature / time) used in domestic kitchens, preparation patterns
(temperature / time), cross contamination (hygiene), home cooking / ready to eat foods /
catering are important. These patterns may differ vastly from one country to another.

At the national level, the expertise of risk assessment is often lacking. This can be improved
by training, by learning from other sciences (e.g. economics, occupational health and
environmental sciences) and by forming groups with different types of expertise. Support
of networking by designing forums, seminars, formal structures for networks, increasing of
domestic and international co-operation as well as establishing validation methods are also
helpful. However time- and resource consuming scientific risk assessment exercises generally
are, they are essential for achieving a high quality output.

The final risk assessment document should include the estimated number of herds, flocks,
animals or people likely to experience heath effects over time, probability distributions,
confidence intervals and other expressions of variability and uncertainties, assumptions made,
analysis of dependence and correlation between the model inputs, sensitivity analysis, and
presentation of costs, resources and time.

3. Use of risk assessment in decision making

3.1. Who should make risk assessments?

The use of risk assessment in decision making demands skills of both scientists and risk
managers. Although risk analysis can be seen as a general approach, the use of scientific
risk assessment is focusing mainly on national and international questions. Commissioning
risk assessment at the national level can be done in different ways, i.e. (1) by establishing
scientific panels which will deal with the specific questions, (2) by giving a responsibility to
research institutes/universities to make risk assessments, (3) by purchasing risk assessments
from private companies or universities, and/or (4) establishing joint projects between
different organizations. In addition to risk assessments made on request of risk managers,
researchers themselves can initiate risk assessment studies. At the international level, WHO/
FAO consultations as well as work of EFSA and the scientific committees of the European
commission have, for instance, played an important role in scientific risk assessment. Their
work has included both conducting risk assessments as well as generating scientific opinions
on risks.

Towards a risk-based chain control  25

Riitta Maijala

Within the food safety sector in Finland, mainly the first two options have been used so far.
In scientific panels, the scientific advice has often been compiled in cooperation with risk
managers (e.g. the Board for Gene Technology and Advisory Committee on Novel Foods).
The advantages of this approach are that the best experts in a particular field write the risk
assessment report, generate a common approach on how to deal with the questions and,
finally, that their interpretations are directly discussed with risk managers. However, experts
are usually employed by other organizations and can therefore dedicate only a limited amount
of time to this work. As an example of the second approach, a Department of Risk Assessment
has been established in the National Veterinary and Food Research Institute to produce risk
assessments on foods of animal origin and contagious animal diseases. The advantages of
this approach are that (1) the data and modelling techniques used for one assessment can
easily be developed further when addressing a question and (2) the method of data collection
can be influenced. Also, resources are specially allocated for this work the output of which
is subsequently combined with input from external experts. Disadvantages include the high
costs of assessments so produced. In both of these approaches, the national food control
authorities (ministries or National Food Agency) have a role as risk managers.

As risk assessment is used for decision making, the results should be reliable. The higher
impact a risk assessment has, the higher is the need for independence. Local, practical risk
assessment done by a food control officer should be comparable to another officer from
the different region of the country. To achieve this, a clear food control structure, training
of personnel, comparison of their risk assessment results as well as written guidelines are
needed. For purposes of scientific risk assessment on a national level, a clear separation of
risk management and risk assessment, transparent selection of experts as well as transparency
in the whole process including risk assessment is essential.

3.2. The question defines the assessment

One of the main issues in the risk assessment process is to clearly formulate the question to
which a risk assessment should give an answer. In formulating a risk assessment question,
ideally, the risk profile should already be available. Unfortunately, this is not always possible
due to the limitations of resources and time. Regardless, risk managers should be able to
clearly formulate, definite questions and indicate whether effects of some risk management
options should also be considered. In regulatory assessments, e.g. to allow approval for
the market, the estimation of risk is often the main output needed and the effects of
implementing various risk management options may not be that important. However, in
other cases, scrutinizing these effects may have a big influence on the structure of the whole
work. Risk assessment, especially that of a non-regulatory type, is an iterative process which
can be going on for years, unless a clear question is formulated and available resources are
fixed. Resources are a key element in achieving what is needed and they should be allocated
proportionally to the importance of the question.

In a full risk assessment covering the production chain from farm to fork, it is essential to
establish the main focus should of the exercise. Some risk assessments already start in the
primary production, involving even feed manufacturing or breeder animals, whereas others focus
more on specific areas such as slaughter or criteria for foodstuffs at retail level (Figure 3).

26  Towards a risk-based chain control

Riitta Maijala

Primary production Processing Storage & transport Retail Consumption

1 Geographical BSE-risk in cattle

2 E.coli O157:H7 in apples

3 Salmonella in broiler production

4 Campylobacter in chickens

5 Salmonella in eggs

6 E. coli O157:H7 in beef hamburgers

7 Listeria monocytogenes criteria

Figure 3. Examples of some microbiological risk assessments which have focused on different steps in the
production chain. References: (1) EC/SSC, 2002; (2) Duffy and Schaffner, 2002; (3) Ranta and Maijala,
2002, Maijala et al., 2005; (4) Christensen et al., 2001; (5) WHO/FAO, 2002; (6) Cassin et al., 1998;
(7) WHO/FAO, 2004.

There are several examples of microbial risk assessments which have focused on one or more
steps of the production chain e.g. (1) steps where the hazard is introduced to the food
production chain such as introducing (or occurrence of) BSE in domestic cattle, (2) steps
where the raw material is contaminated at slaughter or before processing, e.g. contamination
of chickens by Campylobacter or contamination of apples with EHEC, (3) steps where
consumption takes place, e.g. Salmonella and the consumption of eggs or broiler meat, or
(4) the probability of different health effects, e.g. EHEC and the consumption of ground beef
hamburgers (Cassin et al., 1998; EC/SSC; 2002; Duffy and Schaffner, 2002; Hartnett et al.,
2001; Hope et al., 2002; Oscar; 2004; Pawitan et al., 2004; Sugiura et al., 2003; USDA/FSIS
1998; Wahlström et al., 2002).

Although one often aims at the risk estimate itself, risk managers are sometimes more
interested in the effect of various risk management options. In these cases, risk assessment
may focus on e.g. (1) options during primary production e.g. removal of Salmonella positive
parent flocks, (2) using domestic vs. imported raw material e.g. Salmonella Typhimurium
DT 104 and dry-sausage manufacturing, (3) the effects of additional guarantees, or (4) the
different criteria or the consumer education e.g. Campylobacter and improving home hygiene
(Alban et al., 2002; Christensen et al., 2001; Nauta et al., 2000; Ranta and Maijala, 2002;
Ranta et al., 2004; Maijala et al., 2005a, b; WHO/FAO, 2004).

In defining a question and interpreting the risk assessment result, one must be extremely
careful. The latter is illustrated by considering the different conclusions that can be drawn
when risks are calculated from a slightly different angle, as is the case in the WHO/FAO risk
assessment on Listeria monocytogenes in ready-to-eat foods (WHO/FAO, 2004). In this work,
two different measures for annual risk estimates were presented: one for the number of cases
of listeriosis per 1 million servings and one for the number of cases of listeriosis per 10

Towards a risk-based chain control  27

Riitta Maijala

million people. When risk was calculated for servings, the most risky food was smoked fish
(0.021 cases per million servings) with milk as the second one (0.005 cases). However, when
cases of listeriosis per 10 million people were calculated, milk was estimated to cause the
highest risk (9.1 cases) and smoked fish only the second one (0.46 cases). For decision makers
these kinds of issues may sometimes be very important and therefore careful planning of the
question asked as well as how to present the results of risk assessment are very important.

3.3. Commissioning a risk assessment

For risk assessment questions, not focusing on a specific product - e.g. a feed additive
adhering to a general format - is useful to ensure that questions are clearly understood by
both risk assessors and risk managers. The following format of questions has been used in
Finland, concerning other issues than product approval in the field of food of animal origin
and contagious animal diseases:

• the title of risk assessment, i.e. what is the main question;

• the purpose of the risk assessment;
– basis in legislation;
– domestic/ international purposes;
– relation to other studies done / planned.
• type of risk assessment expected (qualitative / quantitative);
• content of risk assessment;
– part of the production chain involved;
– feedstuffs, animals and/or foodstuffs involved;
– pathogen / chemical hazard involved;
– target population for risk estimate.
• possible control options to be evaluated;
• exclusions (those products, productions steps, contamination routes, years, etc. which
are not included into the assessment);
• risk assessment report (language, target group, etc.);
• timetable, resources, funding;
• contact persons in risk assessment and risk management;
• person responsible for the work and risk assessors involved;
• planned co-operation with risk managers, other scientists and other stake holders.

3.4. An example: Salmonella Control Program in Finland

In 1995, i.e. when Finland became a member of the European Union, the Finnish Salmonella
Control Programme (FSCP) was established based on the low Salmonella prevalence in
domestic livestock production. The FSCP was accepted by the European Commission (EC)
Decision 94/968/EC and it forms a basis for the additional guarantees for importing eggs
and meat granted for Finland by the EC. The objective of the FSCP is to protect consumers
and to maintain Salmonella prevalence below 1% in swine, cattle and poultry production as
well as in meat and eggs derived from these animals.

28  Towards a risk-based chain control

Riitta Maijala

The Ministry of Agriculture and Forestry wanted to determine the efficiency of the FSCP and
its effect on public health. Therefore, a number of projects consisting of both quantitative
microbiological risk assessment and economic evaluations was launched, which allowed a
deeper insight as compared with an analysis exclusively based on the apparent results of the
surveillance program (Figure 4) (Maijala et al., 2005b).

The effect of main interventions used in the FSCP have been quantitatively evaluated for
broiler and pork production using transmission models (Ranta and Maijala, 2002; Ranta
et al., 2004; Maijala et al., 2005a). Similar work is currently in progress for beef and egg
production also (Lievonen et al., 2004; Ranta et al., 2005). In addition, economic analyses
have included a study on incentive structures of the FSCP, cost-of-illness type evaluation
and a “willingness to pay”–analysis (Maijala and Peltola, 2000; Peltola et al., 2001; Kangas
et al., 2003; Maijala et al., 2005b; Aakkula et al., 2005).

In the primary broiler production model, the predictive distributions were derived from the
true number of infected broiler flocks rather than from the number of detected Salmonella-
positive broiler flocks (Ranta and Maijala, 2002). The true flock prevalence was estimated to
be 0.9-5.8% (95% probability interval). When this primary production model was combined
with the secondary production model, the effects on public health of eliminating those breeder
flocks from production which tested positive for Salmonella and those of heat-treating the
meat of detected positive broiler flocks, could be simulated (Maijala et al., 2005a, b). Based
on the whole model, it was concluded that if detected positive broiler breeder flocks were

Health Economics
Apparent situation Incentive structure
based on monitoring • isolation paradox
• freeriders
• information
Quantitative risk imperfections
• true prevalence Cost-of-illness
• effect of interventions

Efficiency of Finnish Salmonella Control Program

Figure 4. The methods used to evaluate the efficiency of the Finnish Salmonella Control Program. Reprinted
from Food Control (Maijala et al., 2005b).

Towards a risk-based chain control  29

Riitta Maijala

not removed this would result in 1.0-2.5 more reported human cases as compared with the
expected number of human cases under the current FSCP (95% predictive interval). Without
heat treatment of meat the increase would be 2.9-5.4 fold and without both interventions
3.8-9.0 fold. Replacement of half of the current retail broiler meat by meat with 20-40%
contamination level could result in 33-93 times more human cases as compared with the
expected value under the current situation. However, several interventions suggested in the
FSCP, e.g. imposing restrictive orders, cleaning and disinfecting the broiler houses as well
as voluntary interventions such as the use of competitive exclusion, were not included in the
model. Therefore, the model created may underestimate the effect of Salmonella control.

According to the analyses based on direct costs of the FSCP, the cost-benefit ratios for egg
and meat production with and without market adjustments were varying between 5.4 and
258.1. This means that for every euro invested in the control program 5.4 to 258.1 times
as many euros will return as benefits to the society as a whole (Maijala and Peltola, 2000).
A more detailed analysis was made for the FSCP of broiler production. In 2000, the costs of
the FSCP were 0.02 euros per kg broiler meat produced (Kangas et al., 2003). In addition to
these calculations, a contingent valuation method was used to analyse consumer attitudes
(willingness-to-pay). Based on the questionnaire sent to 2,000 people (response rate of
55%), consumers are willing to pay on average roughly 5.8 euros per household per month
for running the FSCP (Peltola et al., 2001).

This work is an example of risk assessment and economical evaluation, which was done for
an existing surveillance program, i.e. after the decision of establishing such a program was
made. The evaluation work has been useful to define the exact role and benefits of the FSCP
in controlling Salmonella in animal production and the work has identified the needs to
further develop this program. In fact, risk assessment can provide a valuable insight into the
problem or past decisions and allows to study the effects of the various control options. If
needed, the quantitative risk assessment can also be combined with economical evaluations.
However, the work has demanded significant resources which prevents using this approach
for all important zoonoses.

4. Conclusions

4.1. What has been achieved

In general, risk assessment should be based on science, should be transparent, qualitative

and/or quantitative, conducted according to a structured approach, reassessed and re-
evaluated over time, should be flexible, and reflect real life situations as much as possible.
To achieve this, multidisciplinary, high quality work is needed. This implies that resources
used for risk assessment can be significant.

4.2. What has been neglected

Risk assessment is a valuable tool for decision making by risk managers and it may be applied
for a variety of purposes, such as:

30  Towards a risk-based chain control

Riitta Maijala

1. developing legislation or control programs;

2. comparing risks caused by various hazards in foodstuffs;
3. collecting and presenting information available for decision making;
4. harmonizing decision making in different regions;
5. targeting the limited resources in food safety work.
4.3. What needs to be done

The use of risk assessment in the field of international trade as well as in practical applications
will increase in the near future. Provided properly applied, it can improve the decision making.
In addition to expertise in basic risk assessment, increased competence and resources for
both risk assessors and managers is called for to allow studying the effects of different
intervention strategies, the costs involved as well as for comparing risks and benefits in
different decision scenarios.

Aakkula, J., Peltola, J., Maijala, R. and Siikamäki, J., 2005. Consumer attitudes. Underlying perceptions and actions
associated with food quality and safety. Journal of Food Products Marketing (in press).
Alban, L., Olsen, A-M., Nielsen, B., Sǿrensen, R. and Jessen, B., 2002. Qualitative and quantitative risk assessment
for human salmonellosis due to multi-resistant Salmonella Typhimurium DT 104 from consumption of Danish
dry-cured pork sausages. Prev. Vet. Med. 52, 251-265.
Cassin, M.H., Lammerding, A.M., Todd, E.C.D., Ross, W. and McColl, R.S., 1998. Quantitative risk assessment for
Eschericia coli O157:H7 in ground beef hamburgers. Int. J. Food Microbiol. 41, 21-44.
Christensen, B., Sommer, H., Rosenquist, H. and Nielsen, N., 2001. Risk assessment on Campylobacter jejuni in chicken
products. The Danish Veterinary and Food Administration, Institute of Food Safety and Toxicology, p. 138.
CAC (Codex Alimentarius commission), 1999. Principles and guidelines for the conduct of microbiological risk
assessment. CAC/GL-30.
Duffy, S. and Schaffner, D.W., 2002. Monte Carlo simulation of the risk of contamination of apples with Eschericia
coli O157:H7. Int. J. Food Microbiol. 78, 245-255.
European Food Safety Authority. Home page:
European commission, scientific committees, 2000, 2002. Home page:
Hartnett, E., Kelly, L., Newell, D., Wooldridge, M. and Gettinby, G., 2001. A quantitative risk assessment for the
occurrence of Campylobacter in chickens at the point of slaughter. Epidemiol. Infect. 127, 195-206.
Hope, B.K., Baker, A.R., Edel, E.D., Hogue, A.T., Schlosser, W.D., Whiting, R., McDowell, R.M. and Morales, R.A.,
2002. An overview of the Salmonella Enteritids Risk Assessment for Shell Eggs and Egg Products. Risk Analysis
22, 203-218.
Kangas, S., Lyytikäinen, T., Peltola, J., Ranta, J. and Maijala, R., 2003. Economic impacts of the Finnish Salmonella
Control Programme for broilers. Working paper of the National Veterinary and Food Research Institute. EELAn
julkaisuja 02/2003, 75 p.
Lievonen, S., Havulinna, A. and Maijala, R., 2004. Egg consumption patterns and Salmonella risk in Finland. J.
Food Protect. 67, 2416-2423.
Maijala, R. and Peltola, J., 2000. Economics of Food Safety in Finland – case: National Salmonella Control Program.
Agricultural economics research institute. In Finnish, summary in English. Working papers 13/200, p. 55.

Towards a risk-based chain control  31

Riitta Maijala

Maijala, R., Ranta, J., Seuna, E., Pelkonen, S. and Johansson, T., 2005a. A quantitative risk assessment of the public
health impact of the Finnish Salmonella Control Program for broilers. Int. J. Food Microbiology 102, 21-35.
Maijala, R., Ranta, J., Seuna, E. and Peltola, J., 2005b. The efficiency of the Finnish Salmonella Control Programme.
Food Control vol. 16 (8), 669-675.
Nauta, M.J., van de Giessen, A.W. and Henken, A.M., 2000. A model for evaluating intervention strategies to control
Salmonella in the poultry meat production chain. Epidemiol. Infect. 124, 365-373.
OIE, 2003. Terrestrial Animal Health Code, 2003.
Oscar, T.P. 2004. A quantitative risk assessment model for Salmonella and whole chickens. Int. J. Food Microb. 93,
Pawitan, Y., Griffin, J.M. and Collins, J.D., 2004. Analysis and prediction of the BSE incidence in Ireland. Prev. Vet.
Med. 62, 267-283.
Peltola, J., Aakkula, J., Maijala, R. and Siikamäki, J., 2001. Valuation of economic benefits from the Finnish
Salmonella Control Program. Agrifood Research Finland, Economic Research. Working papers 30/2001.
Ranta, J. and Maijala, R., 2002. A probabilistic transmission model of Salmonella in the primary broiler production
chain. Risk Analysis 22, 47-58.
Ranta, J., Tuominen, P., Rautiainen, E. and Maijala, R., 2004. Salmonella in pork production in Finland – a
quantitative risk assessment. Working paper of the National Veterinary and Food Research Institute. EELAn
julkaisuja 03/2004,107 p.
Ranta, J., Tuominen, P. and Maijala, R., 2005. Estimation of true Salmonella prevalence jointly in cattle herd and
animal populations using Bayesian hierarchical modelling. Risk Analysis 25, 23-37.
Sugiura, K., Ito, K., Yokoyama, R., Kumagai, S. and Onodera, T., 2003. A model to assess the risk of the introduction
into Japan of the bovine spongiform encephalopathy agent through imported animals, meat and meat-and-bone
meal. Rev. Sci. Tech. Iff. Int. Epiz. 22, 777-794.
USDA, FSIS, 1998. Salmonella Enteritids Risk Assessment. Shell Eggs and Egg products.
Wahlström, H., Elvander, M., Engvall, A. and Vågsholm, I., 2002. Risk of introduction of BSE into Sweden by import
of cattle from the United Kingdom. Prev. Vet. Med. 54, 131-139.
WHO. Microbiological risks publications. Home page:
WHO/FAO, 2002. Risk assessments of Salmonella in eggs and broiler chickens. Microbiological Risk Assessment Series
1. ttp://
WHO/FAO, 2004. Risk assessment of Listeria monocytogenes in ready-to-eat foods.

32  Towards a risk-based chain control

Ivar Vågsholm

Food safety: A must for the food chain

Ivar Vågsholm
Swedish Zoonoses Center, National Veterinary Institute (SVA), S 751 89, Uppsala, Sweden,


A food market requires foreseeable food safety and quality to function well. Another
prerequisite is that any deviations are transparent to all stakeholders. In the food chain,
from feed production to consumption, there are incentives to cheat between food business
operators. There is a need for acknowledging the shared responsibility between all food
business operators for the quality and safety of the food chains. Moreover, there is a need
for its seamless supervision. The choice of appropriate control measures will always be on
a case-by case basis, applying HACCP principles will be helpful, while for the trade in live
animals the benefits to be gained by trade must be weighed against the risk and the possible
control options.

Keywords: food chain, integrated production systems, moral hazard, externalities, risk
management options

1. Introduction

Somewhat simplistically one might define the food chain as including all food business
operators that are involved in producing food, i.e. ranging from feed producers, farmers,
processors, wholesalers and retail, to catering up to the consumers. However, most pathogens
are not uniquely foodborne and may have several other transmission paths. The positive
answer to when a raw material becomes a foodstuff has moved backwards in the food chain.
Thus dairy farmers produce the foodstuff milk, not the raw material.

The basic proposition of this paper is that food safety is a prerequisite for a well functioning
market for foodstuffs and animal products.

The second proposition is that there are incentives to cheat (moral hazard) in the trade
between food business operators along the food chain unless the joint responsibility for the
good functioning for the food market is acknowledged and taken.

The third proposition is that risk management measures are usually complementary and that
in most cases no single measure will suffice. Finding the right mixture of food safety options
adapted to local conditions will determine if one can succeed in keeping food safe. This will
always be a decision on a case-by-case basis if the food safety system is integrated along
the food chain. The HACCP principles could be a helpful tool in identifying the best control
options for a particular food chain.

Towards a risk-based chain control  33

Ivar Vågsholm

The fourth proposition is that risk management measures are usually more efficient and more
cost effective the earlier they are applied in the food chain. A related proposition is that
control early in the food chain could also contribute to the control of zoonoses with other
transmission paths.

The final proposition is that a food chain should be seamlessly supervised to afford food

This contribution will examine the evolution of the food chains, the need for integrated
production systems, the possibilities for pre and post harvest control including HACCP and
will finally address the issue of trade with live animals.

2. Food safety; an economic perspective

Why is food safety a must for a well functioning food market? One of the basic assumptions
of a well functioning market (Varian, 1984: 290-305) is that all stakeholders along the food
chain, from food business operators to consumers, are well informed about the goods they
sell or purchase. This does not mean that a foodstuff must be of perfect quality and safety, it
rather implies that the quality and safety should be foreseeable entities and that deviations
from the expected safety and quality should be transparent for all parties. If this is not the
case, the consumers will usually respond to uncertainty as regards quality and safety by
demanding less of the foodstuff. For example, the BSE episode causing a crisis in the EU beef
market, with the consequence of lower beef prices and the European Community’s response is
outlined in the previous Food Safety Commissioner’s David Byrnes speech to European Farmers
in February 2001 on the measures needed to regain the consumer confidence (Byrne, 2001).
Since foodstuffs have steep supply and demand curves a reduction in demand will result in
a larger drop of prices assuming the market were to match supply and demand by itself. If
the market is regulated, as is the case for the EU beef market it will tend to be expensive for
the tax-payer as the second best policies are implemented such as increased payments for
price support or increased storage of commodities. The choice of the best policy is difficult
and Gardener (1988: 273-313) reviews the possible policies to handle risk and uncertainty
given different agricultural policy scenarios. However, a good and foreseeable food safety
will always be beneficial for good functioning of the market.

3. Foodborne zoonoses

3.1. Additional transmission paths

Foodborne zoonoses have several transmission paths in addition to those that are foodborne.
For example, in the case of enterohaemorrhagic Escherichia coli O157 [(EHEC) also referred
to as human pathogenic verotoxin producing Escherichia coli O157 (HP-VTEC O157)] these
also include direct animal contact (children playing with calves and lambs), contact with
recreational water sources when swimming, or cross contamination due to irrigation with
contaminated water (Mead and Griffin, 1998). Nevertheless, if control measures for foodborne

34  Towards a risk-based chain control

Ivar Vågsholm

zoonoses are taken early in the food chain, these could also contribute to the control of the
same zoonoses along non-food pathways.

The food chains have become longer and more complex during the last years. It might be
helpful starting by examining the food chains evolution and pinpoint certain important issues
such as that the chain has become less transparent, that there are externalities or incentives
to cheat and that there is a need for integrated production systems.

It should be noted that eating ready to eat foods (e.g., sandwiches) that are often cold stored
for a long time and transported over long distances has become a custom. One example is
the sandwiches one purchases from the fridge at petrol stations.

The increased marketing of fresh foods raises the issue of how control can be afforded over
the food production. For example there is a persistent risk of being infected with HP-VTEC
O157 (Österberg, 2005) and Shigella sp., if eating fresh foods such as salads (Long et al.,
2002) that are contaminated by irrigation with contaminated water.

3.2. Old solutions - new problems

The traditional solutions to assure food safety are sometimes associated with new problems.
One example from the Far East is the use of wet markets full of live birds for the trade of
poultry meats (Perdue and Swayne, 2005). The local opinion is that only clinically healthy
birds are safe to eat, while frozen poultry meat could be unsafe. Although this control option
makes sense with respect to many foodborne diseases that affect birds clinically, it also
allows zoonotic transmission of pathogens such as avian influenza (HPAI) through aerosol
formation. This illustrates a general point, i.e. that the solution of one problem can be the
origin of the next problem.

3.3 Fresh produce – a simple and difficult food chain

Fruits and vegetables can become contaminated with HP-VTEC O157 or other faecal pathogens,
whilst growing in fields, or during harvest, handling, washing/cleaning, processing,
distribution, retail, preparation, and final use (Beuchat, 1996). Contamination may be
associated with the use of improperly treated manure as fertiliser, exposure to faecally
contaminated irrigation or washing water or contacts with animals, birds, or insects pre and
post harvest. The extent and the impact of this kind of contamination on consumer health
are unclear, since limited data are available. Also important is to consider when a vegetable
stops being a raw material and becomes a foodstuff. Is it when the salad is growing in the
fields and exposed to irrigation and fertilisation, when the salad is harvested, when it is
packaged for marketing or at the point of sale? In this context it is important to realize that
the point of contamination is often the irrigation or fertilisation, particularly when carried
out just before the salad is harvested.

Towards a risk-based chain control  35

Ivar Vågsholm

3.3.1. Irrigation

Good examples for irrigation-related foodborne illnesses are a number of recent VTEC O157:
H7 outbreaks in the USA that have been linked to irrigation with contaminated water (CDC,
1999). Also the largest outbreak in Sweden of VTEC O157:H7 with more than 100 cases was
due to contaminated salad (Österberg, 2005). The transfer of foodborne pathogenic micro-
organisms from irrigation water to fruits and vegetables depends on the irrigation technique
used (e.g. sprinklers) and on the nature of the produce e.g. carrots or lettuce. It may be
noted that VTEC O157 will survive for prolonged periods in fresh water, especially at low
temperatures (Wang and Doyle, 1998).

3.3.2. Fertilising

Sewage, manure, slurry, sludge and compost of human and animal origin are commonly used
as organic fertilisers for fruit and vegetable production. Several epidemiological investigations
have identified manure as the source of contamination of VTEC outbreaks (Nguyen-the and
Carlin, 2000).

It appears that to afford acceptable food safety for fresh produce the food chain must be
controlled at pre-harvest stages even when fresh produce is generally considered to be raw
material. Subsequently it should be controlled at the point of consumption since there are
no steps to remove contamination for fresh foods.

4. Prolongation of the food chain

Historically the food chains were short and transparent, and more than often the consumer
was also the producer or was able to observe how the food was produced. However, the
modern food chains are specialized and there is a greater distance between the food producer
and consumer. This has resulted in little transparency between food business operators and
consumers. It is not possible by just looking at a packaged piece of beef from Argentina,
mutton from New Zealand or rocket salad from the Bay of Naples to judge whether the
foodstuff is safe or not. Thus, the consumer has no other option than to trust the all food
business operators along the food chain with regard to food safety and quality.

Also, food and feed markets are becoming global. This becomes evident when visiting a
supermarket where animal and vegetable foods originating from all over the world are offered
for sale. The same is the case for animal feedstuffs that are imported from all over the world
and fed to locally produced animals. It is the question then whether beef or pork is locally
produced or a part of a global food chain?

Moreover, with higher production intensity in farming has come the need for a more energy-
concentrated and protein-rich feedstuff usually supplied from sources outside the farm. The
farmer has no supervision of the production of concentrate feedstuffs and must trust the
supplier(s) of feedstuffs that the feed is safe. In other words, the traditional food production

36  Towards a risk-based chain control

Ivar Vågsholm

has been replaced by a less transparent system in which food business operators along the
food chain must be able to rely on a minimal trust and compliance with the legislation.

In the food business there are economic incentives to cheat (both the suppliers and customers)
with regard to quality and safety of the foodstuff, in particular if failure of a product is not
likely to be detected and the latter is a direct consequence of the lack of transparency of the
food chain. Thus, an individual food business operator or a group of food business operators
could, at the expense of an entire industry, profit from selling unsafe or substandard food
products. In economic analysis this is usually referred to as “externalities of the market” or
the problem of “moral hazard” (Varian, 1984:298-299).

In addressing the question how to deal with cheating it should be noted that all food
business operators from feed producers, farmers, processors, to retailers have a joint interest
in optimizing the efficiency of the food chain.

Possible solutions to the problem of externalities along the food chain include government
regulation, contractual agreements stipulating that the involvement of all food business
operators jointly share the gains or losses of the end product, and/or vertical integration
of the food chain whereby one company or corporation takes control of the entire food

Government regulation aims at managing the externalities along the food chain, thereby
securing that buyers and sellers are aware that by complying with the regulations the
foodstuffs placed on the market adhere to certain food safety standards. These regulations
have usually been responses to emerging problems of the day, rather than the result of a
structured general approach. Food safety was suggested to be inherent to regulations and
the assumption that the government could guarantee the wholesomeness of a food: As long
as non-compliance was not detected food was thought to be safe. This approach afforded
some success, i.e. in protecting consumers against specific hazards such as trichinellosis,
where compulsory control for Trichinella in pork also increased the consumers confidence in
the safety of pork. Another example is the compulsory pasteurization of milk having resulted
in pasteurized milk being seen as one of the safest foodstuffs.

However, these regulations did not suffice as the food chains became international, more
complex and intertwined, as illustrated by several food crises in Europe experienced during
the past 20 years:

• The epidemic of Salmonella enteritidis PT 4 via eggs infected through the ovaries
(Humphrey, 1991; Humphrey et al., 1998; Mawer et al., 1991). This is an example of
the emergence of a new transmission path for a pathogen with public health relevance.
Currently eggs still appear to be the main source for human salmonellosis within the EU
(Kaesbohrer, 2004).
• The Bovine Spongiform Encephalopathy (BSE) epidemic. The origin of BSE is somewhat
ambiguous, possibly appearing as undetected cases in cattle in the late 1970´s according
to the BSE inquiry in the UK released in 2001 (
However, it appears that changes in rendering practices (lower temperature and pressures,

Towards a risk-based chain control  37

Ivar Vågsholm

cessation of fat extraction with ether) were important when the agent was recycled and
transmitted through the use of meat and bone meal as feed additives.
• Increased risk for listeriosis due to new eating habits and technological changes in the
food chain:
– Ready-to-eat foods have been a commercial success. For the convenience stores
the prospect of long storage times for fresh sandwiches through cold storage, is
a profitable feature of the trade. However, this opens up a new transmission path
for Listeria monocytogenes with increased risk for human listeriosis in particular
amongst vulnerable groups. In cold storage (0-4º C), the normal bacterial flora
is not able to grow, while Listeria monocytogenes with its ability to grow at cold
storage temperatures (Farber and Peterkin, 1991) can reach concentrations above
the infectious dose before the end of the cold storage period, while the food still
appears to be fresh and wholesome (McLauchlin and Van der Mee-Marquet, 1998). The
infectious dose for Listeria monocytogenes infections are reported to vary between
millions of c.f.u. for healthy people with gastrointestinal symptoms, to hundreds to
thousands of c.f.u., for vulnerable groups (SVMPH, 1999).
– The increase of obesity in industrialised countries has resulted in the introduction of
new diets such as sprouted seeds which have been implicated in several outbreaks
involving bacterial pathogens, e.g. verotoxogenic Escherichia coli (VTEC) O157:H7
(NACMCF, 1999). The largest outbreak involved VTEC O157:H7 in contaminated radish
sprouts, with over 6,000 infected people in Japan (Michino et al., 1999).

In response to these events the European Commission’s issued the White Paper of Food
Safety ( that outlines
the basis and principles of modern food safety regulations. The guiding principles are that
all stakeholders (defined as food and feed business operators) have an individual and joint
responsibility for the safety of the foodstuff and that all legislation shall be on farm to
table basis.

Complementing the evolution of food safety legislation, the advent of brands, could be seen
as a signal to consumers that a foodstuff of a particular brand has a high and foreseeable
quality and safety would thus justify a higher price. The owners of brands for foodstuffs,
be it milk, cheese, or hamburgers, are to guarantee the wholesomeness of the food to the
consumer. Thus product quality should be meticulously guarded to assure consumer confidence
and a high profitability of the industry.

Hence in response to both the legal evolution as outlined in the White Paper and the
associated advent of brands guaranteeing food safety there is a need for developing integrated
food production systems.

5. Integrated food production system

The Scientific Committee of Veterinary Measures relating to Public Health (SCVMPH, 2001)
issued an opinion about integrated food production systems where meat inspection systems

38  Towards a risk-based chain control

Ivar Vågsholm

could be modernized. The following points were suggested as guiding principles for what
constitutes an integrated production system in the food chain:

• The possibility to assess the system as an integrated part of a singular epidemiological

unit. For instance in the form of an epidemiological framework/organisation that could
utilise all information collected along the food/feed chain to utilize the synergies between
the parts of the food chain and to maximise food safety.
• The integrated system should allow parties or food business operators in this system to
be clearly defined and identifiable. Furthermore, in some settings it would be desirable
to define these integrated production systems geographically.
• No participant should be able to enter or leave without a clearly defined procedure, thus
ensuring that those entering are fulfilling all the requirements of the system and those
leaving “go through one door”, avoiding the half in or half out participation.
• There should be a free flow of information and transparency between all parties in the
• No feed or animals are allowed to enter the system or reach slaughter, unless they
originate from feed or holdings that comply with the systems’ requirements. The farms
or animal holdings should not be able deliver animals to abattoirs outside the system.
If abattoirs take deliveries from holdings outside the system, these should be separated
all along the food chain, and safeguards put in place to protect the integrity of the
integrated system.
• No foodstuff (meat or meat products) should leave the system unless complying with
the system requirements.
• There should be comprehensive veterinary supervision of the complete system and it should
be possible to establish the responsibilities and accountability for the good functioning
along the system. The supervision of the system should target the entire epidemiological
unit rather than its particular parts and include the possibility of withdrawing the approval
or recognition of the integrated system.
• Those responsible for the epidemiological monitoring of the system should be clearly
identified. Furthermore, this responsibility would include that all collected information be
put together and analysed to get an estimate of the risks in the system, in other words
an ongoing risk assessment that should inspire necessary risk management measures to
be taken if needed.

While this is not a list of binding requirements, it indicates how the food chains ought to
evolve to afford food safety. Thus from a food safety perspective, a vertical integration of
the food chain industry would be desirable. However this does not necessarily imply that one
company should control the food chain. It is entirely conceivable that a farmers cooperative,
or consumers cooperative take initiatives. The key point is the joint interest and mutual
benefit of an integrated food chain be served in the interest of all food business operators
and other stakeholders.

Towards a risk-based chain control  39

Ivar Vågsholm

6. Pre- and post-harvest control

Measures should be taken early in the food chain to keep the food chain free of food
pathogens. Control of bovine tuberculosis, bovine spongiform encephalopathy (BSE) and
Salmonella spp. serve as examples for this approach. The advantages to be derived from
controlling food hazards in primary production include:

• Lesser risk for zoonoses along non-food pathways.

• Limiting the possibility for cross-contamination between raw and ready to eat foods
along the food chain.
• Provided there is no contamination of the raw foodstuffs entering the food chain,
facilitating the elimination or reduction of zoonotic pathogens such as Salmonella,
Campylobacter and HP-VTEC O157 for other pathogens (Listeria monocytogenes) this
would be less relevant.
• Offering an indirect solution to antibiotic resistance problems along the food chain, because
the prevalence of foodborne pathogens that could acquire resistance is reduced.

On the other hand, it could be argued that risk management should take place just before
consumption, to ensure the food is safe before it is consumed, complementing the measures
taken early in the food chain. Pasteurization of milk, cooking or heat treatment of food,
measures to avoid cross-contamination during food preparation are examples for the latter.

Hence it is only by linking all these strategies that a reasonable control policy can be
designed for a food chain. The control options for zoonoses in the feed chain will, for
example, include exclusion of certain ingredients such as mammalian meat and bone meal
(MMBM) to avoid transmission of BSE through the feeding stuffs. While the MMBM might
have contained prions, it is usually free from Salmonella due to the heat treatment during
rendering. However, the replacement protein feedstuff may be contaminated with Salmonella.
Hence, it is again possible that the solution of the BSE problem in the EU food chain, will
create a larger Salmonella problem in other food chains.

Noordhuizen et al. (2001) and Johnston (2002) suggested using a control strategy based on
hazard analysis critical control points (HACCP) principles in the primary production. When
comparing the HACCP to the Good Manufacturing Practice approach and the ISO 9000 system,
Noordhuizen et al. (2001) noted that the HACCP system is based on a bottom-up approach,
which is easier to integrate with procedures for integrated food chain quality assurance. Thus,
it provides an intellectual framework by which all the control options in the food chain are
applied. The following criteria were suggested for identifying a critical control point:

• it should have a causal relationship with the hazard;

• it should be possible to measure and monitor the control point;
• acceptable target and tolerance levels should be established;
• corrective measures should be feasible and cost-effective;
• correction must lead to restoration of lost control.

40  Towards a risk-based chain control

Ivar Vågsholm

Mortimore et al. (2002) outlined the 7 principles of HACCP that are operationalised in 12
tasks when establishing the HACCP system. These tasks could give guidance for how to set
up a HACCP system on the farm.

It might be of interest to look at the Swedish Salmonella control program (Sternberg-Lewerin

et al., 2005; Wierup et al., 1992, 1995; Lindqvist, 1999) where at least three major critical
control points were identified, i.e. the breeding pyramid, the feed production and on-farm
production. The Swedish Salmonella program has evolved over the past 50 years on a case-
by-case basis, and might therefore also be referred to as an accidental HACCP program. In
principle, the Salmonella control is built on surveillance of the production, and is based on
bacteriological examination and, when Salmonella is found, on always taking corrective action
to remove the Salmonella contamination from the food chain. While the diagnostic sensitivity
of Salmonella bacteriology is not perfect (Lo Fo Wong et al., 2003), it has - by using several
test points along the food chain - been possible to detect Salmonella contamination and to
keep Salmonella out of the Swedish animal product food chain.

6.1. Trade

A particular control point in pre-harvest control is the control of trade and movements of
live animals. Animal movements represent a risk for transmitting disease both via infected
animals and contaminated transport vehicles. It is believed that the great flow of animals
contributed to the severity of the FMD outbreak in United Kingdom (Gibbens et al., 2001).
Moreover, in a case-control study of risk food business operators for Salmonella typhimurium
DT 104 in bovine herds in the UK, animal movements into and from the farm were identified
as riskful to food business operations (Evans and Davies, 1996).

Moreover, for example in the Swedish Campylobacter program, a major part of the problem
was the fraction of birds infected during the transport to slaughter (Hansson et al., 2004). It
should be noted that transports of foodstuffs or animal products represent much less risk than
transport of live animals. Hence, from a food safety point of view transports of live animal
should be limited as much as possible, while the animals transported should be subject to
equivalent biosecurity measures as on the farm.

The transport of animals, be it in the breeding, production or slaughter circuit, requires

specific attention as the probabilities of spreading agents causing zoonotic and epizootic
disease between herds, regions or countries are considerable.

The risk of disease transmission for the individual animal is highest in live animal trade, due
to the potential for infecting other animals. In the trade of animals intended for production
and slaughter risks are lower. Even safer with regard to disease transmission, is the trade in
sperm and embryos. It is usually safer to slaughter the animal at the point of origin and to
transport the meat and animal products to the point of consumption and such a policy is also
better in terms of animal welfare and environment. This potential for transmission of diseases
by breeding animals can be exemplified by the trade in grand-parent breeding birds.

Towards a risk-based chain control  41

Ivar Vågsholm

6.1.1. Trade in live animals for breeding

The reason to particularly pay attention to biosecurity and disease control at the top of the
breeding pyramid is that a single animal at the top of breeding pyramid may infect many
animals at the production stage. For example, in poultry production, every great-grandparent
female (Elite) could theoretically be the origin of between 156,000 and 300,000 broilers or
between 160,000 and 300,000 laying hens producing between 4.16 x 107 and 9.00 x 107
table eggs (Anonymous, 2004). Thus the introduction of Salmonella enteritidis at the top
of the breeding pyramid would represent a large zoonotic risk. Control options include (1)
test and removal from the breeding pyramid if positive, (2) providing increased resistance
through vaccination, (3) strategic intervention with antimicrobials or disinfectants and/or
(4) high biosecurity (i.e. change of clothes and footwear, controlled ventilation) on the site
of production. The benefit and/or the risk reduction vs. the costs relationship is usually very
favourable for interventions in the breeding pyramid.
In the primary production the same control options are applicable but the benefit cost ratios
are usually less favourable. Hence the choice of control options has to be more judicious and
finely balanced. With safe feedstuffs and breeding pyramids being kept free from pathogens, a
good start has been made for the control of Salmonella and BSE. However; for Campylobacter
in poultry or HP-VTEC in beef and dairy production the food safety job commences on the
production site. The optimal control strategy on one farm or region is not necessarily the
same as in other regions and requires a fine judgement and establishing good communications
between veterinarians, producers and other food business operators.

However, a number of other factors need to be considered. Firstly one should aim for an
all-encompassing solution for the farm controlling both zoonotic and epizootic risks, rather
than considering them separately. Cost efficiency analyses coupled with a analysis of critical
control points would be helpful in finding the best risk management options.

Control options should include test and remove strategies, an approach based on test
and stamping out (TB or brucellosis), test and heat or freezing treatment (Salmonella or
Campylobacter), avoidance of cross-contamination also referred to as logistic slaughter,
vaccination either of breeding or production animals, strategic treatments with antibiotics,
biosecurity measures, control and hygiene of animal and personal movements to and from
the farm, and the control of feed free of pathogens (BSE and Salmonella).

In summary, the decision on the control strategies ought to be a local based on a case by
case decision, but considering the general principles of which some were discussed in this

7. Conclusions

7.1. What has been achieved

Food safety is a prerequisite for a well-functioning market for food stuffs and animal products.
The latter can be achieved if and when all operators along the food chain acknowledge this

42  Towards a risk-based chain control

Ivar Vågsholm

fact and jointly take responsibility to realize well-functioning. On average, this realisation
grows in industry, partly dictated by the recent changes in the European food legislation.

It has been established that risk management measures are generally most effective and
cost-efficient the earlier in the food chain they are applied.

7.2. What has been neglected

Up until fairly recently food business operators have insufficiently contributed to achieving
food safety, partly because legal incentives existed only to a limited extent, partly because
integrated longitudinal control options have only partly been implemented. This occasionally
leads to ignoring of good production practices or even cheating among these operations.

7.3. What needs to be done

Risk management measures are complementary and no single measure will suffice. The
challenge is to find the right mixture of food safety management options and to assure that
the food chain is seamlessly supervised both by governmental bodies and through industry
self-control. To achieve the latter it is crucial that food operators take joint responsibility.


Anonymous, 2004. Opinion of the Scientific Panel on Biological Hazards on a request from the Commission related
to the use of antimicrobials for the control of Salmonella in poultry. The EFSA Journal 115, 1-76.
Beuchat, L.R., 1996. Pathogenic micro-organisms associated with fresh produce. J. Food Prot. 59, 204-216.
Byrne, D., 2001. Speech by David Byrne, European Commissioner for Health and Consumer Protection. The Commission
policy on the health aspects of BSE - Address to COPA, Brussels, 9 February 2001. Accessed on January 31,
CDC (Centers for Disease Control), 1999. Outbreaks of Escherichia coli O157:H7 and Campylobacter among attendees
of Washington county fair-New York. Morbidity and Mortality Weekly Report 48, 803-804.
Evans, S. and Davies, R., 1996. Case control study of multiple-resistant Salmonella typhimurium DT104 infection of
cattle in Great Britain. Vet. Rec. 139, 557-558.
Farber, J.M. and Peterkin, P.I., 1991. Listeria monocytogenes a food borne pathogen. Microbiological Reviews 55,
Gardener, B.L., 1987. The economics of agricultural policies. MacMillan Publishing Company, NY New York, USA.,
Gibbens, J.C. and Wilesmith, J.W., 2002. Temporal and geographical distribution of cases of foot-and-mouth disease
during the early weeks of the 2001 epidemic in Great Britain. Vet. Rec. 151, 407-12.
Hansson, E., Olsson Engvall, E., Lindblad, J., Gunnarson, A. and Vågsholm, I., 2004. The Campylobacter surveillance
program for broilers in Sweden, July 2001-June 2002. Veterinary Record 155, 193-196.
Humphrey, T.J., 1991. Food poisoning – a change in patterns? Veterinary Annual 31, 32-37.
Humphrey, T.J., Threllfall, E.J. and Cruickshank, J.G., 1998. Salmonellosis. In: Palmer, S.R., Soulsby, L. and Simpson,
D.I.H. (Eds.). Zoonoses, Biology, Clinical practice and Public Health Control. Oxford University Press, Oxford,
UK, p. 191-206.
Johnston, A.M., 2002. HACCP in farm production. In: Foodborne pathogens, Hazards, risk analysis and control.
Blackburn, C.W. and McClure, P.J. (Eds.). Woodhead Publishing limited, Cambridge, UK, p. 127-150.

Towards a risk-based chain control  43

Ivar Vågsholm

Kaesbohrer, A., 2004. Salmonella control in food of animal origin in the European Union. Veterinärmötet Nov 11-12,
2004, Uppsala, Sweden, p. 71-76.
Lindqvist, H., 1999. Control of Salmonella infection in commercial layer flocks in Sweden. In: Proceedings of a
workshop within COST Action 97, Pathogenic micro-organisms in poultry and eggs. 12. Field experience on
Salmonella control in poultry.
Lo Fo Wong, D.M., Dahl, J., van der Wolf, P.J., Wingstrand, A., Leontides, L. and von Altrock, A., 2003. Recovery of
Salmonella enterica from seropositive finishing pig herds. Vet. Microbiol. 97, 201-214.
Long, S.M., Adak, G.K., O’Brien, S.J. and Gillespie, I.A., 2002. General outbreaks of infectious intestinal disease linked
with salad vegetables and fruit, England and Wales, 1992-2000. Commun. Dis. Public Health. 5, 101-105.
Mawer, S.L., Spain, G.E. and Rowe, B., 1989. Salmonella enteritidis phage type 4 and hens eggs. Lancet 333, 280-281.
McLauchlin, J. and Van der Mee-Marquet, N., 1998. Listeriosis. In: Palmer, S.R., Soulsby, L. and Simpson, D.I.H.
(Eds.). Zoonoses, Biology, Clinical practice and Public Health Control. Oxford University Press, Oxford, UK, p.
Mead, P.S. and Griffin, P.M., 1998. Escherichia coli O157:H7. Lancet 352, 1207-1212.
Michino, H., Araki, K., Minami, S., Takaya, S., Sakai, N., Miyazaki, M., Ono, A. and Yanagawa, H., 1999. Massive
outbreak of Escherichia coli O157:H7 infection in school children, Sakai City, Japan, associated with consumption
of white radish sprouts. Am. J. Epidemiol. 150, 787-796.
Mortimore, S., Mayes, T. and Colworth, D., 2002. The effective implementation of HACCP systems in food processing.
In Foodborne pathogens, Hazards, risk analysis and control. Blackburn, C.W. and McClure, P.J. (Eds.). Woodhead
Publishing limited, Cambridge, UK, p. 229-256.
NACMCF (National Advisory Committee on Microbiological Criteria for Foods), 1999. National Advisory Committee on
Microbiological Criteria for Foods. Microbiological safety evaluations and recommendations on fresh produce.
Food Control 10, 117-143.
Nguyen-the, C. and Carlin, F., 2000. Fresh and Processed vegetables. In: The microbiological safety and quality of
foods. Lund, B.M., Baird-Parker, T.C. and Gould, G.W. (Eds.). Aspen Publication, Gaithersburg, p. 620-684.
Noordhuizen, J.P.T.M., Frankema, K. and Welpelo, H.J., 2001. Applying HACCP principles to Animal health care at
farm level. In: Application of Quantitative methods in Veterinary Epidemiology. Noordhuizen, J.P.T.M., Frankema,
K., Thrusfield, M.V. and Graat E.A.M. (Eds.). Wageningen Press, Wageningen, The Netherlands, p. 285-298.
Österberg, P., 2005. EHEC på Västkusten. (EHEC on the West Coast). Epiaktuelt 2005, 38. Accessed Jan 30, 2006. http://å%20Västkusten (in Swedish)
Perdue, M.L. and Swayne, D.E., 2005. Public health risk from avian influenza viruses. Avian Diseases 49, 317-327.
SCVMPH, 1999. Opinion of the Scientific Committee on Veterinary Measures relating to Public Health on Listeria
monocytogenes, September 23, 1999, 44 p.
SCVMPH, 2001. Opinion of the Scientific Committee on Veterinary Measures relating to Public Health on identification
of species/categories of meat producing animals in integrated production systems where meat inspection may
be revised, 13 p.
Sternberg-Lewerin, S., Boqvist, S., Engström, B. and Häggblom, P., 2005. The effective control of Salmonella in
Swedish poultry. In: Food safety control in the poultry industry. Mead, G.C. (Ed.). Woodhead Publishing in Food
Science and Technology, p. 195-215.
Varian, H.R., 1984. Microeconomic analysis. W.W. Norton & Company, London, UK, 348 p.
Wang, G. and Doyle, M.P., 1998. Survival of enterohaemorrhagic Escherichia coli O157:H7 in water. J. Food Prot.
61, 662-667.
Wierup, M., Wahlström. H. and Engström. B., 1992. Experience of a 10-year use of competitive exclusion treatment as
part of the Salmonella control programme in Sweden. International Journal of Food Microbiology 15, 287-291.
Wierup, M., Engström, B., Engvall, A. and Wahlström, H., 1995. Control of Salmonella enteritidis in Sweden.
International Journal of Food Microbiology 25, 219-226.

44  Towards a risk-based chain control

Alberto Mantovani and Roberto Cozzani

Risk assessment of feed additives and contaminants

Alberto Mantovani1 and Roberto Cozzani2
1Department of Food Safety and Veterinary Public Health, National Health Institute, Via le
Regina Elena 299, 00161 Rome, Italy,
2Chemical Laboratory, Animal Health Institute of Latium and Tuscany Regions, Via Appia

Nuova 1411, 00178 Rome, Italy

Feed additives make the bulk of chemicals used in animal production, thus representing
a major issue for safety of foods of animal origin. This paper summarizes the approaches
currently adopted by the European Food Safety Authority in order to perform risk analysis
of feed additives as regards the whole food production chain, thus including target species,
consumers, occupational exposure and the environment. Examples considered with their
peculiar critical issues are coccidiostats, essential elements, amino acids, and enzymes.
Moreover, attention is given to environmental contaminants; in particular feeds can be a major
vehicle for human dietary intake of persistent pollutants such as polychlorinated biphenyls
and some insecticides. Further examples considered include heavy metals and mycotoxins.
Critical issues include toxicological characterization, pathways of feed contamination as well
as transfer to animal products.

Keywords: safety, efficacy, residues, exposure, environment, European Food Safety


1. Feed additives and food safety

Foods are produced by living organisms, either plants or animals; this may easily appear as
a “Mr. Lapalisse” statement but it has, in fact, important bearings on most issues of food
safety and veterinary public health. The environment where the food-producing organisms
grow, including the feeds that is eaten by food-producing animals are essential determinants
of the wholesomeness and quality of our diet (Hinton, 2000).

According to the above considerations, feeds represent a complex topic. Feeds must satisfy
the nutritional requirements of the relevant animal species. Such requirements, however,
are not simply the “physiological” ones, supporting basal metabolism, postnatal growth and
reproduction. Instead, feed composition in the industrialised world, as well as in a growing
fraction of developing countries, should support cost-effective and timely production of meat,
milk and eggs by selected, specialised breeds (e.g. Thong and Liebert, 2004; Young et al.,
2005). Therefore, whilst the origin and choice of feed ingredients have to take into account
the requirements of mass production, they have also to increasingly meet the requests of
consumers for foods with a given taste, texture or colour, as well as the safety concerns. It
is well worth considering that some major recent emergencies of the European food safety
have originated from the contamination of feed ingredients, e.g. the bovine spongiform

Towards a risk-based chain control  45

Alberto Mantovani and Roberto Cozzani

encephalopathy outbreak (Rickets, 2004) and the dioxin-polychlorinated biphenyls (PCB)

contamination of poultry products in Belgium (van Larebeke et al., 2001). Such episodes
had both consequences from the regulatory and the research point of view. For instance, the
central role played by meat-and-bone meal in bovine spongiform encephalopathy resulted
in the new regulations on the incorporation of animal proteins into diets fed to ruminants
and other farmed animals as well as in the investigation on new sources of nutrients (Sellier,
2003). As for the episodes of feed-to-food transfer and related pollutants, they prompted
the production of scientific data for a more refined risk analysis of dioxins and dioxin-like
PCB (EFSA, 2004).

Besides ingredients (and their possible contaminants) feeds utilized in intensive farming
require the use of a diverse range of additives, alike the foods used by populations throughout
the industrialized world. In fact, feed additives make the bulk of chemicals used in animal
production, thus representing a major issue for safety of foods of animal origin. As laid down
by the EC regulation (EC, 2003) they are a large and heterogeneous group of compounds added
to feeds due to their nutritional (vitamins, trace elements), zootechnical (such as growth
promoters, coccidiostats and anti-blackhead compounds), sensory (colorants and flavours)
or technological (antioxidants, preservatives, emulsifiers, etc.) role; moreover, the increasing
importance of enzymes and micro-organisms as probiotics must be taken into account. Such
a diverse range of compounds bear relevance to a number of specific issues besides the
general objective of ensuring that possible residues in animal products would not pose any
appreciable risk to consumers.

2. Risk assessment of feed additives in Europe

The “farm-to-fork” approach promoted by the European Union (CEC, 2000) requires the
assessment and control of major components of the food production chain, with emphasis on
primary production. Accordingly, Europe has paid significant attention to the assessment of
feed additives. Criteria for authorization have been established by Council Directive 70/524/
EEC (EU, 1970) followed by many updates; The Directive is based on three main principles:
(1) pre-market authorization, (2) positive list principle, and (3) thorough assessment of
possible effects on human and animal health as well as on the environment. Moreover, up
to 2002, scientific advice has been provided by the Scientific Committee on Animal Nutrition
established under the auspices of the European Commission.

Since 2003, risk analysis of feed additives is a task of the Panel on additives and products
or substances used in animal feed (FEEDAP) within the European Food Safety Authority.
EFSA was the combined result of several alarms, including major feed-related episodes (van
Larebeke et al., 2001; Rickets, 2004), which undermined consumer confidence in the safety
of the food chain and of the implementation of consumer health protection as a primary
European objective (CEC, 2000). The role of EFSA is to provide independent scientific advice
on all matters linked to food and feed safety - including animal health and welfare- as a
sound basis for regulatory decisions and risk management by the European Commission and
Member States.

46  Towards a risk-based chain control

Alberto Mantovani and Roberto Cozzani

2.1. Risk analysis of feed additives

Assessing risks associated with a feed additive is a complex process that requires a
comprehensive, multidisciplinary approach to assess all aspects relevant to the use of a
given substance. Compounds intended for deliberate addition/use in animal feed should
have a proven efficacy, should be safe for animals and consumers at the intended dose
levels; moreover the safety for the user/worker and for the environment should be assessed.
Accordingly, the EFSA FEEDAP panels include a number of expertises, from animal welfare
through to chemistry, pharmacology, microbiology, toxicology and ecotoxicology. In particular,
addressing the following elements represents the general procedure of risk analysis of feed

1. general characteristics include mode of action, stability in feeds, availability of validated

analytical methods, etc.;
2. efficacy under given conditions of use, including possible effects on the quality of animal
3. safety for target species, including possible interactions with other additives or veterinary
4. metabolism and residues, including the determination of a marker residue;
5. safety for consumers, which includes the evaluation of a range of tests on medium- and
long-term effects, including genotoxicity and reproductive toxicity; the overall assessment
of toxicological characteristics lead to the determination of Acceptable Daily Intake
(ADI), Maximum Residue Limits (MRLs) and withdrawal period, along the same line as
for veterinary drugs (Macrì and Mantovani, 2002).
6. occupational safety for manufacturers and users, considering the exposure to powders
and dust; thus parameters include inhalation exposure, sensitization, irritancy;
7. last but not least, assessment of potential ecotoxicity, since mass use of feed additives
in intensively farmed animals may lead to a significant environmental exposure through
animal excreta (Wollenberger et al., 2000).

Moreover, specific issues may be important for some groups of substances. For instance,
mineral salts need a careful evaluation of the balance between requirements and possible
excess in both target species and consumers (Phillips, 1997). As regards micro-organisms
and enzymes (Becquet, 2003), concerns about residues are unlikely; safety evaluation is
focussed on such issues as production of toxins, residual pathogenicity and induction of cross-
resistance; sensitization of workers to microbial protein products also deserves attention.
Also for essential amino acids (EFSA, 2005b) potential safety concerns might be primarily
associated with the production method and the resulting impurities.

3. Examples of EFSA evaluation of feed additives

3.1. A coccidiostat: Narasin (EFSA, 2004b)

Within a general evaluation at European level of coccidiostats for compliance with regulatory
requirements, FEEDAP was requested to evaluate the efficacy and safety of Monteban®

Towards a risk-based chain control  47

Alberto Mantovani and Roberto Cozzani

G100, a product containing not less than 10 % of narasin activity as the active substance.
Although previous data indicate that narasin is effective as a coccidiostat for chickens for
fattening at a dose range of 60-80 mg kg complete feed, no recent field studies are available
to prove the compound is still efficacious at the dose range. The development of resistance
against coccidiostats, including narasin, is well recognized. However, it may effectively be
counteracted in practice by rotation or by shuttle programs, and no unusual resistance to
narasin is expected to appear. Monteban G100 at the use level for chickens is dangerous
to horses, turkeys and rabbits. Tolerance tests showed a small margin of safety (about 1.4)
in target animals; moreover, the interaction with some medicinal substances (i.e. tiamulin)
justify a warning label against the simultaneous use of these substances. Narasin, at the levels
used for treatment of coccidiosis, is also effective in the prevention of necrotic enteritis in
chickens. The compound is active against Gram-positive bacteria, while Enterobacteriaceae
are resistant. There is no cross-resistance to other antimicrobials except to salinomycin.
Increased shedding of Salmonella is unlikely to occur under practical conditions. Narasin is
absorbed to an unknown extent and excreted rapidly by the chicken. The excretion routes
are not established in the chicken whereas faecal excretion prevails in the rat. The main
metabolic pathway is similar in the chicken and rat and it involves oxidative processes.
Narasin metabolites in tissues and excreta are qualitatively similar. The liver is the target
tissue for total residues. However, unchanged narasin disappear quickly from tissues, while
it is somewhat more persistent in the skin/fat where it represents the major fraction. Each of
the many narasin metabolites represent less than 10% of the total tissue residues; therefore,
for food control purposes narasin could be retained as a practical marker residue and skin/fat
as marker tissue. Narasin may concentrate in the egg yolk (Rokka et al., 2005), however the
compound is not intended for use in laying hens.

Not all the toxicological studies on narasin were of satisfactory quality. This was a serious
problem for other coccidiostats, such as Robenidine hydrochloride, for which as a result of
the inadequacy of the available data, the FEEDAP could not establish an ADI (EFSA, 2004c).
As for narasin, the critical effects were focal degeneration of skeletal muscles, including the
diaphragm, and peripheral neuropathy in dogs: accordingly, the NOEL of 0.5 mg kg bw day
seen in the one-year dog study was used to set the ADI of 5 µg kg-1 bw (equal to 300 µg
day-1 for a person of 60 kg bodyweight).

A uniform MRL (maximum residue limit) of 0.05 mg narasin kg-1 wet tissue is proposed for
all tissues. The withdrawal time of 1 day would be considered sufficient. Validated methods
are available which allow monitoring of narasin in premixes and complete feeding stuffs and
the determination of the marker residue, narasin, in the liver, kidney, muscle and skin/fat
of the chicken. As regards occupational safety, Monteban® G100 can cause irritation to the
eyes but not to the skin. Inhalation studies in dogs showed that narasin is potentially highly
toxic by the inhalation route, compared with the oral route. Moreover, it has sensitization
potential by skin contact and by inhalation. However, the product is formulated as granules
with a low dusting potential. For this reason, it is expected that workers will not be exposed
by inhalation to toxic levels of narasin dust as a result of its handling. Nevertheless, the
FEEDAP Panel recommended the use of appropriate personnel protective equipment for the

48  Towards a risk-based chain control

Alberto Mantovani and Roberto Cozzani

As in the case for other coccidiostats, the data were insufficient as regards environmental
risk assessment. Based on the available information on the toxicity, fate and behaviour of
narasin, it cannot be excluded that the use at the recommended dose range poses a risk for
soil organisms. Insufficient data were provided to assess the risk for the aquatic environment
and secondary poisoning of birds and mammals. Therefore, although a ADI and MRL could
be defined, FEEDAP noted a deficiency of data on both efficacy and environmental impact
of narasin.

3.2. A trace element: iodine (EFSA, 2004a)

Iodine is a well-known essential trace element for humans and animals, due to its incorporation
into the thyroid hormones and the dramatic effects on growth and development of low iodine
intake, leading to hypothyroidism. The European Commission asked EFSA to evaluate the
physiological requirements for iodine of the different animal species and to advise on the
possible detrimental effect of the current levels authorised under Directive 70/524/EC (4,
20 and 10 mg/kg feed for horses, fish and all other species, respectively).

The iodine requirements for farm animals vary between 0.1 and 1.1 mg/kg feed. Within
species the requirements are influenced by physiological demands for growth, reproduction or
lactation and also by dietary factors (e.g. goitrogens). In most cases iodine supplementation
of daily ration is necessary due to the low iodine content of plant feeding stuffs. Although
large European areas are iodine-deficient, nowadays clinically evident iodine deficiency in
animals is rather scarce due to feed supplementation.

Based on the limited available data, maximum tolerable dietary iodine levels can be defined
for some species, e.g. 5 mg/kg feed for laying hens and higher than 60 mg/kg feed for farmed
fish. The iodine tolerance of pigs and fish is far above the EU regulations; moreover, the
tolerances are 3 to 10-fold higher than the requirement, allowing sufficient compensation for
potential goitrogenic substances in feed. However, at present the upper safe level for dairy
cow, calf, chickens for fattening, turkey, sheep, goat and rabbits cannot be determined.

Higher dietary iodine supply results in increasing iodine excretion mainly by urine, but also
via milk and eggs, and to a considerably smaller extent in body deposition (except sea food).
Among food from terrestrial animals milk and eggs show the highest iodine concentrations.
All available data on iodine concentrations in foods of animal origin as well as estimates
of dietary intake in Europe do not support an association between current levels of iodine
feed supplementation and risks of excessive iodine intake in humans. It must be noted,
however, that the actual, current levels of use in mammals and birds are lower than the
maximum levels authorised under Directive 70/524/EC. On the other hand, the worst case
scenario model calculations for milk and eggs based on the authorized maximum iodine level
in feed, show that the upper tolerated limit of iodine intake in humans could be exceeded
for adults and adolescents (i.e. 120-130 µg day) (Scientific Committee on Food, 1992).
Reducing iodine to a maximum of 4 mg/kg complete feed for dairy cows and laying hens
would result in a satisfactory margin of safety for the consumption of milk and eggs, still
fulfilling the iodine requirements in farm animals. As for farmed fish, supplementation of

Towards a risk-based chain control  49

Alberto Mantovani and Roberto Cozzani

the diet with the maximum recommended levels (20 mg iodine/kg) will still result in lower
tissue concentrations than those found in wild marine fish.

FEEDAP stressed that iodine supplemented feeds are not the single, nor possibly the major
source, of iodine in human diet. Iodine-enriched salt, supplemented food items, tablets, and
beverages may all contribute to the overall iodine intake. Moreover milk iodine may originate
from feeding as well as several other sources (notably disinfectants).

Iodine in feed enters the environment via direct excretion of faeces and urine on pasture or
spreading of sludge and slurry. The resulting environmental concentration is well below the
background concentration and it is therefore not expected to pose an environmental risk.

Overall, FEEDAP stressed the need for more and updated data on iodine requirement and
tolerance in animals as well as on the actual impact of iodine supplements in feeds on total
iodine dietary intake of humans.

3.3. A bacterial feed additive: Biomin BBSH 797 (EFSA, 2005c)

The active ingredient of the product Biomin BBSH 797 is a strain of Eubacterium sp., originally
isolated from the bovine rumen; this feed additive has actually been developed to combat
feed contamination. In fact, the bacterium was selected for its capacity to reduce to a less
toxic form trichothecene mycotoxins commonly encountered as contaminants of cereal grain.
Accordingly, the product is intended to be applied to cereals to be used in animal feed that
are contaminated (or presumed to be contaminated) with trichothecenes. In particular, it is
intended for use in farming of piglets, pigs for fattening and chickens for fattening.

Because the DSM 11798 strain of Eubacterium used in Biomin product cannot be assigned to
an existing species there is no historical information on its prevalence within the digestive
tract of livestock (or humans). Consequently, the degree of natural exposure to this or similar
strains cannot be assessed. Under these circumstances the tolerance tests made with the
target species assume a greater importance.

No tolerance problems were observed in studies on piglets and chickens in which the additive
was supplied at ten-fold overdose. However, the design of the studies was inadequate, also
because of the contamination of feeds used with mycotoxins. Moreover, no tolerance studies
were made in growing/fattening pigs. Therefore, FEDDAP concluded that safety for any of the
proposed target species has not been demonstrated. Numbers of the main groups of bacteria
other than Eubacteria contributing to the flora of the digestive tract in pigs and chickens were
not significantly affected by the inclusion of the product at the recommended dose or in the
case of piglets, ten-times the recommended dose. However, the studies did not include any
counts of total Eubacteria or strain DSM 11798. As a result the potential of the organism to
colonise the digestive tract in competition with an existing eubacterial flora is unknown.

There is some, but not fully conclusive, evidence from several efficacy trials that the product
can lead to an improvement in relative growth performance.

50  Towards a risk-based chain control

Alberto Mantovani and Roberto Cozzani

FEEDAP recognises that animal products make only a small contribution to the human
exposure to trichothecenes and is satisfied that addition of the additive will not increase
the total exposure of consumers to trichothecenes and their metabolites. However, it cannot
be assumed that the use of the product will reduce the risk; FEEDAP remains concerned that
relying on the presumed efficacy of the product may lead to unsuitable feed material being
treated and used with adverse consequences for the target species. As for occupational
safety, the product does contain proteinaceous material and so it is likely to pose a risk
of sensitization. On the other hand, the DSM 11798 isolate is not expected to present any
specific or unique risks to those handling the additive. Eubacteria are found as one of the
major groups of bacteria in the digestive tract of livestock and humans and so are naturally
occurring within the environment. Although little is known about the specific ecology of
strain DSM 11798 it would be expected to behave as any other Eubacteria and, as a strict
anaerobe, would not be expected to survive in the wider environment.

Overall, the data base did not comply with many specific issues presented by this particular

4. Risk assessment of feed contaminants in Europe

Within the EFSA, evaluating the impact of feed contaminants is the primary task of the Panel
on contaminants in the food chain (EFSA, 2004e), that deals also with undesirable substances
not covered by any other panel such as mycotoxins (Hussein and Brasel, 2001).

Contamination of feeds by environmental xenobiotics is not at all a minor topic for modern
veterinary public health. For instance, feeds can be a major vehicle for the presence in
human diet of PCBs and other persistent organic pollutants that bioaccumulate in the
fatty portion tissues. Examples are the Belgian PCB/dioxin incident (van Larebeke et al.,
2001) and the recurrent alarms over the high concentration of such pollutants in fish meals
used as feeds for farming of salmonids and other fish species (Jacobs et al., 2002). Heavy
metals (Wilkinson, 2003) are other examples of feed contaminants which raise concern for
their impact on animal health and safety of foods of animal origin. Other environmental
contaminants might also deserve more attention, due to their potential for bioaccumulation.
Examples are potential endocrine disrupters such as brominated flame retardants (Harino et
al., 2000) and the biocide organotins (Janak et al., 2005). In particular, attention could be
paid to the potential contamination of feeds based on fish meals.

The stepwise risk analysis of feed additives cannot be applied to contaminants: Obviously,
the data on efficacy and tolerance are not relevant, whereas in some instances (e.g. fluorine)
chronic exposure through feeds and/or pasture do induce long-term toxic effects in farm
animals (EFSA, 2005d). For the risk analysis of feed contaminants, a case-by-case approach
is applied: critical issues include characterization of toxicological hazards, the possible
pathways of feed contamination as well as transfer of parent compound or metabolites
to foods of animal origin. Thus, a comprehensive risk analysis would pinpoint potential
situations of higher exposure that may require measures for risk management.

Towards a risk-based chain control  51

Alberto Mantovani and Roberto Cozzani

5. Examples of EFSA evaluations of feed contaminants

5.1. A heavy metal: arsenic (EFSA, 2005a)

Arsenic is a naturally occurring element, present in soil, ground water and plants. Regions
with high geological occurrence of inorganic arsenic have been identified in particular in
Asia and other non-European countries. In Europe, environmental arsenic levels are rather
low, with the exception of distinct geological or industrial areas.

Arsenic is a metalloid, displaying different valences resulting in a broad variety of arsenic

compounds with diverse chemical characteristics. Inorganic and organic forms of arsenic also
differ significantly in their toxicity, the organic arsenic compounds exhibiting a very low
toxic potential (Akter et al., 2005). Consequently, the potential adverse effects of arsenic
to animal (and human) health are determined by the inorganic fraction in a given feed (or
food) product, and data reporting only total arsenic in food materials are difficult to interpret
in terms of the ability to induce adverse effects.

Drinking water many contain significant amounts of inorganic arsenic and upper limits have
been set in most countries. Seafood and fish have been identified as major sources of arsenic
in the human diet, and in animal feed materials that contain products derived from fish or
other marine organisms. In seafood and fish, arsenic is present predominantly in the organic
forms of arsenobetaine and arsenocholine, which are virtually non-toxic.

Analytical data from Europe on total arsenic in feed materials do not indicate arsenic levels
of concern in materials others than fish-derived products, for which further data on chemical
speciation are needed, to identify the actual levels of inorganic arsenic. As the carry-over of
arsenic in its inorganic form into edible tissue of mammals and poultry is low, food derived
from terrestrial animals contributes only insignificantly to human exposure.

In conclusion, arsenic contamination of water may be of greater concern than foods,

both vegetables and of animal origins, because of the bioavailability of the inorganic
fraction. Conversely, failure to consider the contribution of different arsenic species on their
bioavailability could introduce a substantial bias into the estimation of risks associated with
exposure (Akter et al., 2005).

5.2. A pesticide: camphechlor (EFSA, 2005e)

Camphechlor (also called Toxaphene) is a non-systemic insecticide that was used on crops
and animals. It has been the most heavily applied pesticide in many areas and replaced DDT
in the early 1970s. However, its use is now phased out in most of the world.

Technical camphechlor mixtures show a complex composition, with at least 202 different
compounds identified. Due to its persistence it has found a widespread distribution.
Environmental biotransformation and accumulation in the aquatic environment has led to
relatively high levels of certain camphechlor congeners in fish, marine mammals and sea
birds while other congeners rapidly degrade.

52  Towards a risk-based chain control

Alberto Mantovani and Roberto Cozzani

Camphechlor is readily absorbed from the gastrointestinal tract and distributed to the lipid
portion of the organism. It passes the placenta and transfer to milk has been shown in
animals and humans. Alike other chlorinated insecticides, toxic effects target the nervous
system, liver and thyroid; immunotoxicity appears as the critical effect with a NOEL of 100
µg/kg b.w. in the macaque.

As other chlorinated pollutants (Jacobs et al., 2002), fish oil and fish meal are the main
sources of camphechlor exposure of farmed animals, particularly fish. Human dietary exposure
is mainly from fatty fish, which is estimated to be between 1 and 25 ng/kg b.w./day. High
fish consumers may have intakes of about 60 ng/kg b.w./day, which has still a good safety
margin with the most sensitive NOEL.

The congeners CHB 26, 50 and 62, which accumulate in the food chain, can serve as indicators
of camphechlor contamination. Moreover, congeners CHB 40, 41, 42 and 44, should also be
included in analytical studies as they are also found in fish samples and as CHB 42 appears to
be one of the most toxic congeners. Furthermore, CHB 32 should be included as an indicator
for a recent contamination.

There are substantial data gaps for camphechlor. Detailed data on the prevalence of
camphechlor in feedingstuffs and food of animal (other than fish) and plant origin are
lacking. There is also a general lack of congener specific toxicity data as well as data on oral
toxicity for farmed fish. Thus it is difficult to perform a proper risk assessment.

5.3. A mycotoxin: zearalenone (EFSA, 2004d)

Zearalenone is a mycotoxin produced by several field fungi, including Fusarium graminaerum

and Fusarium culmorum. The toxin is common in maize and maize products, but can be
found in soybeans and various cereals and grains, and their by-products as well. Moreover,
zearalenone seems to occur on grass, hay and straw resulting in additional exposure of
animals from roughage and bedding. Co-occurrence with other Fusarium toxins, particularly
deoxynivalenol, nivalenol, and fumonisins is regularly observed.

Zearalenone is an endocrine disrupter. In all mammalian species, including farm animals, the
mycotoxin interacts with oestrogen receptors, resulting in an apparent hyperoestrogenism,
including reduced fertility. Female pigs of all age groups are considered to be the most
sensitive animal species, but the hormonal effects vary in intensity according to age and
reproductive cycle. Ruminants and poultry show a lower responsiveness to zearalenone.
However, monitoring of feedingstuffs are needed to improve exposure assessment and dose-
response studies are essential to establish safe levels of exposure for zearalenone in feed
materials for all individual farm animal species, including minor species such as rabbits and
small ruminants.

As far as human intake is concerned, due to the rapid biotransformation and excretion of
zearalenone in animals, secondary human exposure resulting from residues in meat, milk
and eggs is expected to be low; thus, foods of animal origin would normally contribute only
marginally to the daily intake.

Towards a risk-based chain control  53

Alberto Mantovani and Roberto Cozzani

6. Conclusions

Feed additives require a careful evaluation with the support of up-to-date scientific
information, in order to evaluate their efficacy and safety in the modern farm animal rearing.
Feed contaminants are an unavoidable problem, but they can be reduced by good farming
practices (e.g. adequate storage conditions in the case of mycotoxins); moreover, strategies
for detoxification should be developed (Molnar et al., 2004) as well as new nutritional sources
less liable to contamination (Rickets, 2004).

Moreover, the examples presented indicate the need for new research data, with particular
regard for certain areas, such as:

• the nutritional requirements, efficacy and tolerance in modern rearing conditions, including
highly genetically selected breeds and so-called “minor species” that can be nevertheless
important productions in certain areas (e.g. the rabbit or the dairy ewe) (EFSA, 2004a;
EFSA, 2004b; Thong and Liebert, 2004; Young et al., 2005);
• the evaluation of actual exposure of the consumers through the feed-to-food transfer of
residues and contaminants, in comparison with other sources of intake, and considering
different dietary patterns and age- and sex-related susceptibilities within the population
(EFSA, 2004a; EFSA, 2004d; EFSA, 2005a; EFSA, 2005e);
• Last but not least, feed ingredients and additives as modifiers of nutritional quality
of foods of animal origins. A peculiar example is iodine supplementation in British
dairy herds which has resulted in iodine contamination of milk and dairy products.
The resulting contribution to prevent endemic goitre has been defined “an accidental
public health triumph” (Phillips, 1997). A more recent approach is the modulation of
the content of cardioprotective effects of omega-3 polyunsaturated fatty acids in fish
tissues by differential feeding (Seierstad et al., 2005). Thus, in some instance feeds and
feed ingredients might support preventive strategies to reduce the burden of human

Overall, this short excursus indicates that compounds present in feeds are an essential
component of the quality and safety of foods of animal origins as outlined by the “farm-to-
fork” conceptual framework of the European food safety.


Akter, K.F., Owens, G., Davey, D.E. and Naidu, R., 2005. Arsenic speciation and toxicity in biological systems. Rev.
Environ. Contam. Toxicol. 184, 97-149.
Becquet, P., 2003. EU assessment of enterococci as feed additives. Int. J. Food Microbiol. 88, 247-254.
CEC (Commission of the European Communities), 1999. White Paper on Food Safety. Brussels 12 January 2000- COM
719 final.
EC, 2003. Regulation (EC) N° 1831/2003 of the European Parliament and of the Council of 22 September 2003 on
additives for use in animal nutrition.
EFSA (European Food Safety Authority). Panel on additives and products or substances used in animal feed. http://

54  Towards a risk-based chain control

Alberto Mantovani and Roberto Cozzani

EFSA, 2004. EFSA Scientific Colloquium – Methodologies and principles for setting tolerable intake levels for dioxins,
furans and dioxin-like PCBs. 28-29 June, 2004, Brussels.
EFSA, 2004a. Opinion of the FEEDAP Panel on the use of iodine in feedingstuffs.
EFSA, 2004b. Opinion of the Scientific Panel on Additives and Products or Substances used in Animal Feed on
a request from the Commission on the re-evaluation of efficacy and safety of the coccidiostat Monteban®
G100 in accordance with article 9G of Council Directive 70/524/EEC.
EFSA, 2004c. Opinion of the Scientific Panel on Additives and Products or Substances used in Animal Feed on a
request from the Commission to update the opinion on the safety of “Cycostat 66G” based on robenidine
hydrochloride, as a feed additive in accordance with Council Directive 70/524/EEC.
EFSA, 2004d. Opinion of the Scientific Panel on Contaminants in the Food Chain on a request from the Commission
related to Zearalenone as undesirable substance in animal feed.
EFSA, 2004e. Panel on contaminants in the food chain
EFSA, 2005a. Opinion of the Scientific Panel on Contaminants in the Food Chain related to Arsenic as undesirable
substance in animal feed.
EFSA, 2005b. Opinion of the FEEDAP Panel on the safety and the bioavailability of product L-Histidine monohydrochloride
monohydrate for salmonids.
EFSA, 2005c. Opinion of the FEEDAP Panel on the safety of the product “Biomin BBSH 797” for piglets, pigs for
fattening and chickens for fattening.
EFSA, 2005d. Opinion of the Scientific Panel on Contaminants in the Food Chain on a request from the Commission
related to Fluorine as undesirable substance in animal feed.
EFSA, 2005e. Opinion of the Scientific Panel on Contaminants in the Food Chain on a request from the Commission
related to camphechlor as undesirable substance in animal feed.
European Union, 1970. Council Directive 70/524/EEC concerning additives in feeding stuffs. EC OJ L270,
Harino, H., Fukushima, M. and Kawai, S., 2000. Accumulation of butyltin and phenyltin compounds in various fish
species. Arch. Environ. Contam. Toxicol. 39, 13-19.
Hinton, M.H., 2000. Infections and intoxications associated with animal feed and forage which may present a hazard
to human health. Vet. J. 159, 124-138.
Hussein, H.S. and Brasel, J.M., 2001. Toxicity, metabolism, and impact of mycotoxins on humans and animals.
Toxicology 167, 101-34.
Jacobs, M., Ferrario, J. and Byrne, C., 2002. Investigation of polychlorinated dibenzo-p-dioxins, dibenzo-p-furans
and selected coplanar biphenyls in Scottish farmed Atlantic salmon (Salmo salar). Chemosphere 47, 183-191.
Janak, K., Covaci, A., Voorspoels, S. and Becher, G., 2005. Hexabromo-cyclododecane in marine species from the
Western Scheldt Estuary: diastereoisomer- and enantiomer-specific accumulation. Environ. Sci. Technol. 39,
Macrì, A. and Mantovani, A., 2002. Endocrine effects in the hazard assessment of drugs used in animal production.
J. Exp. Clin. Cancer Res. 21, 445-456.
Molnar, O., Schatzmayr, G., Fuchs, E. and Prillinger, H., 2004. Trichosporon mycotoxinivorans sp. nov., a new yeast
species useful in biological detoxification of various mycotoxins. Syst. Appl. Microbiol. 27, 661-671.

Towards a risk-based chain control  55

Alberto Mantovani and Roberto Cozzani

Phillips, D.I., 1997. Iodine, milk, and the elimination of endemic goitre in Britain: the story of an accidental public
health triumph. J. Epidemiol. Community Health 51, 391-393.
Rickets, M.N., 2004. Public health and the BSE epidemic. Curr. Top Microbiol. Immunol. 284, 99-119.
Rokka, M., Eerola, S., Perttila, U., Rossow, L., Venalainen, E., Valkonen, E., Valaja, J. and Peltonen, K., 2005. The
residue levels of narasin in eggs of laying hens fed with unmedicated and medicated feed. Mol. Nutr. Food
Res. 49, 38-42.
Scientific Committee on Food, 1992. Nutrition and energy intakes for the European Community. http://ec.europa.
Seierstad, S.L., Seljeflot, I., Johansen, O., Hansen, R., Haugen, M., Rosenlund, G., Froyland, L. and Arnesen, H.,
2005. Dietary intake of differently fed salmon; the influence on markers of human atherosclerosis. Eur. J. Clin.
Invest. 35, 52-59.
Sellier, P., 2003. Protein nutrition for ruminants in European countries, in the light of animal feeding regulations
linked to bovine spongiform encephalopathy. Rev. Sci. Tech. 22, 259-269.
Thong, H.T. and Liebert, F., 2004. Potential for protein deposition and threonine requirement of modern genotype
barrows fed graded levels of protein with threonine as the limiting amino acid. J. Anim. Physiol. Anim. Nutr.
(Berl.) 88, 196-203.
van Larebeke, N., Hens, L., Schepens, P., Covaci, A., Baeyens, J., Everaert, K., Bernheim, J.L., Vlietinck, R. and
De Poorter, G., 2001. The Belgian PCB and dioxin incident of January-June 1999: exposure data and potential
impact on health. Environ. Health Perspect. 109, 265-273.
Wilkinson, J.M., Hill, J. and Phillips, C.J., 2003. The accumulation of potentially-toxic metals by grazing ruminants.
Proc. Nutr. Soc. 62, 267-277.
Wollenberger, L., Halling-Sorensen, B. and Kusk, K.O., 2000. Acute and chronic toxicity of veterinary antibiotics to
Daphnia magna. Chemosphere 40, 723-730.
Young, M.G., Tokach, M.D., Aherne, F.X., Main, R.G., Dritz, S.S., 2005. Goodband, R.D. and Nelssen, J.L. Effect of
sow parity and weight at service on target maternal weight and energy for gain in gestation. J. Anim. Sci.
83, 255-261.

56  Towards a risk-based chain control

John N. Sofos

Field data availability and needs for use in

microbiological risk assessment
John N. Sofos
Department of Animal Sciences, Colorado State University, Fort Collins, Colorado 80523-1171,

As is well known, animals may be contaminated or are asymptomatic carriers of pathogenic
bacteria and, thus, serve as sources of subsequent meat contamination or contamination
of other food commodities through contaminated manure, soil and water. Microbial
contaminants, especially pathogenic bacteria of enteric origin such as Escherichia coli O157:
H7, Salmonella and Campylobacter, are of major concern because they can compromise food
safety. Thus, there is a need to control pathogenic micro-organisms in animals, their products
and other foods in order to enhance the safety of our food supply. Progress has been made
in developing interventions for pathogen control following harvest of animals and plant
products, but pre-harvest pathogen control in animals has major constraints. Approaches
to pathogen control should be based on results of risk analysis activities. For example,
pathogen control at the pre-harvest level should consider the results of research addressing
pathogen ecology and risk analysis of animal management, handling, feeding, and shipping
for slaughter practices. It is also important to realize that control or management of food
safety risks should be based on an integrated effort and approach that addresses all sectors,
from the producer through the processor, distributor, packer, retailer, food service worker
and consumer. Nevertheless, reduction of pre-harvest pathogen prevalence may lead to a
reduced probability that errors occurring in subsequent parts of the food chain will lead to
foodborne illness. Overall, however, areas of emphasis for pathogen control and extent of
efforts and resources committed to such control should be based on risk assessments and
establishment of food safety objectives (FSO). This report examines these issues and presents
an overview of related knowledge.

Keywords: animal contamination, pre-harvest pathogen control, meat pathogen control,

microbiological risk assessment

1. Introduction

It is inevitable that raw foods, including fresh meat and poultry, become contaminated with
micro-organisms, including human foodborne bacterial pathogens. Contamination originates
form soil, decaying material and animal waste, which contaminate water, air, animals,
plants, processing facilities, equipment, utensils, and humans. These sources of original or
cross-contamination result in a complete contamination cycle (CAST, 2004; Koutsoumanis
and Sofos, 2004; Koutsoumanis et al., 2005; Sofos, 1994, 2002, 2004, 2005). Overall, the
microbiological status of the food products that reach the consumers, either as raw meat or

Towards a risk-based chain control  57

John N. Sofos

processed foods, depends on the extent of exposure to contamination and its control during
all steps of the food production, processing, distribution, storage, retailing and preparation
for consumption chain.

In recent years, some highly publicized outbreaks of foodborne disease caused by pathogenic
bacteria, such as Escherichia coli O157:H7, have increased consumer concerns and interest
in food safety. As a result, United States health agencies have set targets for reductions in
foodborne illness incidences, while regulatory authorities and the industry have undertaken
efforts to improve the microbiological quality of meat and other foods, in order to protect
public health and help meet these goals. Actions taken by the Food Safety and Inspection
Service of the United States Department of Agriculture (FSIS/USDA), the meat and poultry
product regulatory agency of the United States, include implementation of a new inspection
regulation (FSIS, 1996) which requires meat and poultry plants to: (1) establish and
implement sanitation standard operating procedures; (2) operate under the principles of
the hazard analysis critical control point (HACCP) process management system; and, (3)
meet microbiological performance criteria and standards for Escherichia coli biotype I and
Salmonella, as a verification of HACCP and pathogen reduction, respectively. In addition,
Escherichia coli O157:H7 has been declared as adulterant in ground beef and other nonintact
beef products in the United States. This is enforced through sampling and testing for presence
of this pathogen in these products. The results of this testing activity are presented in Table 1

Table 1. Prevalence of Escherichia coli O157:H7 in ground beef in the United States.

Year Number of samples

Tested Positive Percent positive

1994 891 0 0
1995 5407 3 0.06
1996 5703 4 0.07
19971 6065 4 0.07
1998 8080 14 0.17
19992 7786 32 0.41
2000 6375 55 0.86
2001 7010 59 0.84
2002 7025 55 0.78
2003 6584 20 0.30
2004 8010 14 0.17
20053 7345 14 0.19
Total 76281 274 0.36

1Sample size tested was increased from 25 g to 325 g.

2Method of detection became more sensitive through inclusion of immunomagnetic bead concentration.

3As of 25 August 2005.

58  Towards a risk-based chain control

John N. Sofos

and demonstrate a decrease in positive samples through the years of testing. Another obvious
observation from the data of Table 1 is the inefficiency of relying on microbial testing for
pathogen control. In a period of over 10 years only 274 of 76281 samples tested (0.36%)
were found positive and removed from the marketplace, if still present and found at the
conclusion of testing.

In efforts to meet regulatory requirements and commercial specifications for raw products
in the processing plant, as well as avoid product recalls from the marketplace when samples
of ground beef are positive for E. coli O157:H7, and hopefully to provide safer products to
consumers, the United States meat industry has employed various carcass decontamination
interventions in sequence or as multiple hurdles during slaughter and carcass dressing. These
interventions have been proven effective, as indicated by data showing major decreases
in contamination levels at the end of the dressing process compared to the point of hide
removal, as well as from the data of Table 1 (Sofos, 2005). Currently, regulators and scientists
are placing emphasis on development of pathogen reduction interventions at the pre-
slaughter and pre-harvest stage. In general, reduction of pathogen prevalence on animals
pre-slaughter and on raw products of animal origin is beneficial because: (1) it results in
products meeting regulatory and commercial standards and specifications, respectively;
(2) processes designed to inactivate pathogens during meat processing will not fail due
to excessive initial contamination; and (3) it minimizes sources of factory environmental
contamination and reduces the risk of biofilm formation and cross-contamination (Sofos,
2002, 2005; Stopforth and Sofos, 2005).

2. Risk analysis based pathogen control

Academic, industry and government scientists, nationally and internationally, agree in recent
years that food safety regulatory actions should be based on decisions made through the
process of risk analysis, which consists of the components of risk assessment, risk management,
and risk communication (Lammerding, 1997; Walls and Buchanan, 2005; Whiting, 1996).
Thus, there is a need to conduct microbiological risk assessments in order to identify risk
factors and to establish food safety objectives (FSO), before setting performance and process
criteria for the industry to achieve by developing proper process management practices, such
as hazard analysis critical control point (HACCP), and critical control points with appropriate
critical limits (ICMSF, 2002). The risk assessment/risk management process defines the
problem and develops risk estimates to select acceptable levels of risk or appropriate levels
of protection (e.g. cases of illness or deaths per 100,000 persons per year). This is then
used to develop FSO (i.e., allowable level of a pathogen in food). Following evaluation and
confirmation of the feasibility of the FSO, risk managers from the government, academia and
industry develop effective control measures or treatments to achieve target performance,
process or product criteria that achieve the FSO. These control measures and criteria are then
introduced into HACCP plans for implementation by the industry (CAC, 1997; FSIS, 1996;
ICMSF, 2002; Lammerding, 1997; NACMCF, 1998).

Various risk assessments have been conducted in recent years (e.g. Risk Assessments of
Salmonella in Eggs and Broiler Chickens; Risk Assessment of Listeria monocytogenes in Ready-

Towards a risk-based chain control  59

John N. Sofos

to-Eat Foods) by groups such as the World Health Organization (WHO) and Food and Agriculture
Organization (FAO) of the United Nations, and the United States Food and Drug Administration
(FDA) and the FSIS/USDA (; In
2001, the FSIS/USDA conducted a farm-to-table preliminary risk assessment to evaluate the
health impact from E. coli O157:H7 in ground beef (
FRPubs/00-023N/exec_sum-00-023Nrpt.pdf). The risk assessment was an effort to provide a
baseline reflecting a full range of current practices, behaviours and conditions in the farm-
to-table chain, including the steps of production, slaughter, processing, transportation,
storage, preparation, and consumption. An effort was made to consider and integrate data
available through July 2001 into the general framework for microbiological risk assessments
which includes hazard identification, exposure assessment, hazard characterization, and risk
characterization (Doyle et al., 2002). The hazard identification component characterized E. coli
O157:H7 using available data from ecology, pathology, epidemiology, and microbiology. The
exposure assessment was comprised of the modules of production, slaughter and preparation,
and used probabilistic techniques to model the prevalence and concentration of E. coli O157:
H7 in live cattle, carcasses, beef trimming, and a single serving of ground beef. Data that
were considered for the exposure assessment included herd and within-herd prevalence of E.
coli O157:H7 including seasonal variation, slaughter conditions and carcass decontamination
interventions, product storage, cooking, and consumer demographics. Hazard characterization
quantified the nature and severity of illness or the death (response) associated with a given
number of cells in a ground beef serving (dose) consumed. Risk characterization integrated
the results of the exposure assessment and hazard characterization to estimate the risk of
illness (

This FSIS/USDA ground beef draft risk assessment yielded intermediate and final outputs in
the form of distributions that characterized the variability and uncertainty in estimates of
a variety of risk assessment endpoints or human illnesses. Overall, the draft risk assessment
report indicated that its objective was to present the state of knowledge of that time in the
United States on: (1) the occurrence of E. coli O157:H7 in cattle, carcasses and ground beef;
and (2) the subsequent risk of human illness, as estimated from the limited data available
to complete a risk assessment. Thus, the risk assessment was declared as a draft and it was
structured in a way to allow incorporation of additional data as they become available, in
order to improve its outcomes.

3. Data gaps, needs and flow

It was determined during development of the draft ground beef risk assessment, and stated
in the document that the certainty of estimates calculated would be strengthened with
additional data on: (1) prevalence and levels of E. coli O157:H7 contamination on cattle
and carcasses; (2) effects of carcass decontamination and chilling processes on final raw
meat contamination; (3) changes in contamination levels and prevalence of the pathogen
during various stages of meat processing, storage and preparation; (4) concentration of E.
coli O157:H7 cells in a ground beef serving before and after cooking; and (5) data on product
contamination changes in foodservice and at the consumer level.

60  Towards a risk-based chain control

John N. Sofos

In general, the FSIS/USDA considered the ground beef draft risk assessment as a “work-in-
progress” and an effort to present the state of knowledge on occurrence of E. coli O157:H7
on cattle, carcasses and ground beef at a that time and based on the limited data available.
The draft risk assessment was released for public comment, and in addition, FSIS/USDA asked
the Institute of Medicine of the National Academy of Sciences (NAS) of the United States
to form a committee that peer reviewed the risk assessment. Keeping in mind that the draft
risk assessment was a work-in-progress, the NAS committee developed a document reviewing
the work of FSIS/USDA (Doyle et al., 2002). Acknowledging that the effort in developing the
draft risk assessment was impressive, that the task of collecting, analyzing and integrating
the existing information was extraordinary, and that the risk assessors faced a number of
substantial, new and peculiar methodological hurdles, the committee discussed a number of
issues applicable to each segment of the risk assessment, with the objective of improving it
in the future. As in other microbiological risk assessments, most of the comments, however,
dealt with issues raised due to lack of data needed for adequate completion of a better risk

Data gaps were recognized as to: (1) the extent and levels of animal contamination with E.
coli O157:H7; (2) the extent of the contribution of faecal, hide, environmental sources, and
transportation contamination to carcass contamination; (3) the contribution of dehiding,
evisceration, dressing, chilling and boning processes on carcass and meat contamination with
E. coli O157:H7; (4) the extent of pathogen spreading and cross-contamination at slaughter,
dressing, chilling and boning on final product contamination; (5) the contribution of carcass
decontamination processes on final carcass and product contamination; (6) information on
cells numbers to supplement data on pathogen prevalence levels; (7) the influence of plant
variability (e.g. size, design, product flow, employee experience, plant location, etc.) and
processing practices on contamination levels; (8) information on contribution of variations
in product handling by various types of food service operations to the problem; and (9) the
contribution to foodborne illness of cross-contamination during food preparation.

In general, quantitative data needs for establishing risk assessment based pathogen control
strategies include: (1) sources and qualitative and quantitative extent of contamination;
(2) effect of processes through the chain on pathogen levels and control; (3) impact of
contamination on human health; and (4) disease investigation data. Such data should
be collected at the pre-harvest, post-harvest, processing, distribution, storage, retail,
foodservice and consumption stage of the food chain. In the United States, research for
data collection at the pre-harvest level is conducted by universities and USDA agencies such
as Animal, Plant Health Inspection Service (APHIS) and Agricultural Research Service (ARS).
Post-harvest research and data collection is done by ARS, the FDA, universities, as well as
industry and private laboratories. Disease investigation is done by the Centers for Disease
Control and Prevention (CDC) and state and local health departments, while pathogenicity
work is the responsibility of various health and university laboratories.

Towards a risk-based chain control  61

John N. Sofos

4. Strategy for pathogen control

There has been a substantial amount of activity in recent years involving investigations
to fill data gaps by determining pathogen sources and levels in the animal environment
as well as in reducing and controlling micro-organisms, and especially pathogens, in the
livestock prior to slaughter as well as during slaughter (Sofos, 2005). It is recognized that
control of food safety risks should be based on an integrated approach that addresses
all sectors, from the producer through the packer, processor, distributor, retailer, food
service, and consumer. The most comprehensive strategy for improving the microbiological
quality of meat is to apply technologies that: (1) control contamination sources to reduce
prevalence and levels of contamination on the raw product (live animal pre-harvest, and
raw and processed product along the processing, retailing and serving chain); (2) minimize
the access of micro-organisms to the product (carcasses, meat and processed ready-to-eat
products); (3) reduce the contamination that has gained access to the product (carcass
washing and decontamination); (4) inactivate micro-organisms on the product without
cross-contamination (carcass decontamination, and product further processing or cooking);
and (5) prevent or inhibit growth of micro-organisms which have gained access to the meat
and have not been inactivated (carcasses, raw meat, and further processed or ready-to-
eat products) (Koutsoumanis and Sofos, 2004; Koutsoumanis et al., 2005; Stopforth and
Sofos, 2005). There is widespread agreement among sectors including regulators, educators,
consumers, health authorities, research scientists, and the industry that there should be
proactive efforts to reduce, eliminate or control pathogens at all stages of the food chain,
as well as include educational activities on safe handling of foods for consumers and those
working in the food industry.

5. Antimicrobial interventions to control of pathogens in live animals

Recently, questions have been raised about animal production practices and their resulting
impact on the incidence of E. coli O157:H7 in beef. Many researchers believe that management
practices at the feedlot play an important role in animal health, carcass quality, and potentially
food safety. Over the last five years, substantial research has been conducted to identify
pathogen intervention systems and livestock management practices to reduce the prevalence
of E. coli O157 in and on market-ready feedlot cattle (Sofos, 2005; Stopforth and Sofos,
2005). Efforts for pathogen control have involved research studies to determine pre-harvest
pathogen sources and niches as well as pre-harvest or field interventions for pathogen control.
In addition, other areas being investigated in recent years include: (1) determination of
pathogen cell numbers, in addition to prevalence; (2) potential pathogen sources at the
processing plant; (3) genetic comparison of pathogen isolates from various sources in order to
potentially track contamination sources; (4) development and validation of pre-slaughter and
processing decontamination interventions; and (5) potential influence of decontamination
interventions on pathogen stressing and cross-protection (Samelis and Sofos, 2003; Sofos,
2005; Stopforth and Sofos, 2005). As additional data are generated and some of the issues
and concerns are addressed, additional questions are raised and the complexity of the subject
of animal-meat contamination-decontamination becomes more evident.

62  Towards a risk-based chain control

John N. Sofos

The rationale for emphasizing efforts to reduce contamination pre-slaughter is that reduction
of pathogen populations in the animal environment will result in lower levels of pathogens
on/in animals, which in turn should lead to a reduced probability of introducing such
pathogens at subsequent steps in the process and should enhance the effectiveness of
subsequent pathogen reduction and control interventions during slaughtering and further
processing of meat (Sofos, 2002, 2005). Overall benefits of pathogen control in the field,
include: (1) reduction of sources and levels of pathogens; (2) control of natural water
contamination; (3) reduction of the potential for cross-contamination of foods of plant
origin; and (4) reduction of the potential for direct animal-to-human pathogen transmission.
Unpublished data from our studies have indicated that when E. coli O157:H7 prevalence on the
pen floor of cattle feedyards was less than 20%, contamination levels on animal hides, feces
and carcasses before evisceration and application of decontamination treatments were 5%,
7.5% and 6.3%, respectively. In contrast, corresponding prevalence levels when feedyard pen
floor contamination levels exceeded 20%, were 25.7%, 51.4% ad 14.3%. The most important
source of beef carcass and processing plant environment is considered to be the hide and
mouth of cattle which have been found to have levels of E. coli O157:H7 contamination in
the range of 40 to 75% (Keen and Elder, 2002). Studies also exist indicating presence of
isolates with matching genetic profiles in animal holding pens, corridors, stunning boxes,
worker aprons, and knives (Tutenel et al., 2003). Quantitative data have indicated that most
carcass samples are contaminated with shiga toxin producing E. coli (STEC) levels of less than
3 log cycles per 100 cm2 (Arthur et al., 2004; Barkocy-Gallagher et al., 2003).

It should be noted that pre-harvest reduction of pathogen prevalence and levels in animals
and their environment is a challenge because: (1) there are numerous complicating variables
involved; (2) scientific information for selection of effective decontamination interventions
and their validation is limited; and (3) the cost effectiveness of any intervention needs to
be considered. Nevertheless, pre-harvest control of animal contamination and its sources is
necessary because, as indicated, it should reduce contamination, not only on carcasses and
meat, but also should decrease the likelihood of contamination of other food products and
water, as well as the potential for animal-to-human transmission of pathogens.

Live animal contamination is considered as the most significant source of carcass and factory
environmental contamination. Potential approaches to controlling pathogens in live animals
(Stopforth and Sofos, 2005) include: (1) use of feed additives to act against pathogens
in the gastrointestinal system; (2) diet modification to result in changes in dominant
micro-organisms in the gastrointestinal system; (3) antimicrobial/antibiotic treatments
against target pathogens such as E. coli O1547:H7; (4) prebiotics, probiotics and competitive
exclusion micro-organisms; (5) treatment with bacteriophages specific against pathogens of
concern; (6) application of specific vaccines; and (7) improved husbandry and management
practices (e.g. market classification of animals, clean housing, clean water, clean feed,
pest control, transport/lairage control, animal cleaning, etc.). With the exception of good
production practices, all other approaches are still in the experimental stage or of limited
use (Sofos, 2005; Stopforth and Sofos, 2005).

Proper management practices may be beneficial more through control of contamination

levels in the environment than on the animals. This, however, is also desirable because,

Towards a risk-based chain control  63

John N. Sofos

as indicated it may result in lower pathogen levels and prevalence in water and plant
foods. Market classification of animals may be beneficial for pathogen control if we accept
evidence that prevalence of E. coli O157 may be lower in heavier/older cattle (Dargatz et
al., 1997). Other studies, however, have found no difference in E. coli O157 prevalence in
cows of different ages or between cows and their calves one week postpartum (Riley et al.,
2003). Number of cattle in a pen, or pen density, may be associated with faecal shedding
of E. coli O157:H7, while it is debatable how effective would cleaning of animal pens be
relative to the extent and long-term E. coli O157:H7 prevalence and levels (Stopforth and
Sofos, 2005). A concern exists, however, with spreading of manure on fields to contribute
as fertilizer; this may result in introduction of the pathogens on other foods such as fruit
and vegetables, which are often consumed without cooking. Thus, it is important to control
manure contamination before use as fertilizer. Treatment with carbonate or with ammonia
combined with carbonate may inactivate pathogens in manure (Park and Diez-Gonzalez,
2003; Stopforth and Sofos, 2005).

Animal drinking water and water troughs have often been suggested as important sources
of pathogen contamination and spreading in live animals (Shere et al., 2002). Therefore,
heat, UV irradiation, chlorine, sodium caprylate, ozone, and electrolyzed water treatments
have been proposed to reduce E. coli O157:H7 in water (Stevenson et al., 2004). However,
extensive and frequent cleaning of water troughs did not affect E. coli O157:H7 prevalence
in cattle (Stopforth and Sofos, 2005). Presence of E. coli O157:H7 in animal feed and in feed
bunks is documented but there is no evidence of an association between its presence in
feed and in animals. Although pests, like flies, may serve to spread E. coli O157:H7 among
animals in feedlots, it is difficult to determine how they could be controlled and how great
the benefit would be. Proposed fly management strategies for potential pathogen control
include use of bait traps, biological control through parasitic wasps, chemical control with
foliar application or insecticide baits, drainage of standing water, scraping of animal pens,
removal of uneaten feed, good maintenance of water troughs, and composting of manure
(Stopforth and Sofos, 2005); the contribution of these interventions to pathogen control in
the field would still be debatable.

Efforts have been made to determine whether animal diet manipulations may influence E.
coli O157 prevalence and shedding by animals. In addition to diet alterations, inclusion of
feed supplements has also been evaluated for potential effects in reducing E. coli O157:H7
carriage and shedding in cattle. A shift from high-concentrate feeds to high roughage diets
has been proposed as a control of pathogen shedding but the results of studies have been
conflicting (Diez-Gonzalez et al., 1998). In addition, such practices may be impractical due
to potential adverse effects on animal performance. Other studies have indicated that feeding
cattle whole cottonseed (15% of total feed) resulted in animals with lower E. coli O157:H7
prevalence, while feeding barley appeared to increase shedding of E. coli and E. coli O157:
H7 (Berg et al., 2004) in cattle. Inclusion of ionophores as feed additives has also been
suggested for controlling pathogens in the rumen of cattle, but it was also found to have
little effect on populations of E. coli O157:H7 and Salmonella in ruminant fluid (Edrington
et al., 2003). A brown seaweed extract (Tasco 14TM) has been presented as effective in
reducing levels of E. coli O157:H7 on cattle hides and in feces (Braden et al., 2004), while
there is evidence that caprylic acid reduces E. coli O157:H7 in bovine rumen fluid and may

64  Towards a risk-based chain control

John N. Sofos

assist in reducing pathogen carriage in cattle (Annamalai et al., 2004). Several studies have
found that inclusion of sodium chlorate as a feed additive or in the water of cattle, sheep,
and swine may effectively reduce pathogen populations in the rumen and faeces of animals
(Anderson et al., 2000; Callaway et al., 2002, 2003); approval of sodium chlorate for use in
animal diets is pending.

Feeding of animals with bacteria that compete with pathogens in the animal’s gastrointestinal
system is another approach that may lead to reduced pathogen loads in the live animal,
the feedlot, and the slaughter facility (Schamberger and Diez-Gonzalez, 2004). A mixture of
undefined micro-organisms to control foodborne pathogens in livestock is typically termed
competitive exclusion, while feeding of individual or combinations of specific microbial
strains may be termed probiotic or direct-fed microbial treatment. In contrast, the term
prebiotic is used to indicate feeding of a carbohydrate substrate that selectively stimulates
commensal bacteria, which may displace pathogens incapable of prebiotic metabolism.
Several studies have found that use of Lactobacillus spp., Streptococcus bovis, E. coli, Proteus
mirabilis, Enterococcus faecalis, Pediococcus acidilactici, Propionibacterium freudenreichii,
and Leuconostoc spp. (Brashears et al., 2003a, b; Schamberger and Diez-Gonzalez, 2004;
Tkalcic et al., 2003; Younts-Dahl et al., 2004; Zhao et al., 2003) may be effective in reducing
faecal shedding of E. coli O157:H7 in cattle, while E. faecalis, S. bovis, Clostridium spp., and
Bacteroides spp. may reduce Salmonella in swine (Genovese et al., 2003).

Antibiotics such as tilmicosin, neomycin sulfate, and oxytetracycline have been found
effective in reducing E. coli O157:H7 in cattle (Stevenson et al., 2004). Use of antibiotics
as feed additives is of concern, however, because of the potential for foodborne pathogens
to develop antibiotic resistance. Bacteriophages (viruses infecting bacteria) have been
evaluated for control of Salmonella and E. coli in poultry and cattle (Goode et al., 2003).
Since the results have been promising as well as conflicting additional studies are needed
in order to determine the potential usefulness of such treatments (O’Flynn et al., 2004).
Development of anti-pathogen vaccines for use in animals has also been pursued. Results
of relevant studies have indicated effectiveness of vaccines against E. coli O157:H7 in small
scale cattle trials (Stopforth and Sofos, 2005).

It has been found that transportation from the feedlot and lairage at the slaughter facility may
influence pathogen prevalence in or on other animals due to increased shedding and cross-
contamination or animal-to-animal transfer during transport (Barham et al., 2002; Larsen
et al., 2004). It is logical that control of this environment should limit the contribution of
vehicles and pens as sources of contamination (Bach et al., 2004; Schmidt et al., 2004).
Pathogen control during transport and lairage may be approached through cleaning and
disinfecting surfaces of trailers prior to animal loading, and cleaning and disinfecting holding
pens at the slaughter facility (Stopforth and Sofos, 2005).

It is obvious that approaches to pre-harvest pathogen control have major constraints. The
major target of pre-harvest pathogen control, E. coli O157:H7, may be used as an example
to point out major difficulties encountered (Sofos, 2002): (1) a documented source of the
pathogen are cattle farms and feedlots; (2) the pathogen may persist for long periods in the
farm environment but not on the same animals; (3) persistence of the pathogen on animals

Towards a risk-based chain control  65

John N. Sofos

varies with season and animal age; (4) cattle carriers may not show signs of illness; and
(5) the pathogen is often found in environmental sources such as feed, water and water
troughs (Hancock and Dargatz, 1995; Hancock et al., 1997). These facts complicate the
issue of applying, pre-harvest measures to control pathogen prevalence and levels in order
to enhance the safety of resulting food products. Nevertheless, as indicated by Elder et al.
(2000), “the association between faecal prevalence and carcass contamination indicates a
role for control…in cattle on the farm… Unfortunately, no effective control methods are
currently available for producers to use…development of such control methods remains an
area of active research…” Areas of needed research for pathogen control pre-harvest include:
(1) pathogen ecology; (2) risk analysis for dairy, beef, steers, heifers, poultry, seafood, at
the farm or feedlot level; (3) pre-slaughter risk factors associated with animal management
and handling; (4) influence on risks of animal and feed management practices; (5) factors
affecting bacterial attachment and detachment in the rumen; and (6) influence on risks of
animal stressing associated with handling, transportation and lairage. Differences in one
or more of these parameters may be responsible for variation in pathogen prevalence rates
associated with animals from different lots or farms, and may be important in pathogen
transfer from animals to foods and humans (Sofos, 2002).

Overall, however, as indicated, pre-harvest control of foodborne pathogens is useful because

their presence on animals and their environment leads to animal product contamination, as
well as contamination of other foods through contaminated manure and water. Reduction of
pathogen incidence in the farm and on animals should reduce the likelihood of contamination
of animal and plant food products and water. However, scientifically defined and verified
critical control points or management practices at the pre-harvest level are presently
unavailable. Ongoing research should continue and be expanded to define risks and develop
effective and practical controls for such risks. It should be noted, however, that elimination
of pathogenic micro-organisms at the pre-harvest level is unlikely; nevertheless, their
reduction and management should be sought. As pathogen control strategies are developed
and implemented at the pre-harvest level, their validation, verification, auditing and
economic aspects should also be considered. Finally, it is important to realize that control
or management of food safety risks from foodborne pathogenic micro-organisms should be
part of an integrated effort and approach that includes all sectors, from the producer through
the processor, distributor, packer, retailer, food service worker, and consumer. Reduction of
pre-harvest pathogen prevalence may lead to a reduced probability that errors occurring in
subsequent parts of the food chain, through cooking and preparation for consumption, will
lead to foodborne illness (Sofos, 2002, 2005).

6. Antimicrobial interventions to control of pathogens at slaughter

Although it is well accepted that live animals constitute the major source of food contamination
with pathogens of enteric origin, in the United States the FSIS/USDA has no legal authority
for on-farm regulatory inspection, while the APHIS of USDA manages animal health pre-
harvest and has conducted numerous major surveys on pathogen prevalence in live animals
(Sofos, 2002). The FSIS/USDA, however, has regulatory authority at the processing level,
and it has: (1) formed an Animal Production Food Safety group to provide a mechanism and

66  Towards a risk-based chain control

John N. Sofos

encouragement for the development of food safety programs at the pre-harvest level; and
(2) in the HACCP meat and poultry inspection regulation it requires slaughtering plants to
“conduct a hazard analysis to determine the food safety hazards reasonably likely to occur
before, during and after entry into the establishment.” Obviously, the regulation places
responsibility for identification of pre-harvest (before entry) hazards on the processor. In
addition to meeting microbial performance standards, “plants are responsible for preventing
illegal, or violating residues from adulterating their meat and poultry products”. Therefore, in
order to verify their own HACCP plan and to comply with FSIS/USDA regulatory requirements,
slaughtering plants need to develop proper preventative approaches, which may include: (1)
rejection of at-risk animals; (2) sorting of animals in groups based on risk; (3) additional
testing; (4) review of producer records; (5) demand for application of quality assurance
programs; (6) visits and inspections of producers and suppliers; and (7) application of
their own quality assurance programs. Animal producer associations have developed quality
assurance programs available for application by their members (Sofos, 2002).

In addition to the above, and in order to meet regulatory microbiological performance

criteria and customer specifications and to enhance the safety of their products, the meat
processing industry has adopted decontamination processes that may include animal cleaning,
chemical dehairing of cattle hides, spot-cleaning of carcasses before evisceration by knife-
trimming or steam and vacuum, and spraying, rinsing, or deluging of carcasses with hot
water, chemical solutions or steam immediately after hide removal and at the end of the
dressing process; recently, such treatments are also considered for application immediately
before carcass boning (Sofos, 2005). Presently, unlike the European Union where no chemical
decontamination of carcasses is permitted, in the United States, chemical decontamination
interventions of meat animal carcasses (e.g. acetic, lactic, peroxyacetic, acidified sodium
chlorite, cetylpyridinium chloride, lactoferrin, etc.) are approved for use if they are generally
recognized as safe, do not render the product adulterated, do not create labelling issues, and
there is scientific evidence that they are effective. Numerous studies have demonstrated,
validated and verified the decontaminating efficacy of various interventions and treatments,
and their contribution in reducing pathogen prevalence on carcasses (Bacon et al., 2000;
Huffman, 2003; Smulders and Greer, 1998; Sofos and Smith, 1998; Sofos et al., 1999;
Sofos, 2005). However, it should be noted that data on the influence of pre- and post-
harvest interventions on reduction of pathogen cell numbers are limited, even though such
information is necessary for completion of farm-to-fork risk assessments.

External animal hide contamination may be reduced by cleaning or washing the hide of the
animal with water or chemical solutions at slaughter (Stopforth and Sofos, 2005). There are,
however, concerns related to pre-slaughter washing of animals since faecal material and the
micro-organisms associated with it may be more readily spread through wet animals. Animal
washing is generally difficult due to climatic conditions and may require special facilities or
chambers. Pre-slaughter animal washing has been applied for sheep in New Zealand, for cattle
in Australia, and by some plants in the United States. Cattle washing has been evaluated
with solutions of chlorine, cetylpyridinium chloride, sodium hydroxide, trisodium phosphate
and phosphoric acid. Use of an online specially designed hide-washing cabinet is employed
by at least one beef slaughtering company in the United Sates. The process is applied after
animal exsanguination and involves washing of the hide with a sodium hydroxide solution,

Towards a risk-based chain control  67

John N. Sofos

followed by rinsing with a chlorine solution, and vacuuming of certain parts of the hide
before its opening for removal (Bosilevac et al., 2004a, b; Koutsoumanis and Sofos, 2004;
Koutsoumanis et al., 2005; Mies et al., 2004; Sofos and Smith, 1998; Sofos, 2005; Stopforth
and Sofos, 2005).

Another cattle hide decontamination process is chemical dehairing, which may be applied
early during slaughter to remove hair and associated external contaminants before the
animal/carcass enters the slaughter room of the factory; this should minimize the importance
of animal hides as sources of environmental and carcass contamination (Nou et al., 2003;
Sofos and Smith, 1998; Sofos, 2005). Laboratory studies with artificially contaminated
beef hide samples showed significant reductions of E. coli O157:H7, Salmonella and Listeria
monocytogenes. In studies carried out in commercial beef slaughtering facilities, chemical
dehairing was found to reduce the need for carcass trimming with knives to remove visibly
soiled tissue, significantly reduced bacterial levels of carcasses immediately after hide
removal, and significantly lowered prevalence of E. coli O157:H7. Problems associated with
this process include the need for disposal of generated waste which includes hydrolyzed
hair and dehairing chemical residues of sodium sulfide and hydrogen peroxide (Sofos and
Smith, 1998).

As indicated, animal slaughtering factories in the United States apply carcass decontamination
technologies immediately after hide removal (before evisceration) and at the end of the dressing
process, before carcass chilling; recently, such treatments are also applied or considered
for application after chilling and immediately before carcass boning. Decontamination
interventions applied include spot-cleaning of carcasses before evisceration by knife-trimming
or steam and vacuum, and spraying, rinsing, or deluging of carcasses with hot water, chemical
solutions or steam. These processes only reduce contamination because it is difficult to
eliminate micro-organisms and still maintain the raw state properties of foods (Koutsoumanis
and Sofos, 2004; Koutsoumanis et al., 2005; Sofos and Smith, 1998; Sofos, 2005; Stopforth
and Sofos, 2005). Under the ‘zero tolerance’ policy for visible carcass contamination in
the United States, cutting with a knife (knife-trimming) is required to remove visible
contamination on carcasses before any washing or other decontamination treatment is
applied. An alternative to cutting with a knife is to use steam-vacuuming for removal of
faecal and ingesta contamination spots of ≤2.5 cm in diameter. This is accomplished with
hand-held equipment, which apply hot water and/or steam to loosen soil and inactivate
bacteria, followed by removal of the contaminants through application of vacuum. The
effectiveness of knife-trimming and steam-vacuuming in reducing carcass contamination
depends on employee diligence of application and the operational status of equipment
(Koutsoumanis and Sofos, 2004; Koutsoumanis et al., 2005; Sofos and Smith, 1998; Sofos,
2005; Stopforth and Sofos, 2005).

Immediately after hide removal and before opening the carcass to remove the viscera, the
whole carcass is sprayed with water and possibly organic acid (lactic or acetic) solution to
reduce microbial contamination acquired during the hide removal process. These treatments
are considered necessary and effective because they are applied soon after hide removal
and before bacteria have attached strongly to the carcass. It has been reported that
pre-evisceration washing of carcasses may lead to an alteration of the surface physical

68  Towards a risk-based chain control

John N. Sofos

characteristics (e.g. contact angle and surface free energy) of the carcass tissue, resulting
in less attachment and potential biofilm formation by micro-organisms (Koutsoumanis and
Sofos, 2004; Koutsoumanis et al., 2005; Sofos and Smith, 1998; Sofos, 2005; Stopforth and
Sofos, 2005).

Carcass sides (halves) that have passed the zero tolerance inspection for visible physical
contamination are washed with water sprays of various pressures to remove bone dust and
blood at the end of dressing. Following washing, carcasses are generally decontaminated with
chemical solutions and/or hot water or steam. Hot water (≥74°C) may be applied through
immersion or dipping, deluging, rinsing at low pressures, and spraying at higher pressures.
Each of these approaches has advantages and disadvantages, and the selection varies with
type of product treated and facilities available. Thermal energy may also be applied for
beef carcass decontamination in the form of pressurized steam. A patented process for
carcass decontamination developed by Frigoscandia and Cargill, Inc. (the Frigoscandia Steam
Pasteurization SystemTM) has been approved and is used in the United States. Potential
advantages of using steam rather than hot water include efficiency of heat transfer and lower
water and energy usage. Steam pasteurization, however, requires a major capital investment
(Koutsoumanis and Sofos, 2004; Koutsoumanis et al., 2005; Sofos and Smith, 1998; Sofos,
2005; Stopforth and Sofos, 2005).

Decontamination with organic acid (lactic or acetic) solutions (1-5%) is also applied to reduce
the bacterial load of meat and poultry carcasses. In addition to the immediate microbial
reduction, acid decontamination results in a residual antimicrobial effect during storage (Ikeda
et al., 2003; Koutsoumanis et al., 2004). The effectiveness of acids in the decontamination
of meat is enhanced when the temperature of the solution is 55o C. In general, organic acid
spraying is used extensively in beef carcass decontamination after treatment with hot water
or steam (Koutsoumanis and Sofos, 2004; Koutsoumanis et al., 2005; Smulders and Greer,
1998; Sofos and Smith, 1998; Sofos, 2005; Stopforth and Sofos, 2005).

Additional chemical solutions tested and in some instances approved and used in the
decontamination of meat and poultry include chlorine and chlorine dioxide, trisodium
phosphate, acidified (usually with citric acid) sodium chlorite, hydrogen peroxide, ozonated
water, cetylpyridinium chloride, peroxyacetic acid-based preparations, and activated
lactoferrin. A variety of other chemical compounds such as polyphosphates, benzoates,
propionates, sodium hydroxide, sodium metasilicate, sodium bisulfate, etc. have been tested
with various rates of success, for the decontamination of meat and poultry. Application of
these or other chemicals as meat and poultry decontaminants in the future will depend on their
efficacy, product and application safety, effects on product quality and cost (Koutsoumanis
and Sofos, 2004; Koutsoumanis et al., 2005; Sofos and Smith, 1998; Sofos, 2005; Stopforth
and Sofos, 2005).

The use of more than one treatment may lead to synergistic or additive decontamination
effects and could be considered as a multiple hurdle approach. In fresh meat decontamination,
the multiple hurdle decontamination approach may involve simultaneous (e.g. warm acid
solutions) or sequential application (e.g. hide cleaning, steam vacuuming, pre-evisceration
washing, hot water or steam treatment, organic acid rinsing) of treatments. The effectiveness

Towards a risk-based chain control  69

John N. Sofos

of these treatments in reducing microbial contamination is affected by pressure, temperature,

chemicals used and their concentration, duration of exposure (which depends on speed of
slaughter and length of the application chamber), method of application and time or stage
of application (Koutsoumanis and Sofos, 2004; Koutsoumanis et al., 2005; Sofos and Smith,
1998; Sofos, 2005; Stopforth and Sofos, 2005).

Despite the generally accepted effectiveness of decontamination technologies in reducing

prevalence and numbers of bacteria on carcasses, there are a number of concerns associated
with their use. Depending on the concentration and intensity of application, hot water, steam
and acid treatments may result in undesirable effects in colour and/or flavour of the products.
Such effects, however, should be only slight and reversible at acid concentrations below 2% or
hot water/steam treatments of short duration. Discoloration problems can also be prevented
by using buffered acids. Application of spraying/rinsing treatments may lead to spreading
and redistribution of bacteria over the carcass or penetration into the tissue. These problems,
however, can be avoided by appropriate selection and adjustment of factors affecting the
efficacy of decontamination. For example, the issue of bacterial redistribution may be
addressed by using interventions that inactivate (hot water, steam, chemical solutions),
rather than remove contamination. As mentioned previously, the period of time before
decontamination has an important effect on bacterial attachment and biofilm formation,
thus, decontamination treatments applied before evisceration will be more effective since
bacterial attachment is still weak (Koutsoumanis and Sofos, 2004; Koutsoumanis et al., 2005;
Sofos and Smith, 1998; Sofos, 2005; Stopforth and Sofos, 2005).

Another important concern associated with the use of decontamination technologies is the
potential development of stress-resistant pathogens. Heat or acid resistance are important
physiological characteristics that may influence the behaviour of pathogens during meat
processing, cooking or in host systems (gastric secretions, phagocytosomal vacuoles) where
acidity is the final barrier that the pathogen must overcome before pathogenesis. The
potential concern for development of stress-resistant pathogens can be attributed to the
‘stress hardening’ phenomenon which refers to the increased tolerance of a pathogen to
a lethal stress after adaptation to the same or a different sub-lethal stress environment.
Studies with model systems have demonstrated that adaptation of pathogenic bacteria such
as L. monocytogenes, E. coli O157:H7 and Salmonella to a mildly stressing environment may
result in increased survival under stress conditions that would be lethal for non-adapted
cells. In addition to increased stress resistance, adaptation may lead to enhanced virulence.
Thus, additional studies are needed to evaluate the potential development of stress-resistant
pathogens with prolonged use of decontamination treatments, while strategies to control
stress resistance of bacteria should involve optimization of decontamination interventions,
in type, intensity and sequence, to maximize microbial destruction and minimize resistance
development. It should be stressed, however, that irrespective of potential stress adaptation
inducement on survivors, decontamination treatments are highly effective in reducing
microbial contamination and prevalence of pathogens on carcasses, and, thus, allowing meat
operations to meet regulatory performance standards and industry specifications. It should
also be noted that, for risk assessment based pathogen control, additional studies are needed
to evaluate the influence of various factors presented throughout this report, and to also
determine the efficacy of decontamination interventions in reducing pathogen cell numbers,

70  Towards a risk-based chain control

John N. Sofos

in addition to data on changes in prevalence. A complete risk assessment should be able to

estimate, with the highest possible accuracy, numbers of pathogen cells per serving of food
or numbers of food servings with a given number of pathogen cells (Koutsoumanis and Sofos,
2004; Koutsoumanis et al., 2005; Samelis and Sofos, 2003; Sofos, 2005).

7. Difficulties in risk assessment

Quantitative microbial risk assessments in food production are still in the infant stage, and
they cannot provide complete outcomes for use in setting food safety objectives (FSO).
In general, the outcomes of present risk assessments are not ready for use. The value or
contributions of present risk assessment exercises include: (1) development and improvement
of approaches for risk assessments; (2) identification of components of the food chain to be
considered in risk assessments; (3) selection of mathematical approaches to be employed; and
(4) identification of data gaps and research needs for completion of risk assessments. There
is a need for collaboration and coordination among various interested groups, organizations
and agencies, nationally and internationally, to develop plans for improvement of risk
assessment outcomes. Studies for data collection to be used in risk assessments should cover
the production-to-consumption continuum.


Annamalai, T., Nair, M.K.M., Marek, P., Vasudevan, P., Schreiber, D., Knight, R., Hoagland, T. and Venkitanarayanan,
K., 2004. In vitro inactivation of Escherichia coli O157:H7 in bovine rumen fluid by caprylic acid. J. Food Prot.
67, 884-888.
Anderson, R.C., Buckley, S.A., Kubena, L.F., Stanker, L.H., Harvey, R.B. and Nisbet, D.J., 2000. Bactericidal effect
of sodium chlorate on Escherichia coli O157:H7 and Salmonella Typhimurium DT104 in rumen contents in vitro.
J. Food Prot. 63, 1038-1042.
Arthur, T.M., Bosilevac, J.M., Nou, X., Shackelford, S.D., Wheeler, T.L., Kent, M.P., Jaroni, D., Pauling, B., Allen,
D.M. and Kouhmaraie, M., 2004. Escherichia coli O157 prevalence and enumeration of aerobic bacteria,
Enetrobacteriaceae, and Escherichia coli O157 at various steps in commercial beef processing plants. J. Food
Prot. 67, 658-665.
Bach, S.J., McAllister, T.A., Mears, G.J. and Schwartzkopf-Genswein, K.S., 2004. Long-haul transport and lack of
preconditioning increases fecal shedding of Escherichia coli O157:H7 by calves. J. Food Prot. 67, 672-678.
Bacon, R.T., Belk, K.E., Sofos, J.N., Clayton, R.P., Reagan, J.O. and Smith, G.C., 2000. Microbial populations of animal
hides and beef carcasses at different stages of slaughter in plants employing multiple-sequential interventions
for decontamination. J. Food Prot. 63, 1080-1086.
Barham, A.R., Barham, B.L., Johnson, A.K., Allen, D.M., Blanton, J.R., Jr. and Miller, M.F., 2002. Effects of the
transportation of beef cattle from the feedyard to the packing plant on prevalence levels of Escherichia coli
O157 and Salmonella spp. J. Food Prot. 65, 280-283.
Barkocy-Gallagher, G.A., Arthur, T.M., Rivera-Betancourt, M., Nou, X., Shackelford, S.D., Wheeler, T.L. and Koohmaraie,
M., 2003. Seasonal prevalence of shiga toxin-producing Escherichia coli, including O157:H7 and non-o157
serotypes, and Salmonella in commercial beef processing plants. J. Food Prot. 66, 1978-1986.
Berg, J., McAllister, T., Bach, S., Stilborn, R., Hancock, D. and LeJeune, J., 2004. Escherichia coli O157:H7 excretion
by commercial feedlot cattle fed either barley- or corn-based finishing diets. J. Food Prot. 67, 666-671.

Towards a risk-based chain control 71

John N. Sofos

Bosilevac., J.M., Arthur, T.M., Wheeler, T.L., Shackelford, S.D., Rossman, M., Reagan, J.O. and Koohmaraie, M.,
2004a. Prevalence of Escherichia coli O157 and levels of aerobic bacteria and Enterobacteriaceae are reduced
when hides are washed and treated with cetylpyridinium chloride at a commercial beef processing plant. J.
Food Prot. 67, 646-650.
Bosilevac., J.M., Wheeler, T.L., Rivera-Betancourt, M., Nou, X., Arthur, T.M., Shackelford, S.D., Kent, M.P., Jaroni,
D., Osborn, M., Rossman, M., Reagan, J.O. and Koohmaraie, M., 2004b. Protocol for evaluating the efficacy of
cetylpyridinium chloride as a beef hide intervention. J. Food Prot. 67, 303-309.
Braden, K.W., Blanton, J.R., Barham, A.R., Allen, V.G., Pond, K.R. and Miller, M.F., 2004. Ascophyllum nodosum
supplementation: a preharvest intervention for reducing Escherichia coli O157:H7 and Salmonella spp. in feedlot
steers. J. Food. Prot. 67, 1824-1828.
Brashears, M.M., Galyean, M.L., Loneragan, G.H., Mann, J.E. and Killinger-Mann, K., 2003a. Prevalence of Escherichia
coli O157:H7 and performance by beef feedlot cattle given Lactobacillus direct-fed microbials. J. Food Prot.
66, 748-754.
Brashears, M.M., Jaroni, D. and Trimble, J., 2003b. Isolation, selection, and characterization of lactic acid bacteria
for a competitive exclusion product to reduce shedding of Escherichia coli O157:H7 in cattle. J. Food Prot. 66,
CAC (Codex Alimentarius Commission), 1997. Joint FAO/WHO Food Standards Programme, Codex Committee on Food
Hygiene. Food Hygiene, Supplement to Volume 1B-1997. Principles for the Establishment of and Application of
Microbiological Criteria for Foods. CAC/GL 21-1997. Secretariat of the Joint FAO/WHO Food Standards Programme.
Rome: Food and Agriculture Organization of the United Nations.
Callaway, R.C., Anderson, K.J., Genovese, T.L., Poole, T.l., Anderson, T.J., Byrd, J.A., Kubena, L.F. and Nisbet, D.J.,
2002. Sodium chlorate supplementation reduces E. coli O157:H7 populations in cattle. J. An. Sci. 80, 1683-
Callaway, T.R., Edrington, T.S., Anderson, R.C., Genovese, K.J., Poole, T.L., Elder, R.O., Byrd, J.A., Bischoff, K.M. and
Nisbet, D.J., 2003. Escherichia coli O157:H7 populations in sheep can be reduced by chlorate supplementation.
J. Food Prot. 66, 194-199.
CAST (Council for Agricultural Science and Technology), 2004. Intervention Strategies for the Microbiological Safety
of Food of Animal Origin, Issue Paper; Council for Agricultural Science and Technology: Ames, Iowa, USA, Issue
Paper # 25, January.
Dargatz, D.A., Wells, S.J., Thomas, L.A., Hancock, D.D. and Garber, L.P., 1997. Factors associated with the presence
of Escherichia coli O157:H7 in feces of feedlot cattle. J. Food Prot. 61, 466-470.
Diez-Gonzalez, T., Calloway, T.R., Kizoulis, M.G. and Russell, J.B., 1998. Grain feeding and the dissemination of
acid-resistant Escherichia coli from cattle. Science 281, 1666-1668.
Doyle, M.P., Ferson, S., Hancock, D.D., Levine, M.M., Paoli, G., Peterson, B.J., Sofos, J.N. and Sumner, S.S., 2002.
Escherichia coli O157:H7 in Ground Beef; Review of a Draft Risk Assessment. Institute of Medicine of the National
Academies. The National Academies Press, Washington, D.C. 161 p.
Edrington, T.S.; Callaway, T.R.; Varey, P.D.; Jung, Y.S.; Bischoff, K.M.; Elder, R.O.; Anderson, R.C.; Kutter, E.; Brabban,
A.D. and Nisbet, D.J., 2003. J. Appl. Microbiol. 94, 207-213.
Elder, R.O., Keen, J.E., Siragusa, G.R., Barkocy-Gallagher, G.A., Koohmaraie, M. and Laegreid, W.W., 2000. Correlation
of enterohemorrhagic Escherichia coli O157 prevalence in feces, hides, and carcasses of beef cattle during
processing. Proc. Natl. Acad. Sci. USA 97, 2999-3003.
Genovese, K.J., Anderson, R.C., Harvey, R.B., Callaway, T.R., Poole, T.L., Edrington, T.S., Fedorka-Cray, P.J. and
Nisbet, D.J., 2003. Competitive exclusion of Salmonella from the gut of neonatal and weaned pigs. J. Food
Prot. 66, 1353-1359.
Goode, D., Allen, V.M. and Barrow, P.A., 2003. Reduction of experimental Salmonella and Campylobacter of chicken
skin by application of lytic bacteriophages. Appl. Environ. Microbiol. 69, 5032-5036.

72  Towards a risk-based chain control

John N. Sofos

FSIS (Food Safety and Inspection Service), 1996. Pathogen reduction: hazard analysis critical control point (HACCP)
systems; final rule. Fed. Regist. 61, 38806-38989.
Hancock, D. and Dargatz, D., 1995. Implementation of HACCP on the farm. Hazard Analysis and Critical Control Point
(HACCP) Symposium. Presented in association with the 75th Annual Meeting of the Conference of Research
Workers in Animal Diseases, November 12, Ramada Congress Hotel, Chicago, IL, 6 p.
Hancock, D.D., Besser, T.E., Rice, D.H., Herriot, D.E. and Tarr, P.I., 1997. A longitudinal study of Escherichia coli
O157 in fourteen cattle herds. Epidemiol. Infect. 118, 193-195.
Huffman, R.D., 2002. Current and future technologies for the decontamination of carcasses and fresh meat. Meat
Sci. 62, 285-294.
ICMSF (International Commission on Microbiological Specifications for Foods), 2002. Micro-organisms in Foods 7:
Microbial Testing in food safety Management. New York: Kluwer Academic/Plenum Publishers, 362 p.
Ikeda, J.S., Samelis, J., Kendall, P.A., Smith, G.C. and Sofos, J.N., 2003. Acid adaptation does not promote survival
or growth of Listeria monocytogenes on fresh beef following acid and non-acid decontamination treatments.
J. Food. Prot. 66, 985-992.
Keen, J.E. and Elder, R.O., 2002. Isolation of shiga-toxigenic Escherichia coli O157 from hide surfaces and the oral
cavity of finished beef feedlot cattle. J. Amer. Vet. Med. Assoc. 220, 756-763.
Koutsoumanis, K.P., Ashton, L.V., Geornaras, I., Belk, K.E., Scanga, J.A., Kendall, P.A., Smith, G.C. and Sofos, J.N.,
2004. Effect of single and sequential hot water and lactic acid decontamination treatments on the survival and
growth of Listeria monocytogenes and spoilage microflora during aerobic storage of fresh beef at 4, 10, and 25
o C. J. Food Prot. 67, 2703-2711.

Koutsoumanis, K. and Sofos, J.N., 2004. Microbial contamination of carcasses and cuts. In: Encyclopedia of Meat
Sciences. W.K. Jensen (Ed.). Elsevier Academic Press, Amsterdam, The Netherlands. pp. 727-737.
Koutsoumanis, K.P., Geornaras, I. and Sofos, J.N., 2005. Microbiology of land muscle foods. In: Handbook of food
Science. Y.H. Hui (Ed.). Marcel Dekker Inc., New York, NY (In press).
Lammerding, A.M., 1997. An overview of microbial food safety risk assessment. J. Food Prot. 60, 1420-1425.
Larsen, S.T., Hurd, H.S., McKean, J.D., Griffith, R.W. and Wesley, I.V., 2004. Effect of short-term lairage on the
prevalence of Salmonella enterica in cull sows. J. Food Prot. 67, 1489-1493.
Mies, P.D., Covington, B.R., Harris, K.B., Lucia, L.M., Acuff, G.R. and Savell, J.W., 2004. Decontamination of cattle
hides prior to slaughter with and without antimicrobial agents. J. Food Prot. 67, 579-582.
NACMCF (National Advisory Committee on Microbiological Criteria for Foods), 1998. Hazard analysis and critical
control point principles and application guidelines. J. Food Prot. 61, 762-775.
Nou, X., Rivera-Betancourt, M., Bosilevac, J.M.m, Wheeler, T.L., Shackelford, S.D., Gwartney, B.L., Reagan, J.O. and
Koohmaraie, M., 2003. Effect of chemical dehairing on the prevalence of Escherichia coli O157:H7 and levels
of aerobic bacteria and Entetobacteriaceae on carcasses in a commercial beef processing plant. J. Food Prot.
66, 2005-2009.
O’Flynn, G., Ross, R.P., Fitzgerald, G.F. and Coffey, A., 2004. Evaluation of a cocktail of three bacteriophages for
biocontrol of Escherichia coli O157:H7. Appl. Environ. Microbiol. 70, 3417-3424.
Park, G.W. and Diez-Gonzalez, 2003. Utilization of carbonate and ammonia-based treatments to eliminate Escherichia
coli O157: H7 and Salmonella Typhimurium DT104 from cattle manure. J. Appl. Microbiol. 94, 675-685.
Riley, D.G., Gray, J.T., Loneragan, G.H. and Barling, K.S., 2003. Escherichia coli O157:H7 prevalence in fecal samples
of cattle from a southeastern beef cow-calf herd. J. Food Prot. 66, 1778-1782.
Schamberger, G.P. and Diez-Gonzalez, F. 2004. Characterization of colicinogenic Escherichia coli strains inhibitory
to enterohemorrhagic Escherichia coli. J. Food Prot. 67, 486-492.
Schmidt, P.L., O’Connor, A.M., McKean, J.D. and Hurd, H.S., 2004. The association between cleaning and disinfection
of lairage pens and the prevalence of Salmonella enterica in swine at harvest. J. Food Prot. 67, 1384-1388.

Towards a risk-based chain control 73

John N. Sofos

Samelis, J. and Sofos, J.N., 2003. Strategies to control stress-adapted pathogens and provide safe foods. In: Microbial
Adaptation to Stress and Safety of New-Generation Foods. A.E. Yousef and V.K. Juneja (Eds.). CRC Press, Inc.
Boca Raton, FL., p. 303-351.
Shere, J.A., Kaspar, C.W., Bartlett, K.J., Linden, S.E., Norell, B., Francey, S. and Schaefer, D.M., 2002. Shedding of
Escherichia coli O157:H7 in dairy cattle housed in a confined environment following waterborne inoculation.
Appl. Environ. Microbiol. 68, 1947-1954.
Smulders, F.J.M. and Greer, G.G., 1998. Integrating microbial decontamination with organic acids in HACCP
programmes for muscle foods: prospects and controversies. Int. J. Food Microbiol. 44, 149-169.
Sofos, J.N., 1994. Microbial growth and its control in meat poultry and fish. In: A.M. Pearson and T.R. Dutson (Eds.).
Quality Attributes and their Measurements in Meat, Poultry and Fish Products. London: Blackie Academic and
Professional, p. 359-403.
Sofos, J.N., 2002. Approaches to pre-harvest food safety assurance. In: Food Safety Assurance and Veterinary Public
Health; Volume 1, Food Safety Assurance in the Pre-Harvest Phase. F.J.M. Smulders and J.D. Collins (Eds.).
Wageningen Academic Publishers, Wageningen, The Netherlands, p. 23-48.
Sofos, J.N., 2004. Pathogens in animal products: sources and control. In: Encyclopedia of Animal Science. W. Pond
and A. Bell (Eds.). Marcel Dekker, Inc., New York, NY, p. 701-703.
Sofos, J.N (Editor), 2005. Improving the Safety of Fresh Meat. CRC/Woodhead Publishing Limited, Cambridge, UK.
I780 p.
Sofos, J.N., Belk, K.E. and Smith, G.C., 1999. Processes to reduce contamination with pathogenic micro-organisms
in meat. Proceedings of the 45th International Congress of Meat Science and Technology, Yokohama, Japan,
p. 596-605.
Sofos, J.N. and Smith, G.C., 1998. Non-acid meat decontamination technologies: model studies and commercial
applications. Int. J. Food Microbiol. 44, 171-188.
Stevenson, S.M.L., Cook, S.R., Bach, S.J. and McAllister, T.A., 2004. Effects of water source, dilution, storage, and
bacterial and fecal loads on the efficacy of electrolyzed oxidizing water for the control of Escherichia coli O157:
H7. J. Food Prot. 67, 1377-1383.
Stopforth. J.D. and Sofos, J.N., 2005. Recent advances in pre- and post-slaughter intervention strategies for control
of meat contamination. In: Recent Advances in Intervention Strategies to Improve Food Safety. V.J. Juneja
(Ed.). American Chemical Society (In press).
Tkalcic, S., Zhao, T. Harmon, B.G., Doyle, M.P., Brown, C.A. and Zhao, P., 2003. Fecal shedding of enterohemorrhagic
Escherichia coli in weaned calves following treatment with probiotic Escherichia coli. J. Food Prot. 66, 1184-
Tutenel, A.V., Pierard, D., Van Hoof, J. and De Zutteri, L., 2003. Molecular characterization of Escherichia coli O157
contamination routes in a cattle slaughterhouse. J. Food Prot. 66, 1564-1569.
Walls, I. and Buchanan, R.L., 2005. Use of food safety objectives as a tool for reducing foodborne listeriosis. Food
Control 16, 795-799.
Whiting, R.C., 1996. Risk assessment and predictive microbiology. J. Food Prot. Supplement, 31-36.
Younts-Dahl, S.M., Galyean, M.L., Loneragan, G.H., Elam, N.A. and Brashears, M.M., 2004. Dietary supplementation
with Lactobacillus- Propionibacterium-based direct-fed with microbials and prevalence of Escherichia coli O157
in beef feedlot cattle and on hides at harvest. J. Food Prot. 67, 889-893.
Zhao, T., Tkalcic, S., Doyle, M.P., Harmon, B.G., Brown, C.A. and Zhao, P., 2003. Pathogenicity of enterohemorrhagic
Escherichia coli in neonatal calves and evaluation of fecal shedding by treatment with probiotic Escherichia
coli. J. Food Prot. 66, 924-930.

74  Towards a risk-based chain control

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

Chemical residues in foods of animal origin:

Assessing risk and implementing control strategies
Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.
Food Quality and Standards Service, Food and Nutrition Division, Food and Agriculture
Organization of the United Nations, Viale delle Terme di Caracalla, 00100 Rome, Italy, sarah.

The use of chemicals in food production has long caused concern in terms of the negative
health impact of any residues remaining in the food at the time of consumption. This has
resulted in extensive assessment both at national and international levels of the safety of
the chemicals that are used. In describing the risk assessment process the case of residues
of veterinary drugs in foods is examined, including the role of ADIs and MRLs in regulation
and control. Despite years of work there remains a gap between the number of veterinary
drugs for which an ADI/MRL has been established and those that are in use. Recent trade
disruptions have highlighted the negative impact of this situation and the importance of
closing the gap. Moreover, it demonstrates that the implementation of control strategies
requires a cross border approach and thus emphasises the need to assist countries to develop
their capacities to adequately monitor and control veterinary drug residues in foods.

Keywords: chemical residues, assessment, JECFA, residues of veterinary drugs, control


1. Introduction

The application of specific chemicals such as veterinary drugs and pesticides in the production
of foods of animal origin as well as in the production of animal feedstuffs has long been
used to boost production, facilitate processing, ensure an adequate food supply and increase
profitability. While the benefits of these various chemical substances have been many, their
use requires considerable guidance and regulation to ensure that any residues remaining
in the final food product are safe in terms of human health. The challenge is to balance a
high quality affordable and adequate food supply with the need to protect consumers from
unnecessary and potentially harmful exposure to chemicals. This challenge is all the greater
as chemical residues in the food supply are not only a consequence of their direct use in food
production and processing, but also arise as a result of contamination of the food supply
from the environment, e.g. heavy metals, naturally occurring chemical contaminants, e.g.
mycotoxins, as well as accidental contamination of food or feed stuffs.

Currently food safety is a priority issue in many countries and regulators, producers, processors
and consumers are faced with a myriad of food safety concerns. Despite the emergence of so
called “new” food safety issues such as Bovine Spongiform Encephalopathy (BSE) and the

Towards a risk-based chain control 75

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

emergence of anti-microbial resistant micro-organisms the concern of consumers regarding

the presence of chemicals in their food is still significant. In a report of a survey undertaken
in the United Kingdom in 2004, 40% of consumers surveyed expressed concern regarding
the use of antibiotics in meats (FSA, 2005). Although a decrease has been observed in the
number of consumers in the United States of America that consider residues as a serious
health concern, it is still an important food safety issue for 66% of the population (Figure 1)
(Food Marketing Institute, 1989 – 1997). A study in Australia and Japan has also indicated
that the presence of chemical residues in food was among the top three food safety concerns
(Smith and Riethmuller, 2000). Another indicator of consumer concern about residue levels
in foods are the recent trend towards organic and other agricultural production systems
that feature a reduced reliance on artificial chemical inputs, and which has seen an overall
ongoing growth rate of around 25% per year in the European Union for the last ten years
(FAO, 2000). Although information from less industrialised countries is not readily available,
these statistics serve to highlight the continued importance of the assessment of the risks
associated with chemical hazards in foods and the need to implement control strategies to
minimise the risk.






1989 1990 1991 1992 1993 1994 1995 1996 1997

Bacteria Residues Irradiation

Animal drugs Nitrites Additives

Figure 1. Consumer attitudes about risks in food in the United States of America and the percent rating
specific food safety issues as a “serious health risk” (Food Marketing Institute, 1989-97).

2. Sources of chemical residues in foods of animal origin

Chemical residues and contaminants in foods of animal origin can be divided into three groups
according to their source. The first group comprises residues arising from the direct application
of an agricultural or veterinary chemical such as the treatment of animals with veterinary drugs
prior to slaughter. Of particular relevance are antimicrobial substances or antibiotics which

76  Towards a risk-based chain control

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

are widely used as therapeutic agents, as well as prophylactics or feed additives. In addition
to potentially giving rise to unacceptable levels of residues in foods of animal origin they can
also lead to the development of resistant strains of bacteria (WHO, 2004).

The second group relates to the presence of chemicals in animal feedstuffs, which may
be contaminated via a number of routes. These include the application of pesticides or
fungicides to feed crops during primary production, where inappropriate use may result
in the presence of unacceptable residue levels in the harvested crop (FAO, 1998). Specific
chemicals such as antibiotics or hormones may be added to feedstuffs to prevent infection
and boost production. Mycotoxins are another important naturally occurring toxicant that
could be present in ingredients for feedstuffs such as wheat, maize and barley (FAO, 2001).
Consumption of feed contaminated with mycotoxins can lead to the presence of these toxins
in foods of animal origin for example Afllatoxin M1 in milk.

Other potential contaminants in feedstuffs include environmental contaminants such as

the polychlorinated biphenyls (PCBs), dioxins and heavy metals including mercury, lead, or
cadmium (Bernard et al., 1999; Angelova et al., 2005). Foods of animal origin are the greatest
source of human exposure to PCBs and dioxins and animal feeds may be an important source of
contamination for livestock. In addition, contaminated fats or oils added either intentionally
or unintentionally to manufactured feeds can be a source of dioxins and PCBs. Pastureland
for livestock may become contaminated with these industrial pollutants which, when emitted
into the air, can contaminate soil and water. Plant materials growing in areas with high levels
of other environmental pollutants, such as radio-nuclides and heavy metals, that are used as
feed may also lead to unacceptably high levels of contamination in food products of animal
origin. Similarly, fish oils or meal used as animal feed ingredients, may contain high levels of
contaminants if they are produced from fish grown in polluted areas (FAO, 2000).

Thirdly, foods of animal origin may become contaminated during processing either accidentally
or intentionally. For example, inadequate removal or rinsing of chemicals used in the cleaning
of equipment prior to processing may be a potential source of contamination (Folks and
Burson, 2001). In some cases illegal food additives find their way in to foods such as the
carcinogen Sudan I dye. The presence of this industrial dye in chilli powder led to a huge
recall of a range of food products in countries world wide early this year (CFIA, 2005; China
Economy, 2005; FSA, 2005; FSAI, 2005).

3. Regulatory control of chemicals in foods

Regulatory control of chemicals in foods is an essential component of any national food

control system. It allows us to reap the benefits of a range of chemical substances while at
the same time striving to minimise any risk to the health of the consumer. The regulatory
system for chemicals is composed of a number of elements as follows:

• Legislation controlling chemical use and registration.

• Establishment of limits (through evaluation/assessment).

Towards a risk-based chain control 77

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

• Monitoring programs to ensure established limits are adhered to and illegal chemicals are
not used (inspection services, laboratory services, training and education).
• Guidelines for the responsible use and application of chemicals as well as guidelines to
prevent contamination of food products, e.g. Good Agriculture Practices (GAPs), Good
Manufacturing Practices (GMPs), Good Animal Husbandry Practices, Good Animal Feeding
Practices, etc.

The control of chemicals in foods thus requires a comprehensive system and infrastructure.
However, many countries neither have the means nor the resources to implement such a system
and in particular evaluate the safety or risk of the vast range of chemicals that are found in
foods. This has been long recognized and in 1956 the Joint FAO/WHO Expert Committee on
Food Additives (JECFA), which is an international expert scientific body administered by FAO
and WHO, was established. Although initially evaluating the safety of food additives only,
this committee now also evaluates contaminants, naturally occurring toxicants and residues
of veterinary drugs in foods. The Joint FAO/WHO Meeting on Pesticide Residues (JMPR) is a
similar body which has met annually since 1963 to conduct scientific evaluations of pesticide
residues in food. It provides advice on the acceptable levels of pesticide residues in food
moving in international trade. Issues related to the chemical residues or contaminants in
foods that are not addressed by either JECFA or JMPR are taken on board by ad hoc FAO/
WHO expert consultations as required. As well as undertaking safety evaluations and risk
assessments of chemicals found in foods, the international activities of FAO and WHO also
promote harmonization of assessment methods, contribute to the development of guidelines
for good practices and provide scientific advice to the Codex Alimentarius Commission, which
can be considered as the international food safety risk manager, for the establishment of
limits and the development of other risk management strategies.

The activities undertaken by FAO, together with WHO, on issues related to chemicals in
foods are inextricably linked with other international activities as well as the work underway
in the member countries of both organizations (Figure 2). Through Codex, the member

e Member trade ernation
rtis agree al
e ment
a, exp Countries s
Dat ific
Sci vice Needs, Standards, Agreements
ad feasibility, guidelines,
inputs, etc. related texts
risk assessment
n chm rds
JECFA, International Be nda
JMPR, Scientific sta
advice risk manager
ad hoc expert Requests for
consultations advice, risk
Figure 2. Graphic representation of the interlinkage among the international risk assessment activities of
FAO and WHO and other activities at both country and international levels.

78  Towards a risk-based chain control

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

countries identify the specific areas in which they need risk management guidance. Codex
can request JECFA and JMPR to undertake risk assessment and provide scientific advice on
these specific issues to enable them to provide the appropriate risk management guidance.
The international risk assessment work of JECFA and JMPR is carried out using the data and
expertise that is provided by countries, their research institutes and their industry. The
resulting risk assessment and scientific advice can then be used by countries as part of their
chemical residue and contaminant control program. It can also be used by the international
risk manager on food safety issues, Codex, in the development of standards, guidelines and
related texts. These Codex texts have been recognised by the World Trade Organization’s (WTO)
Agreement on the Application of Sanitary and Phytosanitary Measures (SPS) as benchmark
standards for all Members of the WTO (WTO, 1995).

4. International activities on residues of veterinary drugs in foods

The international activities on chemicals in foods are broad, reflecting the diversity of issues
to be considered when controlling chemicals in foods. To provide an example of how risk
assessment and control of chemical residues in foods is addressed at the international level
the area of residues of veterinary drugs in foods is examined in more detail.

4.1. Risk assessment – JECFA

Risk assessment has been defined by Codex as a scientifically based process consisting of
hazard identification, exposure assessment, hazard characterization and risk characterization
(Table 1).

Table 1. Risk assessment defined by the Codex Alimentarius.

Hazard Identification The identification of biological, chemical, and physical agents capable of causing
adverse health effects and which may be present in a particular food or group of foods.
Exposure Assessment The qualitative and/or quantitative evaluation of the likely intake of biological,
chemical, and physical agents via food as well as exposures from other sources if
Hazard Characterization The qualitative and/or quantitative evaluation of the nature of the adverse health
effects associated with biological, chemical and physical agents which may be
present in food. For chemical agents, a dose response assessment should be
performed. For biological or physical agents, a dose-response assessment should be
performed if the data are obtainable.
Dose-Response Assessment The determination of the relationship between the magnitude of exposure (dose)
to a chemical, biological or physical agent and the severity and/or frequency of
associated adverse health effects (response).
Risk Characterization The qualitative and/or quantitative estimation, including attendant uncertainties,
of the probability of occurrence and severity of known or potential adverse
health effects in a given population based on hazard identification, hazard
characterization and exposure assessment.

Towards a risk-based chain control 79

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

In addition to codex definitions and related texts, scientific bodies such as JECFA have more
detailed guidelines on the manner in which they undertake their risk assessment work with
the objective of providing the required scientific advice (FAO/WHO, 2000). As well as ensuring
the work proceeds in an efficient and orderly manner, such guidelines allow those outside
of the JECFA process to understand how the work is undertaken and the way in which the
scientific advice is developed.

JECFA work in the area of residues of veterinary drugs in foods comprises three areas:
• The establishment of recommended Maximum Residue Limits (MRLs).
• The determination of Acceptable Daily Intakes (ADIs).
• The development of principles for evaluating the safety of residues of veterinary drugs
in food and for establishing ADIs and MRLs for certain drugs when they are administered
to food-producing animals in accordance with good veterinary practices.

The MRL is the maximum concentration of residue resulting from the use of a veterinary drug
(expressed in mg/kg or mg/kg on a fresh weight basis) that is acceptable in or on a food.
It is based on the type and amount of residue considered to be without toxicological hazard
for human health as expressed by the ADI, or on the basis of a temporary ADI. It also takes
into account other relevant public health risks as well as food technological aspects and
estimated food intakes. When establishing an MRL, consideration is also given to residues that
occur in foods of plant origin and/or the environment. Furthermore, the MRL may be reduced
to be consistent with good practices in the use of veterinary drugs and to the extent that
practical analytical methods are available. The MRLs elaborated by JECFA are “recommended
MRLs” that are forwarded to the Codex Committee on Residues of Veterinary Drugs in Foods
(CCRVDF) for consideration A Temporary MRL is established by JECFA when a temporary ADI
has been established and/or when it has been found necessary to provide time to generate
and evaluate further data on the nature and quantification of residues. In some cases, it is
concluded that there is no need to specify a numerical MRL. This can occur when the available
data on the identity and concentration of residues of the veterinary drug in animal tissues
indicate a large margin of safety for consumption of residues in food when the drug is used
according to good practice in the use of veterinary drugs. In such cases the output of the
JECFA evaluation is known as MRL “not specified”.

The ADI is an estimate of the amount of a substance in food or drinking water that can be
ingested daily over a lifetime without appreciable risk. It is expressed on a body-weight
basis (standard human = 60 kg). The ADI is listed in units of mg per kg of body weight. A
Temporary ADI is established by JECFA when data are sufficient to conclude that use of the
substance is safe over the relatively short period of time required to generate and evaluate
further safety data, but are insufficient to conclude that use of the substance is safe over a
lifetime. A higher-than-normal safety factor is used when establishing a temporary ADI and
an expiration date is established by which time appropriate data to resolve the safety issue
should be submitted to JECFA. With regard to veterinary drug residues an ADI “not specified”
results when the available data on the toxicity and intake of the veterinary drug indicate a
large margin of safety for consumption of residues in food when the drug is used according
to good practice in the use of veterinary drugs. In some cases it may be that no ADI is

80  Towards a risk-based chain control

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

allocated. There are various reasons for this, ranging from a lack of information to data on
adverse effects that call for advice that the veterinary drug should not be used at all.

The ADI and MRL are the primary outputs of a JECFA evaluation of veterinary drug residues.
Taking Neomycin, which was considered by the 60th JECFA meeting, as an example of the
conclusions of the committee with regard to ADI, which is essentially the output of the
hazard characterization step, and the MRL are presented in Table 2. This is the information
that is used by the risk managers, be they the international risk managers in the guise of
Codex or risk managers at national level, to make a decisions to minimise exposure to these
hazards and thereby minimise the risk to human health. Such information is normally made
available within one to two weeks of a JECFA meeting though a summary report placed at
JECFA website in FAO and WHO. The deliberations and the scientific basis behind these outputs
are equally important but as these are much more extensive and expansive documents more
time is required to finalise them for public dissemination.

Table 2. Outputs of a JECFA assessment of residues of veterinary drugs using neomycin as an example.

Acceptable daily intake: The ADI of 0-60 mg/kg bw (established at the forty-seventh meeting of the
Committee (WHO, 1998)) was maintained.
Residue definition: Neomycin

Recommended maximum residue limits (MRLs)a

Species Liver (mg/kg) Kidney (mg/kg) Milk (mg/kg)

Cattle 500 10 000 1500

aThe MRLs of 500 mg/kg for cattle muscle and fat and all other MRLs recommended at the forty-seventh meeting
of the Committee (WHO, 1998) were maintained.

4.1.1. The JECFA stepwise process

The establishment of ADIs and recommended MRLs for certain drugs when they are administered
to food-producing animals in accordance with good practice in the use of veterinary drugs
follows a stepwise process. Hazard identification, the first of these steps relies on the
submission of data from sponsors, usually the manufacturers of veterinary drugs, as well
as data from the published literature. A range of data are collected and considered at this
stage including the intrinsic toxicological properties of the hazard i.e. the specific veterinary
drug, any available toxicological and pharmacokinetic studies in laboratory animals, the
main metabolites of the drug and their potency compared to the parent substance. Other
issues to be considered at this stage include whether the residues are bound in some form
which may impact on whether they are extractable and detectable. In cases where data is
limited the possibility of extrapolating metabolic data from one species to another may

Towards a risk-based chain control 81

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

have to be investigated. The usefulness of this depends primarily on the likelihood that
the drug is metabolized in the same or a similar way and on the mode of administration or

The hazard characterization similarly relies on submission of data from sponsors. This step
specifically may vary depending on the veterinary drug being evaluated. For those veterinary
drugs with a long history of use, data which do not necessarily meet modern criteria may be
used. (WHO, 1993). It is recognized that all evaluations must adequately address issues of
pharmacological effects, general toxicity, reproductive toxicity, embryotoxicity/fetotoxicity,
genotoxicity, carcinogenicity, other effects identified as being of importance, metabolism,
tissue residues and analytical methodology. The committee, therefore, developed a specific
approach for evaluating veterinary drugs with a long history of use that takes into account
each of these concerns. In general, the hazard characterization step requires toxicological
studies on the parent compound as well as information on the biological/toxicological
potency of the major metabolites. The issue of antimicrobial resistance is also considered in
a very specific context, that is whether the veterinary drug residues, when ingested, pose a
danger to human health by putting selective pressure on the microbial flora of the human
gut. Long discussions within the committee have led to the development of a decision-tree
for determining the potential adverse effects of residues of veterinary antimicrobial drugs
on the human intestinal microflora. Another issue considered at this stage is the allergic
potential of the compound (FAO/WHO, 2000).

The outputs of this step are normally one of two. When the animal or human toxicological
data indicate a level where there is no observed effect then the output is described as a
No Observed Effect Level (NOEL). In other cases an ADI is assigned. The ADI incorporates
a default safety factor of 100 and so the ADI is considered to be the safe daily dose for
humans for a lifetime.

The exposure assessment aims to estimate the intake of the residue via food consumption.
Currently the exposure assessment is considered to be extremely conservative due to the
assumptions that are used. These include assumptions such as all animals are treated at the
maximum of the recommended dose range for the maximum duration, all residues in foods
are at the MRL and residues are consumed daily for a lifetime. Furthermore, very high food
consumption estimates are used. Essentially, this comprises a food basket approach that
models human food consumption on the “safe” side (muscle: 300 g, liver: 100 g, kidney:
50 g, tissue fat: 50 g, eggs: 100 g, milk: 1.5 litre). This highly unlikely scenario results in
an unrealistically high exposure or intake estimates. The output of this step is called the
Theoretical maximum daily intake (TMDI).

The unrealistic nature of the approach means that changes to this step are being discussed.
Currently there is a joint FAO/WHO project to “Update Principles and Methods for the Risk
Assessment of Chemicals in Food” underway which is investigating the procedures for risk
assessment for all classes of chemicals in food, including residues of veterinary drugs. While
a more realistic exposure scenario is desirable it is also recognized that any alternative
approach will require substantial amounts of data which are currently not readily available.
Furthermore, any changes may possibly lead to the reconsideration of the MRLs for a large

82  Towards a risk-based chain control

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

number of veterinary drugs. Thus, the discussion continues and it is unlikely that any big
changes will be introduced in the immediate future.

The final step in the risk assessment is the risk characterization which uses the outputs from
the hazard characterization and exposure assessment together with any other information
required. In evaluating veterinary drugs the data used at this stage in order to come to a
final conclusion include:

• Chemical identity and properties of the drug.

• Recommended dose level and frequency.
• Pharmacokinetic, metabolic and pharmacodynamic studies in laboratory and food-
producing animals and humans studies if and when available.
• Residue depletion studies with radiolabelled drug in target animals from zero withdrawal
time to periods extending beyond the estimated withdrawal time (total residues, including
free and bound residues, major residue components for selection of marker residue and
target tissue).
• Residue depletion studies with unlabeled drug (analysis of marker residue, formulations,
route of application, species, at the maximum recommended dose) including analytical
• Mode of administration, dose, and formulation of the drug, which should be the same as
the proposed intended use(s).
• Meaningful statistical analysis of the data.
• Depletion of residues as a function of time to enable a comparison of the recommended
MRLs and the residues resulting under the established conditions of good practice in the
use of veterinary drugs.
• Routine analytical method(s) for regulatory purposes (sensitivity equal to or less than
the MRL; ideally ≤ 0.5 MRL).
• Impact of residues of antimicrobial agents on food processing.
• Concurrence with requirements of GLP guidelines.

The risk characterization step ultimately establishes the MRL based on available data. When
establishing an MRL it is desirable to determine MRLs for four main tissues, normally muscle,
fat, liver and kidney. For the purposes of national and international trade at least two target
tissues are required for sampling and thus MRLs are established for a minimum of two tissue
types. In some cases it is possible to harmonise the tissue MRL established for different
species when supported by the scientific data such as an indication of comparable metabolism
and ratios of marker residue to total residues between species. Another issue that needs to
be considered when establishing the MRL is the definition of “minor” species and “major”
species. Major species in one country is often a minor species in another country or vice
versa and the definition of a major or minor species may change with time and agricultural

The outputs of JECFA are made available in different formats. A summary report is issued
within one to two weeks of the meeting to make known the output of the committees
work in a timely manner. A full report of the meeting (WHO Technical Report Series), a
residue monograph together with analytical methods (FAO Food and Nutrition Paper 41) and

Towards a risk-based chain control 83

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

toxicological monographs and intake assessments (WHO Food Additive Series) are published
subsequently. The outputs are also made available on the FAO (
jecfa/index_en.htm) and WHO ( webpages.

4.2. Risk management - Codex Alimentarius

The JECFA recommended MRLs are used by the relevant codex committees in their standard
setting processes. In the case up veterinary drug residues the information is taken up by
the Codex Committee on Residues of Veterinary Drugs in Foods (CCRVDF). It goes through
the Codex Step procedure which provides several opportunities for countries to comment and
decide whether the recommended MRL is finally adopted as a Codex MRL (Figure 3). All Codex
Alimentarius MRLs for veterinary drug residues are then published and can also be found on
the Codex data base (FAOSTAT data, 2004).

As a risk manager, the Codex Alimentarius Commission has also established a Recommended
International Code of Practice for Control of the Use of Veterinary Drugs (CAC, 1993). This code
sets out guidelines on the prescriptions, application, distribution and control of drugs used
for treating animals, processing animal health and improving animal production. It includes
Good Practices in the Use of Veterinary Drugs (GPVD), including premixes for the manufacture
of medicated feedstuffs. Codex has also developed a Code of Practice on Good Animal Feeding
(CAC, 2004) which addresses, among many other issues, that of the use of veterinary drugs
in animal feedstuffs. It advises that the use of veterinary drugs in medicated feed should
comply with the provisions of the aforementioned Codex Recommended International Code of
Practice for the Control of the Use of Veterinary Drugs and notes that it may be important to
establish borderlines between feed additives and veterinary drugs used in medicated feed to
avoid misuse. It also notes that antibiotics should not be used in feed for growth promoting
purposes in the absence of a public health safety assessment. This latter recommendation
is based on the WHO Global Principles for the Containment of Antimicrobial Resistance in
Animals Intended for Food (WHO, 2000).

Adopted by CAC
Step 5/8

Recommended CCRVDF
Step 3
CCRVDF Adopted by CAC
Step 5
Step 8

Figure 3. Process by which a recommended MRL established by JECFA becomes a Codex Standard.

84  Towards a risk-based chain control

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

Despite the amount of work that has been undertaken by JECFA and Codex on residues of
veterinary drugs in foods and the establishment of MRLs, there remains a big gap in the
number of veterinary drugs available and those for which an international MRL has been
established. The non-existence of an MRL can pose problems in terms of regulation and have
an impact on trade in foods of animal origin. This was particularly highlighted by the recent
disruptions in food trade caused by the detection of trace amounts of certain residues in
animal products (Food Market Exchange, 2003).

In late 2001 and early 2002, several control laboratories in member countries of the European
Union detected trace amounts of chloramphenicol and nitrofurans in imported animal products
(e.g. shrimps, chicken). These findings were triggered mainly by improvements of analytical
methods which significantly lowered the levels of detection for residues of these drugs.
Following the European Unions safeguard provisions for imports of animal products, some
producers and producing countries were temporarily withdrawn from the list of approved
exporters, while others were forced to rapidly implement drastic measures. Such rapid
progress of analytical methods has resulted in large improvements in detection capabilities
of low residue levels of veterinary drugs, and has exposed gaps in the current national and
international regulatory systems, with major international trade implications.

In 2004, FAO and WHO convened a technical workshop to provide both organizations and
Codex with an analysis of the situation (FAO/WHO, 2004). The workshop pointed out that
decisive and innovative action, which is both realistic and flexible, is needed to address these
gaps and identified a number of areas where action is needed. These included alternatives to
using the limit of detection of the analytical method as the basis for regulatory actions; ways
to more rapidly bridge the gap between the number of veterinary drugs in use and those for
which MRLs are established and the need to improve the capacity of developing countries
in particular to enable more comprehensive regulation and control of the veterinary drugs
used in food production.

The meeting considered that the establishment of recommended performance levels (RPLs)
that consider the toxicological risk of the veterinary drug residue or control strategy chosen
by the competent authority, and thresholds of toxicological concern for residues of veterinary
drugs without ADIs or MRLs may be a workable alternative to the current situation where the
limit of detection of the analytical methodology is so critical.

Bridging the gap in terms of quickly establishing international MRLs for all veterinary drugs
in use in a great challenge. A stepwise process to achieve this was considered. Firstly,
substances whose residues are generally recognised as highly toxic and which should not
be used as veterinary drugs have to be addressed at an international level and the CCRVDF
should identify those compounds not to be used in food animals. For veterinary drugs for
which national MRLs have been established it was recommended that work on international
MRLs for these veterinary drugs be completed within the next ten years. A possible means
of achieving this could be to work with JECFA to establish a list of temporary MRLs based on
national/regional evaluations, which after a certain time period could be made permanent
if the original evaluations were not put into question or JECFA was able to establish an ADI
and propose an MRL. Drugs which are seen as important in developing countries and have a

Towards a risk-based chain control 85

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

national approval should be assessed by a consultative process that may involve JECFA and
subsequently be added to the abovementioned list of temporary MRLs.

The workshop also noted that the regulatory frameworks can differ significantly amongst
countries in relation to the comprehensive nature of a regulatory control programme including
its MRLs for veterinary drugs. A number of measures were identified to overcome some of
these but their implementation is likely to require innovative approaches to capacity building.
Some possible measures and actions to address better coordination of capacity building
activities include increasing the availability and quality of information on international
standards and requirements of trading blocks for developing countries, support for the
establishment of regional reference laboratories and/or laboratory networks, and creation
of a network/platform and a mentorship approach to share experience, knowledge and data
between experts and officials from developed and developing countries.

4.3. Implementing control strategies – Capacity building

While committees such as JECFA can provide assessments of the risk associated with residues
of veterinary residues in foods and recommend MRLs and Codex can establish standards
and develop guidelines and codes of good practice they will have no impact unless there is
adequate capacity at country level to implement them. As was recognised at the FAO/WHO
workshop in 2004 great differences exist among countries in terms of their regulatory and
control capacity.

Five building blocks have been identified as critical to any functioning and effective food
control system. These include (1) food law and regulations, (2) food control management,
(3) inspection services, (4) laboratory services (food monitoring and epidemiological data)
and (5) information, education, communication and training (FAO/WHO, 2003). Effective
control of chemicals in foods requires a functioning food control system as this provides
the basic infrastructure for food safety management. As a first step, Codex standards and
guidelines can be incorporated into national food law and regulations. This provides a legal
basis for further control activities. If a national food law is not already in existence, then it
will be necessary to establish one. A model food law has been developed by FAO and WHO
(FAO/WHO, 1976) to assist countries in establishing their own law and have a basic document
which can then be tailored to their specific needs.

One of the differences highlighted by the recent trade disruptions due to residues, was
the rapid development and implementation of analytical methods for detection of residues
in industrialised countries compared to less industrialised and developing countries. As
it is unrealistic to expect these countries to quickly reach a comparable level in terms of
development and application of new analytical methods, it is critical that they have the
facilities and the expertise to implement the well recognized methods for detection of
residues in foods. This is the basis for the establishment of a successful monitoring system for
chemicals in foods. A monitoring program allows a country to get an overview of its situation
in terms of chemical residues in foods thereby allowing it to determine and target effective
interventions. The success of such a program will depend on good planning, identification

86  Towards a risk-based chain control

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

of the residues to be tested, the method of analysis to be used and then judging if the level
is in compliance with the established regulation MRL and if not the action to be taken.

As the needs in many countries are numerous in terms of improving capacity for monitoring
and control of veterinary drug residues, a number of meetings have been convened to
determine the best approaches to take to meet the country needs (FAO/IAEA, 2003, 2004).
It is recognized that there is a need to address policy makers and public health officials s well
as the scientists, technicians and related staff involved in a control program on a day to day
basis. As well as making specific recommendations to policy makers in developing countries
these meetings have also identified a number of areas where international organizations
could provide further assistance. These include assuring a level playing field for requirements
for analytical methods and laboratories, considering the limited resources available in many
developing countries and ensuring that veterinary drugs exported or donated to developing
countries are fit for use as well as establishing laboratory networks, regional reference
laboratories, forums for knowledge sharing and support for training.

While the capacity building programs of organizations such as FAO aim to address these
issues and have already implemented and continue to undertake technical assistance projects
to address these issues, it is clear that there is still work to be done in narrowing the gap
in terms of the regulatory systems in different countries. In order to narrow this gap the
assessment and control of chemicals in foods need to be a joint effort between those working
at national and international levels (Figure 4).

International level

Assessment Guidelines Good

and evaluation and tools practices

National level

Figure 4. Contribution of national and international level activities to the assessment and control of
chemical residues in foods.

5. Conclusions

This chapter describes some of the activities that are underway at the international level
to address the issue of chemical residues in foods. While the ongoing work is substantial, it
is also clear that the issue of chemical residues in foods still remains an important concern
for consumers. In addition, chemical residues in foods can have a huge trade and economic
impact. The rejection of foods and closure of markets as a result to the detection of chemical

Towards a risk-based chain control 87

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

residues can be a particular blow to those countries still in the process of establishing their
export markets and which rely on revenue from such markets to improve their domestic
infrastructure and economic development. Scientific committees such as JECFA and JMPR
as well the Codex Alimentarius Commission have an important role to play in ensuring the
availability of international standards which facilitate the regulation of chemical residues in
foods. However, it is important that international activities are not limited to this field but
are also extended to cover improvement in the capacity of developing countries to establish
the necessary programmes and infrastructure to meet international standards, protect the
health of their population and facilitate trade of their agriculture produce.


Angelova, V., Ivanova, R. and Ivanov, K., 2005. Study accumulation of heavy metals by plants in field condition.
Geophysical Research Abstracts 7, 03931.
Bernard, A., Hermans, C., Broeckaert, F., De Poorter, G., De Cock, A. and Houins, G., 1999. Food contamination by
PCBs and dioxins. Nature 401, 231-232.
CAC (Codex Alimentarius Commission), 1993. Recommended International Code of Practice for Control of the Use of
Veterinary Drugs (CAC/RCP 38-1993).
CAC, 2004. Code of Practice on Good Animal Feeding (CAC/RCP 54-2004).
CFIA (Canadian Food Inspection Agency), 2005. Health hazard alert - certain food products may contain Sudan dyes.
February 23., 2005.
China Economy, 2005. Sudan 1; Latest news, advice and recalls. 9th March, 2005.
FAO (Food and Agriculture Organization of the United Nations), 2000. Food safety and quality as affected by
animal feedstuff. 22nd FAO Regional conference for Europe, Porto, Portugal, 24– 28 July, 2000. http://www.
FAO, 1998. Animal Feeding and Food Safety. Report of an FAO Expert Consultation, Rome, 10–14 March, 1997. FAO
Food and Nutrition Paper No. 69, FAO, Rome.
FAO, 2001. Manual on the application of the HACCP system in mycotoxin prevention and control. FAO Food and
Nutrition Paper No. 73, FAO, Rome.
FAO/IAEA (International Atomic Energy Agency), 2003. Summary Report FAO/IAEA Workshop: “Strengthening
Capacities for Implementing Codex Standards, Guidelines and the Recommended International Codes of Practice
for the Control of the Use of Veterinary Drugs” (PFL/INT/858/PFL). 20–24 October, 2003, Vienna International
Centre, Austria.
FAO/IAEA, 2002. Report on the First Research Co-ordination Meeting of the Co-ordinated Research Project: “The
Development of Strategies for the Effective Monitoring of Veterinary Drug Residues in Livestock and Livestock
Products in Developing Countries”. 2-6 September, 2002, Vienna International Centre, Vienna, Austria. http://
FAO/WHO (World Health Organization), 2000. FAO/WHO Joint Expert Committee on Food Additives (JECFA) Procedures
for recommending maximum residue limits – residues of veterinary drugs in foods. (1987-1999), Rome. http://
FAO/WHO, 1976. FAO/WHO Model Food Law.

88  Towards a risk-based chain control

Sarah M. Cahill, Ezzeddine Boutrif and Maria de Lourdes Costarrica G.

FAO/WHO, 2003. Assuring Food Safety and Quality: Guidelines for Strengthening National Food Control Systems.
FAO Food and Nutrition Paper 76. FAO, Rome.
FAO/WHO, 2004. Technical workshop on residues of veterinary drugs without ADI/MRL. 24–26 August, 2004,
Bangkok, Thailand. FAO, Rome.
FAOSTAT data, 2004. CODEX ALIMENTARIUS: Veterinary Drug Residues in Food: Maximum residue limits. Last updated
13. April, 2005.
Folks, H. and Burson, D., 2001. Food Safety: Chemical Hazards. University of Nebraska Cooperative Extension.
Food Market Exchange, 2003. Shrimp: A review of the news in 2002.
Food Marketing Institute, 1989 – 97. Trends in the United States: Consumer attitudes and the supermarket.
FSA (Food Standards Agency), 2005. Consumer attitudes to food standards 2004 (wave 5). United Kingdom Report.
FSA, 2005. Sudan I: Latest news, advice and recalls.
FSAI (Food Safety Authority of Ireland), 2005. Food Safety Authority Issues Warning on Illegal Food Colourant:
SUDAN RED 1, 18 February, 2005.
Smith, D. and Riethmuller, P., 2000. Consumer concerns about food safety in Australia and Japan. British Food
Journal 102, 835-855.
WHO (World Health Organization), 2004. Report of a Joint FAO/OIE/WHO Expert Workshop on Non-Human Antimicrobial
Usage and Antimicrobial Resistance: Scientific assessment. Geneva, December 1–5, 2003.
WHO, 2000. Global Principles for the Containment of Antimicrobial Resistance in Animals Intended for Food. June,
2000, Geneva, Switzerland.
WHO, 1998. Evaluation of Certain Veterinary Drug Residues in Food. Forty -seventh Report of the Joint FAO/WHO
Expert Committee on Food Additives, WHO Technical Report Series No. 876. WHO Switzerland.
WHO, 1993. Evaluation of Certain Veterinary Drug Residues in Food. Fortieth Report of the Joint FAO/WHO Expert
Committee on Food Additives, WHO Technical Report Series No. 832. WHO Switzerland.
WTO (World Trade Organization), 1995. Agreement on the Application of Sanitary and Phytosanitary Measures. In:
The Results of the Uruguay Round of Multilateral Trade Negotiations; The Legal Texts. WTO, Switzerland.

Towards a risk-based chain control 89

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

Quantitative risk assessment of aflatoxicosis associated

with milk consumption in Italy (2000-2004)
Marcello Trevisani1, Andrea Serraino1, Alessandra Canever1, Giorgio
Varisco2 and Paolo Boni2
1Department of Veterinary Public Health and Animal Pathology, Faculty of Veterinary Medicine,
University of Bologna, via Tolara di Sopra 50, 40137 Ozzano Emilia, Bologna, Italy, marcello.
2 Istituto Zooprofilattico Sperimentale della Lombardia e dell’ Emilia, Italy


Information was gathered to assess the risk of aflatoxin contamination in milk and to give
the public managers tools for the evaluation of mitigation strategies that have already
been imposed or could be imposed to reduce risk. The assessment concerns milk for direct
consumption produced in Italy, from 2001 to 2004, and also provides an estimated uncertainty
associated with available data and determines whether available data have been critical
and have driven the overall risk assessment. Data relative to milk contamination were
generated by two independent control systems, namely a monitoring scheme conducted
by nationally relevant private industry during the period January 2001-July 2004 and an
extensive surveillance system implemented by Public Veterinary Services during the period
September 2003-July 2004 as a consequence of critical environmental conditions leading to
an “aflatoxin crisis in milk”. Both concerned raw milk produced in the North and Centre of
Italy during the last five years and consist of more than 9,000 samples of milk which were
analysed in two laboratories using validated ELISA methods. The mean estimated level of
aflatoxin M1 in milk observed during the crisis period was approximately 0.035 µg/kg in
both data sets (Industry and Public sampling plans) and 95th percentile values were 0.073
and 0.080 µg/kg respectively. These values were higher than those relative to data recorded
before September 2003 (mean = 0.027 µg/kg; 95th percentile = 0.080 µg/kg). The data have
been evaluated for their capability of representing the overall variability of aflatoxin level
in milk produced in the region and the associated degree of uncertainty (accuracy and bias)
of the analytical methods.

The relative amount of milk presenting different levels of aflatoxin contamination has also
been considered because the monitoring data provided by Industry were correlated with
records of milk weight. Any processing steps, apart from mixing, is capable of producing
change in aflatoxin level in the production of pasteurised or UHT milk. A mixing model
considering the capacity of storage tanks at the processing plant and the relative weight of
bulk milk supplied with different aflatoxin levels has produced probability distributions for
aflatoxin in milk. An exposure assessment concerning aflatoxin M1 in milk has been carried
out by using the contamination data in combination with milk consumption observed in the
Italian population. To this end data on milk consumption of the Italian Institute of Nutrition
have been used to produce second order parametric probability distributions for children
(1-9 y) adolescents (10-17 y) adults (18-64 y) and the elderly (>= 65 y). The aflatoxin

Towards a risk-based chain control 91

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

M1 hazard was characterized by using the carcinogen potency estimates outlined by the
WHO panel of experts. This panel has recently analysed all available toxicological studies
concerning aflatoxin M1 and has used the comparative experimental studies with aflatoxin
B1 to estimate its relative carcinogen potency. The carcinogen potency ranges should
encompass the different sensitivities to aflatoxin observed in the human population. The
genotypic and phenotypic variability is at the basis of these differences, but a mechanistic
modelling of cancer initiation and progression, which could directly account for them, cannot
be accomplished at present, because probabilistic quantitative data are inadequate for risk

Carriers of the Hepatitis B virus are significantly more susceptible to aflatoxin carcinogenicity
than healthy individuals and their prevalence in the population has been accounted for in
the risk assessment. However, many factors, including body weight, milk consumption and
prevalence of Hepatitis B carriers, which are dependent on the age of exposed individuals
are correlated and consequently have to be analysed. In order to assess the risk for the
different age classes, we relied on: (1) the recent statistics relative to the prevalence of Viral
Hepatitis infections in Italy in different age classes published by the Public Authorities, (2)
the estimates of body weight based on results of a cross-sectional study of the Italian Society
of Paediatric Endocrinology and Diabetes and on reports of the Italian Institute of Statistics,
and (3) milk consumption data reported in a study by the Italian Institute of Nutrition. The
integration of the distributions generated in the exposure assessment and in the hazard
characterization by use of iterative simulations (Monte-Carlo LHS) have shown that due to
the low prevalence of Hepatitis Virus carriers, the estimates of risk never give cause for any
serious concern. The mean risk estimates for the number of cases of hepatocarcinoma per
million people, possibly related to aflatoxin contamination in milk, ranged from 5.77·10-3
to 1.17·10-3. On the basis of the contamination level observed over the last four years in
children (who as a result of high intake and lower body weight are more susceptible), the
possibility of milk consumption related HCC in children is irrelevant (p <0.01% or: having
0.04 extra cases per million children per year).

The monitoring strategies and the veterinary provisions taken during the crisis period have
proved to be adequate to manage the risk. Continuing monitoring of aflatoxin content in
milk as part of industry driven quality system has proved to be very important for rapid alert
and for producing the most valuable data for quantitative risk assessment.

Keywords: aflatoxin M1, milk, risk assessment, risk management, carcinogenicity, quantitative
risk analysis, mathematical models

1. Introduction: identification of the problem and consideration of the


In the Autumn of 2003 abnormal aflatoxin B1 contamination was reported to have occurred
in maize grown in Italy. Drought-stressed maize and other concomitant factors such as
insect pest damage and inadequate moisture control of grains acted as facilitators of fungal
infection and caused critical conditions during that season. Maize represents 50% of the

92  Towards a risk-based chain control

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

concentrated feed normally used in Italy for lactating cows and contamination with aflatoxin
B1 was capable of altering the aflatoxin M1 level in milk produced in this period. Monitoring
activities revealed that the level of aflatoxin M1 often exceeded the present statutory limits
for milk (0.05 μg kg-1). Surveillance activities were initiated soon after, in particular in the
Po river valley area that is the most important milk producing region in Italy (approximately
75% of the national product and more than 50% of the milk consumed on the national market
is produced in this area). This crisis prompted the Public Health Authorities (i.e. Veterinary
Services) to trace the sources of contamination and to oblige farmers to analyse feedstuffs
produced on the farm and intended for the lactating cows. Farmers were obliged to withdraw
those exceeding statutory limit, but also to refrain from using maize exceeding this limit as
part of a mixed feed for lactating cows. The use of maize at levels close to the 0.02 µg kg-1
limit set for this commodity, and fed directly to the animals, can expose them to a higher
level of aflatoxin B1 than the corresponding maximum level allowed for complete animal feed
(0.005 μg kg-1), and especially in high yielding cows, where a higher aflatoxin carry-over is
present, the risk of producing milk above the statutory limit is high (EFSA, 2004). The results
of extensive monitoring plans implemented by industry combined with active surveillance
plans have permitted the detection of bulk milk consignments exceeding the statutory
limit and their withdrawal. Moreover, the farms that supplied the above consignments were
controlled and milk produced was checked at farm level until the aflatoxin level was driven
below the limit as a result of changes made in the feedstuffs administered to cows.

Concern about the capability of the Public Laboratories to analyse thousands of samples in a
short time has prompted the Authorities to adopt rapid but reliable screening tests. Aflatoxin
M1 ELISA tests were chosen for this purpose, but only after validation procedures proved
they were capable of discriminating samples above or below 0.05 μg kg-1.

The Public Authorities decided that all samples with aflatoxin M1 above 0.04 μg kg-1 have
to be regarded as suspect and controls at farm level have consequently to be initiated. HPLC
tests for confirmation of analytical results were done at the beginning of the crisis in order
to take action relative to milk, but they have been found useless as a control tool in this
context, because they proved too time consuming and expensive. Nevertheless, comparative
quantitative tests with artificially or naturally contaminated milk samples provided data that
were used to validate ELISA as a quantitative method within the range 0.005 – 0.100 μg kg-1
of aflatoxin M1. Although the analytical effort was intensive some weeks were needed for an
initial screening of all farms and Public Authorities and Industry were concerned about the
quality of milk which was being delivered every day at the processing plants.

The analytical effort intended not only to detect those farms producing milk above the
statutory limit, but also to re-sample milk from those farms causing concern in order to
allow an eventual removal of bans. This was economically onerous for producers and led to
the expensive destruction of banned milk. Because all feed commodities have to be tested
to avoid further problems and these checks can only be done for industrially produced feed
or on individual farms once they have been involved, a systematic surveillance activity was
intended to not only detect new farms involved in the crisis but also to exert further control
on the misuse of the contaminated feed still existing after the lifting of the milk ban.

Towards a risk-based chain control 93

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

Our objective in this risk assessment activity is to provide managers with a tool to estimate
the actual risk for milk consumers in Italy and to evaluate the outcome of the Veterinary
Provisions, and also to compare the contingent situation with that of previous years.
Undoubtedly, consumer exposure should be considered over a longer period, because only
prolonged exposure to aflatoxin actually increases the risk of cancer. Moreover, the aflatoxin
potency which is used for risk assessment purposes has been determined on the basis
of life-long experiments (or of epidemic data) involving chronic exposure to very high
levels of aflatoxin. Exposure is variable during a lifetime because the level of aflatoxin
in food commodities is related to agro-climatic changes in the regions where they are
produced and also to public health provisions. With this objective in mind inputs to the risk
assessment model have included other available data and information relative to sampling
plans collected since the year 2000 in specific monitoring plans carried out in the context
of both Industrial or Public Veterinary Laboratory activities. These have been evaluated for
their ability to produce useful quantitative probability distributions because they can be
regarded as representative of a production area.

2. Overview of the risk assessment

Acute aflatoxicosis is not a matter of concern because the level of aflatoxin in milk is low
and only chronic exposure can determine disease, i.e. hepatocarcinoma. Major concern in
Europe is due to the high level of milk consumption, but the relative risk is significantly
affected by the status of Hepatitis B virus carriers. Other possible risks, such as a reduction
in the immune response capability, are still poorly characterized and are related mostly to
breast feeding in regions with exposure to high levels of aflatoxin B1 (Williams, 2004).
In order to characterize the carcinogenic risk of aflatoxins we have to integrate the
quantitative probability distributions generated in the exposure assessment and in the
hazard characterization frameworks.

With regard to exposure assessment several alternatives, different from the external dose
calculation (i.e. integration of food contamination and food consumption) have been proposed
such as measurements of aflatoxin biomarkers (i.e. albumin-adduct in peripherical blood and/
or aflatoxin DNA adduct in liver tissue) because they proved to be useful in modelling epidemic
data in dose-response studies (Montesano et al., 1997). These internal dose measurements
reported in existing studies do not provide, however, a good quantitative measure of aflatoxin
in humans over a long term (WHO, 1998) and cannot discriminate between different sources
of aflatoxins, such as milk or vegetables, that had to be investigated by nutritional studies.
In this assessment only milk for direct consumption has been considered as a source of
aflatoxin M. Other factors such as concentration of aflatoxin in the processing of cheeses
and other milk derived products and estimates of their relative consumption is the subject
of extensive studies that remain outside the scope of the present risk assessment, although
the results obtained by our research could provide the basis for further developments.

94  Towards a risk-based chain control

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

3. Estimation of aflatoxin level in milk

Data used to produce probability distributions for milk are often “the best data available”,
which, instead of paying more attention to the source of the data and the peculiarity of the
season when samples were taken, places more emphasis on the accuracy of the analytical
methods used. In fact most of the milk testing is directed at detecting milk consignments
above the statutory limit and the criteria adopted to make samples representative of milk
production in the region are often not reported or accounted for.

In analysing the variability observed between lots of milk, emphasis should have instead been
given to the farms’ characteristics and management, as well as to the recurrent agro-climatic
changes, and sampling plans should have been structured accordingly. Even in regions with
a temperate climate the effect of agro-climatic changes might determine conditions that
lead the majority of farmers to have an ineffective control over the growth of Aspergillus
in feed material and therefore a subsequent increased overall level of aflatoxins in both
vegetable commodities and milk is found. Use of extensive data sets from monitoring plans
can allow for putting more emphasis on source variability than the results of official controls.
A screening method such as ELISA needs to be confirmed by an accepted reference method
for regulatory purposes, but it could nevertheless provide useful quantitative data for risk
assessment if the method accuracy and bias are accounted for, in that these will have been
previously assessed by appropriate validation procedures.

The ELISA method that provided data for this risk assessment had been validated within
the range 0.005-0.100 μg kg-1 and the mean recovery, assessed by analysing the reference
samples, was 102.4% with mean coefficient of variation equal to 6.2% and 7.5% in the two
laboratories involved. During the crisis period, however, many samples showed aflatoxin
concentration above the validated limit of quantification. Cumulative empirical probability
distribution generated from the data therefore made reference to a possible/probable maximum
limit and this increases uncertainty regarding the values above the range of measurements
(Vose, 2001). We have set the possible upper limit at 0.300 μg kg-1 on the basis of the
available results of the HPLC analyses and ELISA tests which were outside the valid range of
quantification. We consider this assumption should be accepted because the ELISA method
adopted tends to overestimate values above the quantification limit so the maximum assumed
cannot produce any underestimation of risk. Uncertainty was introduced in the exposure
assessment module by using a second order empirical probability distribution.

The capability of data-sets to be representative of the entire productive area was assessed by
comparing variability in the different sampling plans independently carried out in the same
period and in overlapping areas with the same agro-climatic conditions. A first data set was
generated by the monitoring activity adopted by a nationally relevant industry and was aimed
at controlling the quality of raw milk processed in two big productive units in Lombardy and
Emilia. They had planned to analyse aflatoxin in bulk milk collected from transport tanks
every two weeks for each area (combining milk from 2-6 farms). A total of 2,512 samples
representative of 837 farms located in the North and Centre of Italy were analysed from
2001 to July 2004. A second set of data was generated by the surveillance activity carried
out by the Public Veterinary Services and the “Istituto Zooprofilattico della Lombardia e

Towards a risk-based chain control 95

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

dell’Emilia” during the aflatoxin crisis between September 2003 and July 2004 (MinSan,
2003; RER, 2004). We have taken into account the data relative to samples of bulk milk
collected from transport tanks supplying the processing plants, but not those of individual
farm milk or those of bulk milk withdrawn as a consequence of veterinary provisions. Overall
samples analysed and included in this data set amounted to 4,190 and were relative to 180
milk processing farms (most of these cheese producers) that collected milk from 7,000 farms
located in Lombardy. A third data set was recorded in the period June 1999-March 2000,
when samples were taken from 438 transport tanks at 70 milk processing plants (for cheese
production) and approximately 5-6 tanks were checked per each plant.

4. Modelling process of milk

Most studies aimed at evaluating the fate of aflatoxin M1 during milk pasteurization or
sterilization showed that the concentration is not appreciably reduced (Rustom, 1997) and no
reduction in the aflatoxin M1 level is observed after skimming milk (Yousef and Marth, 1989).
The only process currently applied that has an effect on the aflatoxin concentration in milk
for direct consumption is the mixing of bulk milk consignments of different contamination
levels. We have gathered data relative to the weight of each bulk milk consignment from
the Industry monitoring activity and the combination of amount and contamination data
has allowed a simulation of the concentration of aflatoxin in milk we would observe after
processing (mixing). We have used this model output for estimating the contamination
of milk at consumption level. We have had to simulate and not calculate the actual data
because samples were collected by transport tanks every day from two to three different
production areas and for every consignment. In order to simulate a mixing process we have
also accounted for the capacity of the different storage tanks (and their number) present
in the two processing units of Milan (Lombardy) and Bologna (Emilia). The other two data
sets have not been combined with the first one because we had not gathered any objective
information on the amount of each consignment of milk from the many small/medium/
large size plants the majority of which were intended for cheese production. Moreover the
surveillance sampling plan scheme adopted was addressing the need for a rapid screening of
all milk producers within the area and control of those farms involved in the crisis, rather than
a reassessment of each producer several times during the year at fixed time intervals. Instead,
we have used the data generated by the two independent sampling plans, which referred
however to overlapping production areas, to compare patterns (probability distributions) of
aflatoxin concentration. As is shown further on in our study no differences were observed
between data set 1 and 2 relative to aflatoxin M1 in milk.

5. Estimation of milk consumption

Milk consumption data used in this assessment were taken from a recent study carried out by
the Italian Institute of Nutrition and is based on a sample of 1,147 households, corresponding
to 2,734 individuals of different age, sex and areas (Turrini et al., 2001). Consumption data
are reported as mean, standard deviation and geometric mean. It was therefore possible
to adapt probability distribution curves to these parameters to estimate variability within

96  Towards a risk-based chain control

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

the different age classes which were analysed separately (i.e. 1-9; 10-17; 18-64 and 65+
years of age). The contaminant exposure data associated with milk ingestion in different
life stages has been used for calculating specific cancer risk by applying the carcinogenic
potency without mixing exposure in the early stages of life with the other stages of the life
span. This approach has been used instead of the one proposed by JECFA, which was based
on GEMS/Food regional diets, where analysis had been based on average regional data and
childhood and adolescent exposure were prorated with other periods of the entire life span.
In fact, in Italy, milk consumption is much higher in children and adolescents than in adults
and the relative lower weight could increase the relative risk.

6. Hazard characterization

Toxicological studies have shown sufficient evidence for the carcinogenicity of naturally
occurring mixtures of aflatoxins and of aflatoxins B1, G1 and M1, whereas there is limited
evidence for aflatoxin B2 and inadequate evidence for aflatoxin G2 (IARC, 1993). Aflatoxin B1
is usually found in greatest concentration in animal feeds, and the most important metabolite
in milk is aflatoxin M1, the 4-hydroxy metabolite of aflatoxin B1. The International Agency
for Cancer Research has evaluated results of in vitro and in vivo toxicological studies relative
to aflatoxin metabolites found in milk and has concluded that aflatoxin M1 is a possible
carcinogen to humans (carcinogen class 2B). Studies seem to indicate that aflatoxin M1 has
hepatotoxic and hepatocarcinogenetic potential (JEFCA, 2001). Acute toxicity seems to be
similar or slightly less than that of aflatoxin B1, but the M1 level in milk is too low to produce
symptoms of acute intoxication (Krishnamachari et al., 1975; Edds, 1973; WHO, 1998). Long
term exposure to the relatively low level of aflatoxin M1 occurring in milk and milk products
can, on the other hand, pose a hazard because of its hepatocarcinogenetic potential. Major
concern for aflatoxin M1 in milk exists in those countries where consumption of this food
commodity is high, like in European countries (WHO, 1998) and the greatest concern is for
babies, children and adolescents due to the higher ratio milk intake/body weight and for
carriers of viral Hepatitis infections (i.e. Hepatitis B).

Hazard characterization of aflatoxin M1 has been performed by JECFA (2001) but no

epidemiological studies are available that bear on the relationship between risk of primary
liver cancer (PLC) and intake of aflatoxin M1; therefore risk characterization has been
based on studies aimed at evaluating the relative carcinogenic potency of aflatoxins M1
and B1 in animal models. The use of animal species in risk characterization of carcinogenic
substances is common, but there exist differences which include dose, frequency of exposure,
genotype and consequently metabolism. It was observed, for example, that the relative risk
of hepaticarcinoma (HCC) is higher when people exposed to aflatoxin B1 are also carriers of
Hepatitis B virus, as evidenced by the presence of viral antigen in the blood (HBsAg+) and
the carcinogenic potency has been estimated to be 30 times higher than in non-infected
people. However, only few animal species are naturally infected by the Hepatitis B virus.
Although Hepatitis B itself may be the cause of hepatocarcinoma, not all infected individuals
have the cancer and other factors, like Aflatoxin B1 exposure, may act as a multiplicative
risk factor (Sylla et al., 1999).

Towards a risk-based chain control 97

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

In addition to liver infection/inflammation due to viral hepatitis, genetic variability has a

strong effect on aflatoxin susceptibility, as a consequence of the capability of bio-transforming
aflatoxin B1 to yield excretable metabolites. Thus, among rodents, rats are naturally sensitive
but mice are not, and this difference is related to a high level of glutathione S-tranferase iso-
enzymes (Wild and Turner, 2002) and the animal models used for hazard characterization must
be evaluated for their capability of reproducing the molecular mechanisms of carcinogenesis
that are possible in humans. To this aim advances in genetic engineering have led to
the development of mouse models of cancers that simulate human cancer and can also
reproduce the metabolic changes observed in human liver cells as a consequence of Hepatitis
B infections (Sell, 2003). Most studies concern the carcinogenesis mechanism of the most
potent aflatoxin B1, but some show that aflatoxin M1, like aflatoxin B1, is capable of forming
DNA adducts both in vivo and in vitro leading to mutagenesis (Marien et al., 1987; Shinahara
et al., 1995; WHO, 1998). However aflatoxin M1 is a poorer substrate for epoxidation and
is therefore less mutagenic than aflatoxin B1. Furthermore it is known that epoxidation
of aflatoxin depends on the expression levels of citochrome enzymes in human liver, that
the expression of some citochromes is polymorphic and that extra-hepatic metabolism of
aflatoxin, particularly in the small intestine, may be important in modulating the toxic and
carcinogenic effect (Wild and Turner, 2002).

Aflatoxin B1 ingested by animals and humans is partially absorbed and its bio-transformation
proceeds through a citochrome P450 mediated oxidation that produces hydroxy-derivatives
analogous to aflatoxin M1 and 8,9 epoxide-derivatives. These epoxide-derivatives are highly
reactive with DNA and produce adducts. Although DNA adduct formation is a well known
step in the mutagenesis and possible initiation of carcinogenesis, the adduct formation
might not result in cancer initiation because the DNA repair nuclease, the DNA polymerase
and the DNA ligase can repair damages. In other cases the carcinogenesis is initiated and is
subsequently affected by pro-oncogenes and other factors, which have as a possible outcome
cancer development and progression to malignancy (Sell, 2003). Epidemiologic studies in the
human population have correlated a high aflatoxin content in the diet with specific mutation
at ser 249 site of the third codon 249 of p53 gene. This gene codifies a protein that plays a
relevant role in the cell cycle. It blocks cell division, enhances DNA repair, increases apoptosis
(death of cells) and blocks growth activating proteins (Sell, 2003). How the mutation of p53
gene may affect human cancer development is well described in Vogelsteine et al. (2002)
and Hussain, Harris (2000).

It has been pointed out that the metabolic steps of aflatoxins and the variability affected
by genetic differences greatly influence the rate of DNA adduct formation and consequently
genotoxicity, but common metabolic pathways have been observed in animal and human
tissues. The mutational properties of primary AfB1-DNA adducts have been studied (Bailey et
al., 1996; Smela et al., 2002). It was shown that AfB1 8,9 epoxide reacts with the N7 guanyl
moiety of DNA like those present in the p53 gene and that destabilization of 8,9 dihydro-
8-(N7 guanyl) 9-hydroxy aflatoxin B1 results in the formation of AfB1 formamidopyrimidine
(FAPY) and of an apurinic site. It was shown that these DNA adducts in the presence of
bypass polymerases may incorporate bases, and often incorrectly, opposite to the site of DNA
damage. It is supposed that in times of crisis cells could be induced to express the bypass
polymerases and, if gene mutations are within the restriction site of DNA, defective nucleotide

98  Towards a risk-based chain control

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

sequences are produced (Smela et al., 2002). It was also shown that FAPY adducts have a
more subtle effect on DNA architecture and may profoundly affect the local melting point of
DNA duplex. Thus AfB1-N7-gua adducts may arrest repair. It was found that different species
of FAPY adducts are formed and the mayor adducts could be responsible for cell toxicity
(acute toxicosis) whereas minor adducts could be responsible for mutagenicity. A mix of
mayor and minor adducts, which is not as lethal as mayor on its own, could be responsible
for carcinogenesis and DNA or protein adducts (blocked enzymes) could be responsible for
acute aflatoxin toxicity.

The effect of glutathione S-transferase polymorphisms on aflatoxin-induced carcinogenesis

has been investigated in some recent studies (Bian et al., 2000; Sun et al., 2001). These
enzymes are supposed to be factors of individual susceptibility to aflatoxin associated cancer
because they may regulate an individual’s ability to metabolize and detoxify the exo-epoxide
of aflatoxin, reducing the possibility of DNA adduct formation. These authors showed that
the aflatoxin related risk of developing hepato-carcinoma (HCC) may be modulated by GSTx
(GSTT1 and GSTM1) genotypes. Both these iso-enzymes have null genotypes and the GSTT1
non-null genotype is a hazard factor to HCC (Bian et al., 2000). The authors pointed out that
association between chronic exposure to aflatoxin (individuals with high serum AFB1-albumin
adduct level) and HCC risk was observed among HBsAg chronic carriers who had GSTM1 and/or
GSTT1 null genotypes exhibiting absence of enzymatic activity, but not among those who had
non-null genotypes. In particular, a functional GSTT1 genotype was related with increased risk
of aflatoxin associated HCC which is not directly related to the level of aflatoxin adduct in
the serum. The authors attributed the higher risk of HCC in HBsAg chronic carriers who had
non-null GSTT1 genotype to the production of carcinogenic metabolic intermediates which
is favoured by interplay between aflatoxin induction of GSTT1 and inflammatory reaction
consequent to liver injury caused by Hepatitis B infection. Another genetic polymorphism that
could affect HCC risk in individuals exposed to aflatoxin B1 is epoxide hydrolase (Guengerich
et al., 1996). Enzymes of this family are in fact capable of hydrolysing aflatoxin 8,9-epoxide
to dihydrodiol, but their role has not been strongly supported by experimental studies (Wild
and Turner, 2002).

Genotoxicity of aflatoxin B1 has been studied in F344 rats, and a mutation different from
codon 243 of p53 genes (comparable to 249 in humans) was observed, affecting codon 12 of
the ki-ras genes (Riley et al., 1997). The ki-ras is known to be a proto-oncogenes active in
control of malignant transformation and the species differences in the oncogenic mutation
involving the same carcinogens could be the basis of a different molecular mechanism of
oncogenesis, which should be considered in the toxicological studies aimed at defining
the carcinogenic potential. If confirmed, these studies show that the rat model used in
comparative assessment of aflatoxin B1 and M1 have a different molecular mechanism of
oncogenesis, but it might produce comparable outcomes.

In addition to molecular mechanisms of oncogenesis in humans and animals, considerations

regarding the level of exposure has also received attention. A study carried out by Sotomayor
et al. (2003) on the effect of intermittent exposure to Aflatoxin B1 on nucleic acid adduct
formation in rat liver shows that binding of AfB1 to hepatic DNA is a linear function of
the ingested dose both after continuous exposure or after cycles of dosing and rest. This

Towards a risk-based chain control 99

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

assessment could be extended to varying levels of aflatoxins found in human food and it is
important because exposure to the varying levels of aflatoxins, which are observed in humans,
would produce the same outcome as the constant level in animal experiments and this would
therefore confirm the validity (relative to dosing) of previous experiments in animals that
have been used to estimate the relative carcinogenic potency of aflatoxin M1.

The studies reported here show that present knowledge can only partially shed light on
the molecular mechanisms of aflatoxin-associated hepatocarcinogenesis and that genetic
studies in the exposed population will be required to produce mechanistic models. Probability
distributions relative for the different events in target organ metabolism (activation/
detoxification), initiation (mutagenesis), progression (activation of regulatory genes),
cancer development and malignant transformation of cancer cannot be hypothesised now,
but will clarify conflicting results of previous toxicological studies. The carcinogenic potency
may be used now as a surrogate model (mathematical model) for carcinogenesis in humans
but is similar to a black box.

Other factors concerning the use of carcinogenic potency should be considered. The risk of
cancer due to various forms of aflatoxin is indeed based on the cumulative lifetime dose
and the carcinogen potency refers to an increase in the probability of cancer risk that is
a function of dose of the contaminant (i.e. the amount ingested). The overall dose of the
contaminant ingested during the experimental study period (i.e. 2 years for rats), when an
increased number of cancer cases was observed in comparison with the controls, is divided
by number of days (i.e. 365 x 2) and the dose is standardized for the body weight (i.e. µg/kg
b.w. per day). The slope of the line between baseline risk at dose zero (D0) and the value
of added risk at dose D1 to Dn is the dose-response relationship, which could be linear or
not. For some carcinogens that are mutagens, like aflatoxins, no dose without response can
be assumed in the risk assessment (no biological threshold level) and a linear approach to
dose-response assessment is generally recommended. This approach, i.e. linear extrapolation
from a dose within the experimental range to a much lower dose range, that is known as
the model free approach, is often used in assessing equivalency (relative potency) between
carcinogens, but other methods of selecting a point of departure for extrapolation to low
doses have been proposed for modelling dose-response curves using animal bioassays, such
as LED10 and LED25 (EPA, 1999). The studies of Cullen et al. (1987) were discussed by the
JEFCA (1999) and used to evaluate the carcinogenicity of dietary aflatoxin M1 in Fisher rats
compared to aflatoxin B1 and to extrapolate these results to humans using the ‘model free
approach’. These studies showed that the carcinogenic potential of Aflatoxin M1 is one order
magnitude (one tenth) lower than that of Aflatoxin B1. The reported range for potency
estimates of aflatoxin B1 based upon human epidemiological data is in the range 0.05-0.5
for HBsAg+ individuals (carriers of Hepatitis B virus) and 0.002-0.035 for HBsAg- individuals
and the experts assumed that the 0.3 and 0.01 values of carcinogen potency are the best
estimators of carcinogen potency for individuals carrying or not carrying the HB virus. The
carcinogen potency of aflatoxin B1 and of aflatoxin M1 (one tenth that of aflatoxin B1) is
used to express the relationship between the dose of the contaminant in the diet (i.e. ng/kg
body weight per day) and the incidence of primary liver cancer (new cases of hepatocarcinoma
per year).

100  Towards a risk-based chain control

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

7. Risk model and calculation

An Excel spreadsheet model was created in which input for assessing exposure was represented
by distributions: (1) monitoring data of aflatoxin concentration in bulk raw milk; (2) records
of milk weight consignments; (3) capacity of milk storage tanks at Bologna and Milan milk
processing plants; (4) milk consumption data based on the study of the Italian Institute
for Research on Food and Nutrition; (5) anthropometric data for children and adolescents
(body weight, height) from a study by the Italian Society for Paediatric Endocrinology and
Diabetes; (6) body metabolic index of Italian adult population (from statistic reports of the
Italian Institute for Statistics); (7) prevalence of HBsAg carriers among pregnant women who
attended antenatal screening in public and private hospitals in six Italian regions during
2001; (8) prevalence of ABsAg carriers among children after application of vaccination
programmes in Italy; (9) demographic data (number of people of different ages) of the Italian
Institute of Statistics (year 2002); (10) carcinogenic potency of aflatoxin M1 (estimates of
JEFCA panel of experts). An outline of the model constructed to assess the risk is shown in
Figure 1.

8. Aflatoxin concentration in milk

On the basis of the data set 1 a second order cumulative distribution was used to model
variability/uncertainty of concentration of aflatoxin in raw milk. The validity of this data
set concerning samples systematically collected by Industry since 2001 is justified by the
evidence provided in the comparing of data sets 1 and 2 for the same period (September 2003-
July 2004), which produced the same probability distributions (Table 1 and Figure 2).

To produce the cumulative probability distribution the relative frequency of data in classes
1 to i (equal to the intervals 0-10, 10-20, …90-100, > 100 ng/kg) was calculated to define
a discrete distribution. Therefore probability relative to each class and uncertainty relative
to sampling were estimated by a function Beta (k+1, n-k+1), where n is the number of all
samples tested and k is the number of samples in each class (Vose, 2001). Then, probability
values were normalised to give a sum of probabilities equal to 1 and the discrete probabilities
were converted into a cumulative distribution. One hundred Latin Hypercube samples were
taken from the cumulative distribution and imported back into the spreadsheet model. These
data were then used to perform multiple simulations of uncertainty (Fx1 to i) using the @
RISK RiskSimtable function (Vose, 2001), whereas variability was estimated by a function
cumulative (1,2,..i; Fx 1,2,…i) and the minimum and maximum were 0 and 300 ng/kg.
Maximum value was assumed above the highest value recorded measurements of aflatoxin
concentration even if they were outside the range of the validated method. In order to
account for the uncertainty deriving from the analytical method, the Excel function inv.
norm(casual(), a·x, b·x) was used. It makes reference to the ‘x’ simulated value for aflatoxin
concentration generated by the cumulative function and by iterative calculations it produces
values randomly sampled from a Normal (ax, bx) distribution, where ‘a’ and ‘b’ are respectively
the average mean recovery and the coefficient of variability.

Towards a risk-based chain control  101

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

Concentration (Xraw) of aflatoxin M1 Weight (W) bulk milk at concentration ‘X’

in bulk raw milk
(ranks from 10 to 100 or > 100 ng/kg)
[Cumulative(0,0.300*, xraw0.002 to 0.100 µg/kg;F(x))]
*0.300 µg/Kg assumed maximum level Mixing of bulk milk in storage tanks
in samples > 0.100 µg/Kg Sum of aflatoxin equivalents (Ci * Wi)
2nd order model (uncertainty due to up to capacity (Vr) of a randomly chosen
sampling and analytical error accounted for) storage milk tank. Xproc= Σ(Ci * Wi)/Vx

[Vx = discrete(v1-r, p1-r)]

Consumption of milk (IM) per day
Children/Adolescents/Adults/the Elderly:
IM = Lognormal(mu, stdev)
Uncertainty simulated by bootstrap Daily intake of aflatoxin (DAF) with milk
DAF = Xproc. * IM

Carcinogen potency in carriers of Hepatitis

B virus (PCa) or non-infected individuals
PCa = Pert(0.005,0.03,0.05)
PCb = Pert(0.0002,0.001,0.003)

HBsAg+ = prevalence of
Hepatitis B carriers in the New hepatocarcinoma cases/ year x million people =
population = Uniform(min, max) DAF [PCa (HBsAg+) + PCb (HBsAg-)]/Bw
HBsAg- = 1 - HBsAg+

Body weight (Bw)

children/adolescents/adults/the elderly
Bw=Discrete(Bage, Fage)

Bage= Triang(5%,50%,95%)
% =percentile weight
Fage= frequency of people in each
population age class

Figure 1. Schematic diagram of the structure of the model used in the risk assessment.

The cumulative probability distributions for aflatoxin in milk, relative to the different data-
sets, are reported in Figure 2a-d, whereas the classical statistical results are reported in
Table 1.

9. Production module

The model uses the Excel function ‘vlookup’ to correlate the aflatoxin concentration to
weight of bulk milk. In the production module this function makes reference to weight
of bulk milk samples that show values of aflatoxin concentration close to the simulated

102  Towards a risk-based chain control

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

Table 1. Level of aflatoxin M1 (ng/kg) observed in milk samples.

Data set 1 Data set 1 Data set 2 Data set 3

Sep. 03-May 04 Jan. 01-Aug. 03 Oct. 03-Jul. 04 Jun. 99-Mar. 00

Origin Centre-North Italy Centre-North Italy Lombardy Lombardy

Mean 33.57 29.39 34.65 20.38
Standard deviation 28.17 53.36 33.92 16.56
Median 29.12a 16.61b 29.63a 17.84c
95 percentile 72.63 79.51 79.67 45.74
Number of samples 1,275 1,237 4,190 2,344

a,b,c Superscripts with different code proved to be different by Kruskal-Wallis test; p<0.01.

ones, producing an estimate of aflatoxin equivalents (ng/kg) that are added to the tanks
feeding the pasteurizer. The maximum amount of milk in the vats is computed by a model
simulating the filling of the tanks to their maximum capacity with the individual bulk milk
consignments at varying aflatoxin levels, which is simulated by use of Discrete (V1 to r, p1 to r)
making reference to the capacity of tanks and to the number of tanks present in the two
milk processing plants of Milan and Bologna. The output of the production model is the
concentration of aflatoxin in the processed milk, which is calculated by dividing the sum of
aflatoxin equivalents (total aflatoxin content) by the amount of milk filling each tank. The
cumulative probability distribution for aflatoxin in pasteurized milk (calculated from the
Industrial monitoring data in the period 2001-2004) are reported in Figure 2b. This model
assumes that aflatoxin content in milk pack units produced from milk in each vat is uniform.
This assumption was proved by testing aflatoxin level in milk pack units produced from milk
present in a vat (data not reported).

10. Food consumption module

Mean, standard deviation and geometric mean relative to milk consumption data in the
Italian population, which were analysed separately for children, adolescents, adults and the
elderly, have been reported in a study by Turrini et al. (2001). The original data set was not
available, but it was obvious that a normal distribution could not be assumed for describing
the variability, because the report tables show a very large spread of the data around the
mean (standard deviation values close to arithmetic) and the geometric means were lower
than the arithmetic means. The observed probability distributions could not have been
normal because they should encompass the many factors affecting milk consumption, such
as body weight, metabolism/life standard and age. On this basis, and after looking at the
distributions that would better fit the reported data parameters, we could assume that milk
consumption data follow a lognormal distribution.

Because the geometric mean is a more stable parameter than the arithmetic mean in
lognormal distributions fitted to a limited number of samples, we have used the geometric

Towards a risk-based chain control  103

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

a. X <= 50 X <= 100 b. X <= 50 X <= 100

88.1167% 97.2867% 91.0633% 99.8667%
1 1
0.9 0.9
0.8 0.8
0.7 0.7
0.6 Mean = 0.6 Mean =
0.5 30.41297 0.5 33.41039


0.4 0.4
0.3 0.3
0.2 0.2
0.1 0.1
0 0
0 70 140 210 280 350 0 35 70 105 140
ng/kg ng/kg

c. X <= 50 X <= 100 d. X <= 50 X <= 100

87.0933% 98.25% 86.4567% 97.2033%
1 1
0.9 0.9
0.8 0.8
0.7 0.7
0.6 0.6
0.5 Mean = 0.5 Mean =

0.4 33.57645 0.4 34.65594

0.3 0.3
0.2 0.2
0.1 0.1
0 0
0 70 140 210 280 350 0 70 140 210 280 350
ng/kg ng/kg
Figure 2. Frequency distribution of aflatoxin level in milk: period 2001-2004 in (a) raw milk; (b)
pasteurised milk (period 2001-2004); period September 2004-July 2004 (c) data set 1; (d) data set 2.

means and the standard deviations, relative to the age classes, to define the two parameter
lognormal distribution used in our risk assessment. The uncertainty linked to this assumption
is accounted for by using a bootstrap method (Vose, 2001), so that 500 bootstrap samples
were generated, by iterative calculation, from the lognormal distributions defined by the
parameters reported in the study reported above and the means and the standard deviations
of these bootstrap samples were used to simulate uncertainty by 10,000 iterative calculations
with Monte Carlo (LHS method) and the output data sets were used to fit (by use of an
Anderson-Darling best fit function in @risk) a second order distribution relative to the means
and the standard deviations of the two parameter lognormal distribution used to define the
variability in milk consumption. Probability distribution for milk consumption in children
and adolescents are reported in Figure 3.

11. Body weight module

Distributions used to describe body weight of Italians were based on data derived from
two studies. The first one is a cross-sectional study of the Italian Society for Paediatric
Endocrinology and Diabetes and is relative to a sample of 27,421 girls and 27,374 boys,
aged 6-20 years from 16 of the 20 Italian regions (Cacciari et al., 2002). The second is a

104  Towards a risk-based chain control

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

a. b.
X <= 59.14 X <= 485.57 X <= 49.56 X <= 460.29
5% 95% 5% 95%
4.5 5
4 4.5

p (values in 10E -3)

p (values in 10E -3)

3.5 4
3 3.5
Mean = 3
2.5 208.0125 2.5
2 2
1.5 Mean =
1.5 189.9901
1 1
0.5 0.5
0 0
0 1 2 3 0 1 2 3
g x 1000 g x 1000
c. d.
X <= 38.91 X <= 423.33
X <= 31.71 X <= 358.75 5% 95%
5% 95% 6
7 p (values in 10E -3)
6 5
p (values in 10E -3)

5 4
3 Mean =
139.9825 2 Mean =
2 166.9819
1 1
0 0
0 0.5 1 1.5 2 0 0.625 1.25 1.875 2.5
g x 1000 g x 1000

Figure 3. Milk consumption in: (a) children; (b) adolescents; (c) adults; (d) the elderly.

report of the Italian Institute of Statistics concerning health problems and risk factors,
including overweight (ISTAT, 2002). The body mass index (BMI=body weight / height2) which
is reported in this study, was relative to the population over 18 years of age. These data
allowed an estimation of body weight in adults by taking into account the height data at 18
years of age reported in the previous study which could be assumed definitive (body weight
= BMI/height2). Available results of cross-sectional studies relative to weight of children
between 2 and 6 years old are reported in the WHO Global database on Child Growth and
Malnutrition, but these data are relative to data collected 20 years ago and are no longer
representative of the present population. Therefore height/weight paediatric reference
growth charts were used in this study for infants (1-6 years old). All the data sets reported
percentiles values. Triangular distribution defined by 5th, 50th, 95th percentiles was used to
estimate body weight variability. Probability distribution for body weight is reported in Figure
4 and the aflatoxin intake from milk consumption estimated in the different age classes is
reported in Figure 5.

Towards a risk-based chain control  105

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

a. X <= 9.55 X <= 35.6 b.

5% 95% X <= 31.13 X <= 74.67
0.06 5% 95%
Mean = 0.03
0.05 Mean =
20.22954 0.025 52.27037
0.04 0.02


0.02 0.01
0.01 0.005
0 0
5 20 35 50 20 40 60 50 100
kg kg

c. X <= 51.36 X <= 94.66 d. X <= 51.33 X <= 94.5

5% 95% 5% 95%
0.04 0.04
0.035 Mean = 0.035 Mean =
0.03 70.61206 0.03 70.59425
0.025 0.025
0.02 0.02

0.015 0.015
0.01 0.01
0.005 0.005
0 0
30 60 90 120 30 60 90 120
kg kg
Figure 4. Frequency distribution for body weight in: (a) children; (b) adolescents; (c) adults; (d) the

12. Demographic data

With the aim of simulating consumption of milk, at varying levels of aflatoxin M1, population
data elaborated by the Italian Institute for Statistics (ISTAT, 2004) were used to produce
discrete probability distributions. These distributions were used, therefore, to produce (by
Monte Carlo, LHS iterative calculation) samples of individuals from the different age classes.
This simulation was used in the risk characterization module for simulating the probability
that individuals of different ages, with different body weights and milk consumption habits,
drink milk with varying levels of aflatoxin contamination.

13. Prevalence of carriers of Hepatitis B virus in the Italian population

Hepatitis B prevalence data in Italy were taken from three recent epidemiological studies.
We have used data relative to a sample of 10,881 pregnant women who attended HBsAg
antenatal screening in public and private hospitals in six Italian regions during 2001 to
produce an estimate of Hepatitis carriers among adults (Stroffolini et al., 2003). It was
assumed that prevalence was comparable between females and males during their adult life
and also in their elderly age. The overall HBsAg prevalence reported for pregnant women
was 1.7% (CI 95%: 1.4-1.9). Data reported in two studies aimed at evaluating the impact

106  Towards a risk-based chain control

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

a. b.
X <= .15 X <= .83 X <= .05 X <= .19
5% 95% 5% 95%
1 1
0.9 0.9
0.8 0.8
0.7 0.7
0.6 0.6
0.5 0.5

0.4 Mean = 0.4 Mean =
0.3 0.4051 0.3 0.1012
0.2 0.2
0.1 0.1
0 0
0 0.875 1.75 2.625 3.5 0 0.125 0.25 0.375 0.5 0.625 0.75
g/kg g/kg
c. X <= .05 X <= .17 d. X <= .04 X <= .13
5% 95% 5% 95%
1 1
0.9 0.9
0.8 0.8
0.7 0.7
0.6 0.6
0.5 0.5

0.4 0.4
0.3 0.3 Mean =
Mean = 0.0708
0.2 0.094 0.2
0.1 0.1
0 0
0 0.125 0.25 0.375 0.5 0 0.125 0.25 0.375 0.5
g/kg g/kg

Figure 5. Aflatoxin intake (μg/kg body weight) in: (a) children; (b) adolescents; (c) adults; (d) the

of universal vaccination programmes in Italy were used to estimate prevalence in children

and adolescents (Bonanni et al., 2003; Da Villa et al., 1998). The estimated prevalence was
0-0.7% in children and 0.7-0.9% in adolescents. A distribution Uniform (min; max) has been
used to simulate prevalence of carriers in each age class.

14. Risk characterization module

The carcinogenic potency may be used now as a surrogate model (mathematical model) for
carcinogenesis in humans. The risk of cancer due to various forms of aflatoxin is based on
the cumulative lifetime dose. However as carcinogenic potency is related to aflatoxin intake,
and this is due to milk consumption and body weight other than to milk contamination, it
was decided to analyze the different phases of human consumption habits separately (i.e.
children, adolescents, adults, the elderly) and add the cancer risk for each life period to
yield a total lifetime cancer risk (Ginsberg, 2003). The average potency for aflatoxin M1,
which was estimated by the WHO panel of experts (JECFA, 1999; WHO, 1998) was 0.03
cancers/100,000 per year per ng/kg body weight per day for carriers of Hepatitis B virus
(range 0.005-0.05) and 0.001 cancers/100,000 per year per ng/kg body weight per day for

Towards a risk-based chain control  107

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

non-infected individuals (range 0.0002-0.0035). These dose-response relationships were

used to estimate the additional number of hepatocarcinoma cases in the Italian population
associated to aflatoxin in milk for direct consumption. The formula adopted by WHO 1998
to estimate the number of HCC cases (per 100,000 people per year) was:

Nr. HCC cases = AFM1ng/kg (pHBsAg+·potHBsAg+ + pHBsAg-·pot HBsAg-)

where: potHBsAg+ and pot HBsAg- are sampled from PERT distributions (pgeomean, pmin, pmax);
pHBsAg+ and pHBsAg- are the prevalence of carriers of Hepatitis B and AFM1ng/kg is the ratio
between the concentration of aflatoxin M1 ingested and the body weight of individuals.

Results of the risk characterization are reported in Figure 6. Demographic data allow an
estimation of the number of cases of hepatocarcinoma in the Italian population (cases per
100,000 people per year per each age class per number of individuals in each age class).

a. X <= .03 X <= .04 b. X <= 0.0044 X <= .0102

99.9% 99.9867% 98.7467% 99.99%
1 1
0.9 0.9
0.8 0.8
0.7 0.7
0.6 0.6
0.5 Mean = 0.5

Mean =
0.4 5.7766E-03 0.4 1.4421E-03
0.3 0.3
0.2 0.2
0.1 0.1
0 0
0 5 10 15 20 25 30 35 40 45 50 55 60 0 2.5 5 7.5 10 12.5 15
cases (values in 10E-3) cases (values in 10E-3)

c. X <= .0068 X <= .0092 d. X <= .0019 X <= .0072

99.9% 99.99% 99.9% 99.99%
1 1
0.9 0.9
0.8 0.8
0.7 0.7
0.6 0.6


0.4 0.4 Mean =

0.3 Mean = 0.3 1,1779E-03
0.2 1.5580E-03 0.2
0.1 0.1
0 0
0 2.5 5 7.5 10 12.5 15 0 2.5 5 7.5 10
cases (values in 10E-3) cases (values in 10E-3)
Figure 6. Cases of hepatocarcinoma per million people per year in: (a) children; (b) adolescents; (c)
adults; (d) the elderly.

108  Towards a risk-based chain control

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

15. Discussion

By analysing the data taken before and after September 2003, a significant increase in the
mean value of aflatoxin can be observed in milk samples, but the distribution tail (95th
percentile) shows that samples above the statutory limits had been sporadically detected
also in previous years. Data sets 1 and 2 (industrial monitoring and official surveillance)
collected in the period September 2003 – July 2004 showed the same patterns, thus proving
that monitoring data were capable of reproducing the same variability observed in much
more detail by the extensive surveillance plan, which assumes that critical conditions were
similar in Lombardy and in the other regions of the Po river valley as well as in other regions
of Central Italy. The data recorded in data set 1 until August 2003 show that the mean
prevalence for aflatoxin was lower, but that 95th percentile value was similar to data recorded
subsequently, thus proving that sporadic samples above the statutory limit of 0.05 μg/kg of
aflatoxin in milk occurred and that a systematic monitoring activity maintained throughout
the different seasons was able to detect them. The data recorded in the period 2000-2001
in Lombardy, on the other hand, showing that the aflatoxin level was relatively low, were
not collected by using a scheduled sampling scheme involving all the farmers supplying milk
and this could have affected the distribution curve (i.e. at upper limit).

The risk assessment outputs produced using the data set 1 (period 2001-2004) show that
the estimated mean risk (i.e. additional number of hepatocarcinoma cases per year per
million people) exposed to aflatoxin M1 varies from 6.23 * 10-3 cases per million people in
children to 1.23 * 10-3 cases per million in adults. The differences are due to higher milk
consumption and lower body weight in children in comparison with adults. However the
low prevalence of hepatitis carriers reduced the carcinogen potency of aflatoxin M1 in this
population stratum. By modelling data relative to the period before and after September
2003 separately, it was possible to estimate that the mean risk had increased from 4.33
* 10-3 to 8.89 * 10-3 cases per million in children (it had doubled) as a consequence of
the “aflatoxin crisis”. These estimates give an idea of how much the relative risk increased
due to the aflatoxin crisis which occurred in Italy in the year 2003-2004 and proved the
effectiveness of the present management strategies. Because the majority of samples above
the statutory limit were detected and removed from the milk production chain the protection
of the Italian population from the hazard of aflatoxin was assured. Even if the difference in
body weight and milk consumption pose a major risk for people in their younger age, the very
low prevalence of carriers of Hepatitis B, which is consequent to the vaccination campaign
launched more than ten years ago for children and adolescents, has strongly affected the
relative risk of aflatoxin M1.

15.1. Public health management tools and control strategies

Contamination by the aflatoxin producing mould often arises in fields during the pre-harvest
phase of crops, when the climate in the region is consistent with the condition (temperature
and moisture/water activity) necessary for the growth of the fungi. Therefore, higher level
of aflatoxins can be found in raw feedstuffs produced in tropical or subtropical regions, but
increased A. flavus infection and aflatoxin production in the field can also be observed in
maize produced in regions with a mild climate, like Italy, during unfavourable seasons.

Towards a risk-based chain control  109

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

Periods of drought and heat stress, especially during pollination and kernel maturation,
predispose the plants to increased A. flavus infection and aflatoxin production in the field,
and kernels damaged by insects are more prone to A. flavus infection (Brown et al., 1999;
Chen et al., 2004). Maize kernels are highly susceptible to increased aflatoxin contamination
if not dried to a suitable moisture content. In the post-harvest fungal growth the aflatoxin
production ceases when grain moisture is below 13% and aw < 0.8 at 20°C, therefore
appropriate drying of the maize kernels and separation of the damaged ones, which can
determine local moisture increase during storage, is fundamental for the control of the
hazard. However, insect pests might act as promoters or facilitators of fungal infection during
storage. Pests, feeding on stored material, can cause local heating and moisture generation
due to their metabolic activity. These warmer, damper areas provide ideal conditions for
the initiation and development of local fungal growth and mycotoxin production (Riley and
Norred, 1999; Lopez-Garcia et al., 1999; Reyneri et al, 2004). Localised effects such as this
lead to the development of heterogeneous conditions in large bulk stores. Heterogeneity, and
the resultant difficulty in obtaining representative analytical samples is a central issue in
storage technology (Commission Directive 98/53/EC). One complicating factor is that once
growth is initiated (perhaps in a localised region of sufficiently high moisture content) the
metabolic water produced by the growing fungus can perpetuate and amplify the process.
Once formed, mycotoxins are relatively difficult to remove using available decontamination
methods and precise mycotoxin analysis is costly and often too slow to be of real use in
many commodity chains (FAO, 2001; Park et al., 1999). The amount of aflatoxin M1 (the
hydroxylated metabolite of aflatoxin B1) in milk can be estimated with some approximation
from the level of contamination of cows’ feed using regression equations (e.g. ng aflatoxin
M1 = 1.9 + 1.19 · μg aflatoxin B1 in feed). Therefore if daily food rations for cows are made
with 35% of maize at 20 µg/kg of aflatoxin B1 (this is the limit for aflatoxin contamination
in maize intended for feed production) and we give 8 kg per cow per day of this compound
feedstuff, the aflatoxin level adds up to 56 µg per cow per day and the estimated amount of
aflatoxin M1 will approximate 0.068 µg/kg of milk produced. By applying the previous formula
it is shown that the statutory limit of 0.05 µg/kg of aflatoxin M1 can be reached with a daily
ration containing 40.5 µg of aflatoxin B1, which would be approximately 2 kg of maize flour
at 20 µg/kg (Bertocchi et al., 2004; Caggioni and Pietri, 1999). This estimation needs to be
done by farmers when levels of aflatoxin in maize approximate the statutory limit and they
wish to produce complete feedstuffs for lactating cows on the farm. Directive (2003/100/CE)
has established that in complete feedstuffs for lactating cows the total amount of aflatoxin
B1 must not be above 0.005 µg/kg (relative to feedstuffs with a moisture content of 12%),
but the necessary analysis could not be easily accomplished by the individual farmers and
other factors could also affect carry over of aflatoxin in milk, such as the lactating period,
the amount of milk produced, udder health status, thus complicating an accurate prediction
of the level of contamination of bulk milk at the farms.

The carry-over rate for aflatoxin from contaminated feeds into the milk of dairy cows varies
from 0.5-0.6% up to 6.2% in high yielding cows. The highest percentage is found in conditions
of greater permeability of the cell membranes of alveoli, such as in early and late lactation
stages in cows with high milk production, up to 40 L per day. A much lower carry over (0.5-
0.6%) was showed in studies conducted with cows producing 10-20 L of milk per day (EFSA,

110  Towards a risk-based chain control

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

2004). After withdrawal of the contaminated feed, clearance of aflatoxin in animal tissues is
relatively fast and low aflatoxin level in milk can be observed in 72-96 hours. Control based
on the inspection of feed commodities at farm level is useful, i.e. excluding those feeds with
visible mould growth, but it is inadequate to completely prevent the intake of contaminated
feeds. The milk aflatoxins can be found in cow´s milk and the milk of other lactating animals
with a peak within 2 days after the ingestion of the contaminated commodity. These facts
lead to the consideration that analytical support is needed by farmers during emergency
conditions and that adequate (systematic) monitoring programmes planned by Industry
would be the only tool to rapidly discover critical conditions, whereas public surveillance
should intervene for the adoption of protective provisions and to support farmers in solving
the problems they have to face. The importance of improved agricultural practice has been
emphasized as it is an essential tool to protect feed commodities.

The Codex Alimentarius Commission could not reach a consensus with some countries regarding
a maximum level for aflatoxin M1 in milk. In fact, some countries, including the European
Union, had adopted the MRL of 0.05 μg · kg-1 proposed by the 30th Codex Committee for
Food Additives and Contaminants (CCFAC, 2001) and other countries, including US, had
adopted ML of 0.5 μg · kg-1 on the basis of risk assessment conducted by JECFA at its 56th
Session (JECFA, 2001; EC, 2001). This risk assessment was aimed at comparing the levels
of 0.05 µg/kg and 0.5 µg/kg using monitoring data from all regions of the world, showing
that the additional risk for liver cancer predicted for an MRL between 0.05 to 0.5 µg/kg was
negligible. On the basis of present knowledge the fulcrum of risk assessment is on carcinogen
potency estimate, which has to be considered weak (EC, 2000). Therefore, while studies on
carcinogenicity are shedding light on the molecular mechanism of carcinogenicity, research
into a way of producing a mechanistic risk assessment model is still behind and the adoption
of the precautionary concept is advisable for the time being and the efforts already been
taken to reduce the prevalence of Hepatitis B carriers are of utmost importance.

16. Conclusions

16.1. What has been achieved

Monitoring sampling plans maintained over several years and involving all milk producers
provide adequate data for risk assessment even if the aflatoxin quantification data are produced
using rapid tests such as ELISA. Indeed, the uncertainty related to the analytical method can
be accounted for where the method has been validated within an adequate quantification
range. The monitoring plans can also detect critical conditions, which can be generalized in
the presence of unfavourable agro-climatic seasons, and permit the implementation of more
stringent sampling plans (surveillance activity). The effectiveness of veterinary provision can
be evaluated by maintaining monitoring during the critical season and the data can be used
to assess the increased risk. Recent data relative to milk consumption and anthropometric
data enabled the implementation of a model for quantitative exposure assessment. Moreover,
recent prevalence data of Hepatitis B carriers can be used to characterize the risk associated
to aflatoxins. The vaccination campaign launched in Italy more than ten years ago has
substantially reduced the risk related to aflatoxins.

Towards a risk-based chain control  111

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

16.2. What needs to be done

The range of quantification in the method used to determine aflatoxins should be extended
in order to measure the level exceeding the present limit in order to produce data that can
be used as input for risk assessment. Lack of knowledge relative to genetic variability in
the human population and the need for further advances in the understanding of molecular
mechanisms of carcinogenesis related to aflatoxins hamper the adoption of mechanistic
models. Therefore uncertainty related to assumptions at the basis of the carcinogen potency
concept cannot be defined and the sources of the epidemic data (population genetics) can
drive dose-response curves. Use of data should therefore be carefully evaluated and in the
absence of an alternative the adoption of a precautionary principle is justified.


Bailey, E.A., Iyer, R.S., Stone, M.P. and Harris, T.M., 1996. Mutational properties of the primary aflatoxin B1-DNA
adduct. Proc. Natl. Acad. Sci. USA 93, 1535-1539.
Bertocchi, L., Biancardi, A., Boni, P. and Bonacina, C., 2004. Emergenza aflatossine nella provincia di Brescia:
esperienza di campo. L’Osservatorio 7, 4-13.
Bian, J.C.B., Shen, F.M., Shen, L., Wang, T.R., Wang, X.H., Chen, G.C. and Wang, J.B., 2000. Susceptibility to
hepatocellular carcinoma associated with null genotypes of GSTM1 and GSTT1. World Journal of Gastroenterology
6, 228-230.
Bonanni, P., Pesavento, G., Bechini, A., Piscione, E., Mannelli, F., Berucci, C. and Lo Nostro, A., 2003. Impact of
universal vaccination programmes on the epidemiology of hepatitis B: 10 years of experience in Italy. Vaccine
21, 685-691.
Brown, R.L., Chen, Z.Y., Cleveland, T.E. and Russin, J.S., 1999. Advances in the Development of host resistance in
corn to aflatoxin contamination by Aspergillus flavus. Phytopathology 89, 113-117.
Cacciari, E., Milani, S., Balsamo, A., Dammacco, F., De Luca, F., Chiarelli, F., Pasquino, A.M., Tonini, G. and Vanelli,
M., 2002. Italian cross-sectional growth charts for height, weight and BMI (6 – 20 y). European Journal Clinical
Nutrition 56, 171-180.
Caggioni, C. and Pietri, A., 1999. Le aflatossine nel latte: dove nasce il problema e come prevenirlo. L’informatore
agrario 45, 45-50.
CCFAC, 2001. Comments submitted on the draft maximum level for aflatoxin M1 in milk. Codex Committee on Food
Additives and Contaminants. Thirty-third Session, The Hague, The Netherlands, 12-16 March 2001. CX/FAC
Chen, Z.Y., Brown, R.L. and Cleveland, T.H., 2004. Evidence for an association in corn between stress tolerance
and resistance to Aspergillus flavus infection and aflatoxin contamination. African Journal of Biotechnology
3, 693-699.
COMMISSION DIRECTIVE 98/53/EC of 16 July 1998 laying down the sampling methods and the methods of analysis
for the official control of the levels for certain contaminants in foodstuffs. Official Journal of the European
Communities, L 201/93, 17. 7. 98.
Cullen, J.M., Ruebner, B.H., Hsieh, L.S., Hyde, D.M. and Hsieh, D.P., 1987. Carcinogenicity of dietary aflatoxin M1
in male Fisher rats compared to aflatoxin B1. Cancer Res. 47, 1913-1917.
Da Villa, G., Piccinino, F., Scolastico, C., Fusco, M., Piccinino, R. and Sepe, A., 1998. Long-term epidemiological
survey of hepatitis B virus infection in a hyperendemic area (Afragola, southern Italy): results of a pilot
vaccination project. Res. Virol. 149, 263-270.

112  Towards a risk-based chain control

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

Decreto Legislativo 10 maggio 2004, n. 149. Attuazione delle direttive 2001/102/CE, 2002/32/CE, 2003/57/CE
e 2003/100/CE, relative alle sostanze ed ai prodotti indesiderabili nell’alimentazione degli animali. Gazzetta
Ufficiale N. 139 del 16 Giugno 2004.
EC, 2000. First Report on the harmonisation of Risk Assessment Procedures. Part 2: Appendices 26-27 October 2000
(published on the internet on 20.12.2000. http://
EC, 2001. European Community Comments for the Codex Committee on Food Additives and Contaminants, The Hague,
12-16 March 2001 – Agenda item 15 a) Comments on the Draft Maximum Level for Aflatoxin M1 in milk. (http://
Edds, G.T., 1973. Acute aflatoxicosis: a review. J. Am. Vet. Med. Assoc. 162, 304-309.
EFSA, 2004. Opinion of the Scientific Panel on Contamination in the Food Chain on a request from the Commission
related to Aflatoxin B1 as undesirable substance in animal feed. The EFSA Journal 39, 1-27.
EPA, 1999. Guidelines for carcinogen risk assessment. NCEA-F-0644, July 1999, Review Draft. Risk assessment forum.
U.S. Environmental Protection Agency. Washington D.C.
FAO, 2001. Manual on the application of HACCP system in mycotoxin prevention and control. FAO, Food and Nutrition
paper n. 73.
Ginsberg, G.L., 2003. Assessing Cancer Risk from Short-Term Exposure in Children. Risk Analysis 23, 19-34.
Guengerich, F.P., Johnson, W.W., Ueng, Y.F., Yamazaki H. and Shimada T., 1996. Involvement of cytochrome P450,
glutathione S-transferase, and epoxide hydrolase in the metabolism of aflatoxin B1 and relevance to risk of
human liver cancer. Environ Health Perspect. 104 Suppl 3, 557-562.
Hussain, S.P. and Harris, C.C., 2000. Molecular epidemiology and carcinogenesis: endogenous and exogenous
carcinogens. Mutation Research 462, 311-322.
IARC, 1993. Some Naturally Occurring Substances: Food Items and Constituents. Heterocyclic Aromatic Amines, and
Mycotoxins. IARC Monographs on the Evaluation of Carcinogenic Risk of Chemicals to Humans, vol. 56. Lyon,
France: International Agency for Research on Cancer. 571 p.
ISTAT, 2002. Indagine multiscopo sulle famiglie. “Condizioni di salute e ricorso ai servizi sanitari”. Anni 1999-2000.
ISTAT, 2004. 14° Censimento Generale della Popolazione e delle Abitazioni. http://dawinci/
JEFCA, 1999. Evaluation of certain food additives and contaminants. Forty-ninth report of the Joint FAO/WHO Expert
Committee on Food Additives. WHO Technical Report Series 884, WHO, Geneva (1999).
JEFCA, 2001. Evaluation of certain mycotoxin in food. Fifty-sixth report of the Joint FAO/WHO Expert Committee
on Food Additives. WHO Technical Report Series 906, WHO, Geneva (2002).
Krishnamachari, K.A.V.R., Bhat, R.V., Nagerajan, V. and Tilak, T.B.G., 1975. Hepatitis due to aflatoxicosis. An outbreak
in Western India. Lancet 305, 1061-1063.
Lopez-Garcia, R., Park, D.L. and Philliphs, T.D., 1999. Integrated mycotoxin management systems. In: Food, Nutrition
and Agriculture. FAO Food and Nutrition Division 23, p. 38-48.
Marien, K., Moyer, R., Loveland P., Van Holde, K. and Bailey, G., 1987. Comparative binding and sequence interaction
specificities of aflatoxin B1, aflatoxicol, aflatoxin M1, and aflatoxicol M1 with purified DNA. The Journal of
Biological Chemistry 262, 7455-7462.
MinSan, 2003. Nota del Ministero della Salute prot. N. 614/1774/388 del 12/12/2003, “Linee guida direttrici per
il controllo delle aflatossine nei mangimi e nel latte”.
Montesano, R., Hainaut, P. and Wild, C.P., 1997. Hepatocellular Carcinoma: From Gene to Public Health. J. Nat.
Canc. Inst. 89, 1844-1851.
Park, D.L., Njapau, H. and Boutrif, E., 1999. Minimising risks posed by mycotoxins utilizing the HACCP concept. In:
Food, Nutrition and Agriculture. FAO Food and Nutrition Division 23, p. 49-55.
RER, 2004. Lettera 536 Prot. N. VET/04/13901 del 13/04/2004, “Sistema Regionale di sorveglianza per la presenza
di micotossine nei cereali, mangimi, latte, prodotti a base di latte”.

Towards a risk-based chain control  113

Marcello Trevisani, Andrea Serraino, Alessandra Canever, Giorgio Varisco and Paolo Boni

Reyneri A., Blandino M., Vanara F., Matta A., Lazzari A., Alma A., Lessio F., Ferrero C., Minelli L., Cavallero A.,
Turletti A., Spanna F. and Bersani L., 2004. Impiego di tecniche agronomiche per contenere la contaminazione
da micotossine nella granella di mais. L’Informatore Agrario 14, 53-57.
Riley, J., Mandel, G., Sinha, S., Judah, D.J. and Neal, G.E., 1997. In vitro activation of the human Harvey-ras proto-
oncogene by aflatoxin B1. Carcinogenesis 18, 905-910.
Riley, R.T. and Norred, W.P., 1999. Mycotoxin prevention and decontamination – a case study on maize. In: Food,
Nutrition and Agriculture. FAO Food and Nutrition Division 23, p. 25-32.
Rustom, I.Y.S., 1997. Aflatoxin in food and feed: occurrence, legislation and inactivation by physical methods.
Food Chemistry 59, 57-67.
Sell, S., 2003. Mouse model to study the interaction of risk factors for human liver cancer. Cancer Research 63,
Shinahara, T., Ogawa, H.I., Ryo, H. and Fujikawa, K., 1995. DNA-damaging potency and genotoxicity of aflatoxin
M1 in somatic cells in vivo of Drosophila melanogaster. Mutagenesis 10, 161-164.
Smela, M.E., Hamm, M.L., Henderson, P.T., Harris, C.M., Harris, T.M. and Essigman, J.M., 2002. Proc. Natl. Acad.
Sci. USA 99, 6655-6660.
Sotomayor, R.E., Washington, M., Nguyen, L., Nyang’anyi, R., Hinton, D.M. and Chou, M., 2003. Effect of intermittent
exposure to aflatoxin B1 on DNA and RNA adduct formation in rat liver: dose-response and temporal patterns.
Toxicological Sciences 73, 329-338.
Stroffolini, T., Bianco, E., Szklo, A., Pernacchia, R., Bove, C., Colucci, M., Coppola, R.C., D’Argenio, P., Lo Palco,
P., Parlato, A., Ragni, P., Simonetti, A., Zotti, C. and Mele, A., 2003. Factors affecting the compliance of the
antenatal hepatitis B screening programme in Italy. Vaccine 21, 1246-1249.
Sun, C.H., Wang, L.Y., Chen, C.J., Lu, S.N., You, S.L., Wang, L.W., Wang, Q., Wu, D.M. and Santella, R.M., 2001.
Genetic polymorphisms of glutathione S-transferases M1 and T1 associated with susceptibility to aflatoxin-
related hepatocarcinogenesis among chronic hepatitis B carriers: a nested case-control study in Taiwan.
Carcinogenesis 22, 1289-1294.
Sylla, A., Diallo, M.S., Castegnaro, J.J. and Wild, C.P., 1999. Interaction between hepatitis virus infection and
exposure to aflatoxins in the development of hepatocellular carcinoma: a molecular epidemiological approach.
Mutation Research 428, 187-196.
Turrini, A., Saba, A., Perrone, D., Cialda, E. and D’Amicis, A., 2001. Food Consumption Patterns in Italy: the INN-CA
Study 1994-96. European Journal of Clinical Nutrition 55, 571-588.
Vogelstein, B., Lane, D. and Levine, A.J., 2000. Surfing the p53 network. Nature 408, 307-310.
Vose, D., 2001. Risk Analysis. A quantitative guide. 2nd Edition, John Wiley & Sons Ltd, Chichester, UK, p. 217-
Yousef, A.E. and Marth, E.H., 1989. Stability and degradation of Aflatoxin M1. In: Mycotoxins in dairy products.
H.P. Van Edgmond (Ed.). Elsevier Science Publisher Ltd, England, p. 127-162.
WHO, 1998. International programme on chemical safety. Safety evaluation of certain food additives and contaminants.
WHO Food Additives Series 40.
Wild, C.P. and Turner, P.C., 2002. The toxicology of aflatoxins as a basis of public health decisions. Mutagenesis
17, 471-481.
Williams, J.H., 2004. Human aflatoxicosis in developing countries: a review of toxicology, exposure, potential health
consequences, and interventions. Am J Clin Nutr. 80, 1106-1122.

114  Towards a risk-based chain control

Marcus G. Doherr

Use of sensors for early disease detection - visions

on proactive disease control in the primary animal
Marcus G. Doherr
Division of Clinical Research, Department of Clinical Veterinary Medicine, Vetsuisse Faculty,
University of Bern, Bremgartenstrasse 109a, PO Box 8466, CH – 3001 Bern, Switzerland,


The overall objective of epidemiological research is to understand and subsequently control

health-related problems at the population level. One of the objectives of the production of
food of animal origin is to ensure that these products are safe for (human) consumption.
It is generally accepted that safe food should originate from healthy animals, thus linking
epidemiology and food production. Various aspects such as the general approaches used
to monitor diseases and pathogens, the evaluation of diagnostic tests used within these
monitoring programs, modeling approaches to simplify and further explore these complex
biological systems, and (bio-)sensor technology that finds its way into various areas of
health-status monitoring, are briefly addressed, and an attempt is made to critically discuss
their respective values.

Keywords: surveillance, food safety, biosensors, animal production, diagnostic tests

1. Introduction

Epidemiology, in the veterinary context, is concerned with diseases in animal populations

(Thrusfield, 1995). This definition can be broadened to: “Epidemiology is the study of the
distribution and the determinants of health-related states or events in defined population”.

The overall objective of epidemiological research thus is to understand and subsequently

control health-related problems at the population level. Applied epidemiology integrates a
broad range of activities and methodological tools in order to serve this overall objective
(Figure 1).

Three of the primary objectives in the production of food of animal origin are (1) to generate
a profit for the industries involved, (2) to deliver products that the market demands which is
related to (1), and (3) to ensure that they are considered safe for (human) consumption. In
that context, a generally accepted – if not demanded – concept is that safe (often confused
with “healthy”) food should originate from healthy animals!

Towards a risk-based chain control  115

Marcus G. Doherr

Risk factor identification

and disease prevention

Implementation Data collection and

of control strategies maintenance

Design of Routine data

monitoring and description
control strategies EPIDEMIOLOGY (screening)

Advanced data
Economic evaluation
analyses of (statistics, modelling)
diseases and
Risk analysis
Diagnostic test approaches

Figure 1. The main activities related to veterinary epidemiology.

Economically profitable production, in times of strong consumer pressure for low cost products,
requires finding a balance between animal health and
(and the often increased costs associated with higher health and welfare standards),
production intensity, food safety measures and market demands. Unfortunately, this balance
often is strongly affected by the marked (public) demands for low-cost and – at the same
time – low- or even zero-risk products, and the producers economic situation is ignored.
At the same time, increasing consumer demands for “ready-to-cook” or even “ready-to-eat”
foods have increased the number of stages in food production (Wall, 2005) while decreasing
the consumers interaction with the product and its origin, the animal. This further decreases
the consumers understanding of the efforts required at all stages of production to produce
high quality and safe food!

The reliable and ongoing assessment of the health status of farm animals (“pre-harvest”), and
of products of animal origin along the whole food production chain (“post-harvest”), is one
of the central components for safe food production, and therefore plays an important role
in the veterinary public health context. In other words, safe food is fundamental to human
health (Wall, 2005). Increasing attention thus needs to be given to all compartments along
the whole food production chain (stable-to-table concept; Figure 2): monitoring and the
subsequent risk-reducing interventions should not focus on the abattoirs and food processing
stages, but should incorporate the full “chain information”. This, however, requires a data
capture and communication systems that bring together and utilize information from all
segments of the production system (Johnston, 2005).

The ongoing health-status assessment at the animal or farm level (pre-harvest) has become
increasingly important in the detection of foodborne zoonotic agents such as Salmonella,
Escherichia coli O157 or Campylobacter that have their reservoir in farm animals. At the same

116  Towards a risk-based chain control

Marcus G. Doherr

Pre-harvest Post-harvest

Primary Slaughter &

Feed production processing
industry Consumer
industry (farms) industry

safe production and processing „from stable to table“

Figure 2. Different industry sectors involved in the pre- and post-harvest production and processing of
food from animal origin; the stable-to-table concept.

time it is crucial in the early detection of outbreaks of highly contagious diseases such as
foot-and-mouth disease (FMD) and classical swine fever (CSF) in order to prevent large
epidemics associated with high economic losses. Thirdly, ongoing monitoring is required to
validate the absence of those (production) diseases, often characterized by delayed onsets
and only subtle clinical signs, that have been controlled or even eradicated mainly due to
their economic losses, but potentially also due to their zoonotic potential. Examples are
bovine infections with the leucosis virus, with Mycobacterium bovis (bovine tuberculosis)
and with Mycobacterium avium subsp. paratuberculosis (Johne’s disease).

The critical importance of early disease detection became evident after the recent outbreaks
of classical swine fever (CSF) in the Netherlands in 1997/1998, and in the 2001 foot and
mouth (FMD) outbreak in the UK. The combination of lack of distinct clinical symptoms in at
least some of the affected species or animals, the lack of disease awareness (and reporting)
by some personnel involved, delays in routine testing and the fact that no reliable and fast
“on site” diagnostic test was available at that time resulted in a delayed recognition of
and response to the problem by several days. As a consequence of that delay the epidemics
became larger in the number of effected herds and in the resulting economic losses.

The objective of this contribution is to develop and then use the conceptual framework of
(1) disease detection (monitoring and surveillance), (2) the principles of diagnostic test
characteristics and (3) infectious disease modeling in order to present and critically discuss
the value of new tools such as “sensors” used for easier or earlier detection of diseases or

2. Disease “surveillance”

The concept of - and term - surveillance can be traced back to the French revolution and
at that time meant “…to keep watch over a group of persons thought to be subversive”
(Eylenbosch and Noah, 1988). In more recent documents produced by the International Office
of Animal Health (OIE) and other international bodies, a clear distinction is being made
between monitoring and surveillance.

Towards a risk-based chain control  117

Marcus G. Doherr

Monitoring (to watch, follow, observe) is defined as a continuous (ongoing) process of

collection data on the health status (health-related events) within animal populations over
a defined period of time.

Surveillance (to monitor and control) is an extension of monitoring in which control or

eradication action is taken once a predefined level of the health-related event (often disease
prevalence) has been reached.

This terminology unfortunately is not been consistently used; quite frequently the term
surveillance is applied very globally to describe any activity related to detecting cases of
disease within populations. One of the reasons could be that (in veterinary public health)
basically all animal diseases that are monitored are also regulated within specific control
programs. There the use of surveillance is indeed appropriate. If, however, the prevalence of
Newcastle Disease (ND) in wild birds is routinely assessed by testing hunted and found dead
birds, but no control action is taken if the agent is found in this sample, then this would
constitute an example of a monitoring approach. Doherr and Audigè (2001) recommend to
clearly distinguish between the two activities, and to use the monitoring and surveillance
systems (MOSS) as the generic term.

One of the most frequent reasons for the implementation of a MOSS is either to assess, with
defined levels of precision, how frequent (prevalence, incidence) a disease is, or to document
that the outcome (disease) of interest is below a certain threshold level. This could be done
at the sample, animal, herd or even larger aggregate levels of animals. That information
would then be used (1) to support trade, (2) for the design of disease control programs
and to assess their effectiveness, (3) for the assessment of the economic losses related to
a specific disease and (4) in risk analysis and research. The most fundamental approach of
collecting animal disease data from respective target populations is either through baseline
passive monitoring or through targeted active sampling (and testing).

Passive MOSS is defined as the (often mandatory) reporting of clinical suspect cases to
the (veterinary) authorities. This system relies on the awareness of the animal owners and
veterinary practitioners for the disease to recognize suspicious cases, and their willingness to
report them. This system has a long history and was instrumental in the control of mainly the
highly contagious diseases. It uses an infrastructure (farmers, vets) that is already in place
(low cost for the individual disease), and can simultaneously cover a broad range of diseases.
This approach, however, can only be used for diseases that present clear clinical signs, and
it works best for those diseases that are highly contagious (fast spreading) and that have a
short incubation period, thus resulting in a large number of obviously sick animals during a
relatively short period of time (an “outbreak”). However, the approach often underestimates
the true level of disease, and in some instances the disease can go undetected (or detected
but not reported) for extended periods of time. Therefore, reported cases indicate that the
disease is present at that level or higher. The absence of reported cases, however, cannot
automatically be taken as proof that a monitored population is indeed free of the disease,
as recent history in the context of BSE monitoring in several continental European countries
has clearly shown.

118  Towards a risk-based chain control

Marcus G. Doherr

Active MOSS is defined as the ongoing (continuous) or periodic (once or repeated) scientifically
based collection of information on specific disease-related events from predefined target
(animal) population. It is a cost-intensive approach that needs a good scientifically based
design. The results, however, should be representative for the target population and therefore
valid (unbiased). This approach generally works well if rapid and low-cost diagnostic test
systems are available that can detect the condition of interest, and if a target population in
which that condition is likely to be higher when compared to the overall population can be
reliably identified. If no such target population can be identified, then a general population
survey needs to be performed, resulting in higher costs.

The outcome of any MOSS can be expressed as by measures of outcome (disease) frequency.
The most commonly used measures are prevalence and incidence (Thrusfield, 1995). Prevalence
is defined as the number of existing (measurable) events (cases) in a defined population
at risk (of being a case) at a specific point in time. It reflects a cross-section through the
population at risk. Incidence in general relates to the number of new cases observed in a
population at risk over a defined period of time. Incidence count is just the number of cases
over that time period. Cumulative incidence (risk), the most commonly expressed incidence,
is the number of new cases over a specified time period (numerator) divided by the number
of animals at risk of becoming a case during that time period (denominator). The new cases
are counted as for the incidence count. The difficulty lies with measuring the population at
risk (denominator), especially in a dynamic population with exits and new entries. Frequent
approaches are to either take the population at risk present at the beginning of the time
interval, that at the midpoint of the interval, or the average population during the interval.
The result is expressed as that proportion of animals at risk that became diseased (positive)
during the specified time period (month, year etc.).

New cases for the incidence density (rate) are counted as before. However, the denominator
now is an accumulation of exact animal time at risk, and the resulting incidence density
expresses the number of new cases per animal time (months) at risk in the given population.
This measure is rarely used in veterinary medicine since exact data on animal time at risk
are not available.

3. Test principles and characteristics

Within most of the MOSS activities, samples, animals or higher aggregate units such as flocks,
herds or even populations are classified into either positive (clinical disease, presence of the
agent, of antibodies, toxins, etc.) or negative in order to calculate the appropriate measures
of disease (outcome) frequency. Other scenarios include the single, repeated or continuous
measurement of defined performance indicators such as milk yield and quality, reproduction
parameters and weight gain, or more general health and welfare indicators such as animal
behaviour, body temperature, heart and respiratory rate, animal movement pattern, feed
intake and – at the farm level - animal losses over time or production cycle.

All such measurement systems can be referred to as tests, and all single tests and test
combinations have their inherent properties such as the underlying “technical” principle, the

Towards a risk-based chain control  119

Marcus G. Doherr

outcome to be measured, the aggregate level (sample, animal, herd, population), the result
format (qualitative classification or quantitative measurement ), the time required until a
result is available, the related costs of testing, and – very important – the precision and
accuracy of the measurements. The precision of a test describes the closeness of agreement
among repeated measures of the same sample, while accuracy is defined as the level of
agreement between the measured (test) results and the true value (Greiner and Gardner,

In the context of disease testing, interval- or ordinal-type test results (such as temperature,
optical density, dilution titers) are often dichotomized into a positive/negative outcome
by a predefined cut-off value. When this dichotomized test outcome (positive/negative) is
compared to the true status as defined by a gold standard (test), specific diagnostic test
characteristics can be calculated using a 2x2 table approach (Figure 3) (Thrusfield, 1995).
This is one of the most important steps in the test development and evaluation process since
it provides the developer and potential users of the respective tests with essential information
on the abilities and limitations of the (test) system.

a. True status b. True status

(Gold standard) (Gold standard)
New test Pos (+) Neg (-) Total New test Pos (+) Neg (-) Total
Pos (+) a b T+ Pos (+) 90 15 105
Neg (-) c d T- Neg (-) 10 285 295
Total D+ D- n Total 100 300 400

Diagnostic sensitivity = a/(a+b) = a/D+ SE = 90.0%

Diagnostic specificity = d/(b+d) = d/D- SP = 95.0%
Apparent prevalence = (a+b)/(a+b+c+d) = T+/n AP = 26.3%
True prevalence = (a+c)/(a+b+c+d) = D+/n TP = 25.0%
Positive predictive value = a/(a+b) = a/T+ PPV = 85.7%
Negative predictive value = d/(c+d) = d/T- NPV = 96.6%

Figure 3. (a) General 2x2 table approach to derive diagnostic test characteristics when compared against
a the true disease status (gold standard); (b) Example using 100 truly diseased and 300 truly non
diseased individuals.

3.1. Diagnostic test sensitivity and specificity

The (diagnostic test) sensitivity (SE) is defined as the proportion of truly diseased (“gold
standard” positive) individuals that the test correctly classifies as (test) positive. It can be
expressed in function of the cell frequencies in the 2x2 table (Figure 3a), or as the conditional
probability of a test positive outcome (T+) given that the animal is
diseased (D+):

SE = a / (a+c) = P(T+| D+)

120  Towards a risk-based chain control

Marcus G. Doherr

The (diagnostic test) specificity (SP) is accordingly defined as the proportion of truly non-
diseased (“gold standard” negative) individuals that the test correctly classifies as (test)
negative. It can be expressed in function of the cell frequencies (Figure 3a) or as the
conditional probability of a test negative outcome (T-) given that the animal is non-diseased

SP = d / (b+d) = P(T-| D-)

Example (Figure 3b): If out of 100 truly diseased (cell D+; gold standard positive) individuals
90 were correctly classified as (test) positive (cell a), the resulting diagnostic sensitivity
of the test (SE) would be 90%. The remaining animals in this gold standard positive group
are false negative in the test (c). If in the same test 285 (d) of 300 truly non-diseased (D-;
gold standard negative) individuals would be classified as test-negative, then the resulting
diagnostic specificity (SP) would be 95%. The 15 animals classified as positive by the test
(b) are false positive.

Test developers initially are primarily interested in the analytic test properties. These are
defined as (1) the analytic sensitivity, i.e. the lower detection limit or the smallest still
detectable amount of the substance (antigen, antibody, chemical, protein etc.) that the test
is able to measure, and (2) the analytic specificity (cross-reaction profile), i.e. the ability
of the test not to react to any other substances. Analytic test properties will influence
the diagnostic performance of the test in the target population, and thus the diagnostic
sensitivity and specificity of the test!

A very important issue in diagnostic test evaluation is the definition of what constitutes the
“gold standard (reference status)” classification for infected and non-infected individuals
against which the new test is validated. The absolute (positive) gold standard is the
demonstration of the infectious agent after (confirmed) natural infection and subsequent
clinical disease. This could be from animals in a natural disease outbreak from which the
infectious agent was isolated by culture. Also possible is the demonstration of clear and
unique pathological lesions. Other indirect measures of disease (or exposure) such as the
presence of antibodies in a different test system are defined as relative reference (gold
standard) tests. Experimental infections and the use of animals from historically known
negative populations can be considered in addition.

Whenever test characteristics such as SE or SP need to be interpreted, careful evaluation

of the gold standard definition to which the (new) test was compared is important. One
example is the validation of the initial three rapid screening assays for bovine spongiform
encephalopathy (BSE). The gold standard positive pool consisted of 300 brain samples of
good quality from UK clinical BSE cases that were confirmed both by histology and immuno-
histochemistry (IHC). The gold standard negative pool consisted of 1,000 good-quality
brain samples from a population assumed to be historically free of BSE that tested negative
both in histology and IHC. These two groups clearly represent the two extreme ends of the
possible spectrum with respect to possible levels of detectable agent, and tissue quality of
the samples. It thus was hardly surprising that the three tests correctly classified all samples
within this trial. However, in the general population, agent levels and sample quality will

Towards a risk-based chain control  121

Marcus G. Doherr

show more variation, thereby reducing the overall test performance. Another example would
be the validation of a new test for classical swine fever (CSF). Infected pigs that develop the
typical acute disease will show increased body temperature (fever) between days 3 and 8,
excrete CSF virus between days 3 and 10, show clinical symptoms and develop (pathologically
detectable) lesions between days 4 and 8 or 9, and have infection-related increases in
antibody titers after day 10. Using early clinical disease with virus excretion as the gold
standard (positive) for an antibody ELISA would result in a very low sensitivity estimate
since the pigs, though infected, did not yet seroconvert.

Even though it is generally assumed that the SE and SP are fixed properties of the respective
tests, empirical evidence suggests that these characteristics vary within and among animal
populations (Greiner and Gardner, 2000). This is one of the reasons why it is increasingly
required that new diagnostic tests, after a smaller laboratory and field validation, have to
undergo extensive field trials within the designated target population before being officially
recognized by the respective international bodies such as the EU or the OIE.

3.2. Format of test results and cut-off

In order to validate diagnostic tests using the traditional 2x2 table approach, test results
need to be coded in a binary or dichotomous format such as 0/1, neg/pos or no/yes. For
certain tests the results are already recorded as such; they can be used directly. For tests with
a ordinal (dilution titer) or continuously measured outcome (such as temperature, optical
density or chemino-luminescence), a cut-off value is required to classify samples as test-
positive or -negative. Afterwards, a comparison to the gold standard status using the 2x2
table approach is possible. Often the continuous measurements (provided by a new tests)
show overlapping ranges for the gold standard-positive and –negative populations (Figure
4). If this is the case, increasing the cut-off will increase the number of false-negative (FN)
results in the test, thus reducing the SE of the test in comparison to the gold standard. At
the same time, the frequency of false-positive test results (FP) decreases, and the SP of the
test is improved. Lowering the cut-off value would increase the SE and decrease the SP of
the test.

This relationship between cut-off values, test SE and test SP can be visualized in the receiver-
operating characteristic (ROC) analysis that – in a plot – shows the possible sensitivity and
specificity combinations over a range of predefined cut-off values (Greiner et al., 2000). This
allows the selection of that cut-off that provides the optimal SE and SP combination for
the intended purpose of the test. When screening individual animal for diseases that have
intermittent excretion or shedding of the target agent, or when testing for the infrequent
bacterial contamination of carcasses or meat samples along a production line, repetition of
the sampling will increase the probability of detecting the agent-shedding animal or the
bacterial contamination. The required repetition (sampling) frequency depends on the average
time lag between truly positive samples.

Often, the interest lies more in the classification of herds or other aggregates of individuals
(resp. samples) as positive or negative. Herd-level test characteristics (herd SE and herd
SP), in addition to the individual animal level characteristics, depend on (1) the number of

122  Towards a risk-based chain control

Marcus G. Doherr


POS (200)


NEG (2000)

300 292 Cutoff

250 D+
D+D- D-










1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7 7.5 8 8.5 9
ELISA Optical density
Figure 4. Hypothetical frequency distribution (y-axis) of ELISA optical densities (x-axis) for 2,000 gold
standard negative and 200 gold standard positive samples. Moving the cut-off value influences the
frequencies of the four cells in the 2x2 table.

animals from a herd that is tested (in relation to the herd size), (2) the true within-herd
prevalence in infected herds, (3) the selected cut-off value (i.e. 1, 2, 3 … test-positive
individuals) to classify the herd as positive, and (4) the potential variation (between all
herds in the sample) of individual animal test characteristics, within-herd prevalence and
proportion of animals per herd tested (Christensen and Gardner, 2000). In general, as the
number of tested animals per herd and the within-herd prevalence increases, the herd-level
sensitivity increases. This can be useful if the sensitivity of the individual animal test is poor.
However, if at the same time the animal-level SP of the test is low, then testing more animals
per herd will result in an increasing number of false-positive test results, thus reducing the
herd-level specificity (Christensen and Gardner, 2000). The use of reliable confirmation assays
can resolve this problem.

3.3. The users view of test results

Users of diagnostic tests, besides cost and time-to-result considerations, have two main
questions in relation to test performance.
If a diagnostic test was used to assess the proportion of test reactors within a sample, the
result will be the test-positive or apparent prevalence (AP, Figure 3). Given the test is indeed
perfect, i.e. 100% sensitive and 100% specific, then this apparent prevalence will be similar
to the true prevalence in the sample, and will be a good estimate for the true prevalence in
the population. If the test is not perfect, however, then these two prevalence figures, AP and
TP, differ. Using function 1 that incorporates the AP as well as the known test characteristics
(SE, SP), the true sample prevalence, and an estimate for the true population prevalence

Towards a risk-based chain control  123

Marcus G. Doherr

(with 95% confidence intervals) can be derived using the Rogan-Gladen estimator (Rogan
and Gladen, 1978).

TP = AP + SP -1 (Eq. 1)
SE + SP - 1

When confronted with an individual test result (such as a positive pregnancy test strip),
we will automatically try to assess how reliable that test result can be. This probability of a
test to be right is called the predictive value of an individual (either positive or negative)
test result. Predictive values are calculated separately for the test-positive and test-negative
sample. The positive predictive value (PPV or PV+) is defined as the proportion of test-positive
individuals that is truly diseased (“gold standard” positive). It can be also expressed as a
conditional probability of having a truly diseased individual (D+) given that the individual is
test positive (T+). The negative predictive value (NPV or PV-) is defined as that proportion
of test-negative individuals that is truly non-diseased (“gold standard” negative); it can
be expressed as a conditional probability of having a truly non-diseased individual (D-)
given that the individual is test negative (T-) (Figure 3a). Predictive values depend on the
diagnostic test characteristics, with a higher NPV for tests with a high SE, and higher PPV
for tests with a high SP. However, they also depend on the on the true prevalence of the
disease in the population where the (tested) individual came from. In populations with a
high prevalence, there is a generally higher overall probability that a test-positive individual
is truly diseased, resulting in a higher PPV. The same association holds in populations with
a very low disease prevalence and the NPV, which then is high.

3.4. Combination of tests

In the context of screening animal populations for specific outcomes such as disease, a
combination of tests often is used to increase the overall performance of the detection
system. In the most common scenario, reactors from a very sensitive first or screening test
are followed up by a second more specific test that is able to distinguish between true- and
false-positive samples from the initial testing. Screening-test negative samples, however,
are considered to be correct, i.e. accepted as coming from truly negative (non-diseased)
individuals. A simple example of this sequential (serial) combination of tests is the clinical
suspect reporting (as a screening test) and subsequent laboratory confirmation of those
suspects. If, in a country or region considered free of FMD, no suspect cases are reported,
then the population is considered to be free of FMD. If, however, suspects are reported
then thy are further investigated by more specific tests. Other examples are the use of a
BSE rapid screening test on slaughtered cattle and the submission of test-positive samples
to the reference laboratory for confirmation by immunohistochemistry (IHC) and specific
western blotting.

In order to maximize the overall performance of a serial test combination, the (first) screening
test should have a very high sensitivity (approaching 100%). This ensures that (hopefully)
all positive individuals are captured in the screening, but it will also generate a proportion
of false-positive test results, especially when the screening test is not highly specific. The
follow-up (confirmatory) test needs a good sensitivity but – even more important – a very
high (approaching 100%) specificity to clearly distinguish between truly diseased and truly

124  Towards a risk-based chain control

Marcus G. Doherr

non-diseased (but screening test false-positive) individuals. Specific software packages and
modelling approaches can be used to – for a given disease and MOSS objective – identify the
best combination of tests (with given characteristics) reach that objective.

4. Modelling

When dealing with complex biological system of diseases spreading through populations,
zoonotic agents being propagated along the food chain, and tests of various characteristics
used to detect these diseases, agents or other outcomes, modelling approaches are commonly
used. A model – by definition - is a mathematical description of the behaviour of a (biological
or physical) system (Thrusfield, 1995). It can be defined as a tool that enables the researcher
to express (describe) specific hypotheses so that quantitative or qualitative predictions
are possible and can be compared to the real world. No such model will capture all the
(biological) variability present in the real world; they are simplifications of reality. Therefore
all models will be wrong to some extent, but still useful by providing further information and
understanding about the problem of interest.

The course of infectious diseases through susceptible animal populations often is modelled
using what is called deterministic or stochastic compartment or state–transition models
(Thrusfield, 1995). An example of a relatively simple generic animal-level SEIR compartment
(or state) model with susceptible, exposed, infected/infectious and recovered/vaccinated
states, replacement into the pool of susceptible animals, animals leaving all compartments
by death and transition rates that define the proportion of animals changing from one
compartment (state) into another per unit time in presented in Figure 5.

Approaches using these or other types of models have been extensively utilized before,
during and after the 2001 outbreak of FMD in the UK, and their value in various contexts
including research to understand the disease epidemiology, prediction of the course of the
epidemic during an outbreak, evaluation of various response scenarios during an outbreak,
and preparedness for future outbreaks was reviewed (Anon., 2002; Kao, 2002; Kitching et
al., 2005). One general conclusion was that especially during an initial phase of a rapidly
moving epidemic, even “off-the-shelf” models, due to the non-availability of essential

r ρ/ω

Susc Exp Infect Recov/

β γ α Vacc

µ1 µ2 µ3 µ4

Figure 5. Simple animal-level SEIR compartment (or state) model with Susceptible, Exposed, Infected/
Infectious and Recovered/Vaccinated states, replacement (r) into the pool of susceptible animals, animals
leaving all compartments by death (µ1-µ4), and transition rates (α, β, γ, ρ, ω) that define the proportion
of animals changing from one compartment (state) into another per unit time.

Towards a risk-based chain control  125

Marcus G. Doherr

data, would have difficulties to correctly predict the future course of the outbreak, and the
development of new models would require too much time for them to be of much use in
(control) decision making.

Fitting models to completed (past) outbreak information, on the other hand, has proven to
be very useful for the evaluation of various activities that, in a similar outbreak situation,
would influence some of the transmission parameters between compartments as described
in Figure 5. This could include demonstrating the effect of various vaccination strategies
that would reduce the number of susceptible animals by “moving” them into the vaccinated
compartment, of culling strategies that removes animal from several compartments, and of
improvements in diagnosis (testing) that might shorten the time that diseased animals can
infect susceptible animals until they are identified or removed.

5. Sensors

A rather general definition, a (hardware) sensor is an electronic device used to measure a

physical quantity such as temperature, pressure or loudness and convert it into an electronic
signal of some kind (such as voltage). Alternative definitions include (1) a device that
responds to a physical stimulus, such as thermal energy, electromagnetic energy, acoustic
energy, pressure, magnetism, or motion, by producing a signal, usually electrical, and (2) any
device that can sense an abnormal condition within the system and – in response – provide
a signal indicating the presence or nature of the abnormality to either a local or remote
alarm indicator.

Biosensors use the selective binding capabilities of biological molecules such as antibodies,
nucleic acids, enzymes or antigen protein to a specific receptor (molecule), and each binding
event (such as between antibody and antigen receptor, or between enzyme and substrate)
is directly converted by a microchip into an electronic signal. This is referred to as the lab-
on-a-chip concept. The specific binding event is similar to that of known diagnostic test
systems, however, the signal processing does not require time-consuming incubation steps
once the sample with the target agent is prepared. Certainly, this technology works best in
situations when the target molecule is not bound (fixed within tissues or cells such as the
BSE-related proteinase-K resistant prion protein) or otherwise masked and thus not directly
available for binding to the specific receptor on the microchip.

As for any other diagnostic test, a result (signal) validation against a known gold standard,
with determination of a cut-off signal above which the result is classified as positive, is
required, and the afore mentioned approaches and sample size considerations apply.
A range of biosensors measuring various parameters are already used in animal research and
production. Examples include:
• the periodic or continuous monitoring of parameters such as physical activity, body or
environmental temperature, blood pressure, vaginal mucus conductivity, geographic
position (tracking), etc. in laboratory and farm animals as well as in wildlife species;
• milking robots in dairy operation that, in addition to the “visual” assessment of udder
shape and colour record a range of data on the milking procedure (time, amount, etc.)

126  Towards a risk-based chain control

Marcus G. Doherr

and on milk composition and quality (temperature, cells, conductivity, etc.), and that
have the potential to monitor for specific diseases;
• the monitoring of groups of animals in feedlots, pens for their individual animal feed
consumption, feedlot movement, temperature, lameness, and weight gain; often without
any human intervention or presence;
• sensor and instrumentation technologies currently under development in the food safety
context that might offer fast (practically real time) and consistent (repeatable) inspection
capabilities to – for example - detect down to 10 cells of Listeria monocytogenes in food
of animal origin (Anon., 2004).

In the aftermath of the 2001 UK FMD outbreak, it was reported that several some rapid („pen-
site“ or „on-farm“) diagnostic tests and bio-sensors (animal sensors) for highly contagious
animal diseases such as FMD are becoming available (Anon., 2002). However, at the time
of the outbreak they were not yet fully validated, and any further development in epidemic
„peace times“ was considered as difficult due to the lack of interest and funding.

6. Bringing everything together

The validity of MOSS activities related to animal diseases, foodborne agents, toxins and
other health hazards is highly dependent on the diagnostic test systems available in order
to reliably detect these conditions. Thus, careful evaluation of these diagnostic tests is
required in order to understand the advantages – and limitations – of specific tests for a given
problem. The use of new diagnostic tools such as rapid tests for infectious diseases based
on (bio-)sensors, computer-guided video surveillance of the behaviour of pigs in pens, real-
time monitoring for foodborne agents along the whole food production chain, milk parameter
testing as an indicator for udder diseases or even more general health-problems in dairy cows
etc. might improve our ability to detect specific health problems or contaminations before
they become a risk! Modelling, in this context plays an important role in allowing to run
scenarios on the “net effect” of improving the time-to-diagnosis (and control). In times of
increasing labour costs and trends towards larger farm animal operations, it is still discussed
to have continuous surveillance of various parameters in place while the managers become
active only when certain predefined threshold values have been reached. Whether this image
(better: vision) of the “fully wired cow” is technically or economically feasible, and culturally
acceptable, remains an area for discussion.

The technology is certainly moving on, and sensor technology, in one way or the other,
is finding its way into the animal and food-producing industry. Further development and
validation of this technology might be easier in the area of food production (and foodborne
diseases) since technology (in general) is further advanced there. In more traditional farm-
animal systems it is more difficult to accept that the farmer caring for and at the same
time observing the health of his livestock can easily be replaced by a computer-controlled
surveillance system – despite the fact that the latter might be more reliable to detect
developing problems! A careful evaluation (including modelling, economic and social sciences)
is needed to assess where such technology can be successfully implemented.

Towards a risk-based chain control  127

Marcus G. Doherr

Decisions which approach to use, however, need to be based on the evaluation of a rather
complex system, and includes issues such as the target population, the agent or parameter
to be measured, the availability of validated tests including biosensors, test characteristics
(screening and confirmation), the time requirements, the costs of testing and of misdiagnoses
(i.e. false positive and false negative results) as well as other parameters and constraints.


Anon., 2004. Description of Sensor Technology in the food safety and quality context. http://www.csrees.usda.
Anon., 2002. Infectious diseases in Livestock – Scientific questions related to the transmission, prevention and
control of epidemic outbreaks of infections diseases in livestock in Great Britain. Policy document 15/02 of
the Royal Society, London, UK (, p. 66-86.
Christensen, J. and Gardner, I.A., 2000. Herd-level interpretation of test results for epidemiologic studies of animal
diseases. Prev. Vet. Med. 45, 83-106.
Doherr, M.G. and Audige, L., 2001. Monitoring and surveillance for rare health-related events - A review from the
veterinary perspective. Phil. Trans. R. Soc. Lond. B. 356, 1097-1106.
Eylenbosch, W.J. and Noah, N.D., 1988. Surveillance in Health and Disease. Oxford University Press, UK.
Greiner, M. and Gardner, I.A., 2000. Epidemiologic issues in the validation of veterinary diagnostic tests. Prev. Vet.
Med. 45, 3-22.
Greiner, M., Pfeiffer, D. and Smith, R.D., 2000. Principles and practical applications of the receiver-operating
characteristic analysis for diagnostic tests. Prev. Vet. Med. 45, 23-41.
Johnston, M., 2005. Meat inspection and chain information as part of the Farm to Fork Approach. In: Food Safety
assurance and veterinary public health – vol. 3 – Risk management strategies: monitoring and surveillance.
F.J.M. Smulders and J.D. Collins (Eds.). Wageningen Academic Publishers, Wageningen, NL, p. 257-258.
Kao, R.R., 2002. The role of mathematical modeling in the control of the 2001 FMD epidemic in the UK. Trends in
Microbiol. 10, 279-286.
Kitching, R.P., Hutber, A.M. and Thrusfield, M.V., 2005. A review of foot-and-mouth disease with special consideration
for the clinical and epidemiological factors relevant to predictive modelling of the disease. The Vet. Journal
169, 97-209.
Rogan, W.J. and Gladen, B., 1978. Estimating prevalence from the results of a screening test. Am. J. Epidemiol.
107, 71-76.
Thrusfield, M., 1995. Veterinary Epidemiology, 2nd Ed. Blackwell Science, Oxford, UK, p. 1, 39-41, 266-280, 296-
Wall, P.G., 2005. Risk management strategies in food safety: some issues for the EU. In: Food Safety assurance and
veterinary public health – vol. 3 – Risk management strategies: monitoring and surveillance. F.J.M. Smulders
and J.D. Collins (Eds.). Wageningen Academic Publishers, Wageningen, NL, p. 19.

128  Towards a risk-based chain control

Friederike Hilbert

Antimicrobial resistance and transfer in foodborne

Friederike Hilbert
Department of Veterinary Public Health of the University of Veterinary Medicine Vienna,
Veterinaerplatz 1, 1210 Wien, Austria

A constant collection of data on the prevalence of foodborne pathogens in different food
items is essential to predict their impact on human foodborne infections. Especially major
meat species (pork, beef and poultry) have to be analysed for thermophilic Campylobacter,
Salmonella, Yersinia enterocolitica, pathogenic E. coli and Listeria monocytogenes (Schlundt,
2002). Besides determining the prevalence of a pathogen in a certain food to estimate
effects on human health, virulence properties and antimicrobial resistance profiles have
also to be taken into account. In a recent study, conducted in Austria, the prevalence
of major foodborne pathogens was assessed in meat and their antimicrobial resistance
profile determined. Transfer of antimicrobial resistance between pathogens was assumed
on the basis of their prevalence in the same food and on common antimicrobial resistance
genes and elements detected. Thermophilic Campylobacter were mainly isolated from poultry
meat (51%). Although Salmonella is strictly controlled in poultry production in Austria, we
nevertheless found 16% of poultry meat to be contaminated with Salmonella (Mayrhofer et
al., 2004). Pathogenic Yersinia enterocolitica is only a minor problem in pork production in
Austria (16%) and only in 3 beef meat samples – most likely cross-contaminated during meat
processing - pathogenic Yersinia enterocolitica were found. Pathogenic E. coli (as confirmed by
PCR of the stx1 and stx2 (shigatoxins) as well as the intimin gene) were found in beef only
(5%). Whereas thermophilic Campylobacter, Salmonella and pathogenic E. coli were shown to
have about the same overall resistance rate, Yersinia enterocolitica and Listeria monocytogenes
food isolates were not found to be resistant to any of the tested antimicrobials. Resistances
to quinolones and tetracyclines were found most often (Mayrhofer et al., 2004). Based on
statistics, the transfer of antimicrobial resistance between different foodborne pathogenic
species is likely to occur. However, accurate genetic determination of resistance genes and
elements could not proof such a transfer of resistance genes in our study.

Keywords: foodborne pathogens, antimicrobial resistance, resistance transfer, resistance

genes, resistance elements

1. Introduction

Although antimicrobial resistance in foodborne pathogens does in most cases not represent
a major human health risk, it may occasionally result in fatalities. This is particularly the
case when antibiotics are used for treatment of diseased immuno-suppressed people or
when flouroquinolones - as a first prophylactic choice to reduce traveller disease – need

Towards a risk-based chain control  129

Friederike Hilbert

to be administered. The efficiency of drug use might be low at the level of resistance seen
in Campylobacter or Salmonella – the most important foodborne pathogens (Aarestrup and
Engberg, 2001; Di John and Levine 1988; Takahashi et al., 1997). For instance, high rates
of resistance to nalidixic acid - a quinolone of the first generation - is a sign for stepwise
resistance development (mutation) in Enterobacteriaceae. As in our experiments resistance to
nalidixic acid (especially in Salmonella) was rather high (42%) and resistance to nalidixic acid
is most often combined with low level fluoroquinolone resistance in Salmonella, an infection
with such a pathogen might be responsible for fluoroquinolone to fail as effective therapy.

In nature resistance to quinolones occur only by mutation, whereas other resistant elements
(resistance to tetracycline or ampicillin) are transferred horizontally (Kruse and Sorum,
1994). At the moment we have no proof of resistance transfer between different bacteria,
commensals – pathogens or food spoilage flora.

Discussions on the effect of the use of antimicrobial substances used in human- or veterinary
medicine and in agriculture as well as those on resistance development in foodborne pathogens
are continuing (Monnet, 2000). The application of antimicrobials as growth promoters is now
prohibited in the European Union. Soon data will become available to study the effect of
their use on the development of antimicrobial resistance, agriculture profitability and animal
health. But even when the use of antibiotics is exclusively restricted to prophylactic and
therapeutical veterinary application, resistance development in foodborne pathogens may
ensue. The latter can only be counteracted by following strict recommendations for the use
of veterinary drugs, which are based on constantly updated data, i.e. the recommendation
to apply only specifically defined antimicrobial substances that have previously been tested
for antimicrobial resistance in an animal food production setting.

2. Material and methods

2.1. Sample collection

Over a three years period 922 meat samples including pork (n=220), beef (n=134), chicken
(n=288), turkey (n=266), and minced meat (n=14) were collected from supermarkets,
butchers, street markets and slaughterhouses throughout Austria. Meat samples were randomly

2.2. Isolation procedure

2.2.1. Thermophilic Campylobacter

A total of 699 meat samples (105 pork, 84 beef, 243 chicken, 256 turkey and 11 minced meat
samples) were analysed for thermophilic Campylobacter. For the isolation of Campylobacter the
ISO approved procedure for food isolates was followed. In brief, 25g of the meat sample was
inoculated in Bolton broth (Oxoid CM983 with supplement SR208E, Basingstoke, England) and
incubated for 48 hours at 42°C under microaerophilic conditions (Oxoid Gas Generating Kit
Campylobacter System BR060A). The enrichment was streaked onto modified CCDA agar (Oxoid

130  Towards a risk-based chain control

Friederike Hilbert

CM739 and supplement SR155E) and incubated at 42°C under microaerophilic conditions for
48h. Colonies selected on the basis of morphology, were streaked in parallel onto modified
CCDA and plate count agar (Oxoid CM325) and were subsequently grown under microaerophilic
and aerobic conditions, respectively. Strains without aerobic growth on plate count agar were
biochemically verified as Campylobacter jejuni or Campylobacter coli by the Api–Campy (Bio
Merieux 20800, Marcy L’Etoile, France) and stored at –80°C for further characterisation.

2.2.2. Salmonella

A total of 821 meat samples (166 pork, 100 beef, 281 chicken, 262 turkey and 12 minced
meat samples) were analysed for Salmonella according to ISO–6579 using 25g of the food
sample. Next to XLD agar (Merck 1.13919, Darmstadt, Germany) and MacConkey agar (Oxoid
CM115) as selective agars we included the swarming ability at 42°C on MSRV (Oxoid CM910)
agar, followed by sero agglutination (Hoechst-Behring, Marburg, Germany, Polyvalent I;
Anti-O6, 7, 8; Anti-O3, 10, 15; Anti-O4, 5; Anti-O9).

2.2.3. Yersinia enterocolitica

A total of 304 meat samples (90 pork, 91 beef, 89 chicken, 21 turkey and 13 minced meat
samples) were analysed for Yersinia enterocolitica. These samples were analysed according
to ISO-10273 using alkali-treatment and CIN agar plates (Oxoid CM 653 and SR109E) with
minor modification as described in Hilbert et al. (2003) – for rapid urease screening - using
25g of the food sample. Urease-positive colonies were picked and discriminated biochemical
(API 20E, Bio Mérieux 20100).

2.2.4. Pathogenic E. coli

A total of 500 meat samples (120 pork, 134 beef, 126 chicken, 106 turkey and 14 minced
meat samples) were analysed for E. coli as follows. A 25g portion of the food sample was
enriched in EC-broth (Oxoid CM853) under aerobic conditions at 42°C for 24h. The enrichment
culture was streaked onto Coli-ID-Agar plates (Bio Merieux 42017), incubated for 24h at
42°C and typical colonies were isolated on plate count agar. After biochemical verification
as E. coli by using the Api-20E (Bio Merieux 20100) pathogenic E. coli were differentiated
by hemolysin production on enterohemolysin agar (Oxoid PB 5105A) and by PCR for stx1
(primers forward: gtggttgcgaaggaatttacc reverse: actgatccctgcaacacgctg), stx2 (forward:
atcctattcccgggagtttacg reverse: gcgtcatcgtatacacaggagc), and eae (intimin) (forward:
atgcccggacccggcacaag; reverse: aagagtctcgccagtattcg) occurrence.

2.2.5. Listeria monocytogenes

A total of 304 meat samples (90 pork, 91 beef, 89 chicken, 21 turkey and 13 minced meat
samples) were analysed for Listeria monocytogenes. 25 g of the meat sample was analysed
according ISO-11290-2 procedure and differentiated by streaks on PALCAM-agar plates
(Oxoid CM 877 and SR 150). For identification of Listeria monocytogenes typical colonies
were subjected to CAMP testing (Pichhardt, 1993) and biochemical testing by API-Listeria
(Bio Merieux 10300).

Towards a risk-based chain control  131

Friederike Hilbert

2.2.6. Pseudomonas spp.

The isolation of Pseudomonaceae from meat was conducted in parallel to the isolation of the
afore-mentioned foodborne pathogens. In brief: A 1:10 dilution of the sample in peptone
water was streaked directly on Kielwein-agar and incubated for 24h at 30°C under aerobic
conditions. Presumptive colonies were identified by biochemical identification.

2.3. Determination of antimicrobial susceptibility

Cultures were analysed for antimicrobial resistance using the disc diffusion assay recommended
by the National Committee for Clinical Laboratory Standards (NCCLS, 2000) on Mueller Hinton
agar plates (Oxoid CM337). For testing Campylobacter Mueller Hinton agar plates contained
5% (v/v) sheep blood and were incubated under microaerophilic conditions. Briefly, fresh
bacterial colonies were inoculated in 0.8% NaCl suspension to a turbidity equivalent to
0.5 McFarland standard. With a sterile cotton swab the culture was swabbed on the agar
plate and standard discs (Oxoid Antimicrobial Susceptibility Test Discs) were applied using
a disc dispenser. After incubation the size of the inhibition zone was determined according
to the NCCLS guidelines for aerobically grown bacteria. Resistance and sensitivity of the
bacterial isolates were determined using the breakpoints of the NCCLS for Enterobactericeae.
As standard strains E. coli ATCC 25922 (DSM 1103) and Campylobacter coli (DSM 4689)
and Campylobacter jejuni subsp. jejuni (DSM 4688) were used. For Listeria monocytogenes
additional to the NCCL, breakpoints from Vela et al. 2001 were applied. For Staphylococcus
aureus ATCC 25923 (DSM 1104) and Listeria monocytogenes ATCC 7644 were used as standard
strains. Whereas for Yersinia enterocolitica testing additional to E. coli ATCC 25922 the
standard strain DSM 11504 was used.

2.4. Genetic analyses

Tetracycline genes (tetA, tetB, tetO, tetG) were determined by PCR using the primer sets
and conditions:

• tetA-for: 5’ gctacatcctgcttgccttc3’ and tetA-rev: 5’ catagatcgccgtgaagagg 3’ (210Bp);

• tetB-for: 5’ ttggttaggggcaagttttg 3’ and tetB-rev: 5’ gtaatgggccaataacaccg 3’ (659 Bp);
• tetO-for: 5’ aacttaggcattctggctcac 3’ and tetO-rev: 5’ tcccactgttccatatcgtca 3’ (515 Bp)
(reference for primers Ng et al., 2001);
• tetG-for: 5’gctcggtggtatctctgc 3’ and tetO-rev: 5’ agcaacagaatcgggaac3’ (500 Bp)
(reference Guerra et al., 2001).

Plasmid isolation and determination: Plasmids were isolated after recommendations of the
producer (Quiagen-plasmid midi kit cat. 12145).

2.5. Statistics

Statistical analysis was done by Statgraphics for Windows software package, V.2.

132  Towards a risk-based chain control

Friederike Hilbert

3. Results

3.1. Prevalence and antimicrobial resistance rate of thermophilic Campylobacter

isolated from meat

Campylobacter was mainly isolated from poultry meat. A low percentage of pork was
contaminated with mainly Campylobacter coli. In beef, thermophilic Campylobacter were
only detected in less than 3% (mainly cross contamination).

Antimicrobial resistance was determined against tetracycline, ampicillin, nalidixic acid,

ciprofloxacin, erythromycin, chloramphenicol, streptomycin and gentamicin. The highest
prevalence was detected for quinolones with 40.7%; most isolates were cross-resistant to
nalidixic acid and ciprofloxacin, followed by resistance to tetracycline with a rate of 16.4%.
Resistance to streptomycin and ampicillin was 9.7% and 8.6%, respectively. Low resistance
rates were seen for erythromycin, gentamicin and chloramphenicol. Tetracycline as well as
quinolone resistance was seen more frequently in isolates from turkey meat than in isolates
from chicken meat (p<0.005). An overall resistance of 53% has been tested in the upper
isolates (see Figure 1).


60 53
% 40.7
20 9.7 8.6 4




Figure 1. 268 Thermophilic Campylobacter food isolates were analysed for antimicrobial resistance
to nalidixic acid (NA), ciprofloxacin (CIP), tetracycline (TET), streptomycin (S), ampicillin (AMP),
erythromycin (E), gentamicin and chloramphenicol using the disc diffusion assay according to the National
Committee for Clinical Laboratory Standards (NCCLS).

3.2. Prevalence and antimicrobial resistance rate of Salmonella isolated from meat

Expectedly, Salmonella was most often found in poultry meat (prevalence of 16.4%). Three
isolates of Salmonella enterica serovar typhimurium were found on pork. A single one of these
was typed as phage type 120, the others as phage type 104H. In beef as well as in minced
meat (beef and pork) no Salmonella could be isolated. Salmonella of serovars Enteritidis,
Blockley, Virchow, Heidelberg, Paratyphi B variant Java, Typhimurium, Hadar, Rissen and
Kentucky were identified in poultry.

Towards a risk-based chain control  133

Friederike Hilbert

All strains were subjected to antimicrobial resistance testing against tetracycline, ampicillin,
nalidixic acid, ciprofloxacin, chloramphenicol and streptomycin. Almost 60% of the isolates
exhibited a resistant phenotype and most of these strains showed resistance to more than
one antimicrobial tested. The highest resistance rate was seen for nalidixic acid followed by
tetracycline resistant isolates. Streptomycin, ampicillin and chloramphenicol resistant strains
as well as ciprofloxacin resistance were also found. The resistance pattern varied significant
between different serovars (see Figure 2).


% 60
40 33
17 17



/ AM


Figure 2. 52 Salmonella food isolates were analysed for antimicrobial resistance to nalidixic acid (NA),
ciprofloxacin (CIP), tetracycline (TET), streptomycin (S), ampicillin (AMP), and chloramphenicol (C) using
the disc diffusion assay according to the National Committee for Clinical Laboratory Standards (NCCLS).

3.3. Prevalence and antimicrobial resistance rate in pathogenic E. coli isolated from meat

All haemolytic strains and all strains, that were positive for stx1, stx2 or eae PCR, were
classified as potential pathogenic E. coli. Shigatoxin producing strains (STEC) were only
found in beef and to a lesser extent in pork but most isolates carried only one or two and
only rarely all three genetic loci. In poultry meat only haemolytic E. coli were detected
carrying neither the genetic loci for shigatoxin nor intimin. All strains were subjected to
antimicrobial resistance testing against tetracycline, ampicillin, gentamicin, kanamycin,
sulphonamides, nalidixic acid, trimethoprim and streptomycin. Of the total of 28 isolates
(all haemolytic strains including those classified as non STEC) 15 were resistant to at least
one tested substance (54%) (see Figure 3).

3.4. Prevalence and antimicrobial resistance rate of Yersinia enterocolitica isolated

from meat

The pathogenic serovar O:3 was identified mainly in pork at a prevalence of 16.7% and in 3
samples of beef. In all other meat samples only the apathogenic biotype 1A was detected.

All Yersinia enterocolitica isolates (both pathogenic and apathogenic biotypes) were subjected
to resistance testing against tetracycline, gentamicin, kanamycin, sulphonamides, nalidixic

134  Towards a risk-based chain control

Friederike Hilbert


% 60 46
21 18
20 14 14 14

T S3 W


s ist

Figure 3. 28 E. coli food isolates positive for stx1, stx2, or eae-PCR and/or haemolytic strains were analysed
for antimicrobial resistance to tetracycline (TET), sulphonamides (S3), trimethoprim (W), ampicillin
(AMP), streptomycin (S), nalidixic acid (NA), kanamycin, and gentamicin using the disc diffusion assay
according to the National Committee for Clinical Laboratory Standards (NCCLS).

acid, trimethoprim, chloramphenicol and streptomycin. One strain isolated from chicken meat
exhibited a phenotype resistant to tetracycline and was confirmed to biotype 1A. All other
isolates were sensitive to the antimicrobial substances tested.

3.5. Prevalence and antimicrobial resistance rate of Listeria monocytogenes isolated

from meat

In different meat species Listeria monocytogenes was found at prevalences ranging from 12 to
26%. The antimicrobial resistance profiling in Listeria monocytogenes after testing resistance
against tetracycline, penicillin, gentamicin, vancomycin, cotrimoxazol, erythromycin,
chloramphenicol and streptomycin revealed no resistant isolate.

3.6. Prevalence and antimicrobial resistance rate of Pseudomonas spp. isolated from meat

In almost every meat sample analysed, different Pseudomonas spp. could be detected. As
resistance rate was only interesting for transfer studies between different genera, only those
isolates were subjected to resistance profiling found on food in which foodborne pathogens
could be isolated simultaneously. Resistance against tetracycline was seen most often.

3.7. Prevalence of tetracycline, ampicillin and nalidixic acid resistance in different

foodborne pathogens and commensals isolated from one and the same meat sample

As an example of a transferable resistance locus, tetracycline resistance and ampicillin

resistance were used to study the possible transfer rate of resistance between foodborne
pathogens and commensals of different genera. In contrast, rates of nalidixic acid resistance
in different genera (at least in nature occurring by mutation only) were determined too (see
Figure 4). The results clearly demonstrate that the rate of coexisting resistance even between
different genera found on the same food sample are significantly higher for transferable
resistance determinants as for non transferable resistances.

Towards a risk-based chain control  135

Friederike Hilbert

Salmonella Escherichia coli Campylobacter Escherichia coli

91% tetracycline 72%

horizontal transfer

83% ampicillin 70%

by mutation only 50% nalidixic acid

Figure 4. Prevalence of tetracycline resistance and ampicillin resistance versus nalidixic acid resistance in
different bacteria isolated from the same food sample. When a tetracycline resistant isolate of Salmonella
was found in a certain food sample in 91% a tetracycline resistant E. coli isolate could be isolated from
this sample as well (isolated in parallel without antibiotic pressure). The same was seen for ampicillin
resistant isolates in 83% but only in 50% for nalidixic acid resistant isolates. For Campylobacter and E.
coli the co-isolation of resistance was 72, 70% versus 25%.

3.8. Resistance loci in different bacteria isolated from one and the same meat sample

By analysing the tetracycline resistance determinants in Salmonella, Escherichia coli and

Pseudomonas spp. most often the tetA and tetB genes could be detected, whereas, in
thermophilic Campylobacter only the tetO gene was found (see Figure 5). Isolates of different
bacteria harbouring the same resistance genes were analysed for the resistance locus (plasmid
or chromosomal location). So far, analysed isolates found in one and the same meat sample
revealed neither the same gene nor the same resistance locus.

4. Conclusions

Antimicrobial resistance and transfer have been widely discussed both in the scientific
community and in public. Concomitantly, drug application in veterinary and human medicine
but especially in food producing animals has raised public health concern. The application of
anti-infectives used for therapeutic, prophylactic and not in the least for growth promotion
has raised questions about the development of resistant microbes in food animals and
the transfer of resistance to humans. At the same time as strict rules and regulations
restricting the use of anti-infectives discussions are being formulated, discussions are going
on about resistance genes or elements and the transfer between micro-organisms in different

Our studies indicate that the resistance rates in Campylobacter, Salmonella and pathogenic
E. coli are similar in the three pathogenic genera (53%, 57%, 54%). We hypothesize that

136  Towards a risk-based chain control

Friederike Hilbert

W 5
G 5

























Figure 5. Prevalence of the tetO gene in Campylobacter isolates. In tetracycline resistance E. coli isolates
from the same samples (isolated in parallel without antibiotic pressure) the tetO gene could never be

adaptation to antimicrobials used in animal husbandry is an essential surviving strategy for

enteric microbes having their main environment within the host. Whereas no resistant isolate
of pathogenic Yersinia enterocolitica and Listeria monocytogenes was detected (out of one
tetracycline resistant Yersinia enterocolitica biotype 1A).

The aforementioned pathogens are found in different environments like slaughterhouses

and food production plants representing niches with refrigeration temperatures. Thus, the
acquisition of resistance to antibiotic substances migth not be as compulsory as appears to
be the case for enteric microbes.

In Austria quinolone resistance in Campylobacter but also in Salmonella is as high as 40%. In

Austria, poultry is often treated with ciprofloxacin (fluoroquinolone) to prevent Salmonella
infections (Feierl et al., 1999). Consistent with the literature (Taylor et al. 1985; Gootz and
Martin 1991) resistance to nalidixic acid (quinolone) was found in most cases to be associated
with ciprofloxacin (fluoroquinolone) resistance in Campylobacter. Additionally, we detected
the typical resistance phenotype: nalidixic acid resistance/ combined low level resistance
to ciprofloxacin in Salmonella and E. coli (Threlfall, 2002).

Therefore, drugs previously considered useful for the treatment of both severe campylobacteriosis
and salmonellosis (Graninger et al., 1996) can no longer be recommended (Feierl et al., 1999),
as treatment failure could be predicted in to many cases during therapy (Fish et al, 1995;
Aarestrup et al., 2003).

To study the transfer of resistance between different species and genera we first analysed
the possible transferable resistance loci of tetracycline resistance and ampicillin resistance
compared to nalidixic acid resistance (caused by mutation). Most interesting was the high
resistance overlap of isolates from different genera but from the same food sample (see Figure
5). For further studies we used the tetracycline resistant isolates. So far we were not able to
detect isolates of different genera isolated from the same food sample harbouring the same
resistance genes and elements causing tetracycline resistance. So we conclude that transfer

Towards a risk-based chain control  137

Friederike Hilbert

of resistance between different genera of foodborne pathogens and commensals outside the
human or animal host could be a rare event.

To reduce resistance rates in those pathogens the prudent use and alternatives to antimicrobial
treatment like vaccination [as recommended by the WHO (WHO, 2001)] may be most


Aarestrup, F.M. and Engberg, J., 2001. Antimicrobial resistance of thermophilic Campylobacter. Veterinary Research
32, 311-321.
Aarestrup, F.M., Wiuff, C., Molbak, K. and Threlfall, E.J., 2003. Is It Time To Change Fluoroquinolone Breakpoints
for Salmonella spp.? Antimicrobial Agents and Chemotherapy 47, 827-829.
Di John, D. and Levine, M.M., 1988. Treatment of diarrhea. Infectious Disease Clinics of North America 2, 719-745.
Feierl, G., Berghold, C., Furnpass, T. and Marth, E., 1999. Further increase in ciprofloxacin-resistant Campylobacter
jejuni/coli in Styria, Austria. Clinical Microbiology and Infection 5, 59-60.
Fish, D.N., Piscitelli, S.C. and Danzinger, L.H., 1995. Development of resistance during antimicrobial therapy. A
review of antibiotic classes and patient characteristics in 173 studies. Pharmacotherapy 15, 279-291.
Gootz, T.D. and Martin, B.A., 1991. Characterization of high-level quinolone resistance in Campylobacter jejuni.
Antimicrobial Agents and Chemotherapy 35, 840-845.
Graninger, W., Zedtwitz-Liebenstein, K., Laferl, H. and Burgmann, H., 1996. Quinolones in gastrointestinal infections.
Chemotherapy. 42 Suppl 1, 43-53.
Guerra, B., Junker, E., Miko, A., Helmuth, R. and Mendoza, M.C., 2004. Characterization and localization of drug
resistance determinants in multidrug-resistant, integron-carrying Salmonella enterica serotype Typhimurium
strains. Microb. Drug Resist. 10, 83-91.
Kruse, H. and Sorum, H., 1994. Transfer of multiple drug resistance plasmids between bacteria of diverse origins in
natural microenvironments. Applied and Environmental Microbiology 60, 4015-4021.
Mayrhofer, S., Paulsen, P., Smulders, F.J.M. and Hilbert, F., 2004. Antimicrobial resistance profile of five major food-
borne pathogens isolated from beef, pork and poultry. International Journal of Food Microbiology 97, 23-29.
Monnet, D.L., 2000. Toward multinational antimicrobial resistance surveillance systems in Europe. International
Journal of Antimicrobial Agents 15, 91-101.
NCCLS, 2000. Performance standards for antimicrobial disk susceptibility tests. Approved standard. 7th ed. Vol. 20
No. 1. M2-A7.
Ng, L.K., Martin, I., Alfa, M. and Mulvey, M., 2001. Multiplex PCR for the detection of tetracycline resistant genes.
Mol. Cell Probes. 15, 209-215.
Schlundt, J., 2002. New directions in foodborne disease prevention. International Journal of Food Microbiology
78, 3-17.
Takahashi, K., Narita, K., Kato, Y., Sugiyama, T., Koide, N., Yoshida, T. and Yokochim, T., 1997. Low-level release of
Shiga-like toxin (verocytotoxin) and endotoxin from enterohemorrhagic Escherichia coli treated with imipenem.
Antimicrobial Agents and Chemotherapy 41, 2295-2296.
Taylor, D.E., Ng, L.K. and Lior, H., 1985. Susceptibility of Campylobacter species to nalidixic acid, enoxacin, and
other DNA gyrase inhibitors. Antimicrobial Agents and Chemotherapy 28, 708-710.
Threlfall, E.J., 2002. Antimicrobial drug resistance in Salmonella: problems and perspectives in food- and water-
borne infections. FEMS Microbiology Reviews 26, 141-148.
WHO, 2001. Antibiotic resistance: synthesis of recommendations by expert policy groups. WHO/CDS/DRS/2001.10.

138  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

Antibiotic resistance monitoring in veterinary

Antonio Battisti and Alessia Franco
Istituto Zooprofilattico Sperimentale delle Regioni Lazio e Toscana, Rome, Italy, National
Veterinary Reference Centre for antibiotic resistance

The international spread of infectious agents that have developed resistance to antimicrobials
must be regarded as a global problem requiring common strategies, especially in developed
countries. Some key elements of an overall strategy for the containment of the spread of
resistant bacteria of animal origin are education and prudent use of antibiotics among
veterinary practitioners and food animal producers, research, monitoring of resistance and
monitoring of the use of antimicrobials in animal production.

The overall objectives of a surveillance programme concerning antibiotic resistance in bacteria

of animal origin are summarized in the final recommendations of ARBAO, Concerted Action
FAIR PL 97-3654, which stipulate that it is crucial to collect accurate and reliable data of
resistance and antibiotic consumption in food producing animals and companion animals.

The surveillance of antibiotic resistance should target different bacterial species such as
bacteria exclusively causing clinical infections in animals (“veterinary” pathogens), bacteria
isolated from animals but capable of also causing human infections (zoonotic bacteria) and
commensal bacteria isolated from healthy animals (indicator bacteria).

Two different concerted actions (EU IV° and V° Framework Programme) have been conducted
to create a network of national veterinary reference laboratories in Europe and to establish a
surveillance system for monitoring the occurrence and the emergence of antibiotic resistance
in bacteria of animal origin (ARBAO and ARBAO II). In Italy, the National Veterinary Reference
Centre for Antibiotic Resistance (IZS Lazio e Toscana, Rome, Italy), is an active member of
the EU network. The backbone of the monitoring system is based on collecting representative
data from different regions (northern, central, and southern Italy) and from food animals
(bovine, ovine, swine, and poultry) and companion animals. The information on the resistance
situation is provided for the three categories of bacteria of animal origin: animal pathogens
(i.e. Pasteurellaceae, Actinobacillus pleuropneumoniae, Staphylococcus aureus, Streptococci),
zoonotic bacteria (Salmonella, Escherichia coli EHEC), and indicator bacteria (E. coli,
Enterococci). The first report of the Italian monitoring system on antibiotic resistance in
veterinary medicine (2003) was issued at the end of 2004. From 2004 onwards, the surveillance
system has also included the monitoring of resistance among the zoonotic thermo-tolerant
Campylobacter species from food animals (poultry, swine, bovine). Representative examples
from the Italian veterinary monitoring system (2003) are discussed.

Keywords: antibiotic resistance, monitoring, surveillance, veterinary, zoonoses

Towards a risk-based chain control  139

Antonio Battisti and Alessia Franco

1. Introduction

Antibiotic resistance is the emergence and propagation of determinants of resistance

to antibiotics triggered by the selective pressure on bacterial populations by the use of
antimicrobial drugs. In human medicine, the extent of antibiotic resistance is reaching levels
that give cause for concern, in terms of its frequency and speed of diffusion, creating serious
therapeutic problems and even placing the survival of the patients at risk. Infections from
resistant bacteria in humans are estimated to cause health-care costs at approximately 4
billion dollars/year in the USA alone, as a result of increased morbidity, mortality and the
costs associated with illness, both in hospital and in the community.

The international spread of infectious agents resistant to antimicrobials is currently considered

one of the major global public health problems requiring common strategies, especially in
developed countries. Antimicrobial agents have been used in food animals in developed
countries for almost 50 years. Many of these drugs are identical to or belong to classes also
used in human medicine (penicillins, tetracyclines, sulphonamides, phenicols, quinolones,
fluoro-quinolones, aminoglycosides, macrolides, cephalosporins, lincosamides, polypeptides)
and have been employed in therapy or as growth promoters to enhance feed efficiency.
The extensive use of antibiotics in veterinary medicine has, besides causing an increase of
efficiency of farm animal industry, unfortunately also caused the emergence of antibiotic
resistance in animal pathogenic bacteria, in commensal bacteria and in disease agents that
can be transmitted to man through the food chain (zoonotic pathogens). Over the last
decade, an intense debate has taken place about the impact on human health of the use of
such antimicrobials in food animals and the risk of spread of resistant zoonotic pathogens
or resistance determinants from food animals to humans (Anon., 1998, 1999; Witte, 1998;
Threlfall et al., 2000; van den Bogaard and Stobberingh, 2000; WHO, 2001). Since 1997, the
use in animals of several antimicrobials related to human drugs (avoparcin, virginiamycin,
tylosin, spiramycin and bacitracin) has been banned in the European Union, on the basis of
the precautionary principle and Danish research findings (Aarestrup et al., 1998; European
Commission, 1998, 1999). Other drugs are expected to be banned beginning on the 1st of
January 2006. Also, in the veterinary sector observations are made. For instance, the use
of antibiotics has been noted to lead to increased resistance in animal pathogens, making
it difficult to control infectious diseases among livestock, and likely leading to resistant
strains of animal origin being transmitted to man through food, animals and the environment.
Despite these prospects, the use of antibiotics is extremely widespread in veterinary medicine
and animal husbandry.

In-depth investigation of antibiotic resistance is considered of strategic importance by the

World Health Organization (WHO), the Office International des Epizooties (OIE) and the
Health and Consumer Protection Directorate General of the European Community, and has
drawn the attention of veterinary public health agencies in many EU member countries. As
a consequence, guidelines on the prudent (“judicious”) use of antimicrobials in veterinary
medicine have in recent years issued by several international organisations, such as the
WHO, the OIE, and some veterinary associations (American Veterinary Medical Association,
World Veterinary Association). Also available are consensus guidelines (WVA/IFAP/COMISA,
1999) developed by different stakeholders, including the farm animal industry, animal health

140  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

industry and veterinary associations, such as those drawn by the World Veterinary Association
(WVA), World Federation of the Animal Health Industry (COMISA) and the International
Federation of Agricultural Producers (IFAP - FIPA). In general, these guidelines are intended
to maximize therapeutic efficacy and minimize selection of resistant organisms as an integral
part of good veterinary practice.

Although the role of veterinarians is essential in the use of antibiotics in animal therapy,
little is known about their attitudes and practices and their adherence to the guidelines
on prudent use. In Italy, for example, veterinary antimicrobial drugs are prescription-only
medicinal products and may be sold only with a written prescription from a veterinarian,
such that any distribution and usage without prescription is considered illegal.

In veterinary medicine and animal husbandry antimicrobials are still commonly used not only
for therapy, but also for the prevention of enteric diseases, respiratory diseases, or mastitis. In
many countries, beef calves are mass-medicated upon arrival at feedlots, often for treatment
or prevention of respiratory disease. While this metaphylactic treatment is thought to reduce
losses due to clinical disease and mortality (Harland et al., 1991; Guichon et al., 1993) it
remains one of the major concerns for the spread of resistant bacteria in animal production.
Additionally, the extensive use of fluoroquinolones and third-generation cephalosporins in
farm and companion animal therapy is also of concern and this despite the fact that - because
other antibiotics (of lesser importance in human therapy) are available that can equally
effectively treat the bacteria responsible for major bacterial diseases affecting different body
systems (e.g. respiratory system, intestine or mammary gland) - these antimicrobials are
still generally administered. For these reasons, in 1998 the European Community decided to
promote research on antibiotic resistance, organising a plan for European surveillance in both
the human and animal field. It has been recommended that the surveillance network consist
of several diagnostic laboratories coordinated by national reference centres. On the human
side, the monitoring of antibiotic resistance is currently being carried out by diagnostic
laboratories at the hospital level, as in the case of the European Antimicrobial Resistance
Surveillance System (EARSS,, in which Italy also participates: /files/rapporto_AR.pdf).

2. M
 onitoring and surveillance of antibiotic resistance in veterinary

Monitoring constitutes on-going programmes directed at the detection of changes in the

prevalence of disease in a given population and in its environment. The surveillance in
animal health is the continuous investigation of a given population to detect the occurrence
of disease for control purposes, which may involve testing of a part of the population (OIE
Definitions in the International Animal Health Code). What differentiates surveillance from
monitoring is essentially the fact that an additional step is needed beyond the collection,
organization and reporting of data (monitoring) and the transformation of such data in
information needed for actions to be taken to secure public health (surveillance).

Towards a risk-based chain control  141

Antonio Battisti and Alessia Franco

The European Community recently enacted the recommendations on surveillance and the flow
of information on zoonotic pathogens already included in the Council Directive 92/117/EC,
and considers antibiotic resistance as a transversal zoonosis in the recent Directive 2003/99/
EC, which recommends that the member countries establish a system for monitoring antibiotic
resistance in bacteria of animal origin. The Directive 2003/99/EC states that “the alarming
emergence of resistance to antimicrobial agents (such as antimicrobial medicinal products
and antimicrobial feed additives) is a characteristic that should be monitored. Provision
should be made for such monitoring to cover not only zoonotic agents (i.e. Salmonella spp.,
Campylobacter jejuni and Campylobacter coli) but also, in so far as they present a threat
to public health, other agents. In particular, the monitoring of indicator organisms (i.e.
Escherichia coli, Enterococcus faecium, Enterococcus faecalis) might be appropriate. Such
organisms constitute a reservoir of resistance genes, which they can transfer to pathogenic
bacteria”. In addition, the Directive states that “Member States shall ensure, in accordance
with the requirements set out in Annex II, that monitoring provides comparable data on the
occurrence of antimicrobial resistance in zoonotic agents and, in so far as they present a
threat to public health, other agents” (Article 7). The Annex II includes the general and the
specific requirements for monitoring of antimicrobial resistance. At least a minimum data set
is needed for different animal and bacterial species included in the monitoring system, the
sampling strategy, the laboratory methodology and the antimicrobial drugs employed.

The specific requirements ask for “relevant information at least with regard to a representative
number of isolates of Salmonella spp., Campylobacter jejuni and Campylobacter coli from cattle,
pigs, and poultry and food of animal origin derived from those species”.

2.1. Antimicrobial resistance monitoring programmes in Europe

Even before the Directive 2003/99/EC many countries had established systems for the
surveillance of antibiotic resistance and the monitoring of the use of antimicrobial agents
both in human and veterinary medicine. The main goals of such monitoring programmes in
the veterinary field are to detect undesired rises and trends in resistance frequencies, to
produce data for Risk Assessment, provide the basis of policy recommendations, evaluate
strategies for action and to measure the effects of intervention.

In this respect, the Danish monitoring system on antimicrobial use and occurrence of
antimicrobial resistance in bacteria from animals, foods and humans (DANMAP, http://www. had been active
since 1995, followed by the Swedish System (SVARM,
pdf) in 1998, and similar monitoring systems in Holland (MARAN, http://www.cidc-lelystad.
nl/docs/MARAN-2003-web.pdf) and Norway (NORM-VET,
asset/24009/2/24009_2.pdf). Also active is a network in Spain (VAV), and since 2002, the
Italian Veterinary Monitoring System (ITAVARM), which produced the first report with data
of 2003 (

142  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

2.2. European network for resistance monitoring in bacteria of animal origin

Two Concerted Actions (EU 4th and 5th Framework Programme) have been conducted to create
a network of national veterinary reference laboratories in Europe and establish a surveillance
system for monitoring the occurrence and the emergence of antibiotic resistance in bacteria of
animal origin. The first EU Concerted Action was started in the 4th EC Framework Programme
and named “Antibiotic Resistance in Bacteria of Animal Origin” (ARBAO, FAIR PL 97-3654,
Project leader Pascal Sanders, AFSSA, France, accueil.
htm) including participants from Member States (Italy included) and some non-EC countries.
The aim of the Concerted Action was to establish the state of the art and to promote the
standardization and harmonization of methodologies for assessing and reporting the presence
and trends of resistance.

At present there is an ongoing Concerted Action in Europe (EC 5th Framework Programme)
named “ARBAO II” (FAIR-QLK2-2002-01146, Project Leader: Frank Aarestrup, Danish Zoonosis
Centre), with the aim of creating a stable EU network for the harmonization of methods and
criteria for the production of comparable and representative data on antibiotic resistance.
The Italian Veterinary National Reference Centre, IZS delle Regioni Lazio e Toscana, and
the National Institute of Health (ISS) participate for Italy in the veterinary field and in
the human field, respectively. The ARBAO II objectives also include the development of an
external quality control for the capability to obtain reproducible data of susceptibility testing,
which is going on since 2003, and involving the reference laboratories of the participating
countries. At present, the aim of the network is to provide with comparable data stakeholders
and decision makers in the European Union.

2.3. Features of surveillance programmes

The overall objectives of a surveillance programme concerning antibiotic resistance in bacteria

of animal origin should cover (see ARBAO recommendations):

• The analysis of the trends in antimicrobial resistance prevalence in major food producing
animal species.
• Monitoring the emergence of particular resistant clones (e.g. Salmonella typhimurium DT
104 and other multi-resistant clones).
• Monitoring of the development of new resistance phenotypes.
• Detection of new emerging resistance mechanisms.
• Development of an early alert system.

The data obtained by this approach should be used for:

• Control measures in order to prevent epidemic spread of antimicrobial resistance.

• Control measures relating to antibiotic use and prescription policy.
• Collection of input data for Risk Assessments.
• Correlation of the prevalence of antibiotic resistance to the consumption of antibiotics
in both humans and animals.

Towards a risk-based chain control  143

Antonio Battisti and Alessia Franco

For this purpose it is crucial to collect accurate and reliable data of antibiotic ingestion by
food producing animals and companion animals. The surveillance of antibiotic resistance
should target different bacterial species such as those exclusively causing clinical infections
in animals (“veterinary” pathogens), bacteria isolated from animals but capable of causing
human infections (zoonotic bacteria) and commensal bacteria isolated from healthy animals
(indicator bacteria). In Italy, the National Veterinary Reference Centre for Antibiotic Resistance
(IZS Lazio e Toscana, Rome, Italy), is an active member of the EU network.

3. A
 ntimicrobial resistance monitoring in veterinary medicine: the
current situation in Italy
Only in recent years monitoring of antimicrobial resistance in the veterinary field has been
improved in Italy. Until 1999, no reliable data at national level regarding bacteria of animal
origin were available. This was caused also by the heterogeneity of laboratory methods
and drugs tested and by the lack of quality control protocols to ensure the validity of data
generated (reproducibility and repeatability). However, numerous focussed and small scale
epidemiological studies showed convincingly, that like in other European countries, the
problem existed in Italy as well. Molecular studies in Italy demonstrated the emergence
and spread of multi-resistance and new genetic elements. To meet the criteria of the EU
recommendations, some projects were started by IZS Lazio e Toscana, which in 2003 was
appointed as Veterinary Reference Centre for Antibiotic Resistance, through a research
programme granted by the Italian Ministry. The objectives were the production of reliable
data on antibiotic resistance and the implementation of a continuous monitoring system
in Italy. The project has been carried out in collaboration with the network of the Instituti
Zooprofilattici Sperimentali, IIZZSS) and the Veterinary National Reference Centre for
Salmonellosis (IZS delle Venezie).

3.1. Characteristics of the Italian veterinary monitoring system

The backbone of the Italian monitoring system is based on representative data collected
in different regions from: (1) food animals (bovine, ovine, swine, and poultry), and (2)
companion animals (dogs, cats, horses).

The information on the resistance situation is provided for three categories of bacteria of
animal origin:

• Animal pathogens (e.g. Pasteurellaceae, coagulase positive Staphylococci, Streptococci,

E. coli) as they are the “main reason” for administering antimicrobial drugs to food and
companion animals, at least for therapeutic purposes. Besides, some animal pathogens
are suspected to transfer resistant determinants to other human and animal pathogens
(e.g. Pasteurellaceae, Actinobacillus pleuropneumoniae).
• Zoonotic bacteria (Salmonella, E. coli EHEC) for the relevant public health issues.
• Commensal bacteria from the intestines of farm animals (Enterococci, E. coli) as a
potential source of resistance genes that can spread horizontally to zoonotic bacteria
through the food chain (Neidhardt, 1996; Winokur et al., 2001). These organisms are

144  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

indicators of the selective pressure exerted by the use of antimicrobials on intestinal

populations of food animals. In the latter case, the isolates must be collected randomly
from the intestinal content of healthy animals at slaughter, following an active sampling
procedure. While resistance data from zoonotic and animal pathogens are generally
achieved through a laboratory-based passive surveillance, in the latter case an active
surveillance system is needed.

The Italian monitoring system also consider animal pathogens and indicator bacteria from
companion animals, to collect information about the presence and the trends of resistance
also in such animals (Table1).

Table 1. Reasons why companion animals should be included in an integrated antimicrobial resistance monitoring

• Exposure often similar to their owner

• 60% of animal pathogens have zoonotic significance
• Often share food sources with their owners
• Living together increases the risk of exchange
• Frequent “off label” usage in pets
• Use of many identical drugs used in human medicine
• The choice of a specific drug may not follow economic or cost/benefit principles
• Long-lasting therapy may be more frequent

3.2. Additional information required in the surveillance of antimicrobial resistance

Further objectives of an integrated surveillance system on antimicrobial resistance in

veterinary medicine, are not to limit the collection of information to the field of laboratory
surveillance alone, but to extend the information to the use of antimicrobials in veterinary
clinical practice and in farm animal husbandry (i.e. veterinarians’ attitudes towards the use of
antibiotics, information on consumption). In this way the Italian community will be provided
with additional tools to guide public health activities in terms of assessing risks for the
consumer, regulatory policies for veterinary pharmaceuticals and information and instruction
in the animal husbandry system and in professional training for veterinarians.

3.2.1. Surveys on antimicrobial drugs usage

In this respect, to study the knowledge and attitudes regarding antibiotic use of veterinarians
who work with beef and dairy cattle, the National Reference Centre, IZS delle Regioni Lazio e
Toscana, in collaboration with the National Institute of Health, performed a national study
between June and September 2002 (Busani et al., 2004). One aim was to explore the extent to
which their practices are in keeping with the principles of prudent antibiotic use as published
by various international organizations, The study had the following objectives:

Towards a risk-based chain control  145

Antonio Battisti and Alessia Franco

• To evaluate the use of antibiotics and conformity with the principles of prudent use.
• To determine factors associated with non-judicious use.
• To assess the frequency with which alternatives to antibiotics are employed (vaccination,
use of probiotics).
• To evaluate to what extent antibiotic resistance is perceived as a risk.

From the membership lists of two scientific associations (1143 members), 250 were selected
using simple random sampling. Only those veterinarians in private practice who served as
consulting veterinarians for meat and dairy cattle were eligible for inclusion. Those selected
were contacted by telephone, and after ascertaining study eligibility, an interview was
conducted concerning:

• The type and dimension of the farms covered by their practices.

• Attitudes regarding using vaccinations for respiratory and enteric bacteria.
• Use of the laboratory for diagnosis and susceptibility testing.
• Use of antibiotics for therapy and for the prophylaxis of mastitis, enteritis in calves
(neonatal and during the weaning process), and respiratory infections.
• Perceptions regarding antibiotic resistance.

One hundred six veterinarians were interviewed, representing 42% of the original sample. Of
the remaining 144, 48 were not eligible, 92 could not be contacted despite multiple attempts,
and 4 refused to participate. The interviewed veterinarians provide care for about 5% of
the total cattle population of the country; most (81%) worked in the north and provided
consultation for dairy cattle. The veterinarians recommended vaccination for respiratory
infections for 3% of the dairy farms 34% of the cattle farms in their practices; for neonatal
enteritis, the corresponding figures were 24% and 30%.

Laboratory diagnosis was used by 67% of the veterinarians “sometimes” or “always” for
mastitis, by 37% for neonatal enteritis, and by 17% for respiratory infection. However, more
than 60% of the veterinarians practice empiric therapy while awaiting culture and sensitivity

“New generation” antibiotics (3rd and 4th generation cephalosporins, new aminoglycosides,
and fluoroquinolones) listed among the drugs of first choice by 12% of veterinarians for
treating mastitis; these drugs were among the first choices of 68% of veterinarians for
neonatal enteritis treatment and of 28% for respiratory diseases; this preference was especially
common among veterinarians providing care for large cattle farms. An additional 12% listed
phenicols (florfenicol) as their first choice for respiratory infections.

Antibiotic use was also common for prophylaxis, with 20% reporting using antibiotics for the
prevention of neonatal enteritis, 28% to prevent respiratory diseases, and 62% to prevent
mastitis during the drying off period.

Therapeutic failure was reported “often” by 21% and “sometimes by 64% of the veterinarians.
Those who experienced failures were more likely to use new generation antibiotics. A series
of multivariate analyses showed a significant association between (1) the perception of

146  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

antibiotic failure “often” and the use of new antibiotics for mastitis (adjusted odds ratio
(AOR) 4.1; 95% confidence intervals (CI) 1.1-14.3) and (2) the perception of failure “often”
or “sometimes” and the use of fluoroquinolones for neonatal enteritis (AOR 6.2; 95%CI 1.6-
23.8). None of the other variables in the models, including training and specialization, area
of practice, or continuing education experiences were significantly associated with the use
of these antibiotics.

More than 75% of the veterinarians had participated in conferences or continuing education
courses in the previous year, were subscribers to Italian journals, and received updates from
the drug industry; 39% participated in mailing lists and 24% subscribed to international
journals; approximately 20% used all of the above methods to remain current. More than
20% responded correctly to all questions regarding antibiotic resistance.

The veterinarians interviewed were young, used a variety of methods to remain current,
and were aware of the potential risks of antibiotic resistance, but a substantial number
nonetheless used “new generation” antibiotics or fluoroquinolones when other antibiotics
were clearly available. This use was not influenced by continuing education experience,
perception of the problem of resistance, or use of the laboratory; instead it may result from
the perceived need to rapidly solve a problem and the belief that this will be more likely to
occur if such antibiotics are used. Indeed, we found a seemingly paradoxical relationship
between use of the laboratory for culture and sensitivity for enteric infections and the use
of fluoroquinolones; of those using laboratory testing, 38% recommended fluoroquinolones
as their first choice drug.

These practices, which may pose public health risks, were seemingly unaffected by levels of
knowledge and training. Increased use of such antibiotics is of particular concern for the
treatment of conditions such as neonatal enteritis, which frequently is viral in origin and for
which treatment is recommended only when the animal shows signs of systemic infection.

4. Data drawn from the Italian Veterinary Antimicrobial Resistance

Monitoring (ITAVARM)
In the following section some examples are presented of the integrated Italian veterinary
antimicrobial resistance monitoring.

4.1. The Italian demographic picture

The population residing in Italy (31 December 2002) is 57,321,070 inhabitants, of which
25,782,796 (45%) live in the northern part of the country, 10,980,912 (19.2%) in central
and 20,557,362 (35.9%) in southern Italy.The figures of the Italian farm animals are shown
in Table 2 and Table 3 (data drawn from the Italian Institute for Statistics ISTAT, 2000). The
Italian farm animal demographic picture is shown in Table 2 and Table 3.

Towards a risk-based chain control  147

Antonio Battisti and Alessia Franco

Table 2. Number of farms in Italy, shown by region and animal species (ISTAT, 2000).

Region Farms by animal species

Total Bovine Water buffalo Swine Ovine Caprine Horse Poultry

Piemonte 42,521 18,530 16 3,546 2,214 3,638 2,920 27,403

Valle d’Aosta 2,822 1,586 - 107 169 282 145 1,489
Lombardia 35,589 19,660 59 7,487 2,857 3,551 4,602 19,980
Trentino-Alto Adige 17,789 11,217 5 5,885 2,515 2,245 2,389 11,262
Bolzano-Bozen 12,812 9,476 4 5,475 2,136 1,725 1,798 8,562
Trento 4,977 1,741 1 410 379 520 591 2,700
Veneto 84,555 21,575 27 10,674 1,054 2,385 3,581 71,586
Friuli-Venezia Giulia 14,679 3,761 9 3,095 231 624 647 11,827
Liguria 11,636 1,617 4 355 1,331 1,037 762 9,746
Emilia-Romagna 49,012 11,938 19 4,498 1,871 1,577 3,480 41,426
Toscana 49,805 4,964 13 5,471 4,635 2,028 4,233 42,057
Umbria 25,526 3,553 8 7,503 3,815 740 1,699 22,701
Marche 39,479 5,310 27 14,979 3,853 1,234 1,332 36,409
Lazio 68,721 10,872 647 18,881 13,037 3,442 5,996 58,907
Abruzzo 37,559 5,945 7 15,933 9,646 1,607 1,932 33,338
Molise 14,374 4,043 20 7,714 3,884 1,364 855 13,008
Campania 70,278 15,350 1,298 34,641 8,560 5,317 2,180 60,964
Puglia 7,946 4,386 46 1,310 2,462 1,424 1,245 3,841
Basilicata 20,306 3,730 13 11,639 8,119 4,467 1,902 16,175
Calabria 37,229 6,086 11 26,246 5,726 5,813 1,694 27,752
Sicilia 18,443 9,045 9 2,416 6,482 2,496 2,575 6,771
Sardegna 27,566 8,685 8 12,945 14,478 3,290 4,492 4,897

Total 675,835 171,853 2,246 195,325 96,939 48,561 48,661 521,539

North 258,603 89,884 139 35,647 12,242 15,339 18,526 194,719
Central 183,531 24,699 695 46,834 25,340 7,444 13,260 160,074
South 233,701 57,270 1,412 112,844 59,357 25,778 16,875 166,746

4.2. Antimicrobial resistance in foodborne pathogens (zoonotic)

4.2.1. Salmonella spp. infections in humans and animals

Human data on Salmonella resistance monitoring are drawn from the ENTER-NET network,
which is coordinated by the Italian Institute of Health, while data on animals originate
from the ENTER-VET network, coordinated by the Reference Centre for Salmonellosis (IZS
delle Venezie).

148  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

Table 3. Number of farm animals in Italy, shown by region and animal species (ISTAT, 2000).

Region Animal species

Bovine Water buffalo Swine Ovine Caprine Horse Poultry

Piemonte 818,538 598 924,162 88,162 46,176 11,750 13,966,635

Valle d’Aosta 38,888 - 1,072 2,216 3,399 260 14,515
Lombardia 1,604,620 4,393 3,809,192 91,223 50,627 20,400 27,285,623
Trentino-Alto Adige 189,343 24 22,158 60,381 21,177 6,739 1,362,251
– Bolzano 144,196 22 15,804 39,739 15,714 4,725 250,863
– Trento 45,147 2 6,354 20,642 5,463 2,014 1,111,388
Veneto 931,337 1,364 701,685 30,910 12,647 13,243 47,983,231
Friuli-Venezia Giulia 100,766 569 191,663 6,270 6,128 2,310 8,638,393
Liguria 16,468 20 1,477 17,717 7,672 2,585 277,338
Emilia-Romagna 621,399 1,179 1,552,437 78,673 10,483 15,654 29,036,967
Toscana 103,008 521 171,641 554,679 17,158 18,589 3,484,039
Umbria 62,994 126 250,492 149,814 6,302 8,251 8,170,282
Marche 78,329 493 147,750 162,774 6,929 5,064 7,693,313
Lazio 239,457 33,518 89,206 636,499 38,849 22,795 3,322,691
Abruzzo 82,862 58 115,120 281,613 15,403 8,436 3,601,858
Molise 56,594 489 47,447 113,160 10,322 2,474 4,034,421
Campania 212,267 130,732 141,772 227,232 49,455 4,967 5,765,546
Puglia 152,723 5,604 27,145 217,963 52,135 7,550 1,981,935
Basilicata 77,711 547 82,906 335,757 97,545 5,093 496,363
Calabria 101,976 169 101,095 236,962 139,358 3,631 1,410,145
Sicilia 307,876 563 41,649 708,182 122,150 8,453 1,678,455
Sardegna 249,350 984 193,947 2,808,713 209,487 16,487 1,139,323

Total 6,046,506 181,951 8,614,016 6,808,900 923,402 184,731 171,343,324

North 4,321,359 8,147 7,203,846 375,552 158,309 72,941 128,564,953
Central 483,788 34,658 659,089 1,503,766 69,238 54,699 22,670,325
South 1,241,359 139,146 751,081 4,929,582 695,855 57,091 20,108,046

In Italy, human salmonellosis is subject to obligatory notification (class II illnesses and

epidemic episodes of class IV), and the data are collected by the Ministry of Health (bulletin
of infectious diseases). Surveillance in animals focuses primarily on poultry species (Dir.
92/117/EEC, amended by Dir. 99/2003/EEC), though there has also been monitoring of cattle,
swine and other species on the local level (Veneto, Latium, Lombardy and Abruzzo regions).
Data on animals are also generated by the diagnostic activities of the IIZZSS. Surveillance of
animal and vegetable foodstuffs is required under the law, and every year more than 20,000
specimens are analysed throughout the national territory for Salmonella. Also active in Italy
is the Enter-NET Italia network, a surveillance system based on laboratories that gather

Towards a risk-based chain control  149

Antonio Battisti and Alessia Franco

information annually (serotypes, phage types, profiles of resistance to antibiotics) on more

than 12,000 strains of Salmonella, of which roughly half are of human origin.

In recent years the number of cases notified has ranged from 15,000 in 1996 to 10,000 in
2002, with the levels of incidence ranging from 7.1 cases for every 100,000 inhabitants in
the Apulia region to 56.9 cases for every 100,000 inhabitants in the Autonomous Province
of Bolzano. In a number of regions the figures are considerably underestimated, both on
account of the tendency of the entire system to omit notification and because of the lack
of microbiological diagnosis for a portion of the cases. According to the Italian Institute
for Statistics (ISTAT) data on the leading causes of death exist. Salmonella infections cause
approximately 20 deaths per year (ISTAT, 1997-2001), mainly among the elderly.

Significant food-borne epidemics fall under class IV of the classification of infectious diseases
and are also notified to the National Health System (SSN). In 2002, notification was made of
406 outbreaks (in 280 cases the etiologic agent was identified), resulting in more than 4,000
cases of infection, but there is no way of establishing how many were due to Salmonella,
because the etiologic agents are not listed.

The serotypes most frequently isolated in humans over the last 4 years are S. typhimurium, S.
enteritidis, S. infantis, S. derby, S. panama (ITAVARM, 2003). Salmonella enteritidis (SE) and S.
typhimurium (STM) account for approximately three quarters of the human isolates, distinct
in different phage types for epidemiological purposes. The main serotypes isolated in Italy
and reported by the passive laboratory surveillance system ENTER-VET (years 2002-2003) in
animals are STM (23.9%), S. blockley (7%), S. virchow (6.9%), S. hadar (6.9%), S. heidelberg
(6.5%), SE (4.2%), while in foodstuffs of animal origin are STM (20.6%), S. derby (12.1%),
S. hadar (6.8%), S. blockley (6.2%), S. heidelberg (5.6%), S. bredeney (4.2%).

4.2.2. Salmonella spp. resistance in animals and food of animal origin

As previously reported, STM is the serotype most frequently isolated from different animal
species and foodstuffs (23.9% and 20.6% respectively), while SE is rarely isolated from animals
(essentially associated with layers) but is found in foodstuffs, especially eggs, shellfish and
poultry meat (ITAVARM, 2003). As the other serotypes are concerned, associations are found
between serotypes and animal species, as in the case of S. blockley in turkeys, S. virchow
and S. hadar in chicken and S. derby in swine, while other animal species show a clear-cut
prevalence of STM.

Resistance to antibiotics is closely related to the salmonella serotype being considered

and to the animal species from which the strain originates. Multi-resistance is another
condition associated with various serotypes of Salmonella, in particular STM DT 104 and
STM DT NT(Non-Typable, Table 4), S. hadar (Table 5), S. virchow, S. vBlockley and Salmonella

A large number of isolates present 4, 5, 6 or more resistances: in the case of STM DT 104,
the profile of multi-resistance to ampicillin, chloramphenicol, streptomycin, sulphonamides
and tetracycline (ACSSuT) is the most common, while the NT phage type presents a profile

150  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

Table 4. Number of multiple resistances in S. typhimurium isolates, shown by phage type and origin, Italy, 2003 (data
from ENTER-VET).

S. typhimurium S. typhimurium DT 104 S. typhimurium DT NT

No. of Swine Chicken Bovine Turkey Swine Chicken Bovine Turkey Swine Chicken Bovine Turkey

0 16.1 25.0 13.6 15.8 13.3 11.1 40.0

1 7.3 7.1 9.1 5.3 6.7 8.3
2 5.1 14.3 4.5 6.7 5.6
3 7.3 14.3 4.5 6.7 8.3
4 28.5 7.1 13.6 15.8 6.7 41.7 20.0 100.0 100.0
5 27.0 21.4 40.9 15.8 46.7 100.0 87.5 33.3 25.0
6 or more 8.8 10.7 13.6 47.4 13.3 12.5 66.7 40.0

Total 137 28 22 19 15 5 8 6 36 5 1 2

Table 5. Number of multiple resistances in S. hadar isolates, shown by origin (data from ENTER-VET, 2003).

S. hadar

No. of resistances Swine Chicken Bovine Turkey

0 1.5
2 1.5
3 29.9 100.0 66.7
4 100.0 34.3 33.3
5 23.9
6 or more 9.0

Total 1 67 1 3

of 4 resistances (ASSuT). In Tables 6 and 7 the evaluation of resistance is shown for the two
main phage types, DT NT and DT 104. In general, STM shows elevated resistance to ampicillin,
streptomycin, tetracycline, sulfonamides and, especially in DT104, chloramphenicol. Resistance
to trimethoprim is greater than 10% only in isolates from humans and swine, while resistance
to nalidixic acid is frequent in isolates from turkeys, especially in DT 104. Furthermore, in
DT104 isolates of human and animal origin (except swine), there is also a high frequency of
resistance to chloramphenicol (from 53.3% in swine to 100% in other species), while this
resistance is less present in DT NT (13.9%).

Towards a risk-based chain control  151

Antonio Battisti and Alessia Franco

Table 6. Antimicrobial resistance (%) in S. typhimurium, phage type NT isolates from humans and swine (data from
ENTER-NET, 2003 and ENTER-VET, 2003).

Human (68) Swine (36)

Cefotaxime 0.0 0.0

Ampicillin 92.6 77.8
Streptomycin 88.2 72.2
Gentamicin 1.5 0.0
Kanamycin 1.5 0.0
Nalidixic acid 19.1 11.1
Ciprofloxacin 0.0 0.0
Chloramphenicol 7.4 13.9
Tetracycline 89.7 80.6
Sulfonamides 91.2 77.8
Trimethoprim 10.3 2.8

Table 7. Antimicrobial resistance (%) in S. typhimurium DT104 from humans and different animal species (data from
ENTER-NET, 2003 and ENTER-VET, 2003).

Human (87) Swine (15) Chicken (5) Bovine (8) Turkey (6)

Cefotaxime 0.0 0.0 0.0 0.0 0.0

Ampicillin 98.9 66.7 100.0 100.0 100.0
Streptomycin 97.7 80.0 100.0 100.0 100.0
Gentamicin 1.1 0.0 0.0 0.0 0.0
Kanamycin 3.4 6.7 0.0 0.0 66.7
Nalidixic acid 10.3 6.7 0.0 12.5 66.7
Ciprofloxacin 0.0 0.0 0.0 0.0 0.0
Chloramphenicol 92 53.3 100.0 100.0 100.0
Tetracycline 98.9 73.3 100.0 100.0 100.0
Sulfonamides 97.7 80.0 100.0 100.0 100.0
Trimethoprim 9.2 6.7 0.0 0.0 0.0

In some cases, resistance patterns can be useful as epidemiologic markers, when associated
with other features (e.g. PFGE pattern XB0079, resistance to trimethoprim and ASSuT pattern
are useful to trace back human STM infections caused by strains of swine origin, Busani,

Resistances in SE are distinctly lower than in STM. In this serotype the most frequent
resistances are to sulfonamides (36.7% in isolates from chicken), nalidixic acid (41.7% in

152  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

isolates from swine) and streptomycin (41.7% in isolates from swine), and no multi-resistant
isolate has been detected.

Other noteworthy serotypes included in the monitoring of resistance to antibiotics are S.

infantis, S. hadar and S. virchow, which are isolated primarily from human cases and chickens
(Table 8). Resistance in S. infantis primarily regards sulfonamides, tetracycline, nalidixic acid,
streptomycin and ampicillin, and is greater in isolates of human origin than in those from
chickens. S. hadar shows, a significant resistance to kanamycin (16.7%) and, in isolates of
human origin, to gentamicin as well (5.7%). S. virchow shows high level of resistance to
ampicillin, nalidixic acid and sulfonamides, especially in human strains, and resistance to
streptomycin, gentamicin, trimethoprim. Concerns are raised by resistance to cefotaxime
(6.7%), a third-generation cephalosporin, with significant implications for public health.

Table 8. Percentage resistance in Salmonella infantis, S. hadar and S. virchow isolates from humans and chicken (data
from ENTER-NET 2003, and ENTER-VET, 2003).

S. infantis S. hadar S. virchow

Human (51) Chicken (23) Human (20) Chicken (67) Human (15) Chicken (54)

Cefotaxime 0.0 0.0 0.0 0.0 6.7 1.9

Ampicillin 19.6 8.7 50.0 85.1 56.3 74.1
Streptomycin 20.0 8.7 69.2 47.8 37.5 1.9
Gentamicin 0.0 0.0 5.6 0.0 18.8 0.0
Kanamycin 0.0 0.0 16.7 4.5 0.0 0.0
Nalidixic acid 18.4 8.7 75.0 97.0 75.0 79.6
Ciprofloxacin 2.0 0.0 0.0 0.0 0.0 0.0
Chloramphenicol 0.0 4.3 0.0 3.0 0.0 0.0
Tetracycline 24.4 17.4 58.8 52.2 50.0 5.6
Sulfonamides 8.3 30.4 0.0 41.8 20.0 29.6
Trimethoprim 3.7 4.3 0.0 0.0 16.7 1.9

4.2.3. E nterohaemorrhagic Escherichia coli (EHEC, STEC) and Enteropathogenic Escherichia

coli (EPEC, AEEC)

Enterohaemorrhagic E. coli (EHEC, STEC) include serotypes with genes that code for the
adhesion factor (eae gene) and for Shiga-like toxins (ST), also known as verocytotoxins (VT),
with the result that they are referred to as STEC or VTEC. In addition, they can produce other
virulence factors. They can cause enteric forms in humans, with extremely serious sequelae
in children (haemolytic-uremic syndrome, thrombocytopenic purpura). The serotypes most
frequently linked to illness in humans include E. coli O:157, O:26, O:111, O:103 and O:145.
A number of these can occasionally be associated with enteric forms in calves or swine. In

Towards a risk-based chain control  153

Antonio Battisti and Alessia Franco

swine, oedema disease is associated with E. coli that carry the adhesion factor and secrete
a specific variety of verocytotoxin (serogroups O:138, O:139, O:149). Enteropathogenic E.
coli (AEEC, EPEC) only produce the adhesion factor (eae) and are capable of producing a
characteristic lesion on the surface of the enterocytes, referred to as the “attaching and
effacing lesion”. They do not produce shiga toxins. They cause diarrhoea in humans, young
ruminants and rabbits. The resistance data presented in this Chapter were produced from
isolates received by the National Institute of Health, Department of Food and Animal Health,
between 2002 and 2003 for confirmation and characterisation. The isolates were assayed at
the National Reference Centre for Antibiotic Resistance, on the basis of a specific panel of
antibiotics, using both the agar diffusion method and the MIC method (Table 9).

Despite the fact that the number of isolates from some species is very low, the proportion
of those resistant to different molecules varies in relation to the animal species and, most
likely, to different usage in different farming systems. The resistances in isolates from sheep
and water buffalo are generally lower than those observed in other food animals (between
0% and 3%). In the case of cattle, many drugs present resistances between 4% and 11%,

Table 9. Antimicrobial resistance (%) in Escherichia coli EHEC (STEC) and EPEC (AEEC) isolates from different animal
species, 2002-2003.

Water buffalo (38)* Bovine (17)** Ovine (4)*** Rabbit (6)****

Ampicillina 2.6 12.0 0.0 83

Amoxi/clav aca 0.0 6.0 0.0 33
Cefazolina 0.0 6.0 0.0 0
Cefoxitinb 7.9 0.0 0.0 17
Ceftazidimeb 0.0 0.0 0.0 0
Cefuroximeb 0.0 6.0 0.0 0
Cefepimeb 5.3 6.0 0.0 0
Cefotaximea 0.0 6.0 0.0 0
Tetracyclinea 2.6 12.0 0.0 100
Streptomycinc 2.9 6.0 0.0 100
Kanamycinc 0.0 6.0 0.0 83
Gentamicina 0.0 0.0 0.0 83
Amikacina 0.0 0.0 0.0 0
Sulfonamidesc 2.6 6.0 0.0 100
Trim/sulfaa 2.6 6.0 0.0 100
Nalidixic acidc 0.0 6.0 0.0 0
Enrofloxacinc 0.0 0.0 0.0 0
Chloramphenicola 0.0 6.0 0.0 100

*38/38 VT+ve; **14/17 VT+ve; ***1/4 VT+ve;****6/6 VT+ve.

a tested both with Agar Diffusion and Broth Microdilution method.

b tested with Broth Microdilution method.

c tested with Agar Diffusion method.

154  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

and no resistance was observed against aminosides or fluoroquinolones; also of interest is an

isolate resistant to cefotaxime. Finally, the isolates from rabbits present higher resistances
(i.e. gentamicin 83%), at levels of up to 100% for certain molecules (chloramphenicol,
sulfonamides, trimethoprim-sulfamethoxazole, tetracycline).

4.3. Antimicrobial resistance in animal pathogens

4.3.1. Escherichia coli

Escherichia coli is a common commensal micro-organism found in the intestine of animals

and humans. Being a very flexible species, E. coli can act as an opportunistic pathogen and
cause localised or systemic infections (i.e. urinary infections, septicaemia, enteritis). In farm
animal particular strains (enterotoxic E. coli) are associated with neonatal diarrhoea, others
with weaning and post-weaning enteric diseases and septicaemia. It can also cause mastitis
in dairy cattle and sheep (hyper-acute forms, with fever, depression and, less frequently,
death from endotoxic shock).

The resistances observed in E. coli from clinical isolates (localised and systemic infections,
Table 10) in different animal species may be significantly higher than those of E. coli
indicators. This is due in part to the fact that the collection of the isolates from clinical
forms may be affected by sampling bias. In fact, biological samples are usually sent to the
laboratory following severe forms of illness, or those with a high incidence rate, or when

Table 10. Antimicrobial resistance (%) in clinical isolates of Escherichia coli from different animal species, Italy,

Bovine (166) Ovine (230) Dogs (122)

Ampicillin 49.1 16.1 40.3

Amoxi/clav ac 10.8 2.2 15.8
Cefazolin 8.8 2.2 15
Cefotaxime 1.2 0 11.5
Tetracycline 57.8 26.3 44.6
Streptomycin 53.2 18.6 35.9
Kanamycin 26.4 3.8 14.8
Spectinomycin 15.7 5.3 15.8
Gentamicin 6.1 1.7 7.6
Amikacin 0.6 0.4 0
Sulfonamides 54.1 18.1 41.5
Trim/sulfa 29.9 8.6 30
Nalidixic acid 31.3 5.1 24.4
Enrofloxacin 25.6 1.3 17.2
Chloramphenicol 28 6.1 19
Colistin 2.4 1.3 5.7

Towards a risk-based chain control  155

Antonio Battisti and Alessia Franco

therapeutic failures occur. In such cases the laboratory receives a subset of the pathogenic
agents that often presents resistance frequencies higher than those of the total population
of pathogens.

For some farm animal species, such as bovine (Figure 1), the resistance to quinolones in clinical
isolates was 31% vs. 15% in the indicator isolates, and the resistance to fluoroquinolones
were 25.6% vs. 11.2%.

As regards companion animals (Figure 2), the results presented were produced by monitoring
activities carried out in the City of Rome between 2002 and 2003. In isolates from clinical
forms, the resistance proved to be even higher, especially for certain antimicrobial classes
(the most recent beta-lactams, quinolones, fluoroquinolones and phenicols). A considerable
pool of resistances, both in pathogens and indicators, is evident from these data. Resistances
reported for fluoroquinolones and cephalosporins are 17.2% and 11.5%, respectively. The
frequencies of resistance in the indicators, although not as high, are nevertheless significant.
Many isolates present multi-resistances, including extended-spectrum beta-lactams,
fluoroquinolones and aminosides. Third-generation cephalosporin resistance in E.coli from
dogs and cats are due to major β-lactamases, (CMY-2, SHV-12 and CTX-M), reported for the
first time in E. coli from sick and healthy dogs and cats (Carattoli et al., 2005). Molecular
characterization suggests the presence of several combinations of β-lactamase genes in E. coli
from companion animals. These results confirm how important is the monitoring of isolates
from companion animals, as they live in close contact with their owners thus increasing the
risk of an exchange of pathogens or resistance determinants via commensal bacteria.

60 Bovine indicators n= 433
Bovine infections n= 166
Resistance %


Ce ac

om e

Sp nam n
tin ycin

nt in

Am cin

na n

im s

Ch rof id
m acin

Co l
Tr ide
Te axim


lid ulf
ox cilli

Ce zoli

Ka yci

lfo ci

Ge yc


En ic a
Su ika

St cyc

Na /s


lo lox






Figure 1. Antimicrobial resistance (%) in E. coli clinical isolates and E. coli indicators of bovine origin.

156  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

E. coli Indicators n= 189
45 E. coli Infections n= 122
Resistance %

i/ in
Ce ac

fo in
tra me

pt line

Sp am n
tin ycin

nt in
Am cin

na in

im s

ix fa
Ch rof cid

ph n

Co ol
Tr ide
Ka yci

m aci
lid ul
ox cill

Ce zol

Ge yc

lfo ac


Te axi

En ic a

St cyc

Na /s



Su ik

lo lox
Am mpi




Figure 2. Antimicrobial resistance (%) in E. coli clinical isolates and E. coli indicators from dogs and
cats, 2002-2003.

4.3.2. Pasteurellaceae (respiratory pathogens in ruminants)

The bacteria monitored belong to the Pasteurellaceae family. The collection of information
on the following species is thought to be of particular importance: Pasteurella haemolytica
sensu lato group (Mannheimia haemolyti

ca, Pasteurella threalosi) and Pasteurella multocida.

As for M. haemolytica, it is definitely the main pathogenic bacterial agent responsible for
respiratory forms in cattle, and the primary cause of therapy for respiratory forms in this
species. Its pathogenicity is due to a variety of factors (leukotoxin synthesis, iron capturing
system, capsule, LPS, fimbriae, active proteasis on immunoglobulins, etc.). It is important
to monitor its resistance, because of the selection pressure of antimicrobial drugs on that
species, but also on account of the likelihood that it can transfer resistance genes to other
commensal or zoonotic bacteria. Similar conclusions can be drawn for the other species of
the family.

4.3.3. Mannheimia haemolytica and Pasteurella haemolytica in cattle and sheep

For M. haemolytica in cattle, the only resistances registered were to ampicillin (11.1%),
streptomycin (25%) and nalidixic acid (22.2%). No isolate was resistant to fluoroquinolones

Towards a risk-based chain control  157

Antonio Battisti and Alessia Franco

Ovine (n=20)
25 Bovine (n= 9)
Resistance %







tin cin

Am n

im in

flo d
Ch Flor in

m ico

lid ulf

Ge yci


En c ac






Na /s

Sp nam














Figure 3. Antimicrobial resistance (%) in Pasteurella haemolytica sensu lato, clinical isolates of ovine
and bovine origin.

or to phenicols. Considering that all the isolates come from respiratory forms, these results
seem to indicate a prevalence of susceptible micro-organisms (Figure 3).

In the case of sheep, resistances in P. haemolytica sensu lato seem to be comparable to those
observed in cattle, with two isolates (9.5%) resistant to nalidixic acid, one to ampicillin and
one to trimethoprim-sulfamethoxazole (4.5%) and none resistant to fluoroquinolones or to
phenicols. It should be noted that the selective pressure of antibiotics in sheep is generally
lower than in cattle, given that the raising techniques are primarily extensive and so a large
number of the antimicrobial drugs already utilised for cattle have not yet been employed.

The resistance to ampicillin and similar molecules is caused by the ROB1 gene found on
a plasmid and described for the first time in a human isolate of Haemophilus influenzae
responsible for a case of meningitis in the USA. The origin of the gene would appear to be
traceable to Gram-positive bacterial species, with a subsequent transfer to Pasteurellaceae
(P. multocida, M. haemolytica) and Actinobacillus pleuroneumoniae. This example confirms
the importance of monitoring the emergence and diffusion of resistances in animal, which
are the fundamental reason for the use of antibiotics in livestock.

4.3.4. Staphylococcus aureus and coagulase-positive Staphylococci

Staphylococci are Gram-positive cocci, responsible for various forms of illness in livestock,
pets and humans. Among bovine livestock (as well as sheep and goats) they cause mastitis,
one of the most economically significant infectious illnesses in dairy cattle. They are forms
that prove difficult to treat, often becoming chronic or sub-clinic and requiring accurate

158  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

preventive action to limit their spread to the other animals. Even in the sub-clinic forms,
they are one of the main causes responsible for increased somatic cells counts in milk.

In human medicine, clones contracted in hospitals may present reduced susceptibility or

resistances to certain classes of valuable drugs. Of concern are the methicillin-resistant S.
aureus (MRSA) strains, a possible cause of threatening nosocomial infections. In this case the
resistance to beta-lactams is due to the presence of a cassette gene named Staphylococcal
cassette chromosome mec (SCCmec), a mobile genetic element composed of the mec gene
complex, which encodes methicillin resistance. In case of MRSA infections, the only effective
therapy is that with vancomycin or teicoplanin (glycopeptides). Recently, in addition to the
MRSA strains, other strains with reduced susceptibility or resistant to glycopeptides have
emerged (Glycopeptide Intermediate Staphylococcus aureus, GISA; Glycopeptide Resistant
Staphylococcus aureus, GRSA), creating significant therapeutic difficulties, with a high risk
of severe consequences for the patient.

Coagulase-positive Staphylococci from bovine mastitis

The resistance frequencies for coagulase-positive Staphylococci from bovine mastitis (Figure
4) are high for penicillin and ampicillin (43% and 44% respectively) and significant for
tetracycline (13%) and streptomycin (8%), molecules widely used in veterinary medicine
in the last decades. No resistance was observed to oxacillin, the drug tested to reveal the
presence of resistance in the MRSA.

Coagulase-positive Staphylococci from dogs



Resistance %





































Figure 4. Antimicrobial resistance (%) in coagulase + ve Staphylococci isolated from mastitis milk from
dairy cattle (n=115).

Towards a risk-based chain control  159

Antonio Battisti and Alessia Franco

For the first year of activity, 78 isolates tested in 2002-2003 are available. Of these, 27%
come from necropsied animals, while the remaining 63% are from infections on diseased
animals (otitis, conjunctivitis, endometritis, etc.).

The highest frequencies of resistance are observed for molecules widely used in veterinary
practice of companion animals (Figure 5), such the beta-lactams (more than 65% resistance
to ampicillin and penicillin), tetracyclines (46.7%) and streptomycin (25.6%). No isolates



Resistance %





































Figure 5. Antimicrobial resistance (%) in coagulase + ve Staphylococcus clinical isolates from dogs

were found resistant to 1st-generation cephalosporin or to oxacillin. Only 4 out of 78

isolates tested (5.1%) show resistance to fluoroquinolones, widely used in the treatment of
companion animals.

4.4. Antimicrobial resistance in indicator bacteria

The surveillance of resistance to antibiotics in commensal bacteria isolated from the intestine
content of randomly selected animals at slaughter provide valuable data on the pool of
resistance determinants found in bacteria of animal origin. This phenomenon is the consequence
of the selective pressure to which the bacterial species have been subjected as a result of the
use of antibiotics both for therapeutic purposes and growth enhancement. In this respect, E.
coli, as the indicator for gram-negative bacteria, and Enterococcus faecium and E. faecalis, for
gram-positive bacteria, have been studied. The monitoring of resistance frequencies among
different animal species and different production-lines within the same species (e.g. dairy

160  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

cattle, beef cattle, veal calves) allows comparison of the effects of selective pressure and is
a useful tool, as an early alert system, for tracking emerging resistances in livestock.

Data regarding indicator bacteria are drawn from the active monitoring carried out in the
years 2002 and 2003. The monitoring was performed on cattle, swine, sheep and poultry.
4.4.1. Indicator E. coli

Figure 6 present the results of antibiotic resistance monitoring in indicator E. coli in different
farm animal species.

The frequencies of resistance to the panel of antibiotics assayed varies among the different
drugs tested and among animal species of origin considered. In general, the resistances
in sheep provide a “baseline” for the other livestock species. This is due to the extensive
approach in sheep raising in Italy. The pool of resistances rises when bovine livestock are
considered, with peaks for several drugs in swine and chicken, subject to intensive raising
systems that probably cause a wider use of antibiotic drugs to treat bacterial diseases, in
situations with high concentrations of animals per unit of surface area.

Resistance to ampicillin, streptomycin, sulphonamides and tetracycline varies from 11% to

19% in isolates from sheep, from 24% to 41% in isolates from cattle and it is approximately
50% in isolates from swine and chicken. Resistances to nalidixic acid and cefotaxime appear
to be associated with poultry (49.6% and 2.0% in chickens, respectively) and resistance
70 Ovine n= 346
60 Bovine n= 660
Chicken n= 258
% Resistance

50 Swine n= 255









Figure 6. Antimicrobial resistance in indicator faecal E. coli from different farm animal species.

to spectinomycin with swine (28.6%). As fluoroquinolones are concerned, resistance to

enrofloxacin was greater than 10% in isolates from cattle and chickens, less than 5% in
isolates from sheep and 1% in isolates from swine. Resistance to cefotaxime was less than
5% in chickens, and very low (0.3-0.6%) in ruminants.

Towards a risk-based chain control  161

Antonio Battisti and Alessia Franco

5. Conclusions

5.1. What has been achieved

An integrated monitoring system on antimicrobial resistance in the veterinary field was

established in Italy relying on an epidemiologic approach, using epidemiologic studies, to
explore the attitudes and practices of veterinarians regarding the use of antimicrobial drugs
in certain farming systems. These studies provide valuable information especially when
quantitative data on the use of antimicrobials are not readily available.

5.2. What has been neglected / what needs to be done

In the near future, the main objectives to be achieved for the Italian veterinary resistance
monitoring are:

• Obtain reliable information on the usage of antimicrobials in farm and companion animals
in Italy. Nowadays, there is no available web-based information system collecting data
on prescriptions, number of animals treated, duration of treatment.
• Make the collection of isolates to be tested more representative on the Italian territory,
to obtain more accurate estimates in resistance rates in all the classes of bacteria
• Include representative data on thermophilic Campylobacter resistance in livestock. This
object will be achieved starting from 2004, as soon as resistance data will be available
from the main farm animal species: poultry, swine, cattle.


Aarestrup, F.M., Bager, F., Jensen, N.E., Madsen, M., Meyling and Wegener, H.C., 1998. Resistance to antimicrobial
agents used for animal therapy in pathogenic-, zoonotic- and indicator bacteria isolated from different food
animals in Denmark: a baseline study for the Danish integrated antimicrobial resistance monitoring programme
(DANMAP). APMIS 106, 745-770.
Anon., 1998. Evaluation of the Human Health Impact of the Microbial Effects of Antimicrobial New Animal Drugs
Intended for Use in Food- Producing Animals. Food and Drug Administration. U.S. Federal Register 63,
Anon., 1999. A proposed framework for evaluating and assuring the human safety of the microbial effects of
antimicrobial new animal drugs intended for use in food-producing animals. U.S. Food and Drug Administration’s
Center for Veterinary Medicine.
Busani, L., 2004. Typing tools to estimate the attributable fraction of human cases of S. Typhimurium due to a given
animal source. In: Proceedings of COST Action 920 on foodborne zoonoses, Working Group 3: Quantitative Risk
Assessment, 28.-30. June 2004, Pamplona, Spain, p. 76.
Busani, L., Graziani, C., Franco, A., Di Egidio, A., Binkin, N. and Battisti, A., 2004. Survey of the knowledge,
attitudes and practice of Italian beef and dairy cattle veterinarians concerning the use of antibiotics. Vet.
Rec. 155, 733-738.

162  Towards a risk-based chain control

Antonio Battisti and Alessia Franco

Carattoli, A., Lovari, S., Franco, A., Cordaro, G., Di Matteo, P. and Battisti, A., 2005. Extended-spectrum beta-
lactamases in Escherichia coli isolated from dogs and cats in Rome, Italy, from 2001 to 2003. Antimicrob Agents
Chemother. 49, 833-835.
European Commission, 1998. Council Regulation (EC) No 2821/98 of 17. December 1998 amending, as regards
withdrawal of the authorisation of certain antibiotics, Directive 70/524/EEC concerning additives in feedingstuffs.
European Commission, 1999. Opinion of the Scientific Steering Committee on Antimicrobial Resistance, DGXXIV,
Consumer Policy and Consumer Health Protection.
Guichon, P.T., Booker, C.W. and Jim, G.K., 1993. Comparison of two formulations of oxytetracycline given
prophylactically to reduce the incidence of bovine respiratory disease in feedlot calves. Can. Vet. J. 34,
Harland, R.J., Jim, G.K., Guichon, P.T., Townsend, H.G.G. and Janzen, E.D., 1991. Efficacy of parenteral antibiotics
for disease prophylaxis in feedlot calves. Can. Vet. J. 32, 163–168.
Neidhardt F.C., 1996. Escherichia coli and Salmonella: cellular and molecular biology. ASM Press, Washington DC.
Threlfall, E.J., Ward, L.R., Frost, J.A. and Willshaw, G.A., 2000. Spread of resistance from food animals to man - the
UK experience. Acta Vet. Scand. Suppl. 93, 63–74.
Van den Bogaard, A.E. and Stobberingh, E.E., 2000. Epidemiology of resistance to antibiotics. Links between animals
and humans. Int. J. Antimicrob. Agents 14, 327-335.
Winokur, P.L., Vonstein, D.L., Hoffman, L.J., Uhlenhopp, E.K. and Doern, G.V., 2001. Evidence for transfer of CMY-2
AmpC beta-lactamase plasmids between Escherichia coli and Salmonella isolates from food animals and humans.
Antimicrobial Agents Chemother. 45, 2716-2722.
Witte, W., 1998. Medical consequences of antibiotic use in agriculture. Science 279, 996–997.
World Health Organization, 2001. Global strategy for containment of antimicrobial resistance. http://www.who.
WVA/IFAP/COMISA working group, 1999. Prudent use of antibiotics: global basic principles. http://www.poultry-

Towards a risk-based chain control  163

Lis Alban and Stine G. Goldbach

Use of veterinary epidemiology to improve food

safety along the food chain: An industry perspective
on Salmonella
Lis Alban and Stine G. Goldbach
Danish Bacon and Meat Council, Vinkelvej 11, DK-8620, Denmark,


This paper describes some of the initiatives that have been implemented by the Danish pig
industry with the aim to reduce the prevalence of Salmonella. Initially, the main focus was
on pre-harvest initiatives and correct identification of herds with high levels of Salmonella.
Then, the focus has changed to post-harvest initiatives, such as improved slaughter hygiene.
Recently, decontamination applied after slaughter and cost-effectiveness in surveillance have
received increased attention. The Danish system has proved to be successful because the
number of human salmonellosis cases attributable to pork has declined. However, if the aim
is to obtain further reductions, then post-harvest initiatives will be more cost-effective than
pre-harvest initiatives. This knowledge can be used to develop and implement appropriate
types of surveillance programmes in other countries.

Keywords: Salmonella, surveillance, pre-harvest, post-harvest, food safety

1. Introduction

Infection with Salmonella is rarely associated with clinical diseases in pigs. However, control
is important because the public is concerned about the human health impact. In Denmark,
the number of human cases of salmonellosis due to the consumption of pork increased during
the 1990´s and reached a maximum in 1993 where a total of 1,100 cases were reported,
corresponding to an incidence of around 20 per 100,000 inhabitants (Figure 1; Anon., 2004).
This was politically unacceptable and led to the development and implementation of a
surveillance-and-control programme for slaughter pig herds in 1995 (Mousing et al., 1997).
Since then, several initiatives have been carried out in order to identify risk-mitigating factors
along the entire food chain; from control of feedstuff over separate transport of finishers
from herds with high levels of Salmonella to development of slaughter procedures that will
lead to improved hygiene. Recently, attention has been concentrated on the application of
different decontamination procedures at the slaughterhouse with focus on cost-effectiveness
both with respect to surveillance and control costs.

The action plan has been expensive for the Danish pig industry as around 90 million Euros
were spent during the years 1995 to 2002. At the same time, a substantial amount of money
was saved in societal costs; e.g. one study indicated 28 to 66 million Euros in saved costs
– depending on the assumed degree of human cases that have been underreported. Around

Towards a risk-based chain control  165

Lis Alban and Stine G. Goldbach


Cases per 100,000

Broilers Pork Eggs

Figure 1. The estimated major sources for human salmonellosis in Denmark 1998-2003. Source: The
Danish Zoonosis Centre.

70-90% of these costs are related to working days lost (Nielsen and Korsgaard, 2003).
However, the question should be raised whether the same improvement in food safety might
have been achieved in a cheaper way. Additionally, how can a cost-efficient further reduction
in the level of Salmonella in pork be obtained? To address these issues, the evolution of the
Danish surveillance-and-control programme for slaughter pig herds will be presented.

2. Pre-harvest initiatives in the early phase

Basically, Salmonella can be controlled in the pre-harvest phase (in the herd), in the post-
harvest phase (at the abattoir), or in both. In the early years, the focus was mainly on pre-
harvest initiatives. In experiments and large-scale epidemiological studies it was observed
that acidified feed (Jørgensen et al., 2001) or home-mixed feed (Kranker et al., 2001) resulted
in a reduced Salmonella burden in pigs. Likewise, rodent control and limited commingling of
pigs seemed to be important to avoid transmission of Salmonella.

2.1. The phase-1 hypothesis

The phase-1 hypothesis was that if pigs presented at slaughter harboured little or no
Salmonella, then food safety would be assured. This implied that it was essential to classify
herds according to the Salmonella level found in the herd. Therefore, a classification scheme
was developed aiming at dividing the slaughter pig herds into three levels based on the
prevalence data in the previous three months (Mousing et al., 1997):

• Level 1.
– No or few reactors.
– No interventions in the herd.
• Level 2.
– A moderate proportion of reactors.
– Pen faecal samples must be taken and a reduction plan is recommended.

166  Towards a risk-based chain control

Lis Alban and Stine G. Goldbach

• Level 3.
– An unacceptably high proportion of reactors.
– Pen faecal samples must be taken and a reduction plan is recommended.
– Sanitary slaughter of all finishers is compulsory.

2.2. The Salmonella Mix-ELISA

Concurrent improvements in serology made it possible to measure antibodies against

Salmonella at a large scale by using meat-juice samples collected at the abattoir in connection
with slaughter of the finishers (Nielsen et al., 1995; Nielsen et al., 1998). The contents of
the antibodies against Salmonella were analysed in the Danish Mix-ELISA, which detects 90%
of the Salmonella serotypes known to occur in Danish pigs (Nielsen et al., 1995).

2.3. The revision of the classification scheme

The design of the initial classification scheme was based on laboratory results and theoretical
knowledge. After 5 years, sufficient data were collected and the classification system was
revised. The aim was to improve the association between serology and bacteriology on herd
level and to identify positive herds sooner than in the first classification scheme. The data
used for the revision originated from two different screenings: a random screening in 1,962
finisher herds for Salmonella typhimurium DT104 (Anon., 1998) and a detailed screening
for Salmonella in 167 herds (Sørensen et al., 2004). In both sets of data, information was
available about serology and bacteriology for ten finishers per herd.

Our epidemiological analyses showed that there was a clear association between herd serology
and the probability of identifying Salmonella bacteria both in the caecal content (Figure 2;
Alban et al., 2002) and on the carcass (Sørensen et al., 2004). Samples sizes were adjusted
with the aim of being able to detect Salmonella, if present, with a minimum prevalence of
5% seen over one year. This could be obtained if 60 samples were taken annually in a herd

% positive caecal samples

0 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80
% positive meat juice samples evalueated at cut off 40 OD%

Random screening 1,962 herds 167 study herds

Figure 2. The association between serology and bacteriology (based on 10 caecal samples per herd) for
Salmonella in finishers based on two Danish studies, one in 1,962 herds and another in 167 herds.

Towards a risk-based chain control  167

Lis Alban and Stine G. Goldbach

producing <2,000 finishers, 75 in a herd producing 2,001-5,000 finishers, and 100 samples
in a herd producing more than 5,000 finishers a year (Alban et al., 2002).

Studies also indicated that the lower the cut-off of the individual meat-juice result, the
higher the correlation between serology and bacteriology. Unfortunately, the lower the cut-
off is, the higher is the likelihood of a false-positive result. It was of interest to introduce
a Level-0 for seronegative herds. However, a false-positive result would be detrimental for
a Level-0 herd. Therefore, we only lowered the cut-off from 40 OD% to 20 OD% - and not
down to 10 OD%.

Moreover, it was noted that the association between serology and microbiology was improved
when the Salmonella results of the previous three months were weighted 3:1:1. Hereby,
the result of the most recent month would count three times as much as the results of the
two preceding months. The weights were identified as the (rounded off) parameters that
gave the best fit in a logistic regression model describing the association between the
proportion of Salmonella positive caecal samples and serology. The weighted average of the
seroprevalence is called the serological Salmonella index for slaughter pig herds. The effect
of the weighting can be described by the example in Table 1. It is noted, that a herd with an
increasing seroprevalence will enter a higher level sooner in case weighting is used compared
with no weighting. Likewise, when the seroprevalence is decreasing, the herd will sooner
leave the high Salmonella level than in a system without weighting of the monthly results.
The farmers have appreciated the effect of the weighting because it acts as an incentive to
reduce Salmonella in the herd.

The revised scheme started in 2001 and resulted in a 20% reduction in the number of
samples collected without food safety being poorer. By March 2005, 95.7% of the slaughter
pig herds were in Level 1, 3.3% in Level 2, and 1.0% in Level 3; the latter subjected to
sanitary slaughter.

Table 1. Example of effect of weighting results of serology for the allocation of a finisher herd into the Danish Salmonella
level operating with 3 herd levels.

Results of serology (measured in OD%) in previous 3 months

August: 10, 15, 10, 15, 10 = 0 / 5 positive ~ 0%
September: 15, 25, 15, 30, 10 = 2 / 5 positive ~ 40%
October: 25, 30, 15, 28, 10 = 3 / 5 positive ~ 60%

Weighted average of herd prevalence

0.6 x 60% + 0.2 x 40% + 0.2 x 0% = 44% => level 2

Simple average of previous example

(60% + 40% + 0%) / 3 = 33% => level 1

168  Towards a risk-based chain control

Lis Alban and Stine G. Goldbach

A deduction system was also introduced as an incentive for the farmers in Level 2 and Level
3. Currently (2004), the deductions are 2% of the carcass value for a Level-2 finisher. For
Level-3 finishers, the deduction is related to the number of months the herd has been in
Level 3. It begins with 4% and then increases to 6% and eventually reaches 8%. Only few
farmers will be able to make money while paying these deductions.

2.4. Risk-based surveillance

Recently, a study on the effect of introducing risk-based surveillance has demonstrated that
it is possible to reduce sampling in herds with no Salmonella without jeopardizing human
health (Enøe et al., 2004). According to the present programme, 5 samples should be taken
monthly in a herd that produces 201-3,000 finishers annually (Alban et al., 2002). However,
simulation using surveillance data indicated that the number of samples in herds with a zero
seroprevalence could be reduced to 1 sample per month. Risk-based Salmonella surveillance
will be implemented in Denmark mid 2005, and it is expected to lower the number of samples
taken by 20-25% (Danish Bacon and Meat Council).

3. Post-harvest initiatives in the second phase

3.1. The phase-2 hypothesis

There were indications showing that even if swine herds with the highest Salmonella burden
were excluded, there would still be enough Salmonella in the remaining herds to fuel the
system with Salmonella (Alban and Stärk, 2005). This changed the focus of Salmonella
reduction from pre-harvest to post-harvest. Moreover, it was of interest to identify how a
further reduction could be obtained in a cost-effective way. A simulation model was built
to represent pig production from the piggery to the slaughter plant. The model was based
on all available data and expert opinion. Figure 3 describes the Salmonella contamination
on the pig/carcass all the way from the loading on the farm until pork after chilling. It
is noted that the prevalence of Salmonella increases during transport and lairage, and it
reaches its maximum at the time of slaughter. Hereafter the Salmonella prevalence is reduced
substantially due to singeing, but increases again because of polishing, evisceration, and
veterinary inspection, etc. (Figure 3). This led to the phase-2 hypothesis: If one would focus
more on identifying ways of hygienic slaughter, then a low Salmonella burden of the final
carcass will be the result. The simulation model was used to study the effect of different
intervention measures on the Salmonella prevalence of the final carcass. One example was to
half the proportion of herds with a large Salmonella burden. The latter strategy turned out to
be very expensive and had only a limited effect, because the Salmonella load in the remaining
system is sufficient to fuel it with Salmonella. The simulation performed demonstrated that
concurrent improvements throughout the production chain are needed to further reduce
the prevalence of Salmonella on pig carcasses (Alban and Stärk, 2005). This implies e.g. a
high temperature at singeing, enclosing the anus and rectum in a plastic bag, and improved
disinfection of tools (Alban and Stärk, 2005).

Towards a risk-based chain control  169

Lis Alban and Stine G. Goldbach


Prevalence of Salmonella


























Figure 3. Simulated contamination of Salmonella from loading at the farm until pork after chilling (based
on Alban and Stärk, 2005).

3.2. Cost benefit analyses of national Salmonella control strategies

In the past few years, the Danish pig industry has focussed more on the costs of reducing
Salmonella. Recently, a cost benefit analysis has been conducted with the aim to identify the
economical efficiency of a range of possible national control strategies against Salmonella.
The following four strategies (two pre-harvest and two post-harvest) were evaluated:

1. Use of acidified feed for finishers.

2. Use of home-mixed feed for finishers.
3. Sanitary slaughter of all pigs from higher levels of Salmonella (Levels 2 and 3).
4. Hot-water decontamination after slaughter of all pigs.

Only hot-water decontamination, which implies showering pig carcasses with 80°C hot water
for 14-16 seconds directly after slaughter, turned out to be socio-economically efficient – and
only the provided pre-harvest surveillance was cancelled (Table 2). For all four strategies, the
industry as a whole would bear all expenses, and society would gain all benefits (Goldbach and
Alban, 2005). A disadvantage associated with the present type of hot-water decontamination
is a large consumption of water, which might be considered environmentally problematic.
Furthermore, the equipment requires a lot of space, which makes it difficult to install on
several abattoirs.

A subsequent study investigated the cost-effectiveness of different scenarios for reducing the
Salmonella prevalence in pigs and pork along the stable to table chain (Nielsen et al., 2005).
The aim was to analyse how to attain a Salmonella prevalence of 1.2% in pork at the lowest
possible costs. This prevalence level was agreed upon by the Danish Bacon and Meat Council
and the veterinary authorities. The intention of the industry was to provide a catalogue on

170  Towards a risk-based chain control

Lis Alban and Stine G. Goldbach

Table 2. Results of a cost-benefit analysis of four different national strategies against Salmonella in Danish pork,

Strategy regarding reduction of Salmonella in pork

Hot water decontamination Sanitary Home-mixed Acidified

slaughter (B) feed (C) feed (D)
Discounted net benefits (mill. EUR) With existing With reduced
for the time period 2005-2020 control (A1) control (A2)

Net benefits, consumers 23.5 23.5 1.9 8.9 11.6

Net benefits, national authorities 9.7 14.4 0.7 2.6 3.4
Sub total 33.2 37.9 2.5 11.5 15.0
Net benefits, farmers 0 21.5 - -354.5 -95.0a
Net benefits, DBMCb 0 8.5 - - -
Net benefits, slaughterhouses -50.6 -53.3 -50.1 -0.8 3.3a
Sub total pig sector -50.6 -23.3 -50.1 -355.3 -91.7
Total net present value -17.3 14.6 -47.6 -343.9 -76.7

a Net benefits without penalties; b Danish Bacon & Meat Council.

the effects of individual intervention measures for the future control of Salmonella in the
Danish pig sector. Nine scenarios were listed reflecting all socially acceptable strategies to
reduce the Salmonella prevalence. These were:

1. Increased efforts in herds with a high Salmonella level.

2. Use of home-mixed feed for finishers.
3. Use of acidified feed for finishers.
4. Separate transportation and lairage of herds with low Salmonella level.
5. Increased efforts at abattoirs dealing with pigs from herds with a high Salmonella
6. Hot water decontamination of herds with a high Salmonella level (above index 20).
7. Hot water decontamination at specific abattoirs.
8. Use of hand-held steam-sucking.
9. Hot water decontamination of all finishers.

For each scenario, it was evaluated how it would affect the Salmonella prevalence and whether
the goal to attain a prevalence of 1.2% would be realistic. Furthermore, each scenario was
evaluated with respect to its feasibility, investment and annual costs as well as market
aspects. The results are summarized in Table 3.

It was concluded that further pre-harvest initiatives would not be cost-effective compared
with post-harvest measures. Furthermore, none of the pre-harvest scenarios would result in
a sufficient reduction of the Salmonella prevalence in pork by 2006 hence making it possible

Towards a risk-based chain control  171

Lis Alban and Stine G. Goldbach

Table 3. Results of a cost-effectiveness analysis of nine national strategies against Salmonella in Danish pork (based
on Nielsen et al., 2005).

Scenario Feasibility Effect PV mill. Euro 1st year costs Attained a

(15 years) mill. Euro prevalence
of 1.2%

Increased effort in herds with a Difficult Small ÷154 14 No

high Salmonella level
Use of home-mixed feed for Very difficult Small ÷355 366 No
Use of acidified feed for finishers Very difficult Small ÷92 8.6 No
Separate transportation and lairage Difficult Moderate ÷50 5.2 No
of herds with low Salmonella level
Increased effort at abattoirs dealing Difficult Moderate to good NC NC Yes
with pigs from herds with a high
Salmonella prevalence
Hot water decontamination of herds Difficult Good ÷22 3.4 Yes
with high Salmonella level (above
index 20)
Hot water decontamination at Possible Good ÷8 - ÷11 2.4-3.2 Yes
specific abattoirs
Use of hand-held steam-sucking Possible Moderate to good ÷13 - ÷22 1.6-2.7 Yes
Hot water decontamination of all Difficult Good ÷23 12 Yes

PV: Present Value for the period 2005-2020; NC: Not Calculated.

to attain the prevalence level agreed upon with the veterinary authorities (Nielsen et al.,

Other decontamination initiatives are under development. Currently, the Danish Bacon and
Meat Council is testing hand-held decontamination by use of steam on high-risk areas of
the carcass. Another promising alternative is treatment of carcasses with a combination of
steam and ultrasound, which will be pursued further in the future.

3.3. Blast chilling effective

As previously mentioned, there are several measures at the abattoir that might be introduced
to reduce the level of Salmonella. Studies on E. coli have demonstrated that effective blast
chilling may reduce the level of surface bacteria by 0.7-0.9 log (Dalsgaard and Andersson,
1999; Jensen and Christensen, 2000). Dalsgaard and Andersson (1999) also found that the
prevalence of Salmonella was reduced by 31%. So even if the killing effect is limited when
measured against E. coli, blast chilling might result in a substantial reduction in the number

172  Towards a risk-based chain control

Lis Alban and Stine G. Goldbach

of Salmonella positive carcasses. When using blast chilling, it should be avoided to operate
too many carcasses per hour since this may give capacity problems and reduce the effect.

4. Conclusions

4.1. What has been achieved

Veterinary epidemiology in combination with disciplines like microbiology and food hygiene
has contributed considerably to the development and evaluation of the Danish surveillance-
and-control programme for slaughter pig herds. The entire process from the beginning in
1995 until today has resulted in a successful surveillance for Salmonella through increased
knowledge about the pre- and post-harvest dynamics. The number of human salmonellosis
cases attributable to pork has declined, e.g. to 202 in 2003 (Figure 1; Anon., 2004). In spite
of these achievements, it has also been noticed that the Salmonella prevalence in pork has
been constant around 1.4% since 2001 (Anon., 2004).

4.2. What has been neglected

To continue the process of reducing the Salmonella level of pork, the Danish Salmonella
control strategy seems to need a revision. Moreover, the Danish pig industry pays the major
part of the expenses related to the national Salmonella programme in pigs. This situation has
increased focus of the industry on cost-effectiveness of the ways to obtain further reductions
of the Salmonella prevalence in pork.

4.3. What needs to be done

4.3.1. The industry and the authorities in collaboration

Food safety is a consumer demand, and although the food industry is interested in meeting
this demand, the public authorities often enforce some degree of control on the industry
to ensure that a certain level of food safety is indeed afforded. Such control measures are
accepted by the industry.

The industry acts as a player in a society with social norms for what is acceptable and what
is not. In Denmark, the general norm is that food safety – in this case the “absence” of
Salmonella – should be achieved without the use of radiation or decontamination by use of
lactic acid or chloride. The industry must comply with this norm, and does so. However, it
should be legitimate for the industry to obtain the agreed level of food safety in the most
cost-effective way – as long as it acts within the given social norms.

The competitive power of the pig industry highly depends on the production costs. If very
costly Salmonella control measures are enforced on the industry by the public authorities, it
will inevitably affect the competitive power in relation to the pig industry in other countries
that are not subject to the same demands. If the pig sector is to maintain its competitiveness,

Towards a risk-based chain control  173

Lis Alban and Stine G. Goldbach

it will need to focus heavily on production costs. Therefore, it is necessary that a given
level of food safety is obtained at the lowest possible costs. Furthermore, it is preferred that
regulations are enforced at EU level, because this will imply fair competition between pig
production systems in different countries.

4.3.2. Future Salmonella surveillance in an EU perspective

The new Zoonosis directive recently issued by the EU will be implemented in the years to come
(2003/99/EC) and in this context surveillance programmes for Salmonella and possibly other
zoonoses will be developed and implemented in many EU countries. Based on lessons learned
in Denmark, it is recommended that the development of these surveillance programmes
should take place in a dialogue between the pig industry and the veterinary authorities, as
has been the case in Denmark. The lessons learned in Denmark – good as well as bad – can
be used to develop and implement the appropriate type of surveillance programme for the
individual country. This will depend - among other things - on how widespread Salmonella is
in the national pig industry and which initiatives have already been put in place.

4.3.3. Where and how to combat Salmonella

It is possible to reduce the Salmonella burden pre-harvest, but it would be very expensive to
eliminate Salmonella from the infected herds. It is questionable whether this is the most cost-
effective approach if increased food safety is the goal. If further reductions are sought-after,
our analyses show that post-harvest initiatives will be more cost-effective than pre-harvest
initiatives. There are various ways to reduce Salmonella post-harvest – and a combination
of measures is the most cost-effective way in general. For the consumer the main interest
must be whether food is safe to eat – not how this was secured.


Anon., 1998. Bacteriological screening for Salmonella in finisher herds – Multi-resistant Salmonella Typhimurium
DT104 (in Danish). Danish Veterinary and Food Administration. 46 p. plus appendix.
Anon., 2004. Annual Report on Zoonoses in Denmark 2003. Danish Zoonosis Centre, Copenhagen, Denmark. 31 p.
Alban, L. and Stärk, K.D.C., 2005. Where should the effort be put to reduce the Salmonella prevalence in the
slaughtered swine carcass effectively? Prev. Vet. Med. 68, 63-79.
Alban, L., Stege, H. and Dahl, J., 2002. The new classification system for slaughter-pig herds in the Danish Salmonella
surveillance-and-control program. Prev. Vet. Med. 53, 133-146.
Dalsgaard, B. and Andersson, M., 1999. Assessment of the safety of products associated with blast chilling (in
Danish). Danish Meat Research Institute, Roskilde, Denmark. Report No. 48, 231 by March 29. 1999.
Enøe, C., Wachmann, H. and Boes, J., 2004. Low intensity serological surveillance for Salmonella enterica in slaughter
pigs from low prevalence herds in Denmark. In: 10th International Symposium for Veterinary Epidemiology and
Economics, Nov. 17-21, 2003. Viña del Mar, Chile. Proceeding No. 745.
Goldbach, S.G. and Alban, L., 2005. Cost-benefit analysis of measures to reduce Salmonella in Danish pork. In:
Proceedings from SafePork. Sept. 6-9, 2005. Rohnert Park, California, USA.

174  Towards a risk-based chain control

Lis Alban and Stine G. Goldbach

Jensen, T. and Christensen, H., 2000. Decontamination – Documentation of effect related to investigations of the
slaughtering (in Danish). Danish Meat Research Institute, Roskilde, Denmark. Report No. 19, 361 by August
23. 2000.
Jørgensen, L., Kjærsgaard, H. D., Wachmann, H., Borg Jensen, B. and Knudsen, K.E.B., 2001. Effect of pelleting
and use of lactic acid in feed on Salmonella prevalence and productivity in weaners. In: Proceedings Salinpork,
September 2-5, 2001. Leipzig, Germany, p. 109-111.
Kranker, S., Dahl and J, Wingstrand, A., 2001. Bacteriological and serological examination and risk factor analysis
for Salmonella occurrence in sow herds, including risk factors for high Salmonella seroprevalence in receiver
finisher herds. Berl. Münch. Tierärztlich. Wochenschr. 114, 350-352.
Mousing, J., Thode Jensen, P., Halgaard, C., Bager, F., Feld, N., Nielsen, B., Nielsen, J.P. and Bech-Nielsen, S.,
1997. Nation-wide Salmonella enterica surveillance and control in Danish slaughter swine herds. Prev. Vet.
Med. 29, 247-261.
Nielsen, B., Baggesen, D.L., Bager, F., Haugegaard, J. and Lind, P., 1995. The serological response to Salmonella
serovars Typhimurium and Infantis in experimentally infected pigs. The time course followed with an indirect
anti-LPS ELISA and bacteriological examinations. Vet. Microbiol. 47, 205-218.
Nielsen, B., Ekeroth, L., Bager, F. and Lind, P., 1998. Use of muscle juice as a source of antibodies for large scale
serological surveillance of Salmonella infection in slaughter pig herds. J. Vet. Diag. Invest. 10, 158-163.
Nielsen, B. and Korsgaard, H.B., 2003. Estimated society cost for pork-related Salmonella and Yersinia in Denmark
2002. In: Proceedings SafePork, international symposium on the epidemiology and control of foodborne
pathogens in pork, Crete, Greece, Okt. 1-4, 2003, p. 114-116.
Nielsen, B., Dahl, J., Goldbach, S.G. and Christensen, H., 2005. Cost-effectiveness analyses of the Danish Salmonella
Control Strategy. In: Proceeding from SafePork. Sept. 6-9 2005. Rohnert Park, California, USA.
Sørensen, L.L., Alban, L., Nielsen, B. and Dahl, J., 2004. The correlation between Salmonella serology and isolation
of Salmonella in Danish pigs at slaughter. Vet. Microbiol. 101, 131-141.

Towards a risk-based chain control  175

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

Epidemiological surveillance in primary and

processing food production in the network of
“Istituti Zooprofilattici” in Italy
Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni
Istituto Zooprofilattico Sperimentale della Lombardia e dell‘Emilia Romagna „B. Ubertini“,
Via Bianchi 9, 25124 Brescia, Italy,


The expansion of the European Union and the globalisation of international markets have
fundamentally changed the agricultural-industrial productive sector. These changes initially
affected the large production industries but are increasingly also influencing primary
agricultural production (rearing, cultivation, animal feed) as well as small production units
(traditional and typical food products). The latter are numerous in certain European countries,
particularly in Italy. Such a fragmented market that has different needs and is the result
of very particular situations, complicates the ensuring of technical-health assistance and
controls that guarantee affordable health safety for the consumer. Thus, a single information
system needs to be created that can epidemiologically survey primary production, analyse food
processing, give complete risk assessment and create an efficient control system for public
health needs. An example for this approach is the SIVARS ( system developed
by the Italian Istituti Zooprofilattici Sperimentali, the research and service laboratories of the
Veterinary national Health System in Italy. In this contribution we outline some examples of
the existing and planned activities to show the potential of a system that, albeit still under
construction, covers the whole food production chain.

Keywords: food safety, surveillance, sivars (, veterinary public health,

challenge test

1. Introduction

Guaranteeing food safety to serve consumer health protection is a well known objective of
the EU since the first drafting of its White Paper. Food safety can only be guaranteed through
three independent steps:

• Creation of an information network on food safety.

• Risk analysis based on collected information.
• Interaction with the consumer based on risk analysis and the information network.

All European laws on food safety surveillance have particularly highlighted the following

Towards a risk-based chain control  177

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

• The surveillance has to include the whole productive chain (from stable to table) because
only through the homogeneous extension of control and checking systems an adequate
level of effectiveness can be reached.
• Risk prevention and prediction activities have to prevail over restrictive ones in a clear
and open system.
• The only way to realize adequate prevention systems is to base these on thorough and
widespread knowledge on risks and their analysis, that can only be gained through
experimentation, research and field data collection.
• The scientific information collected should be distributed and shared as widely as possible,
because this is indispensable and represents the common basis for all operators of the
various food production and transformation sectors all over the EU. From this comes
the need of uniform systems for data finding and collection, and integrated systems for
information sharing.

2. Scientific data organization

Control and surveillance systems and information networks represent an indispensable

support, not only for each epidemiological exercise (for example the surveillance of a
particular disease), but especially because they constitute an operative base that, at its best,
is operative any time a new safety need or unexpected dangerous situation arises. Thus, the
modularity and continuity of the supervision system is as essential as are the control and
availability of the collected epidemiological data themselves.

Furthermore, the importance of control and surveillance is independent of any specific

purpose they may have been designed for. However, they are fundamental when risk analysis
is to be based on scientific evidence. Traditional food safety approaches even if supported
by epidemiological and statistical data with no contrary evidence, do not suffice. Each
assertion of food safety must be based on scientific and properly documented evidence.
Documentation means not only objectively demonstrating a particular event but also to
assess its influences - for each production step or variable - on the safety of that particular
food. Pasteurized milk, for example, poses no particular problem because the impact on
pathogen growth of both production technology and the pasteurization process are known.
With microbiological isotherm inactivation, it is possible to establish the absence of a specific
pathogen (performance criteria) at a certain time-temperature ratio (process criteria). This
does, however, not apply for most food transformation processes - including cheese and
pasteurized milk - when new variables are introduced.

Veterinary public health implies not only controlling if operators apply hygienic norms in food
production, but also creating an information structure that can guide health control and a
critical evaluation of the producers´ ability to ensure public health. Unlike ten years ago we
can now communicate without huge economic or human resource costs. Still, internet, mobile
phones, automation and control systems are only instruments. The real challenge is using
them in the right way. A great amount of data is already existent, e.g. farmers (sometimes
using computer systems) record their own data, there is a dairy record for cheese production,
the analysis laboratory has identified both the animals and their production data. However,

178  Towards a risk-based chain control

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

each represent a closed system generally without links with or access to other systems. What
we need is to create an open system capable of exchanging this information for better use.
The new laws on food safety are binding and are observed for this reason, but, they are not
necessarily improving food safety. The simple use of technological, sanitary and scientific
knowledge may be put to even better use, because - however complex the challenge - the
more knowledge is acquired, the more it will contribute to this purpose. This is the motivator
to collect homogeneous data in a standard and comparable way for all Europe.

3. The surveillance network of the “Istituti Zooprofilattici

Sperimentali” in Italy

The Istituti Zooprofilattici, uses a national network to communicate with both national and
international institutions, sharing information and research results. This network could
serve as the basis of an information system for the epidemiological surveillance of primary
production and processing as part of a general food safety information system. The Institutes´
laboratories have 10 central headquarters and 90 diagnostic departments located throughout
Italy. This allows a methodical and independent collection of a great amount of data concerning
animal health, farm hygiene and food production. The data collected by the larger veterinary
laboratory networks in Europe, may, if correctly collected and organized, be an important
source for an integrated food safety information system. Under Regional Veterinary Authority
coordination and National Health Ministry surveillance, the Italian information system can
be considered as a support to Veterinary Public Health, both for prevention and planning.
The acquisition of information on technological process characteristics to control or prevent
pathogens in food is the base for the development of risk prevention in food. Finally, in order
to improve food quality standards for consumers´ health, information stored in databases
can also be used to interact and inform the consumers.

3.1. The constitution of an information system

National Information collection by the Istituti Zooprofilattici Sperimentali surveillance network

serves an important and polyhedral function. The information system SIVARS (Veterinary
Information System for Sanitary Risk Assessment) relies on a computerized data base of
laboratory findings divided into two parts. The first (SIPP) concerns primary production,
farm characteristics and raw materials, the second (SITA) food transformation, and products
technologies. The information is separated in computer series, some accessible to everybody,
others restricted to certain areas. The primary production section has a database of all
sanitary information from the producer on every type of food. Connected to a registry, the
system can show the general situation of all sanitary data available, from animal health to
food quality, deriving from both HACCP programs and official planning and controls. Data are
accessible both for farmer and sanitary authorities and respect the laws on privacy.

The food transformation section has elements relating to each food, and includes information
elements that are accessible according to the data. Each food product is described starting
from product trade name and its image, followed by information about its appearance to
the consumer, producer and trade network. Production technologies, product composition,

Towards a risk-based chain control  179

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

distinctive characteristics and other information for consumers, (e.g. allergenic parts,
alimentary intolerance factors or the exclusion of particular items with specific pathologies)
are also collected. In a more reserved section there are also specifics on the product, such
as production technologies, acceptability standards and risk analysis elements. Scientific
knowledge is collected for predictive analysis of microbiological development and the
characteristics of each food as to their interaction with the growth of experimentally added
pathogens. Furthermore, historical data on specific pathogens are collected for each product
such as critical process points and products, environment and material that constitute
complementary risks.

The information is grouped on a geographical typology and characteristics basis, for example
typical or traditional products.

4. Primary food production

The principle objectives of the Epidemiological Surveillance is to collect standardised data

concerning the health of production animals. Animal health problems are the first concern
but, given their final destination is human consumption, the safety of all animal products
is indirectly guaranteed provided these problems are solved.

The cost of control and monitoring primary production is the most restricting factor. Whether
they are met by industry or by the public sector they still relate to what ‘should’ or ‘would
be nice’ to do and what ‘can’ be done. One only needs to think of the costs of sampling
various biological components (blood, faeces, milk, etc.) that generally need a great deal
of time and energy.

Thus the fact that these operations occur within an optimised programme is no longer simply an
operational alternative to improve cost-benefit but rather becomes the principal indispensable
element in a useful and efficient system. The more the various needs are concentrated into a
single operative system, the better the final logistical and cost-results will be.

It may seem obvious to use the same biological matrix (e.g. milk in the case of cattle
farming) for a series of diagnostic trials in a single general programme. However, it is one
of the weaknesses of traditional control systems that they have always been planned and
developed independently. Furthermore, analytical laboratory activities (much of which has
already been carried out for different means and aims) are great sources of information
that are only minimally used due to a lack of a single data collection/management and
distribution system. The latter represents a waste or resources and time. Therefore, it is
essential to use the same matrix for a whole series of analyses and various end results to be
able to characterize the positive effects of epidemiological surveillance. Indeed, it is exactly
these aspects that make control economically possible. The truly innovative aspect is the
central collection and management system of analytical data. The Information System for
Primary Production (S.I.P.P.) of the Italian Istituti Zooprofilattici Sperimentali is an example
of the possibilities of this type of approach of epidemiological surveillance. Once the obvious
technical problems of language, applied analytical methodology and interpretation have been

180  Towards a risk-based chain control

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

solved, the system yields results that are both relevant and all embracing. The ‘input’ phase
of the system merely results from sharing the daily tasks already carried out on a daily basis
by individual laboratories.

4.1. Milk as a source of information

As was the case in many parts of Europe the quality based milk payment system started in
Lombardy at the end of the 1970’s. It developed through the 1980’s and had a single aim: the
redistribution of milk income on the basis of quality. From this activity the system has been
progressively improved and farmers, dairies and the processing industry have all benefited.

The assessment of quality is based on periodic sampling (2-4 times a month) and is carried
out by milk collection personnel who have been trained for this purpose (Table 1). Once
at the laboratory, these samples can also be used for a wide range of other purposes, for

• Analysis, on request of the farmer, for pathogenic agents or specific antibodies to monitor
the health of a herd or to carry out general self-controls.
• Physical-chemical analysis for technological purposes (casein, coagulation aspects, etc.)
on request of the dairy.
• Particular controls dictated by veterinary services during emergencies (as in the case
of the Aflatoxin M1 at the end of 2003). Otherwise it can intervene to monitor Italian
Law 169/89 (D.M 185 concerning high milk quality) for those farmers whose product is
destined for drinking milk, or also for the law DPR 54/97 (EU Directive 92/46) for the
sale of milk.

Table 1. Milk quality payment – Analytical Activity 2004 (lab. Produzioni Zootecniche e Sorveglianza Epidemiologica
degli allevamenti I.Z.S.L.E.R. Brescia, Italy).

Number Number
Dairy farms 6,473 Samples 155,375
Cheese Industry 247 Total analysis 1,486,094

Obligatory parameters Optional parameters

Number Average Number Average

Somatic cells 314,917 155,375 / ml Casein 118,306 2.64 mg/100 ml

Total bacteria count 155,375 66,250 /ml Anaerobic spore form 62,213 325 Spore/L
Fat 155,375 3.93 mg/100 ml R.S.M. 113,086 9.14%
Protein 155,375 3.37 mg/100 ml Urea 121,218 23.25
Lactose 155,375 5.04 mg/100 ml Acidity 2,151 3.25° SH/50 ml
Inhibitory residues 155,375 0.343% of Freezing point 134,808 -0.526°C

Towards a risk-based chain control  181

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

More importantly these samples can be used for even wider research that refers to a single
farm. For example, they can offer information about particular environmental contamination
when large geographical areas need to be monitored for general pollution and health safety

It should be stressed that the creation of an epidemiological surveillance system does not
simply multiply existing data, i.e. increase the analyses carried out on individual samples.
The latter would simply increase costs unless included in a generalised system that has the
very aim of creating a single efficient network of data elaboration and distribution where
the whole sector collaborates.

4.2. From udder health to the safety of the milk

Undoubtedly, mastitis is the main sanitary and production problem in intensive dairy farming.
Mastitis prevention programmes that include whole dairy management, significantly increase
the economic yield, since contagious mastitis is associated with lengthy (sometimes multi-
annual) interventions that include periodic controls of the infection of individual cattle
by bacteriological examination of the milk from both single quarters and whole udder

For a farm, the considerable amount of laboratory analysis has a single aim: to identify
infected cattle and monitor those animals that have been cured and maintain their healthy
status. However, these data are also the base for other measurements that go beyond
farmers´ needs and - if included in a widespread system - could prove useful for instance for
epidemiological research and geographic and health studies. In the case of Staphylococcus
aureus mastitis, for example, there has for over 7 years been a control programme in force in
the province of Brescia (Bertocchi et al., 2005). Apart from the obvious benefits for individual
farms (Table 2) this has provided:

1. For animal health.

• Indications of the level of spread of this pathogen over time in one geographic area.
• Indications of the spread of antibiotic resistance of the strain, with particular reference
to pharmacological resistance.
2. For food safety.
• The evaluation of strains causing mastitis, and of their capability to produce exotoxins.
• The evaluation of strains found in the milk with particular regard to their environmental
origin (milking, structure, animal, i.e. the true meaning of environment) or to the udder.
• Identifying process steps to eliminate milk contamination, above all in dairy products
made from unpasteurised milk.
• Identifying the process conditions capable of eliminating or reducing the production
of toxins by contaminating strains.

These various (partially experimental) research areas are still only major examples of the use
of information. It is obvious that including in the control program other micro-organisms (e.g.
zoonotic agents) or other types of farm health problems would increase possible applications

182  Towards a risk-based chain control

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

Table 2. Data of 100 dairy farms applying intramammary Staphylococcus aureus infection control program (from
Bertocchi et al., 2005).

Initial situation 1998 Current situation 2004

Number of cows 9,085 9,315

Number of Infected cows 3,494 866
% of Infected cows 38.5% 9.5%
BSCC (mean) 446,000 279,000

BCSS (mean) totality of dairy farms in Province of Brescia

BTSCC (mean) 1998 2001 2004

338,000 311,000 288,000

4.3. Pharmacological surveillance

In a single year, the bacteriological laboratory of the milk department of the Brescia IZSLER,
analyses over 100,000 samples to identify pathogenic micro-organisms of the udder from
mammary or quarter milk samples, and evaluates the pharmacological resistance of about a
1,000 different bacterial strains (Table 3). This, for instance enables to assess sensitivity of
isolated bacterial strains against most principally active antibiotics used in mastitis treatment
(Table 4 shows some examples of the degree of sensitivity; Varisco et al., 2004). These
data, together with other data collected from other laboratories, appear ideal to conduct a
pharmacological resistance study and to determine the appearance of multi-resistant strains
that constitute a major problem in human medicine (Bolzoni et al., 2005).

This activity is replicated (albeit to varied degrees), both in other laboratories of the Istituto
Zooprofilattico della Lombardia e dell’Emilia and indeed in the other 9 institutes in Italy

Table 3. Bacteriological analytical activity (lab. Produzioni Zootecniche e Sorveglianza Epidemiologica degli allevamenti
– I.Z.S.L.E.R. Brescia, Italy).

Type of analysis Samples 2004 Samples 2003

Antibiogramme (Kirby-Bauer) 902 796

Bacteriological isolation and identification 3,303 3,797
Mycoplasma 31 37
Staphylococcus aureus (Baird Parker con supp. RPF) 49,878 45,973
Streptococcus agalactiae (T.K.T.) 60,444 52,333
Mouds and levures 57 38
Total analysed 114,615 102,974

Towards a risk-based chain control  183

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

Table 4. Bacteria antimicrobial sensitivity (%), Examples 2001-2004 (from Varisco et al., 2004).

Antimicrobial S. aureus E. coli E. faecalis Serratia sp. Klebsiella sp. Staf. C – Staf. C – Str. uberis
Agent (n = 767) (n = 195) (n = 305) (n = 64) (n = 98) Haem. Not Haem. (n = 138)
(n = 304) (n = 41)

Erythromyc. 67.57 0.52 44.75 1.58 0 81.73 82.93 54.07

Penicillin 34.42 0 91.38 0 0 48.08 65.85 80.98
Tetracycline 72.89 56.25 35.89 19.05 45.4 81.73 70 40.6
Rifaximin 96.47 0.86 74.88 0 0 98.53 100 88.24
Tylosin 84.6 0 48.44 0 0 86.49 n.t. 63.16
Ampicillin 37.89 51.81 92.94 6.95 0 58.65 65.85 80.88
Spyramicin 59.05 0.56 41.69 0 0 75 73.17 52.71
Cefquinone 96.21 99.45 85.87 n.t. 60 97.94 100 95.38
Cephalothin 98.43 39.89 81.73 10.94 87.2 97.12 97.56 95.49
Spectinomycin 21.09 62.96 25.23 61.29 55.2 53 35 38.76
Kanamycin 91.48 76.56 49.69 93.75 66.3 89.42 97.56 52.24
Pirlimycin 89.8 0 64.54 2.08 3.7 90.91 88.57 75

as well as in numerous public and private laboratories (e.g. Sections of the Association of
Provincial farmers, universities, etc.). Thus, the value of central collection and evaluation of
these still varied data becomes enormous.

It should be repeated however that our true mission, rather than simply “putting the
information together”, is making it compatible and comparable (over-coming technical,
logistical and organisational problems that in certain cases should not be underestimated).
The same argument is valid on an international community level, with the difference that
e.g. on an EU level problems are amplified given the lack of efficient data bases to define
health and food safety guidelines.

4.4. Data for public veterinary health activity

Obligatory vaccination campaigns have historically been important for combating infectious
disease even to the point of eradicating whole categories (Foot and mouth disease,
Tuberculosis, Brucellosis, Classical Swine Fever, etc.).

Depending on the appearance of disease, its spread, the pathogenesis in both animals and
man and finally the costs involved, decisions to vaccinate are periodically taken. Obviously,
the cost/benefit evaluation is very important as the possibility of preventing or limiting a
disease must be balanced against economic consequences that vary for different strategies
(stamping-out, block or limitation of exports, etc.).

The availability of accurate information that reflects the reality of the situation will condition
these types of decisions. Unfortunately, information is usually extremely fragmented, not

184  Towards a risk-based chain control

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

evenly distributed and not necessarily updated as they are generated daily in tens of
laboratories and by thousands of operators. It is in emergency situations that the lack of a
single monitoring system is felt. The appearance of new diseases (BSE or highly pathogenic
Avian Influenza), outbreaks of infections that were so far considered exotic (Bovine
Contagious Pleuropneumonia) or considered eradicated (Foot and Mouth Disease) or the
sudden appearance of toxic substances in foodstuffs (Aflatoxin M1 in milk) are examples of
such situations.

Fact is that widespread and continuous controls of diseases that are not contingent would be
extremely uneconomical. An Epidemiological Surveillance System would allow monitoring this
type of situation as it would simply mean adding pre-defined controls to generate meaningful
statistics to an already operating system. The economic impact of these situations would
become possible with this system.

On a limited scale, the “Aflatoxin M1 case” (Bertocchi et al., 2004) that hit the dairy
sector in Italy at the end of 2003 is illustrative. The periodic monitoring carried out by 4
milk treatment companies (Figure 1) as well as those carried out for the epidemiological
surveillance of bulk tankmilk samples from individual dairies by the Istituto Zooprofilattico
of Brescia (Figure 2) showed, in real time, the emergency situation that occurred between
September and October 2003. This supplied the information needed by the Health authorities
for an emergency intervention which, albeit with varying results, allowed the following in
a period of only 2-3 months:

• To quickly conduct a quantitative evaluation of the problem.

• To reduce the risks of consumption of potentially dangerous quantities of toxins to a
• To reduce the economic consequences by quickly controlling stored milk and - in
case findings were abnormal - to intervene only in the dairies where abnormal results

AFM1 ppt

2000 2001 2002 2003 1°Tr. 2003 2°Tr. 2003 3°Tr. 2003 4°Tr.
Figure 1. Average milk Aflatoxin M1 in 4 dairy Factories of Brescia province (from Bertocchi et al.,

Towards a risk-based chain control  185

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni




Jan Feb Mar Apr May Jun Jul Aug Sep
Figure 2. Milk samples with Aflatoxin M1 values > 50 ppt (Legal Limit) in 2003 (from Bertocchi et al.,

Overall, we were able to face an “unexpected” emergency rapidly and efficiently that would
otherwise have been difficult to cope with in such complex practical situations. An indirect
demonstration of the validity of surveillance was the fact that during 2004 and 2005 more
dairies and food processing companies in this sector have requested periodic controls for
Aflatoxin M1 of their milk.

Other opportunities offered by an Epidemiological Surveillance System are retrograde surveys,

important for serological diagnosis, viral infections but also applicable to other sectors.
Essentially, it is a variant of the “sera-bank” concept, that is generally feasible only in certain
centralised and highly specialised structures. The conservation of samples or the registration
of additional “beyond routine” analytical parameters particularly allows useful research,
where this is necessary to understand infection routes in new geographical situations. Such
activities are generally very expensive and cumbersome when carried out by individual
laboratories and they are difficult to execute when a comprehensive approach is required.
However, if programmed and coordinated within a single information system, an holistic
approach is feasible thanks to the subdivision of “tasks” between various operative units.

A small example is a recent study on “meat juice” samples obtained from pig carcasses (easier
to collect than traditional blood samples from live animals) butchered in different areas.
These samples were taken principally to detect active pharmacological residues. However,
these samples could also be used both for specific diagnostic tests for purposes of virus
research or for assessing antibody levels to evaluate the presence of viral pathogens or the
anti-body cover for vaccinations. Finally, they could be used to determine certain metabolic
parameters for measuring stress in transported or butchered animals.

Milk rather than blood samples from cattle farms were analyzed for purposes of detecting
Brucellosis, but these were also used for an initial evaluation of the spread of Q Fever in an
Italian region. The process that was started for food processing (with the European Food
Safety Authority and the relevant national agencies) should therefore result in a uniform
system for primary production. As the farmers are part of the same system and should

186  Towards a risk-based chain control

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

consequently be held equally responsible as food sector operators (see Reg. 178/2002 EU),
while at the same time pursuing their own particular needs and tasks. This is the way forward
for a surveillance system, and while it may seem a contradiction it is both more effective
and cost efficient than traditional systems.

4.5. Processed food production

Processing the data on principal products is the most technically complex phase for risk
analysis. Each product has distinct general and specific characteristics. General characteristics
are the generic commercial product categories, e.g. Parma ham, Bresaola, Grana Padano,
etc. The specific characteristics reflect the differences of the product within the same
commercial category. For instance, Milan salami of one producer may be different to that of
another - even when the same recipe and processes are adhered to - due to small variables
such as the environment or the process itself. Therefore, technical descriptions and details
of processing are important for risk analysis of food safety and one of the main priorities of
SITA (Sistema Informativo sulla Trasformazione Alimentare, Information System on Processed
Food Production) in the SIVARS information system.

Another ambitious objective is to collect and organize, for each product, data that describe
the life cycles of various bacteria that are present in food both as normal contamination
flora and as added ‘starters’. Indeed, the concentration of micro-organisms varies over time
according to production processes, seasoning and storage of the foodstuff until the moment
of consumption. Environmental parameters like temperature and water activity change
fermentation processes in foodstuffs like salami and cheese. Resulting changes in micro-
environmental parameters such as temperature, pH and aw affect the preservation of the food
(McMeekin et al., 1993). The application of various production processes to obtain a particular
foodstuff leads to different microbiological profiles and therefore results in distinct health
risks. These considerations underline the need to thoroughly characterize products for their
microbiological profile, temperature, pH and aw value to afford looking beyond commercial
categories or brand names. Some examples are shown in Figures 3, 4 and 5.

4.6. Microbiological product standards

Using an information system, we can have analytical data of various products taken during
the microbiological characterization phases of these products to use as control parameters for
processing. This becomes particularly important in the production of traditional products, as
defined by their bacterial flora. Research and analysis laboratories can support these products
with specialized analysis programmes and data management. For example, the data obtained
from the bacterial count of a product can be mathematically elaborated to calculate the
specific bacterial growth rate using e.g. DMfit software (Baranyi and Roberts, 1994). Knowing
the dominant bacterial behaviour [e.g. lactic flora for fermented products (Figure 6)], allows
to control other indicative parameters of the process. If the microbiological profile is coherent
with other production lots analyzed throughout the year (process standard), it indicates that
e.g. acidification has occurred correctly or that bio-competition against possible pathogens
or low hygiene standards possibly present in the raw material has been efficient. On the
other hand, if the microbiological profile does not match process standards, then important

Towards a risk-based chain control  187

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

Curd leaving
Temperature (°C)

30 Start cooking
Curd breaking
15 Curd adding

0 33 67 17 .6 33 08 67 .6 .1 67 55 33 .6 .8 .6 .5 .4 .4 .4 .4 19 .8 .5 .5 .4 .8 36 .2
1 1 5 0 .6 1. 1. 2 4 6. 7. 8. 10 11 12 13 14 15 16 17 20 22 23 24 31 46
0. 0. 0. 0
Time (h)
Figure 3. Temperature profile during the cheese making of Grana Padano D.O.P. cheese.

6.5 pH fit max min



0 24 48

time h
Figure 4. pH profile at the beginning of Grana Padano D.O.P. cheese making.

modifications must have occurred and further evaluation of the risks associated with the
foodstuff are indicated.

Collecting and comparing microbiological profiles of foodstuffs with the same name
(e.g. Bergamo Salami) but originating from different factories, allows the definition of
microbiological standards for these products (Figure 7).

The production of a foodstuff with microbiological profiles different to the product standard
would necessitate a re-evaluation of the possible risks as long as they defer from those of
the defined ‘basic’ product. SIVARS not only records the defined microbiological profiles of
foodstuffs (lactic bacteria, micrococci, yeasts and

188  Towards a risk-based chain control

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

aw fit max Min

0 10 20 30 40 50 60 70
time (d)
Figure 5. aw profile during “Salame della Bergamasca” salami process and ripening.



log cfu/g

log cfu/g Fit 1 max min

0 100 200 300 400 500 600 700
time (h)

Time (die) 0 1 4 5 6 - 21
fitted log cfu/g 4.65 5.49 7.99 8.82 8.83
fitted cfu/g 44,668 309,030 97,723,722 660,693,448 676,082,975
standard error of fitting 0.73

salami type bacteria

Cacciatorino Lactobacilli mesophili
Figure 6. Lactobacillus mesophili behaviour during the processing and ripening of “Salame Cacciatorino”.
3 batches were analysed, each batch included 3 salami. Time is in hours.

), but also the growth, survival and death characteristics of micro-organisms that are
potentially dangerous for human health or indicate low hygiene standards (coagulase positive
staphylococci, Escherichia coli, coliforms, enterococci, clostridia sulphite reducers) (Figure

Towards a risk-based chain control  189

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

log cfu/g

5 log cfu/g Fit 1 max min

0 500 1000 1500 2000 2500
time (h)

Time (die) 7 98
Fitted log cfu/g 8.83 8.83
Fitted cfu/g 676,082,975 676,082,975
Standard error 0.51

salami type bacteria

Salame della bergamasca salami Lactobacilli mesophili

Figure 7. Lactobacillus mesophili behaviour during the processing and ripening of “Salame della
Bergamasca”. 26 batches were analysed, each batch included 1 salami, three different companies. Time
is in hours.

4.7. Health Risks

The intra- and extra-community trade in foodstuffs calls for food safety programs that ensure
the conformity of production processes that respect food standards for the country of import
and for consumers’ health in general. The analytical information system of health risks is
therefore structured to include a section, called ‘health risks’, in which production process
data are available according to USDA norms and other international organizations aiming at
safeguarding foodstuffs meant for exportation. The following section presents some examples
of challenge tests made for this purpose.

“Grana Padano” D.O.P., “Bergamo Salami” and spicy “Calabrian Salami” were all experimentally
contaminated with high concentrations (about 107 cfu/g) of pathogens such as Salmonella
enterica sub. typhimurium, Salmonella enterica sub. enteritidis, Listeria monocytogenes,
Staphylococcus aureus, Escherichia coli O157:H7, Clostridium botulinum, Bacillus cereus. In
the case of Grana Padano D.O.P. about 900 litres of milk were contaminated, inside typical
copper heaters. The cheese was produced from the contaminated milk according to production

190  Towards a risk-based chain control

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

log cfu/g Fit 1 max min
log cfu/g

0 2500 5000 7500 10000
time (h)

Time (die) 15 30 3 months 4 months

Fitted log cfu/g 3.13 2.93 2.11 1.7
Fitted cfu/g 1,349 851 129 50
Standard error 0.78

cheese type bacteria

Formaggio Branzi Escherichia coli

Figure 8. Escherichia coli behaviour during the processing and ripening of “Formaggio Branzi” cheese. 12
batches were analysed, each batch included 1 cheese slice, one company. Time is in hours.

processes and with the aid of dairy experts (Losio et al., 2003; Boni et al., 2004). Bergamo
Salami and spicy Calabrian Salami were produced and seasoned at ISZLER according to the
indications and with aid of the producers. The meat was contaminated, then processed and
seasoned according to the time, temperature and water activity indicated by the production
criteria (Cosciani et al., 2004; Daminelli et al., 2003). During the production process, the
death rate of the bacteria and the decimal reduction time were calculated using a DMfit

For Grana Padano D.O.P., made from raw milk, it was shown that the curdling phase in
combination with the storage with the sera given the time/temperature relationship
produced an effect similar to pasteurization. Similarly, Mycobacterium bovis did not survive
the production technology of Grana Padano. Indeed, it already disappeared in the salting
phase (Benedetti et al., 2003). The salami that were analysed, given that the meat used to in
their production was uncooked, derived their safety from the drying process as can be shown
from the reduction of the pathogenic population to 4-7 logarithmic cycles. An exception was
Listeria monocytogenes, that disappeared due to biocompetition and acidification. Some of
the graphs that resume these results carried out at the Department of Food safety of the

Towards a risk-based chain control  191

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

ISZLER are shown in Figures 9 to 15. The death rate was constant over time during the whole
production process for nearly all pathogens.
Figures 9, 10 and 11 show the behaviour of the pathogens during the production process
of Grana Padano D.O.P. cheese that was experimentally contaminated. The table includes
the name of the foodstuffs contaminated, the pathogen used, the decimal reduction time,
the time needed for a five cycle logarithmic reduction (D) and the pathogen’s concentration

8 log cfu/g fit
7 max Min
log ufc/g

-1 0 1 2 3 4 5 6 7 8
time (h)

Time (h) 0.5 0.75 1 2.5 3

Fitted log cfu/g 6.87 6.34 5.81 2.62 1.56
Fitted cfu/g 7,413,102 2,187,762 645,654 417 36
Standard error 0.56

kind of cheese pathogen bacteria D value (min) during time to reach 5 pathogen concentration in
cooking and lying phase log reduction milk for cheese making
28' e 13'' 2 h e 21' 7.56 log cfu/g
Listeria 6.79
Grana Padano monocytogenes max limit: 31' 9,120,108 cfu/g
D.O.P. 36,307,805
min limit: 25' 3 53'' 6,165,950

Figure 9. Listeria monocytogenes, Grana Padano D.O.P. challenge test.

expressed in logarithm and colonies formed per gram.

192  Towards a risk-based chain control

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

8 log cfu/g Fit 1
7 max min
log cfu/g
-1 0 1 2 3 4
time (h)

Time (h) 0.5 0.75 1 1.5

Fitted log cfu/g 7.54 5.82 4.1 0.66
Fitted cfu/g 34,673,685 660,693 12,589 5
Standard error 0.24

kind of cheese pathogen bacteria D value (min) during time to reach 5 pathogen
cooking and lying phase log reduction concentration in milk
Salmonella 8' e 44'' 43' e 37'' 7,79 log cfu/g
Grana Padano spp. max limit: 9' e 50'' 61,659,500 cfu/g
D.O.P. min limit: 7' e 50''

Figure 10. Salmonella typhimurium and Salmonella enteritidis, Grana Padano D.O.P. challenge test.

Figures 12, 13, 14 and 15 show the behaviour of Salmonella typhimurium and Listeria
monocytogenes in contaminated Bergamo Salami. The pathogens concentration decreased
constantly up to 2 months into the seasoning process. Escherichia coli remained constant
for nearly three weeks of seasoning and then decreased linearly in the following months of
seasoning. In the first week of seasoning Leuconostoc spp. were antagonistic for Listeria
monocytogenes increasing the death rate of the pathogen. The tables show the name of the
foodstuffs contaminated, the pathogen used, the decimal reduction time, the time needed
for a five cycle logarithmic reduction (D) and the pathogen’s concentration expressed in
logarithm and colonies formed per gram.

The pathogens death rate can change during processing. For example, the logarithmic
concentration of Staphylococcus aureus in Grana Padano D.O.P. cheese decreased quicker
during the curding process and the storage of the curd under sera as compared with the
following stage when the cheese’s temperature decreased over time (Figure 11). Another
example was the logarithmic concentration of Escherichia coli O157:H7 in Bergamo Salami
(Figure 13), which remained constant before decreasing; this phase is called ‘shoulder’.

Towards a risk-based chain control  193

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

6 log cfu Fit 1 max min
Log cfu/g 5
-1 0 1 2 3 4 5 6 7 8
time (h)

Time (h) 0.5 0.75 1 1.25 2.5 5.5

Fitted log cfu/g 6.9 6 5.11 4.21 3.21 0.82
Fitted cfu/g 7,943,282 1,000,000 128,825 16,218 1,622 7
Standard error 0.21

kind of pathogen D value (min) during D value (min) time to reach 5 pathogen concentration in
cheese bacteria cooking and lying phase after lying phase log reduction milk for cheese making
Grana Staphilococcus 16' e 45'' 1 h e 15' 1 h e 53' 6.54 log cfu/g
Padano aureus max limit: 18' e 39'' max limit: 1 h e 23' 3,467,369 cfu/g
D.O.P. mim limit: 15' e 11'' min limit: 1 h e 8'

Figure 11. Staphylococcus aureus, Grana Padano D.O.P. challenge test.

This system allowed making mathematical models of predictive microbiological patterns

(Rosso et al., 1995; Gibson et al., 1998). The Istituto Zooprofilattico Sperimentale di Brescia
in collaboration with the Institute of Food Research of Norwich and Dr. Baranyi created a
model that was capable of predicting the behaviour of Salmonella spp. in salamis for their
aw, and pH profiles as well as the seasoning temperature (Figure 16; Cosciani Cunico et al.,

The figure shows the data dispersion around the equivalence line, the LogD of Salmonella
spp. model predicted, versus the LogD of Salmonella spp. observed. The death rate was
calculated for aw, temperature and pH function. The equation of the model is shown under
the graph. The filled symbols represent the LogD obtained from the challenge test on two
types of salami.

The producer or the Veterinary Public Health inspector can establish the behaviour of a
pathogen present in the product in a specific phase of production. Modifying the process, e.g.
including a faster drying phase, will modify the prediction of Salmonella spp. (Figure 17).

This type of approach supports achieving food safety considerably: first of all one does not only
address the presence/absence of the pathogen but also how its concentration changes over

194  Towards a risk-based chain control

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

log cfu/g Fit 1 max min
log cfu/g

-1 0 500 1000 1500 2000
time (h)

Time (day) 2 15 30 40
Fitted log cfu/g 6.25 5 3.58 2.6
Fitted cfu/g 1,778,279 100,000 3,802 398
Standard error 0.33

kind of salami pathogen D value (min) during time to reach pathogen concentration in
bacteria rypening 5 log reduction dough for salami making
"salame della salmonella 251 h e 30' 52 day 9 h 8.53 log cfu/g
bergamasca" typhimurium 338,844,156 cfu/g
Figure 12. Salmonella typhimurium, “Salame della Bergamasca” salami challenge test.

time. Secondly, commercial decisions can be made, e.g. should the meat be contaminated,
the seasoning time can be increased. Furthermore, the collected documentation improves
the base of hazard analysis and critical control points of the process, quantifies the risk and
affords to correct processes according to international criteria.

5. Conclusions

The collection and organisation of scientific and technological information may allow the
creation of an information system for food safety. This would be the basis for adding value
to traditional products based on milk or meat (but other foodstuffs too) and provides
information as much to safeguard the products as to guarantee food safety.

The epidemiological management of collected data shared among all operators of Public
Veterinary Health would allow practical results from the work carried out. This would improve
the present situation of risk analysis and assessment both for animal health and for food
safety. From this surveillance process, information can and must be made available for
reference information that is often missing at decision moments. Based on risk assessment

Towards a risk-based chain control  195

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

7 log cfu/g Fit 1
6.5 max min
log cfu/g 5.5
0 500 1000 1500 2000
time (h)

Time (day) 6 15 21 40 62
Fitted log cfu/g 6.94 6.68 6.37 5.12 3.55
Fitted cfu/g 8,709,636 4,786,301 2,344,229 131,826 3,548
Standard error 0.17

kind of salami pathogen D value (min) time to reach 5 log pathogen concentration in
bacteria during rypening reduction dough for salami making
"salame della E.coli O157:H7 329 h e 57' 68 days 44 h 24' 7.18 log cfu/g
Figure 13. Escherichia coli O157:H7, “Salame della bergamasca” salami challenge test.

pertaining to each aspect or production phase, we can for example gain the scientific
motivation to introduce technical modifications, modifications of tolerance limits and in
certain cases even loosen control thus saving work and economic energy.

The development of a complete information system has many applications and the very
use of the information can create new developmental needs, adaptations and revisions.
Its principal characteristics would therefore need to be flexibility and adaptability of the
system that has been created to make data available and not just to record them (even if
collection is necessary for the final objective). The crucial benefit of such a system is indeed
its cosmopolitan nature and therefore data must be shared to allow the systems´ development
and continual updating.

The fact that the supplier of the data is also the user of the information would make the data
ever more useful and would represent a strong argument for its updating. Each participant
in the food production chain must become ‘builder’ of the information system with all due
protection and guidelines. Present food production processes that are currently coded in the
SITA would soon become obsolete unless the producers contribute to the updating to add

196  Towards a risk-based chain control

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

log cfu/g Fit 1
Max Min

log cfu/g
0 500 1000 1500 2000
time (h)

Time (day) 0 10 20 50 70
Fitted log cfu/g 7.68 7.35 7.02 6.03 5.38
Fitted cfu/g 47,863,009 22,387,211 10,471,285 1,071,519 239,883
Standard error 0.58

kind of salami pathogen bacteria D value (min) time to reach pathogen concentration
during 5 log reduction in dough for salami
"salame della L.monocytogenes 729 h 39' 36'' 153 giorni 7.68 log cfu/g

Figure 14. Listeria monocytogenes in “Salame della bergamasca”; salami challenge test.

value to the products and guarantee health standards. The latter goal is the same as the one
pursued by the inspection system of health and safety of food products.

The improvement of food safety (both qualitatively and quantitively) in terms of ‘do-ability’,
economic and social impact as well as the health benefits, can only be achieved if the
mathematical, analytical and predictive models can rely on updated and consultable data-

Finally, it needs to be stressed once more that each activity needs to be able to supply
statistical and epidemiologically useful data that characterize animals health and food
products´ safety. Today we have computer resources available and a capacity to transmit
information that was unimaginable only 10-15 years ago. Risk assessment for the spread
of disease, mapping of production sites, the registration of good and animal flows are all
available now. What is lacking is a single shared system that would allow us to best use
this potential.

Towards a risk-based chain control  197

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

log cfu/g Fit 1
7 Max Min
log cfu/g 5
-1 0 500 1000 1500
time (h)

Time (day) 0 1 2 4 10 21 42 63
Fitted log cfu/g 7.3 6.83 6.36 5.41 4.54 3.84 2.5 1.16
Fitted cfu/g 19,952,623 6,760,830 2,290,868 257,040 34,674 6,918 316 14
Standard error 0.54

kind of salami pathogen bacteria D value (min) time to reach pathogen concentration
during rypening 5 log reduction in dough for salami
salame della L.monocytogenes 376 h 21' 44 d 12 h 7.3 log cfu/g
coculture Leuconostoc spp.

Figure15. Listeria monocytogenes in competition with Leuconostoc spp. “Salame della bergamasca”
salami challenge test.

198  Towards a risk-based chain control

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni

Log D pred 2.5
1 1.5 2 2.5 3 3.5 4
log D obs
model result pepperoni salami A pepperoni salami B
Salami A Salami B Equality line

LogD = -10.38-0.34xTEMP+7.35xPH-
0.44xBW+0.07x(TEMPxPH)+0.44x(TEMPxBW)+3.75 x(PHxBW)–
0.0066x(TEMP2-0.79xPH2+25.15xBW2 +/- 0.17.

Bw = (1-aw)
Figure 16. Prediction vs. observed logD value. Canonic form of polynomial function of the secondary

8 1.2
Predicted conc. (log cells/g)

4 0.6

(Sim. profile1) (Sim. profile2)
1 (Aw. profile1) (Aw. profile2)
0 0
0 200 400 600 800 1000 1200 1400
time (h)
Figure 17. Salmonella spp. prediction in Salami. These two salami are equal, only the water activity
profile changes during the ripening. The dotted line represents the water activity profile vs. time and the
straight line represents the predicted Salmonella concentration vs. time. Note: the more the aw decreases
the faster the log concentration of Salmonella spp. decreases.

Towards a risk-based chain control  199

Giorgio Varisco, Giuseppe Bolzoni, Elena Cosciani Cunico and Paolo Boni


Baranyi, J. and Roberts, T.A., 1994. A dynamic approach to predicting bacterial growth in food. Int. J. Food
Microbiol. 23, 277-294.
Benedetti, M., Daminelli, P., Varisco, G., Bolzoni, G., Belluzzi, G. and Boni, P., 2003. Sopravvivenza di Mycobacterium
bovis in formaggio grana padano. In: Proceedings V Congresso SidILV, Pisa 20-21/11/03, 2003, p. 201-202.
Bertocchi, L., Varisco, G., Bolzoni, G., Bravo, R. and Bonometti, G., 2005. An intramammary Staphylococcus aureus
infection control program in dairy herds of the province of Brescia. In: Proceedings of 4th IDF International
Mastitis Conference 12-15 June 2005, Maastricht, The Netherlands.
Bertocchi, L., Biancardi, A., Boni, P., Varisco, G. and Bolzoni, G., 2004. Milk aflatoxin M1 occurence in the province of
Brescia. In: International Conference “Veterinary public health and food safety”, Roma 23/10/04, p. 39-41.
Bolzoni, G., Varisco, G., Bertocchi, L., Cornoldi, M., Posante, A. and Bravo, R., 2005. Micro organism isolation
prevalence in bovine milk samples of the province of Brescia and in vitro antibiotic sensitivity. In: Proceedings
of 4th IDF International Mastitis Conference 12-15 June 2005, Maastricht, The Netherlands.
Boni, P., Daminelli, P., Cosciani Cunico, E., Monastero, P., Bertasi, B., Rossi, F. and Bornati, L., 2004. Analysis
of Listeria monocytogenes, Salmonella typhimurium and enteritidis and Staphilococcus aureus death rate in
Grana Padano DOP cheese. In: International Conference “Veterinary Public Health and food Safety”, Roma
Cosciani Cunico, E., Daminelli, P., Boni, P. and Baranyi, J., 2004. Predicting the survival of Salmonella in Italian
salami. 2nd Central European Congress of Food, Budapest 26-28 April 2004.
Cosciani Cunico, E., Finazzi, G., Abrati, F., Bonometti, E., Daminelli, P., Monti, A., Bortolotti, M. and Boni, P., 2004.
L. monocytogenes, E. coli O157:H7, S. dublin e S. typhimurium nel salame della bergamasca: calcolo del D value
e validazione del modello predittivo. Abano (PD) VI° Convegno nazionale Si.Di.L.V. 2004.
Cosciani Cunico, E., Monastero, P., Finazzi, G., Daminelli, P., Boni, P., Le Marc, Y. and Baranyi, J., 2005. Concetti di
microbiologia predittiva: sopravvivenza di Salmonella spp. nel salame. Industrie Alimentari 443, 1-10.
Daminelli, P., Bertasi, B., Pavoni, E., Agnelli, E., Losio, N.M., Bonometti, E., Bornati, L., Monastero, P., Cosciani,
E. and Boni, P., 2003. Dinamica della sopravvivenza di Salmonella enteritidis e Salmonella typhimurium nella
tecnologia del salame tipo pizza. Enternet Italia, Roma 09/10/2003.
Gibson, A.M., Bratchell, N. and Roberts, T.A., 1988. Predicting microbial growth: growth responses of salmonellae
in a laboratory medium as affected by pH, sodium chloride and storage temperature. Int. J. Food Microbiol.
6, 155-178.
Losio, M.N., Pavoni, E., Bonometti, E., Monastero, P., Panteghini, C., Daminelli, P. and Boni P., 2003. Dinamica
della sopravvivenza di Salmonella spp, E. coli O157: H7 ed Enterovirus nella tecnologia di lavorazione del Grana
Padano. Enternet Italia, Roma 09/10/2003.
McMeekin, T.A., Olley, J.N., Ross, T. and Ratkowsky, D.A., 1993. Predictive Microbiology. John Wiley & Sons Ltd.
Chichester, UK.
Rosso, L., Lobry, J.R., Bajard, S. and Flandrois, J.P., 1995. A convenient model to describe the combined effects of
temperature and pH on microbial growth. Appl. Env. Microbiol. 61, 610-616.
Varisco, G., Bolzoni, G. and Bertocchi, L., 2004. Residues of antimicrobial in bovine milk samples in Lombardia
region. In: International Conference “Veterinary public health and food safety”, Roma 23/10/04, p. 120-123.

200  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

The new EU legislation on food control and how

veterinarians fit in
Frans J.M. Smulders1, Reinhard Kainz1,2 and Martijn J.B.M. Weijtens3
1Department of Veterinary Public Health and Food Science, University of Veterinary Medicine,
Veterinärplatz 1, Vienna, Austria;
2Department of Craft Associations of Food Business in the Austrian Chamber of Commerce,

Coordination Bureau for the Meat Industry, Wiedner Hauptstraße 63, 1040 Vienna, Austria
3Ministry of Agriculture, Nature and Food Quality, P.O. Box 20401, 2500 EK The Hague, The



Following the outlines formulated in its ‘Whitepaper on Food Safety’ (European Commission,
2000; further commented by Daelman, 2002) the European Commission has issued various
pieces of legislation in the past few years, that represent fundamental changes in Community
food law. These changes are characterized by buzz words such as: ‘changing responsibilities’,
‘longitudinal approach (farm-to-table), improved traceability’, ‘risk analysis based’, ‘other
legitimate factors’/’precautionary principle’, and ‘transparency’. EU-uniform general principles
stipulated in the general food law and the associated laws referred to as the ‘hygiene
package’ and to be applied in food control, imply that both food industry and the competent
authority are challenged with new and wide-ranging tasks. In addition, this contribution deals
specifically with the consequences of the new EU food legislation for both official veterinarians
and veterinary practitioners. For the successful implementation of these laws, the cooperation
of these professional branches with food producers will be of great importance.

Keywords: EU food legislation, official veterinarian, food inspection, veterinary competence,

veterinary education

1. Introduction

The Whitepaper on Food Safety (EC, 2000) lists a large number of factors relevant to the
safety of all foods (be it those of animal or non-animal origin) and addresses such issues as
genetically modified organisms (GMO’s), the application of food colourants and other additives,
food labelling, specific problems related to BSE, etc. Before we discuss the legislation related
to food safety it is important that the readership realises our contribution necessarily deals
only with those fragments of the Whitepaper and the subsequently issued legislation based
on this position paper, that are particularly relevant for the veterinary profession.

Over the past decades the reasons why EU legislation related to the production, processing and
distribution of foods of animal origin needed to be changed have been clearly recognized.

Towards a risk-based chain control  201

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

Major incentives for restructuring food legislation included both ‘external’ factors such as
major food scares [e.g. the BSE crisis (1996-present), the dioxin crisis (1999-2000), and the
Foot-and-Mouth Disease crisis (2001)], that lead to a significant loss of consumer confidence
in foods of animal origin (and which had vast political repercussions), and factors inherent
to changes in agricultural infrastructure and the associated health status of production
animals in Europe. The latter have had major implications for the efficacy of ante- and post
mortem inspection. Illustrative for this are, for instance, the lack of information on the
history and treatment of slaughter animals, the fact that the majority of the industrially
produced meat animals originate from essentially uniform and generally healthy herds, that
- even when their meat might pose a potential threat to public health (e.g. in the case of
carriage of Salmonella or campylobacteriosis) - show no, or unclear clinical symptoms that
are overlooked easily during ante-mortem inspection. Also, one has come to realize that
neither many of the pathological-anatomical abnormalities detected, nor many of the bacteria
isolated at post-mortem inspection pose a threat to public health (Berends et al., 1993,
Hathaway and MacKenzie, 1991, Smulders and Paulsen, 1997). Consequently, the sensitivity
of traditional meat inspection approaches is low (Harbers, 1991; Harbers et al., 1992). Even
worse, the routine incisions prescribed in traditional meat inspection have been shown to
constitute a significant source of cross contamination with pathogens such as Salmonella
and Campylobacter spp. Therefore, from a veterinary point-of-view, there were ample reasons
to reevaluate legislation.

But also with regard to the structure of legislation on the food ultimately produced in
animal production, many inconsistencies have been observed. In the history of Community
legislation the construction of a harmonized set of food laws proved to be complicated and
the various laws were issued rather fragmentarily. Neither a product-specific approach to
harmonization (so-called ‘vertical’ EC legislation) nor the creation of uniform regulations for
distinct thematic areas relating to all foods, or at least to various food groups (i.e themes
such as food additives, food labelling or laws on general hygiene), resulted in a common basic
concept. Consequently, the EC laws differed significantly both in concept and in their general
principles. Such contrasts represented potential impediments to free trade in foods within
the Community and occasionally lead to competitive industrial warfare and an impairment
of the European internal market.

Hence, in recent years major changes in the European food legislation were adopted. In the
framework of this contribution the following need to be mentioned:

• Regulation (EC) No. 178/2002 (the so-called “General Food Law”).

• Regulation (EC) No. 882/2004 on the control of all feeds/foods.

and the so-called ‘hygiene package’ consisting of the following:

• Regulation (EC) No. 852/2004 on the hygiene of foodstuffs (all foods), colloquially
referred to as ‘H1’.
• Regulation EC 853/2004 on the hygiene rules of foods of animal origin, referred to as

202  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

• Regulation (EC) No. 854/2004 on official controls of products of animal origin, referred
to as ‘H3’.
• Council Directive 2002/99/EC on animal health rules, referred to as ‘H4’.
• A final Regulation of the hygiene package (‘H5’), repelling 17 previous EU Directives that
are now covered by the above, was also issued.

Similar legislation specific to feed hygiene has also been issued [e.g. EC No. 183/2005 (EC
2005a)], but this will not be discussed in this contribution. Depending on the date on which
the above regulations are to be implemented Europe-wide, some ‘old’ European and national
legislation persist on an interim base. Specific areas - e.g. those on identification marking,
genetically modified microorganisms, Trichinella, protective measures against transmissible
spongiform encephalopathies (TSE’s) which are not addressed in this contribution - are
specified in additional law texts.

Not all the legislation listed will be extensively discussed here. Instead, we will primarily
focus on those law texts that pertain to the obligations of the competent authorities and
the official veterinarian and which bear great relevance to the veterinary profession. Though
realising that both food and feed (control) are dealt with in current legislation, space
limitations dictate that we concentrate on the food area and more specifically on those of
animal origin. To this end law texts are presented in as much condensed form as possible or
are cited verbatim, where deemed necessary.

Figure 1 is an attempt to provide a simple overview of the current food-related European

laws, particularly those that relate to veterinary activities.

2. Regulation (EC) No 178/2002 on ‘the General Food Law’

Regulation (EC) No 178/2002 (EC, 2002), adopted 28-1-2002, was conceived to represent
the common framework on which national legislation in the various Member States must be
based. It foresees different effective dates for the implementation of the separate articles,
hereby allowing varying, partially multiannual, time frames for the formulation of national
legislation. By the creation of an EU uniform basis for measures to be taken in the food- and
feed sector, said Regulation should allow the adaptation of the various food law concepts,
and general principles and procedures in the individual Member States.

The objectives of Regulation (EC) No 178/2002, viz. “... to provide the basis for the assurance
of a high level of protection of human health and consumers’ interest in relation to food...” is
to be transformed on all levels of production, processing and distribution of food and feed,
as every link of this chain could influence the safety of a foodstuff.

The central conception of the law is that of the food. Foods are defined as: “... any substance
or product, whether processed, partially processed or unprocessed, intended to be, or reasonably
expected to be ingested by humans...” and include drinks, chewing gum, and any substance -
including water - intentionally incorporated into the food during its manufacture, preparation
or treatment.

Towards a risk-based chain control  203

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

General Food Law [ Reg. (EC) No. 178/2002]

Feed hygiene Others, e.g. Feed/food control Others, e.g.

Reg (EC) No. 183/2005 Reg. (EC) No. 882/2004

marking Trichinae
etc. GMO’s TSE’s
etc. etc. etc.

H1 H3
Hygiene foodstuffs Official controls of
foods of animal origin

Reg. (EC) No. 852/2004 Reg. (EC) No. 854/2004

Reg. (EC) No. 2074/2005

H2 H4
Hygiene of foods of Animal health rules
animal origin

Reg. (EC) No. 853/2004 Dir. 2002/99/EC

Reg. (EC) No. 2076/2005
Feed business operator Food business operator Competent authorities

Figure 1. Graphic overview of the current (2006) EU food/feed legislation; left: primarily targeted at
feed business operator, middle: primarily targeted at food business operator, right: primarily targeted at
competent authority/official Veterinarian.

For the implementation of the aforementioned objectives the Regulation foresees a general
framework for measures to be taken:

2.1. General principles of the Food Law

The general principles of the Food Law include the objectives for pursuing a high level of
consumer protection, the principle of risk analysis, the precautionary principle, and public
consultation during preparation, evaluation and revision of food law. Member States are held
to adapt existing legislation as soon as possible but not later than the 1st of January 2007.
Until that moment existing national legislation is to be implemented taking the general
principles formulated in the new law into account.

The first general objective is termed as “... pursuing one or more of the general objectives of
a high level of protection of human life and health and the protection of consumers’ interests,
including fair practices in food trade...”.

Any legislation to be issued by a member state or the Community should be based on risk
analysis, which includes risk assessment, risk management and risk communication. In the
preamble to the Food Law it is recognised that scientific risk assessment alone cannot,
in some cases, provide all information on which a risk management decision should be

204  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

based and that other factors relevant to the matter under consideration (e.g. societal,
economic, traditional, ethical and environmental as well as the feasibility of controls) should
legitimately be taken into account.

In cases where a certain risk for harmful effects on health is identified, but scientific
uncertainty still persists, provisional risk management measures may be taken pending further
scientific information. Such measures must be proportionate, no more restrictive of trade
than is required for purposes of consumer health protection, and are to be reviewed within
a reasonable period of time. This concept is known as the ‘precautionary principle’.

Principles of transparency, i.e. the authorities’ duty to assure an open and transparent public
consultation, are laid down in Section 2 of the Regulation.

2.2. General obligations of the food trade

Importation of foods into the Community and the subsequent distribution and marketing
should either comply with relevant Community legislation or with regulations considered
equivalent, or, finally, be based on a specific agreement between the Community and the
exporting country.

Food and feed exports are to comply with Community requirements or with those of the
importing Member State. In other cases, exportation or re-exportation is subject to the
importing country’s express approval, which may follow only after the relevant Member
State has been fully informed why and under which circumstances the food or feed was not
allowed on the market.

Finally, under the heading of ‘international standards’ the objectives and principles of the
Community’s and Member States’ contribution to the development of international standards
and agreements (e.g. those addressed by Codex Alimentarius) are defined.

As of the 1st of January 2005, food and feed trade obligations have come into force.

2.3. General requirements of the Food Law

Central in the Food Law with respect to food safety are the following questions: (1) “When can
a food item be considered as safe?”, (2) “Who should be held responsible that food safety is
assured?” and, finally, (3) “Who is to take action in the case a food is considered unsafe?”.

The general requirement that unsafe foods should not be put on the market is made concrete
by indicating when foods are unsafe (“injurious to health”, “unfit for human consumption”).
As safe are considered those food items that comply with specific food safety provisions of
the Community (obviously only with regard to the aspects dealt with in these provisions).
Nevertheless, should there be motivated suspicion that a food is unsafe, the competent
national authorities can draw up provisions to limit marketing of such foods or issue market
recall provisions. Where no specific Community provisions exist, a food is deemed to be safe

Towards a risk-based chain control  205

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

when it conforms to the specific national provisions of the Member State in whose territory
the food is marketed.

For feeds the situation is analogous, i.e. feeds are considered to be unsafe when they have
an adverse effect on human or animal health or would render the food derived from food-
producing animals unsafe for human consumption.

The food or feed business operators at all stages of production, processing and distribution
within the business under their control are responsible for ensuring that foods or feeds satisfy
the requirements. The Member States are held to enforce food law and to monitor and verify
that all relevant requirements are fulfilled. To the latter end they must maintain a system
of official control, other activities of surveillance and monitoring, and they must implement
effective, proportionate and dissuasive sanctions.

Regulation (EC) No. 178/2002 includes a section, in which the responsibilities of food and
feed business operators are explained and which outlines their duties (to inform, to act, to
cooperate, not to prevent) in case a food or feed does not fulfill the requirements. Several
situations are distinguished.

Firstly, food business operators who have recognized (or have reason to believe) that a food
is not in compliance with the food safety requirements, are held to immediately initiate
procedures to witdraw the food from the market (‘duty to act’) and to inform the competent
authorities (‘duty to inform’). In cases where the food in question might have already reached
the consumer, the operator has to effectively and accurately inform the consumer about
the reason for withdrawal (‘duty to inform’) and, if necessary, to recall the product (‘duty
to act’). Recall is indicated where other measures cannot assure a high level of consumer

A food business operator responsible for retail or distribution activities, which do not affect
the packaging, safety, labelling or integrity of the food, must take non-compliant foods from
the market (‘duty to act’), provide information necessary to trace the food (‘duty to inform’ the
authorities), and cooperate in the actions taken by producers, processors, manufacturers and
competent authorities (‘duty to cooperate’). Should an operator have brought a food injurious
to the consumer on the market, he is held to immediately notify the authorities (‘duty to
inform’) and not to prevent or discourage any person from cooperating - in accordance with
national law and legal practice - with the competent authorities (‘
). In general, food business operators shall cooperate to support the authorities on actions
taken to avoid or reduce risks (‘duty to cooperate’).

Feed operators have an analogous catalogue of duties.

Experience shows that the internal food or feed market does not function well, unless food or
feed items can be traced back to their source of origin. Regulation (EC) 178/2002 therefore
stipulates that the traceability of foods, feeds, food-producing animals or any substance
intended or expected to be incorporated in a food or feed be established at all stages of
production, processing or distribution. To achieve this, any food or feed business operator is

206  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

obliged to identify his suppliers and customers and to have in place systems and procedures
to ensure the latter.

Last but not least, the Regulation forbids misleading of the consumer by inappropriate
labelling, advertising and presentation of foods and feeds.

As of the 1st of January 2005 the general requirements of food law have come into force.

2.4. The European Food Safety Authority

The European Food Safety Authority (EFSA) established under Regulation (EC) 178/2002 has
been in function as of the 1st of January 2002. Its mission is providing “... scientific advice
and scientific and technical support for the Community’s legislation and policies in all fields
which have a direct or indirect impact on food and feed safety...”. The regulation contains
detailed sections on the tasks of EFSA, its organisation, mode of operation, independence,
transparency, confidentiality and communication, and its financial provisions.

Meanwhile the EFSA Headquarters have been established in Parma, Italy.

2.5. Rapid alert system, crisis management and emergencies

The previous rapid alert system - laid down in Council Directive on general product safety of
29-6-1992 and encompassing only foods and industry products - has in the form of Regulation
(EC) 178/2002 been replaced by a new rapid alert system for food and feed. Member States,
the Commission and EFSA participate in the latter system.

In case of emergencies (i.e. a food or feed constitutes a serious risk to human or animal health
or to the environment and such risks cannot be contained satisfactorily by measures taken
by the Member States concerned) the Commission, at its own initiative or at the request of
the Member State, must immediately adopt measures. Where the Commission has not taken
immediate action, Member States may adopt interim protective measures.

The Commission, in cooperation with EFSA and the Member States, is to draw up a general
plan for crisis management in the field of food and feed safety. This general plan specifies
the types of situation involving direct or indirect risks, the practical procedures to manage a
crisis, the principles of transparency and a communication strategy. Should the Commission
identify that the health risks are not satisfactorily reduced or eliminated by already existing
provisions or by emergency measures, Member States and EFSA are to be informed and a crisis
unit (in which EFSA participates) to be set up immediately. Said crisis unit is to collect and
evaluate all relevant information, to identify all possible measures to deal with the crisis,
and to keep the public informed of the risks involved and the measures taken.

The decisions establishing this concept of crisis management came into force as of 21-2-

Towards a risk-based chain control  207

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

3. R
 egulation (EC) 852/2004 on the hygiene of foodstuffs (H1), and
Regulation (EC) 853/2004 on specific hygiene rules for the hygiene
of foods of animal origin (H2)

Two regulations on the hygiene of food, principally targeted at food business operators (only
secondarily at the competent authority) have been issued. Major features of both can be
summarized as follows.

Characteristic for ‘H1’, Regulation (EC) 852/2004 (EC, 2004 a), is that it relates to all foods,
and it addresses all sectors from stable to table and all stages of production, processing,
storage, distribution and export, and designates the food business operators as being
primarily responsible for the hygiene of their establishments and the products there produced.
Major requirements relate to: (1) registration of all food businesses, (2) hygiene requirements
(including those at farm level), (3) following a HACCP approach (excluding farm level), (4)
adhering to guides of good practice, (5) the fulfillment of microbiological criteria (laid down
in a separate Decision; see below), (6) temperature control, and (7) health marking.

Supplementing H1 is Regulation (EC) 853/2004 (EC, 2003; also referred to as H2), which
specifically targets operators involved in the production, processing and distribution of foods
of animal origin. The latter include milk and dairy products, red and white meat, farmed
and hunted wild game, minced meat and meat preparations mechanically separated meat,
meat products, live bivalve molluscs, fishery products, eggs (and products thereof), frogs’
legs, snails, rendered animal fat and greaves, treated stomachs, bladders and intestines,
and, finally, gelatin and collagen. Major requirements relate to: (1) the conditional approval
of establishments, (2) the identification mark by the operator, (3) health marking for red
meat carcasses by the official veterinarian, (4) a HACCP based approach, (5) (simplified)
requirements for slaughterhouses and cutting plants, and (6) emergency slaughter.

For purposes of adequate implementation of above laws, additional legislation has recently
been issued, e.g. Commission Regulation (EC) No. 2074/2005 (EC 2005c) and Commission
Regulation (EC) No. 2076/2005 (EC 2005e), that, incidentally, include adaptations of H1
and H2.

Where previously operators were bound to adhere to Commission Decision 2001/471/EC

- listing the principles of HACCP, specifying how and where to sample for purposes of
microbiological checks and laying down how to separate acceptable from unacceptable
test results - these have now partly been included in H1 (HACCP principles) or in new
specifications, e.g. Commission Regulation (EC) No. 2073/2005 (EC 2005b).

4. R
 egulation (EC) No 882/2004 “on official controls performed to
ensure the verification of compliance with feed and food law, animal
health and animal welfare rules”

In the Whitepaper on Food Safety the European Commission announced a Regulation that
should represent a common framework for the control systems to be instituted by individual

208  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

Member States. Major incentives for adapting and harmonizing the existing Community
regulations on official control in the feed, food and veterinary sector included discrepancies
between different Community laws, control loopholes in certain sectors of the feed and food
sector, and efficiency deficits of the control services of the Commission itself.

According to the Whitepaper, the three core elements in the Community framework of
legislation on the control of food are: (1) uniform EU operational criteria for the national
authorities, (2) common control guidelines, and (3) a stronger administrative cooperation
in the development and the operational execution of control systems.

With the adoption and publication of Regulation (EC) No 882/2004 on 29-4-2004 (EC, 2004
d), the European Commission has created the afore-mentioned common framework, taking
into account the general objectives of the food law [health protection - here: avoiding risks
for humans and animals - and consumer protection]. There are also links with elements
of Regulation (EC) 178/2002 (see above), particularly with regard to the definitions, the
stipulation that Member States have the basic competence for official food and feed control
(see section 2.3 of this contribution) and the rapid alert system/emergency measures (see
section 2.5 of this contribution).

The articles in Regulation No 882/2004 can be assigned to three categories: (1) rules for
official controls by the Member States, (2) rules for the official controls by the Community,
and (3) specifications for their execution.

4.1. Official controls by Member States

The primary competence for official controls lies with the Member States. Their competent
authorities are obliged to fulfill a number of operational criteria, to ensure their impartiality
and efficiency. For instance, they should have sufficient, suitably qualified and experienced
staff and adequate laboratory facilities at their disposal. When a Member State confers the
competence to carry out official controls on an authority or authorities other than a central
competent authority (regional or local), efficient and effective coordination between all
the competent authorities involved should be ensured. Under certain conditions competent
authorities may delegate specific control tasks to non-governmental control units.

The Member States are obliged to conduct control activities during all phases of production,
processing and distribution and marketing of feeds and foods. These include inspection of (1)
primary producers’ installations, including their surroundings, premises, offices, equipment,
installations and machinery, transport, and the feed and food, (2) raw materials, ingredients,
processing aids and other products used for the preparation and production of feed and
food, (3) semi-finished products, (4) materials intended to come into contact with food,
(5) cleaning and maintenance products and processes, and pesticides, and (6) labelling,
presentation and advertising.

The frequency of official controls depends on various factors. Besides a basic routine control
program, established risks, experiences and knowledge gained from previous inspections, the
reliability of self-control measures of feed and food business operators as well as suspicion

Towards a risk-based chain control  209

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

of violations or malafide practices need to be considered. Control methods and techniques

include monitoring activities, verification, audits, inspection, sampling and analysis.

Control tasks are manifold. They involve particularly the verification of the efficiency of
self-control activities of the feed and food business operator, their hygiene controls, the
assessment of Good Manufacturing Practices (GMP), Good Hygiene Practices (GHP), good
farming practices and HACCP, the examination of documentation and other records relevant
for the assessment of compliance, the reading of values recorded by feed or food business
measuring instruments, the controls with the competent authorities’ own instruments to
verify measurement taken by feed and food business operators, and interviews with feed and
food business operators and their staff.

Control is to be conducted according to specified documented procedures. To support

the staff performing controls, guidelines on the official control - e.g. with regard to the
implementation of HACCP principles or Quality Control systems (EC, 2006 Website) - can be
made available. The efficiency and efficacy of official controls are also to be tested through
verification procedures run by the national competent authorities.

The new control concept assumes great competence and scientific expertise of the control
bodies. They must show comprehensive knowledge of the various potential hazards, the market
mechanisms and the concrete problems associated with particular processing procedures. The
modern control procedures dictate that feed and food control staff are highly qualified, to
assure that controls are effective, objective and proportionate. The corresponding necessary
education necessitates a multidisciplinary approach. To this end the Regulation foresees the
establishment of programmes for education on a national and Community level to ensure that
official controls are executed with an EU wide uniform approach. The Regulation includes an
extensive list of subjects to be taught in such courses.

Further rules are related to the criteria for sampling and analysis, as well as to Member
States’ obligation to establish and adapt operational contingency plans, as stipulated in the
general plans for crisis management according to Regulation (EC) 178/2002 (see section
2.5 of this contribution).

The existing regulations for the control of goods imported from third countries are extended
to include feed and foods of animal- and non-animal origin.

Also the costs of official controls are to be harmonized EU wide. Member States have to
ensure that sufficient financial means are made available for the organisation of official
controls. The Member States will charge feed and food business operators the costs for
‘beyond routine’ control activities in the form of control fees, the level of which is to be set
according to common EU wide criteria. Similar control fees are to be fixed for feed, food and
animal imports from third countries.

In accordance with the stipulations of Regulation (EC) No. 178/2002, Member States are to
establish a multiannual control plan according to EU uniform guidelines (see section 7.1.2
of this contribution).

210  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

Finally, it is relevant for the veterinary profession to mention that separate legislation has
been issued [Regulation (EC) No. 2075/2005 (EC 2005d)] in which the testing for Trichinella
and the associated procedures have been specified.

4.2. Community controls

The Regulation represents the legal basis for Community experts, to be supported by the
Member States, to regularly conduct audits on all feeds and food in the Member States with
the main purpose to verify that the official controls by the Member States are in accordance
with their national control plans and with Community law (see section 4.1). When necessary,
these audits can be supplemented by specific audits and inspections addressing specific
sectors or problems. The outcome of such audits and the associated recommendations are
reported, whereupon Member States are obliged to take any measures necessary to remedy
the situation.

Community control in third countries is based on the following concept. As it is obviously not
economically feasible that individual third country enterprises are inspected by Community
experts, control bodies in third countries are to guarantee that feed and food exported to
the EU comply with Community legislation or with the export country’s national legislation
deemed equivalent to EU laws. In addition, third countries are held to establish control plans
similar to those existing in EU Member States, and these constitute the basis for further audits
and inspections by Community experts. Conversely, third countries are allowed to conduct
control activities in EU Member States that export to these third countries. Community experts
may support third country officials during these controls.

Further decisions relate to the conditions and procedures for the import of feeds and foods
from third countries, the training of Member States’ and third countries’ control staff, and
the coordination of Community control activities.

4.3. Enforcement measures

Primarily responsible for enforcing Community regulations are the Member States. For this
purpose Regulation (EC) No. 178/2002 stipulates the Member States’ obligation to enforce the
Food Law and in case of violations to take measures and apply effective, proportionate and
dissuasive sanctions (see section 2.3). Regulation (EC) 882/2004 expresses this obligation in
factual terms. The Regulation subsequently lists the administrative enforcement measures in
case of non-compliance, which range from limiting or prohibiting marketing- or the recall of
goods through to suspension of operation, the closing of all or part of the business concerned,
up to withdrawal of the operator’s licence. In deciding on the sanction, previous violations
against the food and feed law are also taken into consideration.

Member States are also to lay down rules on sanctions applicable to infringements of the
feed and food law and to take measures to ensure that these are implemented (§ 55 of the

Towards a risk-based chain control  211

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

5. R
 egulation (EC) No. 854/2004 on official controls of foods of
animal origin (H3)

In contrast with the previous legal situation, Regulation (EC) No. 854/2004 (EC, 2004c; also
referred to as H3) addresses in one legal act the requirements of official controls of foods of
animal origin. For its jurisdiction the Regulation refers to Regulation No 853/2004 as regards
the specific rules for the hygiene of products of animal origin (EC, 2004c).

The focus of official control in the context of this Regulation is, on the one hand, verifying
the compliance with hygiene rules and, on the other, the product-specific requirements for the
protection of human and animal health. The essential element to be audited is the compliance
with Good Hygiene Practices and the HACCP systems applied by the food business operators.
For businesses requiring approval in accordance with Community law, the Regulation includes
the procedures for the granting, surveillance or withdrawal of the approval. In addition to
other traceability requirements, the orderly use by these businesses of identification marks
is to be controlled.

The nature and intensity of auditing tasks in respect of individual establishments depends
on the assessed risk for human and animal health, animal welfare, the type of throughput
and the processes carried out, and the food operator’s past record as regards compliance
with the Food Law.

Actions to be taken in case of non-compliance are specifically indicated, as are the procedures
concerning import of foods of animal origin.

The decision who is most competent to conduct audits is principally left to the Member States.
However, the competence to inspect slaughterhouses, game-handling establishments, and
cutting plants that distribute fresh meat, is expressly incumbent upon official veterinarians,
who may in certain cases be supported by official auxilliaries or slaughterhouse personnel.

The specific control tasks for various products of animal origin are extensively listed in
Annexes to the Regulation. In the following, this is illustrated using fresh meat (Annex I)
and raw milk and dairy products (Annex IV) as major examples of tasks which the majority
of official veterinarians are confronted with in their daily professional life.

5.1. Control tasks of the official veterinarian during fresh meat production

Three specific control tasks [i.e. auditing tasks (Section I, Chapter 1), inspection tasks (Section
I, Chapter 2) and health marking (Section I, Chapter 3)] have been formulated for the official
veterinarian. In Section II of the Regulation the actions following control are described.

Auditing tasks include the auditing of Good Hygiene Practices, as well as verifying continuous
compliance with food business operators’ own procedures concerning any collection, transport,
storage, handling, processing and use or disposal of animal (by-) products (including specified
risk material) for which the food business operator is responsible. In the auditing of the
operators’ own procedures it is the veterinarian’s task to check that meat: (1) does not contain

212  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

patho-physiological abnormalities or changes, (2) does not bear faecal or other contamination,
and (3) does not contain specified risk material, except as provided for under Community
legislation and has been produced in accordance with Community legislation on TSE’s.

Inspection tasks include the following categories: (1) collecting and analysing food chain
information, (2) ante-mortem inspection, (3) animal welfare, (4) post-mortem inspection, (5)
specified risk material and other animal by-products, and, finally, (6) laboratory testing.

The official veterinarian has to take into consideration any pertinent information on the food
chain (e.g. from the records of the holding of provenance of animals intended for slaughter,
official certificates accompanying the animals, declarations by veterinary practitioners and
official- and approved veterinarians carrying out controls during primary production, as well as
documentation from voluntary quality control systems of operators). Incidentally, Regulation
(EC) No. 2076/2005 stipulates that the competent authorities may allow that food chain
information is delivered to the abattoir concurrently with the animals to be slaughtered, as
long as such procedures do not jeopardize the objectives of Regulation (EC) No. 853/2004.

Within 24 hours of arrival at the slaughterhouse and less than 24 hours before slaughter
all animals are to be subjected to ante-mortem inspection by an official veterinarian.
Exceptions include emergency slaughter outside the slaughterhouse, hunted wild game, and
- in cases where Community legislation allows ante-mortem inspection - at the holding of
provenance. The primary objective of ante-mortem inspection is to determine if animal welfare
is compromised, or animal/zoonotic diseases prevail [as listed by the Office International
d’Épizooties (OIE)]. In addition to regular ante-mortem inspection, a clinical examination
must be carried out in those cases where the operator or the official auxilliary has put aside
slaughter animals. Finally, detailed specific rules are listed for the ante-mortem inspection
of domestic swine, poultry, and farmed game.

The official veterinarian is to verify compliance with the relevant Community and national
rules on animal welfare, e.g. at the time of slaughter and during transport.

Post-mortem inspection includes the viewing of all external surfaces - with minimal handling
of the carcass and offal - paying particular attention to the detection of (OIE listed) zoonotic
diseases, additional necessary inspection through palpation and/or excision of carcass parts,
and mandatory lengthway carcass- and head splitting in certain animal species. Special
additional rules are formulated for the post-mortem inspection of carcasses and by-products
of cattle under/over 6 months of age, domestic swine over 4 months of age, sheep and goats,
domestic solipeds, free-living game, hunted wild game, poultry and rabbits.

Other inspection activities relate to the removal and separation of specified risk material and
other animal by-products and, where appropriate, the marking of such products.

Finally, the official veterinarian ensures that, where appropriate sampling for laboratory
analysis takes place, samples are appropriately identified and handled, and are sent to the
appropriate laboratory for further analysis.

Towards a risk-based chain control  213

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

Health marking and the correct application of the marks specified as suitable for the various
species is to be inspected by the official veterinarian.

Actions following control include (1) the recording and evaluation of inspection activities,
(2) informing the operators (both primary producers and food business operators), veterinary
practitioners and where necessary the competent authority, should inspection have revealed
a disease or condition affecting public or animal health, or compromising animal welfare.

The Regulation formulates a wide range of decisions official veterinarians can take based on
their findings. These range from the disposition of additional controls (e.g. in the case food
chain documentations do not correspond with the true conditions or are deceptive), denying
approval to slaughter animals, ordering additional sampling, separate killing (in case certain
diseases are suspected), and declaring carcasses as unfit for human consumption (e.g. when
the necessary food chain information is not provided within 24 hours after arrival at the
slaughterhouse, or in animals which have been treated with forbidden substances). In addition,
specific cases where meat should be declared unfit for human consumption are listed.

Finally, the frequency of official controls have been laid down (see section 7.1.3 of this

5.2. Control tasks of the official veterinarian during the production of raw milk and
dairy products

Two major areas are distinguished: the control of milk production holdings and the control
of raw milk upon collection.

In official controls attention is primarily focussed on the animals on the one hand (verifying
that the health requirements for raw milk production and in particular the health status
of the animals and the use of veterinary medicinal products, are being complied with),
and on the other on the milk production holdings themselves (verifying that the hygiene
requirements are being complied with, or inspections and/or monitoring of controls carried
out by professional organizations). In case of suspicion that the requirements for animal
health status are not fulfilled, veterinary checks by official or approved veterinarians are to
be carried out (see also sections 7.1.2 and 7.1.3 of this contribution). If a food business
operator collecting raw milk has not corrected any non-compliance with the criteria with
regard to plate count and somatic count cell count, delivery of raw milk from the holding is
suspended or - in accordance with a specific authorisation of (or general instructions from)
the competent authority - subjected to requirements concerning its treatment and use
necessary to protect public health.

5.3. Professional qualifications for the official veterinarian

Chapter IV of the Regulation stipulates that the competent authority may only appoint
individuals as ‘official veterinarian’, provided they have successfully passed a test showing
that they meet the requirements. The latter have been specifically listed, and include,
verbatim, the following 22 elements:

214  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

1. National and Community legislation on veterinary public health, food safety, animal
health, animal welfare and pharmaceutical substances.
2. Principles of the Common Agricultural Policy, market measures, export refunds and fraud
detection (including the global context: WTO, SPS, Codex Alimentarius, OIE).
3. Essentials of food processing and food technology.
4. Principles, concepts and methods of Good Manufacturing Practice and Quality
5. Pre-harvest quality management (Good Farming Practices).
6. Promotion and use of food hygiene, food related safety (Good Hygiene Practices).
7. Principles, concepts and methods of risk analysis.
8. Principles, concepts and methods of HACCP, use of HACCP throughout the food production
food chain.
9. Prevention and control of foodborne hazards related to human health.
10. Population dynamics of infection and intoxication.
11. Diagnostic epidemiology.
12. Monitoring and surveillance systems.
13. Auditing and regulatory assessment of food safety management systems.
14. Principles and diagnostic applications of modern testing methods.
15. Information and communication technology as related to veterinary public health.
16. Data handling and applications of biostatistics.
17. Investigations of outbreaks of foodborne diseases in humans.
18. Relevant aspects concerning TSE’s.
19. Animal welfare at the level of production, transport and slaughter.
20. Environmental issues related to food production (including waste management).
21. Precautionary principle and consumer concerns.
22. Principles of training personnel working in the production plants.

Although none of these responsibilities are unfamiliar to the veterinary profession, never
before were official veterinarians held by law to be able to execute all of these functions and
never before did EU legislation dictate the contents of continuing education for veterinarians
in a control function in such detail. In Section 7 of this contribution we will therefore revisit
this issue in greater detail.

6. Council Directive 2002/99/EC on animal health rules (H4)

In essence, this Directive [(EC 2004 d) considered part of the hygiene package, also referred
to as H4] aims at assuring that only those products are brought on the market that originate
from healthy animals. To this end, general animal health requirements (and derogations
thereof) applicable to all stages of production, processing and distribution of products
of animal origin within the Community have been laid down, which are underpinned by
veterinary certification and official veterinary controls by EU Member States, and (in case
of importation from third countries) by Community experts, which should document that
animals and products thereof comply with Community rules.

Towards a risk-based chain control  215

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

7. Consequences of the new EU food legislation for the veterinary


7.1. Chances and challenges

7.1.1. With regard to the General Food Law

The General Food Law (Article 17) stipulates the fundamental principle that the food business
operators bear the primary responsibility for the safety of their products. This has a clear
impact on the veterinary profession.

For the veterinary practitioner it implies that the food business operator (including the
farmers) should call upon the veterinary practitioner to give advice on how to produce safe
and wholesome foods. This obviously covers the prevention and treatment of diseases, the
use of veterinary drugs and general hygiene considerations. Though not very different from
the previous situation, the General Food Law explicitly lays down the rules for the various
players for the first time in European history. The veterinary practitioners cannot take over
the primary responsibility to produce safe from the farmer, their role is clearly that of an

Article 17 of the General Food Law states that “.. the food and feed business operators at
all stages of production, processing and distribution within the businesses under their control
shall ensure that foods and feeds satisfy the requirements of food law which are relevant to
their activities and shall verify that such requirements are met”. The role of the competent
authority and the official veterinarian is “... to enforce food law and to monitor and verify that
the relevant requirements of food law are fulfilled by food and feed business operators at all
stages of production”. In principle, this impacts on the activities of the offcial veterinarian in
situations where the latter was sofar accustomed to act very much ‘hands on’, e.g. typically
in a slaughterhouse setting, where veterinarians traditionally used to personally, or with
the help of an auxilliary, carry out all kinds of inspection activities. One might argue that
this is no longer in line with the requirement that the operator shall ensure the food safety
requirements of the food law [see the relevant sub-chapter of Regulation (EC) No. 854/2004].
However, ante- and post-mortem activities also play a role in the monitoring and control of
contagious animal diseases, which represent a (shared) government responsibility.

The requirements of the General Food Law regarding traceability are clearly stated in Article
18, which obliges operators to identify the source (person) of a food, feed, animal, or any
substance intended or expected to be incorporated into a food or feed and to have systems
and procedures in place to provide this information on demand. This has implications for the
veterinary practitioner, who - e.g. when providing a farmer with veterinary drugs - will not
only be required to document his own actions but, in general, may be asked by the farmer to
assist in fulfilling the legal equirements as regards traceability of, e.g. feed and animals.

216  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

7.1.2. As regards Regulation (EC) No. 882/2004 (control of feed and all foods)

The adoption of this regulation introduces new rules for all veterinarians involved in official
controls. These are of a general nature but may occasionally clearly affect their activities.

Chain approach
One of the elements of the Regulation is that official controls: “ … shall be carried out at
any of the stages of production, processing and distribution of feed or food and of animals and
animal products “ (Article 3). As a consequence, official controls on live animals cannot be
limited to the level of the slaughterhouse, but should, in one way or the other, also cover
the farm level. This inherently influences the scope of activities of the veterinary authorities,
especially in those Member States, where official veterinarians do not monitor the farm level.
However, as further described below, there are possibilities to delegate such control tasks
to private veterinarians.

“The competent authorities shall ensure that staff carrying out official controls are free from
any conflict on interest” (Article 4.2). This is relevant for situations where official controls
are carried out on a part-time basis by private practitioners. The competent authorities shall
make sure that these veterinarians do not encounter conflicts of interest. This might require
renewed arrangements between official authorities and private practitioners carrying out
certain control tasks.

Laboratory capacity
“The competent authority shall also ensure that staff carrying out official controls have, or
have access to, an adequate laboratory testing capacity for testing and a sufficient number
of qualified staff so that official controls and control duties can be carried out efficiently and
effectively” (Article 4.2). This is relevant in situations where official controls are carried out
in remote areas, e.g. as is the case for Trichinella testing in some regions. However, the
European Commission is studying possible solutions for these specific cases.

Article 4.3 deals with the conferring of the competence to carry out controls to an authority
other than a central competent authority, in which case particular care must be taken to
coordinate the various control activies. This is clearly relevant for all those Member States
(for instance Austria) where the veterinary authorities are regionalized in the framework
of federal arrangements. In Article 4.6 it has been laid down that internal audits by the
competent authorities, or external audits by a third party, are compulsory. This again requires
appropriate coordination measures.

Delegation of controls
Article 5 contains rules for the delegation of control tasks from the competent authority to
other control bodies. This could be of interest to the veterinary authorities. For example,
it allows them to delegate certain inspection activities at the farm or slaughterhouse level
to third, even private, parties. The final decision if this route should be followed lies with
the Member States.

Towards a risk-based chain control  217

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

Article 6 stipulates that the competent authority shall ensure that all of its staff performing
official controls: (1) “receive, for their area of competence, appropriate training enabling them
to undertake their duties competently and to carry out official controls in a consistent manner”,
(2) “keep up-to-date in their area of competence and receive regular additional training as
necessary”, and, finally, (3) “have aptitude for multidisciplinary collaboration”.

This is quite relevant for official veterinarians. It does not only confirm the consensus opinion
that university training does not suffice for a lifetime career, but also that veterinarians have
no legal basis to monopolize control tasks where other professions are also suitably qualified
or indeed inherently involved in control.

Article 51 elaborates on transnational EU training activities, which may be organized by the

Commission, and to which staff of the competent authorities will be invited. These will serve
to develop a harmonized approach to official control in Member States. This is of clear interest
to official veterinarians as it allows the exchange of experience with colleagues from other
Member States and promotes the development of common approaches and strategies.

Article 7 states: “... The competent authority shall ensure that they carry out their activities
with a high level of transparency. For that purpose, relevant information held by them shall be
made available to the public as soon as possible”. Notwithstanding the premise that a fair
collaboration with industry is paramount, and without prejudice of other legal requirements
applying in this regard, this does open the possibility of disseminating negative results should
the cooperation of industry be insufficient. Informing the public would surely represent a
powerful instrument for the official veterinarian to corroborate their actions and a strong
incentive for food business operators to improve their performance.

Multiannual national control plans

Article 41 states that each member state shall prepare a single integrated multiannual
national control plan. Article 43 further elaborates on this issue. The multiannual control
plan shall take account of the guidelines to be drawn up by the Commission, aiming, among
other things, at the following: (1) promoting a consistent, comprehensive and integrated
approach to official controls of feed and food, animal health and animal welfare legislation,
and embracing all sectors and all stages of the feed and food chain, including import and
introduction, (2) identifying risk-based priorities and criteria for the risk categorisation
of the activities concerned and the most effective control procedures, (3) identifying the
stages of production, processing and distribution of feed and food, including the use of
feed, which will provide the most reliable and indicative information about compliance with
feed and food law, and, finally, (4) encouraging the adoption of best practices at all levels
of the control system. Multiannual control plans and the related guidelines shall, where
appropriate, be adapted on the basis of the conclusions and recommendations contained in
yearly Commission reports.

Such control plans obviously integrate the veterinary controls and, more interestingly, places
these in the broader perspective of all official controls carried out in the Member States in

218  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

the field of feed and food. This should allow for fine-tuning the veterinary controls in relation
to all other controls and for operating in a more strategic way throughout the food chain,
which should have a positive effect on food safety and consumer protection as a whole.

7.1.3. With regard to Regulation (EC) No. 854/2004 (controls of food of animal origin)

This Regulation more particularly lays down the specific rules for meat inspection. It therefore
impacts in a very direct way on the official veterinarians involved in that type of activities.
Although the new rules retain the central role of the official veterinarian, the activities to be
carried out change considerably, as they become more risk-based, more chain-oriented, and
have the character of gaining oversight rather than necessarily involving hands-on actions.
Overall, the role of the official veterinarian becomes more challenging.

Control of the systems put in place by the operator

Under the new hygiene rules (contained in Regulations (EC) No. 852/2004 and 853/2004),
food business operators such as slaughterhouse operators are held to put in place Good
Hygiene Practices and procedures based on the principles of HACCP. This allows for taking
their (primary) responsibility regarding the safety of the products. The official veterinarian
will carry out audits and act on weaknesses in the systems.

This is a challenge for official veterinarians involved in meat inspection. To be able to carry
out these kinds of audits, they need a clear understanding of how systems work. As, in
Europe, the latter is not universally provided by undergraduate University training, additional
education is crucial. Furthermore, this type of activity places official veterinarians in another
role vis à vis the food business operator, demanding that they acquire better communication
skills and adopt a special attitude in their role as auditor.

Food chain information

Animals may only be delivered for slaughter provided accompanied by relevant information
from the farm. This must cover all aspects that may be relevant for the safety of the meat, and
includes information on animal diseases, the use of veterinary drugs, production data, feed
used etc. This so-called Food Chain Information is to be drawn up by the farmers themselves
- although they may ask assistance by the private practitioner - and to be passed to the
slaughterhouse operator, who subsequently decides whether or not to slaughter the animals.
Ultimately, the official veterinarian receives and analyses the Food Chain Information and
will put it to use for purposes of inspection.

Assuming this requirement is implemented in such a way that a credible system is created, this
would offer official veterinarians much more insight into the slaughter-animals’ background,
allowing them to carry out inspection in a more risk-based fashion, and would make their
tasks more interesting and challenging.

Ante-mortem inspection procedures

The new rules for ante-mortem inspection do not fundamentally change current practice. Ante-
mortem inspection should be carried out at the slaughterhouse by official veterinarians themselves

Towards a risk-based chain control  219

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

(possibly supported by official auxilliaries). However, in certain cases, for example for poultry
and fattening pigs, inspection may be carried out at the farm. This may be done by a so-called
‘approved veterinarian’, i.e. a private practitioner designated by the competent authority. This
is another aspect of the ‘chain approach’ that is being introduced in meat inspection.

The role of the official veterinarian as regards animal welfare is now clearly a part of the
meat inspection system.

Post-mortem inspection procedures

The post-mortem inspection procedures for slaughter pigs has been revised on the basis
of scientific advice. Fattening pigs may, under certain conditions, be inspected by visual
inspection procedures only. However, the official veterinarian may decide that certain
palpation or incision activities should be carried out on a routine basis. Furthermore, all
(visually) abnormal carcasses should be inspected in more detail by conducting palpation
and inspection. In practice, this means that the offcial veterinarian will have to make an
evaluation based on a risk analysis, for each batch of slaughter pigs and to decide whether
additional palpation or incision is necessary. This is a major change from the previous
situation where all pigs were always inspected in the same way and implies that, in practice,
less people will be involved in in carrying out routine post-mortem inspection procedures.
Ultimately, it might also stimulate farmers to improve husbandry practices.

The post-mortem inspection procedures for other categories of animals, such as sheep, goats
and veal calves, are in the process of being revised. We assume that the procedures for these
species will be changed as well.

In summary, post-mortem inspection is gradually evolving from a ‘standard recipe’ to a risk-

based set of procedures, with equal or better guarantees for the consumer.

Laboratory procedures
Regarding laboratory activities, certain routine analyses that were previously compulsory may
no longer be so in the future, as long as appropriate guarantees are provided for. Probably the
best example is Trichinella testing. Fattening pigs that originate from certified Trichinella-free
herds need no longer be investigated for Trichinella on a routine basis.

In general, laboratory tests that provide results at least equivalent to the tests described in
the new legislation will be allowed for use. Furthermore, the use of alternative approaches
that give a better result, e.g. serological testing in the case of Cysticercosis, are being

Feedback of information to the farm

All relevant information resulting from inspection shall be fed back by the official veterinarian
to the private practitioner and the farmer. This will allow the latter, supported by the private
practitioner, to take appropriate measures to prevent reoccurrence of the problem.

This is obviously rather time-consuming and represents an additional burden for the official
veterinarian. At the national level appropriate procedures, possibly based on electronic

220  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

systems and databases, should be developed to make this process as efficient as possible
and to guarantee consistency of information.

Presence of the official veterinarian

Without prejudice to various particular cases the rules can be summarized as follows: in
slaughterhouses the presence of at least one official veterinarian is required during the entire
period of ante- and post-mortem inspection, and in enterprises handling game during the
entire period of post-mortem inspection. In cutting plants the official veterinarian or auxilliary
has to be present as frequently as necessary to achieve the goals of the Regulation.

Company staff
As was the case previously, the new Regulation allows for the involvement of company staff
on post-mortem inspection of poultry and lagomorphs. However, clear requirements are now
laid down regarding the training of the relevant company staff.

The new regulations contain specific requirements regarding training of official veterinarians.
These are supposed to be well-trained in the elements indicated under Section 5.3 of this

For certain older veterinarians, or those official veterinarians working on a part-time basis,
this extra training may be experienced as a burden. Yet, not only is the training vital to secure
appropriate official control, but in general should help to guarantee professionality of the job,
which is similarly demanded of modern slaughterhouse staff. Unless the veterinarian accepts
to be looked upon as a costly, inevitable burden to the operator, the veterinary profession is
well-advised to shape up and as a result be considered as yielding professionals and worthy
counterparts to the companies’ own quality assurance staff.

National derogations
The new Regulation provides, under certain conditions, to derogate on a national basis
to accomodate specific siuations (e.g. remote areas, small plants, traditional production
methods). It also allows for the development of innovative approaches and the preparation
of the legislation’s revision.

7.2. Why the European veterinary profession needs to shape up

Veterinary public health control structures that have sofar been in existence in European
countries are diverse, and involve varying (some argue not necessarily proportionate) numbers
of veterinarians. Whereas in some European countries only full- or parttime veterinarians are
involved in controls tasks such as meat inspection, in others (e.g. in the Netherlands where
routine meat inspection is conducted by lay meat inspectors, or in the United Kingdom where
the bulk of the meat control activities is traditionally carried out by environmental health
officers), a considerable number of non-veterinary auxilliaries already participate in food
control (Buschulte and Reuter, 1996; Smulders, 1999). Consequently, the ramifications of the
legislation reform for the labour market of veterinarians in a food or feed control function
will be distinctly different across Europe.

Towards a risk-based chain control  221

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

In countries where their involvement has been limited, job opportunities for veterinarians
may indeed be created (the major motivation obviously being that veterinarians - more than
any other profession - should by the nature of their education be best qualified to observe the
various control functions and to assist in the further development of reliable and adequate
control methods). On the other hand, the new legislation implies that, within a very short
time, the number of veterinarians (and/or auxilliaries) that up to now were involved in
end-product control will decrease significantly. Fortunately, some control functions have
shifted to other parts in the food chain and can still be observed by e.g. (‘approved’) private
veterinarians as indicated in Section 7.1 of this contribution and by Lebrun (2004). The
latter author sketches how this approach [underpinned by the establishment of a solid quality
control system of the private veterinary profession along the lines described by De Bruin
and De Ruijter (2004)] could lead to a de facto territorial network of competent, logistically
independent professionals, acting as ad hoc public service agents.

8. Conclusions

8.1. What has been achieved

A coherent, well-structured, set of Community laws on food safety assurance has been issued,
which forms the basis for the adaptation on Member State level of national legislation.
Responsibilities are now explicitly laid with the producer, and tasks of competent authorities
and official veterinarians have been largely restricted to controlling that the requirements
stipulated in the law are complied with. This approach promises to offer better opportunities
for safeguarding animal health and welfare and to ensure that foods of animal origin are safe
and wholesome for the consumer. The changes in legislation have considerable consequences
for the veterinary profession.

8.2. What has been neglected

The veterinary profession needs to confront the fact that the legal basis to claim many
traditional control functions has been significantly reduced, and that the remaining control
functions will only be assigned to veterinarians, provided they can demonstrate their increased
competence. Although in most European countries the educational basis for their ability
to carry out control functions is in place in the form of curricular core provisions of the
various establishments for veterinary training, it must be conceded that the transition from
simply providing ‘starting competences’ to a seamless system of more targeted under- and
postgraduate education in veterinary public health has not yet taken place. Such requires
the joint efforts of the entire profession, both academic teaching staff and the veterinary
professionals working ‘on the front’. There is work to be done.

8.3. What needs to be done

At the undergraduate University training level in the area of Veterinary Public Health the
above dictates significant curriculum restructuring to assure that a more solid scientific
basis for the subsequent postgraduate education of veterinarians in an (official) control

222  Towards a risk-based chain control

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

function is provided (Smulders, 1999; Korkeala et al., 2003). The European Association of
Establishments for Veterinary Education (EAEVE) seems well-advised to insist more explicitly
that these curricular elements be firmly based on the requirements formulated in the current
legislation on veterinary public health and food safety, as discussed here.

At the continuing education level it is vital that the national competent authorities carefully
identify where their postgraduate courses should take over. Currently, working groups in the
various European countries are elaborating national continuing education schemes. It would
seem prudent for those having been made responsible, to keep in close contact with each
other to allow a truly European uniform approach.


The Austrian Society of Veterinary Medicine, editor of the Wiener Tierärztliche Monatsschrift
(Veterinary Medicine Austria), is gratefully acknowledged for its kind permission to reproduce
(translated) parts of publications by Kainz et al. (2004) and Weijtens et al. (2005).


Berends, B.R., Snijders, J.M.A. and Van Logtestijn, J.G., 1993. Efficacy of current meat inspection procedures and
some proposed revisions with respect to microbiological safety: a critical review. Vet. Rec. 13, 411-415.
Buschulte, A. and Reuter, G., 1996. Ausbildung und Funktion des Tierarztes im öffentlichen Gesundheitswesen
(‘Veterinary Public Health’) im Bereich der Lebensmittelüberwachung in einzelnen Ländern der EU. In:
Deutsche Veterinärmedizinische Gesellschaft (DVG), Proc.: Teil I (Vorträge), 37. Tagung des Arbeitsgebietes
‘Lebensmittelhygiene’, 30 September-2 Oktober, 1996, Garmisch-Partenkirchen, p. 344-353.
Daelman, W., 2002. The EU Food Safety action plan. In: F.J.M. Smulders and J.D. Collins (Eds.) Food Safety Assurance
and Veterinary Public Health, Vol 1. Food Safety Assurance in the Pre-Harvest Phase. Wageningen Academic
Publishers, Wageningen, The Netherlands, p. 17-22.
De Groot, S.J. and De Ruijter, T., 2004. Quality control of the veterinary profession in The Netherlands. Rev. sci.
techn. Off. int. Épiz. 23, 175-185.
EC, 2002. Regulation (EC) No. 178/2002 of the European Parliament and the Council of 28 January 2002, laying down
the general principles and requirements of food law, establishing the European Food Safety Authority and laying
down procedures in matters of food safety. Official Journal of the European Union, Brussels, O.J. L 31/1.
EC, 2003. Council Directive 2002/99/EC of 16 December 2002 laying down the rules governing the production,
processing, distribution and introduction of products of animal origin for human consumption, Official Journal
of the European Union, Brussels, O.J. L18/11.
EC, 2004a. Regulation (EC) No. 852/2004 of the European Parliament and of the Council of 29 April 2004, on the
hygiene of foodstuffs. Official Journal of the European Union, Brussels, O.J. L138/1.
EC, 2004b. Regulation (EC) No. 853/2004 of the European Parliament and of the Council of 29 April 2004, containing
specific rules for the hygiene of products of animal origin. Official Journal of the European Union, Brussels,O.
J. L139/55.
EC, 2004c. Regulation (EC) No. 854/2004 of the European Parliament and of the Council of 29 April 2004, containing
specific rules concerning official controls on products of animal origin. Official Journal of the European Union,
Brussels, O.J. L155/206.

Towards a risk-based chain control  223

Frans J.M. Smulders, Reinhard Kainz and Martijn J.B.M. Weijtens

EC, 2004d. Regulation (EC) No. 882/2004 of the European Parliament and of the Council of 29 April 2004, on official
controls performed to ensure the verification of compliance with feed and food law, animal health and welfare
rules. Official Journal of the European Union, Brussels O.J. L165.
EC, 2005a. Regulation (EC) No. 183/2005 of the European Parliament and the Council of 12 January 2005, laying
down requirements for feed hygiene. Official Journal of the European Union, Brussels, O.J. L35/1.
EC, 2005b. Commission Regulation (EC) No. 2073/2005 of 15 November 2005 on microbiological criteria for
foodstuffs. Official Journal of the European Union, Brussels, O.J. L338/1.
EC, 2005c. Commision Regulation (EC) No. 2074/2005 of 5 December 2005 laying down implementing measures for
certain products under Regulation (EC) No. 853/2004 of the European Parliament and of the Council and for
the organisation of official controls under Regulation (EC) No. 854/2004 of the European Parliament and of
the Council and Regulation (EC) No. 882/2004 of the European Parliament and of the Council, derogating from
Regulation (EC) No. 852/2004 of the European Parliament and of the Council and amending Regulations (EC)
No. 853/2004 and (EC) No. 854/2004. Official Journal of the European Union, Brussels, O.J. L338/27.
EC, 2005d. Commission Regulation (EC) No. 2075/2005 of 5 December 2005 laying down specific rules on official
controls for Trichinella in meat. Official Journal of the European Union, Brussels, O.J. L338/60.
EC, 2005e. Commision Regulation (EC) No. 2076/2005 of 5 December 2005 laying down transitional arrangements
for the implementation of Regulatiuons (EC) No. 853/2004, (EC) No. 854/2004 and (EC) No. 882/2004 of the
European Parliament and of the Council and amending Regulations (EC) No. 853/2004 and (EC) No. 854/2004.
Official Journal of the European Union, Brussels, O.J. L338/83.
EC, 2006, Website (HACCP guidelines) visited May 2006, pathway:
European Commission, 2000. Whitepaper on Food Safety. COM (1999) 719 final.
Harbers, A.H.M., 1991. Aspects of meat inspection in an integrated quality control system for slaughter pigs. PhD
Thesis, Fac. Vet. Med., University of Utrecht, The Netherlands.
Harbers, A.H.M., Smeets, J.F.M., Faber, J.A.J., Snijders, J.M.A. and Van Logtestijn, J.G., 1992. A comparative study
into procedures for post mortem inspection of finishing pigs. J. Food Prot. 54, 471-475.
Hathaway, S.C. and MacKenzie, A.I., 1991. Postmortem meat inspection programs; separating science and tradition.
J. Food Prot. 54, 471-475.
Kainz, R., Smulders, F., Luf, W. and Weijtens, M.J.B.M., 2004. Lebensmittelrecht und -kontrolle im Umbruch, Teil A.
Rechtliche Grundlagen. Vet. Med. Austria/Wien. Tierärztl. Mschr. 91 (suppl. 1) 13-24.
Korkeala, H., Lindström, M. and Fredriksson-Ahomaa, M., 2003. Food hygienic research and education in veterinary
schools: the presence and the future. Archiv für Lebensmittelhygiene 54, 97-152.
Le Brun, Y., 2004. Mechanisms for collaboration between public and private veterinarians: the animal health
accreditation mandate, Rev. sci. techn. Off. int. Épiz. 23, 69-77.
Smulders, F.J.M., 1999. Veterinary public health and -food science in Europe; the current status on university
education, continuing professional development and specialisation. In: F.J.M. Smulders (Ed.). Veterinary aspects
of meat production, processing and inspection; an update of recent developments in Europe, Publ. ECCEAMST,
Utrecht, The Netherlands, p. 387-403.
Smulders, F.J.M. and Paulsen, P., 1999. Reform der Fleischuntersuchung und warum? oder: vorgefasste Meinungen,
Wissenschaft und die Leistungsfähigkeit des gegenwartigen Fleischuntersuchungssystems gegenüber zukunftigen
Systemen. Vet. Med. Austria/Wien. tierärztl. Mschr. 84, 280-287.
Weijtens, M.J.B.M., Smulders, F.J.M., Stangl, P.-V., Sterneberg-van der Maaten, T., Kainz, R. and Luf, W. 2005.
Fundamental changes in food legislation; Part B: The new EU food safety legislation - consequences for the
veterinary profession. Vet. Med. Austria/Wien. Tierärztl. Mschr. 92, 191-195.

224  Towards a risk-based chain control

Synopses of other conference contributions
S.D. Amorim, C.S. Pereira, A. Lafisca and D.P. Rodrigues

Is Vibrio parahaemolyticus a risk pathogen in Brazil?

S.D. Amorim1, C.S. Pereira1, A. Lafisca2 and D.P. Rodrigues1
1Oswaldo Cruz Foundation, Rio de Janeiro, Brazil
2Dipartimento Sanità Pubblica, patologia comparata e igiene veterinária – Universitá di
Padova, Italy


A prospective study has been conducted to evaluate virulence and clonal diversity in 319
V. parahaemolyticus strains from human source (29), environment (97) and seafood (193)
isolated from 1992 to 2004 in different geographic areas of Brazil. The isolates were identified
as Vibrio parahaemolyticus according to B.A.M. (FDA, 2001), as well as by urease production
in Christensen’s urea agar supplemented with different NaCl concentrations (0,5%, 1%, 2%
and 3%). To characterize the serotype prevalence, the method described in FDA manual
(2004) was followed. The prevalent serogroups identified were O1, O10, O3, O11, O8 and
O5. The hemolysin production has been observed in 17,2% from human, 2% from food
and 1% from environmental sources. Recent increases in the incidence of Vibrio infection
(especially outbreaks due to Vibrio parahaemolyticus) should allow a better understanding of
the virulence mechanisms, clonal diversity and implications for the prevention and control in
the tropical areas. Comparative analysis among environmental, clinical and seafood strains
might be helpful for elucidating the potential risk in Brazil.


Vibrio parahaemolyticus is an important human pathogen, of which the number of documented

outbreaks associated with the consumption in raw or undercooked seafood has increased
globally since 1995. By the end of the 1990´s its prevalence had increased worldwide. Among
the Vibrio parahaemolyticus serovars three have been attributed a pandemic potential: O3:
K6, O4: K68 and O1: K UN. Although trends in population-based incidence have not been
reported for Vibrio, it is likely that Brazil´s tropical weather is responsible for the all year
high prevalence of this pathogen in the environment. Studies implicate the thermo stable
direct hemolysin (TDH and TDH – related hemolysin), encoded by the tdh and trh genes,
respectively, as the major virulence factors of this organism. The presence of the tdh gene
marked by phenotypic characteristics such as β-hemolysis on Wagatsuma agar and trh gene
by a positive urease test.

Material and methods

A prospective study has been conducted to evaluate the virulence and clonal diversity in 319
V. parahaemolyticus strains from human sources (29), environment (97) and seafood (193)
isolated from 1992 to 2004 in different geographical areas of Brazil.

Towards a risk-based chain control  227

S.D. Amorim, C.S. Pereira, A. Lafisca and D.P. Rodrigues

All strains were kept into buffered Nutrient Agar- 3% NaCl at room temperature, sub-cultured
in Alkaline Peptone Water (APW) - 1% NaCl and streaked onto TCBS Agar - 37°C/16 Hs.
Colonies with blue or green centres were stabbed on Nutrient Agar- 3% NaCl, Kligler Iron
Agar and Lysine Iron Agar followed the biochemical identification by B.A.M. (FDA, 2001).
Christensen’s urea Agar (Difco) was used for evaluation the urease hydrolyses.

Kanagawa Phenomenon related with the TDH hemolysin followed (Myamoto, et al 1969) on
Wagatsuma Agar. The O: K serovar of each test strain was determined by agglutination tests
with specific anti-sera (Denka Seiken Tokyo) according to BAM (FDA, 2004).


The urease activity among the 319 strains showed 10 positive results (5 human, 4 food and 1
environment). Among the 122 food strains analyzed the more prevalent serotypes identified
were: O10: KUN (21), O1: KUN (9), O8: KUN (8), O5: K17 (7), O3: K72 (6), O10:K69 (6),
O11: KUN (5) and O3:K57 (4).

The presence of beta hemolysis on Wagatsuma Agar had been observed in 12% (7 food and
8 human) strains among the 125 (80 food, 5 environment and 13 human) strains analyzed.


We found high rates of strains which produce urease and beta hemolysin (phenotypic
evaluation) that sometimes couldn’t show it’s real potential of virulence characteristics
(tdh and trh gene). Therefore continued concern about it’s molecular evaluation in warrant
specially with detection of O1: KUN and O3: K6; serotypes recognized as a potential pandemic
strain and elucidate the real risk in Brazil.


Miyamoto, Y., Kato, T., Obara, Y., Akiyama, S., Takizawa, K.S. and Yamai, S., 1969. In vitro hemolytic characteristic
of Vibrio parahaemolyticus its close correlation with human pathogenicity. J. Bacteriol. 100, 1147–1150.
U.S. Food and Drug Administration (FDA), 2001. Bacteriological Analytical Manual on line, Jan. 2001 End.: www.
U.S. Food and Drug Administration (FDA), 2004. Bacteriological Analytical Manual on line, May. 2004 End.: www.

228  Towards a risk-based chain control

Bartocci, Codega de Oliveira, Ortenzi, Costarelli, Crotti, Scuota, Zicavo, Vizzani and Cenci Goga

Implementation of a risk based chain control through

the detection of some Escherichia coli genes in faecal
swabs and food products with multiplex PCR assay
E. Bartocci1, A. Codega de Oliveira1, R. Ortenzi1, S. Costarelli2, S. Crotti2,
S. Scuota2, A. Zicavo2, A. Vizzani1 and B.T. Cenci Goga1
1Dipartimento di Scienze degli Alimenti, Facoltà di Medicina Veterinaria, Università degli
Studi di Perugia, Italy
2Istituto Zooprofilattico Sperimentale dell’Umbria e delle Marche, Perugia, Italy


Enterohemorrhagic Escherichia coli (EHEC) have been globally recognized as important

food-borne pathogens and are considered a major cause of gastrointestinal disease (Fagan
et al., 1999). The nature of illness can range from mild form of diarrhoea to more severe
forms known as hemorrhagic colitis (HC), haemolytic uremic syndrome (HUS) and thrombotic
thrombocytopenic purpura (TTP) caused by Escherichia coli O157:H7 strains (Cebula et al.,
1995; UNI10980, 2002). Infection has been associated with the consumption of raw or
undercooked meat of food animals or other foods contaminated with animal faeces (Kumar et
al., 2004). The aims of the present study has been to determine the molecular characteristics
of Shiga toxins-producing E. coli (STEC) isolated from faecal samples and foodstuffs aimed
to gather epidemiological data, to reduce their diffusion along the food chain, finally to
implement the control of this microbial risk. A multiplex polymerase chain reaction (mPCR)
assay has been developed for detection of genes encoding for the most important virulence
factors implicated in the pathogenesis of the disease: stx1 and stx2 (Shiga toxins STX1 and
STX2), eaeA (intimin protein) and hly-A (entero-haemolysin).

Material and methods

Isolation of E. coli strains: Eighty-two faecal samples originated from cattle and fifty-six food
products, including meats and dairy products, have been investigated for the presence of E.
coli. A total number of 82 E. coli strains were isolated by microbiological method: 66 from
faecal samples (64 from swabs and 2 from calf’s faeces) and 16 from foods.

Faecal samples were directly inoculated on MacConkey’s agar plates; after overnight
incubation at 37°C, colonies were picked up and sub-cultured on blood agar plates. Pure
culture were subjected to standard morphological and biochemical tests (Buchanan et al.,
1994). Foodstuffs were investigated for the presence of Escherichia coli according to Norma
UNI 10980-2002 (2002).

DNA extraction: Bacterial Dna extraction was performed from pure colonies on blood agar
plates using QIAamp minikit® (Qiagen, Milano, Italy) (Bartocci et al., 2004).

Towards a risk-based chain control  229

Bartocci, Codega de Oliveira, Ortenzi, Costarelli, Crotti, Scuota, Zicavo, Vizzani and Cenci Goga

Multiplex PCR: Primers and the predicted lengths of PCR amplification products are listed
in Table 1 (Fagan et al., 1999). Reagents concentrations and temperature conditions are
described in Table 2 (Bartocci et al., 2004). PCR was carried out in a Thermocycler Gene
Amp, PCR System, 9700 Gold (Applied Biosystem, Foster City, USA). The amplified products
of PCR were electrophoresed on a 2% agarose gel, stained with ethidium bromide (0.5g/ml)
and photographed using Fotodine 3-3102 (Celbio, Milano, Italia).

Table 1. Oligonucleotide primers used in multiplex PCR.

Target gene Primer Sequence 5’-3’ Amplicon size References

EHEC hlyA hly-f acg atg tgg ttt att ctg ga 165 bp (Fratamico et al., 1995)
hly-r ctt cac gtg acc ata cat at
stx1 stx1-f aca ctg gat gat ctc agt gg 614 bp (Gannon et al., 1992)
stx1-r ctg aat ccc cct cca tta tg
stx2 stx2-f cca tga caa cgg aca gca gtt 779 bp (Gannon et al., 1992)
stx2-r cct gtc aac tga gca gca ctt tg
eaeA eaeA-f gtg gcg aat act ggc gag act 890 bp Gannon et al., 1997)
eaeA-r ccc cat tct ttt tca ccg tcg

Table 2. PCR conditions.

Target genes Mix conditions Temperature conditions

EHEC hly A, Buffer: 1X MgCl 2: 1.5 mM 95°C x 3’

eaeA dNTP: 200 µ.M Taq: 4 U 95°C x 20’’ 35 cicli
stx1 primer stx1 0.2 µ.M primer stx2: 0.2 µ.M 58°C x 40’’
stx2 primer eaeA: 0.2 µ.M primer hlyA: 0.4 µ.M 72°C x 90’’
72°C x 5’


Multiplex PCR results are described in Table 3. Thirteen of 82 E. coli strains were identified
as both stx1 and stx2 positive (15,9%).

230  Towards a risk-based chain control

Bartocci, Codega de Oliveira, Ortenzi, Costarelli, Crotti, Scuota, Zicavo, Vizzani and Cenci Goga

Table 3. Multiplex PCR results.

Samples E. coli strains Virulence factors’ genes

screened isolated
EHEC-hlyA stx1 stx2 eaeA stx1 /stx2

Faecal samples 82 66 2 (2.4)* 14 (17.1)* 19 (23.2)* 1 (1.2)* 13 (15.9)*

Foods 56 16 0 0 0 0 0
Total 138 82 2 14 19 1 13

*percentage value are referred to total Escherichia coli strains isolated


Escherichia coli strains containing virulence factors’ genes derived only from faecal samples;
these observations suggest that contamination of foodstuffs via the faecal route is unlikely.
It is possible to postulate that secondary contamination during food preparation could play
a role in STEC (shiga like toxin E. coli) diffusion.


Bartocci, E., Codega de Oliveira, A., Ortenzi, R., Costarelli, S., Crotti, S., Scuota, S., Vizzani, A. and Cenci Goga, B.T.,
2004. Messa a punto di una metodica di m-PCR per la ricerca dei geni hlyA, stx1, stx2 ed eaeA in Escherichia
coli enteroemorragici. XIII Conferenza Nazionale OXOID, 11 March 2004, Bologna, Italy.
Buchanan, R.E. and Gibbon, N.E., 1994. Bergey’s Manual of Determinative Bacteriology, 9th edition, Baltimore,
Williams & Wilkins, p. 787.
Cebula, T.A., Payne, W.L. and Feng, P., 1995. Simultaneous Identification of Strains of E. coli Serotype O157:H7
and their Shiga-Like Toxin by Mismatch Amplification Mutation Assay-Multiplex PCR. J. Clin. Microbiol. 33,
Fagan, P.K., Hornitzky, M.A., Bettelheim, K.A. and Djordjevic, S.P., 1999. Detection of Shiga-Like Toxin (stx1 and
stx2), Intimin (eaeA), and Enterohemorrhagic Escherichia coli (EHEC) Hemolysin (EHEC hlyA) Genes in Animal
Faeces by Multiplex PCR. Environmental Microbiol. 65, 868-872.
Fratamico, P.M., Sackitey, S.K., Wiedmann, M. and Deng, M.Y., 1995. Detection of Escherichia coli O157:H7 by
multiplex PCR. J. Clin. Microbiol. 33, 2188-2191.
Gannon, V.P.J., King, R.K., Kim, J.Y. and Golsteyn Thomas, E.J., 1992. Rapid and sensitive method for detection
of Shiga-like toxin-producing Escherichia coli in ground of beef using the polymerase chain reaction Applied
Environmental Microbiology 58, 3809-3815.
Gannon, V.P.J., Souza, S.D., Graham, T., King, R.K., Rahn, K. and Read, S.,1997. Use of the flagellar H7 gene as a
target in multiplex PCR assays and improved specificity in identification of enterohaemorragic Escherichia coli
strains. J. Clin. Microbiol. 35, 656-662.
Kumar, H.S., Karunasagar, I. and Teitzou, T., 2004. Characterisation of Shiga toxin-producing Escherichia coli (STEC)
isolated from seafood and beef. FEMS Microbiol. Letters 233, 173-178.
UNI10980, 2002. Metodo di routine per la conta di E. coli ß-glucuronodasi positivi tecnica di conta delle colonie
a +44°C.

Towards a risk-based chain control  231

L. Beneduce, G. Spano, S. Baldassarre, V. Terzi, G. La Salandra and S. Massa

Real-Time PCR method as a powerful tool to detect

Escherichia coli O157:H7 in wastewater produced
from Mozzarella cheese factories
L. Beneduce1, G. Spano1, S. Baldassarre1, V. Terzi2, G. La Salandra3 and
S. Massa1
1Department of Food Science, Foggia University, via Napoli 25, 71100 Foggia, Italy,
2Istituto Sperimentale per la Cerealicoltura, Via S.Protaso 302, 29017 Fiorenzuola d’Arda

(PC), Italy
3Istituto Zooprofilattico della Puglia e della Basilicata, 71100 Foggia, Italy


Although methods for the effective treatment of wastewater exist and are essential to ensure
the safety of drinking water and crops, the wastewater originated from dairy and cattle
recovery may harbour bacterial species including human pathogens such as Escherichia coli
O157:H7. Enterohaemorrhagic Escherichia coli O157:H7 is one of the most studied foodborne
pathogen bacteria, because of its widespread diffusion, low dose infectiveness and severe
symptoms associated. Since dairy and cattle are considered a major reservoir of Escherichia
coli O157:H7 this pathogen may potentially infect the drinking water supply from cattle waste
wash water. Conventional culture techniques for the detection of E. coli O157:H7 in water
and environment are labour intensive and time consuming and often fail the recovery of this
bacteria when the concentration in the sample is low (beyond the threshold of sensitivity)
or when cells enter in viable not culturable (VNC) state. In this study we have developed
and evaluated a molecular method based on real-time PCR for the identification of E. coli
O157:H7 in cattle and dairy wastewater samples produced from Mozzarella cheese factories,
without pre-enrichment step before DNA extraction. Moreover, we were able to identify E.
coli O157:H7 in cattle and dairy wastewater in which previous method (including standard
PCR) failed to work.

232  Towards a risk-based chain control

P. Berzaghi, S. Segato, E. Soardo, L. Serva, M. Mirisola, A. Corato and I. Andrighetto

NIR analysis of veal meat as an easy way to

discriminate illicit hormones treatment
P. Berzaghi1, S. Segato1, E. Soardo1, L. Serva1, M. Mirisola1, A. Corato1
and I. Andrighetto2
1Department of Animal Science, University of Padua, 35020 Legnaro (PD), Italy
2Istituto Zooprofilattico delle Venezie, 35020 Legnaro (PD), Italy

The use of illicit hormones in growth of animals, like boldenone, enhance animal performances.
Methods of control in veal and beef, however, are not effective in detecting illegal practice.
The main reason is that after those illegal treatments, hormones and their metabolites are
rapidly metabolized and very difficult to detect in blood, urine or meat. On the contrary, the
use of boldenone affects the chemical composition in animal carcasses, thus it is expected
that its analysis may reveal difference between treated and control calves.

Near infrared spectroscopy (NIRS) is a fast and non destructive method of analysis, based
on the absorption of the organic bounds.

Due to the noise/signal rate, the low concentration of boldenone and its derivates in the
meat and the high moisture level (water affect strongly the spectrum because of high
absorption level), it seems that NIRS is not an adequate method of analysis. However, NIRS
is used for meat analysis and may be able to detect differences in meat compositions due to
hormonal treatment. The aim of this study was to evaluate the feasibility of the use of NIRS
to discriminate between meat of hormone treated and untreated veals during rearing.

Material and methods

Two datasets of beef, from the longissimus dorsii of German brown male calves were scanned
by a FOSS NIR System 5000 (1.100-2.498 nm, gap 2 nm). Samples (fresh ground and freeze-
dried ground meat) were scanned in duplicate until low RMS (root mean square) between
sub-samples spectra were obtained.. The first dataset was composed of samples obtained by
13 calves treated with boldenone (BOLD) and its derivatives (ADD). The second dataset was
composed of 13 control samples.

Three samples were randomly selected by each dataset and used to build a validation (VAL)
dataset. Remaining samples were used to perform NIR calibration (CAL). Samples were
scanned as freshly ground meat and again after freeze-drying and grinding. Using discriminant
analysis (2 first principal components) in both fresh ground and freeze-dried ground meat,
five samples were correctly predicted and the last one not (VAL data-set).

Discriminant analysis was performed using modified partial least square analysis (MPLS)
after scatter correction (Normal Variate and Detrending) and second derivative with gap and
smoothing between 10 nm using the WinISI 1.5 software (Infrasoft International, USA).

Towards a risk-based chain control  233

P. Berzaghi, S. Segato, E. Soardo, L. Serva, M. Mirisola, A. Corato and I. Andrighetto

Table 1. Use of discriminant analysis in the prediction of 6 fresh ground samples used as validation dataset.

Predicted Total

Treated Control

Actual Treated 3 0 3
Control 1 2 3
Errors 1 0

Results and discussion

Strong spectral differences were reported in score 2 and 3 but not in component 1 or higher
then 3. Therefore it was decided to utilize only two components (adding further scores would
cause an over fitting effect on calibration of the dataset).

Under control conditions, differences were reported in principal component analysis. Plotting
components 2 and 3 in a bi-dimensional plane graph, two groups are easily recognized
(Figure 2).

Calibration on fresh ground meat was performed with MPLS assigning a dummy value of 1 to
positive samples and 0 to control samples, obtaining a predicting equation with a fraction

1 2 3 4 5 6 7 8 9 10 11 12 13

1 2 3 4
Figure 1. Plot of the first 13 scores and differences in first 4 scores in fresh ground meat. Black line=
treated, red line=control.

234  Towards a risk-based chain control

P. Berzaghi, S. Segato, E. Soardo, L. Serva, M. Mirisola, A. Corato and I. Andrighetto




0.0026 0.0076 0.0126 0.0176 0.0226 0.0276 0.0326
Figure 2. Plot of the scores 2 and 3 in fresh ground meat.

Table 2. Use of multivariate approach; calibration and validation performance R2 on fresh ground or dried ground

Fresh ground meat Freeze dried and ground meat


Treated 0.77 0.76 0.49 0.16

of explained variance of 0.77. Validating with VAL data-set, on fresh ground meat resulted
in a R2 of 0.76. Inadequate performance was obtained on freeze-dried meat. It was noted
that animals submitted to hormonal treatments had a lower water holding capacity thus, in
fresh ground meat, it is easier to discriminate from the control than by treated beef.


Near infrared spectral analysis was able to detect and correctly discriminate in most cases
between meat of boldenone treated and untreated animals. The promising results must be
confirmed with future experiments to test the robustness of the method under different
rearing conditions.


The research was supported by Regione Veneto, Direzione Regionale per la Prevenzione
Dorsoduro 3493, 30125 Venezia.

Towards a risk-based chain control  235

P. Boni, P. Daminelli, E. Cosciani Cunico, P. Monastero, B. Bertasi, F. Rossi and L. Bornati

Analysis of Listeria monocytogenes, Salmonella

typhimurium and enteritidis and Staphylococcus
aureus death rate in Grana Padano DOP cheese
P. Boni, P. Daminelli, E. Cosciani Cunico, P. Monastero, B. Bertasi, F. Rossi
and L. Bornati
Istituto Zooprofilattico Sperimentale della Lombardia e dell’Emilia Romagna “B. Ubertini”,
Via Bianchi 9, 25124 Brescia, Italy


During the past few years, cheese and cheese products have been demonstrated to contain
pathogenic micro-organisms and to have caused human illness. Some cheeses have been
linked to foodborne outbreaks and illnesses caused by Salmonella and Listeria monocytogenes
and in other cheeses were found to contain excessive levels of Staphylococcus aureus. The
highest contamination risk belong to soft cheese and the use of raw milk for cheese making
is forbidden in many countries (FDA/CFSAN, 2003). For this reason the international trading
needs to guaranty the food safety of products, such as Grana Padano DOP, made from raw milk.
Result of studies on survival of L. monocytogenes and Salmonella spp. has shown that if the
pathogens are present in sufficient numbers in milk, the ready-to-consume cheese will contain
the pathogen (Ahmed and Marth, 1990). To perform safety evaluation, instead of traditional
microbiological controls applied to the final food products, an alternative solution based on
simple concepts of predictive microbiology were used. This is an area of food microbiology
that allows us to store the cumulative knowledge and data using advanced computational
tools (database, expert system) and to understand better the microbial responses to some
main controlling factors. Beside, using statistical and mathematical techniques it’s possible
to summarise the results and predict microbial responses. By this approach the effect of Grana
Padano DOP cheese process condition was validate on the fate of pathogens above described.
Especially the curd heat treatment around 53°C, during its manufacture, may minimize such
a hazard in Grana Padano DOP cheese.

Material and methods

Bacterial cultures: Listeria monocytogenes (isolated at IZSLER from Gorgonzola cheese),

Staphylococcus aureus producer of toxin B (isolated at IZSLER from infected cow), S. enteritidis
(n°670 IZSLER) and S. typhimurium (ATCC6994) were used in this study. The pathogens were
cultured into BHI (Brain Heart Infusion agar with 10% serum) slant and Roux at 37°C for 24
h. An amount from the last pathogens culture, sufficient to give ca. 10^7 cells/ml of cheese
milk, was diluted with physiologic solution and added to the contents of a cheese vat.

Cheese manufacture, sampling and enumeration of pathogens: Grana Padano DOP cheese was
manufactured following the regulation imposed by Grana Padano DOP Consortium (n°1107/96

236  Towards a risk-based chain control

P. Boni, P. Daminelli, E. Cosciani Cunico, P. Monastero, B. Bertasi, F. Rossi and L. Bornati

CE). Each vat were filled in with semi whole milk (ca. 1,000L). One vat was used as control,
the second were inoculated with L. monocytogenes culture, the third with L. monocytogenes,
S. enteritidis and S. typhimurium, the forth with L. monocytogenes and Staphylococcus
aureus. The contaminated milk, or dilutions thereof, were suddenly, after sampling, surface
plated on specific media. Salmonella spp. was isolated using Hektoen Enteric Agar, Oxoid
CM0419, incubated for 24h at 37°C; L. monocytogenes was isolated using Listeria selective
agar (oxford), Oxoid CM0856, and Staphylococcus aureus was isolated using Baird Parker
Agar Base, Oxoid CM0275 both incubated at 37°C for 48h. Pathogens cells were identified,
counted, and confirmed (UNI CEI EN ISO / IEC 17025). The samples were collected before,
during, and after the curd cooking, more strictly in the cooking phase and more delayed one
hour later. Curd or cheese samples were diluted (1/3) at the moment and homogenized; the
dilution where surface plated as described above. The initial temperature increase until to
53°C is enough to maintain the value high for about one hour and half. Times of sampling
were different for each vat. This was done to obtain as much information as possible about
L. monocytogenes behaviour during time in this type of cheese.

Mathematical model: ComBase, an Internet-based database of microbial responses to the

food or media environments ( was used to establish the experimental
design (interval times of sampling). The research criteria were to find out how the pathogens
above described can survive in an constant environment (culture media) with pH, aw and
temperature range similar to Grana Padano DOP cheese during the manufacture. The finding
data, not shown, (log cfu vs. time; 50-55°C; pH 5) were analysed by the DMFit program
( based on the model of Baranyi (Baranyi and Roberts, 1994).
This Excel add-in allows the fitting of sigmoid curves, linear and two-phase-linear (bi-phasic)
functions to data using the regression analysis. DMFit was used to fit bacterial death curves
and to calculate the time for a 90% reduction in the number of surviving organism (D value)
of the pathogens. Time sampling for Grana Padano DOP cheese during its manufacture was
selected based on the D values obtained. The different heat treatment from the beginning
and during the curd cooking suggested to fit the data using bi-phasic primary model. The D
values, death rates and relative standard error (se) were calculated from the slopes of the two
different linear phases that are joined at a breakpoint (rate=Log(initial concentration/final
concentration)/time; D=-1/rate). Data to calculate the death rates of L. monocytogenes were
obtained from three different contaminated vats.

Results and discussion

The results obtained show that the logarithm of cells concentration during time decrease
faster in the cooking phase where Grana Padano DOP curd maintain ca.53°C for more than
one hour. In the other hand under 50°C the survival cells population died slower. In Table 1,
for each pathogen, the death rates (1 is higher, 2 is slower), the standard error of the rate
and the time in which the rate change (breakpoint) are represented.

In Figure 1 it is possible to see the observed data and the fitting curves. The death rates of
L. monocytogenes and Staphylococcus aureus during the process of Grana Padano DOP cheese
are similar (not constant), while the death rate of Salmonella spp. is constant. This suggests

Towards a risk-based chain control  237

P. Boni, P. Daminelli, E. Cosciani Cunico, P. Monastero, B. Bertasi, F. Rossi and L. Bornati

Table 1. Different death rate of L. monocytogenes, Staphylococcus aureus, Salmonella spp. during Grana Padano DOP
cheese process. 1 is the higher rate at 53°C, 2 is the slower rate after the linear breakpoint.

Pathogen Linear phases rate es breakpoint

L. monocytogenes 1 -2.16 0.16 1h 42’

2 -0.6 0.19
Staphylococcus aureus 1 2.26 0.28 1h 43’
2 -0.7 0.23
Salmonella spp. 1 -5.01 0.6

10 10 10
9 9 9
8 8 8
7 7 7
6 6 6
5 5 5
4 4 4
3 3 3
2 2 2
1 1 1
0 2 4 6 8 10 12 0 0 2 4 6 8 10 12
0 2 4 6 8 10 12
Figure 1. Observed data (Log cfu vs. time in hours and curve fitting) during Grana Padano DOP. DOP
challenge test; 1 Listeria monocytogenes in three vats, 2 Staphylococcus aureus; 3 Salmonella spp.

the choice of the linear and two phase-linear (bi-phasic) functions to data. The breakpoint,
where the functions are joined, is after 1h and 42’ from the beginning of cheese making.
Sampling temperature measured shows that during that period the temperature is around
53°C, and then decreases slowly (data not shown). The curd, 90 Kg, can maintain such high
temperature for such a long time because it remains at the bottom of the typical conic vat
under ca. 900L of whey for 1 hour (according to manufacture method). The D value for the
L. monocytogenes in the cooking phase is 28’and 22” and becomes 1h and 38’ when the
temperature of cheese decreases. D value for Staphylococcus aureus is 26’ and 58’’ in the
first phase and 1h and 26’ in the second. The D value for Salmonella spp. is 12’ and 37’’. The
results obtain show that the Grana Padano DOP cheese making process produce 8-D kill for L.
monocytogenes, Staphylococcus aureus and Salmonella spp. pathogens. All pathogens are not
detected anymore after 6, 24, 48 hours from the beginning of the cheese making process.
The 99,99% of L. monocytogenes and Staphylococcus aureus cells population decreases after
the first 1h and 42’, while of the other pathogens 99.9% decreases during the following four
hours. The concentration of Salmonella spp. decreases of 8 order of magnitude in the first
hour and half. The experiment performed permits to guaranty the safety level of the Grana
Padano DOP as required by the international treading (USDA/FSIS).

238  Towards a risk-based chain control

P. Boni, P. Daminelli, E. Cosciani Cunico, P. Monastero, B. Bertasi, F. Rossi and L. Bornati


We thank all the Department of Food Safety of IZSLER providing technical assistance and
consultation. We also thank Dr. Jozsef Baranyi and his staff for the kind teaching and advice.
Our appreciation is also extended to Angelo Stroppa (Grana Padano DOP Consortium) for
supplies, service, equipment, and consultation. This work was supported by Grana Padano


Ahmed, E.Y. and Marth, E.H., 1990. Fate of Listeria monocytogenes during the manufacture and ripening of Parmesan
cheese. J. Dairy Sci. 73, 3351-3356.
Baranyi, J. and Roberts, T.A., 1994. A dynamic approach to predicting bacterial growth in food. Int. J. Food
Microbiol. 23, 277–294.
FDA/CFSAN, 2003.

Towards a risk-based chain control  239

P. Bonilauri, G. Merialdi, L. Casadei, G. Liuzzo, S. Bentley and M. Dottori

Contamination distributions and virulence factors of

Listeria monocytogenes in raw meat of avian, bovine,
and swine origin
P. Bonilauri1, G. Merialdi1, L. Casadei1, G. Liuzzo2, S. Bentley3 and M.
1IZSLER, Sezione di Reggio Emilia, Italy
2AUSL Modena, Italy
3Dipartimento di Salute Animale, Sezione di Ispezione degli alimenti di Origine Animale,

Università di Parma, Italy


Exposure Assessment is the qualitative and/or quantitative evaluation of the likely intake
of a biological agent via food as well as exposure to other sources if relevant, and it is a
function of the quantity of food consumed and the level of contamination in that food
(USDA/FDA, 2003). Over the last fifteen years many studies have been published that
describe the likelihood of presence of L. monocytogenes in different raw meat, but a very
low number of studies report enumeration data expressing level of contamination. Hazard
Characterization is the qualitative and/or quantitative evaluation of the nature of the
adverse health effects associated with the hazard (FAO/WHO, 1995; CAC, 1999). In other
words Hazard Characterization is a measure of negative effects due to ingestion of a dose of
micro-organisms and can be represented by a dose-response model. The dose response model
is influenced by various host factors (age, immunity, pregnancy, etc.), by matrix factors (e.g.
fat content) and by the virulence of the ingested bacteria. It is nowadays well known that
among Listeria monocytogenes species, strains coexist with different virulence, and variation
in virulence is demonstrable among strains. This issue is of main importance in human
Listeriosis Risk Assessment and can influence the number of organisms required to produce
illness and possibly the severity or manifestation of illness (USDA/FDA, 2003).

The aim of this study is to give an interpretation of the L. monocytogenes contamination

distribution in raw meat of avian, bovine and swine origin and also to give an example
of virulence index determination of strains isolated from samples belonging to the above
mentioned animal species.

Material and methods

A total of 523 25g raw meat samples (168 bovine, 176 avian, 179 swine) were subjected to
qualitative search of L. monocytogenes by an enrichment technique. In positive samples the
contamination was quantified using EN ISO 11290-2 methods. All strains were serotyped using
Denka Seiken (Tokio Japan) anti-sera. 48 Strains representative of all isolated serotypes were
submitted to genomic virulence evaluation, as describe by Wiedmann et al. (1997). Using

240  Towards a risk-based chain control

P. Bonilauri, G. Merialdi, L. Casadei, G. Liuzzo, S. Bentley and M. Dottori

presence/absence data, three prevalence Beta distributions were obtained for bovine, avian
and swine raw meat. To describe level of contamination in any species three contamination
Lognormal distributions were modelled using the following approach: enumeration data
(log10) of any species were fitted to three normal distributions whose standard deviations
values were used as the standard deviations of the respective Lognormal curves. The mean of
the contamination Lognormal curves was obtained by “sliding” the mean until the percentage
of positive samples (>0.04 CFU/g) corresponded to the presence/absence data (USDA/
FDA September 2003). For any lineage (Wiedmann et al., 1997) a virulence factor (f) was
arbitrarily fixed; for lineage I, f was fixed equal to 1, for lineage II, f was fixed equal to 0,1
and for lineage III f was fixed equal to 0.

For any detected serotype, a virulence value (v) was calculated as the summation of the
fraction of strains classified into the three lineage multiplied by the virulence factor (f) of
any lineage:

v(serotype) = (LI/nserotype) * f I + (LII/nserotype) * f II + (LIII/nserotype) * f III)

were LI is the number of strains assigned to lineage I, LII is the numbed of strains assigned
to lineage II, LIII is the number of strains assigned to lineage III and n(serotype) is the number
of strains tested for that serotype.

A virulence index V(species) for bovine, swine and avian origin raw meat is proposed by
combination of serotype data (frq(serotype) = frequencies of any serotypes for any matrix) with
the virulence value v(serotype).

V(species) = Sv(serotype) * frq(serotype)


Table 1. Number of positive samples and the enumeration (CFU/g) of contamination for avian, bovine and swine raw

Species N Positive 0,04-10CFU/g 10-100CFU/g 100-1000CFU/g

Avian 176 34 29 5 0
Bovine 168 14 14 0 0
Swine 179 30 30 0 0

Towards a risk-based chain control  241

P. Bonilauri, G. Merialdi, L. Casadei, G. Liuzzo, S. Bentley and M. Dottori

Table 2. Prevalence Beta distributions of L. monocytogenes in raw meat of avian, bovine and swine origin.

Species prev median 5% 95%

Avian beta(35.143) 0.20 0.15 0.25

Bovine beta(15.155) 0.09 0.06 0.13
Swine beta(31.150) 0.17 0.13 0.22

Table 3. Contamination Lognormal distributions and their principals parameter.

Species prev mean 5% 95% Max

Avian lognormal -4.3482 3.4543 -10.03 1.33 10

Bovine lognormal -5.1000 2.9463 -9.95 -0.25 10
Swine lognormal -4.4900 3.2049 -9.76 0.78 10

Avian species prevalence Beta(35.000; 143.00)

X <= 0.149 X <= 0.247
5.0% 95.0%

-0.050 0.05 0.1 0.15 0.2 0.25 0.3

Figure 1. Prevalence Beta curve for avian raw meat.

logNormal(-4.348; 3.4543) Trunc(-inf;10)

X <= -10.03 X <= 1.33
5.0% 95.0%

-15 -10 -5 0 5 10 15

242  Towards a risk-based chain control


P. Bonilauri, G. Merialdi, L. Casadei, G. Liuzzo, S. Bentley and M. Dottori

logNormal(-4.348; 3.4543) Trunc(-inf;10)

X <= -10.03 X <= 1.33
5.0% 95.0%

-15 -10 -5 0 5 10 15
Figure 2. Contamination Lognormal curve for avian raw meat.

Table 4. Frequencies of serotypes per species.

Species N 1/2a 1/2b 1/2c 4b other total

Avian 34 0.706 0.029 0.207 0.058 0 1

Bovine 14 0.643 0.143 0.143 0.071 0 1
Swine 30 0.600 0.133 0.167 0.100 0 1

Table 5. Fraction of strains classified into the three virulence lineage per serotype.

Serotype N lineage I Lineage II Lineage III total

1/2a 12 0 0.667 0.333 1

1/2b 12 0.417 0 0.583 1
1/2c 12 0.08 0.75 0.17 1
4b 12 0.75 0 0.25 1

The virulence value (v) for any serotype calculated as describe above is for serotype 1/2a = 0.067, for serotype
1/2b = 0.417, for serotype 1/2c = 0.158 and for serotype 4b = 0.750. Finally the virulence index (V) for avian row
meat resulted =0.135 (0.06-0.24), for bovine row meat =0.178 (0.04-0.37) and for swine row meat =0.197 (0.08-


The results obtained are to be considered as an attempt of the authors in reorganizing data
obtained in a recent epidemiological study about L. monocytogenes raw meat contamination
in a Risk Assessment feasible way. Raw meat is not usually associated to human listeriosis,
but it is the base of many ready to eat preparations and for this reason the prevalence and

Towards a risk-based chain control  243

P. Bonilauri, G. Merialdi, L. Casadei, G. Liuzzo, S. Bentley and M. Dottori

the contamination curves presented, may be useful for an Italian “from farm to fork” L.
monocytogenes risk assessment study. Virulence indexes for avian, bovine and swine strains
were presented as an attempt to approach an important item as Hazard Characterization.


Part of the results were obtained from Progetto di Ricerca Corrente IZS LER 08/2000. Ministero
della Salute.


Codex Alimentarius Commission, 1999. Principles and guidelines for the conduct of a microbiological risk
FAO, Rome. CAC/GL-30. FAO/WHO, 1995. Application of risk analysis to food standards issues. Report of the Joint
FAO/WHO Expert Consultation. WHO, Geneva. WHO/FNU/FOS/95.3.
USDA/FDA, 2003. Quantitative Assessment of the Relative Risk to Public Health from Foodborne Listeria monocytogenes
Among Selected Categories of Ready-To-Eat Foods, September 2003.
Wedmann, M., Bruce, J.L., Keating, C., Johnson, A.E., McDonough, P.L. and Batt, C.A., 1997. Ribotypes and virulence
gene polymorphisms suggest three distinct Listeria monocytogenes lineages with differences in pathogenic
potential. Infect. Immun. 65, 2707-2716.

244  Towards a risk-based chain control

T. Bossu, E. Ingle, S. Saccares, R.N. Brizioli and S. Cataudella

Monitoring programs on aquaculture products

T. Bossu1, E. Ingle1, S. Saccares1, R.N. Brizioli1 and S. Cataudella2
1Istituto Zooprofilattico Sperimentale delle Regioni Lazio e Toscana, Rome, Italy
2Dipartimento di Biologia, Universita di Roma “Tor Vergata”, Rome, Italy

Fishery products from aquaculture have in the past years acquired a more relevant position
due to many reasons; these can be summarised as follows:

• Relevance of the problems linked to the state of living aquatic resources, with an increasing
attention for the entire ecosystem rather than only concentrating on those species which
are commercially the most interesting.
• Declining of the fishery sources caused by the overexploitation of most fishing grounds
in the world and the emerging need of new management models and international and
local regulation systems, resulting in fish farming replacing wild animals harvesting as
food source.
• Quick growth of the aquaculture activities, especially in Asia and, for some species, in
Europe and in the U.S.
• Growing attention, in fishery and aquaculture production as well as in other farming
activities, for the problems related to food safety, hygiene, and quality within a global
market, characterized by high levels of risks and competition and different needs for each
geographical area. Parasitic infections, foodborne diseases associated with pathogens,
chemical residues can be identified as hazards.

In comparison with other types of agrifood production, fishery production has specific
features with high complexity and historical delays. It requires the constitution of a suitable
database to program a suitable system for regimentation and control.

In this context it is clear that steps are needed to improve the level of knowledge of
producers and authorities competent for official controls. A working group of the Istituto
Zooprofilattico Sperimentale delle Regione Lazio e Toscana identified useful elements of
monitoring programs on aquaculture productions. Such programs should be in agreement with
national and European laws and must take into account the specificity of the production and
of the regional market for which the Istituto is responsible.

Risk analysis is impossible when epidemiological data are lacking and such data can only
be provided in a well known environment. An integrated approach is needed to realise a
system for monitoring animal and foodborne diseases, drugs administration, environmental
conditions related to fish farming and to collect such data.

The following is necessary:

• Constitution of a production chain model in aquaculture complying with the regulations

in force, with a precise identification of critical points to be addressed to Veterinary

Towards a risk-based chain control  245

T. Bossu, E. Ingle, S. Saccares, R.N. Brizioli and S. Cataudella

• Identification of field of competence (health, agriculture, environment, etc.).

• Identification of the weak points (knowledge gaps, absence of objective tools, frame
of the chain) in order to realize a correct policy for prevention and, where appropriate,
• The conception of a management and operative model to set up a local web of services
to ensure the collection of field data for a proper risk analysis with the aim to develop
safe aquaculture production and processing procedures.

More specifically, safety problems must be related with the quality, the tradition and the
local procedures, which are different for each geographical area concerned.

The risk analysis in aquaculture industry therefore depends on the identification of suitable
monitoring models, based on the identification of significant and realistic interdisciplinary

246  Towards a risk-based chain control

R. Branciari, R. Mammoli, D. Ranucci, D. Miraglia, G. Gorziglia, F. Feliziani and P. Avellini

The slaughterhouse as an epidemiological

observatory for the surveillance of caseous
lymphadenitis in sheep
R. Branciari1, R. Mammoli1, D. Ranucci1, D. Miraglia1, G. Gorziglia2, F.
Feliziani3 and P. Avellini1
1Dipartimento di Scienze degli Alimenti, Sezione Sicurezza e Qualità degli Alimenti di Origine
Animale, Università degli Studi di Perugia, Via S. Costanzo, 06126 Perugia, Italy
2Veterinary practitioner, Italy
3Istituto Zooprofilattico Sperimentale dell’Umbria e delle Marche, Via Salvemini, 06126

Perugia, Italy


Corynebacterium pseudotuberculosis is the causative agent of caseous lymphadenitis in

goats and sheep, a chronic disease characterized by suppurative, necrotizing inflammation
usually localized in the lymph nodes (Brown and Olander, 1987). The disease can be found
in most parts of the world (Williamson, 2001) especially in areas where intensive husbandry
of small ruminants is practised (Literák et al., 1999). This pathology is important both for
the economic loss it causes (Risvi et al., 1997) and for the possible transmission to man
(Goldberger et al., 1981). It is therefore important to estimate the presence of this disease
as well as the localization of the lesions in order to have useful data in case eradication
plans should be implemented.

Materials and methods

During a one year surveillance of two EC sheep slaughterhouses, situated in central Italy,
a total of 8,032 adult animals were examined. An accurate post-mortem examination was
performed in order to evidence and record lesions possibly due to C. pseudotuberculosis in
lymph nodes, organs (lungs, liver, kidneys, etc.) as well as in subcutaneous tissues. Data of
post-mortem examination were grouped according to the country of origin of the animals.

The information collected was used for statistical calculation (chi-square test), in order to
evidence possible significant differences in the prevalence values observed. The relative risk
(RR) was used as measure of comparison among the countries considered. The chi-square
test was also employed to investigate the prevalence of lesions localized in various organs.
Statistical calculations were performed using the Epiinfo software distributed by the Center
for Disease Control of Atlanta, USA.

Towards a risk-based chain control  247

R. Branciari, R. Mammoli, D. Ranucci, D. Miraglia, G. Gorziglia, F. Feliziani and P. Avellini

Results and discussion

The overall prevalence of lesions related to caseous lymphadenitis was estimated to be 5.5%
with confidence interval (CI) between 5.0-6.0 (95% confidence limits - CL). As reported in
Table 1, when animals were divided into groups on the basis of their country of origin, a
difference in prevalence values was found.

Lesions due to caseous lymphadenitis were absent or extremely rare in animals imported
from Romania or Hungary, while they were found in animals from Spain, France and Italy.
Among the last three countries, the prevalence values obtained are significantly different, in
particular the RR of finding lesions in Spanish animals is 1.8 times higher than in the other
two countries taken as a whole and 1.5 times higher than in Italian animals. This data is not
surprising as Spain already proved to have problems with caseous lymphadenitis (Girones et
al., 1992; Severini et al., 2003).

As seen in Table 2, most of the lesions were found in the thorax region, even if some animals
had extra-thoracic ones in association with them (3 in the liver and 4 in the subcutaneous
tissues). Only in two cases were lesions seen exclusively in the liver or in the subcutaneous
tissues. The prevalent presence of lesions in the lung district (organs and regional lymph
nodes) is also well documented by other authors (Gilmour, 1990; Ziino et al., 1999; Pekelder,
2000). Therefore it may be considered sufficient to inspect only this district when screening
for caseous lymphadenitis, since the probability of finding false negative animals with this
practice is only about 0.5%.

Table 1. Prevalence of caseous lymphadenitis in sheep in relation to the country of origin.

Country Number of animals observed Number of positive animals Prevalence (%) CI (95% CL)

Spain 5,497 351 6.4 5.8-7.0

France 306 17 5.6 3.4-8.9
Italy 1,616 71 4.4 3.5-5.5
Hungary 476 1 0.2 0.0-1.3
Romania 137 0 <2.16* -

*maximum estimable prevalence value

Table 2. Location of the lesions in different districts.

Location Number of positive animals Prevalence (%) CI (95% CL)

Thorax 438 99.5 98.2-99.9

Multiple locations (including thorax) 7 1.6 0.7-3.4
Other than thorax 2 0.5 0.0-1.8

248  Towards a risk-based chain control

R. Branciari, R. Mammoli, D. Ranucci, D. Miraglia, G. Gorziglia, F. Feliziani and P. Avellini


Considering these elements the importance of the slaughterhouse as an epidemiological

observatory clearly appears. The information collected at this level, mainly by accurately
inspecting the thorax region, can be used to spot the disease. For this information to be more
useful, a proper system of identification of the farm of origin of the single animals would be
necessary, as this could then be a mean to implement a control system at farm level which
would be useful to eradicate caseous lymphadenitis.

Due to the possibility of direct transmission to man, mainly causing suppurative granulomatous
lymphadenitis, slaughterhouse operators should be advised to pay greater attention when
handling animals whose origin can be considered more at risk.


Brown, C.C. and Olander, H.J., 1987. Caseous lymphadenitis of goats and sheep, a review. Vet. Bull. 57, 1-12.
Gilmour, N.J.L., 1990. Caseous lymphadenitis: a cause for concern. Vet. Rec. 126, 566.
Girones, O., Simon, M.C. and Alonso, J.L., 1992. Linfadenitis caseosa. I. Importancia economico-sanitaria. Etiologia,
epidemiologia y patogenia. Medicina Veterinaria 9, 135-148.
Goldberger, A.C., Lipsky, B.A. and Plorde, J.J., 1981. Suppurative granulomatous lymphadenitis caused by
Corinebacterium ovis (pseudotuberculosis). Am. J. Clin. Pathol. 76, 486-490.
Literák, I., Horváthova, A., Jahnova, M., Rychlik, I. and Skalka, B., 1999. Phenotype and genotype characteristics
of the Slovak and Czech Corynebacterium pseudotuberculosis strains isolated from sheep and goats. Small Rum.
Res. 32, 107-111.
Pekelder, J.J., 2000. Caseous lymphadenitis. In: Martin (Ed.), Diseases of sheep, Blackwell Science, Oxford, UK,
p. 270-274.
Rizvi, S., Green, L.E. and Glover, M.J., 1997. Caseous lymphadenitis: an increasing cause for concern. Vet. Rec.
140, 586-587.
Severini, M., Ranucci, D., Miraglia, D. and Cenci Goga, B.T., 2003. Pseudotuberculosis in sheep as a concern of
veterinary public health. Vet. Res. Comm. 27, suppl. 1, 315-318.
Williamson, L.H., 2001. Caseous lymphadenitis in small ruminants. Veterinary Clinics of North America: Food Animal
Practice 17, 359-371.
Ziino, G., Giuffrida, A., Ferrara, M.C. and Panebianco, A., 1999. Indagine sull’incidenza della pseudotubercolosi
ovina in due macelli della provincia di Messina. Problematiche ispettive alla luce del D.L.vo 286/94. Atti IX
Convegno Nazionale AIVI, Colle Val d’Elsa (SI), 5-7 November 1999, p. 279-284.

Towards a risk-based chain control  249

F. Brülisauer, T. Berger, B. Klein and J. Danuser

Risk based surveillance of milk and dairy products

F. Brülisauer1, T. Berger2, B. Klein3 and J. Danuser1
1FederalVeterinary Office, Berne, Switzerland
2Agroscope Liebefeld-Posieux, Berne, Switzerland
3Cantonal Laboratory, Epalinges, Switzerland.


Microbiological safety of milk and dairy products is a major issue in public health.
Microbiological contamination may occur by direct excretion of micro-organisms from the
udder, faecal contamination during milking or inadequate hygiene measures during further
processing steps. In order to export milk and dairy products to the EU, Switzerland has to
meet the requirements of EU directive 92/46/EEC. To ensure implementation of that directive,
the Federal Veterinary Office (FVO) carried out a first national monitoring programme in co-
operation with the Association of Swiss Cantonal Chemists in 2003/04.

Material and Methods

A risk assessment on the public health impact of milk and dairy products was conducted by
the FVO in 2001 ( Based on
this assessment and on the input of cantonal food safety authorities, a risk based sampling
plan for milk and dairy products was established. Pasteurised milk and 16 dairy products
were sampled during one year. The samples were collected from industrial plants, dairies
and mountain dairies by cantonal authorities. In small cantons all dairies were sampled, in
big cantons at least 50 percent of all dairies were randomly chosen and sampled. Samples
were transported to cantonal laboratories and processed depending on the product either
immediately or around their expiration date. Aerobic plate counts and Enterobacteriaceae
counts were performed, furthermore presence of coagulase positive Staphylococcus, Listeria
monocytogenes, Salmonella spp., E. coli, and yeasts were determined. The summaries of the
cantonal lab results were checked for consistency and analysed with descriptive statistics at
the FVO. Results were reported as prevalence of overstepping threshold values.

Results and Discussion

A total of 10,024 samples of milk and dairy products were examined. The results showed a low
prevalence of pathogens and a high microbial quality standard of milk and dairy products (Table
1). However, room for quality improvement was revealed. In milk and cream no pathogenic
micro-organisms have been found, but more emphasis needs to be placed on production
hygiene. Butter produced in mountain dairies was frequently found to be contaminated by E.
coli. This could be avoided if the statutory provisions were followed more strictly by producers,
therefore their compliance needs to be checked by official inspections on a regular basis.
As expected hard and semi-hard cheese made of cow milk was seldom subject to threshold

250  Towards a risk-based chain control

F. Brülisauer, T. Berger, B. Klein and J. Danuser

Table 1. Comparison of the assessed (risk assessment 2001) versus observed (survey 2002/03) prevalence of overstepping
threshold values in milk and dairy products (N = 10,024).

Thres-hold Over-stepping Over-stepping CI 95% N

assessed* observed

Pasteurised Milk aerobic plate count e n.a. 8.3% 6 - 11.1% 483

Enterobacteriaceae a n.a. 4.4% 2.7 - 6.6% 478
Pasteurised Cream aerobic plate count e n.a. 16.1% 12.5 - 20.2% 373
E. coli a 1 - 2.5% 1.7% 0.6 - 3.9% 295
Enterobacteriaceae a n.a. 10.5% 7.6 - 14.1% 370
coagul. pos. Staph. d < 1% 0.3% 0 - 1.8% 307
Salmonella spp g < 1% 0.0% 0 - 1.2% 311
Butter E. coli a < 1% 9.0% 6.2 - 12.6% 332
Hard Cow Cheese E. coli a < 1% 2.5% 1.5 - 3.9% 750
coagul. pos. Staph. b < 1% 1.2% 0.6 - 2.3% 749
coagul. pos. Staph. d < 1% 0.3% 0 - 1.0% 749
Semihard Cow Cheese E. coli c 1 - 2.5% 5.4% 3.9 - 7.3% 774
coagul. pos. Staph. c 1 - 2.5% 5.2% 3.7 - 7.0% 773
coagul. pos. Staph. d 1 - 2.5% 1.3% 0.6 - 2.4% 773
L. monocytogenes g 1 - 2.5% 0.3% 0 - 0.9% 758
Semihard Goat Cheese E. coli c 2.5 - 5% 4.7% 1.3 - 11.5% 86
coagul. pos. Staph. c 2.5 - 5% 7.0% 2.6 - 14.6% 86
coagul. pos. Staph. d 2.5 - 5% 2.3% 0.3 - 8.1% 86
L. monocytogenes g 2.5 - 5% 0.0% 0 - 4.2% 86
Soft Cow Cheese E. coli c 2.5 - 5% 14.1% 9.3 - 20.3% 170
coagul. pos. Staph. c 2.5 - 5% 9.9% 5.9 - 15.4% 172
coagul. pos. Staph. d 2.5 - 5% 6.4% 3.2 - 11.2% 172
L. monocytogenes g 2.5 - 5% 1.9% 0.4 - 5.3% 161
Salmonella spp g 2.5 - 5% 0.6% 0 - 3.5% 157
Fresh Cow Cheese Enterobacteriaceae c n.a. 10.1% 6.3 - 15.2% 198
coagul. pos. Staph. b 2.5 - 5% 3.7% 1.6 - 7.1% 219
coagul. pos. Staph. d 2.5 - 5% 0.0% 0 - 1.7% 219
L. monocytogenes g 2.5 - 5% 0.0% 0 - 1.7% 211
Fresh Goat Cheese Enterobacteriaceae c > 5% 18.2% 9.1 - 30.9% 55
coagul. pos. Staph. b > 5% 22.8% 12.7 - 35.8% 57
coagul. pos. Staph. d > 5% 5.3% 1.1 - 14.6% 57
L. monocytogenes g > 5% 0.0% 0 - 6.7% 53
Yogurt Enterobacteriaceae a n.a. 2.3% 1.3 - 3.8% 686
Yeasts c n.a. 11.2% 9 - 13.8% 685
Specialities aerobic plate count f n.a. 4.4% 0.5 - 15.1% 45
Enterobacteriaceae b n.a. 1.5% 0 - 8.2% 66
coagul. pos. Staph. b 1 - 2.5% 0.0% 0 - 9.7% 36
Salmonella spp g 1 - 2.5% 0.0% 0 - 8.4% 42

*: result of a risk assessment; CI 95%:95% confidence interval, survey 2002/2003; n.a.: previously not assessed; a: 10
colony forming units (CFU); b: 100 CFU; c: 1,000 CFU; d: 10,000 CFU; e: 100,000 CFU; f: 1 million CFU; g: absent in 25 g

Towards a risk-based chain control  251

F. Brülisauer, T. Berger, B. Klein and J. Danuser

overstepping. In soft cheese the legal boundary level for coagulase positive Staphylococcus,
was overstepped in 6 percent of all samples. Listeria monocytogenes and Salmonella spp. were
sporadically found in this product. This finding might be explained by missing pasteurisation
and short maturing. Hygiene indicators like E. coli and Enterobacteriaceae were found in
soft cheese (14.1% of the samples were contaminated with E. coli), in fresh cheese (10.1%
of the samples were contaminated with Enterobacteriaceae) and especially in fresh cheese
made of goat milk (18.2% of the samples were contaminated with Enterobacteriaceae).
Therefore the improvement of soft and fresh cheese needs to be given top priority. Only few
products made of goat and sheep milk could be sampled, so only little could be said about
their quality status.

The prevalence of most micro-organisms found was in accordance with the assumptions of
the risk assessment. However, the results of the first national survey helped to further adjust
the risk based sampling plan.


The risk based surveillance of milk and dairy products had a twofold use. First, it specified the
bacteriological quality of Swiss dairy products, especially for those products and production
methods which are prone to contamination. Secondly, its results represented a helpful and
practical tool to further increase the level of hygiene practice in dairy industry. The results
of the national survey were discussed with various dairy experts and recommendations for
enhancement of production was formulated and distributed to interested parties, like the
cantonal food inspectors and dairy consulting services.

To increase the explanatory power of the monitoring, more detailed information on the analysed
products is needed. In the future individual results with supplementary specifications of the
fabrication process or factory will be collected. In addition, special emphasis must be placed
on standardised sampling and analysis due to the high number of partners involved.

252  Towards a risk-based chain control

L. Bucchini and L. Caricchio

Are food recalls risk based food safety tools?

L. Bucchini and L. Caricchio
Hylobates Consulting srl, Rome, Italy,


Foods unfit for safe consumption should not be on the market. Food batches that prove
defective can be recalled voluntarily by manufacturing companies or seized by authorities. The
common goal is the prevention of foodborne disease, that may be caused by contaminating
hazards such as micro-organisms, chemicals or other agents. The global size of the food
distribution network, among other factors, make food recalls complex and expensive. An
important question is whether food recalls and preventive custom import refusals target
actual risks. If they do not, consequences include damage to single companies or whole
countries and inadequate focusing of food safety investment. Relatively few data on food
recalls are made publicly available in the European Union (EU), but some data have recently
become accessible thanks to the European Commission (EC). We have collected food recall
and import refusals from the United States (US) and Europe for the period 12/5-31/12/2003.
Our aim was to compare data on foods targeted and hazards involved with available data
on disease incidence and prevalence of pathogens in food in order to gain insights into the
relationship between food recalls and actual risks.


The data consisted of four datasets. The first data set, named “internal FDA sample”, included
all food recalls published by the US Food and Drug Administration (FDA) on the 2003
Enforcement Reports. International trade data for the US were collected from the OASIS
database, maintained by the FDA (“international FDA sample”). Both datasets include limited
data on meat, poultry and eggs, as such foods are generally not subject to FDA’s authority.
Weekly notifications were collected from the EC rapid alert system for food and feed (RASFF),
and constituted two datasets (Alerts and Information). Information notifications, as opposed
to Alerts, relate to products that have not entered the European market or that have been
completely withdrawn when the notification is published. Attempts to complete the European
data sets with the January to April weekly notifications were unsuccessful as access to the
data is restricted (EC, personal communication). Data were transferred to spreadsheets and
coded. Analysis was performed using MS Excel.

Results and discussion

The internal FDA sample included 130 entries; 10.8% were recalls due to microbiological
hazards, 6.2% to chemical hazards (Table 1), with the rest due to labelling, documental or
allergic concerns. Among microbiological food recalls in this dataset, 9 were caused by the
presence of Listeria monocytogenes (LM) and 4 to the presence of Salmonella spp. (Table 2).

Towards a risk-based chain control  253

L. Bucchini and L. Caricchio

In this dataset, LM-caused recalls (4/9 due to smoked salmon) are twice as frequent as the
Salmonella-related ones (3/4 related to spices), even if the incidence rate of listeriosis in
the US is 40 times lower than that of salmonellosis. Different severity of disease is probably
the base of the strict US LM-control policy evidenced by the data.

In the FDA international sample, 11.3% of the import refusals were due to a bacterial agent
(mostly Salmonella and LM), 25.8% to a chemical hazard (Table 1). Most chemical hazards
were due to unsafe levels of a colour (919), followed by pesticide levels (622) and unsafe
additive content (177).

In the European Alert sample, 28.5% notifications were related to microbiological hazards,
62.4% to chemical hazards (Table 1). Among the 221 chemical alerts, 118 were due to
various contaminants (90 Sudan colours, 10 heavy metals), 42 to residues of veterinary
drugs (primarily, nitrofuran and chloramphenicol), 29 to mycotoxins (fumonisins, aflatoxins).
In contrast, in the European Information sample, member countries reported 1,113 solved
problems: 15.0% were related to microbiological hazards and 72.2% to chemical hazards
(Table 1). Of the 804 chemical notifications, 457 were related to mycotoxins. Of these, 440
were due to aflatoxins, mostly in pistachios from Iran.

Table1. Characterization of datasets; other causes of recalls not included.

Data set Entries Microbiological hazards Chemical hazards

Internal FDA 130 14 (10.8%) 8 (6.2%)

International FDA 8,187 930 (11.3%) 2,115 (25.8%)
EU Alert 354 101 (28.5%) 221 (62.4%)
EU Information 1,113 167 (15.0%) 804 (72.2%)

Table 2. Recalls caused by selected foodborne pathogens compared to the corresponding US 2003 incidence rates
(Foodnet, 2003) and reported cases, where available, in selected EU member states for 2002 (EC, 2003).

Internal FDA International European Alert European US incidence Reported

FDA Information (cases per cases in EU

Salmonella spp. 4 (28.6%) 604 (64.9%) 60 (59.4%) 78 (47.3%) 145 145,231

Listeria monocytogenes 9 (64.3%) 261 (28.1%) 28 (27.7%) 19 (11.4%) 3.3 -
Campylobacter spp. 0 0 0 0 126 149,287
E. coli O157 0 0 0 1 (0.59%) 11 -
Yersinia spp. 0 0 0 0 40 10,147
Shigella spp. 0 0 0 0 73 -

254  Towards a risk-based chain control

L. Bucchini and L. Caricchio

The EU and the FDA international datasets have a larger proportion of chemical hazards
related to micro-organism caused recalls. However, although chemicals do pose several risks
to health, biological agents are held responsible for the majority of foodborne illness. In
the European Information sample, Vibrio parahaemolyticus accounted for 21 notifications
(almost as many as LM) and Salmonella recalls were 4.1 times more frequent than those for
LM, suggesting that control of listeriosis is not as high a priority in Europe as in the US
(although disease rates are presumably similar).

Table 2 shows that, among bacterial pathogens selected for high US incidence and high
reported cases in Europe, only Salmonella and LM resulted in food recalls. Also, in the four
datasets, only one recall was cased by a viral agent. Foods at risk of a LM-caused recall
differed in Europe and in the US. In Europe, meat, cheese and fishery products were most
often cause of a LM-recall, whereas in the US (meat is excluded from US samples) sauces
and prepared foods outnumbered other sources, followed by cheese and fish. Available data
indicate that LM prevalence in foods is highest in fish, prepared foods and meat, followed
by cheese. Therefore, it appears that the European system is not targeting an important
category of food at risk. Concerning Salmonella, the US samples implicates mostly fish,
spices and vegetables (but not fruit, a vehicle often implicated in outbreaks). The European
samples associate the bacterium to meat (mostly pork), herbal teas, fish, fruit and spices.
Interestingly, fish is not normally included in a list of foods at high risk of transmitting
salmonellosis and may be eaten raw. Unexpectedly, Salmonella-caused recalls rarely target
eggs and egg-based products.

Fishery products are among the five foods most at risk for chemical (4/4) or microbiological
(4/4) hazards according to all data sets. Spices represent both a high microbiological
(3/4) and chemical risk (2/4) in most data sets, whereas cheese is mostly recalled for
microbiological reasons. Also fruit ranks quite high due to the combination of microbiological
risk and pesticide residues.

When cases of disease in Europe are compared by country, Germany reports a high number
of cases of disease (72,377) and recalls due to Salmonella (41). Italy reports seven times
less cases of salmonellosis and only 4 Salmonella caused recalls. For listeriosis, France and
Germany reported each approximately the same number of cases and recalls. On the other
hand, the UK reported many cases and few European notifications, perhaps because of failures
of the control system. Italy tops the European dataset for cases of products recalled due to
LM, but reports few cases of listeriosis. Probably, microbiological quality of selected foods
should be improved as well as disease reporting.

Country-specific comparisons have been made for several countries. In the case of France, for
instance, US data picture a country whose products are at risk for LM and, to some degree,
to chemical hazards; European data confirm the LM-related risk-profile, also in meat. In the
case of Germany, only the European system highlights problems with microbiological hazards
(both Salmonella and LM). The risk profile for Italy includes LM, contaminants (Sudan Red)
and fumonisins. Comparisons were also made for India, the UK, Mexico, China, Brazil and

Towards a risk-based chain control  255

L. Bucchini and L. Caricchio


Food recalls and import refusals are relevant components of a safe food system; this preliminary
investigation of their features has provided interesting insights. These insights seem to hold
true even with the limited datasets analysed. Specifically, among other findings, US import-
oriented and EU recall policies focus more on chemical hazards than on pathogens; recall of
LM contaminated products does not seem part of the EU listeriosis control strategy; generally,
several pathogens are virtually excluded from recalls. This type of analysis may prove useful
to regulators, industry and other stakeholders.


European Commission (EC), 2003. Trends and sources of zoonotic agents in animals, feedingstuffs, food and man
in the European Union and Norway in 2003. SANCO/339/2005.
Foodnet, CDC, 2003. Preliminary FoodNet Data on the Incidence of Infection with Pathogens Transmitted Commonly
Through Food - Selected Sites, United States. MMWR 2004, 53(16), 338-343.

256  Towards a risk-based chain control

L. Busani, C. Graziani, I. Luzzi, A.M. Dionisi, C. Scalfaro, A. Caprioli and A. Ricci

Subtyping of Salmonella enterica serotype

Typhimurium of human and animal origin as a tool
for estimating the fraction of human infections
attributable to a given source
L. Busani1, C. Graziani1, I. Luzzi1, A.M. Dionisi1, C. Scalfaro1, A. Caprioli1
and A. Ricci2
1Istituto Superiore di Sanità, Rome, Italy,
2Istituto Zooprofilattico Sperimentale delle Venezie, Legnaro (PD), Italy.


Salmonella typhimurium (STM) is an ubiquitous serotype, being isolated from nearly all farmed
and wild animal species, which may act as reservoir of human infections (Thorns, 2000).
Due to this broad diffusion, phagetyping, molecular typing and antimicrobial resistance
profiles are needed to characterize the isolates, and to provide clues about the sources of
infection. In Italy, about 12,000 human cases of Salmonella infection are reported each
year and STM accounts for about 40% of them. In the last three years, in Italy, the human
cases of salmonellosis caused by STM exceeded the number of cases from S. enteritidis (SE).
SE is the most common serotype isolated from humans in Europe and USA and the main
source of infection for humans are eggs and eggs products (Herikstad et al., 2002). This
picture suggested that the sources of human salmonellosis changes over time, and probably
multiple sources of infection occurred. The national laboratory surveillance system (Enter-net
Italia) yearly provides information on about 6,000 isolates from human infections and 2,000
isolates each from animal, food, and environmental sources. In particular, swine is a common
reservoir for STM, but the scarce epidemiological information about the exposure and the risks
factors of human cases, did not allow an estimate of the contribution of this source to the
human infections. The aim of this work is to use different subtyping tools to provide clues
for identifying the origin of human infections with STM in the absence of epidemiological
data and to define features useful as indicators for monitoring activities.

Material and methods

Two-hundred and twenty one STM from human cases and 1,054 STM from swine, chicken,
turkey and cattle, collected in Italy during 2002 and 2003, were analysed. Data on serotyping
of STM isolates were obtained from the laboratories of the Enter-net surveillance network for
human cases and from the Enter-vet surveillance network for isolates from animals and from
food of animal origin. Data on phage typing, PFGE typing and antimicrobial susceptibility
were obtained from the national reference laboratories for human (Istituto Superiore di

 Recipients of the 2004 Roberto Chizzolini Memorial Poster Award

Towards a risk-based chain control  257

L. Busani, C. Graziani, I. Luzzi, A.M. Dionisi, C. Scalfaro, A. Caprioli and A. Ricci

Sanità, Rome) and for animal (Istituto Zooprofilattico delle Venezie, Legnaro, Padova)

Serotyping was performed by the slide agglutination method using commercial O and H
antisera (Statens Serum Institut, Copenhagen, Denmark). Phagetyping was performed by
using standard methods and phages provided by the International Phage-typing Reference
Laboratory (Health Protection Agency, London, UK). Susceptibility was determined by the
disk diffusion method, following the National Committee for Clinical Laboratory Standard
(NCCLS) recommendations. PFGE was performed using the Bio-Rad CHEF® system following
an agreed protocol (Busani, 2004).

Data analysis was performed using Epiinfo 2002 ver. 3.2.2 (CDC, USA) and Stata ver. 8 (Stata
corp., USA).


STM was the most important serotype involved in human cases in 2002 and 2003, accounting
for about 40% of the isolates. It was also found in cattle (71%), swine (54%), turkeys (13%)
and poultry (5%). The phage typing of human STM showed that DT104 (25%) and DT NT (25%)
were the most represented phage types. DT104 was also found in isolates of STM from the
different animal species (26% in isolates from poultry, 51% from cattle, 46% from turkeys,
18% from pigs). The DT NT phage type, was found in 20% of STM from pigs, 10% of strains
from poultry and less than 5% in strains from the other sources. PFGE analysis of NT strains
showed that 7 out of 8 of the swine isolates had the same profile (XB0079), which was also
observed in 66 out of 70 (78%) of the human strains, but rarely in isolates from other animal
species. Resistance to ampicillin (Am) streptomycin (S) sulphonamides (Su) and tetracycline
(Te), were the most common found in STM isolates, irrespective of their origin. The resistance
to trimethoprim (Tmp), was found in STM of human origin (14%), in 21% of the isolates from
swine and in 6-8% of isolates from the other species. The resistance to chloramphenicol (C)
was a feature linked to DT104 phage type, that showed a R-type AmCSSuT (82% of DT104),
while the R-Type AmSSuT was found in DT NT (58%) but also in DT U302 (27%) and RDNC
(11%) isolates of STM. The R-Type AmSSuT, sometimes with resistance to nalidixic acid (Na)
or Na and Tmp was observed in 91% of the XB0079 strains, while that R-type was found
in only 40% of the NT strains with other PFGE profiles. The phage types of the human STM
XB0079 and AmSSuT were DT NT (34 out of 35), DT U302 (27 out of 30) and RDNC (5 out
of 5). The factors mainly associated with the swine origin of STM strains were reported in
Table 1. The probability that a human isolate of STM comes from swine on the basis of the
resistance to Tmp and the R-Type AmSSuT was about 30% (Figure 1).


Antimicrobial susceptibility testing is commonly performed for epidemiological and clinical

purposes; it is also useful as a typing tool to trace back the source of the isolates. On the
other hand, phage typing and PFGE typing are two techniques expensive and time consuming,

258  Towards a risk-based chain control

L. Busani, C. Graziani, I. Luzzi, A.M. Dionisi, C. Scalfaro, A. Caprioli and A. Ricci

Table 1. Risk factors associated with the swine origin of Salmonella typhimurium isolates (Enter-vet, Italy, 2002-

Risk factors Crude OR (95% CI) Adjusted OR (95% CI)

ASSuT R pattern 2.9 (1.9-4.5) 5.7 (2.9-11.1)

Phage type DT 208 4.6 (1.9-10.9) 4.5 (2.2-9.3)
Phage type DT NT 2.3 (1.5-3.8) 1.7 (0.8-3.5)
Resistance to TMP 4.8 (2.9-8.1) 4.4 (2.1-9.1)

Swine origin Other than

76% 24%
to TMP
10% Swine origin
S. Typhimurium R type 72%
27% ASSuT
human cases 28%
4% Other than
ASSuT and
TMP res
80% 20%
Swine origin Other than

Figure 1. Model based on the resistance to Tmp and the R-Type AmSSuT to assess the probability of swine
origin of human S. typhimurium infections (Enter-net and Enter-vet, Italy, 2002-2003).

not easily available in the peripheral diagnostic laboratories, but they provide more targeted
information about the source of the isolates. PFGE analysis of STM showed a close relationship
with XB0079 profile and R-Type AmSSuT, irrespective of the phage type and the origin of the
isolates, suggesting a clonal origin of ASSuT strains. Swine origin of STM should be detected
by Tmp resistance and R-Type AmSSuT, and these markers seem to be not associated and can
be analysed separately.

These results indicate that: (1) human isolates of STM with Tmp resistance and/or R-type
AmSSuTe have likely a swine origin; (2) the estimate of the human cases due to isolates
with these features accounts for about 40% of total cases of STM; (3) these markers should
be useful to assess the contribution of swine farming and products to the burden of STM
human infections.

Towards a risk-based chain control  259

L. Busani, C. Graziani, I. Luzzi, A.M. Dionisi, C. Scalfaro, A. Caprioli and A. Ricci

The limitation in using the laboratory data to perform such an estimate is the small number
of isolates submitted to PFGE analysis, and the possibility to refer the statistical inference
only a to subset of strains with the considered features. This work provides information
about the role of the swine as reservoir of human infection in Italy, but these data should
be confirmed by epidemiological data from outbreak investigations and risk factors analysis,
to provide for a better estimate of the risk factors for human infections.


Busani, L., Graziani, C., Battisti, A., Franco, A., Vio, D., Digiannatale, E., Paterlini, F., D´Incau, M., Owczarek, S.,
Caprioli, A. and Luzzi, I., 2004. Antibiotic resistance of strains of Salmonella enterica serotypes Typhimurium,
Enteritidis and Infantis isolated in Italy from human infections, foodstuffs and farm animals in Italy. Epidemiology
and Infection 132, 245-251.
Herikstad, H., Motarjemi, Y. and Tauxe, R.V., 2002. Salmonella surveillance: a global survey of public health
serotyping. Epidemiol. Infect. 129, 1-8.
Thorns, C.J., 2000. Bacterial food-borne zoonoses. Rev. Sci. Tech. Off. Int. Epiz. 19, 226-239.

260  Towards a risk-based chain control

Caprioli et al.

Enter-Net Italia: surveillance of verocytotoxin-

producing Escherichia coli infections in Italy
A. Caprioli1, S. Morabito1, F. Minelli1, M.L. Marziano1, A. Fioravanti1,
R.Tozzoli1, G.Scavia1, L.Busani1, G. Rizzoni2, A. Gianviti2, M.A. Procaccino1
and A.E. Tozzi
1Istituto Superiore di Sanità, Italy
2Ospedale Bambino Gesù, Rome, Italy

Enter-net is the European surveillance network for Salmonella and verocytotoxin(VT)-

producing E. coli (VTEC) infections. Istituto Superiore di Sanità (ISS) represents Italy in the
network. VTEC infection and hemolytic-uremic syndrome (HUS) are not notifiable diseases,
and surveillance is carried out on a voluntary basis. HUS cases are notified to ISS, and
stool and serum specimens are submitted for laboratory diagnosis of VTEC infection. ISS
also receives presumptively identified VTEC strains and/or clinical specimens from clinical
microbiology laboratories.

Microbiologic diagnosis is based on the isolation of VTEC strains and on the detection of
free VT in stools. Serologic diagnosis is based on the detection of serum antibodies to the
lipopolysaccharides (LPS) of the VTEC serogroups O157, O26, O103, O111, and O145 by ELISA
and immunoblotting.

Between 1988 and 2003, 314 cases of VTEC infection were identified. A fairly constant number
of cases per year was detected and two communitywide outbreaks were observed in 1992
and 1993, respectively. Most cases were observed during the warm season, and the majority
of them were children with HUS, since most cases were notified by the HUS surveillance
system. The serogroup most commonly detected was O157, particularly of phagetypes 2, 8,
and 14. A relevant number of cases, including many with HUS, were associated to non-O157
serogroups. In particular, infections with VTEC O26 were common and have been increasing
during the last 5 years.

In conclusion: (1) the incidence of VTEC infection in Italy is relatively low, especially if
compared with those reported for other countries; (2) a considerable proportion of cases is
due to infections by non-O157 VTEC; (3) the ratio between cases of HUS and cases with other
clinical manifestations indicate that surveillance of uncomplicated cases of VTEC infections
is insufficient and has to be improved by increasing the number of hospital laboratories
looking for E. coli O157

Towards a risk-based chain control  261

A. Codega de Oliveira, R. Ortenzi, E. Bartocci, A. Vizzani and B.T. Cenci Goga

Effect of the introduction of HACCP on the

microbiological quality of meals at an university
A. Codega de Oliveira, R. Ortenzi, E. Bartocci, A. Vizzani and B.T. Cenci Goga
Dipartimento di Scienze degli Alimenti, Facoltà di Medicina Veterinaria, Università degli
Studi di Perugia, Italy


Meals served in restaurant are often implicated in foodborne disease outbreaks. Main causes
of foodborne diseases are connected to bad manufacturing practices of meals production.
The application of a systematic approach to the identification and evaluation of food safety
hazards, such as the HACCP system, must be carried out to achieve food safety. The HACCP
system has been used in foodservice establishments and European Commission is promoting
harmonization of HACCP principles according to the 1995 Food Safety regulations implemented
in the Directive on Food Hygiene (93/43/EEC) (Council Directive, 1993). The aim of this
study was to determine the microbiological quality of three categories of the most consumed
meals in a University restaurant, before and after implementation of HACCP system, which
is illustrated in details.

Material and methods

A university restaurant, which prepares 1,000 meals a day, has been controlled. Meal
samples were collected during the years 1999, 2000 and 2001, for a total of 103 samples.
Gastronomy products, cooked served warm and cooked served cold have been tested for
bacterial contamination. Triplicate samples studied (25g) were weighed aseptically into sterile
stomacher bags, diluted into 225 ml buffered peptone water (BPW) 0.1% and homogenised in
a Stomacher 400 (PBI, Milan, Italy), serially diluted in BPW and plated in triplicate. Aerobic
plate counts (APCs) were determined by surface spreading of homogenate dilutions (1.0 ml)
on Plate Count Agar Standard (PCA, Oxoid, Basingstoke, UK). Bacillus cereus was isolated
on Bacillus cereus Agar Base (Biolife, Powys, UK). Staphylococcus aureus was determined on
Baird-Parker Agar Base (Oxoid), suspected colonies were examined microscopically, gram
stained, tested for coagulase (Staphylase test, Oxoid) and for thermonuclease (T-test, Oxoid).
Escherichia coli and coliform organisms were isolated on Chromogenic Coliform Agar (Biolife).
For Salmonella spp. isolation was used the ISO 6579 method (ISO, 1993). The isolation of
Listeria monocytogenes was conducted with the ISO 11290-1 and the ISO 11290-2 (ISO,
1996; ISO, 1998) methods.

Statistical analysis: All bacterial counts obtained were transformed to Log10 CFU/g for
subsequent data analysis. For the three groups of food products the effect of year of sampling,
within each microbial group, was compared using analysis of variance with significance

262  Towards a risk-based chain control

A. Codega de Oliveira, R. Ortenzi, E. Bartocci, A. Vizzani and B.T. Cenci Goga

defined at the 95% level (P ≤ 0.05). StatView 5 software (SAS Institute Inc., Cary, NC, USA)
was used for the statistical analysis of results. No data analysis was performed when the
number of samples above the 300 or 3,000 CFU/g, i.e. log10 CFU/g < 2.48 and < 3.48 was
less than 11 units.


Table 1 shows the results. The mean values for APC for the three different years combined and
for all samples taken was 4.29 log10 CFU/g. The lowest APC were recorded in year 2001 for
cooked products served warm, while highest values ware recorded for cold gastronomy products
served in year 1999. E. coli counts were always < 2.48 log10 CFU/g and positive samples were
detected in gastronomy products in year 2000 (with 10% positive samples) and in cooked served
cold products in year 1999 (17% positive samples). The mean value for coliform organisms for
the three different years combined and for all samples taken was 2,95 log10 CFU/g. Coliform
organisms were detected practically from all categories throughout all years (with positive
samples ranging from 7% for cooked served warm products in 2001 to 100% for gastronomy
and cooked served cold products in 1999) and counts were always below 2.48 log10 CFU/g in

Table 1. Microbial profile of meal samples.

Year Gastronomy cooked served warm cooked served cold

mean log10 sd2 +/n§ mean log10 sd2 +/n§ mean log10 sd2 +/n§
UFC/g (% UFC/g (% UFC/g (%
positive*) positive) positive)

APC** 1999 7.36 (100%) 0.1 64/64 4.30 (100%) 1.8 85/85 5.03 (100%) 1.1 69/69
2000 5.11 (100%) 1.0 73/73 3.43 (100%) 1.4 98/98 4.35 (100%) 0.9 74/74
2001 5.13 (100%) 1.5 120/120 3.21 (100%) 1.1 185/185 3.85 (100%) 1.2 126/126
E. coli 1999 <2.48 (0%) - 0/64 < 2.48 (0%) - 0/85 < 2.48 (17%) - 12/69
2000 <2.48 (10%) - 7/73 < 2.48 (0%) - 0/98 < 2.48 (0%) - 0/74
2001 <2.48 (0%) - 0/120 < 2.48 (0%) - 0/185 < 2.48 (0%) - 0/126
Coliforms 1999 4.48 (100%) 0.9 64/64 < 2.48 (0%) - 0/85 2.78 (100%) 0.8 69/69
2000 2.63 (63%) 1.4 46/73 < 2.48 (12%) - 12/98 < 2.48 (59%) - 44/74
2001 <2.48 (60%) - 72/120 < 2.48 (7%) - 13/185 < 2.48 (50%) - 63/126
S. aureus 1999 <3.48 (100%) - 64/64 < 3.48 (0%) - 0/85 < 3.48 (32%) - 22/69
2000 <3.48 (63%) - 46/73 < 3.48 (23%) - 23/98 < 3.48 (80%) - 59/74
2001 <3.48 (0%) - 0/120 < 3.48 (7%) - 13/185 < 3.48 (0%) - 0/126
B. cereus 1999 <3.48 (33%) - 21/64 < 3.48 (14%) - 12/85 < 3.48 (30%) - 21/69
2000 <3.48 (10%) - 7/73 < 3.48 (0%) - 0/98 < 3.48 (41%) - 30/74
2001 <3.48 (0%) - 0/120 < 3.48 (0%) - 0/185 < 3.48 (0%) - 0/126

sd2: standard deviation; *: % of samples where at least one typical colony was isolated; **: aerobic plate count;
§n: number of samples positive/number of samples tested.

Towards a risk-based chain control  263

A. Codega de Oliveira, R. Ortenzi, E. Bartocci, A. Vizzani and B.T. Cenci Goga

cooked served warm. The highest counts for coliform organisms were recorded for gastronomy
products in year 1999 (4.48 log10 CFU/g). S. aureus was detected from gastronomy products
and from cooked served cold products in years 1999 and 2000 and from cooked served warm
in year 2000 and 2001. S. aureus counts were always below 3.48 log10 CFU/g for all categories
in all years and positive samples ranged from 100% in gastronomy products in 1999 to 7%
in cooked served warm in 2001. B. cereus was detected in year 1999 from all categories and
in year 2000 from gastronomy products and cooked served cold products, whilst was never
detected in year 2001. B. cereus counts were always below 3.48 log10 CFU/g. Salmonella spp.
and L. monocytogenes were never detected in all samples studied.


Correct implementation of HACCP is effective in reducing either bacterial load or the

percentage of samples positive that were never detected in year 2001 for all categories of
products. Coliform organisms, although always detected throughout all years, showed an
even more impressive drop, especially for both categories of cold products. Cooked served
warm although, as expected, is always less contaminated than the other two categories of
products. After the full implementation of the HACCP system (i.e. in year 2001), results were
positive for coliform organisms in only 7% of samples. Similar results were achieved for S.
aureus and B. cereus.

Our study demonstrated that personnel training along with the implementation of HACCP
in an university restaurant was effective in reducing overall contamination and detection
of markers of bacteriological safety. It should be noted here that avoidable contamination
of raw products unnecessarily increases the severity of processing, while recontamination
of food after processing for safety nullifies all previous efforts. Adequate control of storage
temperature all along the food chain is one of the most important points in code of good
manufacturing and distributing practices (Mossel et al., 1995). The microbial results of
this study demonstrate that personnel training along with HACCP application contribute to
improve the food safety in the studied restaurant, and also, that the use of total mesophilic
count is a good indicator of food safety, as demonstrated by the consistency of APC and
detection of markers of bacteriological safety.


Council Directive 93/43/EEC of 14 June 1993 on the hygiene of foodstuff. Official Journal L 175, 10/07/1003.
ISO 6579, 1993. Microbiology, General guidance on methods for the detection of Salmonella. International
Organization for Standardization. Geneva.
ISO 11290-1, 1996. Microbiology of food and animal feeding stuff. Horizontal method for detection and enumeration
of Listeria monocytogenes. Detection Method. International Organization for Standardization, Geneva.
ISO 11290-2, 1998. Microbiology of food and animal feeding stuff. Horizontal method for detection and enumeration
of Listeria monocytogenes. Enumeration Method. International Organization for Standardization, Geneva.
Mossel, D.A.A., Corry, J.E.L., Struijk, C.B. and Baird, R.M., 1995. Essentials of the microbiology of food. John Wiley
& Sons Ltd., Chichester, UK.

264  Towards a risk-based chain control

F. Conte, M.L. Scatassa, G. Monsù, V. Lo Verde, A. Finocchiaro and M. De Fino

Monitoring of safety and quality of donkey’s milk

F. Conte1, M.L. Scatassa2, G. Monsù3, V. Lo Verde2, A. Finocchiaro3 and
M. De Fino3
1Faculty of Veterinary Medicine, University of Messina, Italy
2IstitutoZooprofilattico Sperimentale della Sicilia “A.Mirri”, Palermo, Italy
3Veterinary surgeon, Italy

The use of donkey’s milk as an important solution for treatment of infants suffering from
hypersensitivity to milk protein of some animal species, or from multiple food intolerance, has
been encouraged (Carroccio et al., 2000) because its composition is close to human milk.

Safety and quality criteria in the production system of donkey’s milk must be established
as soon as possible especially in view of the hoped-for future use. Furthermore in the EC
regulation n.853/2004 (CE, 2004) the donkey is implicitly included as a milk animal; in the
same context sanitary requirements for milk animals, standard plate count and somatic cell
count for raw milk are given while awaiting formulation of some specific criteria related to
the quality of raw milk and of dairy products. At present, equine milk has been less studied
than bovine milk. Therefore, the application of EC requirements to the row approach and,
consequently, to the HAACP system process control, must be supported by a scientific basis;
this latter will allow a better adherence to HACCP guidelines to guarantee the safety and
the quality of donkey’s milk, as well.

On the basis of the above, some results for donkey’s milk from different parts of Sicily
are presented. Attention was given to farm because European regulation integrates farm
production in the milk row. The specification of some microbiological criteria was emphasized;
indeed these criteria would allow the application of risk analysis which is becoming a new
concern in producing acceptable, safe food (Hasell and Salter, 2003). In the present paper
criteria were for main hygienic indicators, for potential food borne microbial agents and for
mastitis bacteria. Chemical and physical parameters were included as a tool to verify the
freshness and the nutritional value, which could be considered as optional in view of potential
application of donkey’s milk.

The study included two parts. In the first, donkey’s milk samples were collected from one
farm; the second considered three farms. The donkeys were of varying ages. The trials were
carried out monthly; milk was aseptically collected immediately after milking by hand or
machine, into sterile bottles; samples were delivered to the laboratory in a cool box and
analysed upon arrival for: enumeration of total plate count, coliforms, sulphite reducing
anaerobes, Escherichia coli, Staphylococcus spp.; detection of Listeria spp., Salmonella spp.,
Streptococcus spp., Pasteurella spp., Pseudomonas spp. and Corynebacterium spp. Somatic
cell count (SCC), fat (FAT), total nitrogen (PRT), lactose (LCT), dry matter content (dm) and
pH were also considered. Official and/or accredited methods (SINAL) were applied for the
examination of the milk. The first study, carried out on seven donkeys, almost for the whole
lactation period, gave the results summarized in Table 1. Mean values of an eight-month
period of observations during 2000 and 2001 are reported.

Towards a risk-based chain control  265

F. Conte, M.L. Scatassa, G. Monsù, V. Lo Verde, A. Finocchiaro and M. De Fino

Table 1. Microbial counts and SCC are in log10 cfu ml-1; FAT, PRT, LCT and dm are in mg/100 ml of milk.


Parameters 1 2 3 4 5 6 7

SPC 3.50 4.34 5.07 4.96 4.17 4.23 4.88

Coliforms 2.14 1 2.57 2.30 2.85 1.89 2.11
E. coli 2 1 <1 <1 1.43 <1 <1
Staphylococcus spp. <1 <1 2.47 2.30 1.60 <1 <1
SCC 4.13 4.33 4.67 4.55 4.07 4.34 4.70
FAT 0.66 0.79 1.05 1.35 0.49 0.56 0.79
PRT 1.68 1.79 1.60 1.76 1.54 1.35 1.63
LCT 6.09 6.23 5.71 6.09 6.17 6.46 6.09
Dm 8.48 8.73 8.01 8.57 8.41 8.51 8.43
PH 7.23 7.17 7.30 7.25 7.11 7.27 7.19

Streptococcus zooepidemicus, S. intermedius, Pseudomonas aeruginosa and Staphylococcus

hycus were recovered from some animals and during different sampling trials. Three cases
of subclinical mastitis were detected; the affected animals were not included in the study
because milk secretion was cut off. Pseudomonas spp. and Staphylococcus aureus were isolated
from the milk of two animals as aetiological agents.

The second study comprised a total of 23 donkeys from three farms (A, B, C) during 2004. On
farm A and on farm B donkeys were hand milked; on farm C milking was done by machine.
Mean values in milk from the different farms are summarized in Table 2.

Staphylococcus aureus (<30 cfu/ml of milk) was detected in different animals from two
farms. Staphylococcus epidermidis and Streptococcus dysgalactiae were isolated from one
farm. The presence of mastitis was ruled out. Salmonella spp., Listeria spp., Pasteurella and
Corynebacterium spp. were not isolated in either study.

The data in the present report show overall an acceptable microbiological quality of donkey’s
milk. The management system of two kind of farms (hand milking and mechanical milking)
could explain the differences for SPC and coliforms count and for fat, protein, lactose and
dry matter amount.

The strong defence system of the donkey’s mammary gland (e.g. high amounts of lysozyme of
milk) could explain the low amount or the absence of some micro-organisms. Low mean SCC
confirmed the difference between donkey’s milk and other milk animal species (bovine, ovine,
caprine), with the exception for the cases of mastitis. The study confirmed that mastitis is
a rare pathology in the donkey and it would not be of concern for milk safety and quality.
A variable milk composition could reflect the farm management conditions and the animals’
status, as well. The prevalence of S. aureus must not be neglected; further investigations

266  Towards a risk-based chain control

F. Conte, M.L. Scatassa, G. Monsù, V. Lo Verde, A. Finocchiaro and M. De Fino

Table 2. Microbial counts and SCC are in log10 cfu ml-1; FAT, PRT, LCT and dm are in mg/100 ml of milk.


Parameters A B C

SPC 4.253 4.161 5.872

Coliforms 2.837 1.848 4.641
E. coli <1 <1 <1
Staphyloccus spp. <1 <1 <1
Sulphite reducing anaer. <1 <1 <1
SCC 4.645 4.069 4.481
FAT 0.709 0.516 0.327
PRT 1.931 1.845 1.704
LCT 6.092 6.795 6.590
dm 8.718 9.369 9.072
pH 7.015 6.980 7.015

are needed to assess whether this micro-organism in donkey’s milk should be considered a
hazard in view of the fact that infant could fed raw milk.

Microbiological criteria of the present study could be useful for monitoring donkey’s milk in
farm production. Further information on a high number of donkeys could subsequently be
integrated. Today it is not easy to collect milk samples from a statistically significant number
of farms in Sicily. Hopefully milk samples representative of the different parts of Sicily where
farms are located would be examined.

Microbiological data, after further confirmation, will support the definition of some criteria
and critical limits for the HACCP system application; physical and chemical data could support
the legal specifications for quality reference values. Risk assessment for microbial hazards
in foods are a rapidly developing discipline. Concerted efforts are needed to acquire better
data and improved understanding of processes and interactions in the food chain. Even in
the absence of complete data, the risk assessment approach is a valuable tool for gaining
insights into food safety issues by encouraging the collection and analysis of information
(APHA, 2001).

The data of the present paper have to be considered preliminary; our researches must be
carried on, with the following aims: to allow the application of an HACCP system to donkey
milk row; to give sufficient indications for the emanation of specific quality criteria, as
indicated in EC regulation; to allow a better knowledge of nutritional values of donkey’s
milk as a substitute for human milk or for milk from other species for infants suffering from

Towards a risk-based chain control  267

F. Conte, M.L. Scatassa, G. Monsù, V. Lo Verde, A. Finocchiaro and M. De Fino


American Public Health Association (APHA), 2001. Microbiological examination of foods. F.P. Downes and K. Ito
(Eds.), Washington.
Carroccio, A., Cavataio, F., Montalto, G., D’Amico, D., Alabrese, L. and Iacono, G., 2000. Intolerance to hydrolysed
cow’s milk proteins in infants: clinical characteristics and dietary treatment. Clin. and Experim. Allergy 30,
Hasell, S.K. and Salter, M.A., 2003. Review of the microbiological standards for foods. Food Control 14, 391-398.
Regolamento (CE) n. 853/2004 del Parlamento Europeo e del Consiglio del 29 aprile 2004 che stabilisce norme
specifiche in materia di igiene per gli alimenti di origine animale. Gazzetta Ufficiale dell’Unione Europea L
139/55 del 30 aprile 2004.

268  Towards a risk-based chain control

R.R. Coore, S. Love and M.H. Anil

The frequency and capacity for dissemination of

brain tissue embolism associated with pre-slaughter
stunning of cattle
R.R. Coore1, S. Love2 and M.H. Anil1
1Department of Clinical Veterinary Science, University of Bristol, Bristol BS40 5DU, United
2 Department of Neuropathology, Institute of Clinical Neurosciences, Frenchay Hospital,

Bristol BS16 1LE, United Kingdom

The discovery of brain tissue fragments in the jugular blood of some cattle after the use of
captive bolt gun (CBG) stunning has called into question the continued use of these devices
in light of the BSE epidemic (Anil et al., 1999). As a consequence, there is now a need to
carry out in-depth studies to assess and establish the risk of brain tissue embolism posed
by current stunning methods and the potential for contamination of the carcass by central
nervous system (CNS) tissue.

Frequency of brain tissue embolism following the use of captive bolt


Two groups of one hundred cattle were sampled for the presence of CNS tissue in the jugular
return from the head after CBG stunning using a previously described technique (Anil et
al., 1999).

Four animals were identified as having brain tissue emboli associated with the use of the
penetrating CBG and in a further two animals following the use of the non-penetrating

Dissemination of brain tissue emboli in the carcass

A suspension of macerated brain tissue was introduced into the jugular veins of anaesthetised
cattle while simultaneously sampling the aortic blood and stunning by the use of a penetrating
CBG. All samples were analysed for the presence of CNS tissue. CNS tissue was detected in
blood samples from three animals. This study has confirmed the potential for particulate
brain tissue fragments present in the jugular return to pass through the bovine pulmonary
capillaries to enter the systemic circulation.

The venous drainage of the bovine head was investigated using a method of corrosion casting.
Fresh specimens of a head and neck were injected with methyl methacrylate resin (Tensol
70, Evode Speciality Adhesives, UK) and all tissues were subsequently digested in a bath of
concentrated sodium hydroxide over a 3-6 month period. The casts obtained demonstrated

Towards a risk-based chain control  269

R.R. Coore, S. Love and M.H. Anil

the relative size and capacity of sinuses and veins that may transport emboli from the head
after stunning.

Brain tissue emboli and electrical stunning

Examination of cattle brains and analysis of exsanguinated blood following the use of
electrical stunning suggests that brain tissue embolism would be unlikely after the use of
this stunning method.


The project was funded by UK Food Standards Agency FSA grant MO 3012.


Anil, M.H., Love, S., Williams, S., Shand, A., McKinstry, J.L., Helps, C.R., Waterman-Pearson, A., Seghatchian, J.
and Harbour, D.A., 1999. Potential contamination of beef carcases with brain tissue at slaughter. Vet Rec.
145, 460-462.

270  Towards a risk-based chain control

F. Dieber, P. Wagner and J. Köfer

Resistance of Salmonella enteritidis and Salmonella

spp. to quinolones in poultry in Styria (Austria)
F. Dieber, P. Wagner and J. Köfer
Department of Veterinary Administration in Styria, Animal Health Service, A-8010 Graz,

Contrary to infections caused by S. enterica, nontyphoidal Salmonella infections are mainly
spread by the consumption of contaminated food, especially food of animal origin. Berghold
et al. (2004) reported 8,271 cases of illness caused by Salmonella spp. in Austria in 2003.
More than 85% of all human infections registered in that year were caused by S. enteritidis,
in most cases as a result of the consumption of eggs (Berghold et al., 2004). Reports of
fluoroquinolone resistant Salmonella spp. are cause for concern, because fluoroquinolones are
the drug of choice for treating invasive gastrointestinal infections (Piddock et al., 1998). The
goal of our study was to investigate the extent of resistance of Salmonella spp. to quinolones
under the aspect of an emerging discussion about the correct breakpoint (Moeller-Aarestrup
et al., 2003).

Material and methods

Salmonella enteritidis and S. spp. were isolated from faeces of poultry and from the surface of
poultry meat. Subsequently, they were tested for resistance to ciprofloxacin using SENSITITRE®
susceptibility plates. The sampling was done in three poultry slaughterhouses.

Results and discussion

129 strains of S. enteritidis isolated from poultry faeces, 24 strains isolated from the surface
of poultry meat and 48 strains of Salmonella spp. isolated from poultry faeces were tested
for resistance to ciprofloxacin. No resistance of Salmonella spp. to ciprofloxacin was found
using the current National Committee for Clinical Laboratory standards (NCCLS) breakpoint
for resistance to fluoroquinolones at ≥ MIC 2 µg/ml for enrofloxacin and at ≥ MIC 4 µg/ml for
ciprofloxacin (Table 1). But there is a justifiable suspicion that the use of these breakpoints
obscures the true occurrence of resistance to quinolones among Salmonella strains (Chen
et al., 1996; Drlica et al., 1997). Decreased susceptibility to ciprofloxacin (MIC 0.125 to
2 µg/ml) develops as a result of a single mutation in the gyrA gene. Full resistance to
fluoroquinolones is caused by a second mutation in either the gyrA or gyrB gene. If MIC 0.125
µg/ml is used as the breakpoint, 4.8% of the S. enteritidis isolates and up to 48% of the
Salmonella spp. isolates prove to be resistant to ciprofloxacin or show at least a decreased
susceptibility (Table 1).

Towards a risk-based chain control  271

F. Dieber, P. Wagner and J. Köfer

Table 1. Resistances to ciprofloxacin (CIP) of S. enteritidis and Salmonella spp. isolated from faeces and meat of poultry
depending on different breakpoints: 2 µg/ml or 0.125 µg/ml (shown as bold lines).

N in % CIP CI 95 MIC – distribution (%)

0.015 0.03 0.06 0.125 0.25 0.5 1 2 4 8

breakpoints µg/ml

S. enteritidis in faeces 129 0.0 [0-2.8] 90.7 4.7 0.8 1.6 1.6 0.8
S. enteritidis in faeces 129 4.8 [0-2.8] 90.7 4.7 0.8 1.6 1.6 0.8
S. enteritidis in meat 24 0.0 [0-13.7] 91.7 4.2 4.2
S. enteritidis in meat 24 4.2 [0-13.7] 91.7 4.2 4.2
S. spp. in faeces 48 0.0 [0-7.3] 45.8 6.3 14.6 31.3 2.1
S. spp. in faeces 48 48.0 [0-7.3] 45.8 6.3 14.6 31.3 2.1


Berghold, Ch. and Kornschober, K., 2004. Mitteilungen der Sanitätsverwaltung. 105, Heft 4, 8-12.
Chen, C.-R., Malik, M., Snyder, M. and Drlica, K., 1996. DNA gyrase and topoisomerase IV on the bacterial chromosome:
quinolone-inducted DNA cleavage. J. Mol. Biol. 258, 627-637.
Drlica, K. and Zhao, X., 1997. DNA gyrase, topoisomerase IV, and the 4-quinolones. Microbiol. and Molecular Biol.
Rev. 61, 377-392.
Moeller-Aarestrup F., Wiuff, C., Molbak, K. and Threlfall, E.J., 2003. It is time to change fluoroquinolone breakpoints
for Salmonella spp.? Antimicrob. Agents Chemother. 47, 827-829.
Piddock, L.J.V., Ricci, V., McLaren, I. and Griggs, D.J., 1998. Role of mutation in the gyrA and parC genes of
nalidixic-acid-resistant Salmonella serotypes isolated from animals in the United Kingdom. J. Antimicrob.
Chemother. 41, 635-641.

272  Towards a risk-based chain control

V.K. Economou, M.M. Brett, C. Papadopoulou and T. Nichols

Changes in histamine and microbiological analysis

in fresh and frozen tuna muscle during temperature
V.K. Economou¹, M.M. Brett², C. Papadopoulou¹ and T. Nichols³
¹Department of Food and Water Microbiology, Laboratory of Microbiology, Medical School of
Ioannina, P.O. Box 1186, 45110 Ioannina, Greece,
²Food Safety Microbiology Laboratory, PHLS Central Public Health Laboratory, 61 Colindale
Av., London NW9 5HT, United Kingdom
³PHLS Statistics Unit, PHLS Communicable Disease Surveillance Centre, 61 Colindale Av.,
London NW9 5EQ, United Kingdom


Chemical and microbial spoilage in tuna steaks, fresh tuna loins and frozen and thawed tuna
loins was studied in different storage and abuse procedures. Tuna was stored at 0-2o C, 3-4o C
and 6-7o C and abused at 20o C, 25o C and 30o C for 7 and 12 days. At the end of the procedure,
steaks stored at 0-2o C and abused at 25o C for 1hr daily, loins stored at 3-4oC and abused
at 20o C for 2 hr daily and loins stored at 0-2o C and abused at 30o C for 2 hr daily contained
histamine concentrations that were not toxic, whereas steaks stored at 0-2o C and abused at
25o C for 3 hr daily, steaks stored at 5-7o C and abused at 25o C for 1 and 3 hr daily and loins
stored at 6-7o C and abused at 30o C for 2 hr daily contained histamine concentrations that
were toxic. There was an increase over time in all microbial counts tested, with the exception
of sulfite-reducing bacteria. No significant correlation was observed between bacterial counts
and histamine concentration.

Scombroid fish poisoning has been reported in many countries and is the most prevalent
seafood-borne disease in the United States (Lehane and Olley, 2000). Symptoms include skin
rashes, urticaria, oedema, localized inflammation, nausea, vomiting, diarrhoea, cramping,
hypotension, headaches, palpitations and an oral burning and blistering sensation (Lehane and
Olley, 2000). Incubation period is short and symptoms resolve after a few hours. Scombroid
fish poisoning is associated with the ingestion of foods with a high content of histamine.
Histamine is produced by decarboxylation of L-histidine by bacterial decaboxylases (Lehane
and Olley, 2000). Bacteria associated with the production of histidine decarboxylase include
members of the family Enterobacteriaceae (Taylor, 1986), lactic acid bacteria (Joosten and
Northolt, 1989), Vibrionaceae and Photobacterium spp. (Silla Santos, 1996). Histamine is
not inactivated by heat during cooking or processing. The formation of histamine in fish is
prevented by storage at low temperatures (Taylor, 1986).

Towards a risk-based chain control  273

V.K. Economou, M.M. Brett, C. Papadopoulou and T. Nichols

The aim of this study was to assess the histamine risk due to tuna mishandling in restaurants,
where fish are exposed to high temperatures for short periods of time. Also the different
storage procedures and their contribution to histamine build-up and the correlation between
production of histamine and bacterial growth were assessed.

Material and Methods

Eight tuna steaks and 28 tuna loins were used. All samples were kept at temperatures below
5o C during transport to the laboratory and stored at 3-4°C until use. Fish were skinned and
placed into sterile plastic bags. The tuna steaks were divided into 4 groups of 2 steaks.
Two groups were stored at 0-2o C and abused at 25o C for 1 and 3 h respectively for 11
consequent days, and 2 groups were stored at 5-7o C and were abused at 25o C for 1 and 3
h respectively daily for 11 days. The tuna loins were divided into 3 groups. The first group
was stored at 3-4o C and was divided into 4 subgroups. One subgroup was defrosted at 3-4o
C and served as controls. 3 subgroups were defrosted using different regimes: 3-4o C, at room
temperature (20o C) and using a microwave oven (Philips, Cooktronic M710). These subgroups
were temperature abused for 2 h daily at 20o C for 7 days. The second group of loins was
stored at 0-2o C. Two fresh loins served as controls. Three frozen loins and 3 fresh loins were
temperature abused for 2 h daily at 30o C for 12 days. The third group was stored at 6-7o C.
Three fresh loins served as controls. Three frozen loins and 3 fresh loins were temperature
abused for 2 h daily at 30o C for 12 days.

Ten grams of homogenized flesh was mixed with 90ml of maximum recovery diluent and
ten-fold serial dilutions were performed. The counts determined were aerobic counts (Plate
Count Agar), Enterobacteriaceae, (MacConkey agar No.3), Staphylococci (Baird-Parker agar),
Pseudomonaceae (Pseudomonas selective agar). sulfite reducing bacteria (Tryptose Sulfite
Cycloserine agar), lactic acid bacteria (DeMan-Rogosa-Sharpe agar) and Vibrio spp. (TCBS
agar). All media were supplied by the PHLS Media Department (ISO 9001 standard) and were
quality controlled by the Media Department Quality Control to UKAS standards. A fluorometric
method was performed for histamine analysis as described by Taylor et al. (1978). The
fluorometer used was a Kontron M25 (Kontron Analytical, Kontron Instruments Ltd, Kontron
House, Campfield Rd, St Albans, Herts, AL1 5JG,UK).


The histamine concentrations and microbiological counts of tuna processes are shown in
Table 1a-d. Vibrio spp. counts were negligible.

274  Towards a risk-based chain control

V.K. Economou, M.M. Brett, C. Papadopoulou and T. Nichols

Table 1a. Histamine concentrations and microbial counts in tuna steaks stored at 0-2o C and 5-7o C and abused at 25o C.

Days Ha ACCb Ec LFd NLFe STf

Temperature abuse: 1hr (25oC). Storage: 0-2oC

0 5 4.9 5.8 3 5.8 4.3
11 101.5 >8.3 6.8 5.8 6.7 7.2
Temperature abuse: 3hr (25oC). Storage: 0-2oC
0 14.5 6.1 5.5 3.5 5.5 4.6
11 2500 >8.3 7.7 5.8 7.7 8.1
Temperature abuse: 1hr (25oC). Storage: 5-7oC
0 15 6.4 5.4 2.6 5.4 3.2
11 1240 >8.3 7.8 4.6 7.8 7.74
Temperature abuse: 3hr (25oC). Storage: 5-7oC
0 14.5 6.1 5.3 2.6 5.3 4.4
11 2140 >8.3 8 6 8 8.3

aHistamine, bAerobic Colony Count (30o C), cEnterobacteriaceae, dLactose Fermenters, eNon Lactose Fermenters,
fStaphylococci, gSulfite Reducing Bacteria, hLactic Acid Bacteria, iPseudomonaceae.

Table 1b. Histamine concentrations and bacterial counts in tuna loins stored at 3-4o C and abused at 20o C (histamine
(ppm) in average, bacterial counts (CFU/gr) in average of log).

Days Ha ACCb Ec LFd NLFe STf SRBg LABh PSi

Controls (Storage: 3-4oC. Defrosting: fridge temperature)

0 11.4 3.1 2.9 2.3 2.8 <2 <1 <2 2
7 30 6.3 4 2.7 3.9 4.9 <1 3.6 5.9
Temperature abuse: 2hr (20oC). Storage: 3-4oC. Defrosting: fridge.
0 17.1 3.8 3.4 3.1 2.9 2.2 0.6 1.3 2.2
7 29.1 6.9 5.1 4.4 5.7 6.8 0.3 5.8 7.1
Temperature abuse: 2hr (20oC). Storage: 3-4oC. Defrosting: RT
0 20.4 4.1 2.8 2.5 1.6 3.3 <1 2.2 3.3
7 36 8.1 6.6 3.9 6.7 6.6 <1 6.5 8
Temperature abuse: 2hr (20oC). Storage: 3-4oC. Defrosting: MW.
0 16.6 4 3.5 2.8 3.3 1.1 <1 1.6 3.1
7 63 7 5.1 3.6 5.1 4.9 0.5 4.3 6.8

Towards a risk-based chain control  275

V.K. Economou, M.M. Brett, C. Papadopoulou and T. Nichols

Table 1c. Histamine concentrations and microbial counts in fresh and frozen tuna loins stored at 0-2o C and abused
at 30o C.


Controls. Storage: 0-2oC Fresh tuna

4.8 4 2.9 3.9 3.5 1.5 3.5 3.8
7.4 6 2.3 6 6.4 1.3 6.5 6.4
Temperature abuse: 2 hr (30oC). Storage: 0-2oC. Fresh tuna
5.6 4.1 3.5 3.7 4.4 1.7 4 4.6
7.1 6 3.8 5.9 6.4 2.3 6.7 6.7
Temperature abuse: 2 hr (30oC). Storage: 0-2oC. Defrosting: fridge
4.8 3 2.7 2.2 3.6 1.7 3.7 4.2
7.7 5.5 4.3 5.5 7.3 3.1 7 6.7

Table1d. Histamine concentrations and microbial counts in fresh and frozen tuna loins stored at 6-7o C and abused at
30o C.


Controls. Storage: 6-7oC. Fresh tuna

4.8 3.5 2.9 3.3 1.9 0.3 4.7 3
7.2 4.8 3.8 4.4 5.9 1.6 7.6 7.2
Temperature abuse: 2hr (30oC). Storage: 6-7oC. Fresh tuna
4.6 2.8 1.5 2.8 1.2 <1 5.1 1.4
7.9 5.8 5.1 5.3 6.2 1.6 7.8 6.9
Temperature abuse: 2hr (30oC). Storage: 6-7oC. Defrosting: fridge
4.6 3.5 0.9 3.5 3.2 0.3 3.8 2.2
7.6 6.4 4.9 6.3 6.9 3 7.2 6.7


With shorter periods of temperature abuse, the main factor affecting histamine production
appears to be storage temperature. However, with longer times of abuse, storage temperature
appears to be less important. Histamine accumulation in steaks appears to be faster than in
loins. When toxic levels of histamine were present, they were first detected between four and
twelve days of abuse. Our results were comparable to the ones obtained by Gingerich et al.
(1999), Kim and Price (1999) and Lòpez-Sabater et al. (1996). The length and temperature
of storage and abuse are both important. Also the final histamine concentrations were higher
in fresh than in defrosted loins. Microbial analysis of steaks did not show any significant
difference between the different storage and temperature abuse procedures and no correlation
between histamine production and the microbial parameters tested was observed. Nevertheless

276  Towards a risk-based chain control

V.K. Economou, M.M. Brett, C. Papadopoulou and T. Nichols

in loins with the highest histamine concentrations (those abused at 30°C and stored at 6-
7°C), counts of Enterobacteriaceae and lactose fermenters were higher than in controls.


This work was supported by the British Council and the Greek Secretariat of Research and
Technology, Greek Ministry of Development.

Gingerich, T.M., Lorca, T., Flick, G.J., Pierson, M.D. and McNair, H.M., 1999. Biogenic amine survey and organoleptic
changes in fresh, stored, and temperature-abused bluefish (Pomatomus saltatrix). J. Food Prot. 62, 1033-
Joosten, H. and Northolt, M., 1989. Detection, growth, and amine-producing capacity of lactobacilli in cheese.
Appl. Environ. Microbiol. 55, 2356-2359.
Kim, S.H., An, H., and Price, R.J., 1999. Histamine formation and bacterial spoilage of albacore harvested off the
U.S. Northwest coast. J. Food Sci. 64, 340-343.
Lehane, L. and Olley, J., 2000. Histamine fish poisoning revisited. Int. J. Food Microbiol. 58, 1–37.
Lòpez-Sabater, E.I., Rodríguez-Jerez, J.J., Hernández-Herrero M., Roig-Sagués, A.X. and Mora-Ventura, M.T., 1996.
Sensory quality and histamine formation during controlled decompensation of tuna (Thunnus thynnus). J. Food
Prot. 57, 167-174.
Silla Santos, M.H., 1996. Biogenic amines: their importance in foods. Int. J. Food Microbiol. 29, 213-231.
Taylor, S., 1986. Histamine food poisoning toxicology and clinical aspects. CRC Crit. Rev. Toxicol. 17, 91-128.
Taylor, S., Lieber, E. and Leatherwood, M., 1978. A simplified method tof histamine analysis of foods. J. Food Sci.
43, 247-250.

Towards a risk-based chain control  277

A. Eisner, G. Feierl, G. Gorkiewicz, F. Dieber, E. Marth and J. Köfer

VRE (Vancomycin-resistant Enterococci) from human,

animal, and environmental samples in Styria, Austria
A. Eisner1, G. Feierl1, G. Gorkiewicz1, F. Dieber1, E. Marth1 and J. Köfer2
1Department of Hygiene, Medical University Graz, Austria
2Department of Veterinary Administration in Styria, Austria


Over the last few years, enterococci have increasingly become responsible for serious
nosocomial infections ranging from urinary tract and wound infections to endocarditis,
bacteremia and neonatal sepsis. This increase in the incidence of enterococcal infections is
mainly due to their remarkable ability to rapidly develop high-level resistance to antimicrobial
agents, which makes them difficult to treat. During the last decade, in particular, the
resistance to glycopeptides has become a major cause of concern. Vancomycin-resistant
enterococci (VRE) were first isolated in 1986 in Europe, and in 1987 in the United States.
Since then, their presence has increasingly being detected throughout the world.

In Europe the rise of VRE is linked to the use of the antibiotic avoparcin, a glycopeptide
that shows complete cross-resistance to vancomycin, as a growth promoter in food animals.
Because of the risk of VRE transmission from animals to humans, avoparcin was banned in
all EU countries in 1997. Transmission of VRE of animal origin to humans through the food
chain has been proposed as the most likely connection between animal VRE reservoirs and
humans in the community. Contact between animals and humans at avoparcin-exposed
farms has also been shown to be associated with human VRE colonization. Recent data from
different European countries indicate a decrease in the prevalence of VRE isolated from broiler
chickens, poultry meat samples and healthy humans.

The objective of this study was to investigate the prevalence and to determine the genotypes
of VRE from human, animal and environmental samples in Styria, Austria.


Between April 2003 and May 2004, 200 human faecal specimens derived from patients with
precedent antibiotic therapy and 200 derived from non-hospitalised persons who had not
received antibiotic therapy during the previous four weeks were investigated for the presence
of VRE.

During the same period, 619 animal faecal samples were taken from different slaughterhouses.
A total of 208 cattle faecal samples were collected from 136 farms, 206 pig faecal samples

278  Towards a risk-based chain control

A. Eisner, G. Feierl, G. Gorkiewicz, F. Dieber, E. Marth and J. Köfer

were collected from 150 farms, and 205 broiler faecal samples were collected from 58 farms. In
addition 112 liquid manure samples from pig farms were screened for the presence of VRE.

1ml of a 10-fold diluted faecal or liquid manure sample in 0.9% NaCl was added to 9ml
of Enterococcosel Broth (BD Diagnostic Systems, Sparks, Md.). After incubation at 35°C
for 24 hours, 100μl were sub-cultured onto Vancomycin Screen Agar plates containing
6mg vancomycin per liter (BD) and onto Colombia Blood Agar without vancomycin (BD)
and incubated at 35°C for 24 hours again. 3 colonies were randomly selected from each
vancomycin screen agar plate and subcultured onto blood agar. All enterococcal isolates were
identified on the basis of colony morphology, Gram stain, catalase, pyrrolidonyl arylamidase,
Lancefield group D antigen, motility, yellow pigment production and by the 20 Strep and the
Rapid ID32 Strep Apitests (bioMérieux, Marcy l’Etoile, France) and were additionally tested
for motility.

MICs (Minimal Inhibitory Categories) for vancomycin and teicoplanin were determined by the
Etest method (AB Biodisk, Solna, Sweden) according to the manufacturer’s instructions. The
concentration gradient for vancomycin and teicoplanin ranged from 0.016 to 256µg/ml. The
Etest inoculum was prepared in Brain Heart Infusion (BHI) broth (Oxoid, Basingstoke, UK) to
an inoculum density of 2 McFarland. 100ml of this suspension was pipetted onto BHI Agar
(Oxoid) and the plates were incubated at 35°C in ambient air for 24h. MICs were confirmed
after 48h of incubation. The NCCLS (National Committee for Clinical Laboratory Standards)
breakpoints were used for interpretation of the results. Determination of glycopeptide
resistance genotypes (VanA, VanB, VanC1, VanC2/3) was performed by PCR.