Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Critical Aspects of Safety and Loss Prevention
Critical Aspects of Safety and Loss Prevention
Critical Aspects of Safety and Loss Prevention
Ebook951 pages25 hours

Critical Aspects of Safety and Loss Prevention

Rating: 0 out of 5 stars

()

Read preview

About this ebook

Critical Aspects of Safety and Loss Prevention reflects the author's managerial experience and safety operations experience. This book is a collection of almost 400 thoughts and observations on safety and loss prevention, illustrated by accounts of accidents. The items, mostly short, are arranged alphabetically and cross-references are provided. The accident reports in this volume highlight the ignorance, incompetence and folly but also originality and inventiveness in the cause of accident prevention. This book also argues on the importance of loss prevention over the traditional safety approach. This book will be of interest to persons who work in design, operations and maintenance and to safety professionals.
LanguageEnglish
Release dateJun 28, 2014
ISBN9781483192352
Critical Aspects of Safety and Loss Prevention

Related to Critical Aspects of Safety and Loss Prevention

Related ebooks

Power Resources For You

View More

Related articles

Reviews for Critical Aspects of Safety and Loss Prevention

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Critical Aspects of Safety and Loss Prevention - Trevor A. Kletz

    36].

    Abbeystead

    In 1984 an explosion in a water pumping station at Abbeystead, Lancashire killed 16 people, most of them local residents who were visiting the plant. Water was pumped from one river to another through a tunnel. When pumping was stopped some water was allowed to drain out of the tunnel leaving a void. Methane from the rocks below accumulated in the void and, when pumping was restarted, was pushed through vent valves into a pumphouse where it exploded.

    If the presence of methane had been suspected, or even considered possible, it would have been easy to prevent the explosion by keeping the tunnel full of water or by discharging the gas from the vent valves into the open air. In addition, smoking, the probable source of ignition, could have been prohibited in the pumping station (though we should not rely on this alone). None of these things was done because no-one realized that methane might be present. Although there were references to dissolved methane in water supply systems in published papers, they were not known to engineers concerned with water supply schemes.

    The official report¹ recommended that the hazards of methane in water supplies should be more widely known, but this will prevent the last accident rather than the next. Many more accidents have occurred because information on the hazards, though well known to some people, was not known to those concerned; the knowledge was in the wrong place. See lost knowledge, need to know and LFA, Chapter 14.

    The Courts ruled that the consulting engineers were responsible for damages as they should have foreseen that methane might be present. However, one judge said that an ordinary, competent engineer could not have foreseen the danger ²

    Abdication, management by

    We have heard of management by delegation, management by participation and management by exception. More common but less often mentioned is management by abdication. This is illustrated by accident reports which say that the accident was due to human failing and that the injured man should take more care; this does nothing to prevent the accident happening again and is merely an abdication of management responsibility.

    In some factories over 50%, sometimes over 80%, of the accidents that occur are said to be due to human failing. In other factories it is 10%. There is no difference in the accidents, only in the managers. See EVHE.

    Another example of management by abdication is turning a blind-eye.

    Aberfan

    This village in South Wales was the scene of one of Britain’s worst industrial accidents. In 1966 a colliery waste tip collapsed, a school lay in its path and the 166 people killed were mainly children. The immediate cause was the construction of the tip over a stream but the underlying causes were:

    • A failure to learn from the past. Forty years earlier the causes of tip instability were recognized and warned against but the warning went unheeded, as none of the earlier collapses caused any loss of life.

    • A failure to inspect adequately. There were no regular inspections of the tip and when it was inspected only the tipping equipment was looked at, not the tip itself.

    • A failure to employ competent and well-trained people. Tips were the reponsibility of mechanical, not civil, engineers and they received no training on choice of sites or inspection. The official report¹ said, ‘it was the blind leading the blind in a system inherited from the blind’.

    See alertness and LFA, Chapter 13.

    Absolute requirements

    Under UK safety legislation employers are not required to do everything possible to prevent an accident, only what is ‘reasonably practicable’. However, there are some absolute requirements. For example, dangerous machinery must be securely fenced (guarded) even if the chance of anyone being injured is low and the cost of fencing is high.

    The difference between the two approaches is not as great as it seems at first sight. To quote from a Factory Inspector, ‘… inspectors have been reluctant to press for an absolute standard of fencing as required by the statute where … the consequences of achieving that standard would mean that the machine became unworkable… This Nelsonian approach to the realities of industrial life, relying as it does on the experience and judgement of inspectors, has generally proved satisfactory. Indeed, so successful have been these informal (and in some cases formal) arrangements that there has been little desire to amend the primary legislation to reflect the need for a more pragmatic approach to machinery safety¹.’

    Abstractions

    For children, abstractions do not exist. Concrete things like chairs and tables and trees and gardens are real, and so are actions like running or shouting, but abstractions are not.

    As we get older we learn to use abstract thought, but we often forget that children are right: abstractions do not really exist but are only a convenient shorthand to simplify our thoughts and conversations.

    Thus, it is often convenient to talk about attitudes or policies. We say that someone’s attitude to safety is wrong, meaning that he deals with safety matters in what we think is the wrong way. Instead of trying to change his attitude – difficult because it does not exist – let us discuss his problems with him and try to persuade him to deal with them in a different way. If we are successful we may say that his attitude has changed; so it has, but not as the result of a direct, head-on attack.

    Attitude and policy are examples of what philosophers call an epiphenomenon, something that does not exist on its own but only as a sort of glow or halo around other things. If you want to get a halo you don’t try to make one or buy one; instead you behave in a saint-like way and hope that a halo will appear and so it goes with attitudes.

    Generalizations such as the chemical industry, technology or modern youth do not really exist. Critics blame the chemical industry (or technology) for causing pollution or for making chemical weapons but the industry (or technology) does not have a mind of its own. There are only individual companies, made up of individual people, who have different aims, morals, etc.

    The headings in this book include many abstractions (cause, perception of risk, perspective and so on) as they provide convenient headings for discussing related subjects.

    Acceptable risk

    A phrase used, mainly by engineers, to describe a risk which is so small, compared with all the other risks to which we are exposed, that we would not be justified in using our resources of time and money to reduce it even further.

    However, the phrase is best avoided when talking to a wider audience as it may cause them to switch off. ‘What right have you’, they may say, ‘to decide what risk is acceptable to me?’ and, of course, we have no right to decide what risk is acceptable to others. We should never knowingly fail to act when someone else is at risk but we cannot do everything at once. We have to do some things first and others later so we should talk about priorities rather than acceptable risks. Everybody has problems with priorities and if we talk about them we are more likely to keep the attention of our audience.

    The phrase ‘tolerable risk’ may be more acceptable than ‘acceptable risk’ and is used in the official UK publication, The Tolerability of Risk from Nuclear Power Stations¹.

    See Convey Island, risk criteria, fatal accident rate and unlikely hazards.

    Access

    ‘As he climbed down he held on to what he thought was a fixed part of the unit. It was not’¹.

    This quotation reminds us that accidents can occur because people think about the hazards of carrying out a job but not about the hazards of getting to the site and back. For example, a railwayman was seriously injured while walking down the track at night to the site of an engineering job. In their report the Railway Inspectorate said that ‘no thought appears to have been given to the safe means of access for train crews required to reach the site of engineering work at night’.

    A repair had to be carried out to a pipeline on a large pipebridge. There was a walkway on one side of the pipebridge but the line to be repaired was on the far side. Scaffolding was erected so that the repair could be carried out safely but access to the scaffolding was a problem. A ladder would have blocked a roadway and traffic would have had to be diverted so instead the two men who were carrying out the repair were asked to crawl, on planks, between the pipes, to reach the scaffolding. They got there without trouble, but the job went wrong, there was an unexpected release of carbon monoxide gas from the pipe and one of the men was overcome. The rescuers had a difficult job dragging him through the pipebridge to the walkway; fortunately he made a complete recovery.

    People have been overcome inside vessels. When authorizing entry to a vessel or other confined space always ask how the person entering will be rescued if he collapses inside. If a vessel is entered from the top through a manhole a rope is not usually adequate. One man cannot pull another out of a manhole on a rope; a hoist is usually needed.

    According to the UK Factories Act, Section 30 the diameter of a manhole must be at least 18 inches if dangerous fumes are liable to be present. (If the manhole is oval the minimum size is 18 inches by 16 inches.) In fact, it is difficult to get through an 18 inch manhole wearing breathing apparatus or protective clothing and many companies specify a minimum diameter of 24 inches nominal (22.5 inches actual).

    Accident

    An accident is often defined as something that happens by chance and is beyond control. If this is so, then there are very few, if any, accidents in industry (or on the roads or in the home). Most accidents are predictable. I know that during the coming year, in every large chemical works and in many small ones, a tank will be sucked in, a tanker will be overfilled and another will drive away before the filling or emptying hose has been disconnected. A man will be injured while disconnecting a hose and someone will open up the wrong pipeline¹. (See identification of equipment.) I do not know exactly when or where but I am sure these accidents will occur. More serious accidents are also predictable but they occur less often².

    Most accidents are preventable. We know how to prevent them – or someone does if we do not – but we lack the will to do so, or to find out how to do so. We cannot make gold from lead because we do not know how. In contrast, accidents do not occur because we lack knowledge but because we lack energy, drive and commitment.

    In 1884 the Board of Trade’s first report on the working of the Boiler Explosions Act said, ‘The terms inevitable accident and accident are entirely inapplicable to these explosions … So far from the explosions being accidental, the only accidental thing about them is that the explosions should have been so long deferred’.

    The Shorter Oxford Dictionary defines an accident as ‘Anything that happens’ which is a better definition than the usual one but too broad. I prefer ‘An undesired event that results in harm to people, damage to property or loss to process’ (F Bird).

    Some companies call an incident an accident only when someone is injured. If no one is injured they call it a dangerous occurrence or dangerous incident. It is good practice to call them all accidents in order to draw attention to the fact that it is often a matter of chance whether or not injuries occur and that the investigation and follow up should be the same in each case.

    See mechanical accidents, repeated accidents, (visit accident) sites and triangles.

    Accident chains

    See chains.

    Accident investigation

    Craven,¹ an experienced investigator of fires and explosions, suggests that accident investigation should be split into seven steps, to which I have added an eighth. Some of the comments are my own.

    Remit

    It should be made clear to the investigator or investigating team that it is not their job to allocate blame (unless there has been arson, horseplay or reckless indifference to the safety of others). If witnesses feel that they, or their fellow-workers, may be blamed they will keep quiet, we will never know what happened and we will be unable to prevent it happening again. A tolerant attitude towards errors of judgement or omission is a price worth paying to find out the facts. See Australia and LFA, Introduction. Make sure that the relevant authorities, and the insurance company if appropriate, have been informed. (In the UK many accidents have to be reported to the Health and Safety Executive.) The authorities and the insurers may wish to carry out their own investigations. If possible the various investigators should work together but do not delay if the others are slow to arrive.

    A brief survey

    To see what there is to see.

    The state of the plant before the accident

    Was it normal or abnormal? If the investigator is not familiar with the plant and process he should look at plant descriptions, photographs, drawings and similar plants.

    Examination of damage

    Nothing should be moved, unless essential to make the plant safe, until the investigators, and any experts they wish to call in, have seen it and photographs have been taken. (See (visit accident) sites).

    Interviews with witnesses

    Do not put ideas into their minds. Avoid questions to which the answer is ‘yes’ or ‘no’. It is easier for a witness to say ‘yes’ or ‘no’ than describe what they think happened, especially when they are tired or shocked. (See falsification and memory.)

    Research and analysis

    It may be necessary to ask for metallurgical, chemical or other tests to be carried out or to look for information in libraries or company files. In particular a search should be made for reports of similar accidents that have occurred before, on the same plant or elsewhere. Computerized data bases make this searching easier than in the past but the memories of people with long experience in the plant and industry are often better. ‘Old boy’ relationships with people in other companies can be valuable.

    The recommendations

    Craven says that these are not part of the investigation but a later stage, possibly carried out by different people. Most of us consider the recommendations to be the most important part of the investigation; if we cannot make any, the whole exercise is pointless.

    Investigators should look beyond the immediate technical recommendations and see if they can find ways of avoiding the hazard and of improving the management system. Accident investigation is like peeling an onion: beneath each layer of causes and recommendations there are other, deeper layers. For example, consider the fire described under amateurism. The immediate cause was a missing slip-plate and the obvious recommendations concern improved procedures for controlling slip-plating. However, the hazard could have been reduced by a more spacious layout, by installing underground drains and by avoiding a mixture of series and parallel units. The investigators should also ask if it was essential to have so much flammable material in the plant (‘What you don’t have, can’t leak’ – see intensification) and if it had to be so hot. Even if it is impossible to carry out these recommendations on the existing plant they should be noted for the future.

    Finally, why were slip-plating procedures so poor? Did the managers not know what was going on or did they know but not realize that the procedures were poor? Why was the design so poor? Why did those concerned not learn from the other companies in the group? (See insularity.)

    LFA analyses some accidents in detail and shows that we can learn many lessons, many more than we usually do, from the known facts. We pay a high price for accidents and then fail to use all the gold that is in the mine. See chains, Challenger, dangerous occurrence and Zeebrugge.

    The report

    See accident reports.

    Finally, plan ahead. Draw up a procedure for accident investigation so that everyone knows what to do when one occurs.

    Accident-prone

    This term is used to describe people who, as a result of their personal failings, have more than their fair share of accidents. We all know clumsy people who are always dropping things but in industry, especially the process industries, it is generally agreed that accident-prone people are responsible for only a small proportion of the total number of accidents. See EVHE, §4.3.

    Some people will have more than their fair share of accidents by chance. Suppose that in a factory employing 200 people there are 100 accidents in a year. There are not enough to go round and many people will have no accident. If the accidents are distributed at random then the Poisson equation shows that:

    • 121 people will have no accidents

    • 61 will have one accident

    • 15 will have two accidents

    • 3 will have three or more accidents.

    The last three people are not accident-prone, just unlucky. To call them accident-prone would be like calling a die loaded because three sixes came up in a row. To prove that a group of people is accident-prone we have to show that they have more accidents than we would expect by chance¹.

    If people are really accident-prone this may be due to physical problems such as poor sense–muscle coordination, or to personality. Accident-prone people are often insubordinate, excitable and extrovert and are often absent due to sickness. In some cases they may be so unsuitable for their occupation that they may have to be moved. However, ‘Let us beware lest the concept of the accident-prone person be stretched beyond the limits within which it can be a fruitful idea. We should indeed be guilty of a grave error if for any reason we discouraged the manufacture of safe machinery’².

    If it is not very helpful to talk about accident-prone people; it is helpful to try to identify accident-prone plants or systems of work. Many plants and systems contain traps for those who work on them. It is no use telling people to take more care and avoid the traps. A trap is, by definition, something people fall into. We should accept people as we find them and try to change plant designs and methods of working so as to remove opportunities for error (or to protect against the consequences). See friendly plants, human failing and EVHE.

    It is easier to match the job to the man than the man to the job.

    Accident reports

    An accident report should tell us:

    1. What happened.

    2. Why it happened.

    3. What we should do differently in future to prevent it happening again, on other plants as well as on the plant where it actually occurred.

    4. Who should make the changes.

    5. When the changes will be complete. The report can then be brought forward at this time.

    6. What the changes will cost.

    Item 3 is, of course, the most important but it is surprising how many reports fail to give this vital information or tell us only some of the actions we ought to take. Many reports discuss only the immediate technical causes of an accident but not ways of avoiding the hazards or ways of improving the management system. See accident investigation and chains.

    When we investigate an accident we often find a lot wrong, faults that could cause an accident though they did not do so. Recommendations should cover these.

    Reports should be made available to all those who have similar hazards and may be able to profit from our misfortunes, in other companies as well as our own. Do not send them more information than they need or the essential message may not be recognized amongst a mass of detail. The ideal is two reports, one giving the full story, for those who want to see it and the other drawing attention to the essentials. However, do not make them too brief. Include enough detail to make the story convincing or, as W S Gilbert says, to give ‘verisimilitude to an otherwise bald and unconvincing narrative’.

    See carelessness, communication, falsification, honesty, publication, smoke screens and LFA.

    Accident statistics

    In many companies the safety officer spends a lot of his time preparing complex tables of accident data. For example, one annual report contains 12 pages of tables in which nine types of accident (e.g. number of lost-time accidents, number of alternative work accidents, number of minor accidents, number of fires, lost-time accident frequency rate etc.) are tabulated against 11 variables: location, age, service, occupation, nature of accidents (e.g. falls on level, falls from structure, falls from stairways − 34 headings), cause, type of injury (19 headings), part of body injured etc. In some cases separate figures are quoted for each quarter and one page compares the previous five years.

    No conclusions are drawn from all these data and no recommendations are made. Nevertheless returns of this sort are not unusual. I have seen a 39 page report from one large oil company. Who studies these data and why are they produced?

    The theory is that a detailed anaysis of the data will show if there has been an increase in, say, accidents involving ladders, especially to electricians, or an increase in hand injuries, especially to men with short service, allowing us to take action to reverse the increase. In practice, conclusions of this sort rarely, if ever, come out of the data. If there has been an increase in ladder accidents the managers and the safety officer should have picked this up from the accident reports as they come in and by talking to people as they go round the site. No magic remedies for accidents will ever come from analysing figures. Most of us already know what we need to do. We just lack the will to get on with it. See Grimaldi and old-timer.

    I suspect that these accident statistics, prepared with so much labour, are given only a quick glance and then filed, never to see daylight again. The safety officer may be preparing them because he thinks the managers want them, while they wonder why he spends so much time in the office instead of getting out on the site.

    Obviously we do need a few figures to show us whether or not the accident record is improving and how it compares with other companies (see comparisons) and it is true that no single figure such as the lost-time accident rate will suffice. Figures that I suggest should be reported are:

    • Lost-time accident rate.

    • Fatal accident rate (even in large companies only a five-year moving average is meaningful).

    • Minor accident rate.

    • Damage (insured and uninsured) and consequential loss.

    • A measure of the results of safety audits, such as that provided by the five star grading system. This sets out the objectives or elements of a successful safety programme and awards marks for the standard attained. (See criteria.)

    Do not congratulate people on an improvement, or blame them for a worsening performance, unless it is statistically significant. For example, suppose a plant averages seven accidents per year; the number would have to fall to two before we could be 90% confident that the accident rate had fallen. See Lees, §27.5 and Myths, §34.

    Another pitfall is the assumption that all accidents are reported. A map of the distribution of bats in Yorkshire showed a large number of sightings near Helmsley but it is probably bat-watchers rather than bats that are common in the area. If one factory reports more dangerous occurrences than other factories, perhaps they have more or perhaps they are more honest. See under-reporting.

    A senior manager who looks only at accident statistics will not know what is going on. He should also look at the detailed reports on at least some of the accidents that occur. See management.

    Accuracy

    See confidence limits.

    Acids

    See corrosive chemicals.

    Action

    The job of the safety professional is not complete when he has made his recommendations, or even when they have been accepted. It is not complete until they have been carried out and shown to be effective. He should not lose interest in the problem until this stage is reached.

    Accident reports should be brought forward after an appropriate time to see if their recommendations for action have been carried out. Hazard and operability study reports should be brought forward after two or three months to check that team members have carried out the actions they agreed to take.

    What instigates action in loss prevention? Undoubtedly the commonest and most effective motive is a serious incident. The nearer the incident the greater its effect. A fire in our own company has more effect than one in another company which in turn has more effect than one in another country.

    Nevertheless in many cases awareness of a problem has resulted in action before an accident occurred. I have described elsewhere¹ the ‘springs of action’ of four major changes in which I was involved:

    1. Improvements in the system for the preparation of equipment for maintenance: this followed a serious fire in the company concerned.

    2. Improvements in the methods used for testing trips and alarms: this change resulted from recognition of a problem, our increasing dependence on trips and

    Enjoying the preview?
    Page 1 of 1