You are on page 1of 11

Quality in management

In this file, you can ref useful information about quality in management such as quality in
managementforms, tools for quality in management, quality in managementstrategies If you
need more assistant for quality in management, please leave your comment at the end of file.
Other useful material for quality in management:
qualitymanagement123.com/23-free-ebooks-for-quality-management
qualitymanagement123.com/185-free-quality-management-forms
qualitymanagement123.com/free-98-ISO-9001-templates-and-forms
qualitymanagement123.com/top-84-quality-management-KPIs
qualitymanagement123.com/top-18-quality-management-job-descriptions
qualitymanagement123.com/86-quality-management-interview-questions-and-answers

I. Contents of quality in management


==================
Since this manual is aimed at improving the performance of a laboratory, the activities involved focus on
the term "quality". The quality of the product, in the present case analytical results, should obviously be
acceptable. To establish whether the product fulfils the quality requirements these have to be defined first.
Only after that it can be decided if the product is satisfactory or if and what corrective actions need to be
taken.

1.1 What is Quality?


The term "quality" has a relative meaning. This is expressed by the ISO definition: "The totality of
features and characteristics of a product or service that bear on its ability to satisfy stated or implied
needs". In simpler words, one can say that a product has good quality when it "complies with the
requirements specified by the client". When projected on analytical work, quality can be defined
as "delivery of reliable information within an agreed span of time under agreed conditions, at agreed
costs, and with necessary aftercare". The "agreed conditions" should include a specification as to the
precision and accuracy of the data which is directly related to "fitness of use" and which may differ for
different applications. Yet, in many cases the reliability of data is not questioned and the request for
specifications omitted. Many laboratories work according to established methods and procedures which
are not readily changed and have inherent default specifications. Moreover, not all future uses of the data
and reports can be foreseen so that specifications about required precision and accuracy cannot even be
given. Consequently, this aspect of quality is usually left to the discretion of the laboratory. However, all
too often the embarrassing situation exists that a laboratory cannot evaluate and account for its quality
simply because the necessary documentation is lacking.

In the ensuing discussions numerous activities aimed at maintaining the production of quality are dealt
with. In principle, three levels of organization of these activities can be distinguished. From the top down
these levels are:
1. Quality Management (QM)
2. Quality Assurance (QA)
3. Quality Control (QC)

1.2 Quality Management


Quality Management is the assembly and management of all activities aimed at the production of quality
by organizations of various kinds. In the present case this implies the introduction and proper running of a
"Quality System" in laboratories. A statement of objectives and policy to produce quality should be made
for the organization or department concerned (by the institute's directorate). This statement also identifies
the internal organization and responsibilities for the effective operation of the Quality System.
Quality Management can be considered a somewhat wider interpretation of the concept of "Good
Laboratory Practice" (GLP). Therefore, inevitably the basics of the present Guidelines largely coincide
with those of GLP. These are discussed below in Section 1.5.
Note. An even wider concept of quality management is presently coming into vogue: "Total Quality
Management" (TQM). This concept includes additional aspects such as leadership style, ethics of the
work, social aspects, relation to society, etc. For an introduction to TQM the reader is referred to Parkany
(1995).

1.3 Quality Assurance


Proper Quality Management implies consequent implementation of the next level: Quality
Assurance. The ISO definition reads: "the assembly of all planned and systematic actions necessary to
provide adequate confidence that a product, process, or service will satisfy given quality
requirements." The result of these actions aimed at the production of quality, should ideally be checked by
someone independent of the work: the Quality Assurance Officer. If no QA officer is available, then
usually the Head of Laboratory performs this job as part of his quality management task. In case of
special projects, customers may require special quality assurance measures or a Quality Plan.

1.4 Quality Control


A major part of the quality assurance is the Quality Control defined by ISO as "the operational
techniques and activities that are used to satisfy quality requirements. " An important part of the quality
control is the Quality Assessment: the system of activities to verify if the quality control activities are
effective, in other words: an evaluation of the products themselves.

Quality control is primarily aimed at the prevention of errors. Yet, despite all efforts, it remains inevitable
that errors are be made. Therefore, the control system should have checks to detect them. When errors or
mistakes are suspected or discovered it is essential that the "Five Ws" are trailed:
- what error was made?
- where was it made?
- when was it made?
- who made it?
- why was it made?
Only when all these questions are answered, proper action can be taken to correct the error and prevent
the same mistake being repeated.
The techniques and activities involved in Quality Control can be divided into four levels of operation:
1. First-line control: Instrument performance check.
2. Second-line control: Check of calibration or standardization.
3. Third-line control: Batch control (control sample, identity check).
4. Fourth-line control: Overall check (external checks: reference samples, interlaboratory exchange
programmes).
Because the first two control levels both apply to the correct functioning of the instruments they are often
taken together and then only three levels are distinguished. This designation is used throughout the
present Guidelines:
1. First-line control: Instrument check / calibration.
2. Second-line control: Batch control
3. Third-line control: External check
It will be clear that producing quality in the laboratory is a major enterprise requiring a continuous human
effort and input of money. The rule-of-fist is that 10-20% of the total costs of analysis should be spent on
quality control. Therefore, for quality work at least four conditions should be fulfilled:
- means are available (adequate personnel and facilities)
- efficient use of time and means (costs aspect)
- expertise is available (answering questions; aftercare)
- upholding and improving level of output (continuity)
In quality work, management aspects and technical aspects are inherently cobbled together and for a clear
insight and proper functioning of the laboratory these aspects have to be broken down into their
components. This is done in the ensuing chapters of this manual.

1.5 Good Laboratory Practice (GLP)


Quality Management in the present context can be considered a modem version of the hitherto much used
concept "Good Laboratory Practice" (GLP) with a somewhat wider interpretation. The OECD Document
defines GLP as follows: "Good Laboratory Practice (GLP) is concerned with the organizational process
and the conditions under which laboratory studies are planned, performed, monitored, recorded, and
reported."
Thus, GLP prescribes a laboratory to work according to a system of procedures and protocols. This
implies the organization of the activities and the conditions under which these take place are controlled,
reported and filed. GLP is a policy for all aspects of the laboratory which influence the quality of the
analytical work. When properly applied, GLP should then:
- allow better laboratory management (including quality management)
- improve efficiency (thus reducing costs)
- minimize errors
- allow quality control (including tracking of errors and their cause)
- stimulate and motivate all personnel
- improve safety
- improve communication possibilities, both internally and externally.
The result of GLP is that the performance of a laboratory is improved and its working effectively
controlled. An important aspect is also that the standards of quality are documented and can be
demonstrated to authorities and clients. This results in an improved reputation for the laboratory (and for
the institute as a whole). In short, the message is:
- say what you do
- do what you say
- do it better
- be able to show what you have done
The basic rule is that all relevant plans, activities, conditions and situations are recorded and that these
records are safely filed and can be produced or retrieved when necessary. These aspects differ strongly in
character and need to be attended to individually.
As an assembly, the involved documents constitute a so-called Quality Manual. This comprises then all
relevant information on:
- Organization and Personnel
- Facilities
- Equipment and Working materials
- Analytical or testing systems
- Quality control
- Reporting and filing of results.

Since institutions having a laboratory are of divergent natures, there is no standard format and each has to
make its own Quality Manual. The present Guidelines contain examples of forms, protocols, procedures
and artificial situations. They need at least to be adapted and many new ones will have to be made
according to the specific needs, but all have to fulfil the basic requirement of usefulness and verifiability.
As already indicated, the guidelines for Quality Management given here are mainly based on the
principles of Good Laboratory Practice as they are laid down in various relevant documents such as ISO
and ISO/IEC guides, ISO 9000 series, OECD and CEN (EN 45000 series) documents, national standards
(e.g. NEN standards)*, as well as a number of text books. The consulted documents are listed in the
Literature. Use is also made of documents developed by institutes which have obtained accreditation or
are working towards this. This concerns mainly so-called Standard Operating Procedures (SOPs) and
Protocols. Sometimes these documents are hard to acquire as they are classified information for reasons
of competitiveness. The institutes and persons which cooperated in the development of these Guidelines
are listed in the Acknowledgements.
* ISO: International Standardization Organization; IEC: International Electrical Commission; OECD:
Organization for Economic Cooperation and Development; CEN: European Committee for
Standardization, EN: European Standard; NEN: Dutch Standard.

==================

III. Quality management tools

1. Check sheet
The check sheet is a form (document) used to collect data
in real time at the location where the data is generated.
The data it captures can be quantitative or qualitative.
When the information is quantitative, the check sheet is
sometimes called a tally sheet.
The defining characteristic of a check sheet is that data
are recorded by making marks ("checks") on it. A typical
check sheet is divided into regions, and marks made in
different regions have different significance. Data are
read by observing the location and number of marks on
the sheet.
Check sheets typically employ a heading that answers the
Five Ws:

Who filled out the check sheet


What was collected (what each check represents,
an identifying batch or lot number)
Where the collection took place (facility, room,
apparatus)
When the collection took place (hour, shift, day of
the week)
Why the data were collected

2. Control chart
Control charts, also known as Shewhart charts
(after Walter A. Shewhart) or process-behavior
charts, in statistical process control are tools used
to determine if a manufacturing or business
process is in a state of statistical control.
If analysis of the control chart indicates that the
process is currently under control (i.e., is stable,
with variation only coming from sources common
to the process), then no corrections or changes to
process control parameters are needed or desired.
In addition, data from the process can be used to
predict the future performance of the process. If
the chart indicates that the monitored process is
not in control, analysis of the chart can help
determine the sources of variation, as this will
result in degraded process performance.[1] A
process that is stable but operating outside of
desired (specification) limits (e.g., scrap rates
may be in statistical control but above desired
limits) needs to be improved through a deliberate
effort to understand the causes of current
performance and fundamentally improve the
process.
The control chart is one of the seven basic tools of
quality control.[3] Typically control charts are

used for time-series data, though they can be used


for data that have logical comparability (i.e. you
want to compare samples that were taken all at
the same time, or the performance of different
individuals), however the type of chart used to do
this requires consideration.

3. Pareto chart
A Pareto chart, named after Vilfredo Pareto, is a type
of chart that contains both bars and a line graph, where
individual values are represented in descending order
by bars, and the cumulative total is represented by the
line.
The left vertical axis is the frequency of occurrence,
but it can alternatively represent cost or another
important unit of measure. The right vertical axis is
the cumulative percentage of the total number of
occurrences, total cost, or total of the particular unit of
measure. Because the reasons are in decreasing order,
the cumulative function is a concave function. To take
the example above, in order to lower the amount of
late arrivals by 78%, it is sufficient to solve the first
three issues.
The purpose of the Pareto chart is to highlight the
most important among a (typically large) set of
factors. In quality control, it often represents the most
common sources of defects, the highest occurring type
of defect, or the most frequent reasons for customer
complaints, and so on. Wilkinson (2006) devised an
algorithm for producing statistically based acceptance
limits (similar to confidence intervals) for each bar in
the Pareto chart.

4. Scatter plot Method


A scatter plot, scatterplot, or scattergraph is a type of
mathematical diagram using Cartesian coordinates to
display values for two variables for a set of data.
The data is displayed as a collection of points, each
having the value of one variable determining the position
on the horizontal axis and the value of the other variable
determining the position on the vertical axis.[2] This kind
of plot is also called a scatter chart, scattergram, scatter
diagram,[3] or scatter graph.
A scatter plot is used when a variable exists that is under
the control of the experimenter. If a parameter exists that
is systematically incremented and/or decremented by the
other, it is called the control parameter or independent
variable and is customarily plotted along the horizontal
axis. The measured or dependent variable is customarily
plotted along the vertical axis. If no dependent variable
exists, either type of variable can be plotted on either axis
and a scatter plot will illustrate only the degree of
correlation (not causation) between two variables.
A scatter plot can suggest various kinds of correlations
between variables with a certain confidence interval. For
example, weight and height, weight would be on x axis
and height would be on the y axis. Correlations may be
positive (rising), negative (falling), or null (uncorrelated).
If the pattern of dots slopes from lower left to upper right,
it suggests a positive correlation between the variables
being studied. If the pattern of dots slopes from upper left
to lower right, it suggests a negative correlation. A line of
best fit (alternatively called 'trendline') can be drawn in
order to study the correlation between the variables. An
equation for the correlation between the variables can be
determined by established best-fit procedures. For a linear
correlation, the best-fit procedure is known as linear
regression and is guaranteed to generate a correct solution

in a finite time. No universal best-fit procedure is


guaranteed to generate a correct solution for arbitrary
relationships. A scatter plot is also very useful when we
wish to see how two comparable data sets agree with each
other. In this case, an identity line, i.e., a y=x line, or an
1:1 line, is often drawn as a reference. The more the two
data sets agree, the more the scatters tend to concentrate in
the vicinity of the identity line; if the two data sets are
numerically identical, the scatters fall on the identity line
exactly.

5.Ishikawa diagram
Ishikawa diagrams (also called fishbone diagrams,
herringbone diagrams, cause-and-effect diagrams, or
Fishikawa) are causal diagrams created by Kaoru
Ishikawa (1968) that show the causes of a specific event.
[1][2] Common uses of the Ishikawa diagram are product
design and quality defect prevention, to identify potential
factors causing an overall effect. Each cause or reason for
imperfection is a source of variation. Causes are usually
grouped into major categories to identify these sources of
variation. The categories typically include
People: Anyone involved with the process
Methods: How the process is performed and the
specific requirements for doing it, such as policies,
procedures, rules, regulations and laws
Machines: Any equipment, computers, tools, etc.
required to accomplish the job
Materials: Raw materials, parts, pens, paper, etc.
used to produce the final product
Measurements: Data generated from the process
that are used to evaluate its quality
Environment: The conditions, such as location,
time, temperature, and culture in which the process
operates

6. Histogram method
A histogram is a graphical representation of the
distribution of data. It is an estimate of the probability
distribution of a continuous variable (quantitative
variable) and was first introduced by Karl Pearson.[1] To
construct a histogram, the first step is to "bin" the range of
values -- that is, divide the entire range of values into a
series of small intervals -- and then count how many
values fall into each interval. A rectangle is drawn with
height proportional to the count and width equal to the bin
size, so that rectangles abut each other. A histogram may
also be normalized displaying relative frequencies. It then
shows the proportion of cases that fall into each of several
categories, with the sum of the heights equaling 1. The
bins are usually specified as consecutive, non-overlapping
intervals of a variable. The bins (intervals) must be
adjacent, and usually equal size.[2] The rectangles of a
histogram are drawn so that they touch each other to
indicate that the original variable is continuous.[3]

III. Other topics related to Quality in management (pdf


download)
quality management systems
quality management courses
quality management tools
iso 9001 quality management system
quality management process
quality management system example
quality system management
quality management techniques
quality management standards
quality management policy

quality management strategy


quality management books

You might also like