You are on page 1of 11

White Paper

2015

THE ROLE OF SOFTWARE


QUALITY ASSURANCE IN
DO-178B/C SOFTWARE
DEVELOPMENT PROCESS

Contents

Introduction

01

Definitions

01

Backdrop

01

Discussion

01

Conclusion

08

Author

08

References

08

About Cyient

09

Software quality
assurance for
DO-178B can
be considered
to be the sum of
all activities and
actions performed
to satisfy the
given quality
requirements.

Introduction
DO-178B is a document published by the
Radio Technical Commission for Aeronautics
(RTCA) that is used as the guidance for
the development of software used in civil
certifiable airborne systems.
In December 2011, RTCA issued DO-178C
to provide clarification and to address the
inconsistencies in DO-178B, as well as to
introduce technology advancements in
certifiable software development.
DO-178B/C requires that the software
product in question satisfies specific
objectives related to software quality
assurance (SQA). These objectives can
be fulfilled by performing specific quality
assurance activities in the frame of the SQA
process.
The purpose of this paper is to clarify
the role of SQA in a DO-178B/C software
life cycle process and to provide suggestions
to fulfill the required software product quality
objectives.

Definitions
For the purpose of this paper, the terms
development and development process
are used to indicate the complete software
life cycle (including planning, design and
verification) unless otherwise specified.
The term SQE refers to software quality
engineers working both on airborne software
and tools.

Background
In software engineering, the interpretation
of a quality product varies depending on the
perspective.

01

Software developers consider software that


meets their requirements, as a quality product.
However, from customers perspective, highquality software is the one that meets their
needs.
DO-178B does not provide a definition of
quality. However, it provides a definition of
assurance:

The planned and systematic actions necessary


to provide adequate confidence and evidence
that a product or process satisfies given
requirements.
Based on this definition, software quality
assurance for DO-178B can be considered to be
the sum of all activities and actions performed
to satisfy the given quality requirements. DO178B identifies the quality requirements for
airborne software by means of objectives that
are listed in Annex A of the document.
This paper discusses these objectives and
how to achieve them through a SQA process
implementation that is compliant with DO178B/C. The characteristics of an effective
SQA are presented, then common process
and product quality issues are presented
and discussed. Finally, possible methods to
improve overall process and product quality
are proposed.

Discussion
Software Quality Assurance
Quality assurance can be interpreted as the
sum of evaluation activities of a finished
product to assess its quality.
DO-178B/C Software Quality Assurance
Process and Objectives
DO-178B defines the SQA process as
an integral process. This means that it is
continuously run throughout the software
planning, development, verification, and final
compliance efforts.

Fig. 1 | SQA Objectives - DO178B vs DO178C

DO-178B identifies the following objectives


for the SQA process:
The software development processes
comply with the applicable plans and
standards
The transition criteria between software
life cycle processes are satisfied
A software conformity review is conducted
Quality assurance activities are required to
ensure that plans and standards are developed
and they are followed throughout the software
life cycle. The SQA records are the outcome of
these activities and they are used as evidence
of compliance within the given SQA objectives.
At the end of the software development, the
software quality engineers (SQE) also conduct
a software conformity review to verify that:
The software product and associated
life cycle data comply with plan and
standards
Traceability is in place and correct
Deviations are recorded and approved
The executable object code can be
regenerated from the archived source code
The approved software can be successfully
loaded on the target hardware by means of
released instructions
02

Problem reports comply with the software


configuration management plan; they have
been evaluated and their status recorded

However, DO-178B goes beyond the SQA


objectives in Table A-9. The following
additional SQA objectives are in Table A-1:
Software plans comply with this document
(i.e. DO-178B)
Software plans are coordinated
To fulfill these objectives, the SQEs have to
ensure that the software life cycle process
and associated activities described in the
plans comply with DO-178B and the plans are
consistent with one another.
DO-178C clarifies the SQA objectives. Section
8.1 has been modified by adding bullet point a.
and now it reads:
The objectives of the SQA process are to
obtain assurance that:
Software plans and standards are developed
and reviewed for compliance with this
document and for consistency
Software life cycle processes, including
those of suppliers, comply with approved
software plans and standards

SQEs assure that


the software
development
process follows
defined plans
and standards
and that these
documents are
kept current.

The transition criteria for the software


life cycle processes are satisfied
A conformity review of the software product
is conducted
Also the hidden SQA objectives of Table A-1
have been moved to Table A-9 so that all the
SQA related objectives are clearly presented
and grouped in one place. The SQA objectives
have also been reworded to improve clarity.
The Fig.1gives a pictorial representation of the
changes in table A-9 of DO-178 between Rev.
B and Rev. C:
DO-178B Table A-1, objectives #6, #7 are
now combined in objective #1 of Table A-9
DO-178B Table A-9 objective #1 is now split
in objectives #2 and #3 of Table A-9
DO-178B Table A-9 objective #2 is now
objective #3
DO-178B Table A-9 objective #3 is now
objective #4
SQA process implementation
DO-178B requires SQEs to verify that
the plans and standards are correctly
implemented. This often leads to a processoriented interpretation of the SQA activities.
From this perspective, SQEs assure that the
software development process follows defined
plans and standards and that these documents
are kept current. SQA activities are usually
performed by filling checklists that are then
put under configuration control, becoming
part of the SQA records.
SQA activities include the following:
SQA Plan (SQAP) preparation
Review of the software plans
Auditing of the software life cycle processes
Drive process improvements and defect
elimination
Transition criteria assessment
Auditing life cycle data

03

Auditing the environment


Participate in the change control board
Witness formal tests
Monitor and audit the change control
process and artifacts
Audit suppliers
Approve deviations and waivers
Performing conformity review
In addition to these activities, SQEs should
also be in charge of assessing the readiness of
the organization to undergo SOI audits with
the certification authority.
In 1998 FAA published the Job Aid Conducting Software Reviews Prior to
Certification; a revised version (Rev. 1) was
issued on January 16, 2004.
It is clearly stated in the Job Aid itself that it
should be used as a reference tool during
the software review process and [] Nor is
the Job Aid intended to replace DO-178Bit
should be used in conjunction with DO-178B.
With these considerations in mind, the FAA Job
Aid is very helpful in assessing the maturity
of the software product and its compliance to
DO-178B/C.
SQEs should use it to assess the maturity of
the software by applying its checklists in their
entirety and not only the portion applicable to
SQA objectives.
The FAA Job Aid document can also be used as
a training aid to improve SQEs assessment and
audit skills.
FAA has not published a revised version of the
Job Aid yet to account for the new DO-178C.
However, a revised version will probably be
made available in the near future.
A purely process-oriented SQA can be
inadequate to completely fulfill DO-178B, and
now DO-178C, objectives.

It is important
that the companywide quality goals
integrate with
and include the
specific DO178B/C software
quality assurance
objectives. It is a
key factor for SQA
to be effective.

It should be noted that it has never been the


intent of DO-178B to limit the SQA activities
to process compliance verification.

SQEs need to have a strong technical


background in order to identify the possible
issues.

Objectives #6 and #7 in Table A-1 of DO-178B


imply that SQA shall ensure that software
development and verification processes and
all associated activities as well as artifacts are
compliant with the documents objectives.

Without a technical understanding of the


product, SQA cannot effectively perform their
role and be respected by development and
verification organizations.

To achieve this objective, SQEs should be able


to evaluate the technical content of the plans
and the artifacts created during the software
life cycle.
To effectively do so, the SQEs should have
certain prerequisites. These are discussed in
the next section.

Characteristics of an effective SQA
Effective SQA is involved in every aspect
of the software life cycle while maintaining
the required independence and authority to
ensure that the necessary corrective actions
are implemented.
Quality is not only the responsibility of the
SQEs, a strong commitment to quality must
come from every engineer involved in the
project. Quality starts with well-disciplined,
conscientious and talented engineers.
Most companies maintain an ISO 9000
certification as well as in most cases, other
industry-wide standards such as the SEI
capability maturity model integration (CMMI)
and the Six-Sigma. These standards provide
a quality assurance framework. However, it
is important that the company-wide quality
goals integrate with and include the specific
DO-178B/C software quality assurance
objectives.
This is a key factor for SQA to be effective;
without it, a high-quality product cannot be
obtained.

04

Continuous training and improvement are the


other key factors in maintaining an effective
SQA organization. Quality engineers should be
knowledgeable in the software development
and verification processes. Specific training on
DO-178B/C and on the preparation of Stage
of Involvement (SOI) audits should be provided
to all SQEs to ensure adequate auditing and
assessment skills.
Possible causes of ineffective SQA
If the SQA organization is not effective, the
possible causes are usually found in the
following three areas:
Management support: When the companys
management does not buy into the
commitment to quality, it jeopardizes the
authority of the quality organization. SQEs
inputs will be disregarded and corrective
actions will not be taken.
Resources allocation: An understaffed
SQA organization can negatively impact
the quality of the product. SQEs tend to
be overworked and assigned to too many
programs to effectively carry out their job.
Training: Unqualified SQEs also impact the
effectiveness of the SQA organization. If the
SQA engineers lack experience in software
development and verification and overall
knowledge of the project they are assessing,
then their inputs will be disregarded by the
engineering organization and no corrective
action will be taken.
That is why it is important for SQA personnel
to be trained on a continual basis. SQEs
should be trained in the areas of software
development and verification, auditing and

assessment skills. Also, SQEs should be


trained on the specificities of the projects
they have been assigned to. This will promote
a deeper technical knowledge and a closer
interaction with the engineering team.
Examples and Considerations on Common
Process and Product Issues
There can be various quality issues that
might be present themselves in a software
development effort.
The most common are:
Process
Requirements and design reviews improperly
performed or skipped altogether
Unrepeatable build-process because it is not
adequately documented
Source code is changed without modifying
requirements and design
Configuration of the life cycle data is not
maintained
Plans and standards are not followed
Development and verification environments
are not defined and are not under
configuration control
Transition criteria are not met
Product
Software does not meet all the requirements
Requirements are ambiguous and/or
incomplete. Therefore, customers needs are
not met
Software is not robust. It works in normal
scenarios but it crashes when abnormal
inputs are provided
Source code does not align with
requirements and/or design
Software is shipped to customer with
defects
Process issues can lead to product issues if not
addressed. If SQA proactively participates in
every phase of the software lifecycle, potential
issues can be identified early on and corrected,
minimizing the impact on the final product.

05

For example, incomplete or missing records


for requirements or design reviews can be a
symptom of inadequate or late involvement of
the SQA personnel.
Usually SQA personnel should be invited to all
the formal and gate reviews of a software life
cycle (e.g. SSR, PDR, CDR, transition criteria
reviews, etc.).
However, when the SQA organization suffer
from one or a combination of the problems
presented earlier in this document (e.g.
understaffing, lack of technical background),
the need for SQA participation can be easily
forgotten. The result is that required quality
assurance activities are not performed at the
right moment and valuable inputs are missed.
In this scenario, it is easy to spot SQA
checklists and audit reports created well after
the actual review happened with the only goal
of having all the mandated SQA records and to
satisfy the certification authorities.
An improperly performed requirement or design
review can lead to a software product that does
not meet all the requirements or is not robust.
Therefore, it is important that the SQEs
conducting the review have the required
engineering background to evaluate the
technical correctness of the life cycle data
under review.
It is also important that the methods and
tools used in performing quality assurance are
adequate to ensure software compliance with
DO-178B requirements.
In large projects, with hundreds of artifacts,
it is common for SQEs to perform their
assessment applying some type of sampling
criterion. These criteria and their rationale
have to be included in the SQAP. Then, SQEs
experience and technical background become
fundamental to assure that the right sampling
pool is chosen.

When verification
artifacts are
sampled, SQEs
should ensure that
all the verification
environments
and methods
are represented
by the chosen
samples.

For example, requirements reviews should


cover every functional area of the software,
including both functional and non-functional
requirements.
When verification artifacts are sampled,
SQEs should ensure that all the verification
environments and methods are represented
by the chosen samples.

The RTCA Special Committee (SC-205) in


charge of updating DO-178B recognized that
new software development techniques can
lead to new issues. Therefore, supplements
have been created to provide additional
guidance for specific techniques.
MBD is addressed in Supplement DO-331Model-Based Development and Verification
Supplement to DO-178C and DO-278A.

Checklists are the main tool for SQEs.


Depending on the SQA organization and
project size, these checklists are part of the
project-specific SQAP or part of a companylevel SQA manual.

In a MBD project, the software requirements


and/or software design are represented by
models. Quality assurance checklists shall
account for the peculiarities and the additional
requirements that DO-178C has for models.

In both cases, it is important that the


checklists are designed in such a way that all
the key contents of the artifact under review
are covered. At the same time, checklists
cannot be overly detailed to allow for
deviations and the specificities of each project.

When sampling requirements expressed


by models and the associated engineering
reviews, SQEs should verify that the models
comply with the applicable software model
standards and any deviation is properly
justified.

If the checklists are part of a company-level


manual, they are usually designed to be
applicable to all the software projects run in
the company. This can lead to overly generic
checklists with the risk of an inadequate
application of the same.

The engineering reviews of requirements


expressed by models should be checked to
verify the correct application of the model
standard and for compliance with DO-178B/C.

Another critical aspect of company-wide


SQA checklists is their obsolescence. Their
non-specificity and large-scale impact can
create inertia against any modification or
improvement.
Nevertheless, the SQA organization should
ensure that their plans and checklists are
regularly updated and expanded to include
the new techniques implemented in software
projects.
Two techniques having a significant impact
on modern software projects are the modelbased development (MBD) approach and the
increased use of automatic tools.

06

When the compliance with model standards


is verified using automatic tools, SQEs should
verify that the tool output does not contain
warnings or errors; none of the two should
be present unless properly justified. For
unqualified tools, the engineering review of the
tools output should also be assessed by SQA.
In a MBD process using design models, the
verification team is also required to perform
model coverage analysis. The scope of this
activity is to determine which requirements
expressed by the models were not exercised
by verification based on the requirements
from which the model was developed. In
other words, model coverage analysis verifies
the thoroughness of the model verification
activities.

SQA should verify that this analysis has been


performed in accordance with the plans and
standards and in compliance with DO-178B/C
requirements. The analysis resolution should
also be reviewed to check that gaps have been
correctly identified and resolved in compliance
with the project standards and with the
guidelines of DO-331.

DO-330 is a process and objectives oriented


document. The applicable objectives vary
depending on the tool qualification level (TQL).

With DO-178C, the guidance on tools


qualification has been moved to a separate
document.

DO-330 also clarifies the objectives applicable


to the tool developer and to the tool user for
commercial-off-the-shelf (COTS) tools. This
is particularly important considering that
most of the tools used in modern software
development are commercial tools (e.g.
SCADE, Simulink, Code Composer Studio,
etc.).

All requirements for tools are now collected in


a domain-independent document (DO-330 Software Tool Qualification Considerations).
Specific quality assurance objectives are
identified in Table T-9. These objectives are
very similar to the objectives of DO178B/C.
SQA checklists created for airborne software
are probably not adequate to assess the
quality of tool artifacts without modifications.
Additional checklists should be created
specifically for Tool Quality Assurance (TQA).
DO-330 objectives for tool quality assurance
are:
Obtain assurance that tool development
plans and standards are developed and
reviewed for consistency
Obtain assurance that tool development
processes and integral processes, including
those of suppliers, comply with approved
tool plans and standards
Obtain assurance that the transition criteria
for the tool life cycle processes are satisfied
Conduct a conformity review of the tool
product
SQEs working on software project requiring
compliance with DO-178C should familiarize
themselves with the new guidance for tool
qualification presented in DO-330.
The document structure is similar to
DO-178B/C.

07

The TQL concept substitutes the old


definitions of development and verification
tools. Guidance to identify the correct TQL is
provided in DO-178C, Section 12.2.

Similarly to the SQA activities performed on


airborne software, the TQA activities include
the following:
TQA Plan (TQAP) preparation
Review of the tool plans
Auditing of the tool life cycle processes
Transition criteria assessment
Auditing tool life cycle data
Auditing the tool environment
Monitor and audit the change control
process and artifacts
Audit suppliers
Approve deviations and waivers
Performing tool conformity review
Tool quality assurance checklists should be
created to assists the quality engineers in
performing these activities.
Particular attention must go to the operational
aspects of the tools.
SQEs should sample review tool operational
requirements, tool test cases and procedures,
and tool verification results.

TQA checklists should also cover the correct


use of the tool by verifying that the operational
environment is correctly setup and matches
the environment in which the tool was
qualified.

Conclusion
In quality assurance, while evaluation activities
are essential, they are not sufficient to achieve
the specified quality. The quality of a product
cannot be measured, tested, or analyzed as
other characteristics. Quality can only be built-in
into the software development process.
To satisfy not only the letter but also the spirit of
DO-178B and DO-178C, a completely processoriented SQA is not enough.
DO-178B/C considers product quality as
important as process quality and expects SQA
personnel to be able to assess the compliance of
software products from a technical standpoint.
Continuous training on the areas of software
development, software verification, and
application of DO-178B/C is paramount in
ensuring effective SQA personnel.
DO-178C has also expanded the guidance
for tool qualification and for new software
development techniques such as MBD. This
poses new challenges for the SQA organizations
in terms of the adequacy of the process and
methods used to achieve quality assurance, as
well as training of SQEs.
The concept of PQA from the Software
Engineering Institute (SEI) can be adopted to
ensure product quality. With the PQA approach,
the Product Quality Engineers (PQEs) ensure
product quality while SQEs ensure process
compliance.
Through the synergy created by PQEs and SQEs
working closely together, product and process
quality can be assured.

Author
Armando Ragni has over 12 years of experience
in airborne software certification. He is currently
working as Software Discipline Chief with Cyient
for the UTC Embedded Systems, Software and
Electronics CoE and he is deployed at UTAS
premises in Rockford, IL.
In this role, he oversees software compliance to
RTCA/DO-178B/C and UTC processes for all the
software development and verification programs
under UTC ESSE responsibility.
Armando Ragni has previously worked as focal
point for the Software and Complex Electronic
Hardware certification of the CSeries Electric
Power Generation & Distribution System, as well
as Compliance Verification Engineer (EASA- Part
21) for the Alenia C27J JCA Program.
Armando Ragni has completed his Aerospace
Engineering degree from the Politecnico di
Torino, Italy.

Reference
1. RTCA/DO-178B Software Considerations
in Airborne Systems and Equipment
Certification: (Washington, DC: RTCA Inc.,
December 1992)
2. RTCA/DO-178C Software Considerations
in Airborne Systems and Equipment
Certification: (Washington, DC: RTCA Inc.,
December 2011)
3. RTCA/DO-331 - Model-Based Development
and Verification Supplement to DO-178C
and DO-278A: (Washington, DC: RTCA Inc.,
December 2011)
4. RTCA/DO-330 - Software Tool Qualification
Considerations: (Washington, DC: RTCA Inc.,
December 2011)
5. Job Aid - Conducting Software Reviews Prior
to Certification, Rev. 1 (FAA ACS, January
2004)
6. Rierson L. (2013) Developing Safety-Critical
Software. A practical Guide for Aviation
Software: (CRC Press 2013)

08

About Cyient
Cyient is a global provider of engineering,
data analytics, networks and operations
solutions. We collaborate with our clients
to achieve more and shape a better
tomorrow.
With decades of experience, Cyient is
well positioned to solve problems. Our
solutions include product development and
life-cycle support, process and network
engineering, and data transformation
and analytics. We provide expertise in the
aerospace, consumer, energy, medical,
oil and gas, mining, heavy equipment,
semiconductor, rail transportation, telecom
and utilities industries.
Strong capabilities combined with a
network of more than 12,900 associates
across 36 global locations enable us to
deliver measurable and substantial benefits
to major organizations worldwide.
For more information about Cyient,
visit our website.

NAM Headquarters
Cyient, Inc.
330 Roberts Street, Suite 400
East Hartford, CT 06108
USA
T: +1 860 528 5430
F: +1 860 528 5873
EMEA Headquarters
Cyient GmbH
Mollenbachstr. 37
71229 Leonberg
Germany
T: +49 7152 94520
F: +49 7152 945290
APAC Headquarters
Cyient Limited
Level 1, 350 Collins Street
Melbourne, Victoria, 3000
Australia
T: +61 3 8605 4815
F: +61 3 8601 1180
Global Headquarters
Cyient Limited
Plot No. 11
Software Units Layout
Infocity, Madhapur
Hyderabad - 500081
India
T: +91 40 6764 1000
F: +91 40 2311 0352

www.cyient.com
insights@cyient.com

2015 Cyient Limited. Cyient believes the information in this publication is accurate as of its publication date; such information is subject to change
without notice. Cyient acknowledges the proprietary rights of the trademarks and product names of other companies mentioned in this document.
Published June 2015

09

You might also like