You are on page 1of 14

The Automated Production Control Documentation System: A Case Study in Cleanroom Software Engineering

CARMEN !vtaryville and CATHRINE Martin E. SNYDER Energy Systems J. TRAMMELL College and LEON H. BINDER

Marietta

software system was developed for the U.S. Naval Underwater Systems Center (NUSC) as a demonstration of the Cleanroom Software Engineering methodology, The Cleanroom method is a team approach to the incremental development of software under statistical quality control. Cleanrooms formal methods of Box Structure specification and design, functional verification, and statistical testing were used by a four-person team to develop the Automated Production Control Documentation (APCODOC) system, a relational database application. As is typical in Cleanroom developments, correctness of design and code were ensured through team reviews. Eighteen errors were found during functional verification of the design, and nineteen errors were found during walkthrough of the 1820 lines of FOXBASE code. The software was not executed by developers prior to independent testing [i .e,, there was no debugging). There were no errors in compilation, no failures during statistical certification testing, and the software was certified at the target levels of rehability and confidence. Team members attribute the ultimate error-free compilation and failure-free execution of the software to the rigor of the methodology and the intellectual control afforded by the team approach.

A prototype

Categories and Subject Descriptors: D.2. 1 [Software Engineering]: Requirements/SpecificsD.2.4 [Software Engineering]: Program Verification-correctness proofs, tions methodologies; reliability; D.2.9 [Software Engineering]: Management programmmg teams; D.2, 10 [Soft Design methodologies; K. 6.3 [Management of Com~puting and Inforware Engineering]:
mation General Systems]: Software Managementsoftware Reliability,
development

Terms: Design,

Management,

Verification Cleanroom software engineering, statistical

Additional Key Words and Phrases: Box structures, quality control, statistical testing

Work was performed at Maryville College (Data Systems Department, Msryville, TN. 37801) under contract to Martin Marietta Energy Systems (K 1001, MS 7180, P O. Box 2003, Oak Ridge, TN. 37831) for the U.S. Naval Underwater Systems Center (Newport, RI), Authors address: University of Maryland, Unit 29216, APO AE 09102. Permission to copy without fee all or part of this material is granted provided that the copies are not made or distributed for direct commercial advantage. the ACM copyright notice and the title of the publication and its date appear, and notice is given that copying is by permission of the Association for Computing Machinery. To copy otherwise, or to republish, requires a fee and/or specific permission. @ 1992 ACM 1049-331X/92/0100-0081 $01.50
ACM Transa.t,ons on Software Engmeermg and Methodology, V.1 I, NO 1,

January

1992, Pages

S1.94

82

Trammell et al,

INTRODUCTION Zero testing Naval compilation Underwater errors and zero Center failures (NUSC). in execution Engineering Statistical were testing, the results the testing of

in a case study

in Cleanroom

Software

conducted

for the

Systems

approach used under the Cleanroom discipline, produced results supporting scientifically valid certification of the Automated Production Control Documentation (APCODOC) (the target level). Cleanroom Software development name exists comes to of software from the prevent software prevention system Engineering under semiconductor formal defect at %jqo reliable as 95%0 level approach control. where that are hardware support used to the The a literal this during of confidence incremental Cleanroom cleanroom In emphasis software

is a team quality industry,

statistical of defects

introduction development, rather

during methods removal

fabrication.

Cleanroom on defect design.

than

The Cleanroom method Mills during his direction

is based on ideas that were developed by Harlan of the IBM Software Engineering Program during

the 1970s and 80s. Cleanroom developments in both small academic projects and large industrial projects have yielded software of extremely high quality and reliability This paper results [4, 5, 9, 13, 16]. describes the application, learned in the project. development method, technical

and lessons

THE APPLICATION The software developed in this Cleanroom case study was a relational database application. The impetus for the APCODOC system was the need for up-to-date program documentation for the frequently modified software used by production control operators in NUSC financial operations. No such system system NUSC Executive for generating financial Control program documentation effort. was in place; the APCODOC was a new development

software is implemented on the Unisys 1100, using Unisys Language (ECL) for job control and COBOL as the prothe submission and determinafail. Production cases con-

gramming language. The production control function involves of data processing jobs to the 1100, manipulation of the ECL, tion of recovery and restart requirements when programs documentation is needed to carry out these tasks. Existing production documentation is incomplete, sists of informal notes maintained tion control documentation will failures during the production

and in many

by specific individuals. Automated producprovide several benefits, such as reducing cycle, decreasing the effort required to deter-

mine correct rerun procedures, minimizing the need for communication in crisis mode, and reducing vulnerability to the consequences of personnel turnover. The Cleanroom method involves statistical quality control throughout a life cycle of product releases, with development of each release in a series of increments. Release 1 of the APCODOC system is an operational prototype ACM Transactions on Software Engineering and Methodology, Vol. 1, No. 1, January 1992.

The Automated Production Control Documentation designed version Release increments. to demonstrate of the system, system and Release function. 3 will Release 2 will

System

83

be a production version.

be an enhanced

1 of the APCODOC system was to have been developed in three For reasons unrelated to the project, the prototyping effort ended

with Increment 1, and further work on the project will continue as Release 2 under new sponsorship. This summary of the Cleanroom case study conducted through Martin Marietta Energy Systems for the Naval Underwater Systems Center documents APCODOC system. THE METHOD The Cleanroom intellectual ongoing method is a team approach to software in three engineering in which the result of Increment 1 of Release 1 of the

control

of the work within

is ensured a small,

ways: team, and design, in which

peer review

well-qualified

use of the formal methods of Box functional verification, and statistical statistical quality control tested increments reliability. Several overviews accumulate

Structure specification testing, and development a system release

of an incremental to become

process

of certifiable

of the

Cleanroom

method

have

been

published

in recent of Box [6], and method. involved

years [3, 9, 12, 141, as have several works on its formal methods Structure specification and design [8, 10, 11], functional verification statistical The project testing was [2, 15, 171. project conducted was a full from implementation 1989 through of the Cleanroom May 1990 and July The APCODOC

four part-time team the project. 1 This maintains that small

members who devoted approximately project organization was used (1) teams afford greater quality and

half of their time to because Cleanroom process control than of are The are

either individuals or large teams, and (2) to illustrate the usefulness Cleanroom discipline in the very common situation in which developers devoting part-time, stop and start and attention percentage to the of time development work devoted at hand. to each phases of work in the given in Table I. After occurred requirements in parallel. project had When been the

established, code had testers against

and test

planning

opers observed the software.z

as independent Statistical testing

been written and compiled, develconducted the first execution of an established quality standard

1 Team members were Leon Binder, Elaine Galbraith, Carmen Trammell, and Vicki Wester. Two team members were involved in all phases, one team member in verification/coding/testing only, and one in testing only. Specialists were also involved for consultation on details of ECL and FOXBASE. 2 Independent testers and observers included Jerome Blaylock (President, Compumetrics, Houston TX), William Dent (Chair, Maryville College Department of Math, Physics, and Computer Science), Jesse Poore (Chair, University of Tennessee Department of Computer Science), and Cathy Snyder (Project Manager, Martin Marietta Energy Systems, Inc., Data Systems, Research, and Development Program).

ACM Transactions on Software Engineering and Methodology, Vol. 1, No. 1, January 1992.

84

Trammell et al.
Table 1. Distribution of Project Time across Cleanroom Activities

5%

Planning

Review of customer statement of work; development of incremental development plan and schedule; formation of project team Definition of requirements for function, performance, reliability, security, auditabilitv, mteroperablhty; design of user interface; characterization of usage conditions; initial consideration of target hardware, operating system, and programming language Top-down decomposition of system function yielding a hierarchy modules. Specifying both data flow and control flow of

39%

Requirements

22%

Box Structure specification and design Functional verification Coding and code walkthrough Statistical planning test

5%

Verification

of correctness

of the

design

6%

Transliteration correctness

of design to target of transliteration

language;

veriticatlon

of

22%

Definition of a test case; generation of test cases sufficient for reliablhty requirements; identification of oracle for defimtion of correct output; determination of correct output for each test case; generation of test script First execution; testing according certification of the software training time involved to the prepared script;

1%

Statistical testing

There was no Cleanroom been exposed to Cleanroom

m the project

Three of the team members

had

m graduate

courses in computer

science.

provided the measure of Cleanroom process control. The quality standard, and was certified at the target reliability does not meet quality standards during Cleanroom testing, process is not under control. Testing stops and developers Figure 1 illustrates the flow of events in a

software met the level. (If software the development return to design.) development

Cleanroom

increment. Team members attribute the ultimate error-free compilation and execution of the software to the rigor of the methodology and the intellectual control afforded provided by the design, by the team a framework team verification, approach. Formal methods and ongoing peer review that facilitated coherent work and continuity of effort participants. and test phases An overview of the follow. specification, coding of the project

of part-time

Box Structure Specification and Design. Box structures represent system behavior in a hierarchy of black box, state box, and clear box forms. The black box is a specification of function in terms of stimuli and responses. The state box is an elaboration of function in terms of internal data structures. The clear box is the procedural design, and includes decomposition of function into lower level black boxes. New black boxes (i.e., new specifications) are similarly decomposed until the corresponding clear boxes (i. e., designs) contain no new black boxes. The development of a system from the highest level of abstraction to the lowest level of procedure is a stepwise process of box structure
ACM Transactions

decomposition.
on Software
Engineering

Specification
and

and

design
Vol 1, No.

are developed
1, January 1992

in par-

Methodology,

The Automated Production Control Documentation

System

85

THE CLEANROOM Requirements I

PROCESS

t
1 I I 1 I I I I I I I I I

I SpecKlcation i Test Planning

.-*

BOX Structure

and Design t Verification t Coding /Code Walkthrough t

Functional

L -----

-----

-----

Statistical kesting ---Cernfication

Fig. 1. Flow of events in a Cleanroom development increment.

allel,

resulting

in

a seamlessly y. representation

integrated

hierarchy

affording system

complete

verifiability

and traceability

The box structure

of the APCODOC

was expressed

in a design language called Box Description Language (BDL) [11]. The use of BDL afforded several advantages: it enabled a language-independent representation of the system, than one target language; associated members module responses response eter tion types rule with the details with not familiar of BDL example to distinguish the usage in this lists thus preserving the option of conversion to more it freed designers from time-consuming side issues of a given the target in Table all them lists target language language; to fully used and it enabled team for the and all and participate by the on the The in design. box

An example

is given Logical

II. The black stimuli are underlined physical of specification.

box specification stimulus

external from

produced. are given specifies

parameters

parameters, and the

but both black

param-

for completeness

box transifor produc-

of each stimulus

conditions

tion of each response. The state box in this example is the same as the black box since there is no state that must be preserved between invocations of the module. The clear box is the procedure, and Lhe mapping of procedural segments The clear to their specifications is given explicitly in bracketed comments. which box contains a new black box (marked by keywords use BB),

represents the next step in the functional decomposition Peer review during system decomposition afforded Typical among issues discussed in development reviews interpretation correctness,
ACM

of the system. numerous benefits. were the following:

of requirements, completeness,
Transactions

and efficiency
Engineering

of algorithms,
and Methodology, Vol. 1, No. 1, January 1992

on Software

86

Trammell et al.
Table II. Box Structure Example

1
2 3 4 5 6 7 8 9

10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43

define BB Help stimulus filename : char (8) : screen display invocation screen operator request: leave help : keystroke response : screen display help screen invocation screen : screen display error message : screen display transition Check for the existence of the help file named filename. If the file exists, save the invocation screen and write the contents of the help file to the screen. Operator _request: leave _help is given by any keystroke. Upon leave _help, restore the invocation screen. If the file does not exist, print the error message HELP FILE (filename) NOT FOUND. PRESS ANY KEY TO CONTINUE . at row 22, column 1. When the user hits a key, clear line 22. define SB Help stimulus same as BB response same as BB state data none transition same as BB define CB Help stimulus same as BB response same as BB

state data
none local data OK=l file _status proc [BB line 111 use BB Check _File (in: filename; out: file _ status); [BB lines 12-141 if (file _status = OK)

: boolean

_ Existence

opportunities for economy (reusable modules), and conventions


Designs were

of

code

through

use

of

common

services

and standards.
viewed as working drafts until peer review had been completed,

and many design The APCODOC and menu-driven.

improvements resulted from each review session. system is microcomputer-based, single-user, interactive, The user interface includes 35 screens (5 menus, 11

ACM Transactions on Software Engineering and Methodology, Vol. 1, No. 1, January 1992.

The Automated Production Control Documentation


Table IIContinued
44 45 46 47 48 49

System

87

then

50 51 52 53 54 55 56 57 58

do save screen; put file (filename); get (keystroke); restore scree~ od [BB lines 15-171 else do put HELP FILE (filename) at row 22, column 1; get (keystroke); clear line 22; od fi corp

NOT FOUND.

PRESS ANY

KEY

TO CONTINUE.

documentation

displays,

19 help

screens)

and

80 user

functions

(the

sum

of

all options on all screens). Two source files. The box structure representation

files are parsed, creating 3 relational of the APCODOC system is a sevenservices. The 13 common for an average of 6 uses maximum use is used 15

level hierarchy with 29 black boxes and 13 common services are used a total of 77 times in the hierarchy, per common times. Functional programs practitioners training, practice. service. The common service with

Verification. as mathematical learn a balance formal approach

Cleanroom functions and mathematical

verification subject to proof proof economy

is based techniques of effort is the

on in

a view Although Cleanroom

of

arguments.

of formality

is emphasized application

in of

The usual

to correctness

verification

correctness questions [51. When there is easy agreement about correctness, verification proceeds rapidly. Formal techniques are only used in instances in which it is not easy to get consensus about correctness through informal means. After review verification the of the stepwise design sessions design varied, and but review always process included had been completed, Participants of the BDL was performed a new in and a in

was conducted

to verify

its correctness. the author

person who had not previously ten sessions of approximately

seen the BDL. Verification four hours each.

Verification began with the highest level box structure and proceeded black box transition rule (the mapping of stimuli to top-down. Each responses) was verified as being completely implemented by the corresponding state box, and each state box was verified as being completely imple mented by the performed by only in corresponding assertion, clear box. Verification of BDL was with mathematical proof techniques were not all convinced primarily employed (as was

cases where

participants

of correctness

prescribed by Linger et al. [61). A checklist used to ensure systematic checks on specific
ACM Transactions on Software Engineering

created by the developers aspects of the design.

and Methodology,

Vol. 1, No. 1, January 1992.

88

. A total

Trammell et al of 18 errors was found during functional verification. Five that with of these would a high

errors would have caused Of the

have been caught in compilation, failures in execution. execution errors,

and 13 were serious

errors

13 potential

6 were

errors

probability of occurrence (e. g., a defect in a basic serious errors with a low probability of occurrence improbable boundary condition), and 4 were error in a screen display). Although 9 of the 18 errors had a serious impact on correct Writing to the wrong file, for database, but use of the wrong dumb goof that

part of the parser), 3 were (e. g., incorrect check of an errors (e. g., a cosmetic could have error.

minor

found

during

functional only

verification

execution,

one was a deep logical

example, would have seriously corrupted the file name in the write statement was just a and fix, The one deep logical error related [11 is in to correct. and Selby

was easy to find

to control coupling, and required more extensive rethinking A summary of errors using categories defined by Basili given in Table III. A complete list of errors Table S/H S/L M IV. Error significance error, error, error that would have been caught verification system errors had in compilation indicated indeed found design Once the target that served peer high found during functional

verification

is given

codes are as follows. probability of causing of causing failure failure

Serious Serious Minor Error

low probability

c
The

results With

of functional of the than a single

reviews verification

during a quality were

stepwise design. simple

development goofs rather and it was Code

to ensure

exception,

in functional flaws. the BDL had

fundamental

Coding correct

Walkthrough. into

been

verified yielding

as

transliterated

language

FOXBASE,

1820 executable from translation

lines of code. 3 (Transliteration in that the former implies

is commonly distinguished preservation of both semantics

and syntax, while the latter conversion of the BDL design ation, but also required some

implies preservation of semantics only). The to FOXBASE was mostly a matter of translitertranslation and some redesign to accommodate

peculiarities of FOXBASE, Most of the errors discovered during the code walkthrough were introduced in instances of translation or redesign. When coding had been completed, the team reassembled for a code walkthrough. The purpose of the walkthrough was to ensure (1) semantic equivalence between the BDL and the code, and (2) syntactic correctness of the code. The walkthrough was performed in three sessions of approximately four

3FOXBASE is a fourth-generation language. Studies comparing 3GL and 4GL code size on similar projects have found that 3GL code size is an average of 10 times larger [7], Inference of functional complexity and programmer productivity from APCODOC code size 1sfurther complicated by the fact that Cleanroom designs strongly emphasize economy of code through use of common services.
ACM Transactions

on Software

Engineering

and

Methodology,

Vol.

1, No

1. January

1992

The Automated Production Control Documentation


Table III. Summary of Errors Found in Functional of Commission 1 3 2 2 1 1

System

89

Verification

Errors Computation Control Cosmetic Data Initialization Interface Total

Errors of Omission 0 2 2 4 0 0

10

Table IV. Category


Errors

Description

of Errors

Found in Functional

Verification of Error

Occurrences

Significance

Description

of Commission

Computation Control Control Control Cosmetic Cosmetic Data Initialization Interface


Errors of Omission

1 1 1 1 1 1 2 1 1 2 1 1

S/H SIH S/H


s/L

M
M S/H S/H

c S/L. M M c

Incorrect incrementing of current_position in line being parsed Loop control error resulting from control coupling in module interface Get (string length) occurred before get (string) <<<= ,, should have been < in check for improbable boundary condition Function key descriptor missing from screen display Screen title display off by one row Write to wrong file Constant set to wrong value Missing parameter in parameter list Missing check for improbable boundary condition Error message erased from screen too quickly Failure to accept lowercase Y/N caused looping until uppercase entered Missing declaration Removed unnecessary declaration Removed unnecessary assignment statement Removed unnecessary parameter from module interface Added check for (improbable) null string before searching file for string

Control Cosmetic Cosmetic

Data
Design Improvements

4
5 2

1 1

hours each. The same four people participated in every session: the author the BDL, the author of the code, a team member who had also participated

of in

functional verification, and an unbiased FOXBASE programmer. The code walkthrough began with the highest level procedure and proceeded top-down. The previously verified BDL was compared line-by-line with the FOXBASE code. A checklist created by the developers was used to ensure systematic where the checks on specific aspects FOXBASE constructs did of the code. Logic not exactly match was the reverified only BDL. In those

ACM Transactions on Software Englneermg and Methodology, Vol. 1, No. 1, January 1992.

90

Trammell et al

cases, the function performed by the code was verified to be the same as the function represented by the BDL. Deviations from the BDL pertained to such capabilities as file access and implementation of function keys. A total of 19 errors was found during the code walkthrough. Of these were caught introduced been failures Of the probability occurrence, errors were A summary A complete VI. Categories functional resulted caught design in errors functional in not caught in Three and functional but not of the 16 were errors, verification, corrected, 19 were errors that serious with and errors 1 was 12 that an verification were would have with 19, 6 error errors have caused a high of

in transliteration. compilation, execution in execution. 16 potential

would errors a low

10 were error

of occurrence,

1 was

a serious

probability probability

and 5 were minor errors. Four the result of a misunderstanding of errors list is given found in Table during same V. of errors

of the serious/high of FOXBASE.

code walkthrough as those category language. during BDL

is given

in Table errors in

and codes are the

used in classifying

verification. An additional from a misunderstanding whether (T). the error

Language of the target

is for errors that Two additional design (D) or

codes indicate transliteration

was introduced

Coding proved to be a step in which new errors were introduced into a verified design. In fact, the majority of errors found during the code walkthrough had been introduced during conversion to the target language. In the absence of (1) a project more than one programming language may BDL be that Furthermore, the somewhat code generator, by final the awkward. outweighed task the version objective to preserve the option of conversion to language or (2) a design language to target the advantages disadvantages of the the BDL of language-independent of error-prone and code in different parallel design from will the be in a transliteration.

code is sufficiently

of maintaining If language

independence

is an important

objective

development effort, then this parallel maintenance task is worthwhile. If, on the other hand, there is only one intended target language and future conversion to other languages is not anticipated, it will be helpful to use the target language for clear boxes in the box structure hierarchy. Statistical Test Planning. Statistical test planning was based on three

aspects of the system that were established during the requirements phase: the reliability requirements, the user interface, and the usage conditions for the software. The reliability requirements established for the APCODOC prototype95% reliability cases that at the 95%0 level of confidencedetermined would have to be processed correctly to certify the number the reliability of test of the

software at the required level. The requisite number (The mathematical basis for determining the number given by Poore et al. [15] and Sexton [17].) The APCODOC system user interface is menu-driven, was defined as a series of events beginning with either

of test cases was 59. of test cases needed is and system a test case invocation or

ACM T,ansaetions on Software Engineering and Methodology, Vol 1, No. 1, January 1992

The Automated Production Control Documentation

System

91

Table V.

Summary Errors

of Errors

Found in Code Walkthrough Errors of Omission

of Commission

Computation Control Cosmetic Data Initialization Interface Language Total

0
2 2

1
3 3 2

0
4 13

1 1 0
6

Table VI. Category


Errors

Description Origin

of Errors Significance

Found in Code Walkthrough ., ..IJescrlptlon n 01 Error

Occurrences

of Commiss~on

Control Cosmetic Cosmetic Cosmetic

1 1 1 1

T D T T

S/H M M M S/H c

Data Data Data


Initialization Initialization Language Language Language Language
Errors of Omwston

1 1 1
1 1 1 1 1 1

D T T
D T T T T T

S/H
c S/H S/H S/H S/H S/H

File access occurred before file status check Screen title contained extraneous characters Character in display should have been capitalized Blank line intended in screen display was missing Wrong file name Misspelled variable name Extraneous skip statement would have skipped a record in file Wrong type in declaration Variable initialized to wrong value clear after store i~nvalidated store String positions addressed as O-79 adjusted for functions expecting 1-80 close impacted all databases; intended for current database only get of null variable generates an automatic (unintended) (CR) Incorrect operator in predicate related to improbable boundary condition Missing file status check Failure to accept lowercase Y/N caused looping until uppercase entered Failure to prevent display of name on screen Fadure to recognize lowercase characters when parsing file Missing declaration

Control Control Cosmetic Cosmetic Data


Initialization

1 1 1 1 1
1

T T D D D
D

S/L S/H M M S/H

ACM TransactIons on Software Engineering

and Methodology,

Vol. 1, No 1, January 1992.

92

Trammell et al from the Main Menu by and ending random inputs interviews conditions with sampling constructed had and been either from from a return to the Main

selection

Menu or system termination. Test cases were prepared state for transition the software. matrix (The phase usage

a probability-based the usage at the conditions during the customer

of user through

determined

requirements

observations

site. ) The correct the oracle for original expected source output

output for each test case was established in advance, with correctness being the user interface specification and the processed the by the APCODOC script. testing is not to find ensure discussed; there errors, of is no to been the per system. The test input and formed test

files

Statistical but to certify the software

Testing. through

The purpose the inspection

of Cleanroom methods

the reliability

of the software.

Developers previously certification testers were number For allowable

the quality

debugging of the software prior to statistical In the testing of APCODOC, independent the test script) correctly of failures lines until either or a failure allowed under the the requisite occurred. developers processed number thousand

testing. to test (according cases of five had (i.e., errors failures software

of test

the Cleanroom were

maximum to fix the

of code4),

and testers

would then resume testing. If the maximum allowable number of failures was exceeded, testing was to be discontinued and developers were to return to design. All test the software cases were successfully processed by the APCODOC system, and This

was certified

as 95~0 reliable

at the 95% level

of confidence.

certification may be viewed on a conservative approach pling without a reliability statistical realistic estimate testing, prediction of reliability

as the lower bound on reliability since it is based to inference of reliability, i.e., statistical sammodel. When failure data is available during

the Certification Model [2] is used to provide a more of reliability. (The mathematical basis for regarding the by statistical sampling without a model as conserva-

The following

discussion

of errors in 4GL development

is presented

by Martin

[7]:

There are many statmtics on human error rates with conventional programming The early experience with fourth-generation languages indicates that the human typically creates one order of magnitude fewer bugs in building a given application. It appears that if the human writes one-tenth of the lines of code to create an application, he makes one-tenth of the mistakes. In other words, he makes about the same number of
mistakes
work. suggested human the same In

per line
If he so [by mistakes rate added) doing,

of code.
obtains he results makes Hamming] that of ten about times the that a person the power as fast, same there creating of the is total an he can number approximate will accomplish of mistakes law make language of ten times per as much day It is of about and

However,

Richard saying regardless

conservation at (Italics

programs programming

mistakes

brackets

If Martins

report

of the

same

number

of mmtakes

per

line

of code

holds,

then

the five

application of the Cleanroom standard used with 3GL errors/KLOC) is appropriate in 4GL developments as well ACM Transactions on Softwars Engineering and Methodology,

developments

(i. e , maximum 1992

Vol. 1, No. 1, January

The Automated Production Control Documentation tive and the estimate of reliability et al. [15]). by the Certification

System Model

93

as more

realistic

is given

by Poore

Statistical testing lines of code, rate predicted Testing quality control), stopped project reliability) stops or when team under success the standards

against established standards (e. g., errors of growth in mean-time-to-failure, rate provides either in (indicating meeting that certification the of two that goal the measure the of quality failure goal. process control to process the conditions: certification was met,

per thousand of growth in in Cleanroom. established not under testing and the was

meet

development allowing

APCODOC! customer control.

to conclude

the development

was in

LESSONS The prior

LEARNED developers Some learned, had each been about involved this in project Briefly, Cleanroom simply lessons projects reinforced learned

APCODOC to this lessons

effort.

observations and some were

previous

new lessons.

were as follows. In the planning of increments, smaller is better. increasing competence in using Cleanrooms formal ments will be within the teams intellectual control. If implementation use the ming language in more is intended than one programming If, instead, language language-independent BDL.

As the team develops methods, larger increlanguage is anticipated, one programboxes of the

one and

only

for use, use that

in clear

box structure hierarchy. Maximal use of common services interface (which creates opportunities ness of requirements prior creates awareness of opportunities interface and elsewhere). Time-consuming tional verification. differentiate between mathematical Verifiers who formal proof then verify team

is enabled by (1) consistency in the user for common services), and (2) completespecification services and in design both (which the user funccan for common

to box structure

proof is generally have been trained verification techniques corrections. activity. Shortcuts

not necessary in in proof techniques by assertion

cases in which

is sufficient checks on cora

and cases where more Verify correctness, rections Above shortfall should

are needed. Independent in the process

be a routine

all, stay in the Cleanroorn! in reliability.

will

only

yield

REFERENCES
1. BASILI,

V. R., AND SELBY, R. W.


Trans.
P. A.,

Comparing
MILLS, H D.

the effectiveness
Certifying the

of software
reliability

testing
of

strategies.
IEEE

IEEE

Softw.
DYER,

Eng.
M.,

13, 12 (Dec. 1987), 1278-1296.


AND software.

2.

CURRIT,

Trans.

So&o.

Eng.

12,

1 (Jan. 1986), 3-11.


Cleanroom
and

3. DECK, M. D., AND HAUSLER, P. A.


Proceedings of Software Engineering

software

engineering:
Engineering

Knowledge

Theory and practice. In 90 (Skokie, Il., June

1990). 4. KOUCHAKDJIAN, A. Evaluation of the Cleanroom of the Seventh International dard. In Proceedings (San Francisco, June 1990). ACM Transactions
on Software Engineering

methodology:
Conference

a case study at NASA/Godon Testing Computer Software

and

Methodology,

Vol

1, No.

1, January

1992

94

Trammell et al

5. LINGER, R C., AND MILLS,

6. 7. 8. 9. 10

H D. A case study in Cleanroom software engineering: the IBM 88 (Chicago, Il., Ott 1988). COBOL Structuring Facility. In Proceedings of COMPSAC Structured Programming: Theory and Pracflce. LINGER, R. C., MILLS, H. D., AND WITT, B. I. Addison-Wesley, Reading, Mass., 1979. Languages Volume 1Principles. Prentice Hall, EngleMARTIN, J. In Fourth-Generation wood Cliffs, N. J., 1985. MILLS, H. D. Stepwise refinement and verification m Box-Structured systems. Computer 21,6 (June 1988),23-26 IEEE Softw MILLS, H. D., DYER, M,, AND LINGERER. C. Cleanroom software engineering 4,5 (Sept. 1987),19-25. MILLS, H D., LINGER, R C , AND HEVNER, A R Box Structured information systems. IBM
Syst. and J. 26, 4, (1987), Design 395-413. Principles of Informat~on Systems Analysis

11.MILLS, H. D., LINGER, R. C , AND HEVNER, A. R.


12. 13

14.

15

Academic Press, New York, 1986. Quahty MILLS, H D., AND POORE, J. H. Bringing software under statistical quality control Progress 21, 11 (Nov. 1988), 52-55. POORE, J. H., MILLS, H D,, HOPKINS, S. L., AND WHITTAKER, J. A, Cleanroom reliability manager: A case study using Cleanroom with Box Structures ADL Software Engineering Technology Rep,, IBM STARS CDRL 1940, May 1990 POORE, J H , AND MILLS, H, D. An overview of the Cleanroom software development of the ACM In fernatzonal Workshop on Formal Methods m Software process. In Proceedings Development (Halifax, Nova Scotia, July 1989). POORE, J. H., MUTCHLER, D., AND MILLS, H. D. STARS-Cleanroom reliability: Cleanroom ideas in the STARS environment. Software Engineering Technology Rep., IBM STARS

CDRL 1710, Sept. 1989. software development: an 16. SELBY, R W., BASILI, V. R,, AND BAKER, F. T, Cleanroom IEEE Trans Soffw. Eng. 13, 9 (Sept. 1987), 1027-1037, empirical evaluation. testing of software. Masters thesis, Dept. of Computer Science, 17, SEXTON, B. C, Statistical Univ. of Tennessee, Dec. 1988. Received November 1990; revised July 1991; accepted October 1991

ACM Transactions

on Software Engineering

and Methodology,

Vol

1, No 1, January

1992,

You might also like