You are on page 1of 14

Information & Management 40 (2003) 677–690

The impact of software process improvement on quality:


in theory and practice
Noushin Ashrafi*
Department of MSIS College of Management, University of Massachusetts, 100 Morrisey Blvd., Boston, MA 02125, USA
Accepted 22 August 2002

Abstract

To remain competitive, software companies must establish practices that enhance quality and advance process management.
To this end, they have increasingly turned to software process improvement (SPI) methodologies, of which the ISO 9000
standards and the capability maturity model (CMM) are the best known. The underlying principle of both methodologies is to
assess organizational capabilities to produce quality software, but they depend on different underlying processes.
Whether the practices advocated by these methodologies lead to high-quality software has been the topic of ongoing debates.
Both scholars and practitioners are looking for hard evidence to justify the time and effort required by such guidelines to improve
the software-development process and its end product.
In this paper, we investigate the impact of SPI methodologies on software quality, first by theoretical comparison and then
with empirical data. Our findings reveal that each methodology has had a different level of impact on software quality factors.
These findings could help software-development organizations select the methodology or combination that best meets their
quality requirement.
# 2002 Elsevier Science B.V. All rights reserved.

Keywords: Software process improvement; Software quality assurance; The capability maturity model; ISO 9000-3; Quality factors

1. Introduction Software organizations differ in their strategies for


improvement of the quality of their products. In the
Despite rapid advances in all facets of technology, the past, looking for a quick solution, some organizations
software industry is still struggling with the formidable have tried to use special packages or approaches to
task of developing software applications that meet their software-development processes, hoping to get
quality standards, time pressure, and budget constraints quality software products out fast and within budget.
[26,29]. Cost and time elements are quantitative, and Since these efforts did not deliver as promised, the
therefore can be measured and evaluated. Quality on the slower paced and methodical approaches, such as
other hand, is a multi-dimensional concept, difficult to software process improvement (SPI), started to gain
define and measure. The focus of this paper is on the momentum [13].
quality aspect of software development. Two established sources, the ISO 9000-3 standards
and the capability maturity model (CMM), have pro-
*
Tel.: þ1-617-287-7883; fax: þ1-617-287-7877. vided guidance for software-development companies.
E-mail address: noushin.ashrafi@umb.edu (N. Ashrafi). While these SPI techniques may be considered main-

0378-7206/$ – see front matter # 2002 Elsevier Science B.V. All rights reserved.
PII: S 0 3 7 8 - 7 2 0 6 ( 0 2 ) 0 0 0 9 6 - 4
678 N. Ashrafi / Information & Management 40 (2003) 677–690

stream approaches in the software industry, their Table 1


Software quality factors
deployment is expensive and time consuming. There-
fore, organizations are looking for hard evidence to Quality of design
justify the time and effort required to use these guide- Correctness Extent to which the software conforms
lines to improve the software-development process to its specifications and conforms to its
declared objectives
and thus hopefully the end product. Maintainability Ease of effort for locating and fixing a
The purpose of this paper is to examine SPI meth- software failure within a specified time period
odologies, specifically ISO 9000-3 and the CMM, to Verifiability Ease of effort to verify software features and
find out to what extent they address quality factors, performance based on its stated objectives
and once implemented, their impact on various quality Quality of performance
Efficiency Extent to which the software is able to do
factors. This line of analysis should be of value in more with less system (hardware, operating
determining which SPI methodology should be system, communications, etc.) resources
adopted to build quality into the software-develop- Integrity Extent to which the software is able to
ment process. withstand intrusion by unauthorized
users or software within a specified
time period
Reliability Extent to which the software will
2. Background perform (according to its stated objec
tives) within a specified time period
The elusive concept of software quality has been the Usability Relative ease of learning and the
topic of debate since computers were first invented operation of the software
Testability Ease of testing to program to verify that
[19]. Manufacturing quality concepts initiated by it performs a specified function
Deming [5] and continued by Juran and Gryna [17] Quality of adaptation
such as conformance for requirements and fitness for Expandability Relative effort required to expand soft
customer use, have been applied to software products. ware capabilities and/or performance
Pressman [30] and Humphrey [14] showed quality as a by enhancing current functions or by
adding new functionality
multi-dimensional concept perceived differently in Flexibility Ease of effort for changing the soft
different domains. Fournier [9] has indicated that, ware’s mission, functions or data to
‘‘quality very often signifies different things to differ- meet changing needs and requirements
ent people in the context of a software system.’’ Portability Ease of effort to transport software to
To illustrate various aspects of software quality, another environment and/or platform.
Reusability Ease of effort to use the software (or its
software professionals pioneered a quality model that components) in another software
attempted to identify factors representing the beha- systems and applications
vioral characteristics of the software system. Each Interoperability Relative effort needed to couple the
factor was subsequently associated with at least two software on one platform to another
or more software quality attributes used to enforce the software and/or another platform
Intra-operability Effort required for communications between
definition of a specific quality factor [23]. Throughout components in the same software system
the years, experts have come up with different versions
of the model. While varying in detail, all versions lay a Source: the Handbook of Software Quality Assurance, Prentice
Hall, 1998.
foundation to cover all dimensions of quality for a
software product, ensuring that quality is built into the
software system package being created [7,10,11,27]. handbook of SQA. This model identifies 14 quality
The Software Quality Assurance (SQA) group, which factors in three stages of the development life cycle:
is a professional organization founded to promote design, performance, and adaptation. Although it is
the quality assurance profession through proliferation difficult, if not impossible, to develop a piece of
and advancement of high professional standards, software that adheres to all of these 14 factors, the
embraced this framework as a guideline to judge model provides a good frame of reference to under-
the quality of a software system. Table 1 shows a stand software quality. We used this quality model
version of the software quality model printed in the throughout the study.
N. Ashrafi / Information & Management 40 (2003) 677–690 679

Finding a way to improve software quality for all approaches. In fact, there has apparently been no effort
software-development organizations is not possible; to investigate SPI methodologies on various dimen-
what is considered the ultimate target for quality in sions of software quality, such as design, performance,
one organization may not be important to another; for and adaptation.
example, an organization that produces mission cri- In our attempt to do this, first we need to compare
tical software, considers reliability to be the most and contrast SPI and SQA approaches to the devel-
important factor while, portability may be a necessity opment process and how they define and embrace
for organization that produces a software product for a quality issues.
variety of platforms. Obviously, increasing some qual-
ity factors may cause others to decline.
What an experienced software developer wants is 3. Comparing SPI with the SQA factors
an approach that reflects his or her organization and its
goals. Organizations that find themselves in a process Although both SQA guidelines and SPI methodol-
of choosing a methodology to improve their quality ogies more or less claim to address the same subject B,
management system face a dilemma: which, if any of quality assurance (QA) B, they attempt to do so from
these methodologies meet their quality requirements different starting points.
[8]. SQA treats processes as part of a ‘‘package’’ with
Also, there has been some concern about the lack of factors and attributes that can be simply illustrated in
hard evidence that SPI methodologies do indeed tables and cross-charts. Fournier notes that, ‘‘Quality
improve the quality of software products [16,24,28,31]. . . . encompasses the complete system package being
Recently, a number of empirical studies addressed created, including its supportive elements, such as
the effect of the CMM or ISO device on quality. These system documentation, staff training, and even post-
studies, however, measure software quality as reduced installation support processes B, in other words, all
number of defects and less rework, hence offering a those items that meet the expectations of our custo-
narrow definition of quality. CMM specifies 18 key mers . . . in addition to the software product.’’
process areas (KPAs) categorized into five process In contrast, SPI sees the development process as the
maturity levels. Studies have shown that a higher deciding factor in software improvement and tends to
maturity level leads to increased productivity and take most other (but by no means all) factors from
quality, defined as reduced defects and rework outside the domain of QA. For example, CMM
[12,15,21,22]. A similar study investigated the emer- describes the principles and practices intended to help
ging ISO/IEC 15504 international standard on soft- organizations improve the maturity of their software
ware process assessment [6]. Their findings indicate process development through an evolutionary path
that improvements in process capability are associated from ad hoc and chaotic to mature and disciplined
with reduction in rework. [34]. A formal definition given by the Software Engi-
The Software Engineering Institute’s report neering Institute (SEI) shows the emphasis on process
recently summarized the observations from a 1999 rather than product: ‘‘the capability maturity model
survey of high maturity CMM-based efforts. It basi- summarizes the key practices of a key process area and
cally provided a perspective on the practices of high can be used to determine whether an organization or
maturity level organizations regarding their manage- project has effectively implemented the key process
ment, engineering, tools, and technology [4]. areas.’’
Several other studies have been conducted to com- ISO 9000-3 guidelines emphasize consistency
pare and contrast ISO 9000-3 and CMM. Most have through extensive documentation. It states, ‘‘ . . . this
focused on one-on-one comparison of CMM key part of ISO 9000 is intended to describe the suggested
process areas and ISO 9000-3 requirements [1– controls and methods for producing software which
3,20,25,33]. Although very useful in answering ques- meets a purchaser’s requirements. This is done pri-
tions such as ‘‘Is complying with a certain level of marily by preventing non-conformity at all stages
CMM equivalent to complying with ISO standards?,’’ from development through to maintenance.’’ [18].
they do not assess and compare the impact of the two Though expectations may have been heightened by
680 N. Ashrafi / Information & Management 40 (2003) 677–690

the title of ISO 9000-3 ‘‘quality management and handbook of SQA. The next step for this study was
quality assurance standards: guidelines for the appli- to find out whether there is a correlation between
cation of ISO 9001 to the development, supply and theory and practice when deploying SPI methodolo-
maintenance of software,’’ it has a very narrow focus, gies. Hence, we conducted an empirical study, in
as spelled out in its Section 1: ‘‘this part of ISO 9000 which we were surveying the users of these meth-
deals primarily with situations where specific software odologies (software developers) and asking for their
is developed as part of a contract according to pur- perception of the impact of SPI methodologies on
chaser’s specifications.’’ quality factors.
Both of these definitions lead us to believe that SPI For our theoretical assessment, we used SQA as the
models tend to reflect a developer’s view, while SQA initial basis for our mapping. Quality factors are
reflects more of a user’s view. This contrast creates a divided into three categories: design, performance,
Heisenberg effect: following one or the other of these and adaptation. Table 2 depicts the mapping of quality
approaches, you concentrate either on development factors affecting design to ISO standards and CMM
processes, or the external attributes of the product, but key process areas.
not on both at the same time. Thus, ISO 9000-3 covers all three factors that
Basically, SPI methodologies provide a generic improve the quality of design whereas CMM falls
recipe for improving the software-development pro- short of addressing maintainability. It should be noted
cess. This would have a better chance of raising that ISO uses a more direct language to address
product improvement if we could establish a frame- correctness and verifiability then CMM.
work within which improvement can be tailored to Next, we examined the coverage of quality factors
meet business needs and organizational quality for performance by ISO 9000-3 clauses and CMM
requirements. process areas. Table 3 demonstrates the mapping of
Questions such as: ‘‘Do process improvement activ- these quality factors.
ities reflect the understanding of the relationship All factors for the quality of performance are
between process characteristics and quality improve- addressed both by ISO 9000-3 and CMM, but the
ment goals?’’ have remained unanswered. To address language used is not as clear as that for quality of
such concerns and specifically to identify the strengths design. For example, both methodologies cover effi-
and weaknesses of SPI methodologies as they relate to ciency; however, they refer to efficiency of process
quality, we mapped software quality factors to ISO rather than efficiency of product. This raised the
clauses and key processes of CMM. question: Does improving the efficiency of process
This mapping provided theoretical evidence that improve the efficiency of product? Both address
ISO and CMM cover quality factors as described in the usability through training, rather than ease of use of

Table 2
Quality of design

Quality factors ISO standards CMM key process areas: goals

Correctness: extent to which the software 5.9.2: provisions should be made for Product engineering: consistently performs
conforms to its specifications and conforms verifying the correctness and completeness a well-defined engineering process that
to its declared objectives of copies of the software product delivered integrates all the software engineering
activities to produce correct, consistent
software products effectively and efficiently
Maintainability: ease of effort for locating 5.6.2: subsequent processes: the product Not available
and fixing a software failure within a should be designed to the extent practical
specified time period to facilitate testing, maintenance and use
Verifiability: ease of effort to verify 5.4.6: the supplier should draw up a plan Project tracking and oversight: establishes
software features and performance based for verification of all development phase a common understanding between the
on its stated objectives outputs at the end of each phase customer and software project of the
customer’s requirements that will be
addressed by the software project
N. Ashrafi / Information & Management 40 (2003) 677–690 681

Table 3
Quality of performance

Quality factors ISO standards CMM key process areas: goals

Efficiency: extent to which the software 5.4.1: the organization of the project resources, Product engineering: consistently performs
is able to do more with less system including the team structure, responsibilities, a well-defined engineering process that
(hardware, operating system, use of sub-contractors and material resources integrates all the software engineering
communications, etc.) resources to be used activities to produce correct, consistent
software products effectively and efficiently
Integrity: extent to which the software 5.3.1: . . ., the supplier should have a complete, Configuration management: establishes and
is able to withstand intrusion by unambiguous set of functional requirements; in maintains the integrity of the products of
unauthorized users or software addition, these requirements . . . include, but the software project throughout the project’s
within a specified time period are not limited to, the following: software life cycle
performance, safety, reliability, security,
and privacy; these requirements should be
stated precisely enough so as to allow
validation during product acceptance
Reliability: extent to which the software 5.3.1: . . ., the supplier should have a complete, Project tracking and oversight: establishes
will perform (according to its stated unambiguous set of functional requirements. In a common understanding between the
objectives) within a specified time addition, these requirements . . . include, but customer and software project of the
period are not limited to, the following: performance, customer’s requirements that will be
safety, reliability, security, and privacy; these addressed by the software project
requirements should be stated precisely
enough so as to allow validation
during product acceptance
Usability: relative ease of learning 6.9: the supplier should establish and Training program: the purpose of training
and the operation of the software. maintain procedures for identifying the program key process area is to develop
training needs and provide for skills and knowledge of individuals so
training of all personnel performing activities they can perform their roles effectively
affecting quality and efficiently
Testability: ease of testing to program 5.7: testing may be required at several levels Peer reviews: the purpose of peer review is
to verify that it performs a from the individual software item to the to remove defects from the software
specified function complete software product, there are several work products early and efficiently. . .
different approaches to testing and integration

the product. Although the ultimate goal is the same, for portability. CMM does not address adapta-
enhancing the ability to use the product via training tion factors except for flexibility, which is covered
focuses on the users, whereas ease of use focuses on lightly.
the product. Table 4 shows the mapping of the quality
factors for adaptation to CMM key process areas and
ISO standards. 4. The empirical study
Comparatively, ISO 9000-3 guidelines provide bet-
ter coverage. However, the ISO 9000-3 clauses that To find out the consequences of their coverage in
cover the quality factors for adaptation and perfor- practice, and to attempt to assess whether using these
mance are not as strong as those for design. methodologies does indeed improve the quality of a
The mapping revealed that the coverage of quality piece of software, we embarked upon an empirical
factors by the two SPI methodologies ranges from study. We targeted developers who have used CMM
explicit to implicit or not covered at all. For example, and ISO and asked them to evaluate the impact of SPI
both ISO and CMM cover design factors explicitly, on the quality of the design, performance, and adapta-
except for maintainability. Both cover all of the tion of their software products. We compared their
performance factors, but the level of coverage is responses on the basis of the type of SPI methodology
mixed. ISO addresses adaptation implicitly except they used.
682 N. Ashrafi / Information & Management 40 (2003) 677–690

Table 4
Quality of adaptation

Quality factors ISO standards CMM key process areas: goals

Expandability: relative effort required to 5.10.1.c: functional expansion or performance NA


expand software capabilities and/or improvement: functional expansion or
performance by enhancing current performance improvement of existing
functions or by adding new functionality functions may be required by the purchaser
in the maintenance stage
Flexibility: ease of effort for changing the 5.10.7: the supplier and purchaser should Project tracking and oversight:
software’s mission, functions or data to agree and document procedures for establishes a common understanding
meet changing needs and requirements incorporating changes in a software between the customer and software
product resulting from the need to project of the customer’s requirements
maintain performance that will be addressed by the
software project
Portability: ease of effort to transport NA NA
software to another environment and/or
platform
Reusability: ease of effort to use the software 5.6.2.c: use of past design experience: NA
(or its components) in another software utilizing lessons learned from past design
systems and applications experiences, the supplier should avoid
recurrence of the same or similar problems
Interoperability: relative effort needed to 6.1.3.1.c: (procedures should be applied to NA
couple the software on one platform to ensure that) all interfaces to other software
another software and/or another platform items and to hardware (can be identified for
each version of the software)
Intra-operability: effort required for 5.7.3.b: any discovered problems and their NA
communications between components in possible impacts to and other parts of the
the same software system software should be noted and those
responsible notified

We selected four professional organizations in New In the second part of the survey, we asked questions
England and distributed our survey to their members: about the impact of SPI methodologies on quality; for
The SPI Network (SPIN), The Association of Com- design, this is determined by its correctness, main-
puting Machinery (ACM) Boston Chapter, The Amer- tainability, and verifiability. To analyze the impact of
ican Society for Quality (ASQ) of Massachusetts, and the SPI methodologies, we asked the respondents to
The New England Software Quality Assurance Forum rank their perceived impact from very low ¼ 1,
(NESQAF). The method of data gathering was that of low ¼ 2, average ¼ 3, high ¼ 4, to very high ¼ 5.
semi-structured interviews: the surveys were handed Fig. 1 shows histograms indicating the proportion of
out in person, and questions were answered as they developers that perceive the impact of the methodol-
were raised. We approached more than 200 members ogy they use on design factors from very low to very
of these organizations and screened out those who high.
were not part of a SW development team. Our findings
and analysis are based on the 67 responses that we Table 5
collected. Percentage of respondents using a SPI methodology
The first part of the questionnaire was designed to
ISO CMM ISO & Other Total
identify the users and non-users of SPI methodologies. CMM
We separated projects that were developed using an
SPI 16% 10% 12% 22% 60%
SPI methodology from those that were not. Table 5
Non-SPI 40% 40%
depicts this distribution.
N. Ashrafi / Information & Management 40 (2003) 677–690 683

Fig. 1. Impact of software process improvement methodology on the quality of design.

Overall, this shows that our respondents perceived highest frequency for a level of perception, it does not
the impact of using ISO on design factors much measure variability. Numerical descriptive measures,
stronger than that of CMM. They reported, however, such as mean and standard deviation provide a mea-
that when companies adopt both ISO and CMM a sure of central tendency and of variability, respectively
higher level of correctness, maintainability, and ver- hence providing additional insight into comparing the
ifiability is achieved. methodologies. Low variability indicates agreement
Histograms provide a graphical description of our between respondents on the level of impact and man-
respondents’ perception; however, they do not tell the ifests predictability, which is an important factor in
whole story. For example, while the mode shows the selecting a methodology. Table 6 shows the average
684 N. Ashrafi / Information & Management 40 (2003) 677–690

Table 6 considered the best. Obviously, deploying both ISO


Mean and standard deviations for design quality factors
9000-3 and CMM is costly and time consuming, and
Correctness Maintainability Verifiability only companies who are truly committed to improving
the development process would use both. In practice,
ISO SB ¼ 3.64 SB ¼ 3.27 SB ¼ 3.50
SS ¼ 1.21 SS ¼ 1.19 SS ¼ 1.27 the developers of mission critical software, where
correctness is of crucial importance use both meth-
CMM SB ¼ 3.44 SB ¼ 3.33 SB ¼ 3.00
SS ¼ 1.88 SS ¼ 2.12 SS ¼ 2.12 odologies [32]. In our study, 12% of the respondents
reported using both.
ISO and CMM SB ¼ 3.67 SB ¼ 3.67 SB ¼ 3.33
Many companies that cannot afford to deploy both
SS ¼ 1.03 SS ¼ 1.51 SS ¼ 1.21
techniques should select the methodology that meets
Other SB ¼ 3.50 SB ¼ 3.18 SB ¼ 3.25
their requirements. Our study suggests that using ISO
SS ¼ 0.80 SS ¼ 0.98 SS ¼ 0.87
would have a greater impact on the quality of design
than using CMM or other methodologies.
The quality of a software product depends upon a
and standard deviation for design factors for each good design, and good performance. This is measured
group. by the efficiency, integrity, reliability, testability and
The table suggests that the mean values for four usability of a software product. Fig. 2 demonstrates
categories are all above average (3) and quite close. our respondents’ perspectives on the impact of SPI on
Standard deviations provide additional information performance quality factors.
regarding the consistency of the perception among Our data for performance is not as obvious as that
the users of a methodology. The largest variability was for design. Using both ISO and CMM has the greatest
detected among those who use only CMM; whereas impact on reliability and testability, which are two
those who use other, mostly ‘‘home grown,’’ meth- related factors. It seems that other methodologies
odologies were cohesive in their perception of the combined showed the highest impact on efficiency,
impact. and that usability can be improved using ISO alone. It
Fig. 1 and Table 6 are consistent with our mapping is not clear which methodology has the highest impact
for ISO, whose requirements explicitly address the on integrity. Table 7 depicts mean and standard devia-
issues of correctness, maintainability, and verifiability. tion for each factor and clarifies some of the ambiguity
The empirical study indicates that most of the respon- in Fig. 2. While histograms did not indicate which
dents thought that ISO had a high impact on these methodology has the highest impact on integrity,
factors. The results, however, are somewhat incon- Table 7 shows them to be ISO and CMM, thus
sistent for CMM. Although it covers correctness and indicating that the combination is the best.
verifiability and not maintainability, the responses are As we pointed out earlier, performance quality
moderate and close for them. Using both techniques is factors are not addressed as strongly as in design,

Table 7
Mean and standard deviations for performance quality factors

Efficiency Integrity Reliability Usability Testability

ISO SB ¼ 2.64 SB ¼ 2.64 SB ¼ 3.64 SB ¼ 3.73 SB ¼ 3.70


SS ¼ 1.12 SS ¼ 0.81 SS ¼ 1.03 SS ¼ 0.90 SS ¼ 1.06
CMM SB ¼ 2.75 SB ¼ 2.75 SB ¼ 3.67 SB ¼ 3.33 SB ¼ 3.78
SS ¼ 2.05 SS ¼ 2.19 SS ¼ 2.00 SS ¼ 2.00 SS ¼ 1.99
ISO and CMM SB ¼ 2.83 SB ¼ 3.17 SB ¼ 4.00 SB ¼ 2.50 SB ¼ 3.67
SS ¼ 0.75 SS ¼ 0.98 SS ¼ 1.10 SS ¼ 1.22 SS ¼ 1.51
Other SB ¼ 3.18 SB ¼ 3.00 SB ¼ 3.58 SB ¼ 3.42 SB ¼ 3.67
SS ¼ 1.17 SS ¼ 1.18 SS ¼ 1.08 SS ¼ 1.31 SS ¼ 1.30
N. Ashrafi / Information & Management 40 (2003) 677–690 685

Fig. 2. Impact of software process improvement methodology on the quality of performance.


686 N. Ashrafi / Information & Management 40 (2003) 677–690

Fig. 3. Impact of software process improvement methodology on the quality of adaptation.

except for reliability and testing. Table 7 shows mean and a mixed for usability. In Table 3 the language used
values mostly below average (3) for efficiency and by ISO for addressing efficiency is not very strong,
integrity, above average for reliability and testability, and Fig. 2 indicates the percentage of respondents that
N. Ashrafi / Information & Management 40 (2003) 677–690 687

Table 8
Mean and standard deviations for adaptation quality factors

Expandability Flexibility Portability Reusability Interoperability Intra-operability

ISO SB ¼ 3.60 SB ¼ 3.50 SB ¼ 3.10 SB ¼ 3.50 SB ¼ 3.20 SB ¼ 4.10


SS ¼ 0.97 SS ¼ 1.08 SS ¼ 1.60 SS ¼ 1.08 SS ¼ 1.23 SS ¼ 0.99
CMM SB ¼ 4.00 SB ¼ 3.00 SB ¼ 3.13 SB ¼ 3.13 SB ¼ 3.00 SB ¼ 3.33
SS ¼ 1.87 SS ¼ 1.93 SS ¼ 2.03 SS ¼ 2.10 SS ¼ 1.77 SS ¼ 2.18
ISO and CMM SB ¼ 3.80 SB ¼ 3.40 SB ¼ 3.60 SB ¼ 2.80 SB ¼ 3.40 SB ¼ 3.40
SS ¼ 1.10 SS ¼ 1.14 SS ¼ 1.34 SS ¼ 0.84 SS ¼ 0.89 SS ¼ 1.14
Others SB ¼ 2.80 SB ¼ 3.40 SB ¼ 2.80 SB ¼ 3.20 SB ¼ 3.30 SB ¼ 3.55
SS ¼ 1.14 SS ¼ 1.17 SS ¼ 1.55 SS ¼ 1.40 SS ¼ 1.25 SS ¼ 1.29

Fig. 4. Decision tree for selecting SPI methodology.


688 N. Ashrafi / Information & Management 40 (2003) 677–690

perceived its impact as high or very high are 27 and The intrinsic characteristics of CMM and ISO
0%, respectively. Also, apparently CMM covers effi- 9000 could explain their shortcomings here. Both meth-
ciency only lightly, and its histogram shows that odologies primarily deal with the status quo and try to
respondents are not impressed by its impact on effi- improve the existing development process rather
ciency. The histograms in Fig. 3 depict the reported than engaging in innovation that ultimately requires
impact on the factors of the quality of adaptation; it adaptation quality factors. Their guidelines reflect
was quite uneven. their philosophy. Hence, one would believe that
Altogether, the histograms do not provide a clear- for organizations involved with developing software
cut distinction between methodologies in their impact products that need to adapt to constant change,
on adaptation factors. Table 8 shows means and home grown’’ SPI methodologies are the best strategy.
standard deviations for these factors. However, our empirical findings did not support this
Table 8 indicates that using ISO, CMM, or both has idea.
greater impact on most adaptation factors. This contra- The majority of our respondents ranked the impact
dicts our mapping results that showed that CMM does of using ISO and CMM together first for those quality
not cover expandability, portability, reusability, inter- factors that address technical aspects of software
operability, and intra-operability, and covers flexibil- development (correctness, maintainability, verifiabil-
ity lightly. According to the respondents, ISO ranks ity, reliability, and testability). On the other hand,
high in its impact on all adaptation factors, even improving managerial aspects (efficiency, usability,
though it does not cover portability at all and its flexibility, reusability, and intra-operability) did not
coverage for the rest is quite light and not direct. require using both ISO and CMM. This seems counter-
intuitive, because ISO and CMM both address man-
4.1. The results agerial issues extensively.
The CMM users consistently showed a wide dis-
Our empirical data basically supports the theoreti- crepancy in their perceptions of CMMs impact on
cal framework and indicates that process improvement design, performance, and adaptation factors. One
enhances software quality factors; however, the levels possible reason is that the respondents belonged to
of impact on various quality factors differ depending firms at different levels of CMM maturity.
on which methodology is used. To put our entire
findings in perspective, we created a decision tree
that organizations could use as a guideline for the 5. Conclusions and limitations
selection of an SPI methodology that meets their
quality requirements. Fig. 4 shows this decision tree. The key to survive is to develop and market quality
Real-time software products, where performance is products, on time, and within budget. In this study we
a critical factor, include reservation and trading soft- examined SPI methodologies as they relate to quality
ware applications. Organizations involved in devel- guidelines. We cross-referenced factors to ISO 9000-3
oping this type of software product should use ISO, if clauses and key process areas of the CMM to learn
their primary concerns are usability; use ISO and whether SPI methodologies deal with SQA guidelines
CMM, if integrity, reliability, and testability are the as part of their process improvement strategy. We
most important quality factors for them, and use other conducted an empirical study to find the extent of
methodologies if efficiency is the primary concern. the impact of SPI methodologies on quality factors in
The impact on adaptation quality factors is the most practice. Based on our findings, we built a decision
difficult to analyze. Histograms did not provide a tree, which could be used as an instrument for meth-
clear-cut selection for those organizations with adap- odology selection.
tation factors as quality requirements. However, while Our study produced some inconclusive results,
the empirical findings, more or less, supported the including, for example, the inconsistency between
results of theoretical investigation for design and the theoretical framework and the empirical results
performance factors, the results for adaptation factors for adaptation factors, wide variability among CMM
mostly contradicted the findings of our mapping. users, and the possible biases inherent in the perception
N. Ashrafi / Information & Management 40 (2003) 677–690 689

of technical people for managerial issues. This [11] T. Gilb, Principals of Software Engineering Management,
Addison-Wesley, Reading, MA, 1987.
research has two limitations:
[12] D.H. Harter, M.S. Krishan, S.A. Slaughter, Effects of maturity
1. Statistical analysis is descriptive rather than on quality, cycle time, and effort in software product develop-
inferential. We chose descriptive statistics because ment, Management Science 46 (4), 2000, pp. 451–466.
[13] J. Herbsleb, D. Zubrow, D. Goldenson, W. Hayes, M. Paulk,
it does not require a scientific sampling procedure. Software quality and the capability maturity model, Commu-
Although our sample size is large enough for nications of ACM 40 (6), 1997, pp. 30–40.
inferential statistics, our primary concern was the [14] W.S. Humphrey, Managing the Software Process, Addison
integrity and accuracy of our responses; therefore, Wesley, Reading, MA, 1989.
rather than random sampling, we concentrated on [15] J. Ingalbe, D. Shoemaker, V. Jovanovic, A MetaModel for
Capability Maturity Model for Software, in: Proceedings of
developers, with whom we could meet face-to- the AIS, August 2001.
face and know that we could gauge their opinions [16] C. Jones, The economics of software process improvement,
and their perceptions of SPI methodologies. IEEE Computer 29 (1), 1996, pp. 95–97.
2. The ISO 9000-3 clauses and the CMM key process [17] J.M. Juran, F.M. Gryna, Quality Planning and Analysis,
McGraw-Hill, New York, 1993.
areas are intentionally vague to allow organizations
[18] R. Kehoe, A. Jarvis, ISO 9000-3: A Tool for Software Product
to interpret them according to their own quality and Process Improvement, New York, 1996.
system. Our interpretation of their strengths and [19] B. Kitchenham, S.L. Pfleeger, Software quality: the elusive
weaknesses as they relate to quality factors is also target, IEEE Software 13 (1), 1996, pp. 12–21.
subjective and may differ from that of other. [20] D.H. Kitson, Relating the spice framework and SEI approach
to software process assessment, Software Quality Journal 5
(3), 1996, pp. 145–156.
[21] M.S. Krishan, C.H. Kriebel, S. Keker, T. Mukhopadhyay, An
References empirical analysis of productivity and quality in software
product, Management Science 46 (6), 2000, pp. 754–759.
[1] S.R. Bailey, A.J. Donnell, R.M. Hough, Process Control (ISO [22] M.S. Krishan, M.I. Keller, Measuring process consistency:
9000) and Process Improvement (APEX) in the Transmission implications for reducing software defects, IEEE Transactions
Systems Business Unit, AT & T Technical Journal (1994) on Software Engineering 25 (6), 1999, pp. 800–815.
17–25. [23] J.A. McCall, et al. Factors in Software Quality, RADC-TR-
[2] R.C. Bamford, W.J. Deibler, When the CMM meets ISO 77-369, 13441-5700, Rome Air Development Center, Griffiss
9001, cross-talk, The Journal of Defense Software Engineer- Air Force Bas, NY, November 1977, pp. 1–3.
ing (1998) 3–8. [24] L. Osterweil et al., Strategic directions in software quality,
[3] R.C. Bamford, W.J. Deibler, Comparing and contrasting ISO ACM Computing Surveys 28 (4), 1996, pp. 738–750.
9001 and the SEI capability maturity models, IEEE [25] M. Paulk, How ISO 9001 compares with the CMM, IEEE
Computers 26 (10), 1993, pp. 68–70. Software 12 (1), 1995, pp. 74–83.
[4] CMM-CMU/SEI-93-TR-25, Key Practices of the Capability [26] M.J. Pearson, C.S. McCahon, R.T. Hightower, Total quality
Maturity Model, Version 1.1, Carnegie Mellon University, management: are information systems managers ready?
Pittsburg, 1993, pp. 0–11. Information and Management 29, 1995, pp. 251–263.
[5] W.E. Deming, Quality, Productivity, and Competitive Posi- [27] W.E. Perry, Quality Assurance for Information Systems:
tion, MIT Center for Advanced Engineering Study, Cam- Methods, Tools, and Techniques, QED Technical Publishing
bridge, MA, 1982. Company, Wellesley, MA, 1991.
[6] K.E. Emam, A. Birk, Validating the ISO/IEC 15504 measure [28] S.L. Pfleeger, Does organizational maturity improve quality?
of software requirements analysis process capability, IEEE IEEE Software 13 (5), 1996, pp. 109–110.
Transactions on Software Engineering 26 (6), 2000, pp. [29] D.D. Phan, J.F. George, D.R. Vogel, Managing software
541–566. quality in a very large development project, Information and
[7] M.W. Evans, J. Marciniak, Software Quality Assurance and Management 29, 1995, pp. 277–283.
Management, Wiley, New York, NY, 1987. [30] R.S. Pressman, Software Engineering: A Beginner’s Guide,
[8] B. Fazlollahi, M.R. Tanniru, Selecting a requirement McGraw Hill, New York, 1988.
determination methodology-contingency approach revisited, [31] R.S. Pressman, Software process perceptions, IEEE Software
Information and Management 21, 1991, pp. 291–303. 13 (16), 1996, pp. 16–18.
[9] R. Fournier, Practical Guide to Structured System Develop- [32] H. Saiedian, L. McClanahan, Frameworks for quality
ment and Maintenance, Prentice-Hall, Englewood Cliffs, NJ, software process: SEI capability maturity model versus ISO
1991, Chapter 13, pp. 316–322. 9000, Software Quality Journal 5, 1996, pp. 1–23.
[10] D. Garvin, What Does ‘‘Product Quality’’ Really Means?, [33] S.A. Sheard, The frameworks quagmire, cross-talk, The
Sloan Management Review, 1998, pp. 25–45. Journal of Defense Software Engineering (1997) 1–8.
690 N. Ashrafi / Information & Management 40 (2003) 677–690

[34] G.J. Van der Pijla, G.J.P. Swinkelsb, J.G. Verrijdtc, ISO 9000 of Texas at Arlington. Her extensive publications on software
versus CMM: standardization and certification of IS develop- reliability, software process improvement, technology and society,
ment, Information and Management 32 (6), 1977, pp. 267–274. and application of mathematical models to assess fault tolerance
software have appeared in journals such as IEEE Transactions on
Noushin Ashrafi is an associate profes- Software Engineering, IEEE Transactions on Software Reliability,
sor in the Management Science and IEEE Computer Society, Information and Management, The
Information Systems Department, Col- Journal of Information Technology Management, Information and
lege of Management at the University of Software Technology, Information System Management, Computer
Massachusetts Boston. Dr. Ashrafi re- and Society, and Journal of Database Management. Dr. Ashrafi has
ceived her PhD in management and made numerous presentations on various topics of information
information systems from the University systems in national and international conferences.

You might also like