You are on page 1of 32

Software Project

Management
Chapter 5
4th Edition

Software effort
estimation

1
©The McGraw-Hill Companies,
• A successful project is one delivered on the
time ,within budget and with required
quality ,this imply that targets are set which
the project manager then tries to meet.
• Realistic estimates are therefore crucial
,incorrect initial estimate is a problem.
• As a project proceed the accuracy of the
estimates should improve as knowedge
increase.

2
©The McGraw-Hill Companies,
What makes a successful
project?
Delivering: Stages:
 agreed functionality 1. set targets
 on time 2. Attempt to achieve
 at the agreed cost targets
 with the required
quality

BUT what if the targets are not achievable?


3
©The McGraw-Hill Companies,
Over and under-
estimating
• Parkinson’s Law: • Weinberg’s Zeroth
‘Work expands to fill Law of reliability: ‘a
the time available’ software project
• That is given an easy that does not have
target ,staff will work to meet a reliability
less hard. requirement can
• An over-estimate is meet any other
likely to cause project requirement’
to take longer than it
would otherwise
4
©The McGraw-Hill Companies,
continued
• An over-estimate of the effort could lead to
more staff and managerial overhead being
increased.
•Under estimate project might not be
completed on time or to cost. But it might be
implemented in a shorter time than a project
more estimates.
•The danger with underestimate is the effect
in quality. 5
©The McGraw-Hill Companies,
A taxonomy of estimating
methods
• Bottom-up - activity based analytical, Where component tasks
are identified and sized, these individual estimates are
aggregated.
• Parametric or algorithmic models e.g. function points,
Use representing characteristic of the target system and the
environment to predict efforts.
• Expert opinion - just guessing, based on the advice of
knowledgeable staff.
• Analogy - case-based, comparative, where a similar completed
project is identified and it’s actual effort is used as the basis for
the estimate.

6
©The McGraw-Hill Companies,
continued
• Parkinson and ‘price to win’
-where the staff of effort available to do the
project becomes the estimate.
- Where the estimates is a figure that seem
sufficiently to win a contract.
• top_down :where an overall estimates for
the whole project is broken down into
effort required for component tasks.
7
©The McGraw-Hill Companies,
Bottom-up versus top-down
• Project manager will prbly try to get a number of different estimates
from different people using different methods.
• Bottom up: calculate effort for each activity to get an overall
estimation (where the project is novel).
• Bottom-up
– use when no past project data
– identify all tasks that have to be done – so quite time-consuming
– use when you have no data about similar past projects
• Top-down
– produce overall estimate based on project cost drivers
– based on past project data
– divide overall estimate between jobs to be done
• Having calculated the overall effort, the problem is then to
allocate proportion of that effort to various activity.
8
©The McGraw-Hill Companies,
Bottom-up estimating
1. Break project into smaller and smaller
components
[2. Stop when you get to what one person
can do in one/two weeks]
3. Estimate costs for the lowest level
activities
4. At each higher level calculate estimate
by adding estimates for lower levels
9
©The McGraw-Hill Companies,
Top-down estimates
Estimate
100 days • Produce overall
overall
project estimate using effort
driver (s)

design code test • distribute


30% 40%
proportions of
30%
i.e. i.e. i.e. 40 days overall estimate to
30 days 30 days
components

10
©The McGraw-Hill Companies,
Algorithmic/Parametric models
• COCOMO : constructive cost model
• A model to forecast software development
effort has two key components.
1- a method of assessing the amount of
work needed.
2- assess the rate of work at which the task
can be done.
•Parametric models:
1. Focus on task size, e.g. function point.
2.Focus on productivity factors e.g.
COCOMO.
11
©The McGraw-Hill Companies,
Algorithmic/Parametric models

• COCOMO (lines of code) and function


points examples of these
• Problem with COCOMO etc:

guess algorithm estimate

but what is desired is

system algorithm estimate


characteristic
12
©The McGraw-Hill Companies,
Parametric models - continued
• Examples of system characteristics
– no of screens x 4 hours
– no of reports x 2 days
– no of entity types x 2 days

• the quantitative relationship between


the input and output products of a
process can be used as the basis of a
parametric model

13
©The McGraw-Hill Companies,
Parametric models - the
need for historical data
• simplistic model for an estimate
estimated effort = (system size) /
productivity
e.g.
system size = lines of code
productivity = lines of code per day
• productivity = (system size) / effort
– based on past projects

14
©The McGraw-Hill Companies,
Parametric models
• Some models focus on task or system
size e.g. Function Points
• FPs originally used to estimate Lines of
Code, rather than effort
Number
of file types

model ‘system
size’

Numbers of input
and output transaction types
15
©The McGraw-Hill Companies,
Parametric models
• Other models focus on productivity: e.g.
COCOMO
• Lines of code (or FPs etc) an input
Estimated effort
System
size

Productivity
factors

16
©The McGraw-Hill Companies,
Function points Mark II
• Developed by Charles R. Symons
• ‘Software sizing and estimating - Mk II
FPA’, Wiley & Sons, 1991.
• Builds on work by Albrecht
• Work originally for CCTA:
– should be compatible with SSADM; mainly
used in UK
• has developed in parallel to IFPUG FPs

17
©The McGraw-Hill Companies,
Function points Mk II
continued
For each
transaction,
#entities count
accessed
– data items input
(Ni)
– data items
output (No)
#input #output – entity types
items items accessed (Ne)
FP count = Ni * 0.58 + Ne * 1.66 + No * 0.26
18
©The McGraw-Hill Companies,
Function points for embedded
systems
• Mark II function points, IFPUG function
points were designed for information
systems environments
• COSMIC FPs attempt to extend concept to
embedded systems
• Embedded software seen as being in a
particular ‘layer’ in the system
• Communicates with other layers and also
other components at same level

19
©The McGraw-Hill Companies,
Layered software

Higher layers

Receives request Supplies service

Data reads/ Peer to peer


writes communication
Persistent peer
storage Software component
component

Makes a request
Receives service
for a service
Lower layers

20
©The McGraw-Hill Companies,
COSMIC FPs
The following are counted:
• Entries: movement of data into software
component from a higher layer or a peer
component
• Exits: movements of data out
• Reads: data movement from persistent storage
• Writes: data movement to persistent storage
Each counts as 1 ‘COSMIC functional size unit’
(Cfsu)

21
©The McGraw-Hill Companies,
COCOMO81
• Based on industry productivity standards - database is
constantly updated.
- Data base: containing the performance details of executed
project
• Allows an organization to benchmark its software
development productivity
• Basic model
effort = c x sizek

• Effort measured in PM (person_method).

• C and k depend on the type of system: organic, semi-


detached, embedded
• Size is measured in ‘kloc’ ie. Thousands of lines of code
22
©The McGraw-Hill Companies,
The COCOMO constants
System type c k
Organic (broadly, 2.4 1.05
information systems)
Semi-detached 3.0 1.12

Embedded (broadly, 3.6 1.20


real-time)

k exponentiation – ‘to the power of…’


adds disproportionately more effort to the larger projects
takes account of bigger management overheads

23
©The McGraw-Hill Companies,
The COCOMO constants
-Boehm finding that target project
tended to be less productivity than
small one because they needed
more effort for management and
coordination
- Effort = 2.4 X 51.05 = 13.003
- Effort = 3.6 X 101.20 = 57.056
24
©The McGraw-Hill Companies,
Development effort multipliers
(dem)
According to COCOMO, the major productivity
drivers include:
Product attributes: required reliability, database
size, product complexity
Computer attributes: execution time constraints,
storage constraints, virtual machine (VM) volatility
Personnel attributes: analyst capability,
application experience, VM experience,
programming language experience
Project attributes: modern programming
practices, software tools, schedule constraints
25
©The McGraw-Hill Companies,
Using COCOMO development
effort multipliers (dem)
An example: for analyst capability:
• Assess capability as very low, low, nominal, high or
very high
• Extract multiplier:
very low 1.46
low 1.19
nominal 1.00
high 0.80
very high 0.71
• Adjust nominal estimate e.g. 32.6 x 0.80 = 26.8
staff months

26
©The McGraw-Hill Companies,
Estimating by analogy
Use effort
source cases from source as
estimate
attribute values effort

attribute values effort target case


attribute values effort attribute values ?????
attribute values effort

attribute values effort

attribute values effort


Select case
27 with closet attribute
values

©The McGraw-Hill Companies,


•Also called case_base reasoning.

-The estimator identifies


completed projects (source case)
with similar characteristic to the
new project (the target case).

28
©The McGraw-Hill Companies,
Stages: identify
• Significant features of the current
project
• previous project(s) with similar features
• differences between the current and
previous projects
• possible reasons for error (risk)
• measures to reduce uncertainty

29
©The McGraw-Hill Companies,
Machine assistance for source
selection (ANGEL)
Source A

Source B

It-Is
Number of

Ot-Os
inputs

target

Number of outputs

Euclidean distance = sq root ((It - Is)2 + (Ot - Os)2 )


30
©The McGraw-Hill Companies,
Some conclusions: how to
review estimates
Ask the following questions about an estimate
• What are the task size drivers?
• What productivity rates have been used?
• Is there an example of a previous project of
about the same size?
• Are there examples of where the productivity
rates used have actually been found?

31
©The McGraw-Hill Companies,
conclusion
• Use more than one method of estimating.
• collect as much info. as possible from
precion of …..
•Seek a range of onions.
•Document your method of doing estimates
and record all your assumptions.
•Be carful about using other historical
productivity ,especially if it comes from a
different environment.

©The McGraw-Hill Companies,

You might also like