You are on page 1of 53

Decision Making Tools

50
Management Tools

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

Contents
CONTENTS

MARKETING

1. PRODUCT / MARKET GRID ANSOFF BUSINESS UNIT STRATEGY MODEL


2. BRAND ASSET VALUATOR
3. 3 CS
4. MARKETING MIX
5. PRODUCT LIFE CYCLE
6. GE / MCKINSEY MATRIX
7. SALES MIX

4
5
6
7
8
9
10

IT

11

8. NOLANS IT GROWTH STAGES


9. ENTERPRISE ARCHITECTURE
10. INNOVATION ADOPTION CURVE
11. PROJECT MANAGEMENT LIFE CYCLE
12. RASCI MODEL
13. SYSTEMS THINKING / DYNAMICS
14. STRATEGIC ALIGNMENT OF BUSINESS WITH IT

11
12
13
14
15
16
17

FINANCE

18

15. CAPITAL ASSET PRICING MODEL


16. DISCOUNTED CASH FLOW
17. VALUATION MODELS
18. RISK / RETURN TRADE-OFF
19. CFROI AND TSR MODELS
20. PLAUSIBLE THEORY FOR UNKNOWN RISKS

18
19
20
21
22
23

ORG / PEOPLE / HR

24

21. MINTZBERG CONFIGURATION


22. KEPNER-TREGOE MATRIX DECISION MAKING MODEL
23. BELBINS TEAM ROLES
24. GREINERS GROWTH MODEL
25. CHANGE BEHAVIOUR
26. MASLOW HIERARCHY OF NEEDS
27. EFQM MODEL
28. PATH-GOAL THEORY LEADERSHIP MODEL

24
25
26
27
28
29
30
31

SCM / IT / OPS MANAGEMENT

32

29. BENCHMARKING
30. TOYOTAS TOTAL PRODUCTION SYSTEM / JIT
31. DEMING CYCLE
32. MICHAEL PORTERS VALUE CHAIN
33. TQM: SIX SIGMA, KAIZEN
34. EXPERIENCE CURVE
35. CRISIS MANAGEMENT

32
33
34
35
36
37
38

STRATEGY

39

36. SEPT / PEST

39

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

37. SWOT TOWS


38. BALANCED SCORE CARD
39. BCG MATRIX
40. PORTERS FIVE FORCES MODEL
41. STRATPORT PADSS

40
41
42
43
44

ACCOUNTING

45

42. ACTIVITY BASED COSTING


43. ECONOMIC VALUE ADDED / BREAKEVEN ANALYSIS
44. RATIO ANALYSIS Z-SCORE
45. VARIANCE ANALYSIS

45
46
47
48

ECONOMICS

49

46. FORECASTING
47. PRICING MODEL
48. STOCHASTIC MODEL
49. SCENARIO PLANNING
50. HARROD-DOMAR MODEL

49
50
51
52
53

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

Marketing
1. Product / Market Grid Ansoff business unit strategy model
The product/market grid of Ansoff is a model that has proven to be very useful in
business unit strategy processes to determine business growth opportunities. The
product/market grid has two dimensions: products and markets.
Over these 2 dimensions, four growth strategies can be formed:
- market penetration,
- market development,
- product development, and
- diversification.
Market Penetration:
Company strategies based on market penetration normally focus on changing incidental
clients to regular clients, and regular client into heavy clients. Typical systems are volume
discounts, bonus cards and customer relationship management.
Market Development:
Company strategies based on market development often try to lure clients away from
competitors or introduce existing products in foreign markets or introduce new brand names
in a market.
Product Development:
Company strategies based on product development often try to sell other products to
(regular) clients. This can be accessories, add-ons, or completely new products. Often
existing communication channels are leveraged.
Diversification:
Company strategies based on diversification are the most risky type of strategies. Often
there is a credibility focus in the communication to explain why the company enters new
markets with new products. This 4th quadrant (diversification) of the product/market
grid can be further split up in four types:
- horizontal diversification (new product, current market)
- vertical diversification (move into firms supplier's or customer's business)
- concentric diversification (new product closely related to current product in new market)
- conglomerate diversification (new product in new market).
Although already decennia old, the product/market grid of Ansoff remains a valuable
model for communication around business unit strategy processes and business growth.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

2. Brand Asset Valuator


The Brand Asset Valuator measures Brand Value by applying four broad factors:
1. Differentiation Differentiation is the ability for a brand to stand apart from its
competitors. A brand should be as unique as possible. Brand health is built and maintained
by offering a set of differentiating promises to consumers and delivering those promises to
leverage value.
2. Relevance Relevance is the actual and perceived importance of the brand to a large
consumer market segment. This gauges the personal appropriateness of a brand to
consumers and is strongly tied to household penetration (the percentage of households that
purchase the brand).
3. Esteem - Esteem is the perceived quality and consumer perceptions about the growing or
declining popularity of a brand. Does the brand keep its promises? The consumer's
response to a marketer's brand-building activity is driven by his perception of two factors:
quality and popularity, both of which vary by country and culture.
4. Knowledge Knowledge is the extent of the consumers awareness of the brand and
understanding of its identity. The awareness levels about the brand and what it stands for
shows the intimacy that consumers share with the brand. True knowledge of the brand
comes through brand-building.
Differentiation and Relevance taken together say a lot about its growth potential ("Brand
Vitality"), while Esteem and Knowledge determine the current power of a brand ("Brand
Stature").
A Survey based on the Brand Asset Valuator is conducted annually containing data about
20.000 brands, based on the opinion of over 230.000 respondents in 44 countries.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

3. 3 Cs
The 3C's model (three C's framework) of Kenichi Ohmae, a famous Japanese strategy
guru, stresses that a strategist should focus on three key factors for success. "In the
construction of any business strategy, three main players must be taken into account:

the corporation itself,

the customer, and

the competition".
Only by integrating the three C's (Customer, Competitor, and Company) in a strategic
triangle, sustained competitive advantage can exist. He refers to these key factors as
the three C's or the strategic triangle.
3C's model: Customer-based strategies are the basis of all strategy. ..."There is no doubt
that a corporation's foremost concern ought to be the interest of its customers rather than
that of its stockholders and other parties. In the long run, the corporation that is genuinely
interested in its customers is the one that will be interesting to investors".
3C's framework: Corporate-based strategies. They aim to maximize the corporation's
strengths relative to the competition in the functional areas that are critical to success in the
industry.
3 C's model: Competitor-based strategies according to Kenichi Ohmae can be
constructed by looking at possible sources of differentiation in functions ranging from
purchasing, design, and engineering to sales and servicing.
The power of an image:
Both Sony and Honda outsell their competitors as they invested more heavily in public
relations and promotion and managed these functions more carefully than did their
competitors. When product performance and mode of distribution are very difficult to
differentiate, image may be the only source of positive differentiation. But as the case of the
Swiss watch industry reminds us, a strategy built on image can be risky and must be
monitored constantly.
Capitalizing on profit- and cost-structure differences:
Firstly, the difference in source of profit might be exploited, for e.g. profit from new product
sales, profit from services etc. Secondly, a difference in the ratio of fixed cost to variable cost
might also be exploited strategically for e.g. a company with a lower fixed cost ratio can
lower prices in a sluggish market and win market share. This hurts the company with a
higher fixed cost ratio as the market price is too low to justify its high-fixed-cost-low-volume
operation.
Tactics for flyweights:
If such a company chooses to compete in mass-media advertising or massive R&D efforts,
the additional fixed costs will absorb such a large portion of its revenue that its giant
competitors will inevitably win. It could though calculate its incentives on a graduated
percentage basis rather than on absolute volume, thus making the incentives variable by
guaranteeing the dealer a larger percentage of each extra unit sold. The Big Three, of
course, cannot afford to offer such high percentages across the board to their respective
franchised stores; their profitability would soon be eroded if they did.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

4. Marketing Mix
The Marketing Mix model (also known as the 4 Ps) can be used by marketers as a tool to
assist in implementing strategy. Managers use this method to attempt to generate the
optimal response in the target market by blending 4 (or 5, or 7) variables in an optimal way.
It is important to understand that the MM principles are controllable variables. The MM can
be adjusted on a frequent basis to meet the changing needs of the target group and the
other dynamics of the M. environment.

Product

Historically, the thinking was: a good product will


sell itself. However there are no bad products
anymore in today's highly competitive markets. Plus
there are many laws giving customers the right to
return products that he perceives as bad. Therefore
the question on product has become: does the
organization create what its intended customers
want? Define the characteristics of your product or
service that meets the needs of your customers.

Price

How much are the intended customers willing to


pay? Here we decide on a pricing strategy - do not
let it just happen! Even if you decide not to charge
for a service (a loss leader), you must realize that
this is a conscious decision and forms part of the List Price, Discounts,
pricing strategy. Although competing on price is as Financing,
Leasing
old as mankind, the consumer is often still sensitive Options, Allowances,
for price discounts and special offers. Price has also
an irrational side: something that is expensive must
be good. Permanently competing on price is for
many companies not a very sensible approach.

Place

Available at the right place, at the right time, in the


right quantities? Some of the revolutions in business
have come about by changing Place. Think of the
internet and mobile telephones.

Locations,
Logistics,
Channel
members,
Channel
Motivation,
Market
Coverage,
Service Levels, Internet,
Mobile

(How) are the chosen target groups informed or


educated about the organization and its products?
This includes all the weapons in the marketing
armory - advertising, selling, sales promotions,
Promotion
Public Relations, etc. While the other three P's have
lost much of there meaning in today's markets,
Promotion has become the most important P to
focus on.

Advertising,
Public
Relations,
Message,
Direct
Sales,
Sales,
Media, Budget

Functionality,
Quality,
Appearance, Packaging,
Brand, Service, Support,
Warranty

The function of the MM is to help develop a package (mix) that will not only satisfy the
needs of the customers within the target markets, but simultaneously to maximize the
performance of the organization. There have been many attempts to increase the number of
P's from 4 to 5P's in the MM model. The most frequently mentioned one
being People or Personnel.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

5. Product Life Cycle


The Product Life Cycle model can help analyzing Product and Industry Maturity
Stages.
Any Business is constantly seeking ways to grow future cash flows by maximizing revenue
from the sale of products and services. Cash Flow allows a company to maintain viability,
invest in new product development and improve its workforce; all in an effort to acquire
additional market share and become a leader in its respective industry.
Also, product life cycles are becoming shorter and shorter and many products in mature
industries are revitalized by product differentiation and market segmentation. Organizations
increasingly reassess product life cycle costs and revenues as the time available to sell a
product and recover the investment in it shrinks.
Even as product life cycles shrink, the operating life of many products is lengthening. For
example, the operating life of some durable goods, such as automobiles and appliances,
has increased substantially. This leads the companies that produce these products to take
their market life and service life into account when planning. Increasingly, companies are
attempting to optimize life cycle revenue and profits through the consideration of product
warranties, spare parts, and the ability to upgrade existing products.
It's clear the concept of life cycle stages has a significant impact upon business strategy
and performance. The Product Life Cycle method identifies the distinct stages affecting
sales of a product, from the product's inception until its retirement.
In the Introduction stage, the product is introduced to the market through a focused and
intense marketing effort designed to establish a clear identity and promote maximum
awareness. Many trial or impulse purchases will occur at this stage. Next, consumer interest
will bring about the Growth stage, distinguished by increasing sales and the emergence of
competitors. The Growth stage is also characterized by sustaining marketing activities on
the vendor's side, with customers engaged in repeat purchase behavior patterns. Arrival of
the product's Maturity stage is evident when competitors begin to leave the market, sales
velocity is dramatically reduced, and sales volume reaches a steady state. At this point in
time, mostly loyal customers purchase the product. Continuous decline in sales signals entry
into the Decline stage.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

6. GE / McKinsey Matrix
The GE matrix / McKinsey matrix is a model to perform a business portfolio analysis on
the Strategic Business Units of a corporation.
A business
portfolio is
the
collection
of
Strategic
Business Units that make up a
corporation.
The
optimal
business portfolio is one that
fits perfectly to the company's
strengths and helps to exploit
the most attractive industries
or
markets.
A Strategic
Business Unit (SBU) can either
be an entire mid-size company
or a division of a large
corporation, that formulates its
own business level strategy
and has separate objectives
from the parent company.
The aim of a portfolio analysis is:
1) Analyze its current business portfolio and decide which SBU's should receive more or less
investment, and
2) Develop growth strategies for adding new products and businesses to the portfolio
3) Decide which businesses or products should no longer be retained.
The GE / McKinsey Matrix is more sophisticated than the BCG Matrix in three aspects:
1. Market (Industry) attractiveness replaces market growth as the dimension of industry
attractiveness. Market Attractiveness includes a broader range of factors other than just the
market growth rate that can determine the attractiveness of an industry / market.
2. Competitive strength replaces market share as the dimension by which the competitive
position of each SBU is assessed. Competitive strength likewise includes a broader range of
factors other than just the market share that can determine the competitive strength of a
Strategic Business Unit.
3. Finally the GE / McKinsey Matrix works with a 3*3 grid, while the BCG Matrix has only
2*2. This also allows for more sophistication.
Often, Strategic Business Units are portrayed as a circle plotted in the GE McKinsey
Matrix, whereby:
- The size of the circles represent the Market Size
- The size of the pies represent the Market Share of the SBU's
- Arrows represent the direction and the movement of the SBU's in the future
Some important limitations of the GE matrix / McKinsey Matrix are:
- Valuation of the realization of the various factors
- Aggregation of the indicators is difficult
- Core competencies are not represented
- Interactions between Strategic Business Units are not considered.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

7. Sales Mix
The term sale mix refers to the relative proportion in which a company's products are sold.
The concept is to achieve the combination, which will yield the greatest amount of profits.
Most companies have many products, and often these products are not equally profitable.
Hence, profits will depend to some extent on the company's sales mix. Profits will be greater
if high margin rather than low margin items make up a relatively large proportion of total
sales. Changes in sales mix can cause interesting variation in profits. A shift in sales mix
from high margin items to high margin of different items can cause reverse effect-total profit
may increase even though total sales decrease. It is one thing to achieve a particular sales
volume; it is quite a different thing to sell most profitable mix of products.
Sales mix is also meant that proportion of total sales, which each product or product line
generates, and which needs to be appropriately balanced to achieve the maximum amount
of gross profit. Thus, A sales mix is the proportion of sales coming from different products or
services. Changes in sales mix often affect profits because different products often have
different profit margins, therefore a change in the sales mix can have an impact on profits
even if total revenues are unchanged. Selling less of a more profitable product but making
up the sales with a less profitable product still leaves one with lower profits. Profit margin is
simply profit divided by sales. This means that there are as many measures of profit margin
as there are measures of profit. It is usual to be specific and refer to gross margin or profit
margin. Profit margins can provide a comparison between companies in the same industry,
and can help identify trends in the numbers for a company from year to year. In the latter
case it separates the effect on profits of growth or decline in sales from changes related to
efficiency and price levels. It does not do this perfectly as margins naturally increase with
sales (this is called operational gearing), particularly when a company has a high level of
fixed costs or high sales growth or decline.
Sales mix is also determined on the basis of following pricing matrix connected with pricing.
Profit Margin
High Margin
High Margin
High Margin
Low Margin
Low Margin

Pricing Matrix
Sales Volume
High Volume
Low Volume
Low Volume
High Volume
Low Volume

Total Profits
High Profits
High Profits
Low Profits
High Profits
Low Profits

While formulating a pricing policy by a multiproduct company, the above pricing matrix plays
a significant role in sales mix by suitably adjusting the profit margin and sales mix
appropriate to price sensitive customers and price insensitive customers to achieve higher
overall profit.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

10

IT
8. Nolans IT Growth Stages
Nolan and Gibson described four stages a company goes through in learning how to
manage IT (Initiation, Contagion, Control, Maturity) and the four growth processes that
influenced
progress
(Applications, Technologies,
Organization
and
Management, Users):
Originally
designed
to
examine all of IT in a
company, I suggest we can
employ this model to
examine the progress and
prospects for social media
within companies.
If you consider your own
company and its use of
social media in business,
you should be able to place
yourself on this map, growth process by growth process. I suspect most companies are
finding that social media use is proliferating, that it is doing so on multiple, replicating
platforms, that the organization that supports it is becoming more than technical, that theyre
supporting expansion, and that the users of the technology are both excited and uninformed.
Healthy progress requires that the four growth processes are at similar stages of maturity. If
your users remain totally unaware while your governance actions reflect the strictures of the
30-year-old IT function, you will have a mismatch, conflict and waste. If you are trying to
"proliferate" social media across your possible uses but do so only with the one technology
your initial trial used, you will have another mismatch, more conflict and more waste.
You may find that the customer care organizations use of social media has made more, or
less, progress through the stages than your product development organization. Examine
where they each are, the overall structural health of their efforts. Take actions to make them
healthy and set a date to examine whether to rationalize their applications and begin a
process of Integration.
Your customers have the option of buying into a new product or technology. Inside a
corporation, there are two models for the diffusion of innovations: collective innovation
decisions and authority innovation decisions. The collection-innovation decision occurs when
the adoption of an innovation has been made by a consensus among the members of an
organization. The authority-innovation decision occurs when the adoption of an innovation
has been made by very few individuals with high positions of power within an organization
(Rogers 2005, p. 403).
If consensus cant be achieved, then an authority-intervention decision will be made. For this
instance, Brooks, this time playing Louis XIV at Versailles, reminds us that Its good to be
the king.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

11

9. Enterprise Architecture
In 1987, John Zachman, wrote: To keep the business from disintegrating, the concept of
information systems architecture is becoming less of an option and more of a necessity.
From then on, the Enterprise Architecture Framework of Zachman has evolved and
became the model around which many major organizations view and communicate their
enterprise information infrastructure. It provides a blueprint, or architecture, for the
organizations current and future information infrastructure.
Instead of representing the process as a series of steps, he organized it around the points of
view (perspectives) taken by the various players. These players included: 1. someone who
has undertaken to do business in a particular industry, 2. the business people who run the
organization, 3. the systems analyst who wants to represent the business in a disciplined
form, 4. the designer, who applies specific technologies to solve the problems of the
business, 5. the builder of the system, and finally 6. the system itself. These perspectives
are represented as rows in the matrix.
The columns in the framework represent the data manipulated by an organization (what), its
functions and processes (how), locations where business is conducted (where), events that
trigger business activities (when), the people and organizations involved (who), and the
motivations and constraints which determine how the business behaves (why).
Terminology:
An Enterprise is a business association consisting of a recognized set of interacting
business functions, able to operate as an independent, standalone entity. With this
definition, there can be enterprises within enterprises.
Architecture provides the underlying framework, which defines and describes the platform
required by the enterprise to attain its objectives and achieve its business vision.
Network
(Where)

People
(Who)

Time
(When)

Motivation
(Why)

Objectives /
Scope

List of
List of things
processes
important to the the
enterprise
enterprise
performs

List of locations
where the
enterprise
operates

List of
organizational
units

List of
business
events /
cycles

List of
business
goals /
strategies

Model of the
Business

Entity
relationship
diagram
(including m:m,
n-ary, attributed
relationships)

Business
process
model
(physical
data flow
diagram)

Logistics
network (nodes
and links)

Organization
chart, with
roles; skill
sets; security
issues.

Business
master
schedule

Business
plan

Model of the
Information
System

Data model
(converged
entities, fully
normalized)

Essential
Data flow
Distributed
diagram;
system
application
architecture
architecture

Human
interface
architecture
(roles, data,
access)

Dependency
diagram,
entity life
history
(process
structure)

Business
rule model

Technology
Model

Data
architecture
(tables and
columns); map
to legacy data

System
design:
structure
chart,
pseudocode

System
architecture
(hardware,
software types)

User interface
(how the
system will
behave);
security
design

"Control
flow"
diagram
(control
structure)

Business
rule design

Data design
Detailed
Detailed
(denormalized),
Program
Representation physical storage
Design
design

Network
architecture

Screens,
security
architecture
(who can see
what?)

Timing
definitions

Rule
specification
in program
logic

Function
System

Communications Trained
facilities
people

Business
events

Enforced
rules

Data (What)

Converted data

Function
(How)

Executable
programs

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

12

10. Innovation Adoption Curve


The innovation adoption curve of Rogers is a model that classifies adopters of
innovations into various categories, based on the idea that certain individuals are inevitably
more open to adaptation than others. Is is also referred to as Multi-Step Flow
Theory or Diffusion of Innovations Theory.
Innovators
Brave people, pulling the change. Innovators are very important communication.
Early Adopters
Respectable people, opinion leaders, try out new ideas, but in a careful way.
Early Majority
Thoughtful people, careful but accepting change more quickly than the average.
Late Majority
Skeptic people, will use new ideas or products only when the majority is using it.
Laggards
Traditional people, caring for the "old ways", are critical towards new ideas and will only
accept it if the new idea has become mainstream or even tradition.
The diffusion of innovations curve (innovation adoption curve) of Rogers is useful to
remember that trying to quickly and massively convince the mass of a new controversial
idea is useless. It makes more sense in these circumstances to start with convincing
innovators and early adopters first. Also the categories and percentages can be used as a
first draft to estimate target groups for communication purposes.
Diffusion research focus was on five elements: 1) the characteristics of an innovation
which may influence its adoption; 2) the decision-making process that occurs when
individuals consider adopting a new idea, product or practice; 3) the characteristics of
individuals that make them likely to adopt an innovation; 4) the consequences for individuals
and society of adopting an innovation; and 5) communication channels used in the adoption
process.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

13

11. Project Management Life Cycle


Project Initiation
Project Initiation is the first phase in the Project Life Cycle and essentially involves starting
up the project. You initiate a project by defining its purpose and scope, the justification for
initiating it and the solution to be implemented. You will also need to recruit a suitably skilled
project team, set up a Project Office and perform an end of Phase Review. The Project
Initiation phase involves the following six key steps:

Project Planning
After defining the project and appointing the
project team, you're ready to enter the detailed
Project Planning phase. This involves creating a
suite of planning documents to help guide the
team throughout the project delivery. The
Planning Phase involves completing the following
10 key steps:
Project Execution
With a clear definition of the project and a suite of
detailed project plans, you are now ready to enter
the Execution phase of the project. This is the
phase in which the deliverables are physically
built and presented to the customer for
acceptance. While each deliverable is being
constructed, a suite of management
processes are undertaken to monitor and control
the deliverables being output by the project.
These processes include managing time, cost,
quality, change, risks, issues, suppliers,
customers and communication. Once all the
deliverables have been produced and the
customer has accepted the final solution, the
project is ready for closure.
Project Closure
Project Closure involves releasing the final deliverables to the customer, handing over
project documentation to the business, terminating supplier contracts, releasing project
resources and communicating project closure to all stakeholders. The last remaining step is
to undertake a Post Implementation Review to identify the level of project success and note
any lessons learned for future projects.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

14

12. RASCI Model


The RACI model is a relatively straightforward tool that can be used for identifying roles and
responsibilities during an organizational change process. After all, transformation processes
do not process themselves; people have to "do" something to make the processes happen.
Therefore it is useful to describe what should be done by whom to make a transformation
process happen.
In stead of the term RACI, sometimes also the
terms RASCI or RASIC are used.
RASCI is an abbreviation for:
R = Responsible - owns the problem /
project
A = to whom "R" is Accountable - who must
sign off (Approve) on work before it is
effective
(S = can be Supportive ) - can provide
resources or can play a supporting role in
implementation
C = to be Consulted - has information and/or
capability necessary to complete the work
I = to be Informed - must be notified of
results, but need not be consulted
The technique is typically supported by a RACI chart (see figure) which helps to
clearly discuss, agree and communicate the roles and responsibilities.
Typical steps in a RACI process:
1. Identify all of the processes / activities involved and list them down the left hand
side of the chart.
2. Identify all of the roles and list them along the top of the chart.
3. Complete the cells of the chart: identify who has the R, A, S, C, I for each
process.
4. Every process should preferably have one and only one R as a general
principle. A gap occurs when a process exists with no R (no role is responsible), an
overlap occurs when multiple roles exist that have an R for a given process.
5. Resolve Overlaps - Every process in a role responsibility map should contain one
and only one R to indicate a unique process owner. In the case of multiple Rs,
there is a need to zoom in and further detail the sub processes associated with
obtain resource commitment to separate the individual responsibilities.
6. Resolve Gaps - The simpler case to address is the resolution of a gap. Where no
role is identified that is responsible for a process, the individual with the authority for
role definition must determine which existing role is responsible or new role that is
required, update the RASCI map and clarify with the individual(s) that assume that
role.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

15

13. Systems Thinking / Dynamics


Systems Thinking is an approach for studying and managing complex feedback
systems, such as one finds in business and other social systems. In fact it has been used to
address practically every sort of feedback system.
SD is more or less the same as Systems Thinking, but emphasizes the usage of computersimulation tools.
The term System means an interdependent group of items forming a unified
pattern. Feedback refers to the situation of X affecting Y and Y in turn affecting X perhaps
through a chain of causes and effects. One cannot study the link between X and Y and,
independently, the link between Y and X and predict how the system will behave. Only the
study of the whole system as a feedback system will lead to correct results.
The Steps in the SD methodology are roughly as follows:
Identify a problem,
Develop a dynamic hypothesis explaining the cause of the problem,
Build a computer simulation model of the system at the root of the problem,
Test the model to be certain that it reproduces the behavior seen in the real world,
Devise and test in the model alternative policies that alleviate the problem, and
Implement the solution.
Often these steps have to be reviewed and refined going back to an earlier step. For
instance, the first problem identified may be only a symptom of a still greater problem.
SD is based on Systems Thinking, but takes the additional steps of constructing and testing
a computer simulation model.
The SD field developed initially from the book Industrial Dynamics of Jay W. Forrester.
Typical applications of SD can be found in:
strategy and corporate planning
business process development
public management and policy
biological and medical modeling
energy and the environment
theory development in the natural and social sciences
dynamic decision making
complex nonlinear dynamics

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

16

14. Strategic Alignment of business with IT


The Strategic Alignment model of Venkatraman, Henderson and Oldach is a framework
to Aligning Business and IT Strategy.
Venkatraman ea argue in 1993
that the difficulty to realize value
from IT investments is firstly due to
the lack of alignment between the
business and IT strategy of the
organizations that are making
investments, and secondly due to
the lack of a dynamic
administrative process to ensure
continuous alignment between the
business and IT domains.
Four Dominant Alignment
Perspectives:
Strategy Execution: this
perspective views the business strategy as the driver of both organization design choices
and the logic of IS infrastructure (the classic, hierarchical view of strategic management).
Top Management is strategy formulator, IS Management is strategy implementer. [Arrow 1]
Technology Potential: this perspective also views the business strategy as the driver,
however involves the articulation of an IT strategy to support the chosen business strategy
and the corresponding specification of the required IS infrastructure and processes. The top
management should provide the technology vision to articulate the logic and choices
pertaining to IT strategy that would best support the chosen business strategy, while the role
of the IS manager should be that of the technology architect - who efficiently and effectively
designs and implements the required IS infrastructure that is consistent with the external
component of IT strategy (scope, competences and governance). [Arrow 2]
Competitive Potential: this alignment perspective is concerned with the exploitation of
emerging IT capabilities to impact new products and services (i.e., business scope),
influence the key attributes of strategy (distinctive competences), as well as develop new
forms of relationships (i.e. business governance). Unlike the two previous perspectives that
considered business strategy as given (or a constraint for organizational transformation), this
perspective allows the modification of business strategy via emerging IT capabilities. The
specific role of the top management to make this perspective succeed is that of the business
visionary, who articulates how the emerging IT competences and functionality as well as
changing governance patterns in the IT marketplace would impact the business strategy.
[Arrow 3]
Service Level: This alignment perspective focuses on how to build world class IT/IS
organization within an organization. In this perspective, the role of business strategy is
indirect. This perspective is often viewed as necessary (but not sufficient) to ensure the
effective use of IT resources and be responsive to the growing and fast-changing demands
of the end-user population. The specific role of the top management to make this
perspective succeed is that of the prioritizer, who articulates how best to allocate the scarce
resources both within the organization as well as in the IT marketplace (in terms of joint
ventures, licensing, minority equity investments, etc.). The role of the IS manager, in
contrast, is one of business leadership, with the specific tasks of making the internai
business succeed within the operating guidelines from the top management. [Arrow 4]

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

17

Finance
15. Capital Asset Pricing Model
The Capital Asset Pricing Model (CAPM) is an economic model for valuing stocks,
securities, derivatives and/or assets by relating risk and expected return. CAPM is based
on the idea that investors demand additional expected return (called the risk premium) if
they are asked to accept additional risk.
The CAPM model says that this expected return that these investors would demand is equal
to the rate on a risk-free security plus a risk premium. If the expected return does not
meet/beat the required return, the investors will refuse to invest and the investment should
not be undertaken.
Expected Security Return = Riskless Return + Beta x (Expected Market Risk Premium) or:
r = Rf + Beta x (RM - Rf)
where:
-r
is the expected return rate on a security;
- Rf
is the rate of a "risk-free" investment, i.e. cash;
- RM
is the return rate of the appropriate asset class.
Beta is the overall risk in investing in a large market, like the New York Stock Exchange.
Beta, by definition equals 1,00000 exactly.
Each company also has a Beta. A company's Beta is that company's risk compared to the
Beta (Risk) of the overall market. If the company has a Beta of 3.0, then it is said to be 3
times more risky than the overall market. Beta measures the volatility of the security, relative
to the asset class.
A consequence of CAPM-thinking is that it implies that investing in individual stocks is
pointless, because one can duplicate the reward and risk characteristics of any security just
by using the right mix of cash with the appropriate asset class. This is why die-hard followers
of CAPM avoid stocks, and instead build portfolios merely out of low-cost index funds.
Note! The Capital Asset Pricing Model is a ceteris paribus model. It is only valid within a
special set of assumptions. These are:
Investors are risk averse individuals who maximize the expected utility of their end of
period wealth. Implication: The model is a one period model.
Investors have homogenous expectations (beliefs) about asset returns. Implication: all
investors perceive identical opportunity sets. This is, everyone have the same information at
the same time.
Asset returns are distributed by the normal distribution.
There exists a risk free asset and investors may borrow or lend unlimited amounts of this
asset at a constant rate: the risk free rate.
There is a definite number of assets and their quantities are fixed within the one period
world.
All assets are perfectly divisible and priced in a perfectly competitive marked. Implication:
e.g. human capital is non-existing (it is not divisible and it cant be owned as an asset).
Asset markets are frictionless and information is costless and simultaneously available to
all investors. Implication: the borrowing rate equals the lending rate.
There are no market imperfections such as taxes, regulations, or restrictions on short
selling.
Although the assumptions mentioned above normally are not all valid or met, CAPM remains
one of the most used investments models to determine risk and return.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

18

16. Discounted Cash Flow


Discounted cash flow (DCF) analysis is a way of calculating the value of money flowing into and out of
an organisation over time (the cash flows) based on the concept of its present value. Future cash flows
are estimated and then discounted to give their net present values (NPVs). The discount rate used
varies, but one commonly used method of discounting is to calculate how much money would have to
be invested in the present at a given rate of rate of return in order to achieve the cash flow in the future.
The rate of return may be based on current interest rates plus a risk premium that reflects the risk that
the future cash flow may not be forthcoming. The future value (FV) of an investment can be calculated
as:
FV = NPV x (1 + i)n
Where NPV is the net present value of the future cash flow, i is the interest rate (including any risk
premium), and n is the time in years until the cash flow occurs (the unit of time used can be adjusted
to reflect the time within which a return on investment is anticipated). Rearranging the formula to get
the net present value, we get:
NPV = FV/(1+i)n
This formula represents a relatively simplistic approach that assumes that the interest rate (including
the risk premium) remains constant. The formula is applied to both positive and negative cash flows.
In a typical project scenario, the cash flows are likely to be negative until some point after the project
is completed. The important thing to remember is that at some point in the future (the break-even
point), the NPV (represented by the sum of the discounted cash flows up to that point) should
acquire a positive value (i.e. it should be greater than zero). Essentially, if the NPV has a positive
value at some point the project will have paid for itself. If not, it will have lost money. To take a simple
example, let's assume a project has a total cost of US$ 100,000 over a twelve month period (Year
0). The project is expected to produce cost benefits in the form of an additional annual income of
US$ 25,000. per annum, and is expected to pay for itself within five years from completion of the
project. If a value for i of 7.5% is used, the result will be a small positive NPV, as illustrated below
(note that increasing the value of i to 8.0% results in a small negative NPV).
DCF Example ( Value in US$ dollars )
Details

Year 0

Year 1

Year 2

Year 3

Year 4

Year 5

Total
(net)

Cash Flow

100.000

25,000

25,000

25,000

25,000

25,000

25,000

Net Present
Value at 7.5%

100,000

23,256

21,633

20,124

18,720

17,434

1,147

The calculations involved can be set up in any spreadsheet program in a few minutes, enabling a
range of outcomes to be calculated by varying factors such as the value of i. The notion of
discounted cash flow can be used to calculate how much value a project will add to the organisation
over a given period of time. If the net present value at some pre-determined future date is greater
than zero, then the project has added value to the organisation. If not, there will be no point (from a
financial point of view at least) in continuing with the project. The example given is a relatively simple
one and has been used in order to demonstrate the basic principles. In reality such calculations can
be affected by a great number of variables, but it is beyond the scope of these pages to consider the
implications of this any further. Bear in mind also that even if the calculations result in an
unfavourable outcome from a purely monetary point of view, the project may still have long term
intangible benefits that cannot be measured in purely financial terms. There could be serious
negative consequences as a result of not going ahead with a project. The DCF is widely used in
capital budgeting, valuation of companies, shares, assets, business etc.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

19

17. Valuation Models


The valuation of a business, asset or share will commence with financial statements. The
valuation of a company is necessary when company goes for merger, acquisition,
reconstruction, restructuring, listing etc. Three classifications of valuation are 1) Asset
based Valuation 2) Entity based valuation 3)Market based valuation.
Asset Based Valuation
Name of the method
Formula
Historical Cost Method
Net book value of assets External
Liabilities/Number of shares
Current Value or Market
value Method

Current value of assets external


Liabilities/number of shares

Goodwill Method

Goodwill + Current value of business


External Liabilities/number of shares

Brief details
Value of assets is taken on the basis of
book value as per companys latest
Balance Sheet.
Value of asset is taken on the basis of
current value of assets which is arrived on
the basis of replacement cost or realisable
value or value to the business
After arriving current value of assets, good
will is also added to the value of assets.

Entity Based Valuation


Net Present Value
Present value returns= Annual cash
Calculation of net present value of cash
flow/Interest rate 1/(1+Interest rate)n flows by discounting with the use of present
value tables to ascertain the time value of
investment and its income.
This method can be further amplified for
with annual growth of cash flow .
Super Profits Method
Purchasing of a company at normalised
V=Value of business
value of assets and a number of years
A=Net assets
super profits
=Yearly profit of established business
r= Rate of return on assets in the new
business
k=Capitalisation rate of super profits
Market Based Valuation
Quoted Company Market value of shares
Market Value method
Unquoted Company
P/E Ratio = Market price of a share of
Price Earning Ratio
quoted company in similar business /
Earning per share
Dividend Yield Method
Dividend per share / Market price per
share of a quoted company in similar
business
Capital Asset Pricing
Rs=Rrf+B (Rm-Rrf)
Model
Rs = Expected return
Rrf= Risk free rate of return
Rm=Return from market as a whole
B=Beta Factor
Free Cash Flow method Free cash flow = Revenue costsinvestments

Market value of shares will be basis of


negotiations
P/E ratio relates to earning per share to
share value
Parentage of dividend indicates rate of
return to be received by shareholders
This model recognises the risk associated
with the share price and returns comprising
dividend and capital gain or loss

Valuation is based on number years of free


cash flow with or without using Discounted
cash flow method.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

20

18. Risk / Return Trade-off


One of the central concepts of investment theory is that there is a positive relationship
between the level of risk of an investment and its expected level of return - ie the higher the
risk the higher the expected return, and vice versa. Although logically most investors would
prefer low risk, the risk/return
trade off would limit the
potential for higher returns.
Although some asset classes
(particularly shares and
property) can demonstrate
significant volatility over the
short term, history has shown
that over the long term these
fluctuations can be smoothed
out and higher returns can be
generated by implementing
two main strategies:
diversifying your funds across
and within a range of different
investments, and
recognising that different investments have different time frames.
The risk/return trade-off is represented in the following graph:
The chosen investment may not be suitable for your needs, goals and circumstances.
Inflation Risk: Inflation risk the real purchasing power of your invested funds may not keep
pace with inflation.
Reinvestment Risk: If you rely on fixed rate investments you may have to reinvest maturing
money at a lower rate of interest.
Market Risk: Movements in the market mean the value of your investment can go down as
well as up - and sometimes suddenly.
Timing Risk: Trying to time entry to and exit from markets can expose you to potentially
greater short-term volatility.
Risk of not Diversifying: If you put all of your capital into one market a fall in that market will
adversely affect all of your capital.
Liquidity Risk: You may not be able to access your money as quickly as you need to without
suffering a fall in value.
Credit Risk: The institution you have invested with may not be able to make the required
interest payments or repay your funds.
Legislative Risk: Your investment strategies or products could be affected by changes in
current laws and regulations.
Value Risk: You may pay too much for the investment or sell it too cheaply.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

21

19. CFROI and TSR Models


Cash Flow Return on Investment (CFROI), originally developed by HOLT Value Associates
(since Jan 2002 CFSB Holt, based in Chicago), is an Economic Profit (Cash-Flow) based
corporate performance/valuation framework on Economic profit basis, mainly used by
portfolio managers and corporations.
CFROI is normally calculated on an annual basis and is compared to an inflation-adjusted
cost of capital to determine whether a corporation has earned returns superior to its costs of
capital. Cash Flow Return on Investment can help compare across companies with
disparate asset compositions, across borders and time.
An advantage of CFROI is that it ties performance measurement to the factor that capital
markets prize most: the ability of a corporation to generate cashflow.
Also CFROI is inflation-adjusted.
Cash Flow Return on Investment (CFROI) can be calculated at divisional (Strategic
Business Unit) level and can also be used for private held companies.
The calculation of CFROI (formula) is rather complex and too big to fit here.
Is is an approximation of the average real internal rate of return earned by a firm on all its
operating assets. The inflation-adjusted CFROI metric is calculated from a recurring stream
of after-tax cash flows generated by a company's growing base of deprecating and
nondepreciating assets. Over time, CFROI fades, or regresses, to the long-term corporate
average. By applying the ROI to the total assets, a net cash receipt forecast can be
calculated. This forecast is discounted back to the present to arrive at a current value for a
company.
CFSB Holt maintains a CFROI database of over 18.000 companies, consisting of 20 year
historical data for US-companies and 10 year historical data for non-US companies.
Total Shareholder Return (TSR) represents the change in capital value of a listed/quoted
company over a period (typically 1 year or longer), plus dividends, expressed as a plus or
minus percentage of the opening value.
Due to its nature, TSR can not be calculated at divisional level (Strategic Business Unit)
and below.
And also due to its nature, TSR cannot be observed for privately held companies.
TSR can be easily compared from company to company, and benchmarked against industry
or market returns, without having to worry about size bias (Total Shareholder Return
(TSR) is a percentage).

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

22

20. Plausible Theory for unknown risks


According to Collins & Michalski, "something is plausible if is conceptually supported by prior
knowledge". The Plausibility Theory of Wolfgang Spohn (1985-), Collins & Michalski
(reasoning, 1989), Lemaire & Fayol (arithmetic problem solving, 1995), Connell & Keane
(cognitive model of plausibility, 2002) provides new insights into decision making with
unknowable risks. Although plausibility is an ineluctable phenomenon of everyday life and
ubiquitous, it was ignored in cognitive science for a long time and treated only as an
operational variable rather than being explained or studied in itself.
Until the arrival of the plausibility theory, the common theory used by scientists to explain
and predict decision making was Bayesian statistics, named for Thomas Bayes, an 18thcentury English minister who developed rules for weighing the likelihood of different events
and their expected outcomes. Bayesian statistics were popularized in the 1960s by Howard
Raiffa for usage in business environments. According to Bayesian theory, managers make
and should make decisions based on a calculation of the probabilities of all the possible
outcomes of a situation. By weighing the value of each outcome by the probability and
summing the totals, Bayesian decision makers calculate "expected values" for a decision
that must be taken. If the expected value is positive, then the decision should be accepted; if
negative, avoided.
This may seem an orderly way to proceed. However unfortunately, the Bayesian way of
explaining decisions faces at least two phenomena's that are difficult to explain:
1. Downsize risk appreciation (why do people take a gamble at a 50% chance to make
10$ when they have to pay 5$ if they loose, but generally refuse to take the same gamble at
a 50% chance if they can win $1.000.000 versus a potential loss of $500.000?)
2. Dealing with unknowable risks (These kind of risks, that do not involve predictable odds,
are typical for business situations! Why do managers prefer risks that are known over risks
that can not be known?)
Both of these phenomena can be dealt with if the Bayesian Expected Value calculation is
replaced by the Risk Threshold of the Plausibility Theory. Like its predecessor, the
Plausibility Theory assesses the range of possible outcomes, but focuses on the probability
of hitting a threshold point - such as a net loss - relative to an acceptable risk. For example:
a normally profitable decision is rejected if there is a higher then 2% risk of making a (major)
loss. Clearly, plausibility can resolve the weaknesses of Bayesian thinking: both the
tendencies of managers to avoid unacceptable downsize risks and taking unknowable risks
can be explained.
A typical example of the application of plausibility theory are the new Basel II rules for capital
allocation in the financial services industry.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

23

Org / People / HR
21. Mintzberg Configuration
The organizational configurations framework of Mintzberg is a model that describes six valid
organizational configurations:
1. entrepreneurial organization
2. machine organization
3. professional organization
4. diversified organization
5. innovative organization
6. missionary organization
and 7. political organization (= an organizational lacking a real coordinating mechanism)

According to the organizational configurations model of Mintzberg each organization can


consist of a maximum of six basic parts:
1. Strategic Apex (top management)
2. Middle Line (middle management)
3. Operating Core (operations, operational processes)
4. Technostructure (analysts that design systems, processes, etc)
5. Support Staff (support outside of operating workflow)
6. Ideology (halo of beliefs and traditions; norms, values, culture)

According to the organizational configurations framework there are six valid coordinating
mechanisms in organizations:
1. Direct supervision (typical for entrepreneurial organizations)
2. Standardization of work (typical for machine organizations)
3. Standardization of skills (typical for professional organizations)
4. Standardization of outputs (typical for diversified organizations)
5. Mutual Adjustment (typical for innovative organizations)
6. Standardization of norms (typical for missionary organizations)

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

24

22. Kepner-Tregoe Matrix Decision Making Model


The Kepner-Tregoe Matrix is a special, well-orchestrated, synchronized and
documented Root Cause analysis and decision-making method.
It is a conscious, step-by-step approach for systematically solving problems, making good
decisions, and analyzing potential risks and opportunities. It helps you maximize critical
thinking skills, systematically organize and prioritize information, set objectives, evaluate
alternatives, and analyze impact.
Kepner-Tregoe describes the following steps to approach decision analysis:
1. Prepare a decision statement having both an action and a result component
2. Establish strategic requirements (Musts), operational objectives (Wants), and restraints
(Limits)
3. Rank objectives and assign relative weights
4. Generate alternatives
5. Assign a relative score for each alternative on an objective-by-objective basis
6. Calculate weighted score for each alternative and identify top two or three
7. List adverse consequences for each top alternative and evaluate probability (high,
medium, low) and severity (high, medium, low)
8. Make a final, single choice between top alternatives

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

25

23. Belbins Team Roles


In the 1970s, Dr Meredith Belbin and his research team at Henley Management College set
about observing teams, with a view to finding out where and how these differences come
about. They wanted to control the dynamics of teams to discover if and how problems
could be pre-empted and avoided. As the research progressed, the research revealed that
the difference between success and failure for a team was not dependent on factors such as
intellect, but more on behaviour. The research team began to identify separate clusters of
behaviour, each of which formed distinct team contributions or Team Roles. A Team Role
came to be defined as: A tendency to behave, contribute and interrelate with others in a
particular way. It was found that different individuals displayed different Team Roles to
varying degrees.
The first Team Role to be identified was the Plant. The role was so-called because one
such individual was planted in each team. They tended to be highly creative and good at
solving problems in unconventional ways.
One by one, the other Team Roles began to emerge. The Monitor Evaluator was needed
to provide a logical eye, make impartial judgements where required and to weigh up the
teams options in a dispassionate way.
Co-ordinators were needed to focus on the teams objectives, draw out team members and
delegate work appropriately.
When the team was at risk of becoming isolated and inwardly-focused, Resource
Investigators provided inside knowledge on the opposition and made sure that the teams
idea would carry to the world outside the team.
Implementers were needed to plan a practical, workable strategy and carry it out as
efficiently as possible.
Completer Finishers were most effectively used at the end of a task, to polish and
scrutinise the work for errors, subjecting it to the highest standards of quality control.
Teamworkers helped the team to gel, using their versatility to identify the work required and
complete it on behalf of the team.
Challenging individuals, known as Shapers, provided the necessary drive to ensure that the
team kept moving and did not lose focus or momentum.
It was only after the initial research had been completed that the ninth Team Role,
Specialist emerged. The simulated management exercises had been deliberately set up to
require no previous knowledge. In the real world, however, the value of an individual with indepth knowledge of a key area came to be recognised as yet another essential team
contribution or Team Role.
Balance is Key
Whilst some Team Roles were more high profile and some team members shouted more
loudly than others, each of the behaviours was essential in getting the team successfully
from start to finish. The key was balance. For example, Meredith Belbin found that a team
with no Plant struggled to come up with the initial spark of an idea with which to push
forward. However, once too many Plants were in the team, bad ideas concealed good ones
and non-starters were given too much airtime. Similarly, with no Shaper, the team ambled
along without drive and direction, missing deadlines. With too many Shapers, in-fighting
began and morale was lowered.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

26

24. Greiners Growth Model


The growth phases model of Greiner suggests that organizations go through 5 (6)
stages of growth and need appropriate strategies and structures to cope. It is a
descriptive framework that can be used to understand why certain management styles,
organizational structures and coordination mechanisms work and don't work at certain
phases in the development of an organization. The 1972 model of Greiner describes five
(six) phases of organizational development and growth:

Growth through creativity (start-up company, entrepreneurial, informal


communication, hard work and poor pay) [ending by a leadership crisis].

Growth through direction (sustained growth, functional organization structure,


accounting, capital management, incentives, budgets, standardized processes)
[ending by an autonomy crisis].

Growth through delegation (decentralized organizational structure, operational and


market level responsibility, profit centers, financial incentives, decision making is
based on periodic reviews, top management acts by exception, formal
communication) [ending by a control crisis].

Growth through coordination and monitoring (formation of product groups,


thorough review of formal planning, centralization of support functions, corporate
staff oversees coordination, corporate capital expenditures, accountability for ROI at
product group level, motivation through lower-level profit sharing) [ending by a red
tape crisis].

Growth through collaboration (new evolutionary path, team action for problem
solving, cross-functional task teams, decentralized support staff, matrix organization,
simplified control mechanisms, team behavior education programs, advanced
information systems, team incentives) [ending by a internal growth crisis].

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

27

25. Change Behaviour


The Theory of Planned Behavior (TPB) of Icek Ajzen (1988, 1991) helps to understand
how we can change the behavior of people. The TPB is a theory, which predicts deliberate
behavior, because behavior can be deliberative and planned.
TPB is the successor of the
similar Theory of Reasoned
Action of Ajzen and Fishbein
(1975, 1980). The succession was
the result of the discovery that
behavior appeared not to be 100%
voluntary and under control, which
resulted in the addition of perceived
behavioral control. With this
addition the theory was called the
Theory of Planned Behavior.
1. Behavioral Beliefs (beliefs
about the likely consequences of
the behavior)
2. Normative Beliefs (beliefs about the normative expectations of others)
3. Control Beliefs (beliefs about the presence of factors that may facilitate or impede
performance of the behavior).
Ajzen's three considerations are crucial in circumstances / projects / programs when
changing behavior of people.
In their respective aggregates, behavioral beliefs produce a favorable or unfavorable attitude
toward the behavior, normative beliefs result in perceived social pressure or subjective norm,
and control beliefs give rise to perceived behavioral control. In combination, attitude toward
the behavior, subjective norm, and perception of behavioral control lead to the formation of a
behavioral intention. As a general rule, the more favorable the attitude and subjective norm
and the greater the perceived control, the stronger should be the persons intention to
perform the behavior in question.
Recently (2002) Ajzen investigated Residual Effects of Past on Later Behavior. He came
to the conclusion that this factor indeed exists but cannot be described to habituation as
many people think. A review of existing evidence suggests that the residual impact of past
behavior is attenuated when measures of intention and behavior are compatible and
vanishes when intentions are strong and well formed, expectations are realistic, and specific
plans for intention implementation have been developed.A research project in the travel
industry resulted in the conclusion that past travel choice contributes to the prediction of
later behavior only if circumstances remain relatively stable.
Example: The Theory of Planned Behavior of Ajzen can help to explain why advertising
campaigns merely providing information do not work. Increasing knowledge alone does not
help to change behavior very much. Campaigns that aim at attitudes, perceived norms and
control in making the change or buying certain goods have better results.
Similarly in Value Based Management, programs that focus only on explanation of the
importance of Managing for Value (knowledge transfer) will likely not succeed. Rather one
should convince people to change their intention to change by giving a lot of attention to
attitudes, subjective norms and perceived behavior control.
S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

28

26. Maslow Hierarchy of Needs


The Hierarchy of Needs model of Abraham Maslow
Each human being is motivated by needs. Our most basic needs are inborn, having evolved
over tens of thousands of years. Abraham
Maslow's Hierarchy of Needs helps to explain
how these needs motivate us all.
Hierarchy of Needs - Physiological needs
These are the very basic needs such as air, water,
food, sleep, sex, etc. When these are not
satisfied we may feel sickness, irritation, pain,
discomfort, etc. These feelings motivate us to
alleviate them as soon as possible to establish
homeostasis. Once they are alleviated, we may
think about other things.
Hierarchy of Needs - Safety needs
These have to do with establishing stability and
consistency in a chaotic world. These needs are
mostly psychological in nature. We need the
security of a home and family. However, if a
family is dysfunction, i.e., an abusive husband,
the wife cannot move to the next level because
she is constantly concerned for her safety. Love and belongingness have to wait until she is
no longer cringing in fear. Many in our society cry out for law and order because they do not
feel safe enough to go for a walk in their neighborhood.
Hierarchy of Needs - Love and belongingness needs
These are next on the ladder. Humans have a desire to belong to groups: clubs, work
groups, religious groups, family, gangs, etc. We need to feel loved (non-sexual) by others, to
be accepted by others. Performers appreciate applause. We need to be needed.
Hierarchy of Needs - Self-Esteem needs
There are two types of esteem needs. First is self-esteem which results from competence or
mastery of a task. Second, there's the attention and recognition that comes from others. This
is similar to the belongingness level, however, wanting admiration has to do with the need
for power.
Hierarchy of Needs - The need for self-actualization
This is "the desire to become more and more what one is, to become everything that one is
capable of becoming." People who have everything can maximize their potential. They can
seek knowledge, peace, esthetic experiences, self-fulfillment, oneness with God, etc.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

29

27. EFQM Model


Total Quality Management (TQM) is the idea that controlling quality is not something that is
left to a "quality controller, a person who stands at the end of a production line checking final
output. It is (or should be) something that permeates an organization from the moment its
raw materials arrive to the moment its finished products leave the premises.
The EFQM Model is a non-prescriptive TQM framework based on nine criteria. Five of
these are 'Enablers' and four are 'Results'. The 'Enabler' criteria cover what an organization
does. The 'Results' criteria cover what an organization achieves. 'Results' are caused by
'Enablers' and feedback from 'Results' help to improve 'Enablers'.
The EFQM Model, which recognizes there are many approaches to achieving sustainable
excellence in all aspects of performance, is based on the premise that excellent results with
respect to Performance, Customers, People and Society are achieved through Leadership
driving Policy and Strategy, that is delivered through People Partnerships and Resources,
and Processes.
EFOM is a non-prescriptive framework that recognizes there are many approaches to
achieving sustainable excellence. Within this non-prescriptive approach there are some
fundamental concepts which underpin the EFQM Model:
- Results Orientation: achieving results that delight all the organization's stakeholders.
- Customer Focus: creating sustainable customer value.
- Leadership & Constancy of Purpose: visionary and inspirational leadership, coupled with
constancy of purpose.
- Management by Processes & Facts: managing the organization through a set of
interdependent and interrelated systems, processes and facts.
- People Development & Involvement: maximizing the contribution of employees through
their development and involvement.
- Continuous Learning, Innovation & Improvement: challenging the status quo and effecting
change by using learning to create innovation and improvement opportunities.
- Partnership Development: developing and maintaining value-adding partnerships.
- Corporate Social Responsibility: exceeding the minimum regulatory framework in which
the organization operates and to strive to understand and respond to the expectations of
their stakeholders in society.
The EFQM Model is one of the most widely used organizational frameworks in Europe.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

30

28. Path-Goal Theory Leadership Model


The Path-Goal Theory of Robert House says that a leader can affect the performance,
satisfaction, and motivation of a group by:
- offering rewards for achieving performance goals,
- clarifying paths towards these goals,
- removing obstacles to performance.
However, whether leadership behavior can do so effectively also depends on situational
factors.
According to House, there are four different types of leadership styles depending on the
situation:
1. Directive Leadership: The leader gives specific guidance of performance to
subordinates.
2. Supportive Leadership: The leader is friendly and shows concern for the subordinates.
3. Participative Leadership: The leader consults with subordinates and considers their
suggestions.
4. Achievement-oriented Leadership: The leader sets high goals and expects
subordinates to have high-level performance.
The Situational Factors of the Path-Goal Theory are:
I) Subordinates' Personality:
A Locus of Control (A participative leader is suitable for subordinates with internal locus
of control; A directive leader is suitable for subordinates with external locus of control).
B Self-perceived ability (Subordinates who perceive themselves as having high ability do
not like directive leadership).
II) Characteristics of the environment:
- When working on a task that has a high structure, directive leadership is redundant and
less effective.
- When a highly formal authority system is in place, directive leadership can again reduce
workers' satisfaction.
- When subordinates are in a team environment that offers great social support, the
supportive leadership style becomes less necessary.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

31

SCM / IT / Ops Management


29. Benchmarking
Benchmarking is a systematic comparison of organizational processes and performance to
create new standards or to improve processes. Benchmarking models are used to
determining how well a business unit, division, organization or corporation is performing
compared with other similar organizations. A Benchmark is often used for improving
communication, professionalizing the organization / processes or for budgetary reasons.
Traditionally, performance measures have been compared with previous measures from the
same organization at different times. Although this can be a good indication of the rate of
improvement within the organization, it could be that although the organization is improving,
the competition is improving faster.

There are four types of benchmarking methods:


1. internal (benchmark within a corporation, for example between business units)
2. competitive (benchmark performance or processes with competitors)
3. functional (benchmark similar processes within an industry)
4. generic (comparing operations between unrelated industries)

Typically, benchmarking models involves the following steps:


- scope definition
- choose benchmark partner(s)
- determine measurement methods, units, indicators and data collection method
- data collection
- analysis of the discrepancies
- present the results and discuss implications / improvement areas and goals
- make improvement plans or new procedures
- monitor progress and plan ongoing benchmark.

Benchmarking is a tough process that needs a lot of commitment to succeed. More than
once benchmarking projects end with the 'they are different from us' syndrome or
competitive sensitivity prevents the free flow of information that is necessary. However
comparing performances and processes with 'best in class' is important and should ideally
be done on a continuous basis (the competition is improving its processes also...).

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

32

30. Toyotas Total Production System / JIT


Just-in-time, pioneered by Taiichi Ohno in Japan at the Toyota car assembly plants in the early
1970s, is a manufacturing organization philosophy. JIT cuts waste by supplying parts only when the
assembly process requires them. At the heart of JIT lies the kanban, the Japanese word for card.
This kanban card is sent to the warehouse to reorder a standard quantity of parts as and when they
have been used up in the assembly/manufacturing process. JIT requires precision, as the right parts
must arrive "just-in-time" at the right position (work station at the assembly line). It is used primarily
for high-volume repetitive flow manufacturing processes.
Historically, the JIT philosophy arose out of two other things:
1. Japan's wish to improve the quality of its production. At that time, Japanese companies had a bad
reputation as far as quality of manufacturing and car manufacturing in particular was concerned.
2. Kaizen, also a Japanese method of continuous improvement.
The Just-in-time framework regards inventories as a poor excuse for bad planning, inflexibility,
wrong machinery, quality problems, etc. The target of JIT is to speed up customer response while
minimizing inventories at the same time. Inventories help to response quickly to changing customer
demands, but inevitably cost money and increase the needed working capital.
Typical attention areas of JIT implementations include:
- inventory reduction
- smaller production lots and batch sizes
- quality control
- complexity reduction and transparency
- flat organization structure and delegation
- waste minimization
Through the arrival of Internet and Supply Chain Planning software, companies have in the mean
time extended Just-in-time manufacturing externally, by demanding from their suppliers to deliver
inventory to the factory only when it's needed for assembly, making JIT manufacturing, ordering and
delivery processes even speedier, more flexible and more efficient. In this way Integrated Supply
Networks (Demand Networks) or Electronic Supply Chains are being formed. Just-in-time is
sometimes referred to as 'Lean Production'.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

33

31. Deming Cycle


The Deming cycle, or PDSA cycle, is a continuous quality improvement model consisting
of a logical sequence of four repetitive steps for continuous improvement and learning: Plan,
Do, Study (Check) and Act. The PDCA cycle is also known as the Deming Cycle, or as the
Deming Wheel or as the Continuous Improvement Spiral. It originated in the 1920s with the
eminent statistics expert Mr. Walter A. Shewhart, who introduced the concept of PLAN, DO
and SEE. The late Total Quality Management (TQM) guru and renowned statistician W.
Edwards Deming modified the Shewart cycle as: PLAN, DO, STUDY, and ACT.
Along with the other well-known American quality guru-J.M. Juran, Deming went to Japan as
part of the occupation forces of the allies after World War II. Deming taught a lot of Quality
Improvement methods to the Japanese, including the usage of statistics and the PLAN, DO,
STUDY, ACT cycle.
Benefits of the PDCA cycle:
- daily routine management-for the
individual and/or the team,
- problem-solving process,
- project management,
- continuous development,
- vendor development,
- human resources development,
- new product development, and
process trials.

The Deming cycle, or PDSA cycle:


PLAN: plan ahead for change. Analyze and
predict the results.
DO: execute the plan, taking small steps in
controlled circumstances.
STUDY: CHECK, study the results.
ACT: take action to standardize or improve
the process.

In her book "The Deming Management Method" Mary Watson tells about the life of the
business guru the late W. Edwards Deming. The industrial miracle in Japan was a prime
example of what can happen when a nation commits itself to quality and long-range vision
instead of the latest illness: "Turning a Fast Buck-itis." In less then 50 years, Japan went
from making rubber dog-shit, to turning out some of the highest quality precision work in the
world. When Dr. Deming first began speaking in America, America was still riding along on
the post-war victory wave. No one would listen to him. The Japanese welcomed him, and
even today, traces of his quality-control methods are still seen in the industrial workplace.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

34

32. Michael Porters Value Chain


The Value Chain framework of Michael Porter is a model that helps to analyze specific
activities through which firms can create value and competitive advantage.
Inbound Logistics
Includes receiving, storing,
inventory control,
transportation scheduling.
Operations
Includes machining,
packaging, assembly,
equipment maintenance,
testing and all other valuecreating activities that
transform the inputs into
the final product.
Outbound Logistics
The activities required to
get the finished product to
the customers: warehousing, order fulfillment, transportation, distribution management.
Value Chain model of Michael Porter: Marketing and Sales
The activities associated with getting buyers to purchase the product including channel
selection, advertising, promotion, selling, pricing, retail management, etc.
Service
The activities that maintain and enhance the product's value, including customer support,
repair services, installation, training, spare parts management, upgrading, etc.
Procurement
Procurement of raw materials, servicing, spare parts, buildings, machines, etc.
Technology Development
Includes technology development to support the value chain activities, such as Research
and Development, Process automation, design, redesign.
Value Chain model of Michael Porter: Human Resource Management
The activities associated with recruiting, development (education), retention and
compensation of employees and managers.
Firm Infrastructure
Includes general management, planning management, legal, finance, accounting, public
affairs, quality management, etc.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

35

33. TQM: Six Sigma, Kaizen


The Six Sigma model is a highly disciplined approach that helps companies focus on
developing and delivering near-perfect products and services. It is based on the statistical
work of Joseph Juran, a Rumanian-born US pioneer of quality management. The word
"Sigma" is a Greek letter used for a statistical term that measures how far a given process
deviates from perfection (standards deviation). The higher the sigma number, the closer to
perfection. One sigma is not very good, six sigma means only 3.4 defects per million. The
central idea behind Six Sigma is that if you can measure how many "defects" you have in a
process, you can systematically figure out how to eliminate them and get as close to "zero
defects" as possible.
The Japanese origin of Six Sigma can still be seen by the system of "belts" it uses. If you
are a newbie and go on a basic training, you get a green belt. Anyone who has the
responsibility for leading a Six Sigma team is called a black belt. Finally there is a special
elite group called Master Black Belts who supervise the Black Belts.
The Kaizen method of continuous incremental improvements is an originally Japanese
management concept for incremental (gradual, continuous) change (improvement). K. is
actually a way of life philosophy, assuming that every aspect of our life deserves to be
constantly improved. The Kaizen philosophy lies behind many Japanese management
concepts such as Total Quality Control, Quality Control circles, small group activities, labor
relations. Key elements of Kaizen are quality, effort, involvement of all employees,
willingness to change, and communication.
Japanese companies distinguish between innovation (radical) and Kaizen (continuous). K.
means literally: change (kai) to become good (zen).
The foundation of the Kaizen method consists of 5 founding elements, teamwork, personal
discipline, improved morale, quality circles and suggestions for improvement.
Out of this foundation three key factors in K. arise:
- elimination of waste (muda) and inefficiency
- the Kaizen five-S framework for good housekeeping
1. Seiri - tidiness
2. Seiton - orderliness
3. Seiso - cleanliness
4. Seiketsu - standardized clean-up
5. Shitsuke - discipline
- standardization.
When to apply the Kaizen philosophy? Although it is difficult to give generic advice it is
clear that it fits well in incremental change situations that require long-term change and in
collective cultures. More individual cultures that are more focused on short-term success are
often more conducive to concepts such as business process re-engineering.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

36

34. Experience Curve


The Experience Curve Effects were first described by BCG consultant Bruce
Henderson in 1960. Henderson found that there is a consistent relationship between the
cost of production and the cumulative production quantity.
Simply put it states that the more often a task is performed, the lower will be the cost of
performing it. Each time cumulative volume doubles, value added costs (including
administration, marketing, distribution, and manufacturing) fall by a constant and predictable
percentage.
Researchers since then have observed experience curve effects for various industries
ranging between 10 to 30 percent.
The Experience Curve is a major enabler for a cost leadership strategy. If a company
can grasp a big market share quickly in a new market, it has a competitive cost
advantage because it can produce products cheaper than its competitors. Provided the cost
savings are passed on to the buyers as price decreases (rather than kept as profit margin
increases), this advantage is sustainable. If a business could accelerate its production
experience by increasing its market share, it could gain a cost advantage in its industry that
would be hard to match. The result is many companies try to gain a large market share
quickly by investing heavily and aggressively pricing their products or services in new
markets. The investment can be recovered later, once the company has become a market
leader and has built itself a cash cow.
Some limitations of an experience curve-based strategy include:
- There are also other business strategies than Cost Leadership Strategies
- Competitors may also pursue a similar strategy, increasing the necessary investment
levels while decreasing the returns for both.
- Competitors that copy manufacturing methods may achieve even lower production costs by
not having to recover R&D investments.
- Technology breakthroughs may enable even bigger experience curve effects. This is
beneficial for later entrants.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

37

35. Crisis Management


Obviously, any corporation hopes not to face "situations causing a significant business
disruption which stimulates extensive media coverage" (crisis). The public scrutiny that is a
result from this media coverage often affects the normal operations of the company and can
have a (negative) financial, political, legal and governmental impact. Substantial value
destruction is to be feared of, especially when the crisis is not handled well in the perception
of the media / public opinion. Crisis management deals with giving the right crisis
response (precautionary, structural and ad-hoc).

Some generic help and tips on crisis management:


1. Prepare contingency plans in advance (crisis management team and members can
be formed at very short notice, rehearsing of crises of various kinds)
2. Immediately and clearly announce internally that the the only persons to speak about
the crisis to the outside world are the crisis team members)
3. Move quickly (the first hours after the crisis first breaks are extremely important,
because the media often build upon the information in the first hours)
4. Use crisis management consultants (advice by objectivity of PR consultants is
important, bring in specialist corporate image expertise)
5. Give accurate and correct information (trying to manipulate information will seriously
backfire if it is discovered, also internally!)
6. When deciding upon actions, consider not only the short-term losses, but focus also
on the long term effects.
Executives at all levels of the organization are employed to manage crises and often do so
on a daily basis. Their skills are really tested when they have to manage significant crises
that have the potential to disrupt the organization's value creation process, income sources,
operating expenses, stock price, competitive position and ongoing business.
The most effective crisis management occurs when potential crises are detected and dealt
with quickly--before they can impact the organization's business. In those instances they
never come to the attention of the organization's key stakeholders or the general public via
the news media.
In instances where the crisis already has erupted, or it is inevitable the crisis will impact the
organization's key stakeholders, a business continuity plan is helpful to minimize the
disruption and damage. Developing such a plan can seem like a daunting task, but in
actuality it is a common-sense document. It involves identifying those functions and
processes that are critical to the business, then designing the operational and
communications contingency plans to deal with the potential failure of one or more of them
and how key stakeholders will react when they find out.
Corporations with business continuity plans for responding to likely disruptions will be in a
better position to minimize the business impact and financial damage. However, their
executives find the process of developing these plans has an indirect benefit. Their
organizations are more sensitive to possible crisis situations that could disrupt the business
and affect its operating expenses, profits and overall growth. As a result their managers
respond more rapidly and effectively to head them off.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

38

Strategy
36. SEPT / PEST
The PEST Analysis is a framework that strategy consultants use to scan the
external macro-environment in which a firm operates. PEST is an acronym for the
following factors:

Political factors

Economic factors

Social factors, and

Technological factors.

PEST factors play an important role in the value creation opportunities of a strategy.
However they are usually beyond the control of the corporation and must normally be
considered as either threats or opportunities. Remember macro-economical factors can
differ per continent, country or even region, so normally a PEST analysis should be
performed per country.
Political (incl. Legal)

Economic

Social

Technological

Environmental
regulations and
protection

Economic growth

Income distribution

Government research
spending

Tax policies

Interest rates &


monetary policies

Demographics, Population
growth rates, Age
distribution

Industry focus on
technological effort

International trade
regulations and
restrictions

Government
spending

Labor / social mobility

New inventions and


development

Contract enforcement
law
Consumer protection

Unemployment policy

Lifestyle changes

Rate of technology transfer

Employment laws

Taxation

Work/career and leisure


attitudes
Entrepreneurial spirit

Life cycle and speed of


technological obsolescence

Government organization
/ attitude

Exchange rates

Education

Energy use and costs

Competition regulation

Inflation rates

Fashion, hypes

(Changes in) Information


Technology

Political Stability

Stage of the business


cycle

Health consciousness &


welfare, feelings on safety

(Changes in) Internet

Safety regulations

Consumer confidence

Living conditions

(Changes in) Mobile


Technology

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

39

37. SWOT TOWS


A SWOT analysis is an instrumental framework in Value Based Management and Strategy
Formulation to identify the Strengths, Weaknesses, Opportunities and Threats for a
particular company.
Strengths and Weaknesses are internal value creating (or destroying) factors such as assets,
skills or resources a company has at its disposal relatively to its competitors. They can be
measured using internal assessments or external benchmarking.
Opportunities and Threats are external value creating (or destroying) factors a company
cannot control, but emerge from either the competitive dynamics of the industry/market or
from demographic, economic, political, technical, social, legal or cultural factors.

Strengths
-

specialist marketing expertise


exclusive access to natural resources
patents
new, innovative product or service
location of your business
cost advantage through proprietary know-how
quality processes and procedures
strong brand or reputation

Weaknesses
- lack of marketing expertise
- undifferentiated products and service (i.e. in
relation to your competitors)
- location of your business
- competitors have superior access to distribution
channels
- poor quality goods or services
- damaged reputation

Threats
Opportunities
-

developing market (China, the Internet)


mergers, joint ventures or strategic alliances
moving into new attractive market segments
a new international market
loosening of regulations
removal of international trade barriers
a market led by a weak competitor

- a new competitor in your home market


- price war
- competitor has a new, innovative substitute product
or service
- new regulations
- increased trade barriers
- taxation may be introduced on your product or
service

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

40

38. Balanced Score Card


The BSC method of Kaplan and Norton is a strategic approach and performance
management system that enables organizations to translate a company's vision and strategy
into implementation, working from 4 perspectives:
1. financial perspective,
2. customer perspective,
3. business process perspective,
4. learning and growth perspective.
Ad 1: The BSC / Financial perspective: Kaplan and Norton do not disregard the traditional
need for financial data. Timely and accurate funding data will always be a priority, and
managers will do whatever necessary to provide it. In fact, often there is more than enough
handling and processing of financial data. With the implementation of a corporate database,
it is hoped that more of the processing can be centralized and automated. But the point is
that the current emphasis on financials leads to the "unbalanced" situation with regard to
other perspectives. There is perhaps a need to include additional financial-related data, such
as risk assessment and cost-benefit data in this category.
Ad 2: The BSC / Customer perspective: recent management philosophy has shown an
increasing realization of the importance of customer focus and customer satisfaction in any
business. These are leading indicators: if customers are not satisfied, they will eventually
find other suppliers that will meet their needs. Poor performance from this perspective is
thus a leading indicator of future decline, even though the current financial picture may look
good. In developing metrics for satisfaction, customers should be analyzed in terms of kinds
of customers and the kinds of processes for which we are providing a product or service to
those customer groups.
Ad 3: The BSC / Business Process perspective refers to internal business processes.
Metrics based on this perspective allow the managers to know how well their business is
running, and whether its products and services conform to customer requirements (the
mission). These metrics have to be carefully designed by those who know these processes
most intimately. In addition to the strategic management process, two kinds of business
processes may be identified: a) mission-oriented processes, and b) support processes.
Mission-oriented processes are the special functions of government offices, and many
unique problems are encountered in these processes. The support processes are more
repetitive in nature, and hence easier to measure and benchmark using generic metrics.
Ad 4: The BSC / Learning and Growth perspective includes employee training and
corporate cultural attitudes related to both individual and corporate self-improvement. In a
knowledge-worker organization, people are the main resource. In the current climate of rapid
technological change, it is becoming necessary for knowledge workers to be in a continuous
learning mode. Government agencies often find themselves unable to hire new technical
workers and at the same time is showing a decline in training of existing employees. Kaplan
and Norton emphasize that 'learning' is more than 'training'; it also includes things like
mentors and tutors within the organization, as well as that ease of communication among
workers that allows them to readily get help on a problem when it is needed. It also includes
technological tools such as an Intranet.
The integration of these four perspectives into a graphical appealing picture have made
the Balanced Scorecard method a very successful methodology within the Value Based
Management philosophy.
S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

41

39. BCG Matrix


The BCG matrix method is based on the product life cycle theory that can be used to
determine what priorities should be given in the product portfolio of a business unit. To
ensure long-term value creation, a company should have a portfolio of products that
contains both high-growth products in need of cash inputs and low-growth products that
generate a lot of cash. It has 2 dimensions: market share and market growth. The basic
idea behind it is that the bigger the market share a product has or the faster the product's
market grows the better it is for the company.
Placing products in the BCG matrix results in 4 categories in a portfolio of a company:
1. Stars (=high growth, high market share)
- use large amounts of cash and are leaders in the business so they should also generate
large amounts of cash.
- frequently roughly in balance on net cash flow. However if needed any attempt should be
made to hold share, because the rewards will be a cash cow if market share is kept.
2. Cash Cows (=low growth, high market share)
- profits and cash generation should be high , and because of the low growth, investments
needed should be low. Keep profits high
- Foundation of a company
3. Dogs (=low growth, low market share)
- avoid and minimize the number of dogs in a company.
- beware of expensive turn around plans.
- deliver cash, otherwise liquidate
4. Question Marks (= high growth, low market share)
- have the worst cash characteristics of all, because high demands and low returns due to
low market share
- if nothing is done to change the market share, question marks will simply absorb great
amounts of cash and later, as the growth stops, a dog.
- either invest heavily or sell off or invest nothing and generate whatever cash it can.
Increase market share or deliver cashThe BCG Matrix method can help understand a
frequently made strategy mistake: having a one-size-fits-all-approach to strategy, such as a
generic growth target (9 percent per year) or a generic return on capital of say 9,5% for an
entire corporation.In such a scenario: A. Cash Cows Business Units will beat their profit
target easily; their management have an easy job and are often praised anyhow. Even
worse, they are often allowed to reinvest substantial cash amounts in their businesses which
are mature and not growing anymore.
B. Dogs Business Units fight an impossible battle and, even worse, investments are made
now and then in hopeless attempts to 'turn the business around'.
C. As a result (all) Question Marks and Stars Business Units get mediocre size investment
funds. In this way they are unable to ever become cash cows. These inadequate invested
sums of money are a waste of money. Either these SBUs should receive enough investment
funds to enable them to achieve a real market dominance and become a cash cow (or star),
or otherwise companies are advised to disinvest and try to get whatever possible cash out of
the question marks that were not selected.
Some limitations of the Boston Consulting Group Matrix include:
High market share is not the only success factor
Market growth is not the only indicator for attractiveness of a market and sometimes Dogs
can earn even more cash as Cash Cows.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

42

40. Porters Five Forces Model


The Five Forces model of Porter is an outside-in business unit strategy tool that is used to
make an analysis of the attractiveness (value...) of an industry structure. TheCompetitive
Forces analysis is made by
the identification of 5
fundamental competitive
forces:
o the entry of
competitors (how easy or
difficult is it for new entrants
to start to compete, which
barriers do exist)
o the threat of
substitutes (how easy can
our product or service be
substituted, especially
cheaper)
o the bargaining power of
buyers (how strong is the
position of buyers, can they
work together to order large
volumes)
o the bargaining power of suppliers (how strong is the position of sellers, are there many
or only few potential suppliers, is there a monopoly)
o the rivalry among the existing players (is there a strong competition between the
existing players, is one player very dominant or all all equal in strength/size)
o as a sixth factor could be added: government.
Porter's competitive forces model is probably one of the most often used business
strategy tools and has proven its usefulness on numerous occasions. Porter'smodel is
particularly strong in thinking outside-in. Care should therefore be taken not to
underestimate or underemphasize the importance of the (existing) strengths of the
organization (inside-out) when applying this five competitive forces framework of Porter.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

43

41. STRATPORT PADSS


The STRATPORT model of Larreche and Srinivasan (1981, 1982) is a decision support
system for the allocation of a firm's financial resources across its Strategic Business Units
(portfolio analysis). The approach models the impact of general marketing expenditures on
both market share and on the firm's cost structure. Given a specific portfolio strategy, the
system can evaluate the profit and cash flow implications of following that strategy over time.
Alternately, the approach can determine the optimal allocation of marketing expenditures
across Strategic Business Units in order to maximize net present value over a specified time
horizon.
In the Boston Consulting Group approach (BCG Matrix), relative market share and the market
growth are used to classify business units as Question Marks, Stars, Cash Cows, or Dogs.
In the General Electric/McKinsey approach, the business units are classified into nine groups according
to company strength and industry attractiveness. The position of a given business unit on
each of these dimensions is determined qualitatively from a number of market, competitive.
environmental, and internal factors. The Royal Dutch Shell approach is somewhat similar
although the two dimensions are called company's competitive capabilities and prospects for
sector profitability, and the set of factors and their integration into these composite
dimensions are also different. The philosophy underlying these approaches is, however,
similar. At a given point in time, each business unit has a specific role in the portfolio
according to its short-term and long-term economic potential. This role determines the
allocation of financial resources among elements of the portfolio. Minimum or maintenance
investments will be made in a group of business units so that they generate a maximum
cash flow in the short term.
The STRATPORT model (for STRATegic PORTfolio planning) decision support system, is
an on-line computerized mathematical model utilizing empirical and (managerial) judgmentbased data. This system was designed to assist top managers and corporate planners in the
evaluation and formulation of business portfolio strategies, and it represents both
an operationalization and extension of the business portfolio analysis approaches
previously mentioned.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

44

Accounting
42. Activity Based Costing
Activity Based Costing (ABC) is an alternative to the traditional way of
accounting. Traditionally it is believed that high volume customers are profitable customers,
a loyal customer is also a profitable one, and profits will follow a happy customer. Studies on
customer profitability have unveiled that the above is not necessarily true. ABC is a costing
model that identifies the cost pools, or activity centers, in an organization and assigns costs
to products and services (cost drivers) based on the number of events or transactions
involved in the process of providing a product or service. As a result, Activity Based Costing
can support managers to see how to maximize shareholder value and improve corporate
performance.
Historically, cost accounting models related indirect costs on the basis of volume.
Typical benefits of Activity-Based Costing (also: 'Activity Based Management') include:

Identifying the most and least profitable customers, products and channels.

Determine the true contributors to- and detractors from- financial performance.

Accurately predict costs, profits and resource requirements associated with changes
in production volumes, organizational structure and resource costs.

Easily identify the root causes of poor financial performance.

Track costs of activities and work processes.

Equip managers with cost intelligence to drive improvements.

Facilitate better Marketing Mix

Enhance the bargaining power with the customer.

Achieve better Positioning of products

With the costing based on activities, the cost of serving a customer can be ascertained
individually. Deducting the product cost and the cost to serve each customer, one can arrive
at customers profitability. This method of dealing with customer cost and product cost
separately has lead to identifying the profitability of each customer and to position products
and services accordingly.
ABC implementation can help make employees to understand the various costs involved,
which will in turn enable them to analyze the cost, identify the Value Added and Non Value
Added Activities, implement the improvements and realize the benefits. This is a continuous
improvement process in terms of analyzing the cost, to reduce or eliminate the Non Value
Added activities and to achieve an overall efficiency.
ABC has helped enterprises in answering the market need of better quality products at
competitive prices. Analyzing the product profitability and customer profitability, the ABC
method has contributed effectively for the top managements decision making process. With
ABC, enterprises are able to improve their efficiency and reduce the cost without sacrificing
the value for the customer.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

45

43. Economic Value Added / Breakeven Analysis


Economic Value Added (EVA) is a financial performance method to calculate the true
economic profit of a corporation. EVA can be calculated as net operating after taxes profit
minus a charge for the opportunity cost of the capital invested.
EVA is an estimate of the amount by which earnings exceed or fall short of the required
minimum rate of return for shareholders or lenders at comparable risk.
Unlike Market-based measures, such as MVA, EVA can be calculated at divisional
(Strategic Business Unit) level.
Unlike Stock measures, EVA is a flow and can be used for performance evaluation over time.
Unlike accounting profit, such as EBIT, Net Income and EPS, EVA is Economic and is based
on the idea that a business must cover both the operating costs AND the capital costs.
Usage of the EVA method
EVA can be used for the following purposes:
- setting organizational goals
- performance measurement
- determining bonuses
- communication with shareholders and investors
- motivation of managers
- capital budgeting
- corporate valuation
- analyzing equity securities
The Break-even point is, in general, the point at which gains equal losses. The point
where sales or revenues equal expenses. Or also the point where total costs equal total
revenues. There is no profit made or loss incurred at the break-even point. This is important
for anyone that manages a business since the break-even point is the lower limit of profit
when setting prices and determining margins.
Breaking even today does not return the losses occurred in the past, or build up a reserve
for future losses, or provide a return on your investment (the reward for exposure to risk).
The Break-even method can be applied to a product, an investment, or the entire
company's operations and is also used in the options world. In options, the break-even point
is the market price that a stock must reach for option buyers to avoid a loss if they exercise.
For a call, it is the strike price plus the premium paid. For a put, it is the strike price minus
the premium paid.
The Break-even point analysis must not be mistaken for the payback period, the time it
takes to recover an investment.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

46

44. Ratio Analysis Z-Score


The Z-Score formula for Predicting Bankruptcy of Edward Altman is a multivariate
formula for a measurement of the financial health of a company and a powerful diagnostic
tool that forecasts the probability of a company entering bankruptcy within a 2 year period.
Studies measuring the effectiveness of the Z-Score have shown the model is often accurate
in predicting bankruptcy (72%-80% reliability)
The Z-Score was developed in 1968 by Dr. Edward I. Altman, Ph.D., a financial economist
and professor at New York University's Stern School of Business.
The Z-Score bankruptcy predictor combines five common business ratios, using a
weighting system calculated by Altman to determine the likelihood of a company going
bankrupt. It was derived based on data from manufacturing firms, but has since proven to be
effective as well (with some modifications) in determining the risk a service firm will go
bankrupt.
How should the results be judged? It depends:
- Original Z-SCORE [For Public Manufacturer] If the score is 3.0 or above - bankruptcy is
not likely. If the Score is 1.8 or less - bankruptcy is likely. A score between 1.8 and 3.0 is the
gray area. Probabilities of bankruptcy within the above ranges are 95% for one year and
70% within two years. Obviously, a higher score is desirable.
- Model A Z'-Score [For Private Manufacturer] Model A of Altman's Z-Score is appropriate
for a private manufacturing firm. Model A should not be applied to other companies. A score
of 2.90 or above indicates that bankruptcy is not likely, but a score of 1.23 or below is a
strong indicator that bankruptcy is likely. Probabilities of bankruptcy in the above ranges are
95% for one year and 70% within two years. Obviously, a higher score is desirable.
- Model B Z'-Score [For Private General Firm] Edward Altman developed this version of
the Altman Z-Score to predict the likelihood of a privately owned non-manufacturing
company going bankrupt within one or two years. Model B is appropriate for a private
general (non-manufacturing) firm. Model B should not be applied to other companies. A
score of 1.10 or lower indicates that bankruptcy is likely, while a score of 2.60 or above can
be an indicator that bankruptcy is not likely. A score between the two is the gray area.
Probabilities of bankruptcy in the above ranges are 95% for one year and 70% within two
years. Again, obviously, a higher score is desirable.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

47

45. Variance Analysis


In budgeting (or management accounting in general), a variance is the difference between a
budgeted, planned or standard amount and the actual amount incurred/sold. Variances can
be computed for both costs and revenues.
The concept of variance is intrinsically connected with planned and actual results and effects
of the difference between those two on the performance of the entity or company.
Types of variances
Variances can be divided according to their effect or nature of the underlying amounts.
When effect of variance is concerned, there are two types of variances:
When actual results are better than expected results given variance is described as
favorable variance. In common use favorable variance is denoted by the letter F - usually in
parentheses (F).
When actual results are worse than expected results given variance is described as adverse
variance, or unfavourable variance. In common use adverse variance is denoted by the
letter A or the letter U - usually in parentheses (A).
The second typology (according to the nature of the underlying amount) is determined by the
needs of users of the variance information and may include e.g.:
Variable cost variances
Direct material variances
Direct labour variances
Variable production overhead variances
Fixed production overhead variances
Sales variances
Variance Analysis
Variance analysis, in budgeting (or management accounting in general), is a tool of
budgetary control by evaluation of performance by means of variances between budgeted
amount, planned amount or standard amount and the actual amount incurred/sold. Variance
analysis can be carried out for both costs and revenues.
Variance analysis is usually associated with explaining the difference (or variance) between
actual costs and the standard costs allowed for the good output. For example, the difference
in materials costs can be divided into a materials price variance and a materials usage
variance. The difference between the actual direct labor costs and the standard direct labor
costs can be divided into a rate variance and an efficiency variance. The difference in
manufacturing overhead can be divided into spending, efficiency, and volume variances. Mix
and yield variances can also be calculated.
Variance analysis helps management to understand the present costs and then to control
future costs.
S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

48

Economics
46. Forecasting
Forecasting can be broadly considered as a method or a technique for estimating many
future aspects of a business or other operation. There are numerous techniques that can be
used to accomplish the goal of forecasting. For example, a retailing firm that has been in
business for 25 years can forecast its volume of sales in the coming year based on its
experience over the 25-year periodsuch a forecasting technique bases the future forecast
on the past data.
QUALITATIVE FORECASTING METHODS
Qualitative forecasting techniques generally employ the judgment of experts in the
appropriate field to generate forecasts. They can be applied in situations where historical
data are simply not available.
DELPHI TECHNIQUE: In the Delphi technique, an attempt is made to develop forecasts
through "group consensus." Usually, a panel of experts is asked to respond to a series of
questionnaires. The experts, physically separated from and unknown to each other, are
asked to respond to an initial questionnaire (a set of questions). Then, a second
questionnaire is prepared incorporating information and opinions of the whole group. Each
expert is asked to reconsider and to revise his or her initial response to the questions. This
process is continued until some degree of consensus among experts is reached.
SCENARIO WRITING: Under this approach, the forecaster starts with different sets of
assumptions. For each set of assumptions, a likely scenario of the business outcome is
charted out. Thus, the forecaster would be able to generate many different future scenarios
(corresponding to the different sets of assumptions) to make a final decision.
SUBJECTIVE APPROACH: The subjective approach allows individuals participating in the
forecasting decision to arrive at a forecast based on their subjective feelings and ideas.
"Brainstorming sessions" are frequently used as a way to develop new ideas or to solve
complex problems.
QUANTITATIVE FORECASTING METHODS
Quantitative forecasting methods are based on an analysis of historical data concerning the
time series of the specific variable of interest and possibly other related time series.
TIME SERIES METHODS OF FORECASTING: Time series are comprised of four separate
components: trend component, cyclical component, seasonal component, and
irregular component. These four components are viewed as providing specific values for
the time series when combined. In a time series, measurements are taken at successive
points or over successive periods. The measurements may be taken every hour, day, week,
month, or year, or at any other regular (or irregular) interval. A trend emerges due to one or
more long-term factors, such as changes in population size, changes in the demographic
characteristics of population, and changes in tastes and preferences of consumers.
CAUSAL METHOD OF FORECASTING.
Causal methods use the cause-and-effect relationship between the variable whose future
values are being forecasted and other related variables or factors. The widely known causal
method is called regression analysis, a statistical technique used to develop a mathematical
model showing how a set of variables are related. This mathematical relationship can be
used to generate forecasts.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

49

47. Pricing Model


Product or service is free, revenue from ads and critical mass. This is the most common
model touted by Internet startups today, the so-called Facebook model, where the service is
free, and the revenue comes from click-through advertising. Its great for customers, but not
for startups, unless you have deep pockets. If you have real guts, try the Twitter model of no
revenue, counting on the critical mass value from millions of customers.
Product is free, but you pay for services. In this model, the product is given away for free
and the customers are charged for installation, customization, training or other services. This
is a good model for getting your foot in the door, but be aware that this is basically a services
business with the product as a marketing cost.
Freemium model. In this variation on the free model, used by LinkedIn and many other
Internet offerings, the basic services are free, but premium services are available for an
additional fee. This also requires a huge investment to get to critical mass, and real work to
differentiate and sell premium services to users locked-in as free.
Cost-based model. In this more traditional product pricing model, the price is set at two to
five times the product cost. If your product is a commodity, the margin may be as thin as ten
percent. Use it when your new technology gives you a tremendous cost improvement. Skip it
where there are many competitors.
Value model. If you can quantify a large value or cost savings to the customer, charge a
price commensurate with the value delivered. This doesnt work well with nice to have
offerings, like social networks, but does work for new drugs that solve critical health
problems.
Portfolio pricing. This model is relevant only if you have multiple products and services,
each with a different cost and utility. Here your objective is to make money with the portfolio,
some with high markups and some with low, depending on competition, lock-in, value
delivered, and loyal customers. This one takes expert management to work.
Tiered or volume pricing. In certain product environments, where a given enterprise
product may have one user or hundreds of thousands, a common approach is to price by
user group ranges, or volume usage ranges. Keep the number of tiers small for
manageability. This approach doesnt typically apply to consumer products and services.
Competitive positioning. In heavily competitive environments, the price has to be
competitive, no matter what the cost or volume. This model is often a euphemism for pricing
low in certain areas to drive competitors out, and high where competition is low. Competing
on price alone is a good way to kill your startup.
Feature pricing. This approach works if your product can be sold bare-bones for a low
price, and price increments added for additional features. It can be a very competitive
approach, but the product must be designed and built to provide good utility at many levels.
This is a very costly development, testing, documentation, and support challenge.
Razor blade model. In this model, like cheap printers with expensive ink cartridges, the
base unit is often sold below cost, with the anticipation of ongoing revenue from expensive
supplies. This is another model that requires deep pockets to start, so is normally not an
option for startups.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

50

48. Stochastic Model


"Stochastic" means being or having a random variable. A stochastic model is a tool for
estimating probability distributions of potential outcomes by allowing for random variation in
one or more inputs over time. The random variation is usually based on fluctuations
observed in historical data for a selected period using standard time-series techniques.
Distributions of potential outcomes are derived from a large number of simulations
(stochastic projections) which reflect the random variation in the input(s).
Its application initially started in physics. It is now being applied in engineering, life sciences,
social sciences, and finance. Like any other company, an insurer has to show that its assets
exceeds its liabilities to be solvent. In the insurance industry, however, assets and liabilities
are not known entities. They depend on how many policies result in claims, inflation from
now until the claim, investment returns during that period, and so on. So the valuation of an
insurer involves a set of projections, looking at what is expected to happen, and thus coming
up with the best estimate for assets and liabilities, and therefore for the company's level of
solvency.
Deterministic approach
The simplest way of doing this, and indeed the primary method used, is to look at best
estimates. The projections in financial analysis usually use the most likely rate of claim, the
most likely investment return, the most likely rate of inflation, and so on. The projections in
engineering analysis usually use both the most likely rate and the most critical rate. The
result provides a point estimate - the best single estimate of what the company's current
solvency position is or multiple points of estimate - depends on the problem definition.
Selection and identification of parameter values are frequently a challenge to less
experienced analysts.
Stochastic models help to assess the interactions between variables, and are useful tools to
numerically evaluate quantities, as they are usually implemented using Monte Carlo
simulation techniques. While there is an advantage here, in estimating quantities that would
otherwise be difficult to obtain using analytical methods, a disadvantage is that such
methods are limited by computing resources as well as simulation error. Below are some
examples:
Percentiles
This idea is seen again when one considers percentiles (see percentile). When assessing
risks at specific percentiles, the factors that contribute to these levels are rarely at these
percentiles themselves. Stochastic models can be simulated to assess the percentiles of the
aggregated distributions.
Truncations and censors
Truncating and censoring of data can also be estimated using stochastic models. For
instance, applying a non-proportional reinsurance layer to the best estimate losses will not
necessarily give us the best estimate of the losses after the reinsurance layer. In a simulated
stochastic model, the simulated losses can be made to "pass through" the layer and the
resulting losses assessed appropriately.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

51

49. Scenario Planning


Scenario planning is a model for learning about the future in which a corporate strategy is
formed by drawing a small number of scenarios, stories how the future may unfold, and how
this may affect an issue that confronts the corporation.
Royal Dutch Shell, one of the first and leading adopters, defines scenarios as follows:
Scenarios are carefully crafted stories about the future embodying a wide variety of ideas
and integrating them in a way that is communicable and useful. Scenarios help us link the
uncertainties we hold about the future to the decisions we must make today.
The scenario planning method works by understanding the nature and impact of the most
uncertain and important driving forces affecting the future. It is a group process which
encourages knowledge exchange and development of mutual deeper understanding of
central issues important to the future of your business. The goal is to craft a number of
diverging stories by extrapolating uncertain and heavily influencing driving forces. The
stories together with the work getting there has the dual purpose of increasing the
knowledge of the business environment and widen both the receiver's and participant's
perception of possible future events. The method is most widely used as a strategic
management tool, but it is also used for enabling group discussion about a common future.
Typically, the scenario planning process is as follows:
identify people who will contribute a wide range of perspectives
comprehensive interviews/workshop about how participants see big shifts coming in
society, economics, politics, technology, etc.
cluster or group these views into connected patterns
group draws a list of priorities (the best ideas)
sketch out rough pictures of the future based on these priorities (stories, rough
scenarios)
further work out to detailed impact scenarios (determine in what way each scenario
will affect the corporation)
identify early warning signals (things that are indicative for a particular scenario to
unfold)
monitor, evaluate and review scenarios
Some traps to avoid in Scenario Planning:
1) treating scenarios as forecasts
2) constructing scenarios based on too simplistic a difference, such as optimistic and
pessimistic
3) failing to make scenario global enough in scope
4) failing to focus scenarios in areas of potential impact on the business
5) treating scenarios as an informational or instructional tool rather than for participative
learning / strategy formation
6) not having an adequate process for engaging executive teams in the scenario planning
process
7) failing to put enough imaginative stimulus into the scenario design
8) not using an experienced facilitator

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

52

50. Harrod-Domar Model


The model suggests that the economys rate of growth depends on: the level of saving and
the productivity of investment i.e. the capital output ratio. The Harrod-Domar model was
developed to help analyse the business cycle. However, it was later adapted to explain
economic growth. It concluded that:
1) Economic growth depends on the amount of labour and capital.
2) As LDCs often have an abundant supply of labour it is a lack of physical capital that holds
3) back economic growth and development.
4) More physical capital generates economic growth.
5) Net investment leads to more capital accumulation, which generates higher output and
income.
6) Higher income allows higher levels of saving.
Lewis Structural Change (dual-sector) Model:
Many LDCs have dual economies:
The traditional agricultural sector was assumed to be of a subsistence nature characterised
by low productivity, low incomes, low savings and considerable underemployment.
The industrial sector was
assumed to be
technologically advanced
with high levels of
investment operating in
an urban environment.
Lewis suggested that the
modern industrial sector
would attract workers
from the rural areas.
Industrial firms, whether
private or publicly owned
could offer wages that
would guarantee a higher
quality of life than
remaining in the rural areas could provide.
Furthermore, as the level of labour productivity was so low in traditional agricultural areas
people leaving the rural areas would have virtually no impact on output.
Indeed, the amount of food available to the remaining villagers would increase as the same
amount of food could be shared amongst fewer people. This might generate a surplus which
could them be sold generating income. Those people that moved away from the villages to
the towns would earn increased incomes:
1) Higher incomes generate more savings.
2) Increased savings meant more fund available for investment.
3) Increased investment meant more capital and increased productivity in the industrial
sector, higher wages, more incentive to move from low productivity agriculture to high
productivity industry, the circle continues
Rostows Model the 5 Stages of Economic Development:
In 1960, the American Economic Historian, WW Rostow suggested that countries passed
through five stages of economic development. According to Rostow development requires
substantial investment in capital. For the economies of LDCs to grow the right conditions for
such investment would have to be created. If aid is given or foreign direct investment occurs
at stage 3 the economy needs to have reached stage 2. If the stage 2 has been reached
then injections of investment may lead to rapid growth.

S P Jain School of Global Management, Dubai-Mumbai-Singapore-Sydney - For internal circulation only

53

You might also like