You are on page 1of 25

Tutorial: Risk Analysis and Monte Carlo Simulation

Risk analysis is the systematic study of uncertainties and risks while Monte
Carlo simulation is a powerful quantitative tool often used in risk analysis.
By Dan Fylstra
Uncertainty and risk are issues that virtually every business analyst must deal with, sooner
or later. The consequences of not properly estimating and dealing with risk can be
devastating. The 2008-2009 financial meltdown – with its many bankruptcies, homes lost to
foreclosure, and stock market losses – began with inadequate estimation of risk in bonds
that were backed by subprime mortgages. But in every year, there are many less-publicized
instances where unexpected (or unplanned for) risks bring an end to business ventures and
individual careers.
There’s a positive side to uncertainty and risk, as well: Almost every business venture
involves some degree of risk-taking. Properly estimating, and planning for the upside is just
as important as doing so for the downside. Risk analysis is the systematic study of
uncertainties and risks we encounter in business, engineering, public policy, and many other
areas. Monte Carlo simulation is a powerful quantitative tool often used in risk analysis.
Are Uncertainty and Risk Different?
Uncertainty is an intrinsic feature of some parts of nature – it is the same for all observers.
But risk is specific to a person or company – it is not the same for all observers. The
possibility of rain tomorrow is uncertain for everyone; but the risk of getting wet is specific to
me if (a) I intend to go outdoors and (b) I view getting wet as undesirable. The possibility
that stock A will decline in price tomorrow is an uncertainty for both you and me; but if you
own the stock long and I do not, it is a risk only for you. If I have sold the stock short, a
decline in price is a desirable outcome for me.
Many, but not all, risks involve choices. By taking some action, we may deliberately expose
ourselves to risk – normally because we expect a gain that more than compensates us for
bearing the risk. If you and I come to a bridge across a canyon that we want to cross, and
we notice signs of weakness in its structure, there is uncertainty about whether the bridge
can hold our weight, independent of our actions. If I choose to walk across the bridge to
reach the other side, and you choose to stay where you are, I will bear the risk that the
bridge will not hold my weight, but you will not. Most business and investment decisions are
choices that involve “taking a calculated risk” – and risk analysis can give us better ways to
make the calculation.
How to Deal with Risk
If the stakes are high enough, we can and should deal with risk explicitly, with the aid of a
quantitative model. As humans, we have heuristics or “rules of thumb” for dealing with risk,
but these don’t serve us very well in many business and public policy situations. In fact,
much research shows that we have cognitive biases, such as over-weighting the most recent
adverse event and projecting current good or bad outcomes too far into the future, that work
against our desire to make the best decisions. Quantitative risk analysis can help us escape
these biases and make better decisions.
It helps to recognize up front that when uncertainty is a large factor, the best decision does
not always lead to the best outcome. The “luck of the draw” may still go against us. Risk
analysis can help us analyze, document, and communicate to senior decision-makers and

Página 1 de 25
stakeholders the extent of uncertainty, the limits of our knowledge, and the reasons for
taking a course of action.
What-If Models
The advent of spreadsheets made it easy for business analysts to “play what-if:” Starting
with a quantitative model of a business situation in Excel or Google Sheets, it’s easy to
change a number in an input cell or parameter, and see the effects ripple through the
calculations of outcomes. If you’re reading this magazine, you’ve almost certainly done
“what-if analysis” to explore various alternatives, perhaps including a “best case,” “worst
case,” and “expected case.”
But trouble arises when the actual outcome is substantially worse than our “worst case”
estimate – and it isn’t so great when the outcome is far better than our “best case” estimate,
either. This often happens when there are many input parameters: Our “what-if analysis”
exercises only a few values for each, and we never manage to exercise all the possible
combinations of values for all the parameters. It doesn’t help that our brains aren’t very good
at estimating statistical quantities, so we tend to rely on shortcuts that can turn out quite
wrong.
Simulation Software: The Next Step
Simulation software, properly used, is a relatively easy way to overcome the drawbacks of
conventional what-if analysis. We use the computer to do two things that we aren’t very good
at doing ourselves:
1. Instead of a few what-if scenarios done by hand, the software runs thousands or tens
of thousands of what-if scenarios, and collects and summarizes the results (using
statistics and charts).
2. Instead of arbitrarily choosing input values by hand, the software makes sure that all
the combinations of input parameters are tested, and values for each parameter cover
the full range.
This sounds simple, but it’s very effective. There’s just one problem: If there are more than
a few input parameters, and the values of those parameters cover a wide range, the number
of what-if scenarios needed to be comprehensive is too great, even for today’s fast
computers. For example, if we have just 10 suppliers, and the quantities of parts they supply
have just 10 different values, there are 1010 or 10 billion possible scenarios. Even an
automated run of 1,000 or 10,000 scenarios doesn’t come close. What can we do?
The Monte Carlo method was invented by scientists working on the atomic bomb in the
1940s. It was named for the city in Monaco famed for its casinos and games of chance.
They were trying to model the behavior of a complex process (neutron diffusion). They had
access to one of the earliest computers – MANIAC – but their models involved so many
inputs or “dimensions” that running all the scenarios was prohibitively slow. However, they
realized that if they randomly chose representative values for each of the inputs, ran the
scenario, saved the results, and repeated this process, then statistically summarized all their
results – the statistics from a limited number of runs would quite rapidly “converge” to the
true values they would get by actually running all the possible scenarios. Solving this
problem was a major “win” for the United States, and accelerated the end of World War II.
Since that time, Monte Carlo methods have been applied to an incredibly diverse range of
problems in science, engineering, and finance — and business applications in virtually every
industry. Monte Carlo simulation is a natural match for what-if analysis in a spreadsheet.

Página 2 de 25
Randomly Choosing Representative Values
Now for the hard part (not really that hard): How do we randomly choose representative
values, using a computer? This is the heart of the Monte Carlo method, and it’s where we
need some probability and statistics, and knowledge of the business situation or process
that we’re trying to model.
Choosing randomly is the easier part: In the external world, if there were only two possible
values, we might use a coin toss, or if there were many, we might spin a roulette wheel. The
analog in software is a (pseudo) random number generator or RNG – like the RAND()
function in Excel. This is just an algorithm that returns an “unpredictable” value every time it
is called, always falling in a range (between 0 and 1 for RAND()). The values we get from a
(pseudo) random number generator are effectively “random” for our purposes, but they
aren’t truly unpredictable – after all, they are generated by an algorithm. The RNG’s key
property is that, over millions of function calls, the values it returns are “equidistributed” over
the range specified.
To ensure that the values randomly chosen are representative of the actual input parameter,
we need some knowledge of the behavior of the process underlying that parameter. Here
are three histogram charts of the possible values of an input parameter. The first two are
probably familiar.

Página 3 de 25
Three histogram charts of the possible values of an input parameter.
On the top is a Uniform probability distribution, where all the values between 0 and 1 are
equally likely to occur. This is the distribution of values returned by the RAND() function.
In the middle is a Normal probability distribution, the most common distribution found in
nature, business and the economy. Note that, unlike the Uniform distribution, the Normal
distribution is unbounded – there is a small chance of very large or very small/negative
values.
On the bottom is an Exponential probability distribution, which is commonly used to model
failure rates of equipment or components over time. It reflects the fact that most failures
occur early. Note that it has a lower bound of 0, but no strict upper bound.

Página 4 de 25
Our task as business analysts is to choose a probability distribution that fits the actual
behavior of the process underlying our input parameter. Most distributions have their own
input parameters you can use to closely fit the values in the distribution to the values of the
process. Software such as @RISK from Palisade, ModelRisk from Vose Software,
and Analytic Solver Simulation from Frontline Systems (sponsors of this magazine) offers
you many – 50 or more – options for probability distributions.
What Happens in a Monte Carlo Simulation
Given a random number generator and appropriate probability distributions for the uncertain
input parameters, what happens when you run a Monte Carlo simulation is pretty simple:
Under software control, the computer does 1,000 or 10,000 “what-if” scenario calculations
– one such calculation is called a Monte Carlo “trial.” On each trial, the software uses the
RNG to randomly choose a “sample” value for each input parameter, respecting the relative
frequencies of its probability distribution. For example, for a Normal distribution, values near
the peak of the curve will be sampled more frequently. If you’ve specified correlations, it
modifies these values to respect the correlations. Then the model is calculated, and values
for outputs you’ve specified are saved. It’s as simple as that!
At the end of the simulation run, you have results from 1,000 or 10,000 “what-if” scenarios.
You can step through them one at a time, and inspect the results (on the spreadsheet, if
you’re using one), but it’s generally easier to look at statistics and charts to analyze all the
results at once. For example, here’s a histogram of one calculated result, the Net Present
Value of future cash flows from a project that involves developing and marketing a new
product. The chart shows a wide range of outcomes, with an average (mean) outcome of
$117 million. But the full range of outcomes (in the Statistics on the right) is from negative
$77.5 million to positive $295 million! And from the percentiles, we see a 5 percent chance
of a negative NPV of $7.3 million or more, and a 5 percent chance of a positive NPV of $224
million or more.

Histogram: Net Present Value of future cash flows from a project.


Another view, below, from this same chart shows us how sensitive the outcome is to certain
input parameters, using “moment correlation,” one of three available presentations. $C$31
is customer demand in Year 1, $C$32 is customer demand in Year 2, and $C$35 is
marketing and sales cost in Year 1 (treated as uncertain in this model). This is often called
a Tornado chart, because it ranks the input parameters by their impact on the outcome, and

Página 5 de 25
displays them in ranked order. On the right, we are displaying Percentiles instead of
summary Statistics.

This chart shows how sensitive the outcome is to certain input parameters.
How did we construct this model? We started from a standard “what-if” model, then replaced
the constant values in three input cells with “generator functions” for probability distributions.
We also selected the output cell for Net Present Value, as an outcome we wanted to see
from the simulation. We can pretty much always go from a what-if model to a Monte Carlo
simulation model in a similar way. A chart, page 21, shows our chosen distribution – a
“truncated” Normal distribution, which excludes certain extreme values – for customer
demand in Year 2.

A truncated Normal distribution for customer demand in Year 2.

Página 6 de 25
Steps to Build a Monte Carlo Simulation Model
If you have a good “what-if” model for the business situation, the steps involved in creating
a Monte Carlo simulation model for that situation are straightforward:
 Identify the input parameters that you cannot predict or control. Different
software may call these “inputs,” “forecasts,” or “uncertain variables” (Analytic Solver’s
term). For these parameters (input cells in a spreadsheet), you will replace fixed
numbers with a “generator function” based on a specific probability distribution.
 Choose a probability distribution for each of these input parameters. If you have
historical data for the input parameter, you can use “distribution fitting” software
(included in most products) to quickly see which distributions best fit the data, and
automatically fit the distribution parameters to the data. Software makes it easy to
place the generator function for this distribution into the input parameter cell.
 If appropriate, define correlations between these input parameters. Sometimes,
you know that two or more uncertain input parameters are related to each other, even
though they aren’t predictable. Using tools such as rank-order correlation or copulas,
which modify the behavior of the generator functions in a simulation, you can take this
into account.
Thinking about the first step, if you can predict the value of an input parameter, it’s really a
constant in the model. But if you have a prediction that’s only an estimate in a range—or
within ‘confidence intervals’—then it should be replaced with a generator function. If you
can control the value, this parameter is really a decision variable that can be used later in
simulation-based what-if analysis, or in simulation optimization.
Thinking about the third step: If you know the exact relationship between input parameter A
and input parameter B (say that B = 2*A), you can just define a probability distribution for A,
and use a formula to calculate B. Correlation methods are intended for cases where you
know there is a relationship, but the exact form of that relationship is uncertain. For example,
airline stocks tend to rise when oil stocks fall, because both are influenced by the price of
crude oil and jet fuel – but the relationship is far from exact.
So, Why Do This?
We’ve seen that it really isn’t very difficult to go from a good what-if model to a Monte Carlo
simulation model. The main effort is focusing on the real uncertainties in the business
situation, and how you can accurately model them. The software does all the work to analyze
thousands of what-if scenarios, and it gives you a new visual perspective on all of them at
once.
We also saw at the beginning of this tutorial that uncertainty and risk are present in virtually
every business situation, and indeed most life situations – and the consequence
of not estimating risk properly, and taking steps to mitigate it, can mean an early career end,
or even a business failure. That’s just the negative side – risk analysis can also show you
that there’s more upside than you ever imagined.
With the effort so modest and the payoff so great, the answer should be obvious: Monte
Carlo simulation should be a frequently-used tool in every business analyst’s toolkit.
Dan Fylstra is President of Frontline Systems, Incline Village, Nev.
Filed Under: Features, July 2017

Página 7 de 25
Tutorial: Risk Analysis — A Breath of Better Decisions and Success
“Uncertainty is the only certainty there is, and knowing how to live with
insecurity is the only security.”
– John Allen Paulos
By Tianyang Wang, Francisco Zagmutt, and Huybert Groenendaal

Understanding and managing risks are crucial parts of every business. Conceptually,
running a business is like navigating a ship in the ocean: There will always be a variety of
risks driven by wind, wave, tide, and storm—or even by unpredictable icebergs that can sink
the unsinkable Titanic. As risk is such an integrated part of the journey, there is an inevitable
need to analyze, understand, and manage it. Unfortunately, most organizations (and people)
spend a lot more time on planning for what they believe is the “most likely future,” than on
understanding the uncertainties around their decisions, forecasts, and budgets, and on
preparing to confidently navigate or mitigate future risks.
Risk analysis by using Monte Carlo simulation (hereafter also referred to as simulation
modeling) is a very beneficial tool to help organizations better understand future risks and
scenarios, and make informed decisions. Based on the “Monte Carlo tutorial” in the previous
issue,1 this article will continue the discussion on the importance of risk management and
the perils of ignoring it. We will first make the case for how risk analysis and Monte Carlo
simulations can help with making better business (and personal) decisions using a simple
model; we will explain how simulation modeling works with a more complex/complete case-
study; and finally, we will discuss a range of different applications, as well as several factors
that are critical to benefitting from the use of simulation modeling to improve decision
making.
Analyzing Risk: Start the Dialogue with A Simple Model
The basic idea of Monte Carlo simulation is to (quantitatively) play out many what-if
scenarios and to statistically examine how a decision or situation will perform under all
possible future (simulated) scenarios. While it is typically too costly or sometimes even
impossible to perform an experiment in the “real world,” we can easily simulate it in a Monte
Carlo model. There are many professional software applications (such as spreadsheet-
based Analytic Solver Simulation, an Excel add-in) that are intended to help develop and
use a Monte Carlo model easier and more user friendly.
To demonstrate the mechanics of preparing and using a Monte Carlo simulation model, we
start with a simple example. Suppose a global pharmaceutical company is considering its
5-year planning budget, and its base budget of operating profits (OP) is displayed in Figure
1.

Página 8 de 25
Figure 1. Base Budget of Operating Profits
During its budgeting process, the company carefully sets its ‘base budget’ at a level that
management thought was reasonable and realistic. However, a critical challenge of setting
a base budget is that the company’s budget contains many risks: uncertain commercial
success of products, key drug R&D and approval risks, regulatory and legal uncertainties,
etc. And even though management knows about all of these individual risks, it is very difficult
to determine the aggregate impacts of so many different risks on the budget. Fortunately,
developing a well-thought-out risk-based simulation model can help.
To build the simulation model, the scope of relevant risks must first be understood. While
this step is not covered in this article, its importance should not be underestimated. In fact,
a lot of value can often already be obtained from brainstorming and discussing the potential
risks with diverse stakeholders (e.g., R&D, marketing, legal, regulator, and/or finance
departments). In the second step, a simulation model can be developed to incorporate the
identified risks. While some of the identified inputs may be available from existing data (e.g.,
past sales and costs), others may require estimates of future revenues and risks that have
not yet occurred. If available and relevant, historical data can be used to determine the
appropriate distributions for uncertain variables. When historical data are not available, or
there is a reason to believe that the future behavior of a variable will be significantly different
from the past, then expert opinion from different business units can be used to estimate the
probabilities and potential impacts of future risks. For example, we can identify probability
distribution of the growth in annual sales by fitting the historical annual sales-growth data to
a Normal distribution (Figure 2). In addition, Figure 3 shows the potential future peak sales
(in thousands of units) of a new product estimated as a PERT distribution, which is a
probability distribution that is often used to model expert opinion. The example distribution
shows that the company forecasts sales to be between 10,000 and 25,000, with the most
likely estimated number of units to be at 15,000.

Página 9 de 25
Figure 2. Annual Sales Growth

Figure 3. Future Peak Sale


It should also be noted that, wherever appropriate, relationships (often measured by linear
or rank order correlations) should be considered among the different variables within a
model. For instance, there typically is a strong relationship between sales and costs. Such
relationships can be either estimated from the historical data, or modeled based on expert
opinions, and then be incorporated in the simulation model. Including relationships within a
simulation model is important, since not considering relationships in a model often results in
a significant mis-estimation of risks.

Página 10 de 25
For instance, stock price movements are often correlated, and the correlations can increase
significantly in high volatility periods, as we have observed during the stock market
correction of 2008. Mistakenly ignoring such correlations will underestimate the risks of
investments in the stock market.
The next step in using a simulation model involves recalculating (also called simulating) the
model thousands of times or more to generate simulated scenarios for model output (e.g.,
next year’s budget, Net Present Value (NPV), total project costs, etc.). All professional Excel
add-ins such as Analytic Solver Simulation provide a user friendly interface for generating
random samples and graphical and statistical summaries of the simulated data, making it
relatively convenient to conduct simulation in spreadsheets.
The thousands of simulation scenarios are one of the main outputs of a simulation model,
and provide us with an estimate of the possible future scenarios given the different risks
included in the analysis. However, instead of showing management the thousands of
possible future scenarios, we can use these simulated scenarios in various cases: to
construct a frequency distribution of the performance measure; to compute risk measures
such as expected value, percentiles and confidence intervals; to estimate the probability of
the performance measure to be greater (or less) than the targeted/expected performance;
and to develop potential business scenarios to evaluate. The risky budget, for example,
allows senior management to gain insight into questions such as (1) how realistic is their
current budget, (2) what is their confidence for meeting or exceeding the budget, and (3)
and how much the budget should be set at given all the relevant uncertainties?
For the global pharmaceutical company, Figure 4 is a histogram that displays all 10,000
simulations of the 2018 OP considering all the risks and opportunities identified. As the
Figure shows, based on the various risks included in the analysis, the 2018 OP could vary
from less than $10M to more than $17M. In fact, while the expected (i.e., mean) OP profits
are $13.5M, the company would have a 90 percent confidence that the OP would be
between $11M and $16M. Figure 4 also shows that most of the simulation results for the
2018 OP are lower than the “base budget” of $15M for 2018. In fact, Figure 5, the cumulative
distribution of the same 10,000 iterations, clearly shows that the company would have less
than a 20 percent confidence that it would meet or exceed the $15M base budget that it
originally thought was “reasonable.”

Página 11 de 25
Figure 4. Histogram of Simulations

Figure 5. Cumulative Distribution of Simulations


In this example, the global pharmaceutical company’s focus is a five-year budget, thus we
would develop the simulation model to consider risks affecting the OP from 2018 through
2022. As a result, Figure 6 summarizes for each of the five years. The results from the risk
adjusted budget could be eye opening for the managers as the probability of achieving the
company’s base five-year budget is often (much) less than 50 percent, which suggests that
the budget is a stretch goal and may be unrealistic. Based on these results, management

Página 12 de 25
may decide to adjust the budget. In addition, the management team now would have a better
understanding of how much risk there is around each of the years’ OP.

Figure 6. Risk Adjusted Budget vs. Base Budget


Monitoring Risk: Using Sensitivity Analysis
Of course, the next natural question is what uncertainties have the greatest possible impact
on the budget? And what risks, if any, could be mitigated to reduce their negative impacts
on the budget? If we can identify the most influential risk drivers that are driving the overall
risks in, for example, next year’s OP, the company would be able to potentially work on
reducing or mitigating them (e.g., spend more resources on a product to expedite the launch,
or hedge the foreign exchange risk). Sensitivity analysis addresses such questions by
examining how sensitive the output results are to various individual risks. One of the most
frequently used sensitivity analyses in simulation modeling is a Tornado chart that visually
displays the importance of each of the individual risks (e.g., product risks, regulatory risks,
legal risks) to the overall risk in a company’s key metrics, such as its next year’s OP.
For instance, based on the previous example, the Tornado chart 2 in Figure 7 shows two
main results explaining the relative importance of each risk on the 2018 OP of the company
worldwide: (1) which risks are the greatest drivers of the overall 2018 OP and (2) how much
effect each of the risk drivers has on 2018 OP. The Figure clearly shows that the main four
risk drivers are “regulatory risk #1,” “commercial risk #8,” “legal risk #7,” and “R&D risk #6.”
In addition, it shows that “regulatory risk #1” alone could cause the 2018 OP to swing from
$10.64 M to $15.25M. Having a better understanding of and potentially mitigating this
regulatory risk could therefore have a considerable effect on the 2018 OP. The results of
the analysis provide management and the board of directors with a better understanding of
what the most important risk drivers are, and can direct management’s attention to the risks
that deserve most of their focus.

Página 13 de 25
Figure 7. Tornado Chart
In summary, simulation and sensitivity analysis provide a business with valuable insights
into the company’s long-term risks and opportunities, and serve as a powerful tool in
identifying and focusing attention on the risks and opportunities that affect budgets most
significantly.
Managing Risk: Weigh Your Options
After discussing the basics, we now introduce a second example that touches more
advanced topics involving options and decisions. Suppose a multinational company with
specific technologies and expertise (e.g., to sell products B2B that require upfront R&D) has
an ongoing challenge to find the right balance of risk and reward in agreements with
business partners. Agreements can be kept as simple as a supply agreement where the
supplier is paid for cost plus a markup or made more complex where a supplier takes on a
greater share of the cost and risk, and in return, shares the profit margin of the final product.
Simulation can be a powerful decision-supporting tool to understand the risk-reward picture
of alternative partnership agreements: product/business development, private
equity/venture capital, joint ventures, etc. Even though the way partnership agreements are
structured can greatly influence a business’ risk/reward profile, often businesses only
quantify the “most likely” rewards (e.g., NPV, IRR, sales), and not the risks.
In comparison, using Monte Carlo simulation, we can develop risk assessment tools and
simulation models to analyze both the risks and potential rewards of pursuing a variety of
different partnership agreements and business models. Such quantitative tools would
transparently and consistently quantify the risk/reward trade-off, and provide clarity and
insight into alternative partnership structures and business models. This will greatly assist
the company in deciding which opportunities to pursue, and what partnership agreements
to negotiate.
As we discussed previously, first we must gain a good understanding of the main risks and
uncertainties that impact the financial risks and returns of the opportunity in question. In this
example, risks can be wide-ranging but often include R&D uncertainties, market size,
competition and market share, legal landscape, costs, and timelines. The next step is to
build a risk-based financial profit and loss (P&L) and a cash flow model that incorporates all

Página 14 de 25
identified risks and uncertainties, with all relationships between the risks, and importantly,
does not over-complicate the models.
Once the financial returns and risks are incorporated in the P&L model, the model then
“overlays” the various relevant partnership agreement structures or business models,
resulting in a full simulation risk analysis of each alternative “business partnership structure.”
A great strength of this approach is that the results include risk and decision insights on the
individual strategic opportunity as well as the various agreement structures.
Figure 8 shows an example of some of the results of the analysis for this company under
three different agreement structures. In Figure 9, the dots represent the expected “risked
NPV” and the error bars show the amount of risk under each structure. The appeal of each
option depends on how risk averse you and your potential partner are, what the individual
risks are and how they are shared with your partner, and how much risk the rest of your
product portfolio has. In other words, to make an informed decision on the strategy of one
product, it is important to understand the risks in the rest of the organization’s portfolio.
Similarly, while not shown here since this analysis is based on a simulation model, a Tornado
chart can show the main risk drivers and help the company in focusing on how to best
potentially understand and mitigate risks in the partnership agreement.

Figure 8. Different Partnership Structure

Figure 9. Risked NPV under Each Partnership


Many Applications
The authors have successfully applied risk analysis methods in hundreds of projects in fields
ranging from pharmaceuticals, oil and gas, finance, manufacturing, and mining to food and

Página 15 de 25
beverages, health and food safety. We used and gained our analytical expertise in helping
private and public institutions worldwide to make decisions in the presence of uncertainty.
The following are several additional applications of Monte Carlo simulation to improve
decision making.
A large international firm wishes to evaluate the risks and rewards of a potential large
business-development deal. The complicating factor is that the proposed M&A contract
allows either of the two parties involved to change the business terms, after the deal is
signed (i.e., “flip” the terms). A Monte Carlo simulation model can capture the operational,
commercial, and financial risks of the deal, as well as the “optionality” within the potential
M&A deal-terms, and allows the client to take a more favorable negotiation strategy that can
result in a successful licensing deal.
A venture capital firm is in the process of structuring a new fund. The firm needs to determine
the optimal distribution structure of the different future cash flows to various stakeholders
(entrepreneurs, investors, and general partners), taking risks into account. A comprehensive
Monte Carlo model that simulates the investment performance, and relates various cash
flow streams, will help the managing director better understand the risks to each of the
stakeholders involved under the different fund structures and scenarios, support the
decisions on how to structure the fund, and give investors a quantitative view of risk and
return.
A gas transmission and storage company manages the large cost of several projects’ costs
and schedules. The managers wonder if there is a better way to understand and manage
risks. Monte Carlo simulation would provide the company with better insight about the
project and its risks.
Using Simulation Successfully
While Monte Carlo simulation can have great advantages in making better informed
decisions, there are still many organizations that have not utilized it to its advantage. In our
experience with many diverse companies and organizations worldwide, we have found the
following three factors to be critical for the successful and continuous use of simulation
modeling:
 Support from senior management: Senior management’s interest in, and support
for the use of simulation modeling to improve decision making is critically important.
While there are multiple ways of obtaining and maintaining such support, an effective
approach we’ve frequently employed is with one or a couple of important (visible)
example projects in which the benefits of simulation model can be effectively shown.
 Making simulation modeling an integral part of the decision making process: To
benefit from the advantages of Monte Carlo simulation, its usage should be made part
of the regular decision-making process (as stated in #1, its use should be demanded
by management). This means it should, of course, not be a substitute for business
professionals learning with and from real business contexts but instead as a way of
enhancing your decision-making. For decisions that are made regularly (e.g.,
investment decisions), the use of templates can often facilitate and speed this up. One
of the companies we’ve collaborated with has made a Monte Carlo analysis mandatory
for the review/approval of every capex project over $5M.
 Developing and gaining expertise: To achieve credible simulation results, the model
must closely resemble the reality, while also keeping the model conceptually tractable.
The decision for the right level of detail, a suitable model structure, and valid inputs all
require expert experience with the implementation of simulation models. This means
Página 16 de 25
that obtaining the right expertise and skills to develop and use simulation model is
critically important.
Organizations that do get the right support, processes and skills in place can expect a true
competitive advantage though in making more informed decisions.
Summary
We’ve discussed the importance of risk analysis and risk management using Monte Carlo
simulation. In business, risk is everywhere. “Man is a deterministic device thrown into a
probabilistic Universe. In this match, surprises are expected.”3 Simulation modeling can help
organizations better navigate the business oceans with unexpected icebergs. In the end,
what separates successful businesses from those that fail is the attention to risk and the
capability to manage it.
Resources
ModelAssist: A free and comprehensive quantitative risk analysis training and reference
software. http://www.epixanalytics.com/ModelAssist.html.

Practical Spreadsheet Risk Modeling for Management, Lehman, D, Groenendaal, H and


G. Nolder. Chapman and Hall/CRC, September 1, 2011. Hardback textbook, 284 pages.

Footnotes
1 See Solver International’s premier issue, July 2017, which has a great introductory guide
on modeling risks with Monte Carlo simulation.
2 There are various versions of Tornado charts available, but all are focused on the

question of what the main risk drivers are towards a certain objective such as a 2018 OP,
total costs of a project, NPV, etc.
3 Quote by Amos Tversky, in “The Undoing Project” by Michael Lewis

Filed Under: Features, September 2017

Página 17 de 25
Tutorial: Optimization for Better Decisions
Optimization requires that we define, in quantitative terms, a model that specifies all the ways, times or
places our resources may be allocated, and all the significant constraints on resources and uses that must be
met. Here is the way to do that.
Dan Fylstra, CEO Frontline Systems
Every day, in business, government, and even our personal lives, we make decisions about how to best use
the resources – such as time and money – available to us. It is challenging enough for us to decide which
items to buy with our available funds, or which of several priorities we should tackle this morning. For even
medium-size organizations, this challenge is multiplied many times over: How to best schedule every hour
for a staff of 30 people in a call center? How to load packages on a fleet of 100 trucks, and which routes they
should drive to make deliveries in the least time? How to assign crews and aircraft to 1,000 airline flights, as
they move across the country throughout a day?
These decisions – how to allocate (usually limited) resources to different uses, when there are so many
options, with so many interrelationships – are prime candidates for optimization. At leading firms, all the
foregoing decisions are routinely made with the aid of optimization. To use optimization, we need to define,
in quantitative terms, a model that specifies all the ways, times or places our resources may be allocated,
and all the significant constraints on resources and uses that must be met. Then a solver searches for and
finds the best resource allocation decisions.
Decision Variables, Objective and Constraints
Our quantified decision variables are the amount of resources allocated to each individual use – for example,
the number of call center employees working on each shift, or the number of packages of a given size loaded
onto each truck. To determine what “best” means, we must define a quantity called the objective that we
can calculate from the values of decision variables—for example, costs that we’d like to minimize or profits
to be maximized. To complete the model, we must define each constraint, or limit on the ways resources
may be allocated, that reflects the real-world situation. We usually have both simple constraint limits such
as “up to 100 trucks,” and constraints calculated from the decision variables, such as “our beginning
inventory, plus units received minus units shipped, must equal our ending inventory.”
Let’s consider a very simple example of a call center employee scheduling problem, shown in Figure 1 (this
example Excel model is included with Frontline’s Analytic Solver software). Our problem is to schedule
enough employees to work each day of the week (decision variables) to handle our predicted call volume (a
constraint), while minimizing total payroll cost (our objective).

Página 18 de 25
Figure 1: Optimization at work: A Simple Call Center Scheduling Example
Our personnel policy, that employees should work five consecutive days and have two consecutive days off,
determines the possible ways that resources can be used: There are seven possible weekly schedules, each
one starting on a different day of the week. These are labeled A, B, C through G in the Excel model. For
example, employees on Schedule A have Sunday and Monday off, then they work Tuesday through Saturday.
Our decision variables – the number of employees working on each schedule – are in cells D15:D21; they are
summed in cell D22.
In this simple model, all employees are paid at the same rate, $40 per day at cell N15. Our objective, payroll
costs to be minimized, is just =D22*N15*5 (5 working days per week).
We must meet a constraint that the number of employees working each day of the week is greater than or
equal to the “Minimum Required Per Day” figures in row 25. We assume here that we actually know these
numbers – in many real-world call centers, the call volumes per day are uncertain, and we have only a range
or probability distribution for each one. In this tutorial, we’re covering only conventional or deterministic
optimization; in a future tutorial, we’ll describe stochastic optimization, where we must allocate resources
under conditions of uncertainty in the objective and/or constraints (something that Frontline’s software can
handle very well).
The 1s and 0s in the middle of the worksheet help us calculate the number of employees we’ll have in the
call center on each day of the week. For example, on Sunday we’ll have the employees on Schedules B
through F, but those on Schedules A and G will have the day off. So the number of employees working Sunday
is just =SUMPRODUCT(D15:D21,F15:F21) – and similarly for the other days of the week.
Our SUMPRODUCT formulas are in row 24, and we want each value to be greater than or equal to the
corresponding “minimum required” number in row 25. We can express this as F24:L24 >= F25:L25, or using
Excel defined names, as “Employees per day >= Required per day.” You can see the Solver model taking shape
in the right-hand Task Pane in Figure 1.
In this model – as in many others – we must be careful to define all the limits on resources, including “non-
negativity.” We cannot have a negative number of employees on any schedule. This may be obvious to us,

Página 19 de 25
but Solver does allow negative values for decision variables unless we say otherwise – so we include a
constraint D15:D21 >= 0 (or with defined names, “Employees per schedule >= 0”).
There’s one more constraint we haven’t yet discussed: Solver allows any whole number or fractional value
for a decision variable, but we can’t actually assign one-half or two-thirds of an employee to a schedule.
Indeed, the optimal solution without this integer constraint assigns fractional values to four weekly
schedules, such as 2.67 employees for Schedule A, and 6.67 employees for Schedule C – minimizing payroll
cost to $4,933.
We complete the Solver model by adding a constraint “Employees per schedule = integer.” (In some
optimization software, this is treated as a property of the decision variables, but since it limits the possible
solutions, Solver treats these integer requirements as constraints.) We can now solve the model by clicking
the Optimize button on the Ribbon, or the green arrow on the Task Pane. The optimal solution ($5,000 payroll
cost) is shown in Figure 2: Note that the integer constraint “cost us” something – indeed, additional
constraints always yield a “same or worse” objective.

Figure 2: Optimal Solution of the Call Center Scheduling Example


Even more important than the objective value are the decisions that will realize this outcome. These are
amounts of resources to be allocated to each use – in this case, the number of employees to be assigned to
each of the seven possibly weekly schedules: 2, 5, 7, 4, 6 and 1 for Schedules A, B, C, D, E and F. Because our
call volumes on weekends are so high, we assign no employees to Schedule G.
Optimization: Exploiting Structure
How did Solver find this solution, and how do we know that it is the best possible solution? You can play
“what-if” with this model, trying different values in cells D15 through D21, searching for a good combination
of values. (You’ll find that there is no better combination of values that satisfies all the constraints.) But to
really answer this question, we must probe more deeply into “how Solver works,” and talk about some key
ideas: optimality conditions, linearity and convexity. Some math and geometry follows(!) – but if you make it
all the way through this tutorial, you’ll be rewarded with a much deeper understanding of Solver and
optimization.
Solver does try different values for the decision variables, searching for the best solution – indeed all
optimization algorithms work this way – but the search is much more sophisticated than randomly chosen
“what-if” trials. By computing partial derivatives and testing for satisfaction of the KKT (Karush-Kuhn-Tucker)

Página 20 de 25
conditions, Solver can determine that it has found the “top of a peak” or “bottom of a valley” – there are no
better solutions “nearby” (a locally optimal solution) – and if the model is convex (discussed later), there are
no better solutions anywhere (a globally optimal solution). This yields the message “Solver found a solution.
All constraints and optimality conditions are satisfied.”
In our simple Call Center example, Solver had to search a seven-dimensional “space” of possible values for
the decision variables (one dimension for each variable), for better objective values, while ensuring that the
“Minimum Required” and the non-negativity and integer constraints were satisfied. In a problem with 200
decision variables, this becomes a 200-dimensional search! And Frontline’s enhanced Solvers are routinely
used to optimize models with thousands to millions of decision variables.
How? Solver can do this by exploiting the (algebraic) structure of the model. In this case, the objective (recall
it is =D22*N15*5, which equals SUM(D15:D21)*40*5) is a linear function of the decision variables. Each of
the seven constraints is of the form =SUMPRODUCT(D15:D21, constants), also a linear function of the
decision variables. Without the integer constraint, this is a linear programming problem, the easiest type of
optimization problem to solve, and one that always yields a globally optimal solution.
When we add the integer constraint, the model becomes a linear mixed-integer programming problem, or
LP/MIP as shown in the “Model Diagnosis” area at the bottom of the Task Pane. These problems are
significantly harder to solve, but there are sophisticated search algorithms available for LP/MIP problems,
and Frontline’s Solvers use them.
Linearity and Convexity: The Keys to Solvability
A linear function, such as SUM or SUMPRODUCT, or any chain of formulas where decision variables are only
multiplied by constants and the result added or subtracted, can be plotted as a straight line. In full Analytic
Solver software, you can create such a plot (“slicing through” N-dimensional space) with two mouse clicks,
Decisions – Plot, as shown for the objective in Figure 3.

Figure 3: Plot of the linear objective function, Total Payroll Cost.

Página 21 de 25
The constraints in our Call Center model also plot as straight lines (in seven-dimensional space). Since they
must all be satisfied at the same time, Solver can limit its search to points (i.e., combinations of values for the
decision variables) in the intersection of these linear constraints – forming the feasible region. In many
dimensions, the intersection of linear constraints is a geometric form called a polytope, but in two
dimensions this would be a polygon, as shown (for a different two-variable maximization problem) in Figure
4. The objective – shown in red – is also a straight line that slides up and to the right as it is maximized; the
optimal solution is at point D. But the key point is that the polygon or N-dimensional polytope – the feasible
region – can be efficiently searched.

Figure 4: Graphical depiction of a two-variable linear programming problem.


From the viewpoint of optimization as a search process, the straight lines in Figure 4 are less important than
the overall shape of the feasible region, which is convex. (An intersection of linear constraints
is alwaysconvex.) A circle, formed by a formula such as x^2 + y^2, is also convex, and is also easy to search;
but a non-convex region becomes exponentially harder to search as its dimensionality increases.
Figure 5 shows simple examples of convex and non-convex polygons, in two dimensions. Notice that in the
non-convex region, the straight line (called a chord) goes into and out of the feasible region multiple times.
A non-convex region has “nooks and crannies,” which take more and more time to search as the
dimensionality of the region increases. Imagine, for example, searching a 200-dimensional version of this
figure.

Página 22 de 25
Figure 5: Convex and non-convex regions.
When an optimization problem’s objective and constraints are both convex – as is always true in a linear
programming problem – the problem will have one optimal solution, which is globally optimal. But a non-
convex problem may have many locally optimal solutions.
The Convexity Killers
In Excel models, it is common to use IF functions to make “either-or” choices, and to use CHOOSE or LOOKUP
functions to select among multiple values. These functions are very useful, and it’s fine to use them in Solver
models, as long as their arguments do not depend on the decision variables.
But if a model has IF, CHOOSE, or LOOKUP functions that do depend on the decision variables, this will quickly
make the model both non-convex and non-smooth. Figure 6 shows the type of plot we get for a formula such
as =IF(D15<2,F24,3*F24) in our Call Center model.

Página 23 de 25
Figure 6: Plot of IF Function that Depends on Decision Variables.
Compare the “kinks” in this plot to the non-convex region in Figure 5. To make matters worse, the “kinks”
make the function non-smooth, which means that Solver cannot reliably compute derivatives at those points
– another way of saying that Solver cannot reliably follow the “rate of change” in the function. Again, as the
number of dimensions (decision variables) and the number of constraints increases, the time needed to
search for an optimal solution increases exponentially.
What can you do, if you need to make “either-or” choices, or select among multiple values in your model?
There is a better way: You can express the same conditions using integer variables and linear constraints.
Frontline’s Solver User Guide explains how to do this, and our full Analytic Solver software, as part of its
model diagnosis, can detect IF, CHOOSE, and LOOKUP functions and automatically replace them with
equivalent integer variables and linear constraints, up to a certain level of complexity.
It is worth noting that introducing integer variables into an optimization model actually makes the model
non-convex. In Figure 5, because only integer-valued points (which would appear as dots) are feasible, the
chords will go “in and out of the feasible region” multiple times. But if integer variables are the only source
of non-convexity in the model, then powerful algorithms for handling these integer variables can be applied
to greatly cut down on search time. The Gurobi Solver and XPRESS Solver, available as optional plug-in Solver
Engines for any of Frontline’s Solver products, are highly effective at solving even large linear mixed-integer
and quadratic mixed-integer problems.
Global Optimization
If your model simply cannot be expressed as a linear programming or linear mixed-integer problem, you can
still use optimization. In most cases, this means you’ll have to accept an approximate globally optimal
solution, a locally optimal solution, or (for a non-convex, non-smooth model) just a “good” solution – better
than what you were doing before (this can still yield a great business payoff). Figure 7 is a plot of a smooth
but non-convex objective function. You can see that it has multiple “peaks” and “valleys” – the KKT conditions

Página 24 de 25
would be satisfied at each peak (when maximizing) or valley (when minimizing), but the globallyoptimal
points are in dark red and dark blue.

Figure 7: A global optimization problem, with just two variables.


Solver in Excel includes basic facilities for global optimization, using either the “Multistart option” for the
GRG Nonlinear Solver, or the Evolutionary Solver – and Frontline’s enhanced Solver products offer more
powerful methods for global optimization, such as the Interval Global Solver, OptQuest Solver, and
Frontline’s hybrid Evolutionary Solver – an engine that combines classical (linear and nonlinear optimization)
methods with genetic algorithm, scatter search, local search, and heuristic methods.
If you’ve followed this tutorial to its conclusion, you now know a lot more than most people about
optimization! While we’ve illustrated these concepts with Excel models and simple plots, the ideas of linearity
and convexity are fundamental, and applicable to any kind of optimization problem, solution algorithm, or
software. Hopefully, you also realize how optimization problems can become very difficult to solve, but how
powerful software is available to help you find good solutions, even for the most challenging problems.

Dan Fylstra is President and CEO of Frontline Systems, Incline Village, Nev.
Filed Under: Features, September 2017

Página 25 de 25

You might also like