Professional Documents
Culture Documents
“We have it in our power to begin the world over again. A situation, similar to
the present, hath not happened since the days of Noah until now. The birthday
of a new world is at hand.”
1. Multi-Disciplinary Perspective
This treatise is mostly centered on the introduction of the notion of basis instru-
ments contracts (BICs) as the building blocs for pricing or hedging any derivative
security.
We present a very simple and practical method for redefining the most general
derivatives contract, and that is as a function of the realized values of an observable
over a specified period. In addition, rather than defining the notional of derivatives
contracts as a real number, we define it as a function of the values of observables up
to the derivatives contract premium payment time. This alone is a central insight
of our analysis.
Each set of BICs is naturally associated with a BIC-Basis that describes premium
payments amounts for any specified payout payment amounts format.
This enables us to describe the decomposition formula that precisely shows how
any derivative security is decomposed in these BICs in the chosen payout pay-
ment amounts formats. The importance of the definition of these BICs cannot be
overemphasized. A top-level understanding of their relevance may start by asking
4 THE CASE FOR BICs
the two following questions accessible to the generally educated and with no ap-
parent bearing on the issue at hand: Why was mapping the human genome in 2000
hailed by the medical research community as one of the major scientific achieve-
ments of the century in the field? Why was Mendeleyev’s (1869) description of
the periodic table of elements such a foundational achievement in chemistry and
physics?
In answer to the first question, we have been explained that the identification of
the role of each gene in our DNA will help understand the causes of most genetic
diseases, facilitating tailor made cures. Although in most cases, many of these ben-
efits are in their early stages, governments and private sector have and continue to
invest massively in this endeavor. In the second case, we now know for sure that
Mendeleyev’s table helped scientists understand that all physical matter were just
combinations of his basic elements. As a result, it became clear to chemists that all
desired materials could be built by doing just such combinations. The issue shifted
to economics: how could one make these combinations most cost efficiently? This
has led to the development of chemical industries that have taken advantage of
the opportunity and helped build the wealth of nations by satisfying our needs for
various types of materials. Indeed research continues to expand to find more cost
effective processes to manufacture needed products.
There are many more examples where individuals or groups have attempted lay-
ing down structural foundations. Very often, such attempts fail very miserably,
usually when uninformed optimism results in substantial underestimation of the
scope of the endeavor or when the focus on esthetics early on subdues the critical
mind. Nevertheless, when successful, they can usher in new eras of unparalleled
development.
In the human civilization, the apparition of writing and the various alphabets are
referred to as the single events providing a line of demarcation with prehistoric
times. Languages that are more efficient, whether spoken or computational, have
increasingly become vectors of greater development.
In financial derivatives risk management, our identification of BICs and the char-
acteristics of each one of them are akin to the identification of the gene as the unit
element in the expression of each living creature’s descriptive feature as well as the
inventory of all possible genes. In chemistry or physics, the analogy is the atom
and the inventory of all atoms given by the Mendeleyev table. Our decomposition
formula(s) is analogous to describing the genetic composition of each possible liv-
ing being, once identified; in physics or chemistry, our decomposition formula
could be equivalent to providing the atomic composition of each described mate-
rial, whether solid, liquid or gas.
These analogies may seem somewhat pretentious, or in the words of Karl Popper,
naïve “ holism,” so it is important first to acknowledge that evolutionary efforts
to build such a structure has a long history in financial derivatives analysis, and to
present for maximum efficiency our argument in an apparent framework of the-
oretical inclusiveness rather than rejection. This will provide the appropriate cal-
ibrating referentials for understanding the material provided, and help the reader
penetrate its genuine contributive substance with minimum “intellectual transac-
tion costs.”
The trust of our argument in this book is to reformulate the earlier paradigms
in a more general framework that unearths previously hidden relationships and
surprisingly lends itself to a scale of practical implementations never before con-
templated.
One such practical implementation is the possibility to trade arcane and complex
derivatives contracts merely through BICs trading and linear recombination.
6 THE CASE FOR BICs
The recent empirical evidence from behavioral finance seems to indicate that such
a view is at least partly misguided. Works by Kahneman, Tversky [130], Thaler
[178], Barberis [15] among others have documented within the prospect theory
argumentation and outside, numerous persisting biases in human behavior and
general patterns that are not so far from amounting to laws of human nature.
3. Overview of Contributions
Our analytic framework also provides a clean method for extending and general-
izing transaction cost economics results in derivatives pricing and hedging.
Through the power of efficient mathematical formulations, our work will have the
effect of clarifying and simplifying current pricing issues. In philosophical agree-
ment with Wittgenstein’s principles, the benefit is at the very least a gain of “ab-
sence of confusion,” without necessarily increasing computational complexity.
While easily generalized to many other fields of engineering, our results will be
very explicitly exemplified in the resolution of a number of derivatives pricing,
hedging, risk management and trading issues.
And it should not appear surprising that some of the toughest problems in finan-
cial economics will in fact appear here not to be problems of importance, if not
problems at all. An illustration of this will appear in the formulation and establish-
ment of the fundamental theorems of asset pricing in the most general practical
frameworks.
One leading practitioner of derivatives theory once stated “pricing is a science and
hedging is an art.” This book will make it clear that if pricing is a science, then
hedging too is a science, because a hedging issue can always be reformulated—in
a complete or incomplete market framework—as a pricing problem.
One will then be able to see how a number of risk management issues, ranging
from Coherent Value At Risk (VAR) measurement to derivatives portfolio alloca-
tion to hedging effectiveness can be reformulated into pricing issues with clarifying
implications for the incomplete markets analysis or in derivatives accounting.
prices on call/puts prices for all strikes and maturities under various model as-
sumptions. In particular, it becomes clear that for most properly formulated cal-
ibration issues, the difficulty is more narrowing the possible choices to a unique
solution rather than the absence of a solution, as many least square approximate
calibration methods suggest.
Faced with the lack of satisfactory tools in the mathematical literature, our analy-
sis leads us to develop an independent and general theory of interpolation that is
then used to support the resolution of calibration problems considered. One par-
ticularly interesting insight of our analysis there is to deconstruct the relationship
between so-called “parametric” and “non-parametric” estimation methods and
to show how one can coherently transform a complex non-parametric estimation
problem into a simpler parametric one.
Our analysis also provides a clear framework for elucidating and classifying com-
putational complexity issues prior to providing and justifying approximation me-
thods to address them.
On the other hand, we elevate and synthesize deeper philosophical principles and
mathematical properties that provide significant computational benefits to help
effectively address issues we face.
Our independent argument parallels an issue that has been at the heart of many
debates in physics for millennia. According to the Quantum Physicist David
Deutsch, Zeno’s (490BC–425BC) paradox is the earliest known critique of the
common-sense idea that we live in a “continuum”—an infinitely divisible, smooth-
ly structured space. By the nineteenth century the continuum analysis seemed to
have won, with the triumph of the wave theory of light. In 1900, when Max Planck
solved the black body problem by postulating that atoms could absorb or emit
energy only in discrete amounts, the quantum age began.
OPERATIONAL FRAMEWORKS REFORMULATION 9
However, we could view that development as only the latest installment of a strange
loop in the intellectual history of scientific investigation again replaying itself, a re-
peating strange loop which Eric Temple Bell, essayed upon in 1934 with The Search
for Truth [19]: the periodic emergence and re-emergence of the discrete from the
continuous, the continuous from the discrete in Western thought.
Indeed, led by applied mathematics and physics trained quantitative analysts, the
continuous time/continuous space assumptions in the modeling of the various
underlyings under consideration took over and has pervaded the financial deriv-
atives analysis since the nineteen sixties, leveraging Bachelier’s seminal and dis-
carded work [13] at the turn to the 20th century.
One of its most appealing support of this development was that it leveraged well-
established theoretical tools such as Diffusion and Levy processes, well studied
Partial Differential Equations and Martingale theory to obtain easy to compute
pricing or hedging formulas in the early days of the development of the industry.
Our analysis also sheds a new investigative light on Stephen Wolfram’s insight that
despite it often being assumed that continuous systems are computationally more
sophisticated than discrete ones, it has in practice proved surprisingly difficult to
make continuous systems emulate discrete ones [186].
In mathematical finance, Ayache [12] recently very effectively question the valid-
ity of the continuous time/space solutions as the absolute expression of the per-
fection in quantitative analysis. His central argument is that discrete numerical
solutions to PDEs are no worse than their continuous time counterparts as con-
tinuous time limits are themselves often proxy for otherwise irreducibly discrete
10 THE CASE FOR BICs
Indeed, there have been numerous earlier papers that provide discrete time pricing
of derivatives. However, in these papers the discrete time analysis is presented as
merely a narrow case of the more general continuous time analysis. This view is
evidently presented in papers dealing with so-called “discretely monitored deriva-
tives” such as barrier options.
Our finding is that continuous time/space, infinite time/space are very often a
hindrance to understanding and solving relevant issues in derivatives pricing and
hedging. In many other cases this forces us to use a sledgehammer to swat what
would otherwise be a gnat. Our conclusion is that continuous time/space or infi-
nite time/space should only be used, when possible, as mere approximation tools
to reduce computational costs in tasks such as integration or optimization.
Our original criticism of these continuous time/space assumptions and the un-
necessary burdens they place on effective theory development traces its origins in
the investigation of methods to infer the so-called Arrow-Debreu densities from
observed options prices.
This supposes a priori that the price of the option is at least twice differentiable
with respect to the strike. In practice however, one seldom finds a continuous
stream of option prices available in the market and up to now it has been diffi-
cult to argue that one interpolation method is better than another one. Given the
instability of approximations by differentiation, this approach is very questionable.
The importance of this problem was reinforced when it also appeared that, in order
to obtain the price of a derivatives contract that is a function of the underlying
at a future time as a linear combination of vanilla options, forwards and bonds,
one needed—as shown for instance by Carr and Madan [150] among others, to
resort to Schwarz distributions. Otherwise,1 the function of the underlying needed
to be twice differentiable with respect to that underlying. In addition, there were
convergence requirements on the integral of the second derivative of the functional
multiplied by the price of far out of the money options.
OPERATIONAL FRAMEWORKS REFORMULATION 11
Intuitively however, it seemed very much that the C2 requirement did not under-
score a relevant structural problem; what really seemed to matter was the availabil-
ity of enough option prices. The Exhibit below shows our early approach to cir-
cumvent the problem, using a recently rediscovered lemma2 formulated by Emil
Post in the 1930s . The approximate limiting formula proposed, by relying on inte-
grals rather than derivatives of options prices appeared likely to smooth problems
associated with a lack of a continuous stream of options prices. Indeed one can
guess that this integral approach is somewhat related to the non-parametric ap-
proaches for the estimation of Arrow-Debreu densities as studied by Ait-Sahalia
and Lo [142]3 among others.
Our original criticism has been further reinforced by several other instances of
misuse of the continuous time framework. One can quote the Case of Average rate
options and Volatility Swaps where pricing approaches are based on continuous
sampling assumptions to compute averages or realized volatility. Indeed such as-
sumptions are false and actually traded contracts always specify rules for discrete
sampling. What is sometimes troubling here is that there are little rigorous esti-
mations of the extent to which such approximations reflect reality. In fact, in the
prevailing literature, these continuously sampled estimates are treated as if they
were the most exact description of actually traded contracts.
The primary benefit of the discrete time/space approach is that we are able to phys-
ically isolate what we call BICs and construct what we term BIC-basis. These en-
able us to clarify the notion of completeness in a more general sense and develop
quantitative tools to measure it.
We also expose the marginal need for the more sophisticated theoretical tools
needed in the treatment of PDEs/PIDEs without coming short of resolving the
issues for which they were designed. In order to deal with regularity issues in con-
tinuous time/spaces, one must in general use increasingly sophisticated tools to
obtain meaningful results. For instance, one must use distributions to tackle func-
tions that may not be sufficiently differentiable or perform operations such as con-
volutions to regularize functions. Convolutions lead to approximations.
non-trivial nature of these sets has led to extensive studies of topological vector
spaces beyond the familiar categories of Hilbert and Banach spaces, studies that,
in turn, have provided useful new insights in some areas of analysis, such as partial
differential equations or functions of several complex variables” [180].
One of our central decomposition formulas in BICs will use in its proof, distribu-
tions theory type ideas to a trivial space of test-functions that are functions defined
on a discrete finite space with discrete notions of derivatives and integrals. How-
ever, this is done merely for clarity purposes and the abstract notions of dual of
functional spaces is avoided in the actual applications.
Indeed, when we use PDEs or PIDEs as the primary tool for the study of dynam-
ical systems that we explore in derivatives theory, the solution, when regularity
and convergence issues are required but uncertain, is often to take functions in
the Sobolev spaces subset of distributions. This shows that the study in contin-
uous/time space of generalized pricing and hedging issues would be done in in-
creasingly abstract and complex spaces and various abstract notions of solutions
would then be considered to deal with problems at hand; that is how notions of
viscosity solutions when solving PDEs have been created. All those concepts were
developed without derivatives pricing and hedging in mind and contribute to kill
financial intuition. Further, in the context of derivatives pricing and hedging, the
requirement to master all these tools put substantial additional costs on the devel-
opment and dissemination of effective risk management practices.
The case for truly discrete time trading could be further substantiated with em-
pirically and psychologically based arguments. Continuous time trading can act
as a hindrance against rationally taking into account relevant information in trad-
ing decisions. Further, what is left unaccounted for in the continuous time trading
analysis is the substantially increasing psychological cost/pain positively correlated
with trading frequency. The case for this argument is brilliantly made among oth-
ers by Taleb [177].
The issue of space/time infiniteness is not realistic. In the real world, we are in a
discrete framework as underlyings values change by minimal pre-specified incre-
ments and all underlyings within short time frames predictably trade in a bounded
range, and thus can be assumed to take values in a finite set. Hence, the problem of
convergence of spatial integrals becomes a non issue. With finite spaces, limits al-
ways exist and all sums or integrals always converge. Similarly, derivatives contracts
14 THE CASE FOR BICs
This may thus be an instance where we would concur with Mr. Keynes that in the
long run, we will all be dead.
Further, in the more general BICs framework developed in this book, we lift the
traditional limitation of the description of underlying through Stochastic Dif-
ferential Equations (SDE) and markovian stochastic processes and present the
possibility of successfully handling the increased computational cost. For exam-
ple, we can show how an underlying can be described via heteroskedastic Black
Scholes implied volatility functions that are determined by parameters lagging sev-
eral (one or two) times behind, in order, for instance, to incorporate empirical
evidence brought by recent econometric research such as the family of GARCH
models.
We will anchor our most central pricing and hedging argument for exact solutions
around the idea of “backward analysis” that will shed new light or provide a new
interpretation to Kierkegaard’s quote: “Life can only be understood backwards;
but it must be lived forwards.”
In particular, this backward analysis should be contrasted to the Monte Carlo for-
ward summing approach. While the Monte Carlo analysis helps see bad events that
can happen, it does not do that well to help game plan ahead of time what to do in
the event of potential adverse outcomes. This is specifically why the Monte Carlo
forward summing approach performs so poorly in pricing American options. It
may also in part justify why pure Monte Carlo forward summing approaches are
not so good at providing the Greek sensitivities for hedging purposes, the Malli-
avin calculus based methods notwithstanding and the usefulness of Greeks stand-
ing.
OPERATIONAL FRAMEWORKS REFORMULATION 15
Despite our criticism of the continuous time/space framework, our claim is not
that it is not useful. Rather, our strong point is that it has been misconstrued in
the prevailing literature as the most accurate representation of reality, sometimes
creating unnecessary problems. Its use must effectively be confined to being an ef-
ficient approximation tool when a discrete representation is computationally im-
practical. This in our view is the right balance in the use of the discrete and the
continuous.
However, in our functional variables framework, the KKT theorem is often used
in an iterative backward fashion of the dynamic programming kind, as a more
general alternative to Hamilton Jacobi Bellman (HJB) theorems in the study of
multi-period optimization problems.
In the next chapter, we start by redefining a derivatives contract, laying down the
foundation for subsequently exposing our concepts and results. Although intu-
itively simple to recognize, judging from previous attempts at formalism, ascribing
a definition appears not to be a so simple proposition.
16 THE CASE FOR BICs
This section illustrates through the case of the derivation of the market implied
risk neutral density from options prices, anecdotal evidence of the irrelevance of
regularity requirements in derivatives analysis. While mathematical finesse dis-
played may seduce the trained mathematician, the case must be made that the
difficulties were created in the first place by a poor choice of operational frame-
work.
Given a function f (t) defined for t > 0, the Laplace transform F(s) is defined as:
∞
F(s) = e−st f (t)dt.
0
n+1
(−1)n n n
Sup e−bt f (t) < ∞, then f (x) = Lim F (n) .
t>0 n→+∞ n! x x
Proof.
See
or
∼
2
F F nF
Γ+n , x = n (n − 1) x−n+2 − 2n2 x−n+1 + x−n ,
K K K
2
− F F nF
Γn , x = n (n − 1) xn−2 − 2n2 xn−1 + x n
.
K K K
EXHIBIT: REGULARITY REQUIREMENTS IN QUESTION? 17
Then,
n+1
1 F n−1 n
Density(K, T) = Lim
B(0, T) n→+∞ n! K
⎛ ⎞
1 − nF
⎜Fe− nFK + Γ+ F , x e xK Call F , T dx⎟
⎜ n ⎟
⎜ K x2 x ⎟
⎜ ⎟
⎜
×⎜
0 ⎟,
⎟
⎜ 1 ⎟
⎜ − F − nFx ⎟
⎝ + Γn ,x e K Put (xF, T) dx ⎠
K
0
where B(0, T) is the price of the unit zero coupon bond between zero and T and F
is the forward price at time zero for maturity T. Indeed, Density, Call, Put are all
functions of the strike (K) and the maturity (T).
Proof. The proof of this result start by obtaining the Laplace transform as a
function of calls and puts as provided in this volume for any function of underly-
ings. Then using Post’s lemma above, one can then deduce the density as a limit
involving the derivatives of the Laplace transform.
18 THE CASE FOR BICs
1 In an earlier print, “Otherwise” is replaced by “Further.” This was—as fairly pointed out to the
author by P. Carr, indeed an unfortunate typo. However, this may have hidden a “Freudian” instance
of justified unease; using the function satisfactorily as distribution in case it is not twice differentiable
requires test functions that are infinitely smooth with compact support. Thus, the use of regular distri-
butions is not sufficient in the referred paper to get around the regularity issue in the continuous space
framework if one uses densities (the applicable test functions here) that are not infinitely smooth with
compact support, assuming constant discount factor in the usual frictionless case.
2 See:
or
∼
3 See:
∼