You are on page 1of 16

CHAPTER I

THE CASE FOR BICs

“We have it in our power to begin the world over again. A situation, similar to
the present, hath not happened since the days of Noah until now. The birthday
of a new world is at hand.”

Thomas Paine, Common Sense (1776)

1. Multi-Disciplinary Perspective

This treatise is mostly centered on the introduction of the notion of basis instru-
ments contracts (BICs) as the building blocs for pricing or hedging any derivative
security.

We present a very simple and practical method for redefining the most general
derivatives contract, and that is as a function of the realized values of an observable
over a specified period. In addition, rather than defining the notional of derivatives
contracts as a real number, we define it as a function of the values of observables up
to the derivatives contract premium payment time. This alone is a central insight
of our analysis.

Each set of BICs is naturally associated with a BIC-Basis that describes premium
payments amounts for any specified payout payment amounts format.

This enables us to describe the decomposition formula that precisely shows how
any derivative security is decomposed in these BICs in the chosen payout pay-
ment amounts formats. The importance of the definition of these BICs cannot be
overemphasized. A top-level understanding of their relevance may start by asking
4 THE CASE FOR BICs

the two following questions accessible to the generally educated and with no ap-
parent bearing on the issue at hand: Why was mapping the human genome in 2000
hailed by the medical research community as one of the major scientific achieve-
ments of the century in the field? Why was Mendeleyev’s (1869) description of
the periodic table of elements such a foundational achievement in chemistry and
physics?

In answer to the first question, we have been explained that the identification of
the role of each gene in our DNA will help understand the causes of most genetic
diseases, facilitating tailor made cures. Although in most cases, many of these ben-
efits are in their early stages, governments and private sector have and continue to
invest massively in this endeavor. In the second case, we now know for sure that
Mendeleyev’s table helped scientists understand that all physical matter were just
combinations of his basic elements. As a result, it became clear to chemists that all
desired materials could be built by doing just such combinations. The issue shifted
to economics: how could one make these combinations most cost efficiently? This
has led to the development of chemical industries that have taken advantage of
the opportunity and helped build the wealth of nations by satisfying our needs for
various types of materials. Indeed research continues to expand to find more cost
effective processes to manufacture needed products.

There are many more examples where individuals or groups have attempted lay-
ing down structural foundations. Very often, such attempts fail very miserably,
usually when uninformed optimism results in substantial underestimation of the
scope of the endeavor or when the focus on esthetics early on subdues the critical
mind. Nevertheless, when successful, they can usher in new eras of unparalleled
development.

In the human civilization, the apparition of writing and the various alphabets are
referred to as the single events providing a line of demarcation with prehistoric
times. Languages that are more efficient, whether spoken or computational, have
increasingly become vectors of greater development.

To further stress the importance of efficiency contributions, it is noteworthy that


the advent of the printing presses in the fifteenth century (circa 1450-Johannes
Gutenberg) gave birth to unparalleled levels of intellectual and literacy develop-
ment, leading to accelerated economic development. The printing presses un-
leashed intellectual fires at the end of the Middle Ages, helping usher in a new
era of enlightenment. Without the development of the printing press and their
subsequent enhancements, the Renaissance and the Reformation in Western Eu-
rope may never have happened, and most of the literary and scientific treasures
of the western civilization might have never been created; indeed their widespread
diffusion would almost certainly never have happened.
FINANCIAL DERIVATIVES PERSPECTIVE 5

2. Financial Derivatives Perspective

In financial derivatives risk management, our identification of BICs and the char-
acteristics of each one of them are akin to the identification of the gene as the unit
element in the expression of each living creature’s descriptive feature as well as the
inventory of all possible genes. In chemistry or physics, the analogy is the atom
and the inventory of all atoms given by the Mendeleyev table. Our decomposition
formula(s) is analogous to describing the genetic composition of each possible liv-
ing being, once identified; in physics or chemistry, our decomposition formula
could be equivalent to providing the atomic composition of each described mate-
rial, whether solid, liquid or gas.

We can relate the various computational efficiency methods proposed in these


books and their subsequent improvements to the effect the development of the
printing presses had on the practical empowerment of human kind through the
possibility of widespread written communication.

These analogies may seem somewhat pretentious, or in the words of Karl Popper,
naïve “ holism,” so it is important first to acknowledge that evolutionary efforts
to build such a structure has a long history in financial derivatives analysis, and to
present for maximum efficiency our argument in an apparent framework of the-
oretical inclusiveness rather than rejection. This will provide the appropriate cal-
ibrating referentials for understanding the material provided, and help the reader
penetrate its genuine contributive substance with minimum “intellectual transac-
tion costs.”

In one example of inclusiveness, we generalize the existing framework of


vanilla puts and calls to multidimensional BICs. In the process, we develop a con-
cept of implied correlation on the model of implied volatility that in turns allows
us to “link” various univariate distributions into a desired bivariate or larger multi-
variate distribution. This enables us to obtain an intuitively meaningful and more
natural alternative to the celebrated mathematical theory of Copulas for achiev-
ing distributions’ “linkage” and leads to many other a priory unintended practical
applications.

The trust of our argument in this book is to reformulate the earlier paradigms
in a more general framework that unearths previously hidden relationships and
surprisingly lends itself to a scale of practical implementations never before con-
templated.

One such practical implementation is the possibility to trade arcane and complex
derivatives contracts merely through BICs trading and linear recombination.
6 THE CASE FOR BICs

A number of derivatives practitioners with prior research background in theoreti-


cal physics have often wondered whether a Theory Of Everything (TOE) was possi-
ble in financial derivatives theory on the model of the string theories of
physics. Many of these practitioners often entered the field with the expectation
that such a theoretical framework could quickly be formulated only to discover
after a few years and several attempts that this generalization effort was indeed a
non-trivial task, apparently vindicating Popper’s philosophical skepticism on so-
cial and political change applied to financial engineering. (A rigorous probabilis-
tic mind would stress the “apparently” qualification since if all swans one has ever
seen are white, that still does not prove that a swan cannot be black.)

A conventional wisdom response by seasoned professionals to newcomers’ with


such impulses is: “it ain’t physics.” Physical phenomena are external entities that
can be observed and modeled without causing the laws governing them to change
in response. By contrast, the argument goes, social phenomena that drive deriva-
tives prices are endogenous and hence formulating laws governing them is likely
to cause them to change, driven by social agents’ efforts to take advantage of pre-
dicted laws.

The recent empirical evidence from behavioral finance seems to indicate that such
a view is at least partly misguided. Works by Kahneman, Tversky [130], Thaler
[178], Barberis [15] among others have documented within the prospect theory
argumentation and outside, numerous persisting biases in human behavior and
general patterns that are not so far from amounting to laws of human nature.

However, we ground ourselves in the methodological spirit of the Austrian econo-


mists with logically mathematically deductive arguments, while sidestepping these
economists distrust of empirical evidence. Further, contrary to the Austrian school
we do not embrace the pessimistic view on the possibilities of effective mathemat-
ical modeling of economic phenomena.

3. Overview of Contributions

Our goal as opposed to behavioral finance investigations or other empirically


testable hypothesis is not to formulate predictive laws of the behavior of future
observables, but rather to propose a resilient structural framework upon which
stylized empirical observations, evolving anticipations or postulated laws can be
seamlessly incorporated in the pricing and hedging of financial derivatives. In do-
ing this, we obtain without making much empirical assumptions, both qualita-
tive philosophical results as well as mathematical results quantitatively measurable.
This is possible through a priori assumptions of non-parametric descriptions.
OVERVIEW OF CONTRIBUTIONS 7

Our analytic framework also provides a clean method for extending and general-
izing transaction cost economics results in derivatives pricing and hedging.

In addition, it further provides an opportunity for better anchoring the


emerging financial and structural finance analysis in the organization of trading
venues and trading participants.

The established taxonomy of works in financial economics usually differentiates


between normative (prescriptive) and positive (descriptive) theories. We can
summarize our path as approaching normative issues from a strong positive stand-
point.

Through the power of efficient mathematical formulations, our work will have the
effect of clarifying and simplifying current pricing issues. In philosophical agree-
ment with Wittgenstein’s principles, the benefit is at the very least a gain of “ab-
sence of confusion,” without necessarily increasing computational complexity.

While easily generalized to many other fields of engineering, our results will be
very explicitly exemplified in the resolution of a number of derivatives pricing,
hedging, risk management and trading issues.

And it should not appear surprising that some of the toughest problems in finan-
cial economics will in fact appear here not to be problems of importance, if not
problems at all. An illustration of this will appear in the formulation and establish-
ment of the fundamental theorems of asset pricing in the most general practical
frameworks.

One leading practitioner of derivatives theory once stated “pricing is a science and
hedging is an art.” This book will make it clear that if pricing is a science, then
hedging too is a science, because a hedging issue can always be reformulated—in
a complete or incomplete market framework—as a pricing problem.

This will lead in particular to the rejection of Greeks computation as central/prim-


ary tools for derivatives hedging. As a result, the relevance of entire fields of re-
search such as the application of Malliavin Calculus in finance is very much in
question.

One will then be able to see how a number of risk management issues, ranging
from Coherent Value At Risk (VAR) measurement to derivatives portfolio alloca-
tion to hedging effectiveness can be reformulated into pricing issues with clarifying
implications for the incomplete markets analysis or in derivatives accounting.

In the BICs framework, we will be able to reformulate thorny issues of calibration


in a much trivial manner, illustrating it in the exercise of calibrating derivatives
8 THE CASE FOR BICs

prices on call/puts prices for all strikes and maturities under various model as-
sumptions. In particular, it becomes clear that for most properly formulated cal-
ibration issues, the difficulty is more narrowing the possible choices to a unique
solution rather than the absence of a solution, as many least square approximate
calibration methods suggest.

Faced with the lack of satisfactory tools in the mathematical literature, our analy-
sis leads us to develop an independent and general theory of interpolation that is
then used to support the resolution of calibration problems considered. One par-
ticularly interesting insight of our analysis there is to deconstruct the relationship
between so-called “parametric” and “non-parametric” estimation methods and
to show how one can coherently transform a complex non-parametric estimation
problem into a simpler parametric one.

Our analysis also provides a clear framework for elucidating and classifying com-
putational complexity issues prior to providing and justifying approximation me-
thods to address them.

In order to achieve our objectives, we challenge and debunk a number of current


mathematical methods and approaches that have been used to address our issues
and expose their narrower relevance.

On the other hand, we elevate and synthesize deeper philosophical principles and
mathematical properties that provide significant computational benefits to help
effectively address issues we face.

4. Operational Frameworks Reformulation

4.1. Against Continuous time/space, infinite time/space modeling assumptions

Our primary methodological challenge is the use/misuse of the continuous


time/space paradigms as well as space/time infiniteness in dominant Derivatives
pricing and hedging issues.

Our independent argument parallels an issue that has been at the heart of many
debates in physics for millennia. According to the Quantum Physicist David
Deutsch, Zeno’s (490BC–425BC) paradox is the earliest known critique of the
common-sense idea that we live in a “continuum”—an infinitely divisible, smooth-
ly structured space. By the nineteenth century the continuum analysis seemed to
have won, with the triumph of the wave theory of light. In 1900, when Max Planck
solved the black body problem by postulating that atoms could absorb or emit
energy only in discrete amounts, the quantum age began.
OPERATIONAL FRAMEWORKS REFORMULATION 9

However, we could view that development as only the latest installment of a strange
loop in the intellectual history of scientific investigation again replaying itself, a re-
peating strange loop which Eric Temple Bell, essayed upon in 1934 with The Search
for Truth [19]: the periodic emergence and re-emergence of the discrete from the
continuous, the continuous from the discrete in Western thought.

Indeed, led by applied mathematics and physics trained quantitative analysts, the
continuous time/continuous space assumptions in the modeling of the various
underlyings under consideration took over and has pervaded the financial deriv-
atives analysis since the nineteen sixties, leveraging Bachelier’s seminal and dis-
carded work [13] at the turn to the 20th century.

One of its most appealing support of this development was that it leveraged well-
established theoretical tools such as Diffusion and Levy processes, well studied
Partial Differential Equations and Martingale theory to obtain easy to compute
pricing or hedging formulas in the early days of the development of the industry.

Our analysis also sheds a new investigative light on Stephen Wolfram’s insight that
despite it often being assumed that continuous systems are computationally more
sophisticated than discrete ones, it has in practice proved surprisingly difficult to
make continuous systems emulate discrete ones [186].

In mathematical finance, Ayache [12] recently very effectively question the valid-
ity of the continuous time/space solutions as the absolute expression of the per-
fection in quantitative analysis. His central argument is that discrete numerical
solutions to PDEs are no worse than their continuous time counterparts as con-
tinuous time limits are themselves often proxy for otherwise irreducibly discrete
10 THE CASE FOR BICs

situations. While such an argument is agreeable, it does not go so far as to ques-


tion the importance of PDEs as tool of analysis in Dynamical systems, a central
conclusion of our criticism. However, Ayache’s focus is merely a reflection of the
state of the art in dynamic systems analysis. A recent rapid survey of the articles of
the Journal Discrete and Continuous Dynamical Systems (DCDS) sponsored by
the American Institute of Mathematical Sciences showed that virtually all of them
are related or use as operational tool ODEs or PDEs/PIDEs.

Indeed, there have been numerous earlier papers that provide discrete time pricing
of derivatives. However, in these papers the discrete time analysis is presented as
merely a narrow case of the more general continuous time analysis. This view is
evidently presented in papers dealing with so-called “discretely monitored deriva-
tives” such as barrier options.

Our finding is that continuous time/space, infinite time/space are very often a
hindrance to understanding and solving relevant issues in derivatives pricing and
hedging. In many other cases this forces us to use a sledgehammer to swat what
would otherwise be a gnat. Our conclusion is that continuous time/space or infi-
nite time/space should only be used, when possible, as mere approximation tools
to reduce computational costs in tasks such as integration or optimization.

Our original criticism of these continuous time/space assumptions and the un-
necessary burdens they place on effective theory development traces its origins in
the investigation of methods to infer the so-called Arrow-Debreu densities from
observed options prices.

The so-called Breeden Litzenberger relationship implies that by differentiating


twice with respect to the strike the price of a continuous stream of options prices,
we should obtain the Arrow-Debreu density.

This supposes a priori that the price of the option is at least twice differentiable
with respect to the strike. In practice however, one seldom finds a continuous
stream of option prices available in the market and up to now it has been diffi-
cult to argue that one interpolation method is better than another one. Given the
instability of approximations by differentiation, this approach is very questionable.

The importance of this problem was reinforced when it also appeared that, in order
to obtain the price of a derivatives contract that is a function of the underlying
at a future time as a linear combination of vanilla options, forwards and bonds,
one needed—as shown for instance by Carr and Madan [150] among others, to
resort to Schwarz distributions. Otherwise,1 the function of the underlying needed
to be twice differentiable with respect to that underlying. In addition, there were
convergence requirements on the integral of the second derivative of the functional
multiplied by the price of far out of the money options.
OPERATIONAL FRAMEWORKS REFORMULATION 11

Intuitively however, it seemed very much that the C2 requirement did not under-
score a relevant structural problem; what really seemed to matter was the availabil-
ity of enough option prices. The Exhibit below shows our early approach to cir-
cumvent the problem, using a recently rediscovered lemma2 formulated by Emil
Post in the 1930s . The approximate limiting formula proposed, by relying on inte-
grals rather than derivatives of options prices appeared likely to smooth problems
associated with a lack of a continuous stream of options prices. Indeed one can
guess that this integral approach is somewhat related to the non-parametric ap-
proaches for the estimation of Arrow-Debreu densities as studied by Ait-Sahalia
and Lo [142]3 among others.

Another leading illustration is the requirement of twice-continuous differentiabil-


ity for the derivatives contract price with respect to the underlying for the appli-
cation of the Ito lemma from which the pricing PDE can be established when the
underlying follows a diffusion process.

Our original criticism has been further reinforced by several other instances of
misuse of the continuous time framework. One can quote the Case of Average rate
options and Volatility Swaps where pricing approaches are based on continuous
sampling assumptions to compute averages or realized volatility. Indeed such as-
sumptions are false and actually traded contracts always specify rules for discrete
sampling. What is sometimes troubling here is that there are little rigorous esti-
mations of the extent to which such approximations reflect reality. In fact, in the
prevailing literature, these continuously sampled estimates are treated as if they
were the most exact description of actually traded contracts.

4.2. The Discrete space/time choice and its comparative benefits

In order to clearly sort out relevant facts in derivatives pricing or hedging, it


appeared very reasonable to state the problem in the discrete space framework
corresponding to the actual trading practice. As is seen in our presentation, this
appeared to resolve all the issues of functional regularity outlined above.

The primary benefit of the discrete time/space approach is that we are able to phys-
ically isolate what we call BICs and construct what we term BIC-basis. These en-
able us to clarify the notion of completeness in a more general sense and develop
quantitative tools to measure it.

These also enabled us to provide a relatively simple and encompassing definition


of derivatives contract which then facilitates their decomposition and subsequent
pricing and hedging.
12 THE CASE FOR BICs

Surprisingly enough, by choosing discrete time/space assumptions and defining


integrals and derivatives as subordinated to a space partition, we are able to triv-
ially make a substantial number of critical insights.

For example, a major consequence of the BICs framework developed in these


books is the marginalization of the PDE/PIDE analysis for the study of the evo-
lution of stochastic dynamic systems, and in particular the pricing and hedging of
financial derivatives.

We also expose the marginal need for the more sophisticated theoretical tools
needed in the treatment of PDEs/PIDEs without coming short of resolving the
issues for which they were designed. In order to deal with regularity issues in con-
tinuous time/spaces, one must in general use increasingly sophisticated tools to
obtain meaningful results. For instance, one must use distributions to tackle func-
tions that may not be sufficiently differentiable or perform operations such as con-
volutions to regularize functions. Convolutions lead to approximations.

It is comparatively instructive to recall the motivations of the theory of distribu-


tions pioneered by Laurent Schwartz in the 1940s. “Schwartz’s idea (in 1947) there
was to give a unified interpretation of all the generalized functions that had in-
filtrated analysis as (continuous) linear functionals on the space C ç of infinitely
differentiable functions vanishing outside compact sets. He provided a systematic
and rigorous description, entirely based on abstract functional analysis and on du-
ality. Because of the demands of differentiability in distribution theory, the spaces
of test-functions are the C ç set and their duals are the so-called distributions. The
OPERATIONAL FRAMEWORKS REFORMULATION 13

non-trivial nature of these sets has led to extensive studies of topological vector
spaces beyond the familiar categories of Hilbert and Banach spaces, studies that,
in turn, have provided useful new insights in some areas of analysis, such as partial
differential equations or functions of several complex variables” [180].

One of our central decomposition formulas in BICs will use in its proof, distribu-
tions theory type ideas to a trivial space of test-functions that are functions defined
on a discrete finite space with discrete notions of derivatives and integrals. How-
ever, this is done merely for clarity purposes and the abstract notions of dual of
functional spaces is avoided in the actual applications.

Indeed, when we use PDEs or PIDEs as the primary tool for the study of dynam-
ical systems that we explore in derivatives theory, the solution, when regularity
and convergence issues are required but uncertain, is often to take functions in
the Sobolev spaces subset of distributions. This shows that the study in contin-
uous/time space of generalized pricing and hedging issues would be done in in-
creasingly abstract and complex spaces and various abstract notions of solutions
would then be considered to deal with problems at hand; that is how notions of
viscosity solutions when solving PDEs have been created. All those concepts were
developed without derivatives pricing and hedging in mind and contribute to kill
financial intuition. Further, in the context of derivatives pricing and hedging, the
requirement to master all these tools put substantial additional costs on the devel-
opment and dissemination of effective risk management practices.

We could summarize this criticism in a simple acronym, COR-IP, i.e.:


Convergence or Regularity Irrelevance Proposition.

The case for truly discrete time trading could be further substantiated with em-
pirically and psychologically based arguments. Continuous time trading can act
as a hindrance against rationally taking into account relevant information in trad-
ing decisions. Further, what is left unaccounted for in the continuous time trading
analysis is the substantially increasing psychological cost/pain positively correlated
with trading frequency. The case for this argument is brilliantly made among oth-
ers by Taleb [177].

4.3. The finite space/time choice and its justification

The issue of space/time infiniteness is not realistic. In the real world, we are in a
discrete framework as underlyings values change by minimal pre-specified incre-
ments and all underlyings within short time frames predictably trade in a bounded
range, and thus can be assumed to take values in a finite set. Hence, the problem of
convergence of spatial integrals becomes a non issue. With finite spaces, limits al-
ways exist and all sums or integrals always converge. Similarly, derivatives contracts
14 THE CASE FOR BICs

of infinite maturity may be assumed to be contracts of arbitrarily large finite matu-


rity, where the arbitrarily large finite maturity is set at a time when the discounted
value of remaining payouts is of negligible impact on computed premiums.

This may thus be an instance where we would concur with Mr. Keynes that in the
long run, we will all be dead.

4.4. Sample benefits of the BICs framework

Further, in the more general BICs framework developed in this book, we lift the
traditional limitation of the description of underlying through Stochastic Dif-
ferential Equations (SDE) and markovian stochastic processes and present the
possibility of successfully handling the increased computational cost. For exam-
ple, we can show how an underlying can be described via heteroskedastic Black
Scholes implied volatility functions that are determined by parameters lagging sev-
eral (one or two) times behind, in order, for instance, to incorporate empirical
evidence brought by recent econometric research such as the family of GARCH
models.

4.5. Elevation of deeper mathematical properties

By contrast to our criticism of continuous time/space assumptions, a set of deeper


philosophical and mathematical methodologies are elevated to help find exact so-
lutions to derivative pricing and hedging issues. When computational complexity
renders exact solutions impractical, we outline the fundamental mathematical laws
we rely upon to enable predictably effective approximate solutions.

We will anchor our most central pricing and hedging argument for exact solutions
around the idea of “backward analysis” that will shed new light or provide a new
interpretation to Kierkegaard’s quote: “Life can only be understood backwards;
but it must be lived forwards.”

In particular, this backward analysis should be contrasted to the Monte Carlo for-
ward summing approach. While the Monte Carlo analysis helps see bad events that
can happen, it does not do that well to help game plan ahead of time what to do in
the event of potential adverse outcomes. This is specifically why the Monte Carlo
forward summing approach performs so poorly in pricing American options. It
may also in part justify why pure Monte Carlo forward summing approaches are
not so good at providing the Greek sensitivities for hedging purposes, the Malli-
avin calculus based methods notwithstanding and the usefulness of Greeks stand-
ing.
OPERATIONAL FRAMEWORKS REFORMULATION 15

Despite our criticism of the continuous time/space framework, our claim is not
that it is not useful. Rather, our strong point is that it has been misconstrued in
the prevailing literature as the most accurate representation of reality, sometimes
creating unnecessary problems. Its use must effectively be confined to being an ef-
ficient approximation tool when a discrete representation is computationally im-
practical. This in our view is the right balance in the use of the discrete and the
continuous.

In particular, for the resolution of a number of optimization problems in prac-


tical applications we consider, we assume variables/arguments to be continuous.
Thus, the Karush Kuhn Tucker (KKT) theorem and its trivial simplifications that
we repeatedly use provide an effective method for the localization of a discrete nar-
rower set of local extrema from which the desired global maxima/minima can be
computed.

However, in our functional variables framework, the KKT theorem is often used
in an iterative backward fashion of the dynamic programming kind, as a more
general alternative to Hamilton Jacobi Bellman (HJB) theorems in the study of
multi-period optimization problems.

For the approximation of solutions to computationally intensive problems,


we anchor our arguments around the multidisciplinary established Law of the
Few: information that must be obtained by sampling an infinitely large number of
inputs can be effectively approximated by merely processing a cleverly targeted few
inputs. This notion, recently popularized by Malcolm Gladwell [93] and illustrated
in various fields of knowledge is now part of the common strategic vernacular. As
such, elections results in the US can effectively be approximated by elections results
in a few key states. Mathematically, this can be articulated in Hilbert Space projec-
tion principles. This means we would be able to store relevant descriptive informa-
tion about a function in the form of merely a few coordinates in a wisely selected
Hilbert basis function or an orthogonalized set of functions. That’s why Fourier
analysis and Wavelet theory are so important in engineering. Another example is
the use of the concentration of measure phenomenon to help us target our sam-
pling efforts to areas where probability mass is concentrated in order to facilitate
numerical computation of expectations. The theory concentration of measure4 is
a new and rapidly growing one and it can be expected that many of its results will
be put to use in devising appropriate approximate pricing schemes. Already, in a
general sense low discrepancy sequences in large dimensional integrals can be seen
as a very fruitful anticipating application.

In the next chapter, we start by redefining a derivatives contract, laying down the
foundation for subsequently exposing our concepts and results. Although intu-
itively simple to recognize, judging from previous attempts at formalism, ascribing
a definition appears not to be a so simple proposition.
16 THE CASE FOR BICs

5. Exhibit: Regularity Requirements in Question?

This section illustrates through the case of the derivation of the market implied
risk neutral density from options prices, anecdotal evidence of the irrelevance of
regularity requirements in derivatives analysis. While mathematical finesse dis-
played may seduce the trained mathematician, the case must be made that the
difficulties were created in the first place by a poor choice of operational frame-
work.

5.1. Post’s lemma

Given a function f (t) defined for t > 0, the Laplace transform F(s) is defined as:

∞
F(s) = e−st f (t)dt.
0

If f is continuous and such that there exist b > 0 satisfying:

 n+1  
  (−1)n n n
Sup e−bt  f (t) < ∞, then f (x) = Lim F (n) .
t>0 n→+∞ n! x x

Proof.
See 
 

or   ∼ 
  

5.2. Application to Nonparametric Density Derivation

Applying Post’s lemma above to the characteristic function of the density of an


underlying, we can deduce the Risk Neutral Density from option prices when those
are not given by a continuum of call or put prices by underlying level K as:

Proposition I.1 (Density Approximation). If we note:

    2 
F F nF
Γ+n , x = n (n − 1) x−n+2 − 2n2 x−n+1 + x−n ,
K K K
    2 
− F F nF
Γn , x = n (n − 1) xn−2 − 2n2 xn−1 + x n
.
K K K
EXHIBIT: REGULARITY REQUIREMENTS IN QUESTION? 17

Then,
 n+1
1 F n−1 n
Density(K, T) = Lim
B(0, T) n→+∞ n! K
⎛ ⎞
1   − nF  
⎜Fe− nFK + Γ+ F , x e xK Call F , T dx⎟
⎜ n ⎟
⎜ K x2 x ⎟
⎜ ⎟

×⎜
0 ⎟,
   ⎟
⎜ 1 ⎟
⎜ − F − nFx ⎟
⎝ + Γn ,x e K Put (xF, T) dx ⎠
K
0

where B(0, T) is the price of the unit zero coupon bond between zero and T and F
is the forward price at time zero for maturity T. Indeed, Density, Call, Put are all
functions of the strike (K) and the maturity (T).

Proof. The proof of this result start by obtaining the Laplace transform as a
function of calls and puts as provided in this volume for any function of underly-
ings. Then using Post’s lemma above, one can then deduce the density as a limit
involving the derivatives of the Laplace transform. 
18 THE CASE FOR BICs

Notes For Chapter I

1 In an earlier print, “Otherwise” is replaced by “Further.” This was—as fairly pointed out to the
author by P. Carr, indeed an unfortunate typo. However, this may have hidden a “Freudian” instance
of justified unease; using the function satisfactorily as distribution in case it is not twice differentiable
requires test functions that are infinitely smooth with compact support. Thus, the use of regular distri-
butions is not sufficient in the referred paper to get around the regularity issue in the continuous space
framework if one uses densities (the applicable test functions here) that are not infinitely smooth with
compact support, assuming constant discount factor in the usual frictionless case.

2 See: 
 
 or
  ∼ 
 

3 See: 
  ∼
  

4 See works by pioneers in the field, Talagrand and Ledoux at:


    

You might also like