You are on page 1of 10

Chaos and Randomness

Ryan Marshall and Max Proctor


April 2, 2014
Abstract
To explore correllations between outputs of chaotic mappings and compare them to known statistical theorems.

Independent Random Variables

In our project we will reinforce some basic statistical theorems to allow us


to contrast and compare independent random variables with sets of numbers
from dynamical systems. We know that dynamical systems are prone to chaos
(sensitive dependence on initial conditions), we wish to study the mappings of
dynamical systems in order to get a better determine the behaviour of the system in the presence of chaos.

We will begin our study with an understanding of randomness by considering


sets of independent random variables. Although it may seem like the very definition of random suggests that there is no correlation between sets of randomly
generated numbers, mathematics can show that there are intimate relationships
between these numbers. The ideas we will be studying do not give us a relationship between two elements of the set, but rather describe the behaviour of
the set of independent random variables.

Consider the Law of Large Numbers (LLN), which states that the mean of a
sample average converges almost surely to the expected value as the sample size
increases to infinity. This theorem suggests that relationships between means of
sample averages become predictable for the long term. This appears to be in
a stark contrast to chaotic dynamical systems. Sensitive dependence on initial
conditions is on some level the idea that solutions are unpredictable in the long
term.

The proof for the law of large numbers is time-consuming and does not lead
to broader understanding of the LLN so we will omit it. We have however taken
advantage of computer simulation to aid us in gaining insight on the LLN. The
idea behind our simulation was to generate random distributions (with some
chosen standard deviation and mean) of different sizes, analyze each generated
distribution and determine its mean, and then to plot each distributions mean
1

against its size (i.e. vs n). In order to demonstrate the theorem one would
expect a plot that deviated from the expected value for small n and then as n
increases the sample mean will converge to the expected value.

10
8

Mean

11

12

Law of Large Numbers

2000

4000

6000

8000

10000

The above graph demonstrates the simulation with values as follows:


= 10
2 = 3
n( max) = 10000
You can clearly infer that as n increases the mean tends to converge to
= 10.

The predictive nature of these sets of independent random variables begs us


to investigate other relations between sets of independent random variables. So
we would like to review quickly the Central Limit Theorem(CLT).

Given X1 , X2 ,. . . . is a sequence
of i.i.d random variables, each with exPn
pected value and 2 . Let i=1 Xi = Si , the CLT states that for large n,
Sn will approximately be a normal random variable with expected value n
n)
and variance n 2 . Then as a result we have P ( (S(n
) is approximately the
n)
standard normal distribution function. with the approximation becoming more
exact as n grows larger.

This is more tenuous than one would think. The implication of the hypotheses is that this quantity Sn will be a normal random variable with expected

value n and variance n 2 . The second part of the theorem is basically a corollary.

To simplify things the CLT says that if you sum together numerous independent random variables from the same probability distribution then as you
increase the number of terms in the summand (i.e increase n), the summands
n)
converges to
converge to a normal distribution. This also implies that (S(n
n)
the standard normal distribution.

This is pretty useful. It gives you a way to easily compute many different
probabilities from many different distributions.

We attempted to demonstrate the CLT through simulation as follows. First


we generate many different distributions to create Sn , followed by calculation of
n)
the normalized Sn through the equation (S(n
, then finally we computed
n)
s. From there we could
the probability distribution function (PDF) for (S(nn
n)
compare the PDF to the known PDF of the standard normal distribution.

400
0

200

Frequency

600

800

PDF

-4

-2

S_n

I ran this one, with 10,000 Sn s formed from 10,000 generated poisson distributions. As you can see it is very close to the PDF given by the standard
normal distribution, reinforcing the validity of our simulation.

From our review of theorems from statistics we can clearly see that there are
predictable trends in random numbers. Which will help us to develop our original question of whether or not mappings of chaotic dynamical systems exhibit
similar behaviour and to guide our study in understanding what the mappings
do correlate to.

Chaotic Maps

We now consider the dynamical system T (z) = 4z(1 z). It is worth noting
that this is similar to the logistic equation x = rx(1 x). A good place to
first consider this dynamical system is to look at the bifurcations of the logistic
equation. Below is the relevant bifurcation diagram:

We certainly expect to see chaos for the map T (z) = 4z(1 z) Setting
z0 = 0.3 and iterating using zn = 4zn1 (1 zn1 ) we get z1000 = 0.0401, but if
z0 = 0.299 then we get z1000 = 0.8439. This may not prove that there is chaos
but it definitely looks like that is the case.
So if the map T (z) = 4z(1z) creates chaos, then maybe we can use this map
to compare chaos to independent random variables. We have already discussed
the strong law of large numbers, but do the zn of the map follow this law? As
this map creates chaos we find that as n tends to infinity our values z0 , ..., zn are
dense in the unit interval [0, 1]. It is worth checking how chaotic this actually
is. To do this we can compute the zn iteratively before summing all n of them
together before dividing the sum by n; this would be the means of the zn s. This
is easy to do on a computer. We do this for z0 = 0.3421 (chosen arbitrarily).
For n = 10: sum(z)/n = 0.556455392058685
For n = 1, 000: sum(z)/n = 0.515545540139079
For n = 100, 000: sum(z)/n = 0.50246319958991
For n = 10, 000, 000: sum(z)/n = 0.500034797296272
We can see that as n tends to infinity sum(z)/n tends towards 0.5. In fact
we get this same result for any z0 chosen on (0, 1) barring 12 . With the zn s
seeming to be fairly random on the unit interval it is no surprise to expect a
mean of 0.5 for a large amount of them, and this result is analogous to the
strong law of large numbers for independent random variables. This is certainly
a similarity between chaos and randomness.

Again we use our recently aquired statistical knowledge to complete another


comparison, this time with the CLT. To do this we generated 1000 random
variables with = .5 and 2 = .25 in attempt to keep them within [0, 1] and
use them as our initial conditions. We then mapped each initial condition 1000
times and from these sets we created our Sn s (analagous to the CLT). From
here it was easy to create a histogram to see if it matched up with the PDF for
the standard normal distribution.

30
0

10

20

Frequency

40

50

60

CLT with Chaos

-4

-2

holder

It is clear that even for fairly small n (1000), that the data is correllating very
closely to that which we observed earlier while introducing the CLT.
Another way we can compare chaos and randomness could be to plot zn
against zn+2 and xn against xn+2 . For this example we produce 1000 binomial
distributed random numbers x1 to x1000 with parameters n = 10, p = 0.6. These
numbers are independent of one another and so we expect this to be reflected
when we plot xn against xn+2 :

The expectation for these random variables is n p = 6 so this diagram appears reasonable. If we were to produce another 1000 binomial random numbers
with the same parameters we would expect a slightly different diagram, despite

still being centred on the expectation. We get a similar diagram if we do this


with numbers generated from a different distribution (Poisson for example), the
biggest difference being that each diagram is roughly centred on the expectation, as above.
So do we get a similar graph if we do the same for zn s outputted from the
map T (z) = 4z(1 z)? One would expect the graph to appear less random
each zn is derived from zn1 , in other words they are not truly independent.
Using the z0 , ..., zn generated from using zn = 4zn1 (1zn1 ) with z0 = 0.7884
(again, chosen arbitrarily) zn plotted againstzn+2 looks like

It looks like all of the points (zn , zn+2 ) fall in what is roughly an M-shape.
In fact if we plot each of these points as a bubble without connecting them, we
see the M-shape much more clearly. We do this below for even larger n, here
n = 1, 000, 000:

This is very interesting indeed, each point (zn , zn+2 ) falls on this M-shape
perfectly. Thisdiagram certainly implies the dependence of the zn . This is in
striking contract to what we witnessed for the case of the iid random numbers.

It may now be of interest to plot zn against zn+1 for the chaotic case:

Here we see that there is one maximum on this graph. Could this be linked
to the fact that zn+1 = T (zn )? We had two maxima in the previous case and
so it may be worth noting that zn+2 = T (zn+1 ) = T 2 (zn ). We can now check
how many maxima occur when we plot zn against zn+3 (bearing in mind that
zn+3 = T 3 (zn )):

Clearly our hypothesis regarding the maxima has broken down. What about
plotting zn against zn+4 ? We get

Now what happens if we make similar plots when the xn are iid random
numbers? We have already seen what this looks like for xn plotted against
xn+2 . It turns out that the graph is extremely similar when we plot xn against
xn+1 or even if we plot xn against xn+1000 . Should this surprise us? Not really
we have already outlined the fact that these numbers are independent. xn+1
is not dependent on xn so clearly xn+2 is not dependent on xn .
This outlines a major difference between chaos and randomness. Chaos
sometimes may appear random but it really isnt, and this would be due to the
deterministic nature of a map. We have sensitive dependence on initial conditions; but if we use exactly the same z0 to calculate z1000 then we will get the
exact same value for z1000 every single time. Clearly this is not the same with
independent random numbers. Generating two iid random numbers will give us
two different numbers with a probability extremely close to 1.
We noted earlier that chaos seems to obey the strong law of large numbers.
It would certainly be of interest to see if the same applies for the central limit
theorem. Is chaos normally distributed? To check this we find z1 ,..., z10,000
z +...+z10,000
starting from z0 , before computing 1 10,000
. We do this 1, 000 times using
different values of z0 (randomly chosen from a uniform distribution). We then
plot all of these values on a histogram.

We can see that this histogram loosely follows the bell-shaped curve of a
normal distribution. Chaos appears to satisfy the central limit theorem. Now
what happens when we do this for yn = f (zn ) or yn = f (xn ) where f is an
arbitrary function? We find that the f (xn ) are normally distributed for iid xn .
To consider the chaos case we take f (s) = 3s2 2s + 7. Doing the same as
y +...+y10,000
we get
above but with 1 10,000

Once again it looks like these values are obeying the central limit theorem
a trait of random numbers.
From the above it appears that chaos behaves similarly to randomness albeit
in a more constrained way. We have seen that chaos obeys the strong law of large
numbers and the central limit theorem; indicating that the behaviour of chaos
and randomness does settle down when we consider a large number of chaotic
zn or random xn . The fundamental difference is that chaos is fully dependent
of what has happened in the past; a stark contrast to random numbers which
are independent of one another. This was certainly apparent when plotting xn
against xn+i and zn against zn+i for various i. We also noted that mapping the
chaotic zn against zn+i was consistent with our constrained comment, whereas
9

doing the same for random xn was certainly less constrained. Despite sharing
some similarities it is clear that chaos is not the same as randomness.

10

You might also like