You are on page 1of 3

[ADVANCED] ANALYSIS, PROBABILITY AND STATISTICS EXERCISES [10]

Proof is the idol before whom the pure mathematician tortures himself.
SIR ARTHUR EDDINGTON
Statistics: The only science that enables dierent experts
using the same gures to draw dierent conclusions.
EVAN ESAR
Life is a school of probability.
WALTER BAGEHOT

1. [Bounds for order statistics] Prove that if x1 + x2 + . . . + xn = 0 and x21 + x22 + . . . + x2n = 1, where
x1 ≤ x2 ≤ . . . ≤ xn , then

r
n−1 1
− ≤ x1 ≤ − p ,
n n(n − 1)
r s
n−i i−1
− ≤ xi ≤ for all i ∈ {2, 3, . . . , n − 1},
ni n(n + 1 − i)
r
1 n−1
p ≤ xn ≤ ,
n(n − 1) n

where all the inequalities being the best possible.


[1.] A.V. BOYD: Bounds for order statistics. Univ. Beograd. Publ. Elektrotechn. Fak. Ser. Mat. Fiz. 380(1971) 3132.
[2.] D.M. HAWKINS: On the bounds of the range of order statistics. J. Amer. Statist. Assoc. 66(1971) 644645.

2. [Gamma distribution] The median of the gamma distribution with (positive) parameter x is dened
implicitly by the formula
Z m(x) Z m(x) Z ∞
1 1 1
−t x−1
e t dt = or −t x−1
e t dt = e−t tx−1 dt .
Γ(x) 0 2 0 2 0

Prove that the function x 7→ m(x) is continuous and increasing on (0, ∞), while the function x 7→ m(x)−x
is continuous, decreasing and strictly convex on (0, ∞).
[1.] C. BERG and H.L. PEDERSEN: The Chen-Rubin conjecture in a continuous setting. Methods Appl. Anal. 13(2006) 6388.
[2.] H. ALZER: A convexity property of the median of the gamma distribution. Statist. Probab. Lett. 76 (2006) 15101513.

3. [Skewness and kurtosis] Let x1 , x2 , . . . , xn be real numbers and let nαm = xm m m


1 + x2 + . . . + xn ,
where m is an arbitrary real number. Prove that if α1 = 0 and α2 = 1, then the following inequalities
hold:
n−2 1
α4 ≥ α32 + 1 and α3 ≤ √ , α4 ≤ n − 2 + for all n ≥ 2.
n−1 n−1

More generally, prove that if m = p/q ≥ 3 and p, q are integers with q odd, then for all n ≥ 2 integer we
have:
(n − 1)m−1 + (−1)m
2
α2m ≥ αm+1 2
+ αm and αm ≤ .
n(n − 1)m/2−1
[1.] K. PEARSON: Mathematical contributions to the theory of evolution. XIX: Second supplement to a memoir on skew variation. Phil.
Trans. Roy. Soc. A 216(1916) 432.
[2.] J.E. WILKINS: A note on skewness and kurtosis. Ann. Math. Statistics. 15(1944) 333335.
[3.] M.C. CHAKRABARTI: A note on skewness and kurtosis. Bull. Calcutta Math. Soc. 38(1946) 133136.
[4.] M. LAKSHMANAMURTI: On the upper bound of subject to the conditions and Math. Student.
Pn
xm
Pn Pn
i=1 i i=1 xi = 0 i=1 x2
i = n.
18(1950) 111116.

1
2

4. [Upper bound for the dispersion] Let f : [a, b] ⊆ R → [0, ∞) be the probability density function of
a random variable X whose expectation and dispersion are respectively given by
v !2
uZ
Z b b Z b
and D(X) = t
u
E(X) = tf (t) dt t2 f (t) dt − tf (t) dt .
a a a

Prove that if the expectation and the dispersion of X exist, then D(X) ≤ min{max{|a|, |b|}, b − a}.
[1.] N.K. AGBEKO: Some p.d.f.-free upper bounds for the dispersion σ(X) and the quantity σ2 (X) + (x − EX)2 . JIPAM. J. Inequal. Pure
Appl. Math. 7(5)(2006), Article 186, 3 pp. (electronic).

5. [Moments and central moments] Let n be an arbitrary natural number and let f : [a, b] ⊆ R →
[0, ∞) be the probability density function of a random variable X whose expectation and dispersion are
respectively given by
v !2
uZ
Z b b Z b
and D(X) = t
u
E(X) = tf (t) dt t2 f (t) dt − tf (t) dt .
a a a

By denition the nth moment (about zero) and the nth central moment, respectively, of a probability
density function f is the expected value of X n , and of (X − E(X))n , respectively, i.e.
Z b Z b
αn = E(X n ) = tn f (t) dt and µn = E((X − E(X))n ) = (t − α1 )n f (t) dt .
a a
For n = 1 we have α1 = E(X) and µ2 = D2 (X). Moreover, the skewness (or the so-called third stan-
dardized moment) is written as γ1 and dened as γ1 = µ3 /σ 3 , where µ3 is the third moment about
the mean and σ = D(X) is the standard deviation. The kurtosis, or the fourth standardized moment
is dened as µ4 /σ 4 , where µ4 is the fourth moment about the mean and σ is the standard deviation.
Kurtosis is more commonly dened as the fourth cumulant divided by the square of the variance of the
probability distribution, γ2 = µ4 /σ 4 −3, which is known as excess kurtosis. The minus 3 at the end of this
formula is often explained as a correction to make the kurtosis of the normal distribution equal to zero.
Problem: nd the mean, median, mode, variance, skewness, excess kurtosis, nth moment and nth cen-
tral moment of the following continuous distributions: beta, uniform, rectangular, KUMARASWAMY,
logarithmic, triangular, truncated normal, chi, non-central chi, chi-square, non-central chi-square, ex-
ponential, FISHER-SNEDECOR, non-central FISHER-SNEDECOR, gamma, ERLANG, half-normal,
LÉVY, logistic, log-logistic, log-normal, PARETO, RAYLEIGH, RICE, WEIBULL, CAUCHY, GUM-
BEL, FISHER-TIPPETT, LAPLACE, normal, STUDENT, MAXWELL, VON MISES.
[1.] http://en.wikipedia.org/wiki/Probability distribution.

6. [WALLIS' inequality and WALLIS' formula] Let Γ be the well-known EULER gamma function.
Prove that for all n ≥ 1 integer the following improved WALLIS' inequality holds
Γ n + 12 Γ n + 12
 
1 1 (2n − 1)!!
p ≤√ <p , where √ =
π(n + µ1 ) πΓ(n + 1) π(n + µ2 ) πΓ(n + 1) (2n)!!
and the constants µ1 = 4/π − 1 and µ2 = 1/4 are the best possible. Using this WALLIS' inequality prove
the following WALLIS' formulae
2
(2n)2 X (n!)2 2n+1

π (2n)!! 1
and π =
Y
= = lim .
2 (2n − 1)(2n + 1) n→∞ (2n − 1)!! 2n + 1 (2n + 1)!
n≥1 n≥0
[1.] C.P. CHEN and F. QI: The best bounds in Wallis' inequality. Proc. Amer. Math. Soc. 133(2005) 397401.
[2.] S. KOUMANDOS: Remarks on a paper by Chao-Ping Chen and Feng Qi. Proc. Amer. Math. Soc. 134(2005) 13651367.

7. [Monotone form of l'HOSPITAL's rule] For a, b ∈ R let f1 , f2 : [a, b] → R be continuous on [a, b],
and dierentiable on (a, b). Further let f20 (x) 6= 0 for all x ∈ (a, b). Prove that if f10 /f20 is increasing
(decreasing) on (a, b), then so are
f1 (x) − f1 (a) f (x) − f1 (b)
x 7→ and x 7→ 1 .
f2 (x) − f2 (a) f2 (x) − f2 (b)
[1.] G.D. ANDERSON, M.K. VAMANAMURTHY and M. VUORINEN: Inequalities for quasiconformal mappings in space. Pacic J. Math.
160(1)(1993) 118.
3

[2.] G.D. ANDERSON, M.K. VAMANAMURTHY and M. VUORINEN: Conformal Invariants, Inequalities, and Quasiconformal Maps. John
Wiley & Sons, New York, 1997.

8. [Log-concave distributions] Let f : [a, b] ⊆ R → [0, ∞) be a continuously dierentiable probability


density function. Moreover, let us consider the cumulative distribution function F : [a, b] → [0, 1],
the survival function F : [a, b] → [0, 1], the left hand integral of the cumulative distribution function
G : [a, b] → [0, ∞), the right hand integral of the reliability function H : [a, b] → [0, ∞) and the failure
rate r : [a, b] → [0, ∞), dened by
Z x Z b Z x Z b
f (x)
F (x) = f (t) dt, F (x) = f (t) dt, G(x) = F (t) dt, H(x) = F (t) dt, r(x) = .
a x a x F (x)
[Using the monotone form of l'HOSPITAL's rule] Prove that the following implications are true:
a. f is log-concave =⇒ F is log-concave =⇒ G is log-concave.
b. f (a) = 0 and f is log-convex =⇒ F is log-convex =⇒ G is log-convex.
c. f is log-concave =⇒ F is log-concave =⇒ H is log-concave.
b. f (b) = 0 and f is log-convex =⇒ F is log-convex =⇒ H is log-convex.
d. f is log-concave (log-convex) =⇒ r is increasing (decreasing).

[1.] M. BAGNOLI and T. BERGSTROM: Log-concave probability and its applications. Econ. Theory 26(2)(2005) 445469.
[2.] S. ANDRÁS and Á. BARICZ: Properties of the probability density function of the non-central chi-squared distribution. J. Math. Anal.
Appl. 346(2)(2009) 395402.
[3.] Á. BARICZ: Geometrically concave univariate distributions. J. Math. Anal. Appl. 363(1)(2010) 182-196.

9. [Reliability function] Let X be a continuous random variable and suppose that its expectation
exists. Prove that if x tends to innity, then x(1 − F (x)) tends to zero, where F : [0, ∞) → [0, 1] is the
cumulative distribution function of the random variable X. Using this property prove that
Z ∞ Z ∞
E(X) = F (t) dt = (1 − F (t)) dt,
0 0

where F : [0, ∞) → [0, 1], dened by F (t) = 1−F (t), is the survival (or reliability) function of the random
variable X.

10. [NBU property] Let us consider the continuously dierentiable function ϕ : [0, ∞) → (0, ∞). Prove
that if ϕ(0) ≥ 1 and ϕ is log-concave, then for all x, y ≥ 0 we have ϕ(x + y) ≤ ϕ(x)ϕ(y). Moreover, if
ϕ(0) ≤ 1 and ϕ is log-convex, then the above inequality is reversed.
Now, let f be a continuously dierentiable probability density function which has support [0, ∞).
Using the above result prove that if f is log-concave, then for all x, y ≥ 0 we have F (x + y) ≤ F (x)F (y),
F : [0, ∞) → [0, 1], dened by F (x) = 1 − F (x), is the survival (or reliability) function of the random
variable X. Moreover, if f is log-convex, then the above inequality is reversed. The above inequality is
called in economic theory as the new-is-better-than-used property, since if X is the time of death of a
physical object, then the probability P (X ≥ x) = F (x) that a new unit will survive to age x, is greater
than the probability
P (X ≥ x + y) F (x + y)
=
P (X ≥ y) F (y)
that a survived unit of age y will survive for an additional time x.
[1.] M.Y. AN: Log-concave probability distributions: Theory and statistical testing. Technical report, Economics Department, Duke University,
Durham, N.C. 277080097, 1995.
[2.] Á. BARICZ: A functional inequality for the survival function of the gamma distribution. JIPAM. J. Inequal. Pure Appl. Math. 9(1)
(2008), art. 13, 5 pp. [electronic]

You might also like