You are on page 1of 6

Estimation of the Inverse Gaussian Distribution Function

Author(s): Raj S. Chhikara and J. Leroy Folks


Source: Journal of the American Statistical Association, Vol. 69, No. 345 (Mar., 1974), pp.
250-254
Published by: Taylor & Francis, Ltd. on behalf of the American Statistical Association
Stable URL: https://www.jstor.org/stable/2285537
Accessed: 25-04-2019 05:34 UTC

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms

American Statistical Association, Taylor & Francis, Ltd. are collaborating with JSTOR to
digitize, preserve and extend access to Journal of the American Statistical Association

This content downloaded from 154.126.85.186 on Thu, 25 Apr 2019 05:34:13 UTC
All use subject to https://about.jstor.org/terms
Estimation of the Inverse Gaussian
Distribution Function
RAJ S. CHHIKARA and J. LEROY FOLKS*

Minimum variance unbiased estimates of the inverse Gaussian dis- estimates for the exponential, gamma and Weibull
tribution function for all possible cases are given. A direct relationship
distributions by using the Laplace-transform method,
is established between its density function and the normal density
function, which throws more light on its salient features and possibly
which does not require having an initial unbiased
on its application in statistical inference. It is shown that the estimates estimate. Utilizing a similar technique, Washio, Mori-
are very similar in nature to those of the normal distribution and can moto and Ikeda [173 treated the problem for the one-
be evaluated from the normal and Student's t distribution tables. parameter exponential family. Later Basu [2] briefly
summarized some of these cases.
1. INTRODUCTION
Recently Sathe and Varde [12] and Eaton and Morris
Estimation of the distribution function F(x) at any [4] have derived such estimates by considering ancillary
point x is of considerable importance, particularly in statistics. The same approach we find adopted in [9,
quality control and reliability work. In quality control, Theorem 1, Ch. 5], in a slightly different context of
the fraction of acceptable product is given by F(x) or testing of hypotheses when one is concerned with deriving
I - F(x) in the case of one-sided specification limits a uniformly most powerful unbiased test. Although this
and by F(x) - F(y) in the case of two-sided specification. approach is elegant in simplifying the underlying
limits. In reliability work, one is usually interested in the algorithm for the Rao-Blackwell theorem or the Neyman
reliability function 1 - F(x), the probability of a failure- structure of a test, there is a difficulty in determining
free operation of a certain device during time x. appropriate ancillary statistics.
Although not completely satisfactory, minimum vari- We give MVUE's of the inverse Gaussian distribution
ance unbiased estimates (M'vVUE) of the distribution function evaluated at xo, F(xo), for all possible cases.
function have a certain appeal. In this article we present The estimates are given in forms analogous to those for
such estimates for the inverse Gaussian distribution the normal distribution function given by Folks, Pierce
which is suitable as a stochastic model for certain skewed and Stewart [5]. Evaluation of the distribution function
populations. It should be considered as a possible F(x) when both parameters are known is also given to
alternative model to the Weibull, gamma, generalized complete the analogy.
gamma, or log normal distributions.
The method of finding the MVUE is that considered 2. THE INVERSE GAUSSIAN DISTRIBUTION
by Kolmogorov [7] for obtaining such estimates for
A random variable X is distributed as inverse Gaussian,
the niormal distribution function. It utilizes the Rao-
denoted as X I(p(, X), if its density function is given by
Blackwell theorem: given a complete sufficient statistic
T for 6 and an unbiased estiinate g (0) of a parametric f(X; /a, X)
function g (0), the MVUE of g (0) is given by t (6) J [X/27rx3]i exp [-X(x -) 2/2Mu2x], x > 0
= E[gE3) I T]. The choice of initial unbiased estimates is
purely one of convenience. For the sample X1, X2, X. , 0, otherwise,

a convenient such estimate of F(x) is given by


where ,u and X are assumed to be positive. The density
1 if X1 < x function is skewed and unimodal. The characteristic func-
g(0) = tion of X is +(t) = exp I(X/M)El - (1 - 2it,U2/X)I]}.
O otherwise.
Consequently, it follows that all moments exist and, in
Using the same method, estimates have been given particular, one gets
for the normal distribution by Leiberman and Resnikoff
E[X] = , and Var [X] =U3/X.
[10], for the binomial, Poisson and normal distributions
by Barton [1], and for the exponential model by Laurent Furthermore, if X1, X2, ..., X. are i.i.d. - I(,, X), t
[8]. Tate [14] considered this problem for distributionis X = 1 Xe/n - I (, nX). For more details on the in-
involving location and scale parameters and found verse Gaussian distribution, see [6, Ch. 15], [15] and [16].

* Raj S. Chhikara is research scientist and lecturer, Graduate Program in


Mathematical Sciences, University of Texas at Dallas, Dallas, Tex. 75230. J. ? Journal of the American Statistical Association

Leroy Folks'is professor and chairman, Department of Statistics, Oklahoma State March 1974, Volume 69, Number 345
University, Stillwater, Okla. 74074. Theory and Methods Section

250

This content downloaded from 154.126.85.186 on Thu, 25 Apr 2019 05:34:13 UTC
All use subject to https://about.jstor.org/terms
Estimation: Inverse Gaussian 251

Zigangirov [20] and later Shuster [13] expressed the where ?D stands for the cumulative standard normal
inverse Gaussian distribution function F(x) in terms of distribution function. Hence,
the normal distribution function, 4(x). However, for the
sake of brevity and greater insight, we give another
F(x) = ?(y) + e2l1#u _ + y2) (2.4)
proof of their result by establishing a direct relationship
between the inverse Gaussian and the normal density
where y = \(x - ,)/,uIx, or
functions in the following:

Theorem: Let X - I(,, X) and Y = -NJ(X- F (x) =~~? )


Then the density function of Y is given by
-~~~~~~~~

g(y; X/I) = -y/ + y2)( ey22)


+ e2XIl [-Ij(1 +i)]. (2.5)

-oo< y <oo. (2.2) Thus when both parameters ,u and X are known, the
inverse Gaussian distribution function can be evaluated
Proof: The transformation y = -X(x - , )/1u4x using the normal
isdistribution
one- table.
to-one and as x varies from 0 to x y varies from- oc
to oo. It can be seen that 3. ESTIMATION OF F(xo)

3.1 MVUE of F(xo) When I Is Known


x --[(2X + /Ly2) + yI44MX + t2yy2]
The sample mean X= iXi/n - I(,u, nX) is a
complete sufficient statistic foru. So, to obtain the MVUE
dx dy 2[1xi
of F(xo) given by P[X1 < xo I X], we need to find the
dy dx AIX(x + M) conditional distribution of X1 given X.
The joint density function of random variables X, and
Then the density function of Y is given by
Y = 2Xil(n - 1) is
dx
g(y; A, X) = f(x; , X) -
dy f(xi, 9) - i~ln I
f(Xl y)- Xig)
4A ~
= [(4X + ,y2) + yAXl41 + A2y2]-le-V2/2.
472r
exp [- j( - ' ) + (n-l)(y-
After further simplification, we now obtain the density With the transformation = (nX - X1)/ (n - 1), we
function of Y as given in (2.2). This completes the proof. obtain the density function of X1 and X as

To evaluate the inverse Gaussian distribution function


F(x), due to the preceding theorem, one can write
f(x1, -t) n(n -1)X
2rE[x1(nx -xOT
r X (X,1-A) 2 rn(xt-A) - (XI-) 2l
F(x)= f e-z2/2dz *exp +
_2jA2 xj nx- X1
o < Xi < nX.
- f (z/xW>2 + z2) e22/2dz (2.3)
We already know that the density function of if is

g (x) = - nX/27rt3 exp [-nX (x -i)2/2,42?], x > 0.


where y = A(x - i)/V4x. To evaluate the second term
on the right side of (2.3), let u = (4X/,
So, + Z2)X. =Then
given x, the conditional density function
h(x1It) of X1, given byf(x1, t)/g(i), is
2nd term
X4 (n - 1)x
e2X/1 J erU2/2dU h(xi It) = ?
]\21V4X/p.+Y2
exrs 212du+v 2ir [x1(ni -xi]
[ nX (XI X) )2
if y<O0
exp2xit(nt-1
Li 21(m X) - ) 0 X1 <i n.
I 1 Fr -Vr4X /;&+y2 1
e2X/I I e_u212du-I
t a27r JgJ4X/, Hence,e-U2/2d
we derive the MVUE of J
F(xo) as

if y>O
P(xo) = J h(x1jx)dxi, xo < nx (3.2)
0

- e2Xfpt_i + y2), -c< y <o


with h(x1jx) in (3.1) and P(x0) = 1 for all xo > nx.

This content downloaded from 154.126.85.186 on Thu, 25 Apr 2019 05:34:13 UTC
All use subject to https://about.jstor.org/terms
252 Journal of the American Statistical Association, March 1974

Next, to simplify the right side of (3.2), let So the conditional density function of X1, given T

w = i (xl - t)/[xit(nt -x)]i.


h(xiIt) = [___________ (xi t-X 2 (n-3)/2
This is a one-to-one transformation, and w varies from (1 n-t1) xl
- co to oo as xi varies from 0 to nx. Now by following
the technique similar to that of the theorem in Section (Xi - 2
2 one can deduce that 0 < < t. (3.6)
Xi

1 rwo n n-2 w
F(x)) = 4_ / 1+ We now obtain the M\IVUE of F(xo) given by
42 0 n -44X/E A+ w2- xo

F(xo) = | h(x_ I t)dxl,


e-w212dw (3.3) [2p+t-V/4,ut+t2]
where
xo < r2[i + t + /14jt + t2] (3.7)
wO = -n)fX(xo - t)/[xot(nx -x0)]
with h(xi It) in (3.6) and F(xo) = 1 for all
and ,u = nx/(n - 1). Now recognizing the similarity
between (3.3) and (2.3), and thereby using the result xo > [2,u + t + <4/ut + t2]/2.
in (2.4), we obtain
Next, by considering the transformation

F(xo) = 1(wO) + e2X';Iq(-wo), xo < nx. Vn - 1 (xi - ,)r (xi - ,)-


n
Vtxi txl
Hence, the MIVUE of F(xo) is
in (3.7), one can show that
0, x?O<
xo > ntF(xo)= 1
F(xo)=l 1, (3.4)
D(wO) + ~ --e2(n-1)X'n_X(-wo Xn12 2)
n

where
J-o00 4u w2 w2
iWnx(xo -
wo = t n+ n-1 I
and [xg(n -x0) (3.5) g2 -n/2

= iX[nt + (n - 2)xo] .I1 + . dw, (3.8)


wo = ::::.
o.tx(nt - xo)
where wo = n - l (xo - A)[txo - (xo - )2]-L Then
after making the transformation
3.2 MVUE of F(xo) When ju Is Known
- 4 / w2 \ wa 2
The statistic T= 1 (Xi - A)2/X, is completely
sufficient for X. Also, it is easy to prove that the charac- u- (n -1) t* 1 n
teristic function of (X - A)2jX is (1 - 2it,.2/X)-i, which in the second term on the rig
is like that of (,A2/X)X2(1). Hence, the distribution of TofisF(xo) can be derived as
like that of (j12/X)X2(n). Once again we want to find the
O,
conditional distribution function of X1 given T.
The joint density function of random variables X1 xO < 2[(2M+t)- 4pt+t2]
and T can be obtained as

f(x1, t) =X-- F(xo) = XO > 21[(2 + t) + >(4,t + t2]


t + 4] (n-2)/2 t
4-r ( )2n/2A n-lxi
Ft, n-l(wo) + _t Ft,n_
1~~~~~~x
otherwise
[t _ 1) e-kl 0 < (X-12< t.
XI X1 where

The density funct


iIn-1(xo-A) , = n-1(xo + A)
wO= - w)2 - - (3.10)
vtx0 - (XO- 2 'tXo (XO - )2
g(t) -=- t (n-2)/12e-XtI 22, t > 0.
and Ftn_ denotes the cumulative Student's t distribution
function with (n -1) degrees of freedom.

This content downloaded from 154.126.85.186 on Thu, 25 Apr 2019 05:34:13 UTC
All use subject to https://about.jstor.org/terms
Estimation: Inverse Gaussian 253

3.3 MVUE of F(xo) When Both


Hence, the conditional Parameters
density function of X1, given
T = (x, v), is obtained as
The statistic T = (X, V), where X = i Xe/n and
V = 1 (1/X-1/Z), forms a complete sufficient
statistic for (,.g, X). Tweedie [15] showed that 1 and V v) - ( [ )
h(xiIx,
are stochastically independent; and X - I(,u, nX) and V (2 2 )v3n-13
distributed like (1/X)X2(n - 1). To find the MVUE of
F(xo), we will first obtain the conditional distribution V_ ~- VX(n;t
of X1 given T. [1 n(xI -x) 2 (n-)2(
Let vxilt(n:i-xi)

Y= [l/(n- 1)] n Xi and V1 = (1X /X- 1/X). where L<xi<U


Then the joint density of the random variables X1, Y,
and V1 is L= {n(2+vx)
2[n + vx]
n- f2V (n-4) /2
f(xi, y, VI)- n--2 - [4n(n - 1)vx + n2v2I2jj} (3.12)
x
r (- 2) i12nxzys
U --- + vt] {n(2 + vx)
2[n + V$]

r X [(XI - A)2 (n-l(- 2' Xv, + [4n(n - 1)vx + n2v2x2]j}.


.exp[ -~ T X,) +
2,x2 XIl 2J
Consequently, the MVUE of F(xo) is
By considering the transformation
xo

F(xo) h(xikJ, v)dxl (3.13)


L
X= [(n-1)? + Xl]/n
n-1 1 n with h(xi It, v) in (3.11), f(xo) = 0 for all xo < L and
= VVI+ -y + - F(xo) = 1 for all xo > U, where L and U are given
y X
in (3.12).
the joint density function of X1, X, and V is obtained as Next, we simplify (3.13) so that F(xo) can be evaluated
using the Student's t distribution table [11]. Letting
n(n -J)Xn2
f(xI, x, v)= ( _-2
Vn(xj - ) F n(x1- ) 1--
7rJ ( 2 ) 2nx2(n -xi)I W = 1~~MVn - x?
Wvx It(nx-X1) _
n(xI-1 ) - (n-4)12 (3.13) can be simplified as
. V-

F(xo) = 2
2xp X2 - + - x
o 1 n2\
{(X J) +n(xi- t 2
wo ((n--2)w4 2x
2 \ xIX(n! - xi) I 1 -..
JGo - 44n(n -1)(1 + w2) + n2vlw2J
n(xi-t) 2
O < xi < nx, 0 < < v. [1 + W2]-(n-I)'2dw, (3.14)
xix(nx - x1)
where
The joint density function of statistics Xf and V is

wO = 4n1(xo - X)[vxox(nt - xo) - n(xo -)2]-J-


Let Ft,,2 stand for the cumulative Student's t distri-
bution function with n - 2 degrees of freedom. Then
after simplifying (3.14) it can be easily shown that the
MVUE of F(xo) is

O, xo < L

P(xo) = 1i XO U (315)
{FxO( )+ 2 4
Ft,n-2(WO) + 1)
1 + (n-3)12
- Ft, n-

This content downloaded from 154.126.85.186 on Thu, 25 Apr 2019 05:34:13 UTC
All use subject to https://about.jstor.org/terms
254 Journal of the American Statistical Association, March 1974

where
W=n(xo- / nx + (n - 2)xo
_ _ _ _ _ _ _ _ __=_ _ _ _ _ ,_ _ w o = _ _ _ _ _ _ _ _ __= (3._1 6 )
avxox(n - xO) -' n(xo -)2 vxox(nJ-o) - n(xo -)2
and

L = E ) [n(2 + vx) - 14n(n - 1)vt + fl2v2x2]

U ( + ) [n(2 + vt) + <4n(n - 1)vt + n2v2x2].

4. CONCLUDING REMARKS [3] Cox, D.R. and Miller, H.D., The Theory of Stochastic Processes,
London: Methuen and Company, 1965.
Skew data are by no means exceptional and so various [4] Eaton, M.L. and Morris, C.N., "The Application of Invariance
distributions, e.g., the gamma, Weibull, etc., have been to Unbiased Estimation," Annals of Mathematical Statistics, 41
considered as appropriate models to describe physical (October 1970), 1708-15.
[5] Folks, J.L., Pierce, D.A. and Stewart, C., "Estimating the
phenomena in some suitable way. Since the inverse
Fraction of Acceptable Product," Technometrics, 7 (February
Gaussian distribution arises as the distribution of first-
1965), 43-50.
passage time in the Wiener process (see [3, p. 210]), its [6] Johnson, N.L. and Kotz, S., Continuous Distributions 1,
applicability to a life-testing or life-time situation is a Boston: Houghton Mifflin Company, Inc., 1970.
natural consequence. This does not, however, exclude [7] Kolmogorov, A.N., "Unbiased Estimates," Izvestia Akademii
Nauk SSSR, Seriya Matematiceskaya, 14 (1950), 303-26;
other areas for its applications. It may be considered
American Mathematical Society Translaion, Ser. 1, 11 (1962),
appropriate in any situation involving skewed positive
144-70.
data. But interestingly enough, it has the advantage over [8] Laurent, A.G., "Conditional Distribution of Order Statistics
some other skewed distributions that exact small sample and Distribution of the Reduced ith Order Statistics of the
theory is tractable and in some cases it parallels that of Exponential Model," Annals of Mathematical Statistics, 34
the normal distribution. (June 1963), 652-57.
[9] Lehmann, E.L., Testing Statistical Hypotheses, New York:
The minimum variance unbiased estimates of F(xo)
John Wiley and Sons, Inc., 1959.
presented in this article should be of some utility in [10] Lieberman, G.J. and Resnikoff, G.J., "Sampling Plans for
reliability and quality assurance problems. The estimates Inspection by Variables," Journal of the American Statistical
can easily be calculated from existing statistical tables. Association, 50 (June 1955), 457-516.
Comparison of these estimates as given in (3.4), (3.9) [11] Pearson, E.S. and Hartley, H.O., Biometrika Tables for Statis-
ticians, Vol. I, The Cambridge University Press, 1970.
and (3.15) with those of their counterparts for the
[12] Sathe, Y.S. and Varde, S.D., "On Minimum Variance Un-
normal distribution (see [5]) shows a remarkable simi- biased Estimation of Reliability," Annals of Mathematical
larity. Not only are their MVUE's commonly expressed Statistics, 40 (April 1969), 710-14.
in terms of the same distributions, standard normal and [13] Shuster, J., "On the Inverse Gaussian Distribution Function,"
Student's t, but such estimates are also similar in charac- Journal of the American Statistical Association, 63 (December
1968), 1514-16.
ter as to the form and the respective parameters involved.
[14] Tate, R.F., "Unbiased Estimation: Functions of Location and
As the direct method of deriving the estimates is
Scale Parameters," Annals of Mathematical Statistics, 30
somewhat cumbersome, one may think of simplifying it. (June 1959), 341-66.
However, the approach suggested by Eaton and Morris [15] Tweedie, M.C.K., "Statistical Properties of Inverse Gaussian
[4] will not be applicable here because the underlying Distributions I, II," Annals of Mathematical Statistics, 28
(June 1957) 362-77, (September 1957), 696-705.
statistics for the estimates are not ancillary, and so their
[16] Wasan, M.T., "On the Inverse Gaussian Process," Skandi-
Theorem 2.1 cannot be used.
naviska Aktuarietidskrift, Part 1-2 (1968), 69-96.
Zack and his associates [18, 19] have compared the [17] Washio, Y., Morimoto, H. and Ikeda, H., "Unbiased Estima-
maximum likelihood estimates and the MVUE for the tion Based on Sufficient Statistics," Bulletin of Mathematical
normal and exponential distribution functions for their Statistics, 6 (March 1956), 69-93.
[18] Zacks, S. and Even, M., "The Efficiencies in Small Samples of
relative efficiency. A similar study can be done for the
the Maximum Likelihood and Best Unbiased Estimators of
inverse Gaussian distribution function.
Reliability Functions," Journal of the American Statistical
Association, 61 (December 1966), 1033-51.
[Received November 1972. Revised June 1973.]
[19] and Milton, R., "Mean Square Errors of the Best
Unbiased and Maximum Likelihood Estimators of Tail
REFERENCES
Probabilities in Normal Distributions," Journal of the American
[1] Barton, D.E., "Unbiased Estimation of a Set of Probabilities," Statistical Association, 66 (September 1971), 590-93.
Biometrika, 48 (June 1961), 227-29. [20] Zigangirov, K. Sh., "Expression for the Wald Distribution in
[2] Basu, A.P., "Estimates of Reliability for Some Distributions Terms of Normal Distribution," Radio Engineering and
Useful in Life-Testing," Technometrics, 6 (May 1964), 215-19. Electronic Physics, 7 (January 1962), 145-48.

This content downloaded from 154.126.85.186 on Thu, 25 Apr 2019 05:34:13 UTC
All use subject to https://about.jstor.org/terms

You might also like