Stable distribution
In probability theory, a distribution is said to be stable if a linear combination of two independent random variables with this distribution has the same distribution, up to location and scale parameters. A random variable is said to be stable if its distribution is stable. The stable distribution family is also sometimes referred to as the Lévy alphastable distribution, after Paul Lévy, the first mathematician to have studied it.^{[1]}^{[2]}
Of the four parameters defining the family, most attention has been focused on the stability parameter, α (see panel). Stable distributions have 0 < α ≤ 2, with the upper bound corresponding to the normal distribution, and α = 1 to the Cauchy distribution. The distributions have undefined variance for α < 2, and undefined mean for α ≤ 1. The importance of stable probability distributions is that they are "attractors" for properly normed sums of independent and identicallydistributed (iid) random variables. The normal distribution defines a family of stable distributions. By the classical central limit theorem the properly normed sum of a set of random variables, each with finite variance, will tend towards a normal distribution as the number of variables increases. Without the finite variance assumption, the limit may be a stable distribution that is not normal. Mandelbrot referred to such distributions as "stable Paretian distributions",^{[3]}^{[4]}^{[5]} after Vilfredo Pareto. In particular, he referred to those maximally skewed in the positive direction with 1 < α < 2 as "Pareto–Lévy distributions",^{[1]} which he regarded as better descriptions of stock and commodity prices than normal distributions.^{[6]}
Stable 

Probability density function Symmetric αstable distributions with unit scale factor
Skewed centered stable distributions with unit scale factor 
Cumulative distribution function CDFs for symmetric αstable distributions
CDFs for skewed centered stable distributions 
Parameters 
α ∈ (0, 2] — stability parameter
β ∈ [−1, 1] — skewness parameter (note that skewness is undefined)
c ∈ (0, ∞) — scale parameter
μ ∈ (−∞, ∞) — location parameter 

Support 
x ∈ R, or x ∈ [μ, +∞) if α < 1 and β = 1, or x ∈ (∞, μ] if α < 1 and β = −1 

PDF 
not analytically expressible, except for some parameter values 

CDF 
not analytically expressible, except for certain parameter values 

Mean 
μ when α > 1, otherwise undefined 

Median 
μ when β = 0, otherwise not analytically expressible 

Mode 
μ when β = 0, otherwise not analytically expressible 

Variance 
2c^{2} when α = 2, otherwise infinite 

Skewness 
0 when α = 2, otherwise undefined 

Ex. kurtosis 
0 when α = 2, otherwise undefined 

Entropy 
not analytically expressible, except for certain parameter values 

MGF 
$\exp \!\left(it\mu c^{2}t^{2}\right)$ when $\alpha =2$, otherwise undefined 

CF 
$\exp \!{\Big [}\;it\mu c\,t^{\alpha }\,(1i\beta \operatorname {sgn} (t)\Phi )\;{\Big ]},$
where $\Phi ={\begin{cases}\tan {\tfrac {\pi \alpha }{2}}&{\text{if }}\alpha \neq 1\\{\tfrac {2}{\pi }}\log t&{\text{if }}\alpha =1\end{cases}}$ 

Definition
A nondegenerate distribution is a stable distribution if it satisfies the following property:
 Let X_{1} and X_{2} be independent copies of a random variable X. Then X is said to be stable if for any constants a > 0 and b > 0 the random variable aX_{1} + bX_{2} has the same distribution as cX + d for some constants c > 0 and d. The distribution is said to be strictly stable if this holds with d = 0.^{[7]}
Since the normal distribution, the Cauchy distribution, and the Lévy distribution all have the above property, it follows that they are special cases of stable distributions.
Such distributions form a fourparameter family of continuous probability distributions parametrized by location and scale parameters μ and c, respectively, and two shape parameters β and α, roughly corresponding to measures of asymmetry and concentration, respectively (see the figures).
Although the probability density function for a general stable distribution cannot be written analytically, the general characteristic function can be. Any probability distribution is given by the Fourier transform of its characteristic function φ(t) by:
 $f(x)={\frac {1}{2\pi }}\int _{\infty }^{\infty }\varphi (t)e^{ixt}\,dt$
A random variable X is called stable if its characteristic function can be written as^{[7]}^{[8]}
 $\varphi (t;\alpha ,\beta ,c,\mu )=\exp \left(it\mu ct^{\alpha }\left(1i\beta \operatorname {sgn} (t)\Phi \right)\right)$
where sgn(t) is just the sign of t and
 $\Phi ={\begin{cases}\tan \left({\frac {\pi \alpha }{2}}\right)&\alpha \neq 1\\{\frac {2}{\pi }}\log t&\alpha =1\end{cases}}$
μ ∈ R is a shift parameter, β ∈ [−1, 1], called the skewness parameter, is a measure of asymmetry. Notice that in this context the usual skewness is not well defined, as for α < 2 the distribution does not admit 2nd or higher moments, and the usual skewness definition is the 3rd central moment.
The reason this gives a stable distribution is that the characteristic function for the sum of two random variables equals the product of the two corresponding characteristic functions. Adding two random variables from a stable distribution gives something with the same values of α and β, but possibly different values of μ and c.
Not every function is the characteristic function of a legitimate probability distribution (that is, one whose cumulative distribution function is real and goes from 0 to 1 without decreasing), but the characteristic functions given above will be legitimate so long as the parameters are in their ranges. The value of the characteristic function at some value t is the complex conjugate of its value at −t as it should be so that the probability distribution function will be real.
In the simplest case β = 0, the characteristic function is just a stretched exponential function; the distribution is symmetric about μ and is referred to as a (Lévy) symmetric alphastable distribution, often abbreviated SαS.
When α < 1 and β = 1, the distribution is supported by [μ, ∞).
The parameter c > 0 is a scale factor which is a measure of the width of the distribution while α is the exponent or index of the distribution and specifies the asymptotic behavior of the distribution.
Parametrizations
The above definition is only one of the parametrizations in use for stable distributions; it is the most common but is not continuous in the parameters at α = 1.
A continuous parametrization is^{[7]}
 $\varphi (t;\alpha ,\beta ,\gamma ,\delta )=\exp \left(it\delta \gamma t^{\alpha }\left(1i\beta \operatorname {sgn}(t)\Phi \right)\right)$
where:
 $\Phi ={\begin{cases}\left(\gamma t^{1\alpha }1\right)\tan \left({\tfrac {\pi \alpha }{2}}\right)&\alpha \neq 1\\{\frac {2}{\pi }}\log \gamma t&\alpha =1\end{cases}}$
The ranges of α and β are the same as before, γ (like c) should be positive, and δ (like μ) should be real.
In either parametrization one can make a linear transformation of the random variable to get a random variable whose density is $f(y;\alpha ,\beta ,1,0)$. In the first parametrization, this is done by defining the new variable:
 $y={\begin{cases}{\frac {x\mu }{\gamma }}&\alpha \neq 1\\{\frac {x\mu }{\gamma }}\beta {\frac {2}{\pi }}\ln \gamma &\alpha =1\end{cases}}$
For the second parametrization, we simply use
 $y={\frac {x\delta }{\gamma }}.$
no matter what α is. In the first parametrization, if the mean exists (that is, α > 1) then it is equal to μ, whereas in the second parametrization when the mean exists it is equal to $\delta \beta \gamma \tan \left({\tfrac {\pi \alpha }{2}}\right).$
The distribution
A stable distribution is therefore specified by the above four parameters. It can be shown that any nondegenerate stable distribution has a smooth (infinitely differentiable) density function.^{[7]} If $f(x;\alpha ,\beta ,c,\mu )$ denotes the density of X and Y is the sum of independent copies of X:
 $Y=\sum _{i=1}^{N}k_{i}(X_{i}\mu )\,$
then Y has the density $s^{1}f(y/s;\alpha ,\beta ,c,0)$ with
 $s=\left(\sum _{i=1}^{N}k_{i}^{\alpha }\right)^{\frac {1}{\alpha }}.$
The asymptotic behavior is described, for α< 2, by:^{[7]}
 $f(x)\sim {\frac {1}{x^{1+\alpha }}}\left(c^{\alpha }(1+\operatorname {sgn}(x)\beta )\sin \left({\frac {\pi \alpha }{2}}\right){\frac {\Gamma (\alpha +1)}{\pi }}\right)$
where Γ is the Gamma function (except that when α < 1 and β = ±1, the tail vanishes to the left or right, resp., of μ). This "heavy tail" behavior causes the variance of stable distributions to be infinite for all α < 2. This property is illustrated in the loglog plots below.
When α = 2, the distribution is Gaussian (see below), with tails asymptotic to exp(−x^{2}/4c^{2})/(2c√π).
Onesided stable distribution and stable count distribution
When α < 1 and β = 1, the distribution is supported by [μ, ∞). This family is called "onesided stable distribution".^{[9]} Its standard distribution is denoted as
 $L_{\alpha }(x)=f(x;\alpha ,1,\cos({\frac {\pi \alpha }{2}})^{1/\alpha },0)$, where $\alpha <1$.
Consider the Lévy sum $Y=\sum _{i=1}^{N}X_{i}$ where $X_{i}\sim L_{\alpha }(x)$, then Y has the density ${\frac {1}{\nu }}L_{\alpha }({\frac {x}{\nu }})$ where $\nu =N^{1/\alpha }$. Set $x=1$, we arrive at the "stable count distribution".^{[10]} Its standard distribution is defined as
 ${\mathfrak {N}}_{\alpha }(\nu )={\frac {\alpha }{\Gamma ({\frac {1}{\alpha }})}}{\frac {1}{\nu }}L_{\alpha }({\frac {1}{\nu }})$, where $\nu >0$ and $\alpha <1$.
The stable count distribution is the conjugate prior of the onesided stable distribution. Its locationscale family is defined as
 ${\mathfrak {N}}_{\alpha }(\nu ;\nu _{0},\theta )={\frac {\alpha }{\Gamma ({\frac {1}{\alpha }})}}{\frac {1}{\nu \nu _{0}}}L_{\alpha }({\frac {\theta }{\nu \nu _{0}}})$, where $\nu >\nu _{0}$, $\theta >0$, and $\alpha <1$.
It is also a onesided distribution supported by $[\nu _{0},\infty )$. The location parameter $\nu _{0}$ is the cutoff location, while $\theta$ defines the scale of the onesided distribution.
When $\alpha ={\frac {1}{2}}$, $L_{\frac {1}{2}}(x)$ is the Lévy distribution which is an inverse gamma distribution. Thus ${\mathfrak {N}}_{\frac {1}{2}}(\nu ;\nu _{0},\theta )$ is a shifted gamma distribution of shape 3/2 and scale $4\theta$,
 ${\mathfrak {N}}_{\frac {1}{2}}(\nu ;\nu _{0},\theta )={\frac {1}{4{\sqrt {\pi }}\theta ^{3/2}}}(\nu \nu _{0})^{1/2}e^{{\frac {\nu \nu _{0}}{4\theta }}}$, where $\nu >\nu _{0}$, $\theta >0$.
Its mean is $\nu _{0}+6\theta$ and its standard deviation is ${\sqrt {24}}\theta$. It is hypothesized that VIX is distributed like ${\mathfrak {N}}_{\frac {1}{2}}(\nu ;\nu _{0},\theta )$ with $\nu _{0}=10.4$ and $\theta =1.6$ (See Section 7 of ^{[10]}). Thus the stable count distribution is the firstorder marginal distribution of a volatility process. In this context, $\nu _{0}$ is called the "floor volatility".
Another approach to derive the stable count distribution is to use the Laplace transform of the onesided stable distribution, (Section 2.4 of ^{[10]})
 $\int _{0}^{\infty }e^{zx}L_{\alpha }(x)dx=e^{z^{\alpha }}$, where $\alpha <1$.
Let $x=1/\nu$, and one can decompose the integral on the left hand side as a product distribution of a standard Laplace distribution and a standard stable count distribution,
 $\int _{0}^{\infty }\left({\frac {1}{2\nu }}e^{{\frac {z}{\nu }}}\right)\left({\frac {\alpha }{\Gamma ({\frac {1}{\alpha }})}}{\frac {1}{\nu }}L_{\alpha }({\frac {1}{\nu }})\right)d\nu ={\frac {1}{2}}{\frac {\alpha }{\Gamma ({\frac {1}{\alpha }})}}e^{z^{\alpha }}$, where $\alpha <1$.
This is called the "lambda decomposition" (See Section 4 of ^{[10]}) since the right hand side was named as "symmetric lambda distribution" in Lihn's former works. However, it has several more popular names such as "exponential power distribution", or the "generalized error/normal distribution", often referred to when α > 1.
The nth moment of ${\mathfrak {N}}_{\alpha }(\nu )$ is the $(n+1)$th moment of $L_{\alpha }(x)$, All positive moments are finite. This in a way solves the thorny issue of diverging moments in the stable distribution.
Properties
Stable distributions are closed under convolution for a fixed value of α. Since convolution is equivalent to multiplication of the Fouriertransformed function, it follows that the product of two stable characteristic functions with the same α will yield another such characteristic function. The product of two stable characteristic functions is given by:
 $\exp \left(it\mu _{1}+it\mu _{2}c_{1}t^{\alpha }c_{2}t^{\alpha }+i\beta _{1}c_{1}t^{\alpha }\operatorname {sgn}(t)\Phi +i\beta _{2}c_{2}t^{\alpha }\operatorname {sgn}(t)\Phi \right)$
Since Φ is not a function of the μ, c or β variables it follows that these parameters for the convolved function are given by:
 ${\begin{aligned}\mu &=\mu _{1}+\mu _{2}\\c&=\left(c_{1}^{\alpha }+c_{2}^{\alpha }\right)^{\frac {1}{\alpha }}\\[6pt]\beta &={\frac {\beta _{1}c_{1}^{\alpha }+\beta _{2}c_{2}^{\alpha }}{c_{1}^{\alpha }+c_{2}^{\alpha }}}\end{aligned}}$
In each case, it can be shown that the resulting parameters lie within the required intervals for a stable distribution.
A generalized central limit theorem
Another important property of stable distributions is the role that they play in a generalized central limit theorem. The central limit theorem states that the sum of a number of independent and identically distributed (i.i.d.) random variables with finite nonzero variances will tend to a normal distribution as the number of variables grows.
A generalization due to Gnedenko and Kolmogorov states that the sum of a number of random variables with symmetric distributions having powerlaw tails (Paretian tails), decreasing as $x^{\alpha 1}$ where $0<\alpha \leqslant 2$ (and therefore having infinite variance), will tend to a stable distribution $f(x;\alpha ,0,c,0)$ as the number of summands grows.^{[11]} If $\alpha >2$ then the sum converges to a stable distribution with stability parameter equal to 2, i.e. a Gaussian distribution.^{[12]}
There are other possibilities as well. For example, if the characteristic function of the random variable is asymptotic to $1+at^{\alpha }\ln t$ for small t (positive or negative), then we may ask how t varies with n when the value of the characteristic function for the sum of n such random variables equals a given value u:
 $\varphi _{\text{sum}}=\varphi ^{n}=u$
Assuming for the moment that t → 0, we take the limit of the above as n → ∞:
 $\ln u=\lim _{n\to \infty }n\ln \varphi =\lim _{n\to \infty }nat^{\alpha }\ln t.$
Therefore:
 ${\begin{aligned}\ln(\ln u)&=\ln \left(\lim _{n\to \infty }nat^{\alpha }\ln t\right)\\[5pt]&=\lim _{n\to \infty }\ln \left(nat^{\alpha }\ln t\right)=\lim _{n\to \infty }\left\{\ln(na)+\alpha \ln t+\ln(\ln t)\right\}\end{aligned}}$
This shows that $\ln t$ is asymptotic to ${\tfrac {1}{\alpha }}\ln n,$ so using the previous equation we have
 $t\sim \left({\frac {\alpha \ln u}{na\ln n}}\right)^{1/\alpha }.$
This implies that the sum divided by
 $\left({\frac {na\ln n}{\alpha }}\right)^{\frac {1}{\alpha }}$
has a characteristic function whose value at some t′ goes to u (as n increases) when $t'=(\ln u)^{\frac {1}{\alpha }}.$ In other words, the characteristic function converges pointwise to $\exp((t')^{\alpha })$ and therefore by Lévy's continuity theorem the sum divided by
 $\left({\frac {na\ln n}{\alpha }}\right)^{\frac {1}{\alpha }}$
converges in distribution to the symmetric alphastable distribution with stability parameter $\alpha$ and scale parameter 1.
This can be applied to a random variable whose tails decrease as $x^{3}$. This random variable has a mean but the variance is infinite. Let us take the following distribution:
 $f(x)={\begin{cases}{\frac {1}{3}}&x\leqslant 1\\{\frac {1}{3}}x^{3}&x>1\end{cases}}$
We can write this as
 $f(x)=\int _{1}^{\infty }{\frac {2}{w^{4}}}h\left({\frac {x}{w}}\right)dw$
where
 $h\left({\frac {x}{w}}\right)={\begin{cases}{\frac {1}{2}}&\left{\frac {x}{w}}\right<1,\\0&\left{\frac {x}{w}}\right>1.\end{cases}}$
We want to find the leading terms of the asymptotic expansion of the characteristic function. The characteristic function of the probability distribution ${\tfrac {1}{w}}h\left({\tfrac {x}{w}}\right)$ is ${\tfrac {\sin(tw)}{tw}},$ so the characteristic function for f(x) is
 $\varphi (t)=\int _{1}^{\infty }{\frac {2\sin(tw)}{tw^{4}}}dw$
and we can calculate:
 ${\begin{aligned}\varphi (t)1&=\int _{1}^{\infty }{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1\right]\,dw\\&=\int _{1}^{\frac {1}{t}}{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1\right]\,dw+\int _{\frac {1}{t}}^{\infty }{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1\right]\,dw\\&=\int _{1}^{\frac {1}{t}}{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1+\left\{{\frac {t^{2}w^{2}}{3!}}+{\frac {t^{2}w^{2}}{3!}}\right\}\right]\,dw+\int _{\frac {1}{t}}^{\infty }{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1\right]\,dw\\&=\int _{1}^{\frac {1}{t}}{\frac {t^{2}dw}{3w}}+\int _{1}^{\frac {1}{t}}{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1+{\frac {t^{2}w^{2}}{3!}}\right]dw+\int _{\frac {1}{t}}^{\infty }{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1\right]dw\\&=\int _{1}^{\frac {1}{t}}{\frac {t^{2}dw}{3w}}+\left\{\int _{0}^{\frac {1}{t}}{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1+{\frac {t^{2}w^{2}}{3!}}\right]dw\int _{0}^{1}{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1+{\frac {t^{2}w^{2}}{3!}}\right]dw\right\}+\int _{\frac {1}{t}}^{\infty }{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1\right]dw\\&=\int _{1}^{\frac {1}{t}}{\frac {t^{2}dw}{3w}}+t^{2}\int _{0}^{1}{\frac {2}{y^{3}}}\left[{\frac {\sin(y)}{y}}1+{\frac {y^{2}}{6}}\right]dy\int _{0}^{1}{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1+{\frac {t^{2}w^{2}}{6}}\right]dw+t^{2}\int _{1}^{\infty }{\frac {2}{y^{3}}}\left[{\frac {\sin(y)}{y}}1\right]dy\\&={\frac {t^{2}}{3}}\int _{1}^{\frac {1}{t}}{\frac {dw}{w}}+t^{2}C_{1}\int _{0}^{1}{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1+{\frac {t^{2}w^{2}}{6}}\right]dw+t^{2}C_{2}\\&={\frac {t^{2}}{3}}\ln t+t^{2}C_{3}\int _{0}^{1}{\frac {2}{w^{3}}}\left[{\frac {\sin(tw)}{tw}}1+{\frac {t^{2}w^{2}}{6}}\right]dw\\&={\frac {t^{2}}{3}}\ln t+t^{2}C_{3}\int _{0}^{1}{\frac {2}{w^{3}}}\left[{\frac {t^{4}w^{4}}{5!}}+\cdots \right]dw\\&={\frac {t^{2}}{3}}\ln t+t^{2}C_{3}{\mathcal {O}}\left(t^{4}\right)\end{aligned}}$
where $C_{1},C_{2}$ and $C_{3}$ are constants. Therefore,
 $\varphi (t)\sim 1+{\frac {t^{2}}{3}}\ln t$
and according to what was said above (and the fact that the variance of f(x;2,0,1,0) is 2), the sum of n instances of this random variable, divided by ${\sqrt {n(\ln n)/12}},$ will converge in distribution to a Gaussian distribution with variance 1. But the variance at any particular n will still be infinite. Note that the width of the limiting distribution grows faster than in the case where the random variable has a finite variance (in which case the width grows as the square root of n). The average, obtained by dividing the sum by n, tends toward a Gaussian whose width approaches zero as n increases, in accordance with the Law of large numbers.
Special cases
There is no general analytic solution for the form of p(x). There are, however three special cases which can be expressed in terms of elementary functions as can be seen by inspection of the characteristic function:^{[7]}^{[8]}^{[13]}
 For α = 2 the distribution reduces to a Gaussian distribution with variance σ^{2} = 2c^{2} and mean μ; the skewness parameter β has no effect.
 For α = 1 and β = 0 the distribution reduces to a Cauchy distribution with scale parameter c and shift parameter μ.
 For α = 1/2 and β = 1 the distribution reduces to a Lévy distribution with scale parameter c and shift parameter μ.
Note that the above three distributions are also connected, in the following way: A standard Cauchy random variable can be viewed as a mixture of Gaussian random variables (all with mean zero), with the variance being drawn from a standard Lévy distribution. And in fact this is a special case of a more general theorem ^{[14]} which allows any symmetric alphastable distribution to be viewed in this way (with the alpha parameter of the mixture distribution equal to twice the alpha parameter of the mixing distribution—and the beta parameter of the mixing distribution always equal to one).
A general closed form expression for stable PDF's with rational values of α is available in terms of Meijer Gfunctions.^{[15]} Fox HFunctions can also be used to express the stable probability density functions. For simple rational numbers, the closed form expression is often in terms of less complicated special functions. Several closed form expressions having rather simple expressions in terms of special functions are available. In the table below, PDF's expressible by elementary functions are indicated by an E and those that are expressible by special functions are indicated by an s.^{[14]}

α


1/3 
1/2 
2/3 
1 
4/3 
3/2 
2

β = 0 
s 
s 
s 
E 
s 
s 
E

β = 1 
s 
E 
s 
s 

s

Some of the special cases are known by particular names:
 For α = 1 and β = 1, the distribution is a Landau distribution which has a specific usage in physics under this name.
 For α = 3/2 and β = 0 the distribution reduces to a Holtsmark distribution with scale parameter c and shift parameter μ.
Also, in the limit as c approaches zero or as α approaches zero the distribution will approach a Dirac delta function δ(x − μ).
Series representation
The stable distribution can be restated as the real part of a simpler integral:^{[16]}
 $f(x;\alpha ,\beta ,c,\mu )={\frac {1}{\pi }}\Re \left[\int _{0}^{\infty }e^{it(x\mu )}e^{(ct)^{\alpha }(1i\beta \Phi )}\,dt\right].$
Expressing the second exponential as a Taylor series, we have:
 $f(x;\alpha ,\beta ,c,\mu )={\frac {1}{\pi }}\Re \left[\int _{0}^{\infty }e^{it(x\mu )}\sum _{n=0}^{\infty }{\frac {(qt^{\alpha })^{n}}{n!}}\,dt\right]$
where $q=c^{\alpha }(1i\beta \Phi )$. Reversing the order of integration and summation, and carrying out the integration yields:
 $f(x;\alpha ,\beta ,c,\mu )={\frac {1}{\pi }}\Re \left[\sum _{n=1}^{\infty }{\frac {(q)^{n}}{n!}}\left({\frac {i}{x\mu }}\right)^{\alpha n+1}\Gamma (\alpha n+1)\right]$
which will be valid for x ≠ μ and will converge for appropriate values of the parameters. (Note that the n = 0 term which yields a delta function in x−μ has therefore been dropped.) Expressing the first exponential as a series will yield another series in positive powers of x−μ which is generally less useful.
Simulation of stable variables
Simulating sequences of stable random variables is not straightforward, since there are no analytic expressions for the inverse $F^{1}(x)$ nor the CDF $F(x)$ itself.^{[17]}^{[10]} All standard approaches like the rejection or the inversion methods would require tedious computations. A much more elegant and efficient solution was proposed by Chambers, Mallows and Stuck (CMS),^{[18]} who noticed that a certain integral formula^{[19]} yielded the following algorithm:^{[20]}
 generate a random variable $U$ uniformly distributed on $\left({\tfrac {\pi }{2}},{\tfrac {\pi }{2}}\right)$ and an independent exponential random variable $W$ with mean 1;
 for $\alpha \neq 1$ compute:
 $X=\left(1+\zeta ^{2}\right)^{\frac {1}{2\alpha }}{\frac {\sin(\alpha (U+\xi ))}{(\cos(U))^{\frac {1}{\alpha }}}}\left({\frac {\cos(U\alpha (U+\xi ))}{W}}\right)^{\frac {1\alpha }{\alpha }},$
 for $\alpha =1$ compute:
 $X={\frac {1}{\xi }}\left\{\left({\frac {\pi }{2}}+\beta U\right)\tan U\beta \log \left({\frac {{\frac {\pi }{2}}W\cos U}{{\frac {\pi }{2}}+\beta U}}\right)\right\},$
 where
 $\zeta =\beta \tan {\frac {\pi \alpha }{2}},\qquad \xi ={\begin{cases}{\frac {1}{\alpha }}\arctan(\zeta )&\alpha \neq 1\\{\frac {\pi }{2}}&\alpha =1\end{cases}}$
This algorithm yields a random variable $X\sim S_{\alpha }(\beta ,1,0)$. For a detailed proof see.^{[21]}
Given the formulas for simulation of a standard stable random variable, we can easily simulate a stable random variable for all admissible values of the parameters $\alpha$, $c$, $\beta$ and $\mu$ using the following property. If $X\sim S_{\alpha }(\beta ,1,0)$ then
 $Y={\begin{cases}cX+\mu &\alpha \neq 1\\cX+{\frac {2}{\pi }}\beta c\log c+\mu &\alpha =1\end{cases}}$
is $S_{\alpha }(\beta ,c,\mu )$. It is interesting to note that for $\alpha =2$ (and $\beta =0$) the CMS method reduces to the well known BoxMuller transform for generating Gaussian random variables.^{[22]} Many other approaches have been proposed in the literature, including application of Bergström and LePage series expansions, see ^{[23]} and,^{[24]} respectively. However, the CMS method is regarded as the fastest and the most accurate.
Applications
Stable distributions owe their importance in both theory and practice to the generalization of the central limit theorem to random variables without second (and possibly first) order moments and the accompanying selfsimilarity of the stable family. It was the seeming departure from normality along with the demand for a selfsimilar model for financial data (i.e. the shape of the distribution for yearly asset price changes should resemble that of the constituent daily or monthly price changes) that led Benoît Mandelbrot to propose that cotton prices follow an alphastable distribution with α equal to 1.7.^{[6]} Lévy distributions are frequently found in analysis of critical behavior and financial data.^{[8]}^{[25]}
They are also found in spectroscopy as a general expression for a quasistatically pressure broadened spectral line.^{[16]}
The Lévy distribution of solar flare waiting time events (time between flare events) was demonstrated for CGRO BATSE hard xray solar flares in December 2001. Analysis of the Lévy statistical signature revealed that two different memory signatures were evident; one related to the solar cycle and the second whose origin appears to be associated with a localized or combination of localized solar active region effects.^{[26]}
Other analytic cases
A number of cases of analytically expressible stable distributions are known. Let the stable distribution be expressed by $f(x;\alpha ,\beta ,c,\mu )$ then we know:
 The Cauchy Distribution is given by $f(x;1,0,1,0).$
 The Lévy distribution is given by $f(x;{\tfrac {1}{2}},1,1,0).$
 The Normal distribution is given by $f(x;2,0,1,0).$
 Let $S_{\mu ,\nu }(z)$ be a Lommel function, then:^{[27]}
 $f\left(x;{\tfrac {1}{3}},0,1,0\right)=\Re \left({\frac {2e^{{\frac {i\pi }{4}}}}{3{\sqrt {3}}\pi }}{\frac {1}{\sqrt {x^{3}}}}S_{0,{\frac {1}{3}}}\left({\frac {2e^{\frac {i\pi }{4}}}{3{\sqrt {3}}}}{\frac {1}{\sqrt {x}}}\right)\right)$
 $f\left(x;{\tfrac {1}{2}},0,1,0\right)={\frac {1}{\sqrt {2\pi x^{3}}}}\left(\sin \left({\tfrac {1}{4x}}\right)\left[{\frac {1}{2}}S\left({\tfrac {1}{\sqrt {2\pi x}}}\right)\right]+\cos \left({\tfrac {1}{4x}}\right)\left[{\frac {1}{2}}C\left({\tfrac {1}{\sqrt {2\pi x}}}\right)\right]\right)$
 $f\left(x;{\tfrac {1}{3}},1,1,0\right)={\frac {1}{\pi }}{\frac {2{\sqrt {2}}}{3^{\frac {7}{4}}}}{\frac {1}{\sqrt {x^{3}}}}K_{\frac {1}{3}}\left({\frac {4{\sqrt {2}}}{3^{\frac {9}{4}}}}{\frac {1}{\sqrt {x}}}\right)$
 ${\begin{aligned}f\left(x;{\tfrac {4}{3}},0,1,0\right)&={\frac {3^{\frac {5}{4}}}{4{\sqrt {2\pi }}}}{\frac {\Gamma \left({\tfrac {7}{12}}\right)\Gamma \left({\tfrac {11}{12}}\right)}{\Gamma \left({\tfrac {6}{12}}\right)\Gamma \left({\tfrac {8}{12}}\right)}}{}_{2}F_{2}\left({\tfrac {7}{12}},{\tfrac {11}{12}};{\tfrac {6}{12}},{\tfrac {8}{12}};{\tfrac {3^{3}x^{4}}{4^{4}}}\right){\frac {3^{\frac {11}{4}}x^{3}}{4^{3}{\sqrt {2\pi }}}}{\frac {\Gamma \left({\tfrac {13}{12}}\right)\Gamma \left({\tfrac {17}{12}}\right)}{\Gamma \left({\tfrac {18}{12}}\right)\Gamma \left({\tfrac {15}{12}}\right)}}{}_{2}F_{2}\left({\tfrac {13}{12}},{\tfrac {17}{12}};{\tfrac {18}{12}},{\tfrac {15}{12}};{\tfrac {3^{3}x^{4}}{4^{4}}}\right)\\[6pt]f\left(x;{\tfrac {3}{2}},0,1,0\right)&={\frac {\Gamma \left({\tfrac {5}{3}}\right)}{\pi }}{}_{2}F_{3}\left({\tfrac {5}{12}},{\tfrac {11}{12}};{\tfrac {1}{3}},{\tfrac {1}{2}},{\tfrac {5}{6}};{\tfrac {2^{2}x^{6}}{3^{6}}}\right){\frac {x^{2}}{3\pi }}{}_{3}F_{4}\left({\tfrac {3}{4}},1,{\tfrac {5}{4}};{\tfrac {2}{3}},{\tfrac {5}{6}},{\tfrac {7}{6}},{\tfrac {4}{3}};{\tfrac {2^{2}x^{6}}{3^{6}}}\right)+{\frac {7x^{4}\Gamma \left({\tfrac {4}{3}}\right)}{3^{4}\pi ^{2}}}{}_{2}F_{3}\left({\tfrac {13}{12}},{\tfrac {19}{12}};{\tfrac {7}{6}},{\tfrac {3}{2}},{\tfrac {5}{3}};{\tfrac {2^{2}x^{6}}{3^{6}}}\right)\end{aligned}}$
 with the latter being the Holtsmark distribution.
 ${\begin{aligned}f\left(x;{\tfrac {2}{3}},0,1,0\right)&={\frac {\sqrt {3}}{6{\sqrt {\pi }}x}}\exp \left({\tfrac {2}{27}}x^{2}\right)W_{{\frac {1}{2}},{\frac {1}{6}}}\left({\tfrac {4}{27}}x^{2}\right)\\[8pt]f\left(x;{\tfrac {2}{3}},1,1,0\right)&={\frac {\sqrt {3}}{{\sqrt {\pi }}x}}\exp \left({\tfrac {16}{27}}x^{2}\right)W_{{\frac {1}{2}},{\frac {1}{6}}}\left({\tfrac {32}{27}}x^{2}\right)\\[8pt]f\left(x;{\tfrac {3}{2}},1,1,0\right)&={\begin{cases}{\frac {\sqrt {3}}{{\sqrt {\pi }}x}}\exp \left({\frac {1}{27}}x^{3}\right)W_{{\frac {1}{2}},{\frac {1}{6}}}\left({\frac {2}{27}}x^{3}\right)&x<0\\{}\\{\frac {\sqrt {3}}{6{\sqrt {\pi }}x}}\exp \left({\frac {1}{27}}x^{3}\right)W_{{\frac {1}{2}},{\frac {1}{6}}}\left({\frac {2}{27}}x^{3}\right)&x\geq 0\end{cases}}\end{aligned}}$
See also
Notes
 The STABLE program for Windows is available from John Nolan's stable webpage: http://academic2.american.edu/~jpnolan/stable/stable.html. It calculates the density (pdf), cumulative distribution function (cdf) and quantiles for a general stable distribution, and performs maximum likelihood estimation of stable parameters and some exploratory data analysis techniques for assessing the fit of a data set.
 libstable is a C implementation for the Stable distribution pdf, cdf, random number, quantile and fitting functions (along with a benchmark replication package and an R package).
 R Package 'stabledist' by Diethelm Wuertz, Martin Maechler and Rmetrics core team members. Computes stable density, probability, quantiles, and random numbers. Updated Sept. 12, 2016.
References
 ^ ^{a} ^{b} B. Mandelbrot, The Pareto–Lévy Law and the Distribution of Income, International Economic Review 1960 https://www.jstor.org/stable/2525289
 ^ Paul Lévy, Calcul des probabilités 1925
 ^ B.Mandelbrot, Stable Paretian Random Functions and the Multiplicative Variation of Income, Econometrica 1961 https://www.jstor.org/stable/pdfplus/1911802.pdf
 ^ B. Mandelbrot, The variation of certain Speculative Prices, The Journal of Business 1963 [1]
 ^ Eugene F. Fama, Mandelbrot and the Stable Paretian Hypothesis, The Journal of Business 1963
 ^ ^{a} ^{b} Mandelbrot, B., New methods in statistical economics The Journal of Political Economy, 71 #5, 421–440 (1963).
 ^ ^{a} ^{b} ^{c} ^{d} ^{e} ^{f} Nolan, John P. "Stable Distributions – Models for Heavy Tailed Data" (PDF). Retrieved 20090221.
 ^ ^{a} ^{b} ^{c} Voit, Johannes (2005). The Statistical Mechanics of Financial Markets – Springer. Springer. doi:10.1007/b137351.
 ^ Penson, K. A.; Górska, K. (20101117). "Exact and Explicit Probability Densities for OneSided L\'evy Stable Distributions". Physical Review Letters. 105 (21): 210604. doi:10.1103/PhysRevLett.105.210604.
 ^ ^{a} ^{b} ^{c} ^{d} ^{e} Lihn, Stephen (2017). "A Theory of Asset Return and Volatility Under Stable Law and Stable Lambda Distribution". SSRN.
 ^ B.V. Gnedenko, A.N. Kolmogorov. Limit distributions for sums of independent random variables, Cambridge, AddisonWesley 1954 https://books.google.com/books/about/Limit_distributions_for_sums_of_independ.html?id=rYsZAQAAIAAJ&redir_esc=y
 ^ Vladimir V. Uchaikin, Vladimir M. Zolotarev, Chance and Stability: Stable Distributions and their Applications, De Gruyter 1999 https://books.google.com/books/about/Chance_and_Stability.html?id=Y0xiwAmkb_oC&redir_esc=y
 ^ Samorodnitsky, G.; Taqqu, M.S. (1994). Stable NonGaussian Random Processes: Stochastic Models with Infinite Variance. CRC Press. ISBN 9780412051715.
 ^ ^{a} ^{b} Lihn, Stephen (2017). A Theory of Asset Return and Volatility Under Stable Law and Stable Lambda Distribution. SSRN.
 ^ Zolotarev, V. (1995). "On Representation of Densities of Stable Laws by Special Functions". Theory of Probability & Its Applications. 39 (2): 354–362. doi:10.1137/1139025. ISSN 0040585X.
 ^ ^{a} ^{b} Peach, G. (1981). "Theory of the pressure broadening and shift of spectral lines". Advances in Physics. 30 (3): 367–474. doi:10.1080/00018738100101467. ISSN 00018732.
 ^ Nolan, John P. (1997). "Numerical calculation of stable densities and distribution functions". Communications in Statistics. Stochastic Models. 13 (4): 759–774. doi:10.1080/15326349708807450. ISSN 08820287.
 ^ Chambers, J. M.; Mallows, C. L.; Stuck, B. W. (1976). "A Method for Simulating Stable Random Variables". Journal of the American Statistical Association. 71 (354): 340–344. doi:10.1080/01621459.1976.10480344. ISSN 01621459.
 ^ Zolotarev, V. M. (1986). OneDimensional Stable Distributions. American Mathematical Society. ISBN 9780821845196.
 ^ Misiorek, Adam; Weron, Rafał (2012). Gentle, James E.; Härdle, Wolfgang Karl; Mori, Yuichi, eds. HeavyTailed Distributions in VaR Calculations. Springer Handbooks of Computational Statistics. Springer Berlin Heidelberg. pp. 1025–1059. doi:10.1007/9783642215513_34. ISBN 9783642215506.
 ^ Weron, Rafał (1996). "On the ChambersMallowsStuck method for simulating skewed stable random variables". Statistics & Probability Letters. 28 (2): 165–171. doi:10.1016/01677152(95)001131.
 ^ Janicki, Aleksander; Weron, Aleksander (1994). Simulation and Chaotic Behavior of Alphastable Stochastic Processes. CRC Press. ISBN 9780824788827.
 ^ Mantegna, Rosario Nunzio (1994). "Fast, accurate algorithm for numerical simulation of L\'evy stable stochastic processes". Physical Review E. 49 (5): 4677–4683. doi:10.1103/PhysRevE.49.4677.
 ^ Janicki, Aleksander; Kokoszka, Piotr (1992). "Computer investigation of the Rate of Convergence of Lepage Type Series to αStable Random Variables". Statistics. 23 (4): 365–373. doi:10.1080/02331889208802383. ISSN 02331888.
 ^ Rachev, Svetlozar T.; Mittnik, Stefan (2000). Stable Paretian Models in Finance. Wiley. ISBN 9780471953142.
 ^ Leddon, D., A statistical Study of Hard XRay Solar Flares
 ^ ^{a} ^{b} Garoni, T. M.; Frankel, N. E. (2002). "Lévy flights: Exact results and asymptotics beyond all orders". Journal of Mathematical Physics. 43 (5): 2670–2689. doi:10.1063/1.1467095.
 ^ ^{a} ^{b} Hopcraft, K. I.; Jakeman, E.; Tanner, R. M. J. (1999). "Lévy random walks with fluctuating step number and multiscale behavior". Physical Review E. 60 (5): 5327–5343. doi:10.1103/physreve.60.5327.
 ^ Uchaikin, V. V.; Zolotarev, V. M. (1999). "Chance And Stability – Stable Distributions And Their Applications". VSP. Utrecht, Netherlands.
 ^ Zlotarev, V. M. (1961). "Expression of the density of a stable distribution with exponent alpha greater than one by means of a frequency with exponent 1/alpha". Selected Translations in Mathematical Statistics and Probability (Translated from the Russian article: Dokl. Akad. Nauk SSSR. 98, 735–738 (1954)). 1: 163–167.
 ^ Zaliapin, I. V.; Kagan, Y. Y.; Schoenberg, F. P. (2005). "Approximating the Distribution of Pareto Sums". Pure and Applied Geophysics. 162 (6): 1187–1228. doi:10.1007/s0002400426663.
This page is based on a Wikipedia article written by authors
(here).
Text is available under the CC BYSA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.