- About this Journal
- Abstracting and Indexing
- Aims and Scope
- Article Processing Charges
- Articles in Press
- Author Guidelines
- Bibliographic Information
- Citations to this Journal
- Contact Information
- Editorial Board
- Editorial Workflow
- Free eTOC Alerts
- Publication Ethics
- Submit a Manuscript
- Subscription Information
- Table of Contents
ISRN Applied Mathematics
Volume 2013 (2013), Article ID 938545, 8 pages
Average Sample Number Function for Pareto Heavy Tailed Distributions
Department of Mathematics and Computer Science, University of Tebessa, Algeria
Received 21 April 2013; Accepted 9 May 2013
Academic Editors: S.-W. Chyuan and Q. Song
Copyright © 2013 Boukhalfa El-Hafsi. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
The main purpose of this work is shortly to give the average sample number function after a sequential probability ratio test on the index parameter alpha of stable densities, which we give a mean of the number of data required to take decision in the case , we use the fact that the tails of Levy-stable distributions are asymptotically equivalent to a Pareto law for large data. Stable distributions are a rich class of probability distributions that allow skewness and heavy tails and have many intriguing mathematical properties. The lack of closed formulas for densities and distribution functions for all has been a major drawback to the use of stable distributions by practitioners, but few stable distributions have the analytical formula of their densities functions which are Gauss, Levy, and Cauchy.
Because it has a large application in many fields, stable distributions have been proposed as a model for many types of physical and economic systems [1–3]. There are several reasons for using a stable distribution to describe a system. The first is where there are solid theoretical reasons for expecting a non-Gausian stable model. The second reason is the Generlized Central Limit Theorem which states that the only possible non-trivial limit of normalized sums of independent identically distributed terms is stable. The third argument for modeling with stable distributions is empirical: many large data sets exhibit heavy tails and skewness. The strong empirical evidence for these features combined with Generalized Central Limit Theorem is used by many to justify the use of stable models. There are now reliable computer programs to compute stable densities, distribution functions and quantiles. With these programs, it is possible to use stable models in variety of practical problems [1, 2].
In Section 1, we give some definitions, arithmetic properties of stable laws, and the numerical calculation of stable densities, distribution functions Mittnik et al.  and Nolan [1, 2], and simulation algorithm [1, 5]. In Section 2, we propose a method to calculate the likelihood ratio appliying the pareto limit of stable densities for large data, with discuss, Finally we interests of the number of observations needed to take decision about the best value of index parameter by its averange sample number function in the case , when moment of order one exists.
There are now reliable computer programs to compute stable densities, distribution functions and quantiles. Nolan  creates algorithms for generate general stable densities based on integral representations of the densities which were derived by Zolotarev , then Nolan [2, 3] provides a useful program named “STABLE”. With these program, it is possible to use stable models in variety of practical problems. For Wald’s SPRT  approximatios, Raghavachari  gave exact formulas in the case of an exponential density for testing versus , provided . For a sequential probability ratio test, Under other distributions A. Wald’s approximations  are often used, but how about stable distributions? there are not closed formula for the densities to calculate the likelihood ratio, we compare the fit of two pareto stable models using sequential probability ratio test.
2. Stable Distributions
Definition 1. A random variable has a distribution infinitely divisible if and only if for all independent with the same even law such as where means equality in distribution.
Remark 2. The random variables did not even law as . However, as we see in the following examples, they belong to the same family of distributions.
Example 3. The characteristic function of normal distribution is written as as th power of characteristic function of a normal distribution .
Example 4. The characteristic function of Cauchy distribution is written as as th power of the characteristic function of a Cauchy distribution .
Example 5. The characteristic function of Poisson distribution is written as as th power eme of the characteristic function of a Poisson law .
Example 6. The characteristic function of Gamma distribution is written as
as th power eme of characteristic function of a Gamma distribution .
The same is true for the exponential law and the law of ).
Remark 7. A mixture of finished normal distributions is not divisible infinitely.
Theorem 8. A random variable is the limit of a sum of random variables i.i.d. if and only if is divisible indefinitely.
Proof. The demonstration was detailed in Shiryayev .
Definition 9. A random variable is stable if for , two independent random variables that have same law as and any two real positive constants , , then satisfies for some and .
Definition 10. A random variable is stable distribution if and only if, for all and any independent identique family which have the same law that , there is and , as
Proposition 11. If is stable, is divisible indefinitely. The inverse is false (see the example below, the Poisson distribution).
Proof. Just take the random variables .
As are independent, are also independent, and by substitution, we have
Remark 12. This definition is coinside of the central limit theorem if is following a normal law.
Theorem 13. has domain of attraction; that is, there is a sequence of random variables , a sequence of positive real numbers , and a sequence of real numbers , such as
Proof. The demonstration is detailed in Shiryayev .
Corollary 14 (Levy-Khinchin). The characteristic function of a stable random variable admits the following form:
Proof. The demonstration is detailed in Gnedenko and Kolmogorov .
The random variable is a stable law with parameters , , , and ; we note that
But this representation of the FC, called parameterization standard, has the disadvantage of not being continued in all of its parameters. In fact, there is discontinuity in the points where and ; otherwise, there are other parameterization of the FC more adapted to different problems. Consider
This representation of Zolotarev’s parameterization notes . The parameters , and are the same as those of the standard parameterization, but and are related by
This parameterization is very important because the characteristic function and the cumulative distribution function are continuing in relation to the four parameters. is well conditioned numerically by calculation.
Another parameterization is given by the following: where where parameters and are the same as for standard parameterization; other parameters meet the following relationships:
Remark 15. The main disadvantage is that the densities of stable laws are unknown except in three cases.(1)The Gaussian distribution where , .(2)The Cauchy distribution where .(3)The Lévy distribution where .
But since the implementation of the fast Fourier transform, stable densities are easy to calculate. We can approach this method density .
2.1. Interpretation of the Parameters of the Characteristic Function of a Stable Law
(1)The parameter known as an exponent characteristic or a stability index describes the form of distribution or the degree of thickness of the tail distribution . If is smaller, then the tails of the distribution are thick. In other words, the more small is, the more we see the existence of very large fluctuations (see Figure 1). A Gaussian distribution that has the maximum value of is .(2)The parameter gives an idea of asymmetric distribution. This is the parameter of asymmetry. If is equal to (resp., ), distribution is totally asymmetrical left or right. When , then the distribution is symmetric around (see Figure 2).(3)The parameter is called a scale factor. is more grand, if more data are volatile (see Figure 3). The parameter has the possibility to bend more or less the body of the distribution.(4)The parameter location corresponds for to the mean of the distribution. If , then is the median. In other cases, cannot be interpreted.
2.2. Arithmetic Properties
Stable random variables have the following properties.(i) are two independent random variables that follow, respectively, stable laws , ; then, follows a stable law with Note that, if , then .(ii)If , , then .(iii)If , follows a stable law , then where “” means “has the same distribution.”(iv) is a symmetric random variable around if and only if the law of is . We note that is a symmetric stable random variable with .(v)Let be a random variable law with , then (vi)Let be a random variable law with , then
In this section, it is assumed that the samples are independent and equally distributed stable random variables, such that ,
2.3. The Asymptotic Behavior
Theorem 16 (see ). For , , , , If , then by derivation, for large data, or denotes the Gamma function defined by .
As , then by derivation .
2.4. Parameter Estimation
2.4.1. Tail Exponent Estimation
There are many methods to give estimators of the index exponent of stable distributions, but we describe here the regression method, which is based on the effect that, for large enough values of , the tails of stable laws come in a power of alpha; see (22); then, we consider a linear regression as where the slope of the line will be an estimator of .
2.5. Simulation of an -Stable Distribution
To simulate the stable laws, there is an algorithm developed by Chambers et al. . It can generate a law ; then to obtain a stable law, just make a change of variables.
Step 1. Generate two random variables as follows.
Let be uniform on . Then, is uniform on , and is exponentially distributed with mean 1.
Step 2. Generate a law.(i)For (ii)For
Step 3. Generate a law using these transformations: For more details see Samorodnitsky and Taqqu, [12, page 43].
3. Probability Ratio Test Applying to Stable Laws in the Case
We will use for the exponent characteristic and for the asymmetry parameter to avoid confusion with the symbols and which were used in type I error and type II error in Wald’s sequential probability ratio test, and for simplifying the process, let us consider symmetric standard alpha stable random variables , in other terms where is unknown.
We assume that we have a sample of independent random variables which follow the same law with unknown .
We want to use the sequential probability ratio test SPRT, for the hypothesis , against the hypothesis, ,where , if we want to get a test as and .
We are willing to choose values for and as the Wald’s  approximation . Where , at the end of the th observation:(i)accept if ;(ii)accept if ;(iii)continue the test if , with defined by the relationship in general by In our case of symmetric standard alpha stable random variables, we have so, In (30), we note that for , then To summarize, we accept if (i)Accept , if (ii)We take an additional observation if In practice, we trace the following two parallel lines: Then we place on the same graph the points of coordinates as long as they are in the band plan defined by two straight, we decides whether is crossed the band formed by the two straights, and we decides if is crossed. We continue the sampling if the points are in the band.
Example 17. For a sample of independent random variables which follow the same law with unknown , we want to use the sequential probability ratio test SPRT, for the hypothesis against the hypothesis, , where . Let and , then , . and . with thick black line (straight). with red line (straight) (see Figure 4).
Example 18. We assume that we have a sample of independent random variables which follow the same normal distribution with unknown . We want to use the sequential probability ratio test SPRT, for the hypothesis against the hypothesis, ,where ; if we follow the same steps, we obtain
3.1. Efficiency Function and Average Sample Number Function
Definition 19. The efficiency function of the probability ratio test is
It is possible to obtain additional information on the sequential probability test, especially on the number of observations needed to take decision, which is a discrete random variable that admits moments of all kinds; consider the function average sample number (ASN).
Definition 20. One calls the following average sample number (ASN) function:
Theorem 21. Let a fixed . If , then where and is defined by (29).
Theorem 22. Let a fixed . If and , then
In the case of stable densities with , see (v) in arithmetic properties, and with a simple calculation, and for symmetric standard alpha stable random variable , Then, where is the function of efficiency. With substitution in (29) finally, under
because , and under
Example 23. Let and ; then, for the index parameter of symmetric stable distributions , ,
In the case ofnormaldistribution , that is, , we use the same steps, but here we have the analytic form of the density function, and the average sample number function is under because , and under because , where
We use the fact that the tails of the Lévy-stable distributions are asymptotically equivalent to a Pareto law when for large data as , where to obtain the probability ratio, and we compare the fit of two Pareto stable models using the sequential probability ratio test which is not possible in the other cases because of the lack of the analytical formulas of stable densities; then, we apply the theorem giving the average sample number function.
Can one give a bound of the average sample number function of stable distributions in our case, and in other cases?
Can one say that this amount of data is sufficient to make the decision that this population follows this stable law with this alpha parameter or the other laws with the second alpha parameter?
One can find many applications of the Pareto stable distributions in many fields when the data are very large, such as the flow of profits or revenues in public and private companies with huge capital, such as banks, insurance companies, and oil companies. Actuaries give speculation of instant transactions reliability and test hypotheses and advice to guide managers; however, it would be prudent in their decisions in cases where the data are incomplete in the sense that the mean of the number of data required to take decision is not achieved. As in the likelihood, we use the log data, which simplifies the calculus.
- J. P. Nolan, “An algorithm for evaluating stable densities in Zolotarev's (M) parameterization,” Mathematical and Computer Modelling, vol. 29, no. 10–12, pp. 229–233, 1999.
- J. P. Nolan, “Numerical calculation of stable densities and distribution functions,” Communications in Statistics. Stochastic Models, vol. 13, no. 4, pp. 759–774, 1997.
- S. Rachev and S. Mittnik, Stable Paretian Models in Finance, John Wiley & Sons, New York, NY, USA, 2000.
- S. Mittnik, T. Doganoglu, and D. Chenyao, “Computing the probability density function of the stable Paretian distribution,” Mathematical and Computer Modelling, vol. 29, no. 10–12, pp. 235–240, 1999.
- J. M. Chambers, C. L. Mallows, and B. W. Stuck, “A method for simulating stable random variables,” Journal of the American Statistical Association, vol. 71, no. 354, pp. 340–344, 1976.
- V. M. Zolotarev, One-Dimensional Stable Distributions, American Mathematical Society, Providence, RI, USA, 1986.
- A. Wald, Sequential Analysis, John Wiley & Sons, New York, NY, USA, 1947.
- M. Raghavachari, “Operating characteristic and expected sample size of a sequentail probability ratio test for the simple exponential distribution,” Calcutta Statistical Association Bulletin, vol. 14, pp. 65–73.
- A. N. Shiryayev, Probability, vol. 95 of Graduate Texts in Mathematics, Springer, New York, NY, USA, 1984.
- B. V. Gnedenko and A. N. Kolmogorov, Limit Distributions for Sums of Independent Random Variables, Translated from the Russian, annotated, and revised by K. L. Chung, J. L. Doob and P. L. Hsu, Addison-Wesley, Reading, Mass, USA, 1968.
- A. Janiki and A. Weron, Simulation and Chaotic Behavior of α-Stable Stochastic Processes, Marcel Dekker, New York, NY, USA, 1994.
- G. Samorodnitsky and M. S. Taqqu, Stable Non-Gaussian Random Processes, Chapman & Hall, New York, NY, USA, 1994.