#### Abstract

The Chebyshev-Markov extremal distributions by known moments to order four are used to improve the Laguerre-Samuelson inequality for finite real sequences. In general, the refined bound depends not only on the sample size but also on the sample skewness and kurtosis. Numerical illustrations suggest that the refined inequality can almost be attained for randomly distributed completely symmetric sequences from a Cauchy distribution.

#### 1. Introduction

Let be real numbers with first and second order moments , . The Laguerre-Samuelson inequality (see Jensen and Styan  and Samuelson ) asserts that for a sample of size no observation lies more than standard deviation away from the arithmetic mean; that is, Experiments with random samples generated from various distributions on the real line suggest that there is considerable room for improvement if one takes higher order moments , , into account. In the present note, we demonstrate that this can be done using the so-called Chebyshev-Markov extremal distributions based on the moments of order three and four or equivalently on the (sample) knowledge of the skewness and (excess) kurtosis of a real sequence. The latter quantities are denoted and defined by For example, if is a “symmetric normal” real sequence with vanishing skewness and kurtosis , then the following improvement holds (see Example 5): Since the bounds in (1) can be attained, one might argue that (3) is not a genuine improvement because it depends on the property of a sequence to be “symmetric normal.” However, this objection cannot be made if one states improved general bounds of the type with some analytical function depending on all feasible values of and the sample size . According to Arnold and Balakrishnan  the idea of using probability inequalities to derive (1) goes back (at least) to Smith  (see Jensen and Styan , Section 2.7). The derivation is very simple. Indeed, consider the discrete uniform random variable defined by Clearly, is a standard random variable, which therefore satisfies the Chebyshev-Markov inequalities (also called Cantelli inequalities): Substituting into the first inequality and into the second inequality, one gets through combination the Laguerre-Samuelson bound (1). Along the same line of proof, we derive in Section 3 a refinement of the type (4) by considering the generalized Chebyshev-Markov inequalities by known skewness and kurtosis, which is recalled in preliminary Section 2. The result is illustrated for some sequences of symmetric type. We observe that the new bounds are sometimes rather tight. Symmetric sequences from a Cauchy distribution, whose moments do not exist, generate examples for this phenomenon.

#### 2. The Chebyshev-Markov Extremal Distributions by Known Skewness and Kurtosis

Given a partial order between random variables and some class of random variables, it is possible to construct extremal random variables with respect to this partial order, which provide useful information about extreme situations in probabilistic modeling. For example, the classical Chebyshev-Markov inequalities yield the extremal random variables with respect to the usual stochastic order for random variables with a given range and moments known up to a fixed order. Extremal random variables with respect to the increasing convex and other orderings are of similar general interest. A modern account of this topic is found in Hürlimann , Chap. IV.

In the following, a capital letter denotes a random variable with distribution function and survival function . The random variable is said to precede in stochastic order, a relation written as , if for all in the common support of and . The class of all random variables with given support , , and known moments is denoted by or simply in case the context is clear. For each fixed , denote by the Chebyshev-Markov extremal distributions, which are solutions of the extremal moment problems Random variables with distributions , are denoted by , , and one has for all . For example, if is the space of all standard random variables, then one has (e.g., Hürlimann , Chap. III, Table 4.1) which in particular includes inequalities (6).

In general, to construct the stochastic ordered extremal distributions (7), it is necessary to solve the optimization problems (), where is the Heaviside indicator function, defined to be 0 if and 1 otherwise. It belongs to the class of extremal moment problems (), where is a piecewise linear function and which have been extensively studied in Hürlimann . A general approach to solve these problems is the well-known polynomial majorant (minorant) method. It consists to find a polynomial () of degree less than or equal to and to construct a finite atomic random variable such that all atoms of are simultaneously atoms of . Indeed, suppose and have been found such that and () for all . Then the expected value depends only on the first moments , and thus necessarily maximizes (minimizes) over all .

For the present purpose, it suffices to restrict the attention to the construction of , over the space of all standard random variables defined on the real line with known skewness and kurtosis pair . Recall that for one has , . The required main result is found in the Appendix of Hürlimann . First of all, the parameters must satisfy the following well-known inequality between skewness and kurtosis (e.g., Pearson , Wilkins , and Guiard ): Second, the extremal bounds are attained at standard triatomic random variables such that for some quartic polynomial , and one has To describe the extremal supports of and the associated probabilities one needs the following characterization result.

Proposition 1 (characterization of standard triatomic random variables on with skewness and kurtosis ). Suppose that . The support , , and probabilities of a triatomic random variable are uniquely determined as follows: where , , and the functions , , and are defined by

Proof. Consult Hürlimann , proof of Proposition II.2.

The Chebyshev-Markov extremal distributions over the space are determined as follows.

Theorem 2. Under the assumption and in the notations of Proposition 1, the distribution functions of the Chebyshev-Markov ordered extremal random variables for the set are described in Table 1.

Proof. Consult Hürlimann , proof of Theorem III.2.

#### 3. A Refinement of the Laguerre-Samuelson Inequality

Let be real numbers with first four order moments , . The mean and standard deviation are ,  . The skewness and (excess) kurtosis are denoted by , and defined in (2). As in Section 1, consider the discrete uniform random variable defined by , . Since is a standard random variable with skewness and kurtosis , , it satisfies by Theorem 2 the Chebyshev-Markov inequalities Substituting into the first inequality and into the second inequality, one gets the bounds where , are both solutions of the equation , which by (13) is equivalent to the quartic equation Since the probability function is monotone decreasing, the condition is equivalent to the inequalities Similarly, the condition is equivalent to the inequalities Therefore, a necessary condition for the validity of (15) is . The following main result has been shown.

Theorem 3 (generalized Laguerre-Samuelson inequality). Let be real numbers with mean , standard deviation , skewness , and kurtosis as defined in Section 1. Suppose that and that quartic equation (16) has two real solutions and . Then, the following bounds hold: It is worthwhile to state separately the special case of “symmetric” sequences with vanishing skewness .

Corollary 4 (generalized Laguerre-Samuelson inequality for symmetric sequences). Let be real numbers with mean , standard deviation , skewness , and kurtosis as defined in Section 1. Then, the following bound holds: To illustrate the obtained results consider “completely symmetric” sequences of length of the type for which ; hence , , and Corollary 4 applies.

Example 5 (normally distributed completely symmetric sequence). Suppose that sequence (21) is independent and identically standard normally distributed with sample kurtosis . Then, one has approximately

Example 6 (completely symmetrized gamma distributed sequence). Suppose that the independent and identically symmetrized stem from a gamma distributed sequence. For a completely symmetric sequence (21) of this type the theoretical kurtosis is . For example, if one has , and Corollary 4 implies the following approximate (theoretical) bound:

Example 7 (cauchy distributed completely symmetric sequence). Suppose that the independent and identically symmetrized in (21) stem from a Cauchy distributed sequence with density . Though the moments of the Cauchy distribution and the corresponding theoretical bound (20) do not exist, the sample based improved Laguerre-Samuelson bound with remains valid. Table 2 illustrates numerically.

Table 2 illustrates numerically for some typical simulations of completely symmetric sequences. They all show a substantial improvement over the Laguerre-Samuelson inequality. Moreover, Example 7 of completely symmetric sequences from a Cauchy distribution shows that the improved bound can almost be attained for some random sequences. Moreover, simulations for Examples 5 and 6 suggest that further improvement is possible. According to Hürlimann , Theorem IV.2.1, the Chebyshev-Markov extremal distributions with known higher order moments predict still better though even more complex bounds.

#### Conflict of Interests

The author declares that there is no conflict of interests regarding the publication of this paper.

#### Acknowledgment

The author is grateful to a referee for pointing out a minor error.