Advances in Statistics

Volume 2015, Article ID 581259, 8 pages

http://dx.doi.org/10.1155/2015/581259

## Relative Entropies and Jensen Divergences in the Classical Limit

^{1}La Plata National University and Argentina’s National Research Council (IFLP-CCT-CONICET)-C, C. 727, 1900 La Plata, Argentina^{2}Comision de Investigaciones Científicas (CIC), Argentina

Received 30 September 2014; Revised 21 December 2014; Accepted 11 January 2015

Academic Editor: Jos De Brabanter

Copyright © 2015 A. M. Kowalski and A. Plastino. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

Metrics and distances in probability spaces have shown to be useful tools for physical purposes. Here we use this idea, with emphasis on Jensen Divergences and relative entropies, to investigate features of the road towards the classical limit. A well-known semiclassical model is used and recourse is made to numerical techniques, via the well-known Bandt and Pompe methodology, to extract probability distributions from the pertinent time-series associated with dynamical data.

#### 1. Introduction

Many problems of quantum and statistical mechanics can be formulated in terms of a distance between probability distributions. A frequently used quantity to compare two probability distributions, which arose in information theory, is the Kullback-Leibler (KL) Divergence [1]. Given two probability distributions and , it provides an estimation of how much information contains relative to and measures the expected number of extra bits required to code samples from when using a code based on , rather than using a code based on [2]. Some of them can also be regarded as entropic* distances*.

It is well-known that systems that are characterized by either long-range interactions, long-term memories, or multifractality are best described by a formalism [3] called Tsallis’ -statistics. The basic quantity is the entropy [ ()] with being probabilities associated with the different system-configurations. The entropic index (or deformation parameter) describes the deviations of Tsallis entropy from the standard Boltzmann-Gibbs-Shannon (BGS) one. One has The BGS entropy works best for systems composed of either independent subsystems or interacting via short-range forces, whose subsystems can access all the available phase space [4–7]. For systems exhibiting long-range correlations, memory, or fractal properties, Tsallis’ entropy becomes the most convenient quantifier [4–17]. Tsallis relative entropies, studied in [18], are NOT distances in probability space. An alternative tool is the Jensen Shannon Divergence, introduced by Lamberti et al. [2].

How good are these quantifiers to statistically describe complex scenarios? To answer we will apply the above-mentioned quantifiers to a well-known semiclassical system in its path towards the classical limit [19, 20]. The pertinent dynamics displays regular zones, chaotic ones, and other regions that, although not chaotic, possess complex features. The system has been investigated in detail from a purely dynamic viewpoint [20] and also from a statistical one [21–23]. For this a prerequisite emerges: how to extract information from a time-series (TS) [24]? The data at our disposal always possess a stochastic component due to noise [25, 26], so that different extraction-procedures attain distinct degrees of quality. We will employ Bandt and Pompe’s approach [27] that determines the probability distribution associated to time-series on the basis of the nature of the underlying attractor (see [27] for the mathematical details).

In this paper we employ the normalized versions of Tsallis relative entropy [10, 18, 28],* to which we add the Jensen Divergences associated to it*. We will compare, via Jensen, (i) the probability distribution functions (PDFs) associated to the system’s dynamic equation’s solutions in their route towards the classical limit [20] with (ii) the PDF associated to the classical solutions. Our present Jensen way will be also compared to the -Kullback-Leibler analyzed in [18].

The relative entropies and Jensen Divergences mentioned above are discussed in Section 2, which briefly recapitulates notions concerning the Tsallis relative entropy and the Kullback-Leibler relative one. The Jensen Shannon and generalized Jensen Divergences are also discussed in this Section. A simple, but illustrative example is considered in Section 2.1. In Section 2.2, we consider normalized forms corresponding to the information quantifiers mentioned in Section 2.1. As a test-scenario, the semiclassical system and its classical limit are described in Section 3, and the concomitant results presented in Section 4. Finally, some conclusions are drawn in Section 5.

#### 2. Kullback and Tsallis Relative Entropies, Jensen Shannon, and Generalized Jensen Divergences

The relative entropies (RE) quantify the difference between two probability distributions and [29]. The best representative is Kullback-Leibler’s (KL) one, based on the BGS canonical measure (2). For two normalized, discrete probability distribution functions (PDF) and (), one has with . if and only if . One assumes that either for all values of , or that if one , then as well [30]. In such an instance people take [30] (also, , of course).

KL can be seen as a particular case of the generalized Tsallis relative entropy [10, 18, 28] when [10, 18, 28]. if . For one has if and only if . For one has for all and .

The two entropies (3) and (4) provide an estimation of how much information contains relative to [29]. They also can be regarded as entropic* distances*, alternative means for comparing the distribution to . Our two entropies are not symmetric in .

So as to deal with* the nonsymmetric nature* of the KL Divergence, Lamberti et al. [2] proposed using the following quantity:
as a measure of the distance between the probability distributions and . if and only if . This quantity verifies . Moreover, its square root satisfies the triangle inequality [2]. In terms of the Shannon entropy, expression (5) can be rewritten in the form
We can obtain a generalization of Jensen’s Divergence by using the relative entropy (4) instead of the KL Divergence; that is,

##### 2.1. An Illustrative Example

A simple scenario will now illustrate on the behavior of our quantifiers. Let us evaluate for the certainty versus the equiprobability case; that is, (i) , for and if , and (ii) , with , . In this case, (4) adopts the form
if . For , that is, Kullback-Leibler entropy (3) instance, we obtain . For these same PDFs, consider now given by (7). We find
if . In the Jensen Shannon Divergence-case (), given by (6), one has
Let us discuss the behavior of these quantities for large , when . We ascertain that (and ). Instead, the Jensen Divergences attain an asymptotic value
and . Comparing (11) and (8), one notes that for large behaves like for ( behaves like ). What are the consequences on the -dependence? Given reasonable -values, for small (), the behavior of resembles that of (see Figure 2). For , the -dependence is quite different (see scales in Figure 3). The asymptotic behavior for both large and large is such that , while . Thus, scale differences may become* astronomic*.

In real-life statistical problems , on the basis of our simple but illustrative example, we expect quite different behaviors for and regarding the -dependence for large . -variations are much smoother for than for .

In this work we consider (see Sections 3 and 4) a system representing the zeroth mode contribution of a strong external field to the production of charged meson pairs, more specifically, the roald lading to the classical limit. The ensuing dynamics is much richer and more complex. However, we will see that the pertinent difference in the -behaviors of our quantifier is the one of our simple example. The two quantities, (a) maximum of Figure 1 and (b) the -value in Figure 3, were both chosen so as to coincide with our semiclassical system’s number of states for this case, which facilitates comparison.