Advances in High Energy Physics

Volume 2018 (2018), Article ID 8347408, 9 pages

https://doi.org/10.1155/2018/8347408

## Azimuthal Anisotropy in High-Energy Nuclear Collision: An Approach Based on Complex Network Analysis

Deepa Ghosh Research Foundation, Kolkata 700031, India

Correspondence should be addressed to Susmita Bhaduri; ni.noitadnuofgd@irudahbs.atimsus

Received 25 October 2017; Revised 19 December 2017; Accepted 15 January 2018; Published 11 February 2018

Academic Editor: Edward Sarkisyan-Grinbaum

Copyright © 2018 Susmita Bhaduri and Dipak Ghosh. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The publication of this article was funded by SCOAP^{3}.

#### Abstract

Recently, a complex network based method of visibility graph has been applied to confirm the scale-freeness and presence of fractal properties in the process of multiplicity fluctuation. Analysis of data obtained from experiments on hadron-nucleus and nucleus-nucleus interactions results in values of* Power of Scale-Freeness of Visibility Graph (PSVG)* parameter extracted from the visibility graphs. Here, the relativistic nucleus-nucleus interaction data have been analysed to detect* azimuthal anisotropy* by extending the visibility graph method and extracting the average clustering coefficient, one of the important topological parameters, from the graph. Azimuthal-distributions corresponding to different pseudorapidity regions around the central pseudorapidity value are analysed utilising the parameter. Here we attempt to correlate the conventional physical significance of this coefficient with respect to complex network systems, with some basic notions of particle production phenomenology, like clustering and correlation. Earlier methods for detecting anisotropy in azimuthal distribution were mostly based on the analysis of statistical fluctuation. In this work, we have attempted to find deterministic information on the anisotropy in azimuthal distribution by means of precise determination of topological parameter from a complex network perspective.

#### 1. Introduction

Many authors have probed the azimuthal anisotropy of the produced particles in ultrarelativistic heavy-ion collisions as a function of transverse momentum and it has been used as one of the major observables to study the* collective properties* of nuclear matter [Ex. [1]]. The initial volume enclosing the interacting nucleons is essentially anisotropic in coordinate space, because of the geometry of noncentral heavy-ion collisions. The initial coordinate space anisotropy of the overlapping zone of the colliding nuclei, in which the produced nuclear matter thermalization transforms via reciprocal interactions into the final state anisotropy in the momentum space. This area has been a field of immense interest in the recent past.

The azimuthal anisotropic distribution in momentum space has been analysed using Fourier series [2], where the first few harmonicas have been referred to as directed flow, elliptical flow, and so on, and in general different harmonics will have different symmetry planes. In case of an idealized initial geometry of heavy-ion interactions, all symmetry planes coincide to the reaction plane of the collision, which is constituted by the impact parameter and the beam axis. If one uses just the orthogonality properties of trigonometric functions, the Fourier series can produce some nonvanishing flow harmonics which can not confirm whether the azimuthal anisotropic distribution in momentum space has originated from a collective anisotropic flow or from some other fully unrelated physical process capable of yielding event-by-event anisotropies (as, for example, minijets) [3]. Hence, more rigorous attempts were made to analyse collective behaviour from different perspectives which could disentangle it from the processes which normally involve only a smaller subset of the produced particles termed as* nonflow* [3]. The use of correlation-based techniques by including two or more particles has eventually led to multiparticle correlation techniques. Recently, Bilandzic et al. have suggested that if all produced particles are independently emitted and correlated only to a few common reference planes, then the presence of* azimuthal anisotropy* can be confirmed [3]. This has already been confirmed mathematically in [4]. Sarkisyan [5] has analysed the parametrization of multiplicity distributions of the produced hadrons in high-energy interaction to describe higher order genuine correlations [6] and established the necessity of incorporating the multiparticle correlations with the property of self-similarity to achieve a good description of the measurements. Wang et al. [7] and Jiang et al. [8] were the first to go beyond two-particle azimuthal correlations, in terms of experimental analysis. However, it did not work for increased number of particles in such multiplets. The joint probability distribution of number of particles with an -multiplicity event has been applied theoretically, for the first time, in flow analysis of global event shapes [4] and then in other studies [1]. Borghini et al. have further reported a series of analysis on multiparticle correlations and cumulants [9]. Two- and multiparticle cumulants had drawbacks stemming from trivial and nonnegligible contributions from autocorrelations, which generates interference among various harmonics. Then Lee-Yang Zero (LYZ) method [10, 11] filters out the authentic multiparticle estimate for flow harmonics, equivalent to the asymptotic behaviour of the cumulant series. But this approach has its own integral systematic biases. Most recently, Bilandzic et al. have proposed the -cumulants by implementing Voloshin’s fundamental idea of manifesting multiparticle azimuthal correlations in terms of -vectors assessed for different harmonics [12]. Though previous drawbacks are partially removed the method is very monotonous as calculation and hence could be accomplished only for a small subset of multiparticle azimuthal correlations. Bilandzic et al. have provided a generic framework which allows all multiparticle azimuthal correlations to be evaluated analytically, with a fast single pass over the data [3]. It removed previous limitations and new multiparticle azimuthal observables could be obtained experimentally. But in this method a systematic bias has been found, when all particles got divided into two groups, one of reference particles and the other particles of interest.

The evidences of self-similar characteristics in high-energy interactions have connections to the idea of fractality. In view of these, study of* azimuthal anisotropy* can also be attempted using different methods that are based on fractality of a complex system. It started from the introduction of intermittency by Bialas and Peschanski [13], for the analysis of large fluctuations, where the power-law behaviour (indicating self-similarity) of the factorial moments with decreasing size of phase-space intervals was confirmed. A relationship between the anomalous fractal dimension and intermittency indices has been established by Paladin and Vulpiani [14]. The evolution of scaling-law (thereby self-similarity) in small phase-space domains has been reviewed in terms of particle correlations and fluctuations in different high-energy multiparticle collisions by De Wolf et al. [15], and eventually a relationship between fractality and intermittency in multiparticle final states has been established. The built-in cascading mechanism in the multiparticle production process [16], naturally gives rise to a fractal structure to form the spectrum of fractal dimensions, and hence the presence of scale invariance in the hadronization process is evident. Further, various methods based on the fractal theory have been utilised to examine the multiparticle emission data [14, 17–20], and two of them, the moment and moment methods, were developed, respectively, by Ghosh et al. and implemented extensively to similar systems [21]. Then techniques like the Detrended Fluctuation Analysis (DFA) method [22] and Multifractal-DFA (MF-DFA) method [23] were introduced for analysing fractal and multifractal behaviour of fluctuations in high-energy interactions.

Recently, novel approaches to analyse complex networks have been proposed. Various natural systems can be termed as complex, and heterogeneous systems consisting of various kinds of fundamental units which communicate among themselves through varied interactions (namely, long-range and short-range interactions). Complex network based systems present us with a quantitative model for large-scale natural systems (in the various fields like physics, biology, and social sciences). The topological parameters extracted from these complex networks provide us with important information about the nature of the real system. The latest advances in the field of complex networks have been reviewed and the analytical models for random graphs, small-world, and scale-free networks have been analysed in the recent past [24, 25]. Havlin et al. have reported the relevance of network sciences to the analysis, perception, design, and repair of multilevel complex systems which are found in man-made and human social systems, in organic and inorganic matter, in various scales (from nano to macro), in natural and in anthropogenic systems [26]. Zhao et al. have investigated the dynamics of stock market, using correlation-based network, and identified global expansion and local clustering market behaviours during crises, using the heterogeneous time scales [27].

Lacasa et al. have introduced a very interesting method of visibility graph analysis [28, 29] that has gained importance because of its completely different, rigorous approach to estimate fractality. They have started applying the classical method of complex network analysis to measure long-range dependence and fractality of a time series [29]. Using fractional Brownian motion (fBm) and fractional Gaussian noises (fGn) series as a theoretical framework, they have experimented over real time series in various scientific fields. They have converted fractional Brownian motion (fBm) and fractional Gaussian noises (fGn) series into a scale-free visibility graph having degree distribution as a function of the Hurst parameter associated with the fractal dimension which is the degree of fractality of the time series and can be deduced from the Detrended Fluctuation Analysis (DFA) of the time series [23]. Recently, multiplicity fluctuation in -AgBr interaction at an incident energy of GeV and S-AgBr interaction at an incident energy of 200A GeV have been analysed using visibility graph method [30] and the fractality of void probability distribution in S-Ag/Br interaction at an incident energy of GeV per nucleon has also been analysed, using the same method (see [31] and reference there in).

Motivated by the findings obtained from the previous studies in this work the* azimuthal anisotropy* was studied using the S-AgBr interaction at 200A GeV by extending the complex network based visibility graph method. The average clustering coefficient [32], one of the important topological parameters, is extracted from the visibility graph constructed from the azimuthal distribution data corresponding to several pseudorapidity regions around the central pseudorapidity. The scale-freeness and fractal and multifractal properties of the process of multiparticle production have already been confirmed in [33–36], by using the DFA and MF-DFA methods. Recently, complex network based visibility graph method has been applied over data collected from -AgBr interaction at GeV and S-AgBr interaction at A GeV, and then by analysing the* Power of Scale-Freeness of Visibility Graph (PSVG)* [28, 29, 37] parameter extracted from the graphs, the scale-freeness and fractal properties of the process of particle production have been established [30, 31]. Mali et al. have applied visibility, horizontal visibility graphs, and the sandbox algorithm to analyse multiparticle emission data in high-energy nucleus-nucleus collisions in [38, 39]. The topological parameters of the visibility graphs have their usual significance with respect to the complex network systems. Here we attempted to correlate the physical significance of average clustering coefficient with some fundamental notions of particle production phenomenology, like clustering and correlation. Earlier methods for detecting anisotropy in azimuthal distribution were mostly based on the analysis of statistical fluctuation. So, in this work, we have attempted to analyse the azimuthal distribution using the approach of complex network which gives more deterministic information about the anisotropy in azimuthal distribution by means of precise topological parameters.

The rest of the paper is organized as follows. The method of visibility graph algorithm and the significance of complex network parameters like scale-freeness and average clustering coefficient are presented in Section 2. The data description and related terminologies are elaborated in Section 3.1. The details of our analysis are given in Section 3.2. The physical significance of the network parameter and its prospective correlation with the traditional concepts of* azimuthal anisotropy* in heavy-ion collisions is elaborated in Section 3.3, and the paper is concluded in Section 4.

#### 2. Method of Analysis

As per the visibility graph method, a graph can be formed for a time or data series according to the visibility of each node from the rest of the nodes [29]. In this way the visibility graph preserves the dynamics of the fluctuation of the data present within it. Hence periodic series is transformed to a regular graph, random series to a random graph, and naturally fractal series to a scale-free network in which the graph’s degree distribution conforms to the power-law with respect to its degree. Thus a fractal series can be mapped into a scale-free visibility graph [29], which is also from a series with finite number of data points [40]. However, other nonstationary and nonlinear methods like DFA and MF-DFA require an infinite number of data points as input for yielding accurate result

##### 2.1. Visibility Graph Method

Let us suppose that the value of the th (in the sequence of the input data series) point of time series is . In this way all the input data points are mapped to their corresponding nodes or vertices (according to their value or magnitude). In this node-series, two nodes, say and , corresponding to the th and th points in the time series are said to be visible to each other or in other words joined by a two-way edge, if and only if the following equation is satisfied. In this way, a visibility graph is constructed out of a time series :where and The nodes and with and are shown in Figure 1 where the nodes and are visible to each other and connected with a bidirectional edge as they satisfy (1). It is evident that the sequential nodes are always connected as two sequential points of the time or data series can always see each other.