Abstract

The cities are not static environments. They change constantly. When we talk about traffic in the city, the evolution of traffic lights is a journey from mindless automation to increasingly intelligent, fluid traffic management. In our approach, presented in this paper, reinforcement-learning mechanism based on cost function is introduced to determine optimal decisions for each traffic light, based on the solution given by Larry Page for page ranking in Web environment (Page et al. (1999)). Our approach is similar with work presented by Sheng-Chung et al. (2009) and Yousef et al. (2010). We consider that the traffic lights are controlled by servers and a score for each road is computed based on efficient PageRank approach and is used in cost function to determine optimal decisions. We demonstrate that the cumulative contribution of each car in the traffic respects the main constrain of PageRank approach, preserving all the properties of matrix consider in our model.

1. Introduction

The PageRank model [1] uses the structure of the Web to build a Markov chain considering a nonnegative and irreductible matrix for transition probability (called also a primitive matrix). This property guarantees that, according to Frobenius test for primitivity [2], a stationary vector exists as a solution for the PageRank problem.

In our approach, the main role of a computer server that will serve a traffic light is to compute the congestion cost of different monitored road segments. The congestion is a real effect of dynamic and unpredictable behavior of drivers, and the traffic optimization (in terms of cost reduction) will contribute to solve partially the congestion problem. We consider a directed graph having streets as nodes and introduce the centrality of a road as a measure of popularity of the road on different path (a more central road will be part of more driving routes). The two roads are connected if they are part in the same route. This is similar to how Web pages are connected. In the Web model, in general, the pages are connected, so we could introduce the density in order to measure number of connection related to number of resources. In the traffic model, the road density (considered as number of roads in a city region) is consequently higher and hence more jams. The road density is translated in our model to graph density which has an effect in graph representation (the graph matrix is not sparse).

The urban growth patterns are considered in [3] as development units correlated and able to reproduce the observed morphology of cities and the area distribution of subclusters in an urban system. Another model try to map out fractal models of urban growth [4, 5]. Peterson concluded that “such a model emphasizes how uncoordinated local decision making can generate global patterns that give cities their particular sizes and shapes. It suggests that a city's overall structure can emerge naturally out of individual actions without any coordinating authority.

In this paper we prove that an adapted PageRank approach over the proposed graph offers a stationary solution for the computation of centrality road segments as a new mathematical applied problem in engineering. Also, we prove that the cumulative effect of each car that offers information about traffic preserves the properties of the matrix.

2. The PageRank Model for the Urban Traffic

The transition probability matrix is built from the structure of the city, to be stochastic and primitive. The Markov model represents the graph with a square matrix whose element is the probability of moving from state (road ) to state (road , direction connected with road ) in a time step [6]. In the matrix some rows can contain all zeros, which is the natural assumption for cities (a specific route is composed of some road segments). We will replace all zeros rows with (, where is the row vector of all ones, and is the order of the matrix (the number of roads). Therefore, we will compute a new nonnegative and irreductible matrix as follows:

Now, considering , a column vector that contains the centrality scores of all roads, the PageRank mathematical problem can be written as , where . For the urban traffic model, we can change the uniform vector with the more general vector (teleportation vector), representing the probability of road to be considered part of a route. We will define . It is clear that this change makes irreductible and aperiodic; all nodes are connected (all roads are connected in cities where the graph of roads is always connected). The additional normalization equation ensures that is a probability vector, so it can be used in order to compute the congestion cost. The PageRank problem is now . The row vector can be found by solving the eigenvector problem , or by solving the homogeneous linear system .

Theorem 2.1. Let one consider the general problem , where for a specific teleportation vector and a specific . Then, the solution of the problem can be written as

Proof. We will start with the initial problem as follows where we have used since . Hence, and so

Theorem 2.2 (convex combinations of teleportation vectors). For any convex combination of teleportation vectors, , with and , one has the convex combination .

Proof. We have

So, a convex combination () of teleportation vectors will lead to the same convex combination of PageRank vectors. This also demonstrates that the convex cumulative contribution of each car and traffic light in the traffic follows the main constrain of PageRank approach, preserving all the properties of the matrix. For our system we make the hypothesis that cars are equipped with wireless devices that run a mobile navigation application. Such a mobile application uses the device's sensors (e.g., GPS sensor and accelerometer) and constantly sends the sensed traffic conditions (travelling speed and position) to a computer server (the car contribution). For the driver, the incentive for using such an application might consist in the dynamic adjustment of his/her traveling path, based on the returned feedback.

An important result was obtained related to representation of a stochastic traffic bound. The main result proves that the burstiness bound and the bound of long-term average rate are separately connected to the fractal dimension, that is, the measure of the local self-similarity together with the small-scale factor and the Hurst parameter that is the measure of the long-range dependence together with the large-scale factor a of traffic [7]. Having this bound, the basis to reliably distinguish flood traffic from aggregated one is reliable detection of signs of DDOS flood attacks. Reliably distinguishing DDOS flood traffic from aggregated traffic becomes a tough task mainly due to the effects of flash-crowd traffic [8].

2.1. Convergence

Considering the power method to solve the PageRank problem, we have the following:

Applying this gives that . Because is irreductible and aperiodic, , for all initial approximation of cen[trality vectors .

Theorem 2.3 (fast PageRank). For any approximation of centrality vectors , the power method, , will converge.

Proof. If denotes the number of steps then we can approximate , and, considering the precision threshold, we have We can approximate with the precision ; thus, we have So, having , we conclude that the method converges.

Another important result sustains the convergence of this method it represented by the following theorem.

Theorem 2.4. Given that the spectrum of the stochastic and irreductible matrix is , the spectrum of the primitive stochastic matrix is .

Proof. The proof is give in [6, Theorem  5.1].

This theorem shows that if all values are lower than 1, then the new values are also lower than 1, so the convergence of the method is preserved for the matrix. Therefore, it is important to have a near-optimal model, and convergence plays an important role here because it leads to fewer iterations to be executed until the algorithm reaches a solution for a centrality vector () in PageRank problem.

2.2. Stability

We will discuss the stability of the proposed method as an analysis of centrality of the vector , considering the dependency and . We have the following theorem.

Theorem 2.5. If is a centrality vector, then

Proof. The proof is give in [6, Theorem  7.2].

This theorem makes it apparent that the sensitivity of the centrality vector as a function of is primarily governed by the size of . If is close to 1, then centrality is sensitive to small changes in . As becomes smaller, the effects of the probability given by are increased. In the urban traffic model this is not an accurate situation because the teleportation vector is computed based on car behavior and depends of its context. So, it is more desirable (at least in this respect) to choose close to 1 (between 0.75 and 0.90, or 0.85 like in the Web model [1]). However, if is too close to 1, then the method will be unstable, and the convergence rate slows.

Theorem 2.6. One has: .

Proof. We have The -matrix norm is the maximum absolute row sum. If the node in the traffic model has a positive number of links (this is true for all cities), then the corresponding diagonal element is 1. Then, the sum to is in absolute value, so . Now, we will prove that the row sums of are . Because of , the row sums of are . Based on [2, 9] we have that the is a positive and irreductible matrix, so . So, .

In the ideal case must be 1, but, in this case , will be 0. If is too close to 1, then the method will be unstable because . For the proposed interval , we have , so the matrix is hard conditioned. For a Web value, , we have .

3. Decomposition of Traffic Model

Random paths can be observed in different routes in the cities. It is possible to decompose the path in different smaller parts, so , where is a timed weight for the part , and is the number of outbound links of the part . On the other hand, the matrix can be written as , where is the number of different smaller parts (or a range) of a route. The decomposition of the routes for the driver has the following significance: more routes mean more possibilities to choose from, so the driver is constantly facing the challenge of choosing the route that best fits its own preferences, which is not always easy to do, especially considering the ever-changing dynamic traffic conditions in large cities.

In the decomposition model, there are two suppositions: (i) is the PageRank solution for each matrix, so which means that the centrality of the roads is kept in the decomposition model or (ii) is the PageRank solution for each matrix, so which means that the centrality of the roads is not kept in the decomposition model and is computed as a temporal dependency by the formula .

Theorem 3.1. The decomposition model with and conservative centrality , for all , of the PageRank problem is stable.

Proof. We have So, the matrix becomes which is a diagonal matrix with nonzero diagonal elements. So, the stability of the method is proven.

For the second case (ii), the global teleportation vector can be deducted as which highlights the possibility for decomposition of a global teleportation vector. This result makes it easy to generate new centrality vector () from existing and computed centrality vectors. This could be applied for router nodes placement problem in wireless mesh networks designed for traffic light monitoring [10].

Also, the problem of the stabilization of descriptor systems in continuous time via static output feedback is important for monitoring phase in order to collect the elements of matrix used for the computation of matrix . Sufficient conditions are derived for the closed-loop system to be admissible (i.e., stable, regular, and impulse-free) and are proposed in [11].

4. Discussions

The new area of research is oriented on next generation data technologies in support of collective and computational intelligence [12]. Next generation of data technologies are used together to capture, integrate, analyze, mine, annotate, and visualize distributed data made available from various community users in a meaningful and collaborative for the organization manner. Also, in this context, the processing and command synthesis for memory-limited complex systems are very important, considering the limitation of traffic lights [13].

The European Commission report “European transport policy for 2010: time to decide” recognizes the important role played by the transport infrastructure in the modern European economies and identifies a number of major European policy goals that are expected to help streamline existing transport systems [14]. In particular, the report shows that the development of new infrastructures is not anymore a reliable solution as well as investing in technologies based on intelligent transport systems (ITSs), as predictions show that in the future traffic congestions can be solved mainly by technology. It is estimated that only in Europe traffic congestion affects approximately 10% of the existing transport network, with significant financial implications. According to the conclusion of the analysis given by Duranton and Turner [15], “increased provision of interstate highways and major urban roads is unlikely to relieve congestion of these roads.” Thus, it becomes apparent that the way to alleviate congestion and utilize the existing infrastructure more efficiently is ITSs.

An intelligent transport systems can be seen as a cyber-physical networking systems which are made up of various physical systems that are heterogeneous in nature. The data processing and analysis in these systems is the most important aspect for solving difficult problems like traffic congestion. Li and Zhao demonstrate that the power laws may yet be a universality of data in these systems, a useful point of view for data modeling [16, 17]. If we consider the periodic trends, like traffic evolution patterns, the long-range power-law cross-correlation can be applied. Diverse methods can be generalized to eliminate the trend effects, such as moving average singularity value decomposition and can be used for detecting the cross-correlation of traffic signals, like in metropolis where the traffic jam is becoming a puzzle for citizens [18].

We envision a society with IT systems that do respond to the actual people's needs. In particular, the theoretical model can inspire the development of solutions that will impact the domain of transportation. Today phones can be used to collect information (e.g., evidence about waste and noise) and mobilize people together for a common goal. Systems can further use the sensed information to construct high-level models that enable people better understand various social and environmental phenomena. The stage for these phone-centric sensing campaigns was recently set up by various projects targeting various domains. For traffic, congestion alone can severely impact both the environment and human productivity (because of wasted hours waiting in congestions). Traffic jams cost Europe alone € 135 billion a year, and drivers lose five days per year sitting in traffic. This is why mobile phone systems such as MIT Track or Mobile Millenium are already used to provide fine-grained traffic information on a large scale, using mobile phones to collect traffic data, pinpoint specific problems, and facilitate services such as accurate travel time estimation for improved commute planning. We believe that our theoretical proposed model can, in fact, sustain the development of such intelligent transportation systems in the future, to deliver better results with and for the drivers in large urban environments.

5. Conclusions

In this paper we describe a new efficient approach of a classical mathematical problem formulated by Larry Page a solution for web search. Our approach adapts this problem to the urban traffic and demonstrates that the usage of this method to compute the centrality of each segment of road can improve the traffic and reduce congestions. The presented solution can lead to support for traffic applications designed to support traffic in large urban transportation systems. The approach can be implemented into Intelligent Transportation Systems that eventually will lead to solving problems related to congestions, finding better routes for drivers, meaning less time spent in traffic by individual cars, less pollution and fuel saving. As a future study, the decomposition approach of this method will be studied.

Authors' Contribution

All authors carried out the proofs. All authors conceived of the study and participated in its design and coordination. All authors read and approved the final manuscript.

Acknowledgments

This work was partially supported by Project “ERRIC—Empowering Romanian Research on Intelligent Information Technologies/FP7-REGPOT-2010-1,” ID no. 264207. The research presented in this paper is supported by Romanian Projects “TRANSYS—Models and Techniques for Traffic Optimizing in Urban Environments” (Contract no. 4/28.07.2010, Project CNCSIS-PN-II-RU-PD ID: 238) and “SORMSYSResource Management Optimization in Self-Organizing Large Scale Distributes Systems” (Contract no. 5/28.07.2010, Project CNCSIS-PN-II-RU-PD ID: 201). The work has been cofunded by the Sectorial Operational Program Human Resources Development 2007–2013 of the Romanian Ministry of Labor, Family and Social Protection through the Financial Agreement “POSDRU/89/1.5/S/62557.