Abstract

The range of application of methodologies of complexity science, interdisciplinary by nature, has spread even more broadly across disciplines after the dawn of this century. Specifically, applications to public policy and corporate strategies have proliferated in tandem. This paper reviews the most used complex systems methodologies with an emphasis on public policy. We briefly present examples, pros, and cons of agent-based modeling, network models, dynamical systems, data mining, and evolutionary game theory. Further, we illustrate some specific experiences of large applied projects in macroeconomics, urban systems, and infrastructure planning. We argue that agent-based modeling has established itself as a strong tool within scientific realm. However, adoption by policy-makers is still scarce. Considering the huge amount of exemplary, successful applications of complexity science across the most varied disciplines, we believe policy is ready to become an actual field of detailed and useful applications.

1. Introduction

Generally speaking, complex systems are those in which the sum of the parts is insufficient to describe the macroscopic properties of systems’ behavior and evolution [13]. Interactions among parts of the system, at different scales, in a nonhierarchical [4], nonlinear, and self-organizing manner [5] lead to emerging properties [6, 7] that fail to have a single, certain unfolding in the future.

Social actions, carried out by millions of individuals interacting in a multitude of way and through traditional or digital means, and economic processes, where highly heterogeneous economic actors are interconnected by transactions, ownership relations, competition, and mutualism, are two paradigmatic kinds of complex systems. Policies, as a set of actions to enhance social life and economic processes, are an archetypical example of controlling them. Policies are the product of the interaction of agents and institutions in time and space in which knowledge of current state provides only incomplete views of future states of the system. Policy modeling, as an attempt to design the operation of such interactions, presupposes some level of comprehension of the mechanisms, processes, and likely trajectories while maintaining a strict knowledge of the inherent incompleteness of modeling complex systems [812].

This view that policies are complex enables the application of complex systems’ methodologies onto the analysis of public (and private) policy-making. Such application feeds on early contributions and takes many forms that vary from simple construction of indicators and measures of complexity à la Shannon [1315], to cellular automata and artificial intelligence [16], to agent-based modeling [17] and network science [18].

This review provides an overview of contemporary applications of policy modeling that follows the traditional complex systems’ methodologies portfolio. Mainly, we focus on agent-based modeling, network science, and data mining. First, we discuss the methodologies themselves and then we examine three cases in detail. A large consortium for infrastructure analysis applied to the case of Britain, one of the most consolidated families of macroeconomics agent-based models and its applications on fiscal and monetary policies and a land-use model in use by metropolitan governance entities across the USA and abroad. Implications for policy modeling close the paper.

2. Policy Modeling Methodologies

Fuentes [19] describes a landscape of eleven distinct methodologies coming from complex systems’ sciences (the complex science referred by Fuentes [19, pp. 55–56] include “nonlinear science, bifurcation theory, pattern formation, network theory, game theory, information theory, super statistics, measures of complexity, cellular automata, agent-based modeling, and data mining”). While that approach is more exhaustive, here, we emphasize applications to public policies, thus focusing more extensively on the discussion of agent-based modeling and cellular automata, data mining, network analysis, and game theory. Other relevant methodologies are considered together in a specific subsection.

2.1. Agent-Based Modeling and Cellular Automata

Agent-based modeling (ABM) is a computational or algorithmic, artificial implementation of agents who interact among themselves and with the environment following a set of rules. As a result of the interaction, the variables that describe the state of the agents may be modified [2023]. ABM is useful when analytical solutions are too complex or impossible to be calculated. Results are sensitive to initial conditions, although, in many cases, still deterministic, and thus useful interpretation relies on distributions of stochastically repeated simulations of the model and reasonable validation.

The uniqueness of ABM methodology led Epstein [24, 25], probably inspired by Ostrom [26], to suggest a third way of doing science. Verbal or argumentative would be the first one, mathematically quantification a second, and algorithmic simulation the third [27]. Such proposal is in line with the views of philosopher Nicolescu [28] in which social sciences (and therefore, policy-making) have a contained maneuvering space when experimenting with populations and individuals. ABM provides just that liberty of experimenting in silico with additional degrees of freedom. Hence, ABM may enhance the capabilities of social sciences to bridge science and policy.

In fact, agent-based modeling as a methodology has a number of attributes that probably make it the method of complex systems most attached to policy-making. It is flexible, adaptable to empirical analysis, cost-effective, and adequate to ever-changing analysis scenarios [29]. ABM is also applied to a number of different disciplines in the realm of policy studies from demography [30], to anthropology [31], and it also gains recognition and scope in economics [27, 32] and international politics [33]. Finally, there is also a profusion of available tools specifically designed for modeling [34].

In economics, Dawid and Delli Gatti [35] categorize seven distinct families of macroeconomic models [3644]. Dosi, Fagiolo, Roventini, and coauthors [41, 4547] probably lead the most prolific evolutionary branch of economic modeling whereas Lengnick is a single standalone model proposal which does not include a credit market [42]. Despite these large macroeconomic models, economic studies also emphasize market-specific models: in electricity [48, 49], labor market [47, 50], economic behavior [51, 52] and the problem of commons [53]. Further in economics, ABM is used to criticize current, quantitative, yet perfect (without endogenous crises) economic models [5456]. Finally, it is worth mentioning the use of ABM to test policies applied to the financial markets, from the early contributions of LeBaron [57, 58] and Westerhoof [59] to the detailing of the effects of transaction taxes and trading halts on assets volatility [60].

Despite all these contributions, institutions and governments are still slow in adopting recommendations. Bank of England [61], OECD [62], the European Union [63] and some academic institutions (such as MITRE, DARPA, and NECSI) have helped fuel the debate. Although Page [64] reminds us with the essence of understanding the mechanism underpinning economic phenomena, policy-makers do not seem ready to accept general results and explanations [6567], rather sticking to precise (yet probably wrong [54]) numbers.

Cellular automata (CA) also goes back to the infancy of complex studies [3, 6, 68, 69]. Its fundamental design is the related analysis of diffusion by contact processes in a deterministic way when the state of the agents can be described by a finite set of states. Despite a stream of literature by itself, contemporary conceptualization of CA may consider it as a special case of agent-based modeling in which agents are fixed and not mobile and their relationships follow a matrix of adjacency [70, 71]. Even then, CA is much used for spatial analysis, having once been called “space theory based models” [72].

In fact, among geographers and spatial analysts “Land-Use and Transportation” models (LUTs) have yielded results and recommendations for the past two decades. Early models [73, 74] focused on urban development, but those were readily followed by more general land-cover models [75, 76]. Transportation and activities models also intensified their uses in the 2000s [77, 78] and became paradigmatic for actual use in urban planning and transportation [7981], especially by metropolitan bodies.

Advances in the area have been so intertwining that new models have started to feed from different trends of spatial-modeling literature bridging transportation and land-use models to macroeconomics [82] and actual life-cycle of individuals in order to generate individual demand models [83]. The bridge has also been generous when crossing automated computing techniques and traditional models [84] or when aiding its validation [85].

2.2. Network Science

Network science studies the structure of a given system in general by recourse of tools originated in graph theory. Here, nodes describe the system constituents and edges conform the interactions between them. One may consider again that networks are generalizations of agent-based models in which not only agents have attributes, but also edges, with varying attributes and lengths. These connections are abstractions, or dimensional relationships of the typical fixed neighborhood found in CA models. These arguments are purely conceptual and only aid the highlighting of the complementarity of complex systems methodologies [86]. In fact, two similar economic models may use either networks [42] or spatial distance [82] as rules towards consumer decision-making.

Despite this inherently attached connection with both ABM and CA, network science has come a long way after its relatively recent birth as an independent yet gregarious discipline, marked by the seminal papers by Watts and Strogatz [87] and Barabási and Albert [88]. In fact, it has developed a large field of literature that has evolved from recent work on network statistics [89] to studies that describe dynamic changes of the network itself [90].

In Finance, applied network analysis has helped illuminate likely policy effects of systemic risk within interbank trading. An early work by Battiston et al. [91, p. 2082] showed that local interactions travelling within networks may function as “an alternative mechanism for the propagation of failures”. Subsequent work was able to measure network systemic relevance more precisely [92] and thus apply policy testing, including transparency advocacy [93] and leverage regulation [94]. Together, these analyses have demonstrated with considerable easiness the possibilities of simulating alternative policy scenarios.

A current challenge in network science is “understanding the relationship between structure and function” [95, p. 9]. Scientists are trying to comprehend how the topological structure of a given network—how their nodes are connected–influences their systemic functions and what they do. An example would be to clarify how the connection of proteins determines a resulting phenotype. Conversely, others [96] are trying to find the function or purpose of the network given observed data, having applied examples on migration, congress voting, and the human brain.

2.3. Data Mining

Data mining, or more generally data science, has benefited from continuously decreasing hardware prices, larger software communities, and an abundance of data following generalization of desktops first and mobile devices more recently, which subsumed giving rise to the ongoing digitalization of society. One could date this quantification and empirical emphasis back to around 2001’s book by Hastie, Tibshirani, and Friedman [97], deepening the effort in 2009 [98]. Quickly, deep learning [99] and neural networks [100] became standard, maybe due to available software (and accelerators), such as TensorFlow [101].

There is no doubt of the beneficial effects of data science on social life enhancement in fields such as pinpointing fraudulent actions [102, 103] helping in medicine diagnostics [104], training of professionals via simulation [105], or aiding mobility through autonomous systems [106]. However, some concerns are also present [107]. Specifically, there are mentions of results without theory, like the infamous Garbage In, Garbage Out (GIGO) [108].

Such lack of theory is of minor concern for some machine learning scientists who want solely to achieve the best possible prediction, no matter the processes or prejudices. Conversely, there is the argument that complex “description” [109], hitherto unavailable, may provide new theories by induction, which previously seemed as a lackluster source of scientific reasoning.

2.4. Game Theory

Game theory focuses on how a group of agents or elements (which may describe individuals, organizations, economic actors, etc.) interact using strategic decision-making. The two branches that one can visualize in this field are cooperative or noncooperative. A good reference for a discussion on this topic can be seen in [110]. It is usually understood that cooperative game theory is applicable when agreements are enforceable, while noncooperative game theory is applicable otherwise. McCain argue that noncooperative game theory is an effective tool for problem-finding (or diagnostic method). These observations make game theory a useful methodology to be applied in societies that faces continuous decision-making processes. Moreover, recent studies suggest that a combination of game theory with psychology and neuroscience has great potential to understand mechanism involved in social decision-making [111]. It is worth mentioning that the connection between game theory and complexity can be achieved, or it is clearer, when an important number of agents are connected in a network interacting under a game theory dynamics, as in the case of evolutionary game theory [112].

2.5. Other Policy Methodologies

Dynamical systems (DS) is a modeling approach in which there are timed flows among stocks and control of probabilistic input of variables in order to conform a systemic analysis with feedback [113]. On such setting, each stock entity is an abstract construction that allows mathematical simulation of future states of the system. There is not, however, heterogeneity within each entity as in ABM, nor spatial representation, as Batty reminds us [21]. DS was introduced in the 1970s [114] and has accumulated many applications and supporting technologies since then [115]. Even though DS and ABM share some characteristics, they also have important, fundamental, differences [116]. Some of those are the applications on different levels of analyses using ABM on complex networks [117]. In those types of systems, the emergence of new characteristics at higher levels is difficult to analyze using first principles, something that is one of the main characteristics and properties of classical DS [118].

Moreover, numerical simulation, microsimulation, or yet mathematical simulation is also a methodology that solves otherwise intractable analytical equations numerically [119]. It is the simple application of known rules, usually probabilistic ones, to known states so that the researcher can observe the trajectory and results into the future. Numerical simulation is useful, for example, to understand the effects of a given tax change on specific sectors or taxpayers. Crooks and Heppenstall [120], however, highlight that microsimulation accounts only for direct effects, the effect of taxes on the market, but not the counterreaction, i.e., the indirect effect of having a market that is different from the original one. Numerical simulation has been used in various fields ranging from regional economics [121], to fluids, to pollution.

Before concluding, we highlight that this review is definitely not exhaustive, but covers the methodologies most attached to policy applications. We describe some applications in the following section.

3. Policy Modeling Applications

There has been a wide and spread range of research and output for policy using complex systems’ various methodologies. This special issue is, as far as we know, a first effort to put together the main strands of literature specifically on the area of policy. On the same vein, the previous section lists a sample of the most referenced publications that discusses policy modeling and applications in order to help crisscross leading researchers over different fields. In this section, we dwell a bit longer on three cases of larger impact and reverberation, in our opinion.

Specifically towards policy and management, there is significant difference between planning for a known, well-designed trajectory development and planning within a complex system environment. “Complexity theory demonstrates that there are fundamental conceptual difficulties in the concepts of “planning” in any open system which contain a significant level of decentralization of decision-making” [12] [122, p. 320]. In other words, referring to fishery governance, systems are neither predictable nor controllable [123], thus, the need to consider ever-changing environments when doing policy-making.

3.1. System of Systems

A consortium of seven leading universities and other partners in the United Kingdom has formed the Infrastructure Transitions Research Consortium (ITRC). ITRC has put together a National Infrastructure Model (NISMOD) that in turn has evolved into a current program named Multiscale Infrastructure Systems Analytics (MISTRAL). All those acronyms depict a large institutional effort aimed at applying “complexity-based methods” to public policy based on a criticism of reductionist science, systems theory, and mainstream neoclassical economics.

ITRC proposed focusing on four main themes (according to the Report “Final Results from the ITRC” (2015), available at https://www.itrc.org.uk/wp-content/PDFs/ITRC-booklet-final.pdf. For a longer discussion, see [124]):(a)Develop a capacity to compare quantitative metrics of infrastructure capacity and demand given a varied number of alternative scenarios, which ITRC call national strategies, while accounting for interdependent effects among infrastructure sectors(b)Develop a specific model of vulnerabilities and cascading cumulative failures (and resilience) in connected infrastructure systems(c)Develop an understanding of the dynamics of infrastructure when coupled with evolving socioeconomic (heterogeneous), spatially specific social groups(d)Develop sound long-term planning of infrastructure systems

MISTRAL proposes going further with four encompassing challenges (see Report [125]): (a) downscale, detail, and emphasize local complexity of infrastructure, (b) focus on its interdependencies and connections, (c) take the experience abroad and change infrastructure decision-making internationally, and (d) maintain its focus on quantifying the relationships between infrastructure and economic growth.

All in all, the best output to follow the production of ITRC’s proposal is the book by leading researchers Jim W. Hall, Martino Tran, Adrian J. Hickford, and Robert J. Nicholls [124]. The authors introduce the book motivating the relevance of infrastructure systems, discussing the challenges of handling infrastructure within a contemporary, advanced-economies, interdependent environment, and outlining their “system of systems approach”.

Such an approach, the authors claim, would equalize both assumptions and metrics across different infrastructure sectors and make them robust against future uncertainties, all in accordance with strategies developed and scenarios that are outside the control of policy-makers. Further, the system of systems would be able to capture possible risks and vulnerabilities, thus making the infrastructure system more resilient. Hence, better, long-term planning would ensue.

According to their proposed framework, the first methodological step is scenario generation. A scenario builds upon a range of possible futures unfolding from demographic, economic, climate change, and environmental alternatives. As a result, “a complete set (times series) of external parameters defining the boundary conditions” [126, p. 15] is produced. The total number of possible scenarios considering all alternatives amounts to 2,112 combinations. However, given that difference, close scenarios provide very similar results and have different probabilities; in practice a set of three most likely scenarios with some variants is actually employed in the analysis.

Next, strategies, defined as the possible ways to tackle infrastructure provision in terms of planning, investment, and projects, are developed. The proposals need to be based on national policy directives and detailed enough so that they can be simulated. At the same time, three approaches per sector are observed: (a) demand management, such as regulation of a given sector, which may affect demand; (b) system efficiency, including possible gains that come from technology adoption for instance, and (c) capacity expansion changes, which actually involves physically altering infrastructure assets.

The third methodological step implies the use of detailed models that are specific for each sector, but that are intertwined with one model’s input coming from another model’s output. The best description of the system of systems method is that of a “family of models” in which communication and special links are consistent so that policy trade-offs across the full infrastructure system are properly evaluated.

Qualitatively, there are four ways to incorporate changes within the model. When policy-makers propose a change of policy, such a proposal enters the model as a strategy-change, which will end up as a measured change in demand. When incorporating an innovative process, efficiency parameters change. When physical infrastructure changes, the capacity (supply) of the system changes. Finally, when exogenous change happens, scenarios also change. When all the above steps have been implemented, the system is ready to provide evaluation and prognostics. As the authors claim, a “web-based data-viewer (…) combines and compares performance across sectors, across time, across regions, and across future conditions” [126, p. 24].

The project has certainly spanned a wide range of results and publications (a full reference guide, divided by nine themes (complex adaptive systems, databases, demographics, digital communications, economic impacts, governance, infrastructure system network risk analysis, and solid waste) can be found at https://www.itrc.org.uk/outputs/research-outputs/). Among the ones with larger reach is a methodological proposal (with 25 authors) [127] and a study on the link between energy and water [128].

3.2. Macroeconomics Agent-Based Model

One of the seven families of macroeconomics models described by Dawid and Delli Gatti [35] and also featured in Dosi and Roventini [129] is coined “Keynes meets Schumpeter” (KS) and was led by Dosi, Fagiolo, Roventini, among other coauthors (we have chosen this family to detail, as it seems to have the larger number of stylized facts reproduced and the more proficuous production). The model baseline was described in [45] and has a recent consolidation in [130]. A new model validation proposal uses KS as case-study [85] and policy applications have been done on climate change [131], labor market [46, 132], and monetary policy [133, 134]. A review on macroeconomics agent-based models policy application is available in [135].

The great contribution of KS is the ability to endogenously reproduce long-term economic cycles [130], while also maintaining short-term results, thus, breaking with the economic paradigm of equilibrium in which crisis is only deviations from a supposedly correct natural path [54]. Crisis in fact may have long-lasting effects on the economy. Further, the KS model seems especially relevant given that it has been shown to be validated [85]. As such, KS refutes the main valid criticism of agent-based modeling which is the lack of validation. Further, it provides actual policy case studies that are concrete and foundational enough to be applied to policy-making.

KS models the attempt of translating conceptual innovation theory into measured, validated output growth, contextually dependent on macroeconomic conjuncture. Such an attempt covers both the short-term (and indicators such as unemployment) and the long run (such as GDP). KS approach is novel, when compared to traditional macroeconomic Dynamic Stochastic General Equilibrium (DSGE) models as traditional ones do not treat technology as an endogenous factor for model explanation [130].

The model is composed of firms from two sectors (capital and consumption-goods), banks, Central Bank, labor force, and government. Capital firms drive innovation when investing in R&D and output more efficient, cheaper machines. Government defines taxes values and unemployment subsidy levels and the Central Bank decides on interest rate levels [130].

The authors list 17 stylized facts, in both macro- and microeconomics, that help ensure the model validity for policy analysis. Those stylized facts go from replicating endogenous economic fluctuations, to recession durations to cross-correlation of macro- and credit-related variables. Further, in microeconomics, the model mimics firms size distribution, firms’ productivity heterogeneity, bankruptcies, among others [130]. Despite these resemblances, as previously mentioned, KS model was used as a case-study by [85, p. 138, our emphasis] in which they compared the structures of vector autoregressive models in a five-step process to show that KS “resemble between 65% and 80% of the causal relation” in observed macroeconomic time-series.

A bundle of eleven policy analyses derived from KS is available [46, 133, 134]. They include firms’ innovation search capabilities, technological opportunities, patents, new firms’ productivity, market selection, antitrust, among others. Specifically about income inequality, [133] reports that markup setting influences both the dependence of financial support and the share between profits and wages. Such mechanism affects macroeconomics’ stability and growth. Together, the authors agree with inputs by Stiglitz and Piketty that claim that there is a downward trend feedback loop in economics that can be tied to higher levels of inequality. In such a scenario fiscal austerity has more negative effects when coupled with higher markup levels.

Furthermore, recent applications of the KS model on the labor market [47, 136, 137] have helped show that more rigid markets with higher levels of protection and less flexible wage may in fact keep output at increased levels, while maintaining lower inequality. Dosi et al. [137] suggest that coordination failures bring wages down and thus significantly impacting aggregate demand. The authors also suggest [136] that the core reasons of rising unemployment are lower innovation rates, workers’ skills deterioration, and reduced firms entry dynamics. This is also relevant because it goes against typical policy recommendations based on DSGE applications hitherto supported by international institutions such as OECD and the IMF, but which is under discussion [138].

In sum, KS model suggests that (a) technological changes and market open for new firms lead to strong positive growth; (b) patent enforcement, however, reduces growth dynamism; and (c) competition is relevant, but producing weaker effects. KS further recommends countercyclical fiscal policy (opposed to fiscal austerity) as a means to convey (a) unemployment and output stability; (b) “higher growth paths”, whereas fiscal austerity would be detrimental to the economy and government debt in both the short and long run.

KS is thus a model that clearly exemplifies a new tradition for a given segment of macroeconomics [135]. It seems to have been able to capture most of the variations of a complex system, the economy, allowing for policy analysis in a validated, fact replicating reasonable model.

3.3. UrbanSim

UrbanSim was first proposed in the mid-1990s [77, 78, 139, 140] as a model that focused on feedback effects from land use into transport and back. Their modeling process include a family of submodels that run independently, feeding on empirical data, while exchanging inputs and outputs among themselves. In practice, a GIS interface with a typical 150-meter grid works as a single unit summing up households and firms. The land price model follows typical neoclassical urban economics [141] and hedonic price regression modeling [142] which is update at each year-step of the model.

Since then, UrbanSim has grown and changed into a 3D platform in 2012 and became proprietary software (urban canvas) in 2014. They have now expanded the simulation to include not only land use and transportation but also the economy and the environment. They claim to account for unfolding effects of infrastructure onto transport, housing affordability, and the environment.

The agents of the model include households, individual citizens, and firms, including land developers, government, and their political constraints [143]. Facing a given environment, agents make choices, such as (a) whether to find a job (and which one) or stay at home, (b) where to locate your own family or your business, and (c) whether a family or business relocation is a sensible move.

The market in the model is asynchronous. Households search for new dwellings facing the short-term needs for a given year. Land developers observe house demand, but augment house supply within a wider, cumulative number of years [143]. Although prices are modeled (calculated) endogenously, the model does not impose equilibrium; i.e., there is no need for markets to “clear” [77].

A first application of the model is available for Eugene-Springfield, Oregon [77]. The interested policy stakeholder, Oregon Department of Transportation, actually financed the project. The model uses data starting in 1994 and runs for 15 years with a resolution grid of 150 square meters. A previous time-frame from 1980 to 1994 was used to validate the model capabilities. A good accuracy (of less than 50 households between simulated and observed results) was achieved for 57% of the sample. A larger error of up to 200 households included 89% of the sample for number of households and 76% for the number of jobs. Although reasonable, the results did not predict isolation and pinpoint change occurrences and have also overpredicted and underpredicted for small areas.

The takeaway from the first UrbanSim implementation is that effective integration of at least land use, transportation, and environmental issues’ mechanisms is a must [77]. Such an integration should also be accompanied by considering that, in practice, those areas usually involve different, distinct institutions (responsible for each issue), with conflicting values, epistemologies, and pragmatic policies [143].

Applications of UrbanSim were later developed for Detroit, Michigan [144], Salt Lake City, Utah [139], San Francisco, California [145], Seattle, Washington [146], and Paris, France [147].

The belief for integrated planning shown in the 2011s [143] paper is still present in UrbanSim’s most recent output [79]. Once more financed by an interested policy stakeholder, US Department of Energy, the authors propose an integrated “pipeline” among UrbanSim, called a microsimulation platform, along with ActivitySim, an agent-based model platform responsible for the generating traffic demand based on citizens choice of activities and a traffic assignment model (a routing mechanism). The motivation behind the attempt is clear: how to effectively quantify both intended and unintended consequences on urban complex environments, given a specific change in infrastructure or policy. Further, as the authors put it, urban systems are those in which “transportation network, the housing market, the labor market ([via] commuting), and other real estate markets are closely interconnected...” [79, p. 2].

4. Conclusions

In this review, we show some of the impact of the complexity science methodologies has had in public policy. We take special attention to agent-based modeling, network science, data mining, and game theory. We believe that these methodologies are important not only being used extensively nowadays, but also being the ones that are creating the bridge between science (its quantitative and qualitative methods, ways of thinking, etc.) and policy decision-making. We also have presented real cases where the use of such methodologies has been used.

The case analysis suggests that larger efforts are being developed in policy applications of different realms, from macroeconomics (fiscal and monetary policy), to urban planning (mobility and air pollution), to infrastructure (energy, water, and waste). We have selected these examples as they are paradigmatic, while we acknowledge that they are only some of the high-magnitude works currently being developed. The list of references amassed account for other smaller, scattered applications that seem to be wide and spread across disciplines.

Nevertheless, such a growing body of literature does not show that these methodologies have been understood, nor accepted without caveats in academia (in general) and policy-makers. Namely, most macroeconomics policy follows DSGE methods although they have been also heavily criticized [54]. Most infrastructure projects are planned in an isolated manner, following sector guidelines with little or no interface with other sectors. Further, most urban planning carried out observes all of the challenges issues listed by [143] conflicting institutions, values, epistemologies, and policies, but also the inherent communication issues across heterogeneous fields.

All in all, results of the three cases presented in detail reinforce the belief that integrated modeling performed with input originated across disciplines, sectors, and institutions within a complex systems framework deliveries with the added bonus of unveiling large scale effects of policies in an adaptive, evolutionary, nonhierarchical manner.

This work is part of a special issue on Public Policy Modeling and Applications.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

Bernardo A. Furtado would like to acknowledge Grant from the National Council of Research (CNPq). Claudio J. Tessone acknowledges financial support of the University of Zurich through the University Research Priority Program on Social Networks.