Complexity
Publishing Collaboration
More info
Wiley Hindawi logo
 Journal metrics
See full report
Acceptance rate24%
Submission to final decision83 days
Acceptance to publication31 days
CiteScore3.500
Journal Citation Indicator0.740
Impact Factor2.121

Optimization Analysis for Innovative Inputs under the Objective Discrepancy between Government and Enterprise

Read the full article

 Journal profile

Complexity publishes original research and review articles across a broad range of disciplines with the purpose of reporting important advances in the scientific study of complex systems.

 Editor spotlight

Chief Editor, Prof Sayama, is currently researching complex dynamical networks, human and social dynamics, artificial life, and interactive systems while working at Binghamton University, State University of New York.

 Special Issues

Do you think there is an emerging area of research that really needs to be highlighted? Or an existing research area that has been overlooked or would benefit from deeper investigation? Raise the profile of a research area by leading a Special Issue.

Latest Articles

More articles
Research Article

The New Generalized Exponentiated Fréchet–Weibull Distribution: Properties, Applications, and Regression Model

Statistical probability distributions are commonly used by data analysts and statisticians to describe and analyze their data. It is possible in many situations that data would not fit the existing classical distributions. A new distribution is therefore required in order to accommodate the complexities of different data shapes and enhance the goodness of fit. A novel model called the new generalized exponentiated Fréchet–Weibull distribution is proposed in this paper by combing two methods, the transformed transformer method and the new generalized exponentiated method. This novel modeling approach is capable of modeling complex data structures in a wide range of applications. Some statistical properties of the new distribution are derived. The parameters have been estimated using the method of maximum likelihood. Then, different simulation studies have been conducted to assess the behavior of the estimators. The performance of the proposed distribution in modeling has been investigated by means of applications to three real datasets. Further, a new regression model is proposed through reparametrization of the new generalized exponentiated Fréchet–Weibull distribution using the log-location-scale technique. The effectiveness of the proposed regression model is also investigated with two simulation studies and three real censored datasets. The results demonstrated the superiority of the proposed models over other competing models.

Research Article

Techniques for Finding Analytical Solution of Generalized Fuzzy Differential Equations with Applications

Engineering and applied mathematics disciplines that involve differential equations include classical mechanics, thermodynamics, electrodynamics, and general relativity. Modelling a wide range of real-world situations sometimes comprises ambiguous, imprecise, or insufficient situational information, as well as multiindex, uncertainty, or restriction dynamics. As a result, intuitionistic fuzzy set models are significantly more useful and versatile in dealing with this type of data than fuzzy set models, triangular, or trapezoidal fuzzy set models. In this research, we looked at differential equations in a generalized intuitionistic fuzzy environment. We used the modified Adomian decomposition technique to solve generalized intuitionistic fuzzy initial value problems. The generalized modified Adomian decomposition technique is used to solve various higher-order generalized trapezoidal intuitionistic fuzzy initial value problems, circuit analysis problems, mass-spring systems, steam supply control sliding value problems, and some other problems in physical science. The outcomes of numerical test applications were compared to exact technique solutions, and it was shown that our generalized modified Adomian decomposition method is efficient, robotic, and reliable, as well as simple to implement.

Research Article

Compliance Risk Assessment in the Banking Sector: Application of a Novel Pairwise Comparison-Based PRISM Method

Up-to-date compliance management uses a risk-based approach based on international standards. In addition to techniques and practices, implementing compliance measures is determined by principles and culture. Compliance risk assessment is an evolving field in theory and practice. Compliance risk management is complex and highly dependent on the decisions of experts. This article presents a new compliance risk assessment method based on a commercial banking case study. In the study, the Guilford method is used to extend the Partial Risk Map (PRISM) assessment technique, and the steps of the proposed pairwise comparison-based PRISM method are described in detail. Since risk assessment is critical to the operation and development of compliance management systems, the proposed risk assessment method involves testing individual evaluations’ consistency and the results’ robustness. The best-fitting and outlier experts can be identified based on testing the impact of individual expert rankings on the aggregated ranking. The main finding is that top partial risks can be identified by applying the proposed pairwise comparison-based PRISM technique; therefore, possible optimal risk mitigation strategies and measures can be designed.

Research Article

Low Complexity, Low Probability Patterns and Consequences for Algorithmic Probability Applications

Developing new ways to estimate probabilities can be valuable for science, statistics, engineering, and other fields. By considering the information content of different output patterns, recent work invoking algorithmic information theory inspired arguments has shown that a priori probability predictions based on pattern complexities can be made in a broad class of input-output maps. These algorithmic probability predictions do not depend on a detailed knowledge of how output patterns were produced, or historical statistical data. Although quantitatively fairly accurate, a main weakness of these predictions is that they are given as an upper bound on the probability of a pattern, but many low complexity, low probability patterns occur, for which the upper bound has little predictive value. Here, we study this low complexity, low probability phenomenon by looking at example maps, namely a finite state transducer, natural time series data, RNA molecule structures, and polynomial curves. Some mechanisms causing low complexity, low probability behaviour are identified, and we argue this behaviour should be assumed as a default in the real-world algorithmic probability studies. Additionally, we examine some applications of algorithmic probability and discuss some implications of low complexity, low probability patterns for several research areas including simplicity in physics and biology, a priori probability predictions, Solomonoff induction and Occam’s razor, machine learning, and password guessing.

Research Article

Prediction for the Inventory Management Chaotic Complexity System Based on the Deep Neural Network Algorithm

Precise inventory prediction is the key to goods inventory and safety management. Accurate inventory prediction improves enterprises’ production efficiency. It is also essential to control costs and optimize the supply chain’s performance. Nevertheless, the complex inventory data are often chaotic and nonlinear; high data complexity raises the accuracy prediction difficulty. This study simulated inventory records by using the dynamics inventory management system. Four deep neural network models trained the data: short-term memory neural network (LSTM), convolutional neural network-long short-term memory (CNN-LSTM), bidirectional long short-term memory neural network (Bi-LSTM), and deep long-short-term memory neural network (DLSTM). Evaluating the models’ performance based on RMSE, MSE, and MAE, bi-LSTM achieved the highest prediction accuracy with the least square error of 0.14%. The results concluded that the complexity of the model was not directly related to the prediction performance. By contrasting several methods of chaotic nonlinear inventory data and neural network dynamics prediction, this study contributed to the academia. The research results provided useful advice for companies’ planned production and inventory officers when they plan for product inventory and minimize the risk of mishaps brought on by excess inventories in warehouses.

Research Article

Providing a Framework for Performance Evaluation of Organizations in Successfully Implementing TQM, Based on Knowledge Management Approach and Organizational Agility

Nowadays, any organization, in order to be aware of the desirability and quality of its activities, especially in dynamic environments, urgently needs an evaluation system to assess its system’s performance and efficiency. In this study, the performance of production and service organizations has been evaluated according to the effect of knowledge management success factors and organizational agility on the success factors of total quality management (TQM (total quality management)). Considering that the two areas of knowledge management and organizational agility are among those influential on TQM and each has been examined separately in the previous studies in relation to it, none of the previous studies has focused on all these three areas together. Therefore, in the present study, in addition to evaluating the performance of organizations based on the success factors of TQM, we seek to identify the influential factors and success factors of each of the three areas of knowledge management, organizational agility, and TQM; then, the impact of knowledge management and organizational agility on the success factors of TQM is evaluated. To express the above-given relationship, group interpretive structural modelling (ISM (interpretive structural modelling)) techniques have been used to rank the components of each domain; also, fuzzy quality function deployment (FQFD (fuzzy quality function deployment)) technique has been applied to find the weight and relationship of the three mentioned domains by examining the independent and interface variables of the previous step. The weights resulting from two-stage house of quality have been used as the weights for the design of the two-stage data envelope analysis (DEA) model by the weight control method. Finally, the above-given factors in knowledge-based companies located in Khorramabad Science and Technology Park have been studied and the results have been expressed.

Complexity
Publishing Collaboration
More info
Wiley Hindawi logo
 Journal metrics
See full report
Acceptance rate24%
Submission to final decision83 days
Acceptance to publication31 days
CiteScore3.500
Journal Citation Indicator0.740
Impact Factor2.121
 Submit

Article of the Year Award: Outstanding research contributions of 2021, as selected by our Chief Editors. Read the winning articles.