This special issue aims to provide recent developments in managing information uncertainty and complexity in decision-making. Information uncertainty and complexity are a common paradigm in modern decision-making because perfect information is seldom available to decision-makers. A wide range of statistical and nonstatistical decision-making models have been proposed in the literature to model complex systems under uncertainty. Statistical methods (i.e., probability theory) are useful in modeling complex systems with incomplete or inaccurate data while nonstatistical methods (i.e., fuzzy set theory, rough set theory, possibility theory, or fuzzy neural networks) are useful for modeling complex systems with imprecise, ambiguous, or vague data.

Today’s real-world problems involve multiple data sets, some precise or objective and some uncertain or subjective. Hybrid decision-making models are quickly emerging as alternative methods of choice for modeling complex systems under uncertainty. Managing uncertainty is a prerequisite to effective problem-solving and decision-making in complex systems. Therefore, we invited authors to submit original research articles that proposed formal decision-making methods to describe and rationalize the process of decision-making in complex systems under uncertainty.

This special issue received twenty-four articles, and after rigorous review, we accepted six articles for inclusion. We would like to introduce each of them by a short description. The articles are described in the order of their appearance in the issue.

D. Zhou et al. in their paper titled “An Improved Belief Entropy and Its Application in Decision-Making” propose an improved belief entropy that enhances the performance of Deng entropy and some other uncertainty measures in Dempster–Shafer framework. The new belief entropy considers the uncertain information comprised not only in the mass function and the cardinality of the proposition, but also in the scale of the frame of discernment (FOD) and the relative scale of each proposition with respect to the FOD. Numerical examples show that the new belief entropy can quantify the uncertain degree of the body of evidence (BOE) more accurately than the other uncertainty measures. A new uncertainty measure-based decision-making approach is presented in this paper and applied to a case study. The approach shows the efficiency and effectiveness of the new belief entropy to solve problems related to uncertain information processing in real-world applications.

J. Ma and Z. Guo, in their paper titled “Implications for Firms with Limited Information to Take Advantage of Reference Price Effect in Competitive Settings,” explore some useful implications for decision-makers of firms with partial information to take advantage of the reference price effect in competitive settings. The work was closely related to the research on decision-making mechanisms of bounded rational players. Specifically, the authors set up a dynamic system model to represent the evolution process of prices and reference prices. Attempts have also been made to reveal the impacts of key parameters on stability and profits. Accordingly, the researchers considered a market with two competitive firms which sell similar products to customers and compete in price. To model the evolution of reference prices, the authors adopted an exponential smoothing model which has been one of the most commonly used models for reference prices. In addition, to model the decision-making processes of firms with limited information, the authors adopted the widely used gradient mechanism, which has proven to be robust in providing a good approximation to the practical adjustment when only marginal profit is available. They show that if the adjustment process is unstable, too much advertising may be harmful even when the advertising cost is negligible. Accordingly, firms should take potential evolution processes into consideration before implementing their advertising strategies. The authors finally suggest that incorporating random factors into the study of reference price effect makes implications more profound and robust which can be further investigated in the future research.

The paper “Some Generalized Pythagorean Fuzzy Bonferroni Mean Aggregation Operators with Their Application to Multiattribute Group Decision-Making” written by R. Zhang et al. develops a few aggregation operators for fusing the Pythagorean fuzzy information and introduces a novel approach to decision-making based on the proposed operators. First, the generalized Bonferroni mean is extended to the Pythagorean fuzzy environment, and the generalized Pythagorean fuzzy Bonferroni mean and the generalized Pythagorean fuzzy Bonferroni geometric mean are introduced. Second, a new generalization of the Bonferroni mean, namely, the dual generalized Bonferroni mean, is proposed by considering the shortcomings of the generalized Bonferroni mean. Thereafter, the new operators are used as bases to propose a novel approach to multiattribute group decision-making with Pythagorean fuzzy information. The authors apply the new approach to illustrate its validity to the problem of selecting the best airline. As the calculation process of the proposed method is a little more complicated than existing methods, the focus of future research of the authors is to reduce the complexity of the calculation of the proposed method.

K. Yan et al., in their paper titled “Managing Information Uncertainty in Wave Height Modeling for the Offshore Structural Analysis through Random Set,” present a reliability study for an offshore jacket structure with emphasis on the features of nonconventional modeling. First, a random set model is formulated for modeling the random waves in an ocean site. Then, a jacket structure is investigated in a pushover analysis to identify the critical wave direction and key structural elements. The selected probabilistic models are adopted for the important structural members. The wave height model is processed in a P-box format when it is used in the numerical analysis. The models are applied to find the bounds of the failure probabilities for the jacket structure. The propagation of this wave model is investigated in an interval analysis and Monte Carlo simulation. The results are compared in context of information content and numerical accuracy. Further, the failure probability bounds are compared with the conventional probabilistic approach. The key findings are that the P-box model possesses significant advantages over the traditional probability and it provides more information for engineers to make decisions especially for an existing structure which is exposed to a changing environment. The findings from the case analysis show its benefits for practical use in reliability engineering.

The paper titled “An Approach to Integrating Tactical Decision-Making in Industrial Maintenance Balance Scorecards Using Principal Components Analysis and Machine Learning” written by N. Rodriguez-Padial et al. deals with the design of the customized maintenance plans in a production plant. Applying Custom Balance Scorecard design framework, two data sets are constructed. The first data set reflects the maintenance work orders received in the productive area and the second data set consists of the production values and machine responses that are expressed as efficiency variables and failure times. Later on, the first data set is considered as input information at the exploratory phase, at which the principal component analysis (PCA) methodology is applied. The second data set is employed in the analysis phase. At this phase, machine learning framework is implemented for two different tasks: supervised learning, which is training a model on known input and output data to predict future outputs, and unsupervised learning, which is finding hidden patterns and intrinsic structures in the input data. The proposed approach enables the measurement of certain indicators of those productive areas that were previously considered strategic.

A. Mardani et al. in their paper titled “Recent Fuzzy Generalisations of Rough Sets Theory: A Systematic Review and Methodological Critique of the Literature” present a comprehensive and systematic review of methodological approaches and applications of fuzzy-rough set theory including fuzzy logic and fuzzy sets, rough sets, fuzzy logical operators, and fuzzy relations. They use preferred reporting items for systematic reviews and meta-analyses (PRISMA) to classify the selected articles according to author name, author nationality, publication year, application domain, method, contribution, and the publication outlet. A total of 132 papers were selected for this systematic review and meta-analysis for a seven-year period from 2010 to 2016. Many reviewed articles were published between 2013 and 2016. The authors also classify these papers into six application areas: information systems, decision-making, approximation operators, feature and attribute selection, fuzzy set theories, and other application areas. This study found that fuzzy set theory combined with rough set theory was the most widely used method. The study also ranks the Journal of Information Systems first in publishing fuzzy-rough set papers among 28 journals identified as relevant publication outlets in the field. The authors encourage future research on expanding this collection with books, dissertations, and non-English sources.

We think the special issue is interesting to all those who have to make decisions in uncertain environments, including engineers, managers, and economists interested in the analytical and practical aspects of hybrid multiple-criteria decision-making methods. Because multiple-criteria decision-making is highly interdisciplinary, the audience of the issue could include the research community specialists who study or teach decision-making related disciplines in academia and practitioners who work as consultants or in-house experts in business, industry, and government.

Jurgita Antucheviciene
Madjid Tavana
Mehrbakhsh Nilashi
Romualdas Bausys