Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2014, Article ID 319264, 6 pages
http://dx.doi.org/10.1155/2014/319264
Research Article

Multiscale Probability Transformation of Basic Probability Assignment

Meizhu Li,1 Xi Lu,1,2 Qi Zhang,1 and Yong Deng1,3,4

1School of Computer and Information Science, Southwest University, Chongqing 400715, China
2School of Hanhong, Southwest University, Chongqing 400715, China
3School of Automation, Northwestern Polytechnical University, Xi’an, Shaanxi 710072, China
4School of Engineering, Vanderbilt University, Nashville, TN 37235, USA

Received 16 June 2014; Accepted 2 October 2014; Published 20 October 2014

Academic Editor: Mohamed Abd El Aziz

Copyright © 2014 Meizhu Li et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Decision making is still an open issue in the application of Dempster-Shafer evidence theory. A lot of works have been presented for it. In the transferable belief model (TBM), pignistic probabilities based on the basic probability assignments are used for decision making. In this paper, multiscale probability transformation of basic probability assignment based on the belief function and the plausibility function is proposed, which is a generalization of the pignistic probability transformation. In the multiscale probability function, a factor q based on the Tsallis entropy is used to make the multiscale probabilities diversified. An example showing that the multiscale probability transformation is more reasonable in the decision making is given.

1. Introduction

Since first proposed by Dempster [1] and then developed by Shafer [2], the Dempster-Shafer theory of evidence, which is also called Dempster-Shafer theory or evidence theory, has been paid much attention for a long time and continually attracted growing interests [35]. Even as a theory of reasoning under the uncertain environment, Dempster-Shafer theory has an advantage of directly expressing the “uncertainty” by assigning the probability to the subsets of the set composed of multiple objects rather than to each of the individual objects, so it has been widely used in many fields, such as statistical learning [6, 7], classification and clustering [811], decision making [1214], knowledge reasoning [15, 16], risk assessment and evaluation [17, 18], and others [1921].

To improve the Dempster-Shafer theory of evidence, many studies have been devoted for combination rule of evidence, confliction problem, generation of mass function [22, 23], uncertain measure of evidence [24, 25], and so on [2628]. The basic rule of combination is Dempster’s rule of combination (orthogonal sum). It needs a normalization step in order to preserve the basic properties of the belief functions. In [29], Zadeh has underlined that this normalization involves counterintuitive behaviours. In order to solve the problem of conflict management, Yager [30], Smets [31], and Murphy [32] and more recently Liu [33] and Lefèvre and Elouedi [34] have proposed other combination rules.

One open issue of evidence theory is the decision making based on the basic probability assignments, and many works have been done to construct a reasonable model for the decision making. In the transferable belief model (TBM) [35], pignistic probabilities are used for decision making [36]. The transferable belief model is presented to represent quantified beliefs based on belief functions. TBM was constructed by two levels: the credal level where beliefs are entertained and quantified by belief functions and the pignistic level where beliefs can be used to make decisions and are quantified by probability functions. The main idea of the pignistic probability transformation is to transform the multielement subsets into singleton subsets by an average method. Because in some examples the pignistic transformation method produces results that appear to be inconsistent with Dempster’s rule of combination, Cobb and Shenoy [37] proposed a new plausibility transformation method for translating Dempster-Shafer (D-S) belief function models to probability models. Other works have been done to obtain a more reasonable transformation method [3840].

Among the previous research, the pignistic probability transformation is the most widely used, but it is not reasonable enough to describe the unknown for the multielement subsets. Hence, a generalization of the pignistic probability transformation called multiscale probability transformation of basic probability assignment is proposed in this paper, which is based on the belief function and the plausibility function. The proposed function can be calculated with the difference between the belief function and the plausibility function; we call it multiscale probability function and denote it as a function . In the multiscale probability function, a factor based on the Tsallis entropy [41] is used to make the multiscale probabilities diversified. When the value of equals 0, the proposed multiscale probability transformation can be degenerated as the pignistic probability transformation.

The rest of this paper is organized as follows. Section 2 introduces some basic preliminaries about the Dempster-Shafer theory and the pignistic probability transformation. In Section 3, the multiscale probability transformation is presented. Section 4 uses an example to illustrate the effectiveness of the multiscale probability transformation. Conclusion is given in Section 5.

2. Preliminaries

2.1. Dempster-Shafer Theory of Evidence

Dempster-Shafer theory of evidence [1, 2], also called Dempster-Shafer theory or evidence theory, is used to deal with uncertain information. As an effective theory of evidential reasoning, Dempster-Shafer theory has an advantage of directly expressing various uncertainties. This theory needs weaker conditions than Bayesian theory of probability, so it is often regarded as an extension of the Bayesian theory. For completeness of the explanation, a few basic concepts are introduced as follows.

Definition 1. Let be a mutually exclusive and collectively exhaustive set, which is indicted by: The set is called frame of discernment. The power set of is indicated by , where If , is called a proposition.

Definition 2. For a frame of discernment , a mass function is a mapping from to , formally defined by which satisfies the following condition:

In Dempster-Shafer theory, a mass function is also called a basic probability assignment (BPA). If , is called a focal element, and the union of all focal elements is called the core of the mass function.

Definition 3. For a proposition , the belief function is defined as The plausibility function is defined as where .
Obviously, , and these functions and are the lower limit function and upper limit function of proposition , respectively.

2.2. Pignistic Probability Transformation

In the transferable belief model (TBM) [35], pignistic probabilities are used for decision making. The definition of the pignistic probability transformation is shown as follows.

Definition 4. Let be a BPA on the frame of discernment . Its associated pignistic probability function is defined as where is the cardinality of subset . The process of pignistic probability transformation (PPT) is that basic probability assignment transferred to probability distribution. Therefore, the pignistic betting distance can be easily obtained by PPT.

3. Multiscale Probability Transformation of Basic Probability Assignment

In the transferable belief model (TBM) [35], pignistic probabilities are used for decision making. The transferable belief model is presented to represent quantified beliefs based on belief functions. The main idea of the pignistic probability transformation is to transform the multielement subsets into singleton subsets by an average method. Though the pignistic probability transformation is widely used, it is not reasonable in Example 5.

Example 5. Suppose there is a frame of discernment of , , and ; then, the BPA is given as follows:

In the pignistic probability transformation, for , the result will be . Actually, it is not reasonable, as   means that the sensor cannot judge to which classes the target belongs, since it represents a meaning of “unknown.” In other words, only according to , nothing can be obtained except “unknown.” In this situation, average is used in the pignistic probability transformation, which is one of the methods to solve the problem. Compared with the average, weighted average is more reasonable in many situations.

In the Dempster-Shafer theory, the plausibility function represents the optimistic estimate of one proposition, and the belief function represents the pessimistic estimate of one proposition. The distance between the plausibility function and the belief function refers to the belief level of one proposition. For one proposition, the bigger the belief level, the bigger the probability. In this paper, the weighted average is represented by the distance between the belief function and the plausibility function, whose definition is shown as follows.

Definition 6. Let be a BPA on the frame of discernment . The difference function is defined as

Definition 7. The weight is defined as

Based on the weighted average idea, a factor , which is proposed in the Tsallis entropy [41], is used to highlight the weights. Thus, the definition of multiscale probability function is shown as follows.

Definition 8. Let be a BPA on the frame of discernment . Its associated multiscale probability function on is defined as where is the cardinality of subset . is a factor based on the Tsallis entropy to amend the proportion of the interval. The transformation between and is called the multiscale probability transformation.
Actually, the part of (11) denotes the weight of element based on normalization, which is replaced by the averaged in the pignistic probability function.

Theorem 9. Let be a BPA on the frame of discernment . Its associated multiscale probability on is degenerated as the pignistic probability when equals 0.

Proof. When equals 0 and equals 1, the multiscale probability function will be calculated as follows: Then, it can obtain

From (12) and (13), we can see that when the value of equals 0, the proposed multiscale probability function can be degenerated as the pignistic probability function.

Theorem 10. Let be a BPA on the frame of discernment . If the belief function equals the plausibility function, its associated multiscale probability is degenerated as the pignistic probability .

Proof. Given a BPA on the frame of discernment , for each , when the belief function equals the plausibility function, namely, , and the bel is a probability distribution [35], then is equal to .

For example, let be a frame of discernment and ; if it is satisfied with , , and , the BPA on the frame must be satisfied with . In this situation, the multiscale probability will be degenerated as the pignistic probability.

Corollary 11. If bel is a probability distribution , then is equal to .

Theorem 12. Let be a BPA on the frame of discernment . If the distances between the belief function and the plausibility function are the same, the multiscale probability transformation can be degenerated as the pignistic probability transformation.

Proof. It is the same as the proof of Theorem 9.

An illustrative example is given to show the calculation of the multiscale probability transformation step by step.

Example 13. Let be a frame of discernment with 3 elements. We use , , and to denote element 1, element 2, and element 3 in the frame. One body of BPA is given as follows:

Step 1. Based on (5) and (6), the values of the belief function and the plausibility function of elements , , and can be obtained as follows:

Step 2. Calculate the distance between the belief function and the plausibility function:

Step 3. Calculate the weight of each element in . Assume that the value of equals 1.
When , When ,

Step 4. The value of the multiscale probability function can be obtained based on the above steps:, , .

4. Case Study

In this section, an illustrative example is given to show the effect of the multiscale probability function when the value of changes.

Example 14. Let be a frame of discernment with 3 elements; namely, .
We are given one body of BPAs as follows:

Based on the pignistic probability transformation, the results of the pignistic probability function are shown as follows: According to the proposed function in this paper, the results of the multiscale probability function can be obtained through the following steps.

Firstly, the values of belief function and the plausibility function can be obtained as follows: Then, the distances between the belief functions and the plausibility functions can be calculated:

Based on the definition of the multiscale probability transformation, the values of can be obtained. There are 20 cases where the values of are starting from Case 1 with and ending with Case 11 when as shown in Table 1. The values of for these 20 cases are detailed in Table 1 and graphically illustrated in Figure 1.

tab1
Table 1: The values of multiscale probability function when the values of change.
319264.fig.001
Figure 1: The values of multiscale probability function when the values of change.

According to Table 1 and Figure 1, on one hand, when the value of increased, the probability of the element which has larger weight is increased, and the probability of the element which has smaller weight is decreased. For example, the element is starting with probability 0.5500 and ending with probability 0.8250. The element is starting with probability 0.2500 and ending with probability 0.1061.

On the other hand, according to Table 1, the option ranking of the values of can be obtained. It is starting with and ending with . It is mainly because is impact of the values of . This principle makes the multiscale probability function have the ability to highlight the proportion of each element in the frame of discernment.

Note that when the value of equals 0, the values of pignistic probability are the same as the values of multiscale probability , which is proposed in this paper. In other words, the multiscale probability function is a generalization of the pignistic probability function.

5. Conclusion

In the transferable belief model (TBM), pignistic probabilities are used for decision making. In this paper, a multiscale probability transformation of basic probability assignment based on the belief function and the plausibility function, which is a generalization of the pignistic probability transformation, is proposed. In the multiscale probability function, a factor is proposed to make the multiscale probability function have the ability to highlight the proportion of each element in the frame of discernment. When the value of equals 0, the multiscale probability transformation can be degenerated as the pignistic probability transformation. An illustrative case is provided to demonstrate the effectiveness of the multiscale probability transformation.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

The authors thank the anonymous reviewers for their valuable comments and suggestions that improved this paper. The work is partially supported by the National Natural Science Foundation of China (Grant no. 61174022), Specialized Research Fund for the Doctoral Program of Higher Education (Grant no. 20131102130002), R&D Program of China (2012BAH07B01), National High Technology Research and Development Program of China (863 Program) (Grant no. 2013AA013801), Open Funding Project of State Key Laboratory of Virtual Reality Technology and Systems, Beihang University (Grant no. BUAA-VR-14KF-02), and Fundamental Research Funds for the Central Universities no. XDJK2015D009.

References

  1. A. P. Dempster, “Upper and lower probabilities induced by a multivalued mapping,” The Annals of Mathematical Statistics, vol. 38, no. 2, pp. 325–339, 1967. View at Publisher · View at Google Scholar · View at MathSciNet
  2. G. Shafer, A Mathematical Theory of Evidence, vol. 1, Princeton University Press, Princeton, NJ, USA, 1976. View at MathSciNet
  3. R. P. Srivastava and L. Liu, “Applications of belief functions in business decisions: a review,” Information Systems Frontiers, vol. 5, no. 4, pp. 359–378, 2003. View at Publisher · View at Google Scholar · View at Scopus
  4. T. Denœux, “Maximum likelihood estimation from fuzzy data using the EM algorithm,” Fuzzy Sets and Systems, vol. 183, no. 1, pp. 72–91, 2011. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  5. D. Wei, X. Deng, X. Zhang, Y. Deng, and S. Mahadevan, “Identifying influential nodes in weighted networks based on evidence theory,” Physica A: Statistical Mechanics and Its Applications, vol. 392, no. 10, pp. 2564–2575, 2013. View at Publisher · View at Google Scholar · View at Scopus
  6. F. Cuzzolin, “A geometric approach to the theory of evidence,” IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews, vol. 38, no. 4, pp. 522–534, 2008. View at Publisher · View at Google Scholar · View at Scopus
  7. Y. Yang, D. Han, C. Han, and F. Cao, “A novel approximation of basic probability assignment based on rank-level fusion,” Chinese Journal of Aeronautics, vol. 26, no. 4, pp. 993–999, 2013. View at Google Scholar
  8. M.-H. Masson and T. Denœux, “ECM: an evidential version of the fuzzy c-means algorithm,” Pattern Recognition, vol. 41, no. 4, pp. 1384–1397, 2008. View at Publisher · View at Google Scholar · View at Scopus
  9. Z.-G. Liu, Q. Pan, and J. Dezert, “Evidential classifier for imprecise data based on belief functions,” Knowledge-Based Systems, vol. 52, pp. 246–257, 2013. View at Publisher · View at Google Scholar · View at Scopus
  10. L. Livi, H. Tahayori, A. Sadeghian, and A. Rizzi, “Distinguishability of interval type-2 fuzzy sets data by analyzing upper and lower membership functions,” Applied Soft Computing Journal, vol. 17, pp. 79–89, 2014. View at Publisher · View at Google Scholar · View at Scopus
  11. S. Agrawal, R. Panda, and L. Dora, “A study on fuzzy clustering for magnetic resonance brain image segmentation using soft computing approaches,” Applied Soft Computing, vol. 24, pp. 522–533, 2014. View at Publisher · View at Google Scholar
  12. J. Liu, Y. Li, R. Sadiq, and Y. Deng, “Quantifying influence of weather indices on pm based on relation map,” Stochastic Environmental Research and Risk Assessment, vol. 28, no. 6, pp. 1323–1331, 2014. View at Google Scholar
  13. S. Huang, X. Su, Y. Hu, S. Mahadevan, and Y. Deng, “A new decision-making method by incomplete preferences based on evidence distance,” Knowledge-Based Systems, vol. 56, pp. 264–272, 2014. View at Publisher · View at Google Scholar · View at Scopus
  14. C. S. Lin, C. T. Chen, F. S. Chen, and W. Z. Hung, “A novel multiperson game approach for linguistic multicriteria decision making problems,” Mathematical Problems in Engineering, vol. 2014, Article ID 592326, 20 pages, 2014. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  15. B. Kang, Y. Deng, R. Sadiq, and S. Mahadevan, “Evidential cognitive maps,” Knowledge-Based Systems, vol. 35, pp. 77–86, 2012. View at Publisher · View at Google Scholar · View at Scopus
  16. T. Denoeux, “Maximum likelihood estimation from uncertain data in the belief function framework,” IEEE Transactions on Knowledge and Data Engineering, vol. 25, no. 1, pp. 119–130, 2013. View at Publisher · View at Google Scholar · View at Scopus
  17. X. Zhang, Y. Deng, F. T. S. Chan, P. Xu, S. Mahadevan, and Y. Hu, “IFSJSP: a novel methodology for the Job-Shop Scheduling Problem based on intuitionistic fuzzy sets,” International Journal of Production Research, vol. 51, no. 17, pp. 5100–5119, 2013. View at Publisher · View at Google Scholar · View at Scopus
  18. X. Deng, Y. Hu, Y. Deng, and S. Mahadevan, “Supplier selection using AHP methodology extended by D numbers,” Expert Systems with Applications, vol. 41, no. 1, pp. 156–167, 2014. View at Publisher · View at Google Scholar · View at Scopus
  19. S. Chen, Y. Deng, and J. Wu, “Fuzzy sensor fusion based on evidence theory and its application,” Applied Artificial Intelligence, vol. 27, no. 3, pp. 235–248, 2013. View at Publisher · View at Google Scholar · View at Scopus
  20. X. Deng, Y. Hu, Y. Deng, and S. Mahadevan, “Environmental impact assessment based on D numbers,” Expert Systems with Applications, vol. 41, no. 2, pp. 635–643, 2014. View at Publisher · View at Google Scholar · View at Scopus
  21. C. Zhang, Y. Hu, F. T. Chan, R. Sadiq, and Y. Deng, “A new method to determine basic probability assignment using core samples,” Knowledge—Based Systems, vol. 69, pp. 140–149, 2014. View at Publisher · View at Google Scholar
  22. T. Burger and S. Destercke, “How to randomly generate mass functions,” International Journal of Uncertainty, Fuzziness and Knowlege-Based Systems, vol. 21, no. 5, pp. 645–673, 2013. View at Publisher · View at Google Scholar · View at Scopus
  23. Z.-G. Liu, Q. Pan, and J. Dezert, “A belief classification rule for imprecise data,” Applied Intelligence, vol. 40, no. 2, pp. 214–228, 2014. View at Publisher · View at Google Scholar · View at Scopus
  24. M. E. Basiri, A. R. Naghsh-Nilchi, and N. Ghasem-Aghaee, “Sentiment prediction based on dempster-shafer theory of evidence,” Mathematical Problems in Engineering, vol. 2014, Article ID 361201, 13 pages, 2014. View at Publisher · View at Google Scholar · View at Scopus
  25. X. Deng, X. Lu, F. T. Chan, R. Sadiq, S. Mahadevan, and Y. Deng, “D-CFPR: D numbers extended consistent fuzzy preference relations,” Knowledge-Based Systems, 2014. View at Publisher · View at Google Scholar
  26. F. Karahan and S. Ozkan, “On the persistence of income shocks over the life cycle: evidence, theory, and implications,” Review of Economic Dynamics, vol. 16, no. 3, pp. 452–476, 2013. View at Publisher · View at Google Scholar · View at Scopus
  27. Z. Zhang, C. Jiang, X. Han, D. Hu, and S. Yu, “A response surface approach for structural reliability analysis using evidence theory,” Advances in Engineering Software, vol. 69, pp. 37–45, 2014. View at Publisher · View at Google Scholar · View at Scopus
  28. X. Zhang, Y. Deng, F. T. Chan, A. Adamatzky, and S. Mahadevan, “Supplier selection based on evidence theory and analytic network process,” Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture, 2015. View at Publisher · View at Google Scholar
  29. L. A. Zadeh, “A simple view of the dempster-shafer theory of evidence and its implication for the rule of combination,” The AI Magazine, vol. 7, no. 2, pp. 85–90, 1986. View at Google Scholar
  30. R. R. Yager, “On the dempster-shafer framework and new combination rules,” Information Sciences, vol. 41, no. 2, pp. 93–137, 1987. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  31. P. Smets, “Combination of evidence in the transferable belief model,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 12, no. 5, pp. 447–458, 1990. View at Publisher · View at Google Scholar · View at Scopus
  32. C. K. Murphy, “Combining belief functions when evidence conflicts,” Decision Support Systems, vol. 29, no. 1, pp. 1–9, 2000. View at Publisher · View at Google Scholar · View at Scopus
  33. W. Liu, “Analyzing the degree of conflict among belief functions,” Artificial Intelligence, vol. 170, no. 11, pp. 909–924, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  34. E. Lefèvre and Z. Elouedi, “How to preserve the conflict as an alarm in the combination of belief functions?” Decision Support Systems, vol. 56, no. 1, pp. 326–333, 2013. View at Publisher · View at Google Scholar · View at Scopus
  35. P. Smets and R. Kennes, “The transferable belief model,” Artificial Intelligence, vol. 66, no. 2, pp. 191–234, 1994. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  36. P. Smets, “Decision making in the TBM: the necessity of the pignistic transformation,” International Journal of Approximate Reasoning, vol. 38, no. 2, pp. 133–147, 2005. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  37. B. R. Cobb and P. P. Shenoy, “On the plausibility transformation method for translating belief function models to probability models,” International Journal of Approximate Reasoning, vol. 41, no. 3, pp. 314–330, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus
  38. M. Daniel, “On transformations of belief functions to probabilities,” International Journal of Intelligent Systems, vol. 21, no. 3, pp. 261–282, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at Scopus
  39. J. M. Merigó and M. Casanovas, “Decision making with Dempster-Shafer theory using fuzzy induced aggregation operators,” in Recent Developments in the Ordered Weighted Averaging Operators: Theory and Practice, vol. 265 of Studies in Fuzziness and Soft Computing, pp. 209–228, Springer, 2011. View at Publisher · View at Google Scholar
  40. E. Nusrat and K. Yamada, “A descriptive decision-making model under uncertainty: combination of dempster-sha fer theory and prospect theory,” International Journal of Uncertainty, Fuzziness and Knowlege-Based Systems, vol. 21, no. 1, pp. 79–102, 2013. View at Publisher · View at Google Scholar · View at Scopus
  41. C. Tsallis, “Possible generalization of Boltzmann-Gibbs statistics,” Journal of Statistical Physics, vol. 52, no. 1-2, pp. 479–487, 1988. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet · View at Scopus