Table of Contents Author Guidelines Submit a Manuscript
Journal of Sensors
Volume 2015 (2015), Article ID 509385, 9 pages
Research Article

Combination of Evidence with Different Weighting Factors: A Novel Probabilistic-Based Dissimilarity Measure Approach

Key Laboratory of Embedded and Network Computing of Hunan Province, College of Computer Science and Electronic Engineering, Hunan University, Changsha 410082, China

Received 9 October 2014; Revised 9 February 2015; Accepted 23 February 2015

Academic Editor: Stefania Campopiano

Copyright © 2015 Mengmeng Ma and Jiyao An. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


To solve the invalidation problem of Dempster-Shafer theory of evidence (DS) with high conflict in multisensor data fusion, this paper presents a novel combination approach of conflict evidence with different weighting factors using a new probabilistic dissimilarity measure. Firstly, an improved probabilistic transformation function is proposed to map basic belief assignments (BBAs) to probabilities. Then, a new dissimilarity measure integrating fuzzy nearness and introduced correlation coefficient is proposed to characterize not only the difference between basic belief functions (BBAs) but also the divergence degree of the hypothesis that two BBAs support. Finally, the weighting factors used to reassign conflicts on BBAs are developed and Dempster’s rule is chosen to combine the discounted sources. Simple numerical examples are employed to demonstrate the merit of the proposed method. Through analysis and comparison of the results, the new combination approach can effectively solve the problem of conflict management with better convergence performance and robustness.

1. Introduction

Multisensor data fusion is a technology that combines information from several sources to form a unified picture [1]. Dempster-Shafer (DS) theory of evidence is one of the most prevalent methods for data fusion and is firstly proposed by Dempster in the 1960s [2] and further developed by Shafer in the 1970s [3]. DS has been widely used in many regions, such as image processing [4, 5], target recognition and tracking [6, 7], fault diagnosis [8], and knowledge discovery [9], to name a few. Unfortunately, in the framework of DS, Dempster’s rule, as an inherent problem, is incapable of managing the high conflicts from various information sources at the step of normalization and will generate counterintuitive results as first highlighted by Zadeh [10].

For a few years, a variety of combination methods have been proposed to achieve effective data fusion on high degree of conflicting sources of evidence. By studying them, the overall methods can be summarized into two main categories. The first is to improve the rules of combination [1113]. The representative methods are Lefevre’s method [11], Yager’s method [12], and so on. Scholars who put forward this method believe that the cause of high conflict evidence combination failure is due to some defects of the Dempster combination’s rules. The second is to modify the original sources of evidence without changing Dempster’s combination rule [1419]. The representative methods are Murphy’s method [14], Y. Deng’s method [15], and so on. Our approach is based on the second kind of improvement way, for the improved combination rules cannot meet the commutative law or associative law and the high conflict is not due to Dempster’s rule, and the unreliable evidence is the real cause. The sources of evidence should be discounted according to the reliability. The basic idea of the discounting method is that if one source has great (small) dissimilarity with the others, its reliability should be low (high).

Therefore, the dissimilarity measure between two sources of evidence plays a crucial role in the discounting method. Jousselme et al. [16] proposed a principled distance which regarded the evidence as the vector based on the geometry interpretation, but its computation burden is important. Liu [17] proposed the two-dimensional measure which consists of Shafer’s conflict coefficient and the pignistic probability distance between betting commitments. However, when one factor is large and another is small, the dissimilarity degree cannot be directly assured. Qu et al. [18] proposed a conflict rate to proportion the conflict, but when the two pieces of evidence are equal, the conflict rate will cause a counter-intuitive result. Liu et al. [19] proposed a dissimilarity measure to describe the divergences of two aspects between two pieces of evidence, the difference of beliefs and the difference of hypotheses which two pieces of evidence strongly support. But it is not good enough to capture the difference between BBAs in some cases as it will be seen. Based on the above analysis, the current methods are not adequate to precisely delineate the divergence between two pieces of evidence. This motivates researchers to develop a good and useful measure of dissimilarity.

In this study, a novel combination approach of conflict evidence is proposed. The novel dissimilarity measure is defined through integrating the fuzzy nearness and correlation coefficient by Hamacher T-conorm rule [20] based on an improved probabilistic transformation. The weighting factors adopted to discount the original sources are automatically determined according to the proposed probabilistic-based dissimilarity measure. The interest of our improved probabilistic transformation, the new dissimilarity measure, and the discounted method to combine conflicting sources of evidence are illustrated through some numerical examples.

The rest of this paper is organized as follows. In Section 2, we briefly review the DS evidence theory. The new method for combining conflict evidence is proposed in Section 3. In Section 4, numerical examples are enumerated to show the performance of the existing alternatives and the proposed method. Section 5 concludes this paper.

2. Theory of Evidence

2.1. Belief Function

The frame of discernment, denoted by , is a finite nonempty set including mutually exclusive and exhaustive elements. denotes the power set composed of all the possible subsets of . A basic belief assignment (BBA) is a function mapping from to and verifies the following conditions:where is the empty set. The subset of with nonzero masses is called the focal elements of . There are also two other definitions in the theory of evidence. They are belief and plausibility functions associated with a BBA and are defined, respectively, as

represents the total amount of probability that is allocated to , while can be interpreted as the amount of support that could be given to . and are the lower and upper limit of the belief level of hypothesis , respectively.

2.2. Dempster’s Combination Rule and the Paradox Problem

Suppose two bodies of evidence and are derived from two information sources; Dempster’s combination rule can be defined aswhere is the conflict coefficient, reflecting the degree of conflict between the two bodies of evidence.

Note that there are two limitations in applying DS evidence theory. One is that the counterintuitive results can be generated when high conflicting evidence is infused using Dempster’s rule as shown in classical Zadeh’s example [10]. The second is that the conflict coefficient is not very appropriate to really characterize the conflict between BBAs, particularly in case of two equal BBAs as reported in [17].

Example 1 (Zadeh’s example). Assume and over are defined as

According to (3) and (4), we get , , and . We can see that and have low support level to hypothesis , but the resulting structure has complete support to . This appears to be counterintuitive.

Example 2. Consider two equal and over are defined as

According to (4), we get . This reveals that the two pieces of evidence are of high degree of conflict, but in fact they are equal.

2.3. Pignistic Transformation

When working in the probabilistic framework, the focal elements are singletons and exclusive, and the degree of the conflict becomes easier to compute regardless of the intrinsic relationship between BBAs. Probabilistic transformation is a useful tool to map BBAs to probabilities. A classical transformation is the pignistic transformation [22], defined aswhere is the number of elements in subset . transfers the positive mass of belief of each nonspecific element onto the singletons involved in that element according to the cardinal number of the proposition.

3. A Novel Combination Approach of Conflict Evidence

The fundamental goal of our approach is to allocate reasonable weighting factors to the evidence and make a much better combination. The derivation of the weights of the sources is based on the widely well-adopted principle that the truth lies in the majority opinion. The sources which are highly conflicting with the majority of other sources will be automatically assigned with a very low weighting factor in order to decrease their bad influence in the fusion process. To determine the weighting factors, the conflict should be well measured first.

The degree of conflict between BBAs has been measured in many works, including conflict coefficient [3], Jousselme’s distance measure [16], pignistic probability distance [17], conflict rate [18], and dissimilarity measure [19]. These measures cannot characterize the conflict comprehensively and accurately. What is more, the distance measures [16, 17, 19] are based on the pignistic transformation [21], but such transformation is only a simple average in mathematics. It considers the role of belief functions while ignoring the effect of the plausibility functions. Therefore, we propose an improved probabilistic transformation to transform BBAs into probabilities to overcome the shortcomings of pignistic transformation. Then based on the improved probabilistic transformation method, a novel dissimilarity degree which integrates the fuzzy nearness and correlation coefficient by Hamacher T-conorm rule is proposed.

3.1. An Improved Probabilistic Transformation

Definition 3. By utilizing the information contained in the belief function and plausibility function of the propositions in the DS, a new method for transforming BBA into probability is defined aswhere is the total value of belief functions, defined as . The uncertain information that can be reallocated can be represented as . can well balance the degree of influence of the belief function and plausibility function. If the value of is big, the certainty information plays a dominant role, so the influence of should be bigger than . On the contrary, if the value of is big, the proportion of uncertain information is larger than the certainty information, so the influence of should be bigger than .
The improved probabilistic transform function satisfies and . It is worthy to note that the presented probabilistic transformation not only includes the special cases described in [22, 23] but also can well transform the BBAs into probabilities in general conditions.(1)If , all the focal elements are single sets, and so the BBAs should remain unchanged. Equation (8) can be simplified to .(2)If , and have the same influence on the allocations of the uncertain information, so (8) degrades into the probabilistic transform function proposed in [24], described as .(3)If , all the focal elements are multiple sets and the allocating of the uncertain information is only based on the , so (8) degrades into the plausibility function-based transform (PFT) proposed by Cobb and Shenoy in [23], described as .

Example 4. Let the BBA over the same frame of discernment be as follows:

In Example 4, . We choose Shannon entropy to measure the uncertainty of the probabilities after transformation. The results of probabilities and uncertainties in such general case are listed in Table 1. By analyzing the results, the improved probability transformation can get more effective probability and the smallest information uncertainty compared to the methods proposed in [2123], since it balances the influence of belief function and plausibility function well with the factor .

Table 1: The results of probabilistic transformation in Example 4.

3.2. The New Dissimilarity Measure
3.2.1. Fuzzy Nearness

Fuzzy set theory is specially designed to provide a language with syntax and semantics to translate qualitative judgments into numerical reasoning and to capture subjective and vague uncertainty. In this theory, fuzzy nearness is used to measure the level of similarity between two objects. In this work, we use the fuzzy nearness [24] to measure the similarity between BBAs. We use the fuzzy nearness [24] to measure the similarity between BBAs. The most commonly used fuzzy nearness is the well-known maximum-minimum method. The merit of fuzzy nearness will be illustrated in Example 6 by comparing with the distance measure [16, 19, 20].

Definition 5. Assume we have got a sequence of pieces of probability evidence rebuilt by the improved probabilistic transformation form . The level of similarity between two BBAs can be calculated bywhere and are the operators for calculating the minimum and maximum, respectively. The fuzzy nearness satisfies . In Examples 1 and 2, according to (13), we get and , respectively, which means that the fuzzy nearness can well reflect the difference between two highly conflict BBAs or two equal BBAs.

Example 6. Let us consider the frame and let the following three independent BBAs over the same frame of discernment be as follows:One getsThe dissimilarity between and is larger than that between and according to the distance measures [16], [17], and [19], which are counterintuitive. The fuzzy nearness shows and are more similar than and , which is in line with the intuitive judgment. This illustrates that the fuzzy nearness can measure the conflict in the case that the distance measure fails.
However, the fuzzy nearness is not stable. If , is used instead of in Example 6, and one gets , . In this case, the fuzzy nearness cannot work well because the degree of the divergence between the distinct hypotheses strongly supported by each source must play an important role [18].

3.2.2. Correlation Coefficient

A conflict between two BBAs can be interpreted qualitatively as the fact that one source strongly supports one hypothesis and the other strongly supports another hypothesis, and the two hypotheses are not compatible (their intersection is empty) [19]. This is intuitively consistent and will be adopted here. Reference [19] defined a conflict coefficient to reflect the divergence degree by the pair of maximal subjective probability from conflict sources, but it cannot reflect the difference between two nonconflicting sources of evidence. In this part, a correlation coefficient is proposed to reflect the divergence degree of the hypothesis that two belief functions strongly support based on the new proposed probabilistic transformation. The proposed correlation coefficient can reflect the difference of two conflicting or nonconflicting sources of evidence.

Definition 7. Let and be two pieces of probability evidence produced by the improved probabilistic transformation on the frame . The correlation coefficient is defined bywhere , . The correlation coefficient is defined by using the maximal approximate subjective probability of the BBAs. If two sources of evidence distribute most of their mass of belief to the same elements, the similarity between the two sources in such conditions is calculated by the average of maximal probability. Otherwise, the amount of conflict will be represented by the product of the average of minimal subjective probability. So our definition can well describe the difference of two conflicting or nonconflicting sources of evidence.
In Example 6, one gets , , which indicates that the correlation coefficient can well reflect the divergence of incompatible hypotheses that two sources of evidence commit most of their beliefs to.
However, according to the definition of the correlation coefficient, it only considers the elements that two sources of evidence strongly support and ignores the other elements of BBAs. Actually, the fuzzy nearness and correlation coefficient are complementary and they separately capture different aspects of the dissimilarity of BBAs.

3.2.3. The New Similarity Measure

Taking into account both of them in the elaboration of a new measure of dissimilarity seems therefore a natural way to capture two aspects of the dissimilarity of BBAs. Consider the analysis in [19]; Hamacher T-conorm fusion rule [20] satisfies important properties and also will be used here.

Definition 8. Based on the improved probabilistic transformation, the new dissimilarity measure denoted by is defined by fusing fuzzy nearness and correlation coefficient, described as In order to verify the effectiveness of the new dissimilarity measure, Example 9 drawn from [17] is conducted to compare it with the methods proposed in [3, 1619]. Since the compared methods are proportional to the conflict and our method is inversely proportional to the conflict, with the purpose of making an intuitive comparison, we define a variable , denoted by

Example 9. Let be a frame of discernment with 20 elements in Shafer’s model. We denote by its subscript and the two BBAs are defined as follows: where is the subset of . The number of elements in is increased step by step from 1 to 20. Consequently, there are 20 cases in this example. The results of the dissimilarity measures between and are graphically illustrated in Figure 1.

Figure 1: Comparisons of different conflict measure methods when subset changes in Example 9.

From Figure 1, it can be seen that although the novel dissimilarity measure presents a similar behavior with the [16], [17], and [19], our function changes more slowly than the existing ones. The value of conflict coefficient [3] is always equal to 0.05 whether the size of subset changes or not. [18] indicates that and are totally different except for case 5, so it cannot distinguish the variation among these cases. In case 4 and case 5, the [17] is equal, but the two pieces of evidence have obvious difference. It is worthy to mention that when becomes more and more uncertain, all singletons get small probability gain through the probabilistic transformation. The divergence degree, reflecting the strong support of sources in different hypotheses, is becoming lower and lower. Therefore, the new dissimilarity measure can well reflect the conflict between two sources of evidence and is more relational than other measures.

3.3. Combination Based on Discounting Evidence

Suppose the number of sources of evidence is . The evidence similarity can be represented with matrix :

The support degree of all sources of evidence to evidence can be defined by

The credibility of evidence can be calculated by the following formula:

The weight of evidence is defined by

Definition 10. Let be the frame of discernment; there are pieces of evidence participating in combination, respectively, which are . Let the weight of be ; we discount the evidence by the weighting factor as the following formula [3]:then do the combination of the discounted evidence by Dempster’s rule:

4. Numerical Examples

A typical architecture of the proposed combination method is shown in Figure 2. It consists of two main parts: determination of belief functions and combination of the sources of evidence. The construction of BBAs can be processed by the existing belief function generators and is out of the scope of this paper. With the obtained sources of evidence, the proposed approach is employed to measure the degree of conflict of BBAs and combine highly conflicting sources of evidence with weighting factors for decision making. Three simple examples are employed to show the performance of the proposed approach with respect to other methods, including DS [3], Yager’s rule [12], Murphy’s method [14], Y. Deng’s method [15], and Liu’s method [19].

Figure 2: The typical architecture of the proposed combination scheme.
4.1. Feasibility

The classic Zadeh’s example is used in this part to illustrate that the proposed approach can solve the invalidation problem of DS combination rule with high conflict. The combined results are tabulated and are listed in Table 2.

Table 2: Combination results of and for Example 1.

From Table 2, we can obtain that DS rule [3] assigns 100% certainty to the minority belief hypothesis , which is counterintuitive. The proposed methods can balance the conflict and allocate the same belief 0.3691 to and and 0.1699 to the unknown proposition. Moreover, when a new source of evidence supporting is collected and then combined, the proposed approach confirms to 0.6772 while DS rule constantly assigns the 100% certainty to . What is more, the belief allocated to the unknown proposition drops to 0.1217. The new source of evidence and combined results are listed in Table 3. This illustrates that our approach can well combine highly conflicting sources of evidence with a correct decision.

Table 3: The new source of evidence and combined results.

4.2. Robustness

In the real application of decision-making support systems, the interference of surroundings or the aberrant measurement of sensors always leads to the varying of the collected belief functions within a certain range. Therefore, the robustness of the combination method directly affects the synthesis results.

Example 11 (employed from [19]). Let us consider three simple Bayesian BBAs over the frame as in Table 4. The source of evidence number 3 can provide two similar BBAs denoted by and . Let us see how the small difference affects the fusion results.
From Table 4, we can see that , , and commit most belief on , whereas distributes the largest belief to . Thus, will not be considered as reliable as the other ones. The combination results of different methods are shown in Table 5.

Table 4: Four sources of evidence in Example 11.
Table 5: Combination results of different methods in Example 11.

The fusion results in Table 5 show that and are very similar. DS rule provides the unreasonable result that is impossible to happen. Although Yager’s rule has a preferable robustness, the outcome is poor since it assigns the conflict to the unknown domain. This reflects that Yager’s rule is too conservative. In Murphy’s method, considers that is most likely to be true, whereas believes that should correspond to the truth; therefore they lead to opposite conclusion. This indicated that average BBAs method is not robust enough. Once the discounting approach in [15, 19] and this paper is applied, one gets the largest mass of belief to as expected. Moreover, if our method is used as the dissimilarity coupled with the proposed method of weighting factors determination, it can produce the most specific and robust results. This illustrates that the proposed method can work well with perfect robustness even in high conflicting cases.

4.3. Effectiveness

In this section, a synthetic numerical example of a simulation of the multisensor based automatic target recognition system is employed to analyze the effectiveness of the proposed approach of combination.

Example 12. Let the frame of target discernment be and the real target is . From five distinct information sources the system has collected seven singleton bodies of evidence , , , , and , including the conflicting evidence from the nonreliable information source, shown as in Table 6.
In Table 6, , , , and assign most of their belief to , but oppositely commits its largest mass of belief to . is considered as the least reliable source. The sources of evidence are firstly transferred into probabilities by the proposed probability transformation. Then the weighting factors are determined by the new probability-based dissimilarity measure. All the combined results of the discounted sources of evidence are listed in Table 7 and the belief assignment to the target of the different alternatives is graphically illustrated in Figure 3.

Table 6: The collected sources of evidence of the target recognition system.
Table 7: Combination results of different methods of the target recognition system.
Figure 3: The belief assignment allocated to target of different alternatives.

As can be observed in Table 7, the Dempster combination rule concludes that the target is very unlikely to happen whereas is almost sure to happen. Such unexpected behavior shows that DS rule is risky to be used to combine sources of evidence in a high conflicting situation. The results of Yager’s rule [12] indicate that has a small mass of belief after the conflict evidence arrives, since it allocates the majority of the belief to the unknown domain . This reflects that Yager’s rule is too conservative. In Murphy’s method [14], although has a higher mass of belief than as expected, the results are only the average of the BBAs. Y. Deng’s method [15], Liu’s method [19], and the proposed approach can generate right results, because once the discounting method is applied, the conflict evidence becomes strongly discounted owing to its largest conflict with other sources. From Figure 3, one sees that the proposed probabilistic-based dissimilarity measure coupled with the automatic discounting factors determination generates more effective results with a better convergence performance than all other methods.

5. Conclusions

In this paper, a new method has been proposed to combine conflict sources of evidence with different weighting factors. The merit of the new method proposed in this work lies in the elaboration of an efficient probability transformation and a comprehensively probabilistic-based dissimilarity measure which can be used for the determination of the weighting factors of the sources involved in the fusion process. Through aforementioned analysis and comparison, the proposed approach can effectively solve the counterintuitive behaviors of the classical DS rule in combining highly conflicting sources. Furthermore, it can make the right decision with better robustness and effectiveness performance for the decision-making support system or target detection system.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.


This work is supported in part by the Natural Science Foundation of China under Grant no. 61370097.


  1. B. Khaleghi, A. Khamis, F. O. Karray, and S. N. Razavi, “Multisensor data fusion: a review of the state-of-the-art,” Information Fusion, vol. 14, no. 1, pp. 28–44, 2013. View at Publisher · View at Google Scholar · View at Scopus
  2. A. P. Dempster, “A generalization of Bayesian inference (with discussion),” Journal of the Royal Statistical Society, Series B: Methodological, vol. 30, no. 2, pp. 205–247, 1968. View at Google Scholar · View at MathSciNet
  3. G. Shafer, A Mathematical Theory of Evidence, Princeton University Press, Princeton, NJ, USA, 1976. View at MathSciNet
  4. Y. Xia and M. S. Kamel, “Novel cooperative neural fusion algorithms for image restoration and image fusion,” IEEE Transactions on Image Processing, vol. 16, no. 2, pp. 367–381, 2007. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  5. M. Fontani, T. Bianchi, A. D. Rosa, A. Piva, and M. Barni, “A framework for decision fusion in image forensics based on Dempster-Shafer theory of evidence,” IEEE Transactions on Information Forensics and Security, vol. 8, no. 4, pp. 593–607, 2013. View at Publisher · View at Google Scholar · View at Scopus
  6. W. Huang and Z. L. Jing, “Multi-focus image fusion using pulse coupled neural network,” Pattern Recognition Letters, vol. 28, no. 9, pp. 1123–1132, 2007. View at Publisher · View at Google Scholar · View at Scopus
  7. A. M. Aziz, “A new multiple decisions fusion rule for targets detection in multiple sensors distributed detection systems with data fusion,” Information Fusion, vol. 18, no. 1, pp. 175–186, 2014. View at Publisher · View at Google Scholar · View at Scopus
  8. O. Basir and X. H. Yuan, “Engine fault diagnosis based on multi-sensor information fusion using Dempster-Shafer evidence theory,” Information Fusion, vol. 8, no. 4, pp. 379–386, 2007. View at Publisher · View at Google Scholar · View at Scopus
  9. B. Scotney and S. McClean, “Database aggregation of imprecise and uncertain evidence,” Information Sciences, vol. 155, no. 3-4, pp. 245–263, 2003. View at Publisher · View at Google Scholar · View at Scopus
  10. L. A. Zadeh, “A simple view of the Dempster-Shafer theory and its implications for the rule of combination,” Artificial Intelligence, vol. 2, no. 7, pp. 85–90, 1986. View at Google Scholar
  11. E. Lefevre, O. Colot, and P. Vannoorenberghe, “Belief function combination and conflict management,” Information Fusion, vol. 3, no. 2, pp. 149–162, 2002. View at Publisher · View at Google Scholar · View at Scopus
  12. R. R. Yager, “On the Dempster-Shafer framework and new combination rules,” Information Sciences, vol. 41, no. 2, pp. 93–137, 1987. View at Publisher · View at Google Scholar · View at MathSciNet
  13. W. Q. Wang, Y. J. Zhao, and J. Huang, “A new evidence combination scheme for decision assistant,” in Proceedings of the 4th International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC '12), pp. 53–57, October 2012. View at Publisher · View at Google Scholar · View at Scopus
  14. C. K. Murphy, “Combining belief functions when evidence conflicts,” Decision Support Systems, vol. 29, no. 1, pp. 1–9, 2000. View at Publisher · View at Google Scholar · View at Scopus
  15. Y. Deng, W. K. Shi, Z. F. Zhu, and Q. Liu, “Combining belief functions based on distance of evidence,” Decision Support Systems, vol. 38, no. 3, pp. 489–493, 2004. View at Publisher · View at Google Scholar · View at Scopus
  16. A. L. Jousselme, D. Grenier, and É. Bossé, “A new distance between two bodies of evidence,” Information Fusion, vol. 2, no. 2, pp. 91–101, 2001. View at Publisher · View at Google Scholar · View at Scopus
  17. W. Liu, “Analyzing the degree of conflict among belief functions,” Artificial Intelligence, vol. 170, no. 11, pp. 909–924, 2006. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  18. S.-J. Qu, Y.-M. Cheng, Q. K. Pan, Y. Liang, and S.-W. Zhang, “Conflict-redistribution DSmT and new methods dealing with conflict among evidences,” Control and Decision, vol. 24, no. 12, pp. 1856–1859, 2009. View at Google Scholar · View at Scopus
  19. Z.-G. Liu, J. Dezert, Q. Pan, and G. Mercier, “Combination of sources of evidence with different discounting factors based on a new dissimilarity measure,” Decision Support Systems, vol. 52, no. 1, pp. 133–141, 2011. View at Publisher · View at Google Scholar · View at Scopus
  20. M. Oussalah, “On the use of Hamacher's t-norms family for information aggregation,” Information Sciences, vol. 153, pp. 107–154, 2003. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  21. P. Smets, “Decision making in the TBM: the necessity of the pignistic transformation,” International Journal of Approximate Reasoning, vol. 38, no. 2, pp. 133–147, 2005. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  22. W. Jang, A. Zhang, and D. Duanmu, “A novel probability transform method for decision making,” in Proceedings of the 3rd Chinese Information Fusion Conference, pp. 110–114, August 2011.
  23. B. R. Cobb and P. P. Shenoy, “On the plausibility transformation method for translating belief function models to probability models,” International Journal of Approximate Reasoning, vol. 41, no. 3, pp. 314–330, 2006. View at Publisher · View at Google Scholar · View at MathSciNet
  24. L. A. Zadeh, “Fuzzy sets,” Information and Computation, vol. 8, no. 1, pp. 338–353, 1965. View at Google Scholar · View at MathSciNet