Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2018, Article ID 5858272, 10 pages
https://doi.org/10.1155/2018/5858272
Research Article

Weighted Evidence Combination Rule Based on Evidence Distance and Uncertainty Measure: An Application in Fault Diagnosis

1School of Software Engineering, Chongqing University, Chongqing 401331, China
2Dean’s Office, Chongqing Aerospace Polytechnic College, Chongqing 400021, China
3Department of Computer Engineering, Chongqing Aerospace Polytechnic College, Chongqing 400021, China

Correspondence should be addressed to Jun Sang; nc.ude.uqc@gnasj

Received 29 May 2017; Revised 3 September 2017; Accepted 15 November 2017; Published 17 January 2018

Academic Editor: Josefa Mula

Copyright © 2018 Lei Chen et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Conflict management in Dempster-Shafer theory (D-S theory) is a hot topic in information fusion. In this paper, a novel weighted evidence combination rule based on evidence distance and uncertainty measure is proposed. The proposed approach consists of two steps. First, the weight is determined based on the evidence distance. Then, the weight value obtained in first step is modified by taking advantage of uncertainty. Our proposed method can efficiently handle high conflicting evidences with better performance of convergence. A numerical example and an application based on sensor fusion in fault diagnosis are given to demonstrate the efficiency of our proposed method.

1. Introduction

Information fusion technology (IFT) is utilized to analyze multisource uncertain information comprehensively. Through reserving the common information, IFT can decrease indeterminacy greatly. The Dempster-Shafer theory of evidence [1] (D-S theory, also known as evidence theory or theory of belief functions) is regarded as an efficient model to fuse information in intelligent systems [2]. And Dempster’s combination rule is the most crucial instrument of D-S theory. The theory was firstly proposed by Dempster in 1967 [1] and then developed to its present form by Shafer et al. in 1976 [3]. The Dempster’s combination rule possesses several interesting mathematical properties, such as commutativity and associativity, and it plays a very significant role in evidence theory [4]. Nowadays, the evidence theory is applied widely in many fields, like supplier selection ([5, 6]), target recognition ([79]), decision making ([4, 10, 11]), reliability analysis ([1215]), and so on.

Although D-S theory has a lot of advantages, there also exist some basic problems that still are not completely clarified. One of the most significant issues is that D-S theory will become invalid when using it to fuse highly conflicting evidences, and the counter-intuitive results ([10, 16, 17]) will be generated. To solve such a problem, two major methodologies are popular. One is to preprocess the bodies of evidence (BOEs) ([1820]), and the other is to modify the combined rule ([2123]). There are mainly three alternative combination rules belonging to the second type and they are, respectively, Dubois and Prade’s disjunctive combination rule [24], Smets’ unnormalized combination rule [25], and Yager’s combination rule [21]. These three alternatives mentioned above are examined and they all propose a general combination framework. The main work of preprocessing bodies of evidences (BOEs) includes Murphy’s simple average in [19], Yong et al.’s weighted average on the basis of distance of evidence in [26], and Han et al.’s modified weighted average in [27]. In [19], a simple averaging approach of the primitive BOEs is proposed, and in that case all BOEs are seen equally important, which is unreasonable in practice. Yong et al. [26] get a better combination result according to combining the weight average of the masses for times. The approach proposed by Han et al. [27] is a novel weighted evidence combination approach based on the evidence distance and ambiguity measure (AM), which actually modifies Yong et al.’s work [26].

In this paper, a novel weighted evidence combination rule based on evidence distance and uncertainty measure is proposed to address the combination of conflicting evidences. The numerical example and an application in fault diagnosis are given to sufficiently prove the efficiency of our proposed method.

The remaining paper is organized as follows: Section 2 starts with a brief introduction of the Dempster-Shafer theory and evidence distance; the proposed method is presented in Section 3; Sections 4 and 5 give a numerical example and an application in fault diagnosis, respectively, to prove the efficiency of our proposed approach; finally, the conclusion is made in Section 6.

2. Preliminaries

In this section, some preliminaries are briefly introduced below.

2.1. Basics of Evidence Theory

Dempster-Shafer theory of evidence (D-S theory) is used for dealing with uncertainty information as an efficient mathematical model in intelligent systems [1]. In 1967, The definition, D-S theory was proposed by Dempster [1] and then his student Shafer et al. developed this theory [3] in 1976.

([29, 30]) Let be a nonempty finite set and be the set of all subsets of , denoted and .

In Dempster-Shafer theory of evidence [3], a basic probability assignment (BPA) is a mapping: satisfying

If , is called a focal element, and the set consisting of all the focal elements is called one body of evidences (BOEs). When there are more than one independent body of evidences, Dempster’s combination rule, (2) which is a powerful and crucial tool in D-S theory, can be utilized to combine these evidences.where stands for conflict degree, also called normalization constant. What is noted is that if , this combination rule will make no sense. Here, we give a specific example about combination rule and show the corresponding results in Table 1.

Table 1: An example of combination rule.

Example 1. Suppose that the frame of discernment is complete and there are two BOEs listed as follows:, , ;, , In the frame of discernment , there are two BOES, and . When and are both reliable, to generate a new BPA, we can conjunctive rule denoted (3) [31]. When only one of them is totally reliable and we are not sure about another one, then we should apply disjunctive combination rule [31] denoted (4).([32, 33]) Given a proposition , the belief function of , , is defined in (5), which represents the total belief that the object is in . The plausibility function of , , is defined in (6), which measures the total belief that can move into . In D-S theory, and are called lower bound function and upper bound function, respectively, denoted . And and must satisfy the following relations of (7) and (8):For any proposition , its uncertainty can be represented by using its belief function and plausibility function as (9).Here, we give an example about belief function and plausibility function and show its results in Table 2.

Table 2: An example of Bel and Pl.

Example 2. Assume ; a BPA is given where , , , .

But the classical Dempster combination rule is not efficient all the time. When BOEs are in high conflict, illogical results will be generated [31]. Nowadays, there are mainly two kinds of methodologies. One is to modify the combined rule, and the other is to preprocess evidences. Smets’ unnormalized combination rule [25], Dubois and Prade’s disjunctive combination rule [24], and Yager’s combination rule [21] belong to the first category. These three alternatives mentioned above are examined and they all proposed a general combination framework. To preprocess data, Murphy’s simple average in [19], Yong et al.’s weighted average [26], and Han et al.’s modified weighted average in [27] are popular. In [19], a simple averaging approach of the primitive BOEs is proposed. And in that case all BOEs are seen equally important, which is unreasonable in real life. In [26], Yong et al. can get a better combination result by combining the weight average of the masses for times. In [27], a novel weighted evidence combination approach based on the distance of evidence and AM is proposed, which is based on Yong et al.’s work [26] actually.

2.2. Evidence Distance

With D-S theory applying widely, the study about evidence distance has attracted more and more interests [34]. The dissimilarity measure of evidences can represent the lack of similarity between two BOEs. Performance evaluation [35], reliability evaluation [36], conflict evidence combination [26], target association ([37, 38]), and a lot of methods regarding evidence distance are brought up as an appropriate measure of the difference. And several definitions of distance in evidence theory are also proposed, like Jousselme et al. distance [39], Wen et al.’s cosine similarity [40], Ristic and Smets’ transferable belief model (TBM) global distance measure ([37, 38]), Sunberg and Rogers’s belief function distance metric [41], and so on. Among those definitions on distance of evidence, the most frequently used is Jousselme et al.’s distance [39].

The Jousselme et al. distance [39] is identified based on Cuzzolin’s the geometric interpretation of evidence theory [42]. The power set of the frame of discernment is regarded as a -linear space. A distance and vectors are defined with the BPA as a particular case of vectors. The Jousselme et al. distance is defined as follows: , are two BPAs under the frame of discernment , and is a matrix. The element in is defined as ; represents cardinality. This Jousselme et al. distance satisfies all four requirements (nonnegativity, nondegeneracy, symmetry, and triangle inequality) [39] of a strict distance metric. This distance is an efficient tool to quantify the dissimilarity of two BOEs. An example of the Jousselme et al. distance is shown below.

Example 3. Assume there are two BOEs and ., , , ;, , , .The value inside the BOE vectors and and the distance matrix are given byIt follows that and :

3. The Proposed Method

We followed the methods of Wang et al. [43]. Assume that there are in total BOEs , , collected; we can use (13) to precalculate these pieces of BOE ([4446]).where stands for the corresponding weight degree of each BOE and is the weighted average BPA of BOEs. By means of the classical Dempster’s rule ([1, 3]) to combine for times, we can get the final combined result. But, to find an appropriate weight is a little difficult. Actually there are many related works including Yong et al. [26] and Han et al.’s approach [27]. In this paper, we present a new modified weighted evidence combination rule on the basis of evidence distance and a novel uncertainty measure. The flow graph of our proposed method is described in Figure 1.

Figure 1: The flow graph of the proposed method.

Step 1. In this step, based on evidence distance, we can determine all weights of the primitive BOEs.
As seen from the definition of evidence distance, the less the distance between two BOEs is, the more the similarity of those two is. The similarity measure between two BOEs is defined [20] inAfter getting all the degrees of similarity between BOEs, we can construct a similarity measure matrix (Smm) [26]. The Smm will give us insight into the agreement between BOEs.We define the support degree of each BOE [26] as follows: Then, we can obtain the credibility degree of each BOE based on [26]:In fact, for it is evident that we can regard the credibility degree as a weight directly. So, can be used to replace in (13) to modify original BPAs. As mentioned in [27], the uncertainty degree can also be utilized to construct weights. A combination result can come up by using AM and modifying obtained based on evidence theory. Here, we propose another uncertainty measure to modify obtained in Step , and the results perform better.

Step 2. In this step, the weight will be modified on the basis of uncertainty.
Supposing that one of some BOEs with relatively high credibility degree generated in the first step has less uncertainty degree than the others, we believe that the BOE is more credible and it should possess more weight because of its good quality. On the contrary, if a BOE has both more uncertainty degree and a low credibility degree, such a BOE is relatively incredible and even causes a wrong result perhaps. Thus, such a BOE should possess less weight.
Based on the thoughts mentioned above, the weight determined based on evidence distance will be modified through the following steps.
Step 2-1. As for each given original BOE , compute the belief function and the plausibility function of each proposition using (5) and (6).
Step 2-2. We take advantage of (9) to measure uncertainty denoted for each proposition . Adding up all by using (18), we can obtain total uncertainty measure for one BOE , denoted .Step 2-3. We utilize the following equation to modify the weight obtained in Step .Step 2-4. The final modified weight will be obtained after normalizing all with (20).Step 2-5. The modified weighted averaged BOE denoted is obtained asIf pieces of evidence are supplied, we can use the classical Dempster’s rule [1] to combine for times [19]. After that, we could get a better combined outcome to make a better decision.

4. Experiments

In this section, a numerical example is illustrated to demonstrate the efficiency of our proposed method.

Example 4. In a multisensor-based automatic target recognition system, suppose that the frame of discernment is complete and is the real target. From five different sensors, the system collects five BOEs listed as follows:, , ;, , ;, , ;, , ;, , .

Step 1. Through the first step of our proposed method, the credibility degree will be obtained, that is, the weight of each BOE can be determined based on evidence distance. The evidence distance between BOEs and the credibility degree of each BOE are shown in Tables 3 and 4, respectively.

Table 3: The evidence distance between BOEs.
Table 4: The weight determined by the evidence distance.

Step 2. Through this step, we can get the modified weight.
We can calculate total uncertainty for each BOE using (9) and (18). The results are listed in Table 5.

Table 5: The total uncertainty for each BOE.

Then by making use of (19) and (20), the final modified weight of each BOE can be obtained. The results are shown in Table 6.

Table 6: The final modified weight for each BOE.

Next, the can be obtained by means of replacing of (13) with . The results are listed in Table 7.

Table 7: on the basis of the novel modified weight averaging approach.

Finally, we make advantage of the classical Dempster-Shafer combination rule ([1, 3]) to combine shown in Table 7 for 4 times [19]. After that, the final combined outcomes will be got and shown in Table 8 and Figure 2.

Table 8: The final BPA by using the proposed method.
Figure 2: The final BPAs based on the proposed method.

We also make use of different combination rules to calculate the Example 4, and the results are shown in Table 9 and the comparing figures are shown in Figures 36.

Table 9: Evidence combination outcomes based on different combination rules.
Figure 3: The results using different combination rules when , are given.
Figure 4: The results using different combination rules when , , are given.
Figure 5: The results using different combination rules when , , , are given.
Figure 6: The results using different combination rules when , , , , are given.

As seen from Table 9 and Figures 36, when evidences are in high conflict, counter-intuitive results will be produced by using the classical Dempster’s combination rule and they do not reflect the truth. With incremental BOEs, although Murphy’s simple average [19], Yong et al.’s weighted average [26], and Han et al.’s novel weight average [27] all can give reasonable results, their results are all inferior to the outcomes of our proposed approach. Moreover, the performance of convergence of our proposed method is better than other existing methods. The main reason for these phenomena mentioned above is that, by making use of the evidence distance [39] and uncertainty measure, the final weights of bad evidences are decreased greatly (e.g., in experiment section, consider the case , , , , ; the weight of conflicting evidence falls to 0.0899 from 0.5). So its effect on the final combined outcomes will be weakened extremely. Similarly, because the final weights of good evidences are increased (e.g., in experiment section, consider the cases , , ; the total weight of credible evidences and adds to 0.8743 from 0.5), the effect of these credible evidences on the final combined results is strengthened extremely. The numerical example illustrates adequately that our proposed method is efficient.

5. Application

As all know, sensor data fusion plays a crucial role in fault diagnosis. In this section, our proposed method is also applied to solve such a problem and the results show that our proposed method is as feasible and efficient as the other existing approaches, and the most important thing is that, by utilizing our proposed method, the accuracy of fault diagnosis will be improved drastically. The example in this application section is cited from [28].

Suppose a machine has three gears , , and , and the failure fault modes , , represent that there are faults in , , and , respectively. The fault hypothesis set is . Assume , , are three types of sensors, respectively. The evidence set derived from different sensors is denoted as and the BPAs are all listed in Table 10. Assume the sufficiency indexes of those three pieces of evidences are 1, 0.6, and 1 and the importance indexes are 1, 0.34, and 1, respectively.

Table 10: The basic probability assignments in the application.

According to the equation , we can easily get the conflict degree of each pair of evidences and they are , , and , respectively, which obviously shows that the second evidence has conflicts highly with others. To solve such a fault diagnosis problem, the statistic sensor reliability and the dynamic sensor reliability will be considered. The statistic sensor reliability mainly depends on the technical factors, which can be evaluated by experts’ assessment or comparing the detection value with the real value in the long-term practice. The dynamic sensor reliability is influenced by the surrounding conditions changing with time, which can be measured by comparing the consistency of the outputs with other sensors that aim at the same point.

In this application section, the statistic sensor reliability is measured based on the evidence sufficiency index and evidence importance index in Fan and Zuo’s approach [28] and we define . The dynamic sensor reliability is measured based on the newly proposed method in this paper and we define . The comprehensive sensor reliability is defined to modify the highly conflicting evidences and finally through the classical Dempster’s combination rule, we can obtain the final results with a high accuracy to address such a fault diagnosis problem. The detailed steps are shown as follows.

Step 1. Calculate the statistic sensor reliability of each evidence and the results are listed in Table 11.

Table 11: The statistic sensor reliability of each evidence.

Step 2. Based on the newly proposed method, calculate the dynamic sensor reliability of each evidence and the results are listed in Table 12.

Table 12: The dynamic sensor reliability of each evidence.

Step 3. According to the equation , calculate the comprehensive sensor reliability and then normalize. The results are shown in Table 13.

Table 13: The comprehensive reliability of each evidence.

Step 4. The normalized comprehensive sensor reliability is applied to replace in (13) to modify BPAs, and through using the classical Dempster’s combination rule for two times, we can obtain the final results shown in Table 14.

Table 14: The final results in the application.

As seen from Table 14, based on our method, the belief degree of the fault is 88.99%, while the fault only has a belief degree of 7.39%. It is evident that . Thus, through our approach, we can correctly find the fault ; that is, Gear 1 has a fault and so our proposed method is efficient in fault diagnosis.

We also compare our proposed method with others and the comparison results are shown in Table 15. In D-S evidence theory, the belief degree of the fault is 0.4519, while that of is 0.5048. Due to the existence of conflicting evidence , D-S evidence theory comes to the counterintuitive results that , which will lead to a wrong decision. However, the remaining two methods can handle the conflicting evidence , so that they can both reach the right result. In Fan and Zuos method, , while our newly proposed approach has a higher belief degree of 88.99. The main reason is that the proposed method takes into consideration not only the static reliability represented by evidence sufficiency and evidence importance, but also the dynamic reliability, the information volume of the sensor itself, measured by evidence distance and uncertainty measurement, which decreases the weight of conflicting evidence and then makes ’s influence on the final result less.

Table 15: The comparison results between our proposed approach and other methods.

6. Conclusion

In this paper, a novel weighted evidence combination rule based on evidence distance ([20, 39]) and uncertainty measure is put forward. The proposed approach can address the combination of the conflicting evidences efficiently. By comparing other existing method, our proposed approach performs with the fastest convergence when managing high conflicting evidences. In addition, we apply our newly proposed method to the fault diagnosis. The results of the application sufficiently demonstrate the efficiency of our approach. The process of our proposed method is simple and with better convergence, so we believe our proposed method has a promising future.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This work was supported by Specialized Research Fund for the Doctoral Program of Higher Education of China (SRFDP) (no. 20130191110027) and by Research and Construction of Self-Regulated Learning Platform of Software Technology Specialty in Higher Vocational Colleges (no. KJ1502901) and by Research and demonstration application based on MOOCS Independent Learning Platform of Training Support Technology for Industry 4 talents (no. GZTG201605).

References

  1. A. P. Dempster, “Upper and lower probabilities induced by a multivalued mapping,” Annals of Mathematical Statistics, vol. 38, pp. 325–339, 1967. View at Publisher · View at Google Scholar · View at MathSciNet
  2. Y. Song, X. Wang, and H. Zhang, “A distance measure between intuitionistic fuzzy belief functions,” Knowledge-Based Systems, vol. 86, pp. 288–298, 2015. View at Publisher · View at Google Scholar · View at Scopus
  3. G. Shafer et al., A Mathematical Theory of Evidence, Princeton University Press, Princeton, NJ, USA, 1976. View at MathSciNet
  4. M. Beynon, B. Curry, and P. Morgan, “The Dempster-Shafer theory of evidence: an alternative approach to multicriteria decision modelling,” Omega , vol. 28, no. 1, pp. 37–50, 2000. View at Publisher · View at Google Scholar · View at Scopus
  5. Z. Dou, X. Xu, Y. Lin, and R. Zhou, “Application of D-S evidence fusion method in the fault detection of temperature sensor,” Mathematical Problems in Engineering, vol. 2014, Article ID 395057, 6 pages, 2014. View at Publisher · View at Google Scholar · View at Scopus
  6. X. Deng, Y. Hu, Y. Deng, and S. Mahadevan, “Supplier selection using AHP methodology extended by D numbers,” Expert Systems with Applications, vol. 41, pp. 156–167, 2014. View at Publisher · View at Google Scholar · View at Scopus
  7. X. Xu, S. Li, X. Song, C. Wen, and D. Xu, “The optimal design of industrial alarm systems based on evidence theory,” Control Engineering Practice, vol. 46, pp. 142–156, 2016. View at Publisher · View at Google Scholar · View at Scopus
  8. Y. Chen, A. B. Cremers, and Z. Cao, “Interactive color image segmentation via iterative evidential labeling,” Information Fusion, vol. 20, no. 1, pp. 292–304, 2014. View at Publisher · View at Google Scholar · View at Scopus
  9. D. Suh and J. Yook, “A method to determine basic probability assignment in context awareness of a moving object,” International Journal of Distributed Sensor Networks, vol. 2013, Article ID 972641, 7 pages, 2013. View at Publisher · View at Google Scholar · View at Scopus
  10. L. A. Zadeh, “A simple view of the Dempster-Shafer theory of evidence and its implication for the rule of combination,” AI Magazine, vol. 7, no. 2, pp. 85–90, 1986. View at Google Scholar · View at Scopus
  11. Y. Leung, N.-N. Ji, and J.-H. Ma, “An integrated information fusion approach based on the theory of evidence and group decision-making,” Information Fusion, vol. 14, no. 4, pp. 410–422, 2013. View at Publisher · View at Google Scholar · View at Scopus
  12. S. A. Khamseh, A. K. Sedigh, B. Moshiri, and A. Fatehi, “Control performance assessment based on sensor fusion techniques,” Control Engineering Practice, vol. 49, pp. 14–28, 2016. View at Publisher · View at Google Scholar · View at Scopus
  13. D. Niu, Y. Wei, Y. Shi, and H. R. Karimi, “A novel evaluation model for hybrid power system based on vague set and Dempster-Shafer evidence theory,” Mathematical Problems in Engineering, vol. 2012, Article ID 784389, 2012. View at Publisher · View at Google Scholar · View at Scopus
  14. Q. Zhou, H. Zhou, Q. Zhou, F. Yang, L. Luo, and T. Li, “Structural damage detection based on posteriori probability support vector machine and Dempster-Shafer evidence theory,” Applied Soft Computing, vol. 36, pp. 368–374, 2015. View at Publisher · View at Google Scholar · View at Scopus
  15. J. Xu, Z. Zhong, and L. Xu, “ISHM-oriented adaptive fault diagnostics for avionics based on a distributed intelligent agent system,” International Journal of Systems Science, vol. 46, no. 13, pp. 2287–2302, 2015. View at Publisher · View at Google Scholar · View at Scopus
  16. L. A. Zadeh, “Review of a mathematical theory of evidence,” AI Magazine, vol. 5, no. 3, article 81, 1984. View at Google Scholar
  17. E. Lefevre, O. Colot, and P. Vannoorenberghe, “Belief function combination and conflict management,” Information Fusion, vol. 3, no. 2, pp. 149–162, 2002. View at Publisher · View at Google Scholar · View at Scopus
  18. V. Torra, Y. Narukawa, and S. Miyamoto, “Modeling Decisions for Artificial Intelligence: Theory, Tools and Applications,” in Modeling Decisions for Artificial Intelligence, vol. 3558 of Lecture Notes in Computer Science, pp. 1–8, Springer, Berlin, Heidelberg, 2005. View at Publisher · View at Google Scholar
  19. C. K. Murphy, “Combining belief functions when evidence conflicts,” Decision Support Systems, vol. 29, no. 1, pp. 1–9, 2000. View at Publisher · View at Google Scholar · View at Scopus
  20. F. Voorbraak, “On the justification of Dempster's rule of combination,” Artificial Intelligence. An International Journal, vol. 48, no. 2, pp. 171–197, 1991. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  21. R. R. Yager, “On the Dempster-Shafer framework and new combination rules,” Information Sciences, vol. 41, no. 2, pp. 93–137, 1987. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  22. K. Yamada, “A new combination of evidence based on compromise,” Fuzzy Sets and Systems, vol. 159, no. 13, pp. 1689–1708, 2008. View at Publisher · View at Google Scholar · View at Scopus
  23. D. Han, C. Han, and Y. Yang, “A modified evidence combination approach based on ambiguity measure,” in Proceedings of the 11th International Conference on Information Fusion, FUSION '08, Cologne, Germany, July 2008. View at Publisher · View at Google Scholar · View at Scopus
  24. D. Dubois and H. Prade, “Representation and combination of uncertainty with belief functions and possibility measures,” Computational Intelligence, vol. 4, no. 3, pp. 244–264, 1988. View at Publisher · View at Google Scholar
  25. P. Smets, “Combination of evidence in the transferable belief model,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 12, no. 5, pp. 447–458, 1990. View at Publisher · View at Google Scholar · View at Scopus
  26. D. Yong, S. WenKang, Z. ZhenFu, and L. Qi, “Combining belief functions based on distance of evidence,” Decision Support Systems, vol. 38, no. 3, pp. 489–493, 2004. View at Publisher · View at Google Scholar · View at Scopus
  27. D. Han, Y. Deng, C. Han, and Z. Hou, “Weighted evidence combination based on distance of evidence and uncertainty measure,” Journal of Infrared, Millimeter and Terahertz Waves, vol. 30, no. 5, pp. 396–400, 2011. View at Publisher · View at Google Scholar
  28. X. Fan and M. J. Zuo, “Fault diagnosis of machines based on D-S evidence theory. Part 1: D-S evidence theory and its improvement,” Pattern Recognition Letters, vol. 27, no. 5, pp. 366–376, 2006. View at Publisher · View at Google Scholar · View at Scopus
  29. W. Zhu, H. Yang, Y. Jin, and B. Liu, “A Method for Recognizing Fatigue Driving Based on Dempster-Shafer Theory and Fuzzy Neural Network,” Mathematical Problems in Engineering, vol. 2017, Article ID 6191035, 2017. View at Publisher · View at Google Scholar · View at Scopus
  30. W. Jiang, B. Wei, X. Qin, J. Zhan, and Y. Tang, “Sensor Data Fusion Based on a New Conflict Measure,” Mathematical Problems in Engineering, vol. 2016, Article ID 5769061, 2016. View at Publisher · View at Google Scholar · View at Scopus
  31. W. Liu, “Analyzing the degree of conflict among belief functions,” Artificial Intelligence, vol. 170, no. 11, pp. 909–924, 2006. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  32. H. Zheng, Y. Deng, and Y. Hu, “Fuzzy evidential influence diagram and its evaluation algorithm,” Knowledge-Based Systems, vol. 131, pp. 28–45, 2017. View at Publisher · View at Google Scholar
  33. J. Qian, X. Guo, and Y. Deng, “A novel method for combining conflicting evidences based on information entropy,” Applied Intelligence, vol. 46, no. 4, pp. 876–888, 2017. View at Publisher · View at Google Scholar · View at Scopus
  34. A.-L. Jousselme and P. Maupin, On some properties of distances in evidence theory, Belief, Brest, France.
  35. D. Fixsen and R. P. S. Mahler, “The modified Dempster-Shafer approach to classification,” IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, vol. 27, no. 1, pp. 96–104, 1997. View at Publisher · View at Google Scholar · View at Scopus
  36. H. Guo, W. Shi, and Y. Deng, “Evaluating sensor reliability in classification problems based on evidence theory,” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 36, no. 5, pp. 970–981, 2006. View at Publisher · View at Google Scholar · View at Scopus
  37. B. Ristic and P. Smets, “The TBM global distance measure for the association of uncertain combat ID declarations,” Information Fusion, vol. 7, no. 3, pp. 276–284, 2006. View at Publisher · View at Google Scholar · View at Scopus
  38. B. Ristic and P. Smets, “Global cost of assignment in the TBM framework for association of uncertain ID reports,” Aerospace Science and Technology, vol. 11, no. 4, pp. 303–309, 2007. View at Publisher · View at Google Scholar · View at Scopus
  39. A. L. Jousselme, D. Grenier, and É. Bossé, “A new distance between two bodies of evidence,” Information Fusion, vol. 2, no. 2, pp. 91–101, 2001. View at Publisher · View at Google Scholar · View at Scopus
  40. C. Wen, Y. Wang, and X. Xu, “Fuzzy Information Fusion Algorithm of Fault Diagnosis Based on Similarity Measure of Evidence,” in Advances in Neural Networks-ISNN, pp. 506–515, Springer, Berlin, 2008. View at Google Scholar
  41. Z. Sunberg and J. Rogers, “A belief function distance metric for orderable sets,” Information Fusion, vol. 14, no. 4, pp. 361–373, 2013. View at Publisher · View at Google Scholar · View at Scopus
  42. F. Cuzzolin, “A geometric approach to the theory of evidence,” IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 38, no. 4, pp. 522–534, 2008. View at Publisher · View at Google Scholar · View at Scopus
  43. J. Wang, Y. Hu, F. Xiao, X. Deng, and Y. Deng, “A novel method to use fuzzy soft sets in decision making based on ambiguity measure and Dempster-Shafer theory of evidence: an application in medical diagnosis,” Artificial Intelligence in Medicine, vol. 69, pp. 1–11, 2016. View at Publisher · View at Google Scholar · View at Scopus
  44. J. F. Baldwin, “A Calculus for Mass Assignments in Evidential Reasoning,” in Advances in the Dempster-Shafer Theory of Evidence, pp. 513–531, John Wiley & Sons, Inc., New Jersey, NJ, USA, 1994. View at Google Scholar
  45. E. Lefevre, O. Colot, P. Vannoorenberghe, and D. De Brucq, “Contribution des mesures d’information à la modélisation crédibiliste de connaissances,” TS. Traitement du signal, vol. 17, no. 2, pp. 87–97, 2000. View at Google Scholar
  46. S. Fabre, A. Appriou, and X. Briottet, “Presentation and description of two classification methods using data fusion based on sensor management,” Information Fusion, vol. 2, no. 1, pp. 49–71, 2001. View at Publisher · View at Google Scholar · View at Scopus