Complexity

Complexity / 2017 / Article
Special Issue

Managing Information Uncertainty and Complexity in Decision-Making

View this Special Issue

Research Article | Open Access

Volume 2017 |Article ID 4359195 | https://doi.org/10.1155/2017/4359195

Deyun Zhou, Yongchuan Tang, Wen Jiang, "An Improved Belief Entropy and Its Application in Decision-Making", Complexity, vol. 2017, Article ID 4359195, 15 pages, 2017. https://doi.org/10.1155/2017/4359195

An Improved Belief Entropy and Its Application in Decision-Making

Academic Editor: Jurgita Antucheviciene
Received22 Dec 2016
Accepted23 Jan 2017
Published16 Mar 2017

Abstract

Uncertainty measure in data fusion applications is a hot topic; quite a few methods have been proposed to measure the degree of uncertainty in Dempster-Shafer framework. However, the existing methods pay little attention to the scale of the frame of discernment (FOD), which means a loss of information. Due to this reason, the existing methods cannot measure the difference of uncertain degree among different FODs. In this paper, an improved belief entropy is proposed in Dempster-Shafer framework. The proposed belief entropy takes into consideration more available information in the body of evidence (BOE), including the uncertain information modeled by the mass function, the cardinality of the proposition, and the scale of the FOD. The improved belief entropy is a new method for uncertainty measure in Dempster-Shafer framework. Based on the new belief entropy, a decision-making approach is designed. The validity of the new belief entropy is verified according to some numerical examples and the proposed decision-making approach.

1. Introduction

Decision-making in the uncertain environment is common in real world applications, such as civil engineering [13], supply chain management [4, 5], risk analysis [6, 7], medical service [8, 9], and so on [1012]. In order to reach the goal of maximum comprehensive benefits, many methods have been developed for decision-making based on probability theory [13, 14], fuzzy set theory [1518], Dempster-Shafer evidence theory [1922], rough set theory [2325], and so on [2629].

Dempster-Shafer evidence theory [19, 20] is effective in uncertain information processing. It provides the frame of discernment (FOD) and the basic probability assignment (BPA) for information modeling, as well as the Dempster’s rule of combination for data fusion. Dempster-Shafer evidence theory has been extensively studied in many fields such as pattern recognition [3034], fault diagnosis [3538], multiple attribute decision-making [3941], risk analysis [22, 35, 42, 43], controller design [44, 45], and so on [4648]. Some open issues in Dempster-Shafer evidence theory are still needed for further study, including the conflicting management [4952], the independence of different evidences [5355], the methods to generate BPAs [5658], and the incompleteness of the FOD [5961]. One way to address these open issues is to quantify the uncertain degree of uncertain information before further information processing.

In the probabilistic framework, Shannon entropy [62] is a well-known theory for uncertainty measure; it has attracted much attention in real applications [6365]. But Shannon entropy cannot be used directly in the framework of Dempster-Shafer evidence theory, because a mass function in evidence theory is a generalized probability assigned on the power set of the FOD. In order to overcome this limitation, in Dempster-Shafer framework, many methods have been proposed to measure the uncertain degree of the evidence, such as Hohle’s confusion measure [66], Yager’s dissonance measure [67], Dubois & Prade’s weighted Hartley entropy [68], Klir & Ramer’s discord measure [69], Klir & Parviz’s strife measure [70], George & Pal’s total conflict measure [71], and so on [7274]. Some of these methods are derived from Shannon entropy [6670]. But the aforementioned methods are somehow not that effective in some cases [71, 75].

Lately, another uncertainty measure named Deng entropy [75] is proposed in Dempster-Shafer framework. Although Deng entropy has been successfully applied in some real applications [3638, 80, 81], it does not take into consideration the scale of the FOD, which means a loss of available information while doing information processing. To make matters worse, the information loss will lead to failure for uncertainty measure in some cases [71]. The same shortage also exists in the confusion measure [66], the dissonance measure [67], the weighted Hartley entropy [68], the discord measure [69], and the strife measure [70]. To address this issue, an improved belief entropy based on Deng entropy is proposed in this paper. The proposed belief entropy can improve the performance of Deng entropy by considering the scale of the FOD and the relative scale of a focal element with respect to FOD. What is more, the proposed method keeps all the merits of Deng entropy; thus it can degenerate to Shannon entropy in the sense of the probability consistency.

In order to verify the validity of the improved belief entropy, a decision-making approach in target identification is designed based on the new belief entropy. In the proposed method, the uncertain degree of the sensor data is measured by the new belief entropy; then the uncertain degree will be used as the relative weight of each sensor report modeled as the body of evidence (BOE); after that, the BPAs of the BOE will be modified by the weight value; finally, the decision-making is based on the fusion results of applying Dempster’s rule of combination to the modified BPAs.

The rest of this paper is organized as follows. In Section 2, the preliminaries on Dempster-Shafer evidence theory, Shannon entropy, and some uncertainty measures in Dempster-Shafer framework are briefly introduced. In Section 3, the improved belief entropy is proposed; some behaviors of the proposed belief entropy are discussed with the numerical examples. In Section 4, an uncertainty measure-based decision-making approach is proposed to show the efficiency of the new belief entropy. The conclusions are given in Section 5.

2. Preliminaries

In this section, some preliminaries are briefly introduced, including Dempster-Shafer evidence theory, Shannon entropy, and some typical uncertainty measures in Dempster-Shafer framework.

2.1. Dempster-Shafer Evidence Theory

Let be a finite nonempty set of mutually exclusive and exhaustive events, and is called the frame of discernment (FOD). The power set of , denoted as , is composed of elements denoted as follows:

A mass function is defined as a mapping from the power set to the interval , which satisfies the following conditions [19, 20]:

If , then is called a focal element, and the mass function represents how strongly the evidence supports proposition .

A body of evidence (BOE), also known as a basic probability assignment (BPA) or basic belief assignment (BBA), is represented by the focal sets and their associated mass value:where is a subset of the power set and each has an associated nonzero mass value .

A BPA can also be represented by its associate belief function and plausibility function , respectively, defined as follows:

In Dempster-Shafer evidence theory, two independent mass functions, denoted as and , can be combined with Dempster’s rule of combination defined as follows [19, 20]:where is a normalization constant representing the degree of conflict between and and is defined as follows [19, 20]:

2.2. Shannon Entropy

As an uncertainty measure of information volume in a system or process, Shannon entropy plays a central role in information theory. Shannon entropy indicates that the information volume of each piece of information is directly connected to its uncertainty degree.

Shannon entropy, as the information entropy, is defined as follows [62]:where is the number of basic states, is the probability of state and satisfies , and is the basis of the logarithm which accounts for the scaling of . Although is arbitrary, is usually chosen to be , and the unit of information entropy is bit. If is the natural base, then the unit of information entropy will be Nat. Mathematically, as a scaling factor, different basis of the logarithm is convertible.

2.3. Uncertainty Measures in Dempster-Shafer Framework

In Dempster-Shafer framework, some uncertainty measures for the BOE are presented, as is shown in Table 1, where is the FOD, and are focal elements of a mass function, and denotes the cardinality of .


Uncertainty measureDefinition

Hohle’s confusion measure [66]
Yager’s dissonance measure [67]
Dubois & Prade’s weighted Hartley entropy [68]
Klir & Ramer’s discord measure [69]
Klir & Parviz’s strife measure [70]
George & Pal’s total conflict measure [71]

Lately, another belief entropy, named Deng entropy, is presented to measure the uncertainty in the BOE. Deng entropy, denoted as , is defined as follows [75]:

3. The Improved Belief Entropy

3.1. Problem Description

In Dempster-Shafer framework, the uncertain information modeled in the BOE includes the mass function and the FOD. However, the existing uncertainty measures only focus on the mass function [66, 67] or at most take into consideration the cardinality of each proposition [6871, 75]. In other words, the scale of the FOD is totally ignored. Without taking full advantage of available information in the BOE, the existing uncertainty measures cannot effectively quantify the difference among different BOEs if the same mass value is assigned on different FOD. A simple example of the limitation of Deng entropy is shown in Example 1.

Example 1. Consider a target identification problem; assume that two reliable sensors report the detection results independently. The results are represented by BOEs shown as follows:Recalling (8), the uncertainty measure with Deng entropy is shown as follows:The limitation of Deng entropy also exists in Dubois & Prade’s weighted Hartley entropy [68], and the uncertainty measure with the weighted Hartley entropy is shown as follows:The results calculated by Deng entropy and the weighted Hartley entropy are counterintuitive. Although the two BOEs have the same mass value, the FOD of the first BOE consists of four targets, denoted as , , , and , while the second BOE has only three possible targets, denoted as , , and . Intuitively, it is expected that has a less uncertainty than . In other words, the belief entropy of should be bigger than that of . Both Deng entropy and weighted Hartley entropy fail to quantify this difference.
To address this issue, an improved belief entropy is proposed.

3.2. The Proposed Belief Entropy

In Dempster-Shafer framework, the improved belief entropy is proposed as follows:where is the FOD, denotes the cardinality of the focal element , and is the number of element in the FOD. Compared with some other uncertainty measures in [6671, 75], the improved belief entropy addresses more information in a BOE. The uncertain information addressed by the new belief entropy includes the information represented by the mass function, the cardinality of each proposition, the scale of FOD (denoted as ), and the relative scale of a focal element with respect to the FOD (denoted as ).

In detail, compared with the confusion measure [66] and the dissonance measure [67], the uncertain information modeled by the cardinality of each proposition and the scale of FOD now can be handled properly. Compared with the weighted Hartley entropy [68], the discord measure [69], the strife measure [70], the total conflict measure [71] and Deng entropy [75], the scale of FOD, and the relative scale of a focal element with respect to the FOD now can be processed effectively. Above all, by involving the cardinality of each proposition and the scale of FOD in the proposed belief entropy, the new uncertainty measure can now effectively quantify the difference among different BOEs even if the same mass value is assigned on different FODs. What is more, more information of the BOE is addressed in information processing which means less information loss.

With the new belief entropy, recall the issue in Example 1; the improved belief entropy for these two BOEs is calculated as follows:

It can be concluded that both the weighted Hartley entropy and Deng entropy cannot measure the different uncertain degree between these two BOEs, while the new belief entropy can effectively measure the difference by taking into consideration more available information of the BOE. According to Table 2, it is also safe to say that the first BOE has a higher uncertain degree with the new belief entropy; this is reasonable because the FOD of includes four possible targets which means a larger information volume than the second BOE . The efficiency of the new belief entropy is not available in the weighted Hartley entropy and Deng entropy.


BOEsWeight Hartley entropy [68]Deng entropy [75]Improved belief entropy

12.55592.1952
12.55592.0750

3.3. Behaviors of the Proposed Belief Entropy

In order to show the rationality and merit of the proposed belief entropy, some numerical examples are presented in this section. In Section 3.3.1, the compatibility of the new belief entropy with Shannon entropy is verified with some simple numerical examples. In Section 3.3.2, the superiority of the new belief entropy compared with some other uncertainty measures is presented.

3.3.1. Compatibility with Shannon Entropy

Example 2. Consider a target identification problem; if the target reported by the sensor is with one hundred percent belief, then the mass function can be denoted as in the frame of discernment .
Shannon entropy , Deng entropy , and the improved belief entropy are calculated, respectively, as follows:It is obvious that the uncertainty degree for a certain event is zero. So the values of Shannon entropy, Deng entropy, and the improved belief entropy are all zero.

Example 3. Consider the mass functions in the frame of discernment .
Shannon entropy , Deng entropy , and the improved belief entropy are calculated, respectively, as follows:According to Examples 2 and 3, if the mass value is only assigned on the single element, the result of the improved belief entropy is consistent with Shannon entropy and Deng entropy. The new belief entropy is compatible with Shannon entropy in the sense of the probability consistency, which indicates the effectiveness of the proposed belief entropy.

3.3.2. Efficiency in Uncertainty Measure

In order to test the efficiency and merit of the new belief entropy, recall the numerical example in [75] as follows.

Example 4. Consider the mass functions , , , and . The FOD is with fifteen elements denoted as . The proposition consists of a variable subset with the number of element changing from one to fourteen, as is shown in Table 3.
Deng entropy and the improved belief entropy are calculated, respectively, with the variable element number in the proposition ; the results are shown in Table 3. Table 3 shows that the improved belief entropy is smaller than Deng entropy. This is reasonable, because more information in the BOE is taken into consideration within the improved belief entropy. By taking into consideration more available information, the uncertain degree measured by the new belief entropy decreases significantly compared to Deng entropy, which can be really helpful in information processing.
The uncertain degree in Example 4, measured by the other uncertainty measures in Table 1, is shown in Figure 1. The uncertain degree measured by Hohle’s confusion measure never changes with the variation of the element number in proposition ; thus it cannot measure the variance of uncertain degree in this case. Similar to Hohle’s confusion measure, Yager’s dissonance measure has a limited capacity of uncertainty measure in this case. Both the confusion measure and the dissonance measure cannot measure the change in the proposition . The uncertain degree measured by Klir & Ramer’s discord measure, Klir & Parviz’s strife measure, and George & Pal’s conflict measure is decreasing with the increasing of the element number in the proposition ; this is counterintuitive. Thus, the confusion measure, the dissonance measure, the discord measure, the strife measure, and the conflict measure cannot effectively measure the rising of the uncertain degree along with the increasing of the element number in the proposition .
It seems that the uncertain degree measured by Dubois & Prade’s weighted Hartley entropy, Deng entropy, and the modified belief entropy is rising significantly with the increasing of the element number in proposition . However, the weighted Hartley entropy and Deng entropy cannot distinguish the different uncertain degree among the BOEs with similar BPAs on different FODs, as is shown in Example 1. Thus, the improved belief entropy is the only available method for uncertainty measure in this case. More importantly, the proposed belief entropy takes advantage of more valuable information in the BOE, which ensures it to be more reasonable and effective for uncertainty measure in Dempster-Shafer framework.


CasesDeng entropyImproved belief entropy

2.66232.5180
3.93033.7090
4.90824.6100
5.78785.4127
6.62566.1736
7.44416.9151
8.25327.6473
9.05788.3749
9.86009.1002
10.66129.8244
11.461710.5480
12.262011.2714
13.062211.9946
13.862212.7177

4. An Uncertainty Measure-Based Decision-Making Approach

In this section, a decision-making approach in sensor data fusion is presented. After uncertainty measure with the improved belief entropy, the modified BOEs are fused with Dempster’s rule of combination. Finally, decision-making is based on the fused results.

4.1. Problem Description

In target recognition, sometimes, decision-making is based on reports from sensors. Consider the problem in [78]; the potential objects in the target recognition system are denoted as , , and in the FOD, denoted as . Five sensors report on the possible object independently and successively; each report is represented as a BOE; these BOEs are denoted as , , , , and , respectively.

With the incoming data of sensors, which is the target? The target recognition system needs to make a decision based on the fusion results of the sensor data. Intuitively, will be the right target. In the second BOE, the mass value assigned on target is zero and the mass value of target is 0.9; this is in high conflict with the other BOEs. According to [78, 79], the second BOE may come from a bad sensor and its report on the target is abnormal. will infer the correct fusion result, which may lead to a failure in decision-making. This case can be really a challenge for some data fusion methods that cannot handle conflicting evidences effectively [49, 50, 52].

4.2. Uncertainty Measure with the New Belief Entropy

Before further addressing the BOEs reported by sensors, the quality of this information is quantified with the proposed belief entropy. Recalling (12), the improved belief entropy of each BOE is calculated as follows:

Intuitively, the second BOE is conflicting with the other BOEs; this is also successfully indicated by the abnormal value of the improved belief entropy. In other words, the improved belief entropy value of , which is 0.4690, is abnormal in comparison with that of the other four BOEs.

4.3. Decision-Making Procedures

In real applications, for example, air battle, the real-time requirement is highly concerned, so decision-making in target recognition needs to be finished in real time. If decision-making needs to be finished in real time, then data fusion needs to be processed instantly with the upcoming sensor report. The procedures for decision-making based on the improved belief entropy are designed in Figure 2; six steps are needed as follows.

Step 1. Evidence from sensor report is modeled as the BOE.
An example of this step is given in Section 4.1; each piece of evidence is modeled as a BOE.

Step 2. Uncertainty measure of each BOE with the improved belief entropy.
Generally, the more dispersive the mass value is assigned among the power set, the bigger the new belief entropy of the BOE will be. An illustrative example of this step is shown in Section 4.2; the proposed belief entropy is used to measure the uncertain degree of each BOE in Section 4.1.

Step 3. Relative weight of the incoming BOEs is calculated based on the improved uncertainty measure.
A big entropy value corresponds to a big information volume. It is commonly accepted that the bigger the entropy is, the higher the uncertain degree will be. The relative weight of each BOE is defined as the relative weight of the new belief entropy. For the th BOE, its relative weight among all the available BOEs, denoted as , is defined as follows:The relative weight of each BOE in Section 4.1 can be calculated with (18); the results are shown as follows:

Step 4. The modified BPAs are derived for sensor data fusion.
For a proposition , the modified BPA based on available BOEs is defined as follows:For example, with (20), the BPAs in Section 4.1 are modified with the relative weight of each BOE; the results are shown as follows:

Step 5. The weighted BPAs are fused with Dempster’s rule of combination in (5)-(6) by time(s).
For example, the modified BPAs in Step 4 are fused with Dempster’s rule of combination by four times; the fused results are shown as follows:

Step 6. Real-time decision-making based on the fused results.
For example, taking into consideration the target recognition problem expressed in Section 4.1, with the aforementioned five steps, now it is safe to make the conclusion that is the recognized target.
Based on the aforementioned six steps, the decision-making results corresponding to two, three, four, and five BOEs are shown in Table 4, respectively. It can be concluded that with the proposed method the right decision will be made even if there are only three BOEs, because the belief on the proposition is more than 80% in the fusion result with three BOEs. The inference of the wrong report can be overcome instantly with the incoming sensor reports. The fusion results with all the five sensor reports will have a belief of over 98% on the right recognized target .


BOEsFusion resultsRecognition result

,  Uncertain; no proposition has a belief over 60.00%

,  ,  , with a belief of 81.48%

,  ,  ,  , with a belief of 94.95%

,  ,  ,  ,  , with a belief of 98.73%

4.4. Discussion

In order to test the efficiency of the proposed method, the fusion results are compared with the other methods in [7880]. The results of different combination methods addressed on this issue are shown in Table 5. The comparisons of the fusion results with different fusion methods of two, three, four, and five BOEs are shown in Figures 3, 4, 5, and 6, respectively.


,  ,  ,  ,  ,  ,  ,  ,  ,  ,  

Dempster’s rule [20]

Yager’s rule [76]

Murphy’s rule [77]