Complexity

Volume 2017, Article ID 4359195, 15 pages

https://doi.org/10.1155/2017/4359195

## An Improved Belief Entropy and Its Application in Decision-Making

School of Electronics and Information, Northwestern Polytechnical University, Xi’an, Shaanxi 710072, China

Correspondence should be addressed to Yongchuan Tang; nc.ude.upwn.liam@nauhcgnoygnat and Wen Jiang; nc.ude.upwn@newgnaij

Received 22 December 2016; Accepted 23 January 2017; Published 16 March 2017

Academic Editor: Jurgita Antucheviciene

Copyright © 2017 Deyun Zhou et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

Uncertainty measure in data fusion applications is a hot topic; quite a few methods have been proposed to measure the degree of uncertainty in Dempster-Shafer framework. However, the existing methods pay little attention to the scale of the frame of discernment (FOD), which means a loss of information. Due to this reason, the existing methods cannot measure the difference of uncertain degree among different FODs. In this paper, an improved belief entropy is proposed in Dempster-Shafer framework. The proposed belief entropy takes into consideration more available information in the body of evidence (BOE), including the uncertain information modeled by the mass function, the cardinality of the proposition, and the scale of the FOD. The improved belief entropy is a new method for uncertainty measure in Dempster-Shafer framework. Based on the new belief entropy, a decision-making approach is designed. The validity of the new belief entropy is verified according to some numerical examples and the proposed decision-making approach.

#### 1. Introduction

Decision-making in the uncertain environment is common in real world applications, such as civil engineering [1–3], supply chain management [4, 5], risk analysis [6, 7], medical service [8, 9], and so on [10–12]. In order to reach the goal of maximum comprehensive benefits, many methods have been developed for decision-making based on probability theory [13, 14], fuzzy set theory [15–18], Dempster-Shafer evidence theory [19–22], rough set theory [23–25], and so on [26–29].

Dempster-Shafer evidence theory [19, 20] is effective in uncertain information processing. It provides the frame of discernment (FOD) and the basic probability assignment (BPA) for information modeling, as well as the Dempster’s rule of combination for data fusion. Dempster-Shafer evidence theory has been extensively studied in many fields such as pattern recognition [30–34], fault diagnosis [35–38], multiple attribute decision-making [39–41], risk analysis [22, 35, 42, 43], controller design [44, 45], and so on [46–48]. Some open issues in Dempster-Shafer evidence theory are still needed for further study, including the conflicting management [49–52], the independence of different evidences [53–55], the methods to generate BPAs [56–58], and the incompleteness of the FOD [59–61]. One way to address these open issues is to quantify the uncertain degree of uncertain information before further information processing.

In the probabilistic framework, Shannon entropy [62] is a well-known theory for uncertainty measure; it has attracted much attention in real applications [63–65]. But Shannon entropy cannot be used directly in the framework of Dempster-Shafer evidence theory, because a mass function in evidence theory is a generalized probability assigned on the power set of the FOD. In order to overcome this limitation, in Dempster-Shafer framework, many methods have been proposed to measure the uncertain degree of the evidence, such as Hohle’s confusion measure [66], Yager’s dissonance measure [67], Dubois & Prade’s weighted Hartley entropy [68], Klir & Ramer’s discord measure [69], Klir & Parviz’s strife measure [70], George & Pal’s total conflict measure [71], and so on [72–74]. Some of these methods are derived from Shannon entropy [66–70]. But the aforementioned methods are somehow not that effective in some cases [71, 75].

Lately, another uncertainty measure named Deng entropy [75] is proposed in Dempster-Shafer framework. Although Deng entropy has been successfully applied in some real applications [36–38, 80, 81], it does not take into consideration the scale of the FOD, which means a loss of available information while doing information processing. To make matters worse, the information loss will lead to failure for uncertainty measure in some cases [71]. The same shortage also exists in the confusion measure [66], the dissonance measure [67], the weighted Hartley entropy [68], the discord measure [69], and the strife measure [70]. To address this issue, an improved belief entropy based on Deng entropy is proposed in this paper. The proposed belief entropy can improve the performance of Deng entropy by considering the scale of the FOD and the relative scale of a focal element with respect to FOD. What is more, the proposed method keeps all the merits of Deng entropy; thus it can degenerate to Shannon entropy in the sense of the probability consistency.

In order to verify the validity of the improved belief entropy, a decision-making approach in target identification is designed based on the new belief entropy. In the proposed method, the uncertain degree of the sensor data is measured by the new belief entropy; then the uncertain degree will be used as the relative weight of each sensor report modeled as the body of evidence (BOE); after that, the BPAs of the BOE will be modified by the weight value; finally, the decision-making is based on the fusion results of applying Dempster’s rule of combination to the modified BPAs.

The rest of this paper is organized as follows. In Section 2, the preliminaries on Dempster-Shafer evidence theory, Shannon entropy, and some uncertainty measures in Dempster-Shafer framework are briefly introduced. In Section 3, the improved belief entropy is proposed; some behaviors of the proposed belief entropy are discussed with the numerical examples. In Section 4, an uncertainty measure-based decision-making approach is proposed to show the efficiency of the new belief entropy. The conclusions are given in Section 5.

#### 2. Preliminaries

In this section, some preliminaries are briefly introduced, including Dempster-Shafer evidence theory, Shannon entropy, and some typical uncertainty measures in Dempster-Shafer framework.

##### 2.1. Dempster-Shafer Evidence Theory

Let be a finite nonempty set of mutually exclusive and exhaustive events, and is called the* frame of discernment* (FOD). The power set of , denoted as , is composed of elements denoted as follows:

A* mass function * is defined as a mapping from the power set to the interval , which satisfies the following conditions [19, 20]:

If , then is called a* focal element*, and the mass function represents how strongly the evidence supports proposition .

A* body of evidence* (BOE), also known as a* basic probability assignment* (BPA) or* basic belief assignment* (BBA), is represented by the focal sets and their associated mass value:where is a subset of the power set and each has an associated nonzero mass value .

A BPA can also be represented by its associate belief function and plausibility function , respectively, defined as follows:

In Dempster-Shafer evidence theory, two independent mass functions, denoted as and , can be combined with Dempster’s rule of combination defined as follows [19, 20]:where is a normalization constant representing the* degree of conflict* between and and is defined as follows [19, 20]:

##### 2.2. Shannon Entropy

As an uncertainty measure of information volume in a system or process, Shannon entropy plays a central role in information theory. Shannon entropy indicates that the information volume of each piece of information is directly connected to its uncertainty degree.

Shannon entropy, as the information entropy, is defined as follows [62]:where is the number of basic states, is the probability of state and satisfies , and is the basis of the logarithm which accounts for the scaling of . Although is arbitrary, is usually chosen to be , and the unit of information entropy is bit. If is the natural base, then the unit of information entropy will be Nat. Mathematically, as a scaling factor, different basis of the logarithm is convertible.

##### 2.3. Uncertainty Measures in Dempster-Shafer Framework

In Dempster-Shafer framework, some uncertainty measures for the BOE are presented, as is shown in Table 1, where is the FOD, and are focal elements of a mass function, and denotes the cardinality of .