Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2015 (2015), Article ID 563745, 13 pages
http://dx.doi.org/10.1155/2015/563745
Research Article

Entropy Measures for Interval-Valued Intuitionistic Fuzzy Sets and Their Application in Group Decision-Making

1College of Mathematical Sciences, Yangzhou University, Yangzhou 225002, China
2Institute of Operations Research, Qufu Normal University, Shandong, Rizhao 276826, China

Received 8 July 2014; Revised 3 December 2014; Accepted 16 December 2014

Academic Editor: Ricardo Femat

Copyright © 2015 Cuiping Wei and Yuzhong Zhang. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Entropy measure is an important topic in the fuzzy set theory and has been investigated by many researchers from different points of view. In this paper, two new entropy measures based on the cosine function are proposed for intuitionistic fuzzy sets and interval-valued intuitionistic fuzzy sets. According to the features of the cosine function, the general forms of these two kinds of entropy measures are presented. Compared with the existing ones, the proposed entropy measures can overcome some shortcomings and be used to measure both fuzziness and intuitionism of these two fuzzy sets; as a result, the uncertain information of which can be described more sufficiently. These entropy measures have been applied to assess the experts’ weights and to solve multicriteria fuzzy group decision-making problems.

1. Introduction

Since Zadeh introduced fuzzy set theory [1], some generalized forms have been proposed and studied to treat imprecision and uncertainty [210]. Atanassov proposed the notions of intuitionistic fuzzy sets (IFSs) [2] and interval-valued intuitionistic fuzzy sets (IVIFSs) [3]; Gau and Buehrer [4] introduced the notion of vague sets. And some authors [57] pointed out that the IFS theory and the vague set theory are equivalent to the interval-value fuzzy set (IVFS) theory proposed by Zadeh [11].

As two important topics in the FS theory, entropy measures and similarity measures of fuzzy sets have been investigated widely by many researchers from different points of view. The entropy of a fuzzy set describes the fuzziness degree of the fuzzy set. de Luca and Termini [12] introduced some axioms which captured people’s intuitive comprehension to describe the fuzziness degree of a fuzzy set. Kaufmann [13] proposed a method for measuring the fuzziness degree of a fuzzy set by a metric distance between its membership function and the membership function of its nearest crisp set. Yager [14] suggested the entropy measure being expressed by the distance between a fuzzy set and its complement. Chiu and Wang [15] gave simple calculation for entropies of fuzzy numbers in addition and extension principle. They [16] also investigated the entropy relationship between the original fuzzy set and the image fuzzy set. Hong and Kim [17] introduced a simple method for calculating the entropy of the image fuzzy set without calculating its membership function. Zeng and Li [18] showed that similarity measures and entropies of fuzzy sets can be transformed to each other based on their axiomatic definitions.

Aimed at these important numerical indexes in the fuzzy set theory, some researchers extended these concepts to the IVFS theory and IFS theory and investigated their related topics from different points of view [1924]. We review some generalization study on entropy measure. Burillo and Bustince [25] introduced the notions of entropy on IVFSs and IFSs to measure the degree of intuitionism of an IVFS or an IFS. Szmidt and Kacprzyk [26] proposed a nonprobabilistic-type entropy measure with a geometric interpretation of IFSs. Hung and Yang [27] gave their axiomatic definitions of entropies of IFSs and IVFSs by exploiting the concept of probability. Farhadinia [20] generalized some results on the entropy of IVFSs based on the intuitionistic distance and its relationship with similarity measure. After that, many authors also proposed different entropy formulas for IFSs [2731], IVFSs [32, 33], and vague sets [34]. For IVIFSs, Liu et al. in [35] proposed a set of axiomatic requirements for entropy measures, which extended Szmidt and Kacprzyk’s axioms formulated for entropy of IFSs [26]. Wei et al. in [36] extended the entropy measure of IFSs proposed in [26] to IVIFSs and gave an approach to construct similarity measures by entropy measures of IVIFSs. By this approach, the proposed entropy measure can yield a similarity measure of IVIFSs, which has been applied in the context of pattern recognition, multiple-criteria fuzzy decision-making, and medical diagnosis. For entropy measures of IVIFSs, we refer to [3739].

In [31], Vlachos and Sergiadis revealed an intuitive and mathematical connection between the notions of entropy for FSs and IFSs in terms of fuzziness and intuitionism. They pointed out that entropy for FSs is indeed a measure of fuzziness, while for IFSs, entropy can measure both fuzziness and intuitionism. Recall that the fuzziness is dominated by the difference between membership degree and nonmembership degree, and the intuitionism is dominated by the hesitation degree. Hence it is very interesting to construct entropy formulas measuring both fuzziness and intuitionism. We propose an entropy measure, as well as its general form, for IFSs and then generalize it to IVIFSs. Our entropy measures are compared with some existing ones in [2931, 37]. As an application in multicriteria fuzzy group decision-making, we propose a method to assess the experts’ weights by the proposed entropy measures.

The rest of the paper is organized as follows. Section 2 reviews some necessary concepts of IFSs and IVIFSs. In Section 3, we propose a new entropy measure and its general form for IFSs. Then we compare the proposed entropy measure with some existing ones and give some conditions under which these existing entropy measures may not work as desired, while the proposed entropy measure can do well. In Section 4, we extend the entropy measures defined in Section 3 to IVIFSs and propose entropy measures for IVIFSs. These entropy measures are compared with the ones defined by Ye in [37]. Section 5 gives the application of the proposed entropy measures in assessing the weights of experts. Concluding remarks are drawn in Section 6.

2. Preliminaries

Some basic concepts of intuitionistic fuzzy sets and interval-valued intuitionistic fuzzy sets are reviewed from Atanassov [2], Atanassov and Gargov [3], and Xu [40].

Definition 1 (see [2]). Let be a universe of discourse. An intuitionistic fuzzy set in is an object having the form where with the condition The numbers and denote the degree of membership and nonmembership of to , respectively.

For convenience of notations, we abbreviate “intuitionistic fuzzy set” to IFS and denote by IFS the set of all IFSs in .

For each IFS in , we call the intuitionistic index of in , which denotes the hesitancy degree of to . The complementary set of is defined as

Definition 2 (see [2]). For two IFSs and , their relations are defined as follows:(1) if and only if , , for each ;(2) if and only if and .

Consider that, sometimes, it is not approximate to assume that the membership degrees for certain elements of an IFS are exactly defined, but a value range can be given. In such cases, Atanassov and Gargov [3] introduced the following notion of interval-valued intuitionistic fuzzy sets.

Definition 3 (see [3]). Let be a universe of discourse and denote all closed subintervals of the interval . An interval-valued intuitionistic fuzzy set in is an object having the form where with the condition The intervals and denote the degree of membership and nonmembership of to , respectively.

For convenience, let , , so with the condition .

We abbreviate “interval-valued intuitionistic fuzzy set” to IVIFS and denote by IVIFS the set of all IVIFSs in .

We call the interval abbreviated by or , the interval-valued intuitionistic index of in , which is a hesitancy degree of to .

Clearly, if and , then the given IVIFS is reduced to an ordinary IFS.

Definition 4 (see [25]). Let denote all closed subintervals of the interval . For , we define if and only if , ; if and only if , ; if and only if , .

Definition 5 (see [3]). For two IVIFSs and , their relations and operations are defined as follows:(1) if and only if , , for each ;(2) if and only if and ;(3).

In [40], is called an interval-valued intuitionistic fuzzy value (IVIFV), where , , and . Let be the universal set of IVIFVs.

Based on two functions, Xu [40] provided a method to compare two IVIFVs.

Definition 6 (see [40]). Let and be two IVIFVs. Let and be the score degrees of and , respectively; let and be the accuracy degrees of and , respectively. Then we have the following.(1)If , then is smaller than , denoted by .(2)If , then the following hold:(a)if , then is indifferent to , denoted by ;(b)if , then is smaller than , denoted by ;(c)if , then is bigger than , denoted by .

Definition 7 (see [40]). Let and be two IVIFVs. Then three operational laws of IVIFVs are given as follows:(1);(2), ;(3).

With the thorough research of IVIFS theory and the continuous expansion of its application scope, it is more and more important to aggregate intuitionistic fuzzy information effectively. Xu [40] proposed interval-valued intuitionistic fuzzy weighted averaging operator to aggregate the interval-valued intuitionistic fuzzy information.

Definition 8 (see [40]). Let be a collection of IVIFVs. An interval-valued intuitionistic fuzzy weighted averaging (IVIFWA) operator is a mapping , such that where is the weighting vector of with and .

In Definition 8, if and , then IVIFVs reduce to IFVs and the IVIFWA operator reduces the IFWA operator.

In many practical problems, such as, multicriteria decision-making, the study objects are finite. So in the rest of the paper, we assume that the universe is a finite set, listed by .

3. Entropy Measures for IFSs

In this section we will propose a concrete entropy measure for IFSs and demonstrate its efficiency through comparisons with some existing entropy measures in [2831].

3.1. A New Entropy Measure for IFSs

Szmidt and Kacprzyk [26] extended the axioms of de Luca and Termini [12] to propose the following definition of an entropy measure for IFSs.

Definition 9 (see [26]). A real-valued function is called an entropy measure for IFSs if it satisfies the following axiomatic requirements:(E1) if and only if is a crisp set; that is, or for any ;(E2) if and only if for all ;(E3);(E4) if and for or and for for any .

In this subsection, we introduce a new entropy measure for IFSs. For each , define by Then we have the following theorem.

Theorem 10. The mapping , defined by (11), is an entropy measure for IFSs.

Proof. It is sufficient to show that the mapping , defined by (11), satisfies the conditions (E1)–(E4) in Definition 9.
Let for . From , , and , we have . Thus .
(E1) Let be a crisp set; that is, , , or , for any . No matter in which case, we have . Hence .
Conversely, suppose now that . Since , it follows that . Also since , we have or . Thus we can obtain , or , . So is a crisp set.
(E2) Let for each . Applying this condition to (11), we easily obtain .
Conversely, we suppose that . From (11) and , we obtain that for each . Also from , we have for each .
(E3) For , we can easily get that .
(E4) Suppose that and for . Since and , we have and . Thus It follows that Thus which implies that . Thus .
Similarly, when and for , we can also prove that . Hence we have .

Analyzing the features of the cosine function, we give the following general form of the entropy measure defined in (11), which is suggested by the referee.

Theorem 11. Let be an even function such that is strictly monotone increasing on , , and . For an IFS , let Then is an entropy measure for IFSs.

Proof. The process of the proof is similar to that for Theorem 10. We omit it.

There are many functions, for example, , , or , satisfying the requirements in Theorem 11. Clearly, different functions give rise to different entropy measures for IFSs.

3.2. Comparison with Existing Entropy Measures

For an IFS in , Ye [29] proposed two entropy measures and :

The following proposition shows that these two formulas are the same.

Proposition 12. For each in IFS, let Then .

Proof. By the properties of trigonometric functions, we have, for each , It follows that . Next, we can simplify it as follows:

The following example shows that the entropy measure can produce some counterintuitive cases.

Example 13. Let , , and be three IFSs in . Now we calculate their entropies.

The absolute differences between the membership degrees and the nonmembership degrees of each to , , and are the same; thus, by formula (17), we can obtain that . But we can see that the hesitancy degrees of the element to , , and are different. Intuitively, the uncertain information of is more than that of , and the uncertain information of is the least. Obviously, the results obtained by using Ye’s formula are not in accordance with our intuition.

Now, let us calculate the entropies of , , and by formula (11). We have , , and , so that . This is consistent with our intuition according to the above analysis. The following theorem is a straightforward exercise.

Theorem 14. Let . For a constant in , let be the set of all IFSs in with . Then is strictly monotone increasing with respect to on .

Comparing the entropy measures and , we find that could measure not only the degree of fuzziness, but also the degree of intuitionism of IFSs, which overcomes the shortcoming that could only measure the degree of fuzziness of IFSs. The entropy measures in [2831] could also measure both fuzziness and intuitionism, but in some cases, some of them may not work well as desired. Next we compare the entropy measure with them.

We first recall these entropy measures. Xia and Xu [28] derived a cross-entropy measure : where , .

In [30, 31], Vlachos and Sergiadis proposed an entropy measure according to a cross-entropy measure and an entropy measure based on the product of two vectors:

These three entropy measures satisfy the set of requirements in Definition 9. The following examples show that the entropy measures and may give inconsistent information in some cases.

Example 15. Let

Table 1 gives us the entropies of by , , , and .

Table 1: Comparison with existing entropies.

From Table 1, we can see that, for and , the closer the membership degree and the nonmembership degree, or the bigger the hesitation degree, the greater its entropy. Particularly, when the membership degree is equal to the nonmembership degree, the entropy reaches the maximum value 1. The results obtained by using the entropy measures and are in accordance with our intuition.

For , , and , we can see that the absolute differences between the membership degrees and the nonmembership degrees are the same, and hesitancy degrees of the element to , , and are increasing. Intuitively, the uncertain degree of the three IFSs should be increasing. In fact, from Table 1, we have and , which are consistent with our intuition. But, by entropy measures and , we have , , and . Obviously, the results obtained by using the two entropy measures are not in accordance with our intuition. Furthermore, for the entropy measures and , we have the following conclusions.

Theorem 16. Let . For a constant in , let be the set of all IFSs in with . Then(1) is strictly monotone decreasing with respect to on ;(2)when , is strictly monotone decreasing with respect to on ; when , is strictly monotone increasing with respect to on .

Proof. Since and , we have and or and .(1)Applying the above conditions, we have where we let if necessary.
Let for . Then = , so that is strictly monotone increasing with respect to . Thus is strictly monotone decreasing with respect to on .(2)Since and or and , we have Let Then . Clearly, when , we have , so that is strictly monotone decreasing with respect to on ; when , , which implies that is strictly monotone increasing with respect to on .

Hence, compared with the above entropy measures, the entropy measure defined by formula (11) is more effective and reasonable to measure the uncertain information of IFSs.

4. An Entropy Measure for Interval-Valued Intuitionistic Fuzzy Sets

In this section we will extend the entropy measure to IVIFSs and define a new entropy measure for IVIFSs which is compared to the entropy measures defined in [37].

Liu et al. [35] and Zhang et al. [38] proposed a set of axiomatic requirements for an entropy measure of IVIFSs, which extends Szmidt and Kacprzyk’s axioms formulated for entropy of IFSs [26].

Definition 17 (see [35]). A real-valued function is called an entropy measure for an IVIFS , if it satisfies the following axiomatic requirements:(E1) if and only if is a crisp set;(E2) if and only if for each ;(E3);(E4) if is less fuzzy than , which is defined as

In this section we will introduce a formula to calculate the entropy of an IVIFS based on the entropy measure for IFSs defined by formula (11). For any we define Then we have the following theorem.

Theorem 18. The mapping , defined by (27), is an entropy measure for IVIFSs.

Proof. In order to prove that the mapping is an entropy measure, it is sufficient to show that satisfies the conditions (E1)–(E4) in Definition 17. Suppose that Since , , , , , and are all in the interval , we have . Thus .
(E1) Let be a crisp set; that is, , , or , for any . No matter in which case, we have . Hence .
On the other hand, suppose that . Since , we have . Also since , we have . From the following four possible relations of and : we can easily obtain , , or , , so is a crisp set.
(E2) Let . Applying this condition to (27) yields .
Now suppose that . It follows that for each . Since , we have . Thus , for each .
(E3) For , we can easily get that .
(E4) Suppose that , for . Then we have the following relations: In order to prove , we need to prove that By the assumption, it is equivalent to prove that which can be simplified to prove that
Indeed, since and , we have Then, by computing the sum of the left (resp., right) terms of the above two inequalities, we have (33). Hence holds.
By a similar way, we can also prove that for the other three cases. Since for each , we have .

Similar to Theorem 11, we give the following general form of the entropy measure defined in (27).

Theorem 19. Let be an even function such that is strictly monotone increasing on , , and . For any , let Then is an entropy measure for IVIFSs.

Proof. The process of the proof is similar to that for Theorem 18. We omit it.

In the following, we will compare the entropy measure defined by (27) with the entropy measures defined in [37, 41]:

Let be an IVIFS in . Chen et al. [41] proposed a concrete entropy measure:

Ye [37] introduced two entropy measures and as follows: where , , and are two fixed numbers.

Theorem 20. For each in IVIFS, let where , , and .
Then .

Proof. The process of the proof is similar to that for Proposition 12. We omit it.

Example 21. Let us calculate entropies for the following IVIFSs:
By the entropy measure , we get
The difference between the membership degrees and nonmembership degrees of and is the same, and the hesitant degree of is bigger than that of , so the entropy of should be bigger than that of . However, by the entropy measure , we have .

Using (43), we have

We suppose that in formula (39); then formula (39) reduces to

From these results, we can see that the entropy formula defined by (43) has the following two drawbacks.(1)It does not satisfy the necessary condition of in Definition 17. In fact, for any IVIFS satisfying for each , we can obtain .(2)It only reflects the difference between the membership degree and nonmembership degree. Thus, for any two IVIFS and satisfying , for all , we have .

Now by the entropy formula defined by (27), we can obtain The results show that and