Abstract

In the rough fuzzy set theory, the rough degree is used to characterize the uncertainty of a fuzzy set, and the rough entropy of a knowledge is used to depict the roughness of a rough classification. Both of them are effective, but they are not accurate enough. In this paper, we propose a new rough entropy of a rough fuzzy set combining the rough degree with the rough entropy of a knowledge. Theoretical studies and examples show that the new rough entropy of a rough fuzzy set is suitable. As an application, we introduce it into a fuzzy-target decision-making table and establish a new method for evaluating the entropy weight of attributes.

1. Introduction

The rough set theory, introduced by Pawlak [1, 2], is a useful and important tool to deal with uncertain information systems. It has been studied and applied in many fields, such as artificial intelligence, making decision, machine learning, and pattern recognition. In the traditional rough set theory, targets are often ordinary. However, in actual problems, we often encounter some cases in which targets are fuzzy [3]. In order to describe this kind of fuzzy targets, Dubois and Prade [4, 5] proposed the concept of rough fuzzy set, which is a good combination of rough sets and fuzzy sets. As an important issue in the rough fuzzy set theory, measuring uncertainty has been widely studied (see [69]). There are two primary reasons which cause the uncertainty. One reason is that the boundary region of the target’s approximation space could produce roughness. The larger the boundary region is, the larger the roughness of the target is. The kind of roughness is classified into system uncertainty and measured by the rough degree [3, 10] from the view of algebra. Another reason is the roughness produced by a knowledge which is a division and divided by the binary relation on universe. The more objects the equivalent classes have, the rougher the knowledge is. This kind of uncertainty is classified into knowledge’s uncertainty and measured by the rough entropy [11, 12] of a knowledge from the view of information theory.

However, there exist some deficiencies about these uncertainty measures of rough fuzzy sets mentioned above. On the one hand, the rough degree is not strictly monotonic with finer knowledge. That implies that the rough degree is not accurate enough. On the other hand, the rough entropy of knowledge does not reflect the uncertainty produced by the boundary region of approximation space. In order to overcome these limitations, we propose a new rough entropy about rough fuzzy sets, which considers not only the rough degree but also the rough entropy of a knowledge. This is the first aim in this paper (see Section 3).

In multiattribute decision-making problems, the research of evaluating attribute weights is always a hot area. There are some traditional methods to determine attribute weights, such as expert grading, binary comparing, fuzzy statistics, and grey relational analysis. Nevertheless these methods depend largely on decision maker’s experience. Now, the rough set theory has been introduced into research of determining weights. Usually, there are two kinds of methods. From the view of algebraic theory [13, 14], the rough degree is used to determine weights of attribute. This method is effective but sometimes leads to an impractical case in which some weights may be zero. From the view of information theory [15, 16], information entropy is used to determine weights. This method can avoid the case in which the weight is equal to zero. But it may lead to an awkward case in which weights of some redundant attributes are bigger than that of nonredundant attributes weights. In this paper, we will introduce the new rough entropy of rough fuzzy sets into fuzzy-target decision-making tables. As an application, we give a new method to evaluate entropy weights. That is the second aim in this paper (see Section 4).

2. Basic Concepts of Rough Fuzzy Sets

In order to deal with complicated fuzzy information systems integrating the rough set theory, Dubois and Prade in [4] introduced equivalent relations into fuzzy sets and proposed the concept of rough fuzzy sets.

Let be universe, and let be an equivalent relation. Then constitutes a Pawlak approximation space. The set of equivalent classes generated by on is called a knowledge. denotes the equivalent class of the element . denotes the set of all fuzzy sets on .

Definition 1 (see [3, 10]). Let be a Pawlak approximation space, and let be a fuzzy set on . Then the lower and upper approximation sets of on are defined by and , respectively, where and are called the lower and upper approximation operators. The ordered pair is called rough fuzzy set. If , is said to be definable. Otherwise, is said to be rough.

In Definition 1, can be understood as the membership that the element affiliated at lowest level to and as the membership that the element affiliated possibly to . If is degenerated to an ordinary set, and are degenerated accordingly to the lower and upper approximation set of traditional Pawlak rough set, respectively.

Let be a universe and . Then the cut sets of and are given, respectively, as follows:

Definition 2 (see [12]). Let and be equivalent relations, , and . If there is for any such that , then it is said that the knowledge is finer than the knowledge and denotes it by .
If is finer than and , then it is said that is strictly finer than and denotes it by .

3. Uncertainty Measures of Rough Fuzzy Sets

Two kinds of methods to measure uncertainty of rough fuzzy sets are often used. One is the rough degree, which is defined from an algebra point of view. Another is the rough entropy, which is defined from an information theory point of view.

3.1. Rough Degree of a Rough Fuzzy Set

Since a fuzzy target is described by use of the lower and upper approximation sets, the noncoincidence of the lower and upper approximation sets results in roughness. The larger the boundary region formed by the lower and upper approximation sets is, the more the roughness is. In order to quantify the roughness, rough degree was introduced.

Definition 3 (see [3, 10]). Let be a Pawlak space and . Then for any , the accuracy degree and rough degree of under are defined, respectively, as follows: where denotes the number of elements belonging to a finite set. If , then it is stipulated that .

In Definition 3, rough degree characterizes the roughness using the lower and upper approximation sets to approach a fuzzy target and thresholds and make the representation of rough degree more flexible.

Theorem 4 (see [3]). Let be a Pawlak space and . For any , the rough degree in Definition 3 satisfies the following properties: (i);(ii) does not decrease with and does not increase with ;(iii)If , then ;(iv)If and (is a constant) for any , then .

Theorem 5 (see [3]). Let and be two Pawlak spaces, and . Then, .

Theorem 5 describes that the rough degree is a monotone decrease with finer knowledge. However, this result is not sufficiently accurate; see an illustrative example.

Example 6. Let be a Pawlak approximation space, where , , and . Given a fuzzy-target decision-making decision table, see Table 1.
Let , , and . Then We may see that By Definition 1, we have Let . Then We have

It is possible that the values of rough degree are invariant under the finer knowledge from Example 6. This reveals that rough degree is not equipped with strict monotone with the finer knowledge. It implies the limitations of rough degree in Definition 3. Hence, it is necessary to find a more accurate measure to characterize the roughness of rough fuzzy sets.

3.2. Rough Entropy of a Knowledge

Entropy is an important notion measuring uncertainty of information systems in the information theory; see [17]. A knowledge in rough set theory is regarded as a division of universe. This implies that knowledge itself is granular and uncertain. The bigger the knowledge granules are, the rougher the division is, and, accordingly, the bigger the uncertainty of knowledge is. In order to quantify the uncertainty that resulted in knowledge granules, Liang and Shi in [11] introduced a definition of rough entropy for a knowledge.

Definition 7 (see [11]). Let be a Pawlaw space and . Then rough entropy of the knowledge is defined as follows:

Theorem 8 (see [11]). Let and be two Pawlak spaces. Then implies that .

From Theorem 8, one can see that the finer the division of under is, the smaller the rough entropy of a knowledge is.

3.3. Rough Entropy of a Rough Fuzzy Set

The uncertainty of a rough fuzzy set not only depends on the roughness of the fuzzy target itself, but also depends on the uncertainty of a knowledge. From the discussion in Section 3.1, we know that although rough degree can reflect the roughness of a fuzzy target which is characterized approximately by lower and upper approximation sets, is not strictly monotone and cannot reflect the influence with the changes of knowledge granules. From the discussion in Section 3.2, we know that the rough entropy of a knowledge can reflect the roughness of the knowledge and is strictly monotone, but it cannot reflect the roughness of a fuzzy target itself. Based on these deficiencies, we propose a new rough entropy of a rough fuzzy set.

Definition 9. Let be a Pawlak approximation space, , and . Rough entropy of under is defined as follows:

Theorem 10. Rough entropy of defined in Definition 9 has the following properties. (i)For any given , reaches the maximum when .(ii) reaches the minimum 0 when .

Proof. (i) Let . Then for any , we have Obviously, , . Then and . By Definition 3, we have
In addition, by Definition 7, we have Hence, we obtain
(ii) If , we have Hence, we obtain .

Theorem 11. Let and be Pawlak approximation spaces, and . If , one has

Proof. Assuming that , we have by Theorem 5 and by Theorem 8. Then we obtain .

Theorem 11 implies that the rough entropy of a rough fuzzy set in Definition 9 is strictly monotone.

Example 12. For the fuzzy set given in Example 6, we compute the rough entropy as follows: Then we obtain when .

This example validates that the rough entropy in Definition 9 is more accurate than the rough degree in Definition 3, since it is equipped with strict monotonicity.

4. Entropy Weight for a Fuzzy-Concept Decision-Making Table

4.1. Weight of an Attribute

Given a fuzzy-target decision-making table , where is an object set, is a conditional attribute set whose value set is a crisp set, and is a fuzzy target whose value set is a fuzzy set. Using the rough fuzzy set theory in Section 3, the fuzzy target can be described approximately by the lower and upper approximation sets , under the attribute set , and the rough entropy of the fuzzy target under the attribute set is

The significance of an attribute is obtained by the change value of rough degree when the attribute is removed from the attribute set. The larger the change is, the more the significance of the attribute is. The significance of the attribute is Then the weight of the condition attribute is

We can see that and may equal zero since is not strictly monotone with finer knowledge . Nevertheless, for a certain decision table, each attribute is significant in a way [18]. That is to say, each attribute’s weight may be tiny but could not be equivalent to zero. In order to avoid the deficiency of and above, we propose a new method to calculate the weight.

Definition 13. Let be a fuzzy-target decision-making table and . The significance of is defined by and weight of is defined by

Theorem 14. Let be a fuzzy-target decision-making table and . Then for any integers , one has .

Proof. Since holds, by Theorem 11, we have Then Hence, we obtain

Theorem 14 shows that the weight of attribute defined in Definition 13 cannot be equivalent to zero. This is what we expect.

4.2. Numerical Example

Example 15. Given a fuzzy-target decision-making table for people’s trip in an area influenced by weather conditions, see Table 2. Using two different methods, we analyze the weight of every weather condition which affects people’s decision for a trip.
Let and {Outlook (), Temperature (), Humidity (), Windy ()}. Then we have
Let , ; we obtain
Then, by computation using rough degree, the significance of attribute is, respectively, and the weight of is, respectively,
On the other hand, we compute the modified significance and weight given in Definition 13. According to Definition 7, we compute and according to Definition 9, we have Hence, according to Definition 13, we obtain and the weight of attribute is, respectively,
By contrasting two kinds of weights above, we find that the latter method involving the rough entropy of a fuzzy set given in Definition 13 keeps basically identical trend with the former one only using the rough degree. That means the latter method is effective. However, it is unpractical that the weight of attribute is zero in the former case. The latter method avoids this case. This implies that the latter method is better than the former one. Moreover, the weights of attributes and are equivalent in the former result. It does not reflect the influence of knowledge’s roughness. The latter method makes up for this deficiency. According to these results, we can conclude that our method involving new rough entropy of a fuzzy set is superior to the method only using rough degree.

5. Conclusion

A new rough entropy for measuring uncertainty in the rough fuzzy set theory has been proposed. Using it, we can measure not only the rough degree induced by the boundary region of a fuzzy target in an approximation space, but also the roughness from the classification of a knowledge. In a fuzzy-target decision-making table, we have established a new method based on the new rough entropy to evaluate attribute weights.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

The authors would like to thank the referee for his/her valuable comments and suggestions. This work is supported by the Guangdong Province’s Nature and Science Project (S201101006103) of China, the Jiangmen City’s Science and Technology Project of China, and the Wuyi University Youth’s Nature and Science Project of China.