Abstract

In the present communication, a parametric (, )-norm information measure for the Pythagorean fuzzy set has been proposed with the proof of its validity. The monotonic behavior and maximality feature of the proposed information measure have been studied and presented. Further, an algorithm for solving the multicriteria decision-making problem with the help of the proposed information measure has been provided keeping in view of the different cases for weight criteria, when weights are unknown and other when weights are partially known. Numerical examples for each of the case have been successfully illustrated. Finally, the work has been concluded by providing the scope for future work.

1. Introduction

The concept of intuitionistic fuzzy set (IFS) (Atanassov) [1] has been widely studied and applied to deal with uncertainties and hesitancy inherent in practical circumstances. The prominent characteristic of an IFS is that it assigns a number from the unit interval to every element in the domain of discourse, a degree of membership, and a degree of nonmembership along with the degree of indeterminacy whose total sum equals unity. In literature, intuitionistic fuzzy sets comprehensively span applications in the field of decision-making problems, pattern recognition, sales analysis financial services, medical diagnosis, etc.

Pythagorean fuzzy set (PFS), proposed by Yager [2], is an efficient generalization of intuitionistic fuzzy set, characterized by a membership value and a nonmembership value satisfying the inequality that the squared sum of these values is less than or equal to 1. Yager and Abbasov [3] well stated that, in some practical multiple-criteria decision-making problems, it is viable that sum of the degree of the membership and the degree of nonmembership value of a particular alternative provided by a decision-maker may be in such a way that their sum is bigger than 1, where it would not be feasible to use intuitionistic fuzzy set. Therefore, PFS proves to be proficiently more capable of representing and handling vagueness, impreciseness, and uncertainties than IFS in various decision-making processes. It may be noted that PFS is more generalized than IFS as the span of membership degree of PFS is more than span of membership degree of IFS which enables wider applicability.

Various researchers theoretically developed the concept of Yager’s Pythagorean fuzzy sets [4] and applied it in the field of decision-making problems, medical diagnosis, and pattern recognition and in other real-world problem. In order to deal the decision-making problem with PFSs, Zhang and Xu [5] proposed a comparison method based on a score function to identify the Pythagorean fuzzy positive ideal solution (PIS) and the Pythagorean fuzzy negative ideal solution (NIS). Further they extended the TOPSIS method to compute the distances between each alternative with PIS and NIS, respectively. Peng and Yang [6] proposed some basic operations for PFSs and provided Pythagorean fuzzy aggregation operators along with their important properties. In continuation, they developed a Pythagorean superiority and inferiority ranking algorithm to solve group decision-making problems in view of uncertainty. Further, Peng et al. [7] established the relationship between the distance measure, similarity measure, entropy, and the inclusion measure and suggested the systematic transformation of information measures for PFSs. Yager [8] introduced some of the basic set operations for PFSs and established the relationship between Pythagorean membership values and complex number. In addition to this, the solutions of multicriteria decision-making with satisfactions through Pythagorean membership values have been carried out. A new method for Pythagorean fuzzy MCDM problems with the help of aggregation operators and distance measures has been developed by Zeng et al. [9]. Further, they proposed the Pythagorean fuzzy ordered weighted averaging weighted average distance (PFOWAWAD) operator and developed a hybrid TOPSIS method.

Using PFSs, Ren et al. [10] had a simulation test to study the effect of the risk attitudes of the decision makers over the solutions of decision-making problems. Zhang [11] introduced a novel closeness index-based ranking method for Pythagorean fuzzy numbers and proposed interval valued Pythagorean fuzzy set with basic operations and important properties. In addition to this, the hierarchical multicriteria decision-making problems in Pythagorean fuzzy environment have been solved by developing a closeness index-based Pythagorean fuzzy QUALIFLEX method. Liu et al. [12] developed various types of Pythagorean fuzzy aggregation operators and used them to solve decision-making problems. Zeng [13] developed a Pythagorean fuzzy multiattribute group decision-making method on the basis of a new Pythagorean fuzzy probabilistic ordered weighted averaging (OWA) operator. Though various researchers have significantly contributed in the development of the theory of PFSs as deliberated above, a seldom study on the entropy of PFSs and its applications has been found in literature. Xue et al. [14] studied the linear programming technique for multidimensional analysis of preference (LINMAP) method under Pythagorean fuzzy environment to solve multiple attribute group decision-making problem by incorporating Pythagorean fuzzy entropy along with various other applications. Vital applications of entropy and information measures based on the IFS theory have been well known in the literature. In order to deal with real-world problems more efficiently and to cater the need of the hour, generalizations of the existing approaches play an important role as they contribute more flexibility in applications; e.g., parameters may characterize various factors such as time constraint, lack of knowledge, and environmental conditions, etc.

Bajaj et al. [15] proposed a new -norm intuitionistic fuzzy entropy and a weighted -norm Intuitionistic fuzzy divergence measure with their computational applications in pattern recognition and image thresholding. Gandotra et al. [16] studied multiple-criteria decision-making problem with the help of parametric entropy under -cut and -cut based distance measures for different possible values of parameters and provided the ranking method for the available alternatives.

In this communication, we have proposed a new -norm information measure of Pythagorean fuzzy set and applied the information measure in an algorithm to solve multicriteria decision-making problem. In continuation, the implementation of the proposed algorithm by taking suitable examples has also been illustrated. The rest of this paper is organized as follows: in Section 2, we present some basic notions and preliminaries related to the proposed information measure. A new -norm information measure of Pythagorean fuzzy set has been well proposed with the proof of its validity in Section 3. Further, in Section 4, the maximality and the monotonic behavior of the proposed information measure with respect to parameters and have been studied and validated empirically. In Section 5, a new multicriteria decision-making algorithm is provided on the basis of the proposed -norm information measure of PFS in view of two cases of weights of criteria: one when weights are unknown and other when weights are partially known. In order to support and implement the proposed algorithm, an example for each case has been explicitly dealt in Section 6. The paper is finally concluded in Section 7.

2. Preliminaries

In this section, we recall and present some fundamental concepts in connection with Pythagorean fuzzy set, which are well known in literature.

Definition 1 (see [1]). An intuitionistic fuzzy set (IFS) in (universe of discourse) is given bywhere and denote the degree of membership and degree of nonmembership, respectively, and for every satisfy the conditionand the degree of indeterminacy for any IFS and is given by .

Definition 2 (see [2]). A Pythagorean fuzzy set (PFS) in (universe of discourse) is given bywhere and denote the degree of membership and degree of nonmembership, respectively, and for every satisfy the conditionand the degree of indeterminacy for any PFS and is given by

In case of PFS, the restriction corresponding to the degree of membership and the degree of nonmembership iswhereas the condition in case of IFS isfor . This difference in constraint conditions gives a wider coverage for information span which can be geometrically shown in Figure 1.

Some of the important binary operations on PFSs are being presented below which are available in literature.

Definition 3 (see [7]). If and are two Pythagorean fuzzy sets in , then the operations can be defined as follows: (a)Complement: .(b)Containment: .(c)Union: .(d)Intersection: .

Definition 4 (see [10]). Let and be two PFSs, then the Euclidean distance between and is defined as follows:

Definition 5 (see [5]). Let and be two PFS, then the Hamming distance between and is defined as follows:

3. New -Norm Information Measure of PFS

Let and be the set of all probability distribution association with random variable taking finite values Joshi and Kumar [17] defined and studied a real valued function from to as -norm information measure of the distribution for and and given bywhere and Or and

The most important property of this measure is that when =1 or =1, then (10) becomes the or -norm entropy studied by Boekee and Lubbe [18] and if and or and , then it gives Shannon’s [19] entropy.

Based on the axiomatic definition of entropy for intuitionistic fuzzy set, proposed by Hung and Yang (2006) [20], we analogously define a real valued function , called entropy of Pythagorean fuzzy set if and only if the following four axioms are satisfied:(i)(PFS1) Sharpness: iff is a crisp set, i.e., ,  ; or ,  ;  .(ii)(PFS2) Maximality: is maximum iff(iii)(PFS3) Symmetry: .(iv)(PFS4) Resolution: iff , i.e., and for or if and for

In context with Pythagorean fuzzy information, we propose the following Pythagorean fuzzy entropy analogous to measure (10):

Theorem 6. The proposed entropy measure is a valid Pythagorean fuzzy information measure.

Proof. To prove this, we shall show that it satisfies all the axioms PFS1 to PFS4.(i) (PFS1) (Sharpness): If , thenSince   , it is possible only in the following cases:(a)Either , i.e., ,(b) i.e., ,(c) i.e., .These three cases implies that is a crisp set. Conversely, if is a crisp set then which is obvious.(ii) (PFS2)(Maximality):In Section 4, we have empirically proved that is maximum iff Analytically, we prove the concavity of the by calculating its hessian at the critical point, i.e., with particular values of and . The Hessian of is as :It may be noted that is a negative semidefinite matrix for different possible values of and which shows that it is a concave function. Hence, the concavity of the function establishes the maximality property.(iii) (PFS3)(Symmetry): It is obvious from the definition that(iv) (PFS4)(Resolution): We haveandbecause if and with , then ;   and which implies that the above result holds. Similarly, if and with , then also the above result holds.Now, since is a concave function on the Pythagorean fuzzy set , therefore, if then and imply .
Therefore, by the above explained result, we conclude that satisfies condition of resolution PFS4.
Similarly, if , then and . By using the above proved result, we conclude that satisfies the condition PFS4.
Hence, satisfies all the axioms of Pythagorean fuzzy entropy and, therefore, is a valid measure of Pythagorean fuzzy information.

Theorem 7. Let and be two PFSs defined in where and such that either or . Then

Proof. Divide into two parts and such that , i.e., ,    ;  , i.e.,   . Nowwhich impliesSimilarly,On adding the above two terms, we get

Theorem 8. For any Pythagorean fuzzy set , we have

Proof. By definition, the proof is obvious.

4. Monotonicity of - Norm Information Measure of PFS

In this section, we carry an empirical study for investigating the maximality and monotonic nature of the proposed -norm information measure of PFS. For this, we consider the following four Pythagorean fuzzy sets , , , and over the universe of discourse :For various values of and and using equation (12), we compute and tabulate all the values of . On the basis of the tabulated data and the plots given in Table 1 and Figure 2, it is quite clear that takes maximum value when and is a monotonically decreasing function of and .

5. MCDM Algorithm with -Norm Entropy

Suppose that there is a set of feasible alternatives, i.e., and a set of criteria The decision-making problem is to select the most suitable alternative out of these alternatives. The appraisal values of an alternative with respect to the criteria are given by , where is the degree to which the alternative satisfies criteria and is the degree to which the alternative does not satisfy attribute , satisfying ,   and with and . This problem can be modeled by representing it through the following Pythagorean fuzzy decision matrix:Let be the weight vector of all the criteria where and is the degree of importance of the th criteria. Sometimes this criteria weight is completely unknown and sometimes it is partially known because of the lack of knowledge, time, data, and the limited expertise of the problem domain. In this section, we discuss and devise two methods to determine the weights of criteria by using the proposed entropy (12).

Case 1 (unknown weights). When the criteria weights are completely unknown, then we calculate the weights by using the proposed PFS entropy aswhere , andis the proposed Pythagorean fuzzy entropy for

Case 2 (partially known weights). In case the weights are partially known for a multiple-criteria decision-making problem, we use the minimum entropy principle (Wang and Wang [21]) to determine the weight vector of the criteria by constructing the programming model as follows.
The overall entropy of the alternative iswhere ; or

Since there are fair competitive environments between each of the alternatives, the weight coefficient with respect to the same criteria should also be equal. Further, in order to get the ideal weight, we construct the following accompanying model:

;   or , subject to .

Finally, the procedure for implementing the proposed algorithm is being presented using Figure 3.

The steps of the proposed methodology are enumerated and detailed as follows.

Step 1. We construct the decision matrix , where the elements    are the appraisal of the alternative with respect to the criteria

Step 2. Determine the criteria weights by using (27) and (30).

Step 3. Define the most preferred solution and the least preferred solution aswhere ,  ;  ; andwhere ,  ;  , respectively.

Step 4. By using Definition 5, the distance measures of from and will be evaluated as follows:and

Step 5. Determine the relative degrees of closeness as follows:

Step 6. On the basis of the relative degree of closeness obtained in Step 5, we determine the optimal ranking order of the alternatives. The alternative with the maximal degree of closeness is supposed to be the best alternative.

6. Numerical Examples

Based on two different cases considered in the proposed algorithm, we present two different examples as follows.

Example 1 (unknown weights). Suppose an automobile company produces four different cars, say, , , , and . Suppose a customer wants to buy a car based on the four given criteria, say, comfort , good mileage , safety , and interiors . Assume the appraisal values of the alternatives with respect to each criterion provided by the expert are represented by PFS asThen the calculations for the ranking procedure are as follows:(1)Calculate the criteria weight vector using (27): (2)The most preferred solution and the least preferred solution are given byandrespectively.(3)The distances between each of from and are given by(4)The values of relative degree of closeness are as follows:(5)The ranking of the alternatives as per the relative degree of closeness is and is the best available alternative. It may be noted that the above ranking is with respect to the specific values of = 3 and = 0.3.The consistency of the ranking procedure for different values of parameters and may also be observed and studied by making a simulation study over the varying values of the parameters depending on the requirement.

Example 2 (partially known weights). Suppose there are 1000 students in a college. On the basis of three selected criteria, say, (personality), (intelligence), and (communication skills), the administration wants to select a college representative. Let there be three candidates, say, , , and . The PFS decision matrix for the above problem is

Let the information about the criteria weight be partially given in the following form . The calculation for the ranking procedure for the above decision-making problem is presented as follows:(1)Using (30), we determine the criteria weights by following linear programming model:By solving this linear programming problem using MATLAB software, we obtained the criteria weight vector as follows:(2)The most preferred solution and the least preferred solution are given by and , respectively.(3)The distances between each of from and are given by(4)The values of relative degree of closeness are(5)The ranking of the alternatives as per the relative degree of closeness is and is the best available alternative. It may be noted that the above ranking is with respect to the specific values of = 3 and = 0.3.The consistency of the ranking procedure for different values of the parameters and may also be observed and studied by making a simulation study over the varying values of the parameters.

7. Conclusions and Future Work

We have successfully proposed a new parametric -norm information measure for Pythagorean fuzzy set along with the proof of its validity and discussed its maximality and the monotonic behavior with respect to parameters under consideration. Further, an algorithm for multicriteria decision-making problem has been well proposed and successfully implemented with the help of two different kind of numerical examples when weights are unknown and other when weights are partially known.

In the area of pattern recognition, the directed divergence measure/symmetric divergence measure explains dissimilarity between pairs of probability distribution which is generally utilized for the procedure of factual surmising. It might be noticed that these difference measures and similarity measures are dual ideas. The similarity measure might be characterized as a diminishing capacity of the difference measures, particularly when the scope of divergence measures is .

The proposed parametric -norm information measure can further be extended to the concept of the parametric directed divergence measure/symmetric divergence measure for Pythagorean fuzzy sets. Various applications including total ambiguity and information improvement measures based on the proposed divergence measures may also be discussed in detail, e.g., taking a monotonic decreasing function into account, the upper bound of the symmetric divergence measure can be calculated and the similarity measure can be subsequently defined between two PFSs. The concept of similarity based clustering method (SCM) can also be investigated and the structure of the considered data set might be examined with the assistance of the similarity measure.

Data Availability

The data for the implementation of the proposed algorithm in the numerical example are hypothetical data and have no connection with any particular agency’s data.

Disclosure

This article does not contain any studies with human participants or animals performed by any of the authors.

Conflicts of Interest

The authors declare that they have no conflicts of interest.