Abstract

This paper focuses on multiattribute group decision-making problems with interval-valued intuitionistic fuzzy values (IVIFVs) and develops a consensus reaching model with minimum adjustments to improve the consensus among decision-makers (DMs). To check the consensus, a consensus index is introduced by measuring the distance between each decision matrix and the collective one. For the group decision-making with unacceptable consensus, Consensus Rule 1 and Consensus Rule 2 are, respectively, proposed by minimizing adjustment amounts of individual decision matrices. According to these two consensus rules, two algorithms are devised to help DMs reach acceptable consensus. Moreover, the convergences of algorithms are proved. To determine weights of attributes, an interval-valued intuitionistic fuzzy program is constructed by maximizing comprehensive values of alternatives. Finally, alternatives are ranked based on their comprehensive values. Thereby, a novel method is proposed to solve MAGDM with IVIFVs. At length, a numerical example is examined to illustrate the effectiveness of the proposed method.

1. Introduction

Multiattribute group decision-making (MAGDM), where several decision-makers (DMs) evaluate a finite set of alternatives with respect to multiple attributes and select a best one, has been applied in many fields, such as supply chain management, risk investment, and industry engineering. In classical MAGDM, attribute values are usually represented as real numbers. However, due to the ambiguity of human thinking and the lack of decision information as well as time, DMs are unable to express their opinions with real numbers precisely. After Zadeh [1] introduced the fuzzy set (FS), more and more DMs used fuzzy sets to describe their opinions [24]. Nevertheless, FS characterizes the fuzziness only by the membership degree. Later, Atanassov [5] extended the FS and introduced the intuitionistic fuzzy set (IFS) which uses real number to express the membership, nonmembership, and hesitancy of an alternative on a given set. In 1989, Atanassov and Gargov [6] generalized IFS to the interval-valued IFS (IVIFS), in which the membership, nonmembership, and hesitancy are represented by intervals. In recent years, MAGDM with interval-valued intuitionistic fuzzy values (IVIFVs) has received widely attentions [716].

In group decision-making with IVIFVs, different DMs come from various research fields and have varying perceptions, attitudes, and motivations. Thus, they may have different preferences for the same decision problem and provide distinct opinions on alternatives. Thereby, inconsistency among DMs’ opinions is inevitable. If individual opinions are aggregated directly without consensus, final decision results may be unable to represent the opinion of this group and the group decision-making may be meaningless. Hence, the consensus process, which helps DMs bring their opinions closer, is one of key issues in MAGDM to reach a collective decision result accepted by most DMs. As so far, pools of methods [1725] have been developed to reach consensus. A popular method, proposed by Dong et al. [21], aids DMs in reaching consensus by minimizing adjustments between original individual decision matrices and adjusted ones. Later, this method is improved and extended to different environments. For example, Zhang et al. [19] improved this method by measuring adjustment with distances and the number of adjusted elements in real number scenario. Wu et al. [22] and Zhang et al. [23] extended method [21] to linguistic distribution context or incomplete linguistic distribution context and, respectively, constructed two different feedback mechanisms based consensus models with minimum adjustment cost. Subsequently, Zhang et al. [24] and Dong et al. [25], respectively, developed a minimum adjustment distance consensus rule and a minimum number of adjusted simple terms consensus model under hesitant fuzzy linguistic environment. In fact, consensus models with minimum adjustments have been successfully applied to many contexts, such as social network, opinion dynamics, and dishonest MAGDM contexts [2628].

Although the consensus model with minimum adjustment has been widely used in GDM, its application on GDM problems with IVIFVs has not appeared. Meanwhile, the research on the consensus of GDM with IVIFVs is very few. Only Zhang and Xu [16] and Cheng [10] discussed this issue. Zhang and Xu [16] presented a consensus index based on dominant relations between alternatives. Employing similarity degrees between individual preference vectors and the group one, Cheng [10] presented another consensus index and proposed an iterative approach to improving the consensus. For enriching the study on the consensus model in the IVIF context, this paper presents a new consensus feedback mechanism by generalizing the consensus model with minimum adjustment [19] to IVIF environment.

After reaching the consensus in MAGDM with IVIFVs, the next key issue is how to determine weights of attributes, which plays a significant role while aggregating individual opinions into a collective one. To determine attributes’ weights, different mathematical programs are constructed [8, 9, 11, 15]. For example, by maximizing comprehensive values of alternatives, Wan et al. [15] and Hajiagha et al. [11] constructed distinct mathematical models to determine attribute weights. The difference between them is that the former is an interval program, while the latter is an evolving linear program. By maximizing the weighted scores of alternatives, Chen and Huang [9] built a linear programming model to obtain attributes’ weights. Chen [8] set up a nonlinear program to derive attributes’ weights by maximizing inclusion-based closeness coefficients of alternatives. Finally, alternatives are ranked by distinct decision methods or aggregation methods, such as the plant growth simulation method [14], the inclusion-based TOPSIS method [8], the permutation method [7], the extended ELECTREE [12, 29], and IVIF power Heronian aggregation operators [13].

Although previous studies are effective for solving MAGDM with IVIFVs, there are still some limitations as follows:

Most existing methods [79, 1115] ignored the consensus before integrating individual opinions. Despite of methods [10, 16] discussing the consensus, method [16] only introduced a consensus index but did not provide any approach to improving the consensus. Although method [10] designed an algorithm for reaching consensus, the convergence of this algorithm is not proved. In fact, this algorithm sometimes is unable to help DMs reach the predefined level of the consensus, which is verified in Section 6.2.3.

Some methods [1214, 16] assigned attributes’ weights in advance, and this may result in the subjective randomness. Although methods [8, 9, 15, 29] determined attributes’ weights objectively by constructing and solving mathematical programs, the determined weights are real numbers. Considering advantages of IVIFSs over real numbers mentioned before, it is more suitable that attributes’ weights are represented by IVIFVs.

Due to that attribute values of alternatives are IVIFVs, it is reasonable that comprehensive values of alternatives should be IVIFVs, too. Thus, the decision information supplied by DMs can be retained as much as possible. However, comprehensive values of alternatives derived by methods [8, 12, 14, 15, 29] are real numbers or intervals. This may lead to the lost or distortion of decision information to some extent.

To make up above limitations, this paper discusses the consensus of MAGDM with IVIFVs. A consensus index is introduced to check the degree of the consensus among DMs. To improve the consensus, Consensus Rule 1 and Consensus Rule 2 are presented by minimizing adjustment amounts of original individual decision matrices. The difference between these two consensus rules is that Consensus Rule 1 is to minimize the distances between original decision matrices and adjusted ones, while Consensus Rule 2 is to minimize the number of adjusted elements in original matrices. Subsequently, maximizing comprehensive values of alternatives with IVIFVs, an IVIF program is constructed and solved to determine attributes’ weights objectively. Finally, comprehensive values of alternatives are generated and alternatives are ranked.

Compared with existing methods, the proposed method has following prominent characteristics:

Before aggregating individual decision matrices, the consensus among DMs is considered. A simple index is introduced to measure the consensus degree among DMs and two consensus rules are presented to help DMs reach an acceptable consensus. Furthermore, the convergences of these two rules are proved explicitly. Thus, it is guaranteed that the consensus among DMs can achieve predefined consensus degree for any MAGDM with IVIFVs.

For determining attributes’ weights, an IVIF program is built and a new solving approach is provided. First, DMs assign attributes’ weights in the form of IVIFVs. Afterwards, accurate attributes’ weights are objectively determined by solving the built IVIF program. Thus, not only the activeness of DMs is explored, but also the objectiveness of attributes’ weights is ensured.

The comprehensive values of alternatives obtained by the proposed method are in the form of IVIFVs, which is consistent with the form of attribute values provided by DMs. Thus, the decision information may be retained as much as possible. Therefore, the decision results based on comprehensive values may be more reasonable.

The remainder of this paper is organized as follows: Section 2 reviews some definitions of IVIFSs and describes MAGDM problems with IVIFVs. Section 3 introduces a consensus index for measuring the degree of consensus among DMs and defines two types of adjustment amounts used in the consensus reaching process. Section 4 presents two consensus rules for reaching consensus. In Section 5, a multiobjective interval intuitionistic fuzzy program is constructed and solved to determine attributes’ weights objectively. At the end of this section, a novel method is developed to solve MAGDM problems with IVIFVs. Section 6 provides a numerical example to show the application of the proposed method. The paper ends with some conclusions in Section 7.

2. Preliminaries

To facilitate subsequent analyses, this section reviews some definitions related to IVIFSs and describes MAGDM problems with IVIFVs.

2.1. Interval-Valued Intuitionistic Fuzzy Set

Definition 1 (see [6]). An interval-valued intuitionistic fuzzy set in is defined aswhere and and for any . The intervals and represent the membership degree and nonmembership degree of element to the IVIFS , respectively. Denote and . Therefore, the IVIFS can be equivalently expressed aswhere , , and for any .

In addition, is called the hesitancy index of element , where and . The pair is called an IVIFV and simply denoted by , where , , and .

Definition 2 (see [6]). Let and be two IVIFVs, and then we stipulate(1) if and only if and .(2) if and only if and .(3);(4).

Definition 3 (see [30]). Let be an IVIFV. Thenare called the score function and accuracy function of the IVIFV , respectively.

Definition 4 (see [30]). Let and be two IVIFVs. Then,
If , then ;
If , then (i)If , then ;(ii)If , then .

Definition 5 (see [30]). Let and be two IVIFVs. Then, the Manhattan distance between and is defined as

According to Definition 5, the Manhattan distance between two IVIF matrices is defined as Definition 6.

Definition 6. Let be two IVIF matrices whose elements are IVIFVs, where . The Manhattan distance between and is defined as

Definition 7. Let be a collection of IVIFVs. Ifwhere is an associated weight vector of , satisfying and , then IVIFWM is called an IVIF weight mean operator of dimension . Particularly, when , the IVIFWM is called an IVIF mean operator.

2.2. Multiattribute Group Decision-Making Problems with IVIFVs

For the sake of convenience, letting , and , the MAGDM problem concerned in this paper is described as follows.

Let be a discrete set of alternatives. Let be a finite set of attributes. Assume the weight vector of attributes is , where are IVIFVs. Let be a set of DMs whose weight vector is with and . Suppose that DM provides an IVIF decision matrix , where represents the performance of the alternative with respective to the attribute supplied by DM .

For solving the above MAGDM problems with IVIFVs, two processes, the consensus process and the selection process, are necessary. The consensus process aims to reach a high degree of consensus among DMs, which guarantees that the final decision results obtained in the selection process is accepted by most DMs. The selection process is to obtain the final decision results based on individual decision matrices. In the consensus process, this paper focuses on how to measure the degree of consensus among DMs and how to reach an acceptable consensus degree. In the selection process, this paper proposes an IVIF program based method for ranking alternatives.

3. Consensus Index and Adjustment Amounts in MAGDM with IVIFVs

This section introduces a distance-based consensus index to measure the degree of consensus among DMs and defines an acceptable consensus. If the consensus degree among DMs does not reach the defined acceptable consensus, it is necessary to adjust original individual decision matrices to improve the consensus. In this process, how to measure adjustment amounts of adjusted matrices from original decision matrices is an interesting topic. As for this topic, this section defines two different types of adjustment amounts.

3.1. Consensus Index in MAGDM with IVIFVs

The consensus index for GDM is often introduced by measuring proximity degree between the individual performance and the group performance. The distance function is a popular tool for measuring the proximity degree. Therefore, according to Definition 6, this subsection introduces a consensus index for GDM with IVIFVs.

Let be individual decision matrices and be the collective one obtained by aggregating individual decision matrices, where and . The consensus index is introduced by three levels.

Level 1. Consensus degrees of alternatives on attributes: The consensus degree of DM on alternative with respective to attribute is computed as

Level 2. Consensus degrees on alternatives: The consensus degree of DM on alternative is computed as

Level 3. Consensus degree on decision matrices: The consensus degree of DM is computed as

Thus, the consensus index is defined as

Plugging (8)-(10) into (11), (11) can be written as

It is shown from (12) that full consensus is reached if . Otherwise, the smaller the consensus index , the higher the consensus among DMs.

Definition 8. Let be a predefined threshold. If , the group is called acceptable consensus. Otherwise, the group is called unacceptable consensus.

3.2. Adjustment Amounts in MAGDM with IVIFVs

For the group with unacceptable consensus, it is necessary to adjust until they reach acceptable consensus. Denote the adjusted matrices by , where and . For convenience, let be the set of original decision matrices and be the set of adjusted decision matrices. As we know, the smaller the adjustment amounts of the set from , the more the decision information adjusted matrices retain. Thereby, how to measure adjustment amounts is an important issue. In [1820], distances between original matrices and adjusted ones were applied to measure the adjustment amounts. According to Definition 6, this paper presents a type of Manhattan distance-based adjustment amount as

The adjustment amount in (13) describes the average deviation of all adjusted matrices from their original matrices. The smaller , the closer adjusted matrices are to their corresponding original matrices and hence the more decision information adjusted matrices preserves.

Sometimes, DMs hope to use the number of adjusted elements in original matrices as a measure of the adjustment amount. The less the number of adjusted elements, the smaller the adjustment amount. In this case, another measure for the adjustment amount of from is proposed aswhere , , , and , respectively, indicate the numbers of the adjusted , , , and , i.e.,

In (14), adjustment amount counts the total number of adjusted elements in the consensus reaching process. If , all elements of all original decision matrices are not adjusted, i.e., for any . The larger , the more elements of original matrices being adjusted and, namely, the more adjustment amount.

For preserving the decision information as much as possible, it is sensible to minimize the adjustment amount while reaching consensus. Bearing this idea in mind and employing above two different adjustment amounts, we propose two consensus rules for reaching consensus in the sequel.

4. Two Consensus Rules with Minimum Adjustment Amounts

By minimizing two types of adjustment amounts mentioned in Section 3, respectively, this section develops two consensus rules to reach consensus. Consensus Rule 1 is to minimize the Manhattan distance-based adjustment amount , while Consensus Rule 2 is to minimize the total number of adjusted elements (i.e., ).

4.1. Consensus Rule 1 for Reaching Consensus

Due to the fact that the adjusted matrices are considered as final decision matrices, it is natural that the collective matrix should be obtained by aggregating matrices with IVIFWA operator in (7). Denote the obtained collective matrix by , where . Therefore, it yields thatwhere is the vector of DMs’ weights.

In reaching consensus process, the group should be required to be acceptable consensus, i.e.,where .

In addition, to guarantee that adjusted matrices are IVIF matrices, where , one has

Accordingly, Consensus Rule 1 is built by minimizing the Manhattan distance-based adjustment amount under such constraints described by (19)-(21), i.e.,where and are decision variables.

Plugging (13), (19) and (20) into (22), (22) can be rewritten as

Obviously, (23) is a nonlinear programming model. To solve this model, we can transform it into a linear programming model. Supposing , , , , , , , , , , , , , , , and , then it induces that , , , , , , , , , , , , , , , and . Employing these deviation variables, (23) can be converted as

Equation (24) is a linear programming model and can be easily solved by popular software, such as Lingo and Matlab. Thus, the optimal adjusted individual decision matrices, denoted by , are obtained.

4.2. Consensus Rule 2 for Reaching Consensus

Different from Consensus Rule 1, Consensus Rule 2 aims to minimize the total number of adjusted elements in all individual decision matrices. From (14), the objective function of Consensus Rule 2 is described aswhere , , , and are binary variables described by (15), (16), (17), and (18).

Rest constraints of Consensus Rule 2 are similar to those of Consensus Rule 1. Thus, combining constraints into (15), (16), (17), and (18), Consensus Rule 2 is built aswhere , , , , , , , , , , and are decision variables. It is clear that (26) is a mixed 0-1 programming model.

To solve (26), the key issue is how to handle the last constraints. Let us first analyze binary variables . From (18), one has if . Hence, (18) is equivalent to when . When , variables are taken as 0 or 1 in the equation . If we take , the value of the objective function in (26) is larger than that with . Therefore, variables should be taken as 0 when . Consequently, (18) can be rewritten as regardless of or . Similarly, (15), (16), and (17) can be also rewritten as , , and , respectively.

According to the above analyses, (26) can be converted into another mixed 0-1 program, i.e.,

To facilitate the solving process of (27), Lemma 9 is introduced.

Lemma 9 (see [31]). If a constraint in a mixed 0-1 programming contains a product of a binary variable with a linear term , where are variables with finite bounds, this product can be replaced by a new variable together with the following linear constraints:

Theorem 10. Equation (27) can be equivalently transformed into the following mixed 0-1 linear program:

Proof. Letting and , then we have and . Similarly, suppose that , , , , , and . Thus, the constraints, , can be equivalently converted as On the other hand, observing the last constraints in (27), it is unfolded as . Let . As , according to Lemma 9, the constraints can be replaced by following linear constraints:In a similar way, other three constraints, , , and can be similarly replaced asAccording to (30) and (32), (27) can be equivalently transformed into (29). The proof is completed.

Solving (29), the optimal value and the adjusted individual decision matrices are yielded.

4.3. Algorithms for Reaching Consensus Based on Two Consensus Rules

Considering that some DMs are reluctant to adjust their opinions completely depending on the adjusted individual decision matrices obtained by Consensus Rule 1 or Consensus Rule 2, it is advisable to regard adjusted individual decision matrices as aids while DMs adjusting their preferences. Motivated by this idea, we devise two algorithms to reach consensus.

Algorithm I

Input individual decision matrices , the consensus threshold , and the weight vector of DMs .

Output the final adjusted individual decision matrices , the consensus index , the final collective IFPR , and the number of the iterations .

Step 1. Let and , where .

Step 2. Aggregate individual matrices into a collective one , where , and calculate the consensus index by (12). If , go to Step 4; Otherwise, go to Step 3.

Step 3. Improve the consensus of based on Consensus Rule 1 or Consensus Rule 2. Let be the original individual matrices. Solving (24) or (29), the adjusted matrices are obtained. Suppose , i.e.,Set and go to Step 2.

Step 4. Let and . Output final adjusted individual decision matrices , the final collective matrix , and the number of the iterations .

Remark 11. If the consensus index of original individual matrices is large, it is suggested to use Consensus Rule 1 (i.e., (24)) in Step 3, which can rapidly improve the consensus. Otherwise, it is suggested to use Consensus Rule 2 (i.e., (29)) in Step 3, which only adjusts a few elements of original individual decision matrices. Thereby, it is a good idea to combine Consensus Rule 1 and Rule 2 in reaching consensus process. Thus, replacing Step 3 by Step , Algorithm II is devised.

Algorithm II

Step . Improve the consensus of based on Consensus Rule 1 and Consensus Rule 2. Let be the original individual matrices. If , where is a critical value given in advance, the adjusted individual decision matrices are obtained by solving (24). If , the adjusted individual decision matrices are obtained by solving (29). Suppose . Set and go to Step 2.

Remark 12. In Step , the parameter is the predefined threshold of the acceptable consensus, while the parameter is a critical value that is used to decide whether Consensus Rule 1 or Consensus Rule 2 is applied to improve the consensus among DMs. In addition, the parameter is more than . If , then the consensus is acceptable consensus. Otherwise, the consensus is unacceptable and needs to be improved. While improving the consensus, if , Consensus Rule 1 is applied. If , Consensus Rule 2 is applied.

Theorem 13. Let be obtained by Algorithm I and optimal adjusted individual matrices be determined by Consensus Rule 1 (i.e., (24)) in Step 3. It follows that .

Proof. Please see Appendix A.

Theorem 14. Let be obtained by Algorithm I and the optimal adjusted individual decision matrices be determined by Consensus Rule 2 (i.e., (29)) in Step 3. It follows that .

Proof. Please see Appendix B.

From Theorems 13 and 14, it is concluded that Algorithm I is convergent. It is easily understand that Algorithm II is also convergent because it is a combination of Consensus Rule 1 and Rule 2.

5. An Interval-Valued Intuitionistic Fuzzy Program Based Method for Solving GDM with IVIFVs

This section proposes an IVIF program based method for solving GDM with IVIFVs. To derive attributes’ weights, an IVIF program is built. By transforming this IVIF program into a multiobjective interval program, a solving approach is proposed. The collective decision matrix is derived by integrating individual decision matrices with the IVIFWAM operator. Finally, combining the consensus reaching process described in Section 4, a novel method is presented to solve GDM with IVIFVs.

5.1. Interval Objective Program

As a preparation for solving the IVIF program which will be constructed to determine attributes’ weights, this subsection reviews an approach to solving an interval objective program. Ishibuchi and Tanaka [32] presented a popular solving approach by transforming an interval objective programming into an equivalent multiobjective programming. Let an interval be and is a set of constraints the interval should satisfy. A maximization interval objective program is represented as . This program can be equivalently transformed into a biobjective mathematical program . A minimization interval objective program is represented as . This program can be equivalently transformed into another biobjective mathematical program .

5.2. Determination of Attributes’ Weights

According to the final collective decision matrix derived in Section 4, the comprehensive values of alternatives are obtained as where is the accurate weight vector of attributes and satisfies , , and . For convenience, Denote .

As we know, the larger the comprehensive value of alternative , the better the alternative is. Thus, a mathematical program is built by maximizing comprehensive values, i.e.,

As comprehensive values are IVIFVs, this program is called an IVIF program. According to Definition 2, to maximize , it is necessary to maximize their memberships and minimize their nonmemberships. Hence, (35) can be equivalently transformed into the following multiobjective interval program:

Employing the solving approach in Section 5.1, (36) can be converted as

Equation (37) can be further transformed as

By the min-max summation method, (38) can be converted into the following program:

Plugging (34) into (39), (39) can be rewritten as

Solving (40), the weight vector of attributes is determined.

5.3. A Novel Method for Solving GDM with IVIFVs

According to above analyses, a novel method for solving GDM with IVIFVs is generalized below.

Step 1. Each DM establishes his/her individual decision matrix with IVIFVs and provides the vector of attribute weights with IVIFVs.

Step 2. Normalize individual decision matrices.

In general, there are cost attributes and benefit attributes. Denote the set of cost attributes by and the set of benefit attributes by . Individual decision matrices often are normalized by the following formula:

Step 3. Check the consensus among individual decision matrices by (12) and (19). If the group is acceptable consensus, go to Step 3. Otherwise, go to the next step.

Step 4. Reach consensus by Algorithm I or II and derive the final adjusted individual decision matrices and the final collective decision matrix .

Step 5. Determine the attribute weight vector by solving (40).

Step 6. Obtain comprehensive values of alternatives by (34).

Step 7. Rank alternatives based on Definition 4.

Figure 1 depicts the above decision-making process.

6. A Practical Example of a Cloud Service Provider Selection and Comparative Analyses

To illustrate applications and advantages of the proposed method, this section provides a cloud service provider selection example and conducts comparative analyses.

6.1. A practical Example of a Cloud Service Provider Selection

Cloud computing [33, 34] is a kind of computing paradigms based on the internet. By this paradigm, shared hardware and software resources can be provided to computers on demand. Cloud computing has many advantages, such as flexibility, business agility, and pay-as-you-go cost structure. Many enterprises with limited financial and human resources can apply cloud computing to deliver their business services and products online to extend their business markets. Thus, the selection of cloud service provider becomes a key issue for enterprises.

Guilin FeiYu Electronic Technology limited company (Feiyu for short) is devoted to the development and sales of electronic products. To increase the running speed of their servers, Feiyu tries to seek a cloud service provider. After the market research and preliminary screening, five providers are selected and be further evaluated, including Ali Cloud (), Tencent Cloud (), Field Could (), American Cloud (), and Wangsu Technology limited company (). Three decision-makers are invited to evaluate these five alternatives with respect to four attributes, including performance (), payment (), security (), and scalability (). Suppose the vector of DMs’ weights is . Normalized individual IVIF decision matrices are shown in Tables 13. After discussion and negotiation, the group of DMs provides attributes’ weights as follows:

In what follows, we will apply the proposed method to solve this example.

Steps 1-2. Normalized individual decision matrices are shown as Tables 13.

Step 3. Check the consensus of individual IVIF decision matrices.

By (19), the temporary collective IVIF decision matrix is obtained and shown in Table 4. In virtue of (12), the consensus index is calculated as . Set the consensus threshold ; then . Therefore, the group is unacceptable consensus and needs to be adjusted.

Step 4. Reach consensus.

(i) The first round. According to Algorithm II, let . As , Consensus Rule 1 is first applied to adjust original individual IVIF decision matrices. Take and . Solving (24), the temporary adjusted individual IVIF decision matrices are obtained. Let in (33), a new group of individual IVIF decision matrices and the temporary collective decision matrix are generated. Further, the consensus index is computed as . Hence, the reaching consensus process is continued based on Consensus Rule 1.

(ii) The second round: Replace and by and , respectively. Repeating above processes, updated individual IVIF decision matrices and the collective one are obtained. The new consensus index is derived as . Thereby, the following reaching consensus process is guided by Consensus Rule 2.

(iii) The third round: Plugging , , and into (29) and solving it, the temporary adjusted individual IVIF decision matrices are obtained. Plugging into (33), decision matrices are updated as . The collective matrix is derived by (19). The consensus index, , is calculated as 0.0519.

(iv) The fourth round: Repeating the processes in (iii) based on Consensus Rule 2, the adjusted individual decision matrices and the collective one are derived and listed in Tables 58. The consensus index is computed as . Accordingly, the group is acceptable consensus and the consensus reaching process is finished. Regard and as and , respectively, and go to the next step.

Step 5. Determine attributes’ weights.

According to matrix and the attributes’ weights , the vector of attributes’ weights is determined by solving (40). i.e.,

Step 6. Obtain comprehensive values of alternatives.

Plugging matrix and the vector into (34), comprehensive values of alternatives are obtained as

Step 7. Rank alternatives.

In virtue of (3), scores of alternatives are derived as

According to Definition 4, alternatives are ranked as . Thereby, American cloud () is the best cloud service provider.

6.2. Comparative Analyses

To further show advantages of the proposed method, some comparative analyses are conducted in this subsection.

6.2.1. IR and DR Rules

In consensus reaching process, Identification Rule (IR) and Direction Rule (DR) are two popular rules and applied into many multiple attribute group decision-making problems [3538]. IR is used to determine which DMs have less contribution on reaching a higher consensus, and DR rule is used to direct DMs in how to modify their opinions of alternatives on attributes.

Before conducting comparative analysis, IR and DR are extended into the IVIF environment. Thus, another two consensus rules, called Consensus Rule 3 and Consensus Rule 4, are generated, respectively. By replacing Step 3 in Algorithm I with Step 3’ and Step 3’’, respectively, Consensus Rule 3 and Consensus Rule 4 can be obtained and described as follows:

Consensus Rule 3

Step 3. While improving the consensus degree, one element is modified each time. To complete the step, four substeps are needed.

Step 3’-1. Seek the maximum consensus degree of DM and denoted by .

Step 3’-2. Seek the maximum consensus degree of DM on alternatives and denoted by .

Step 3’-3. Seek the maximum consensus degree of DM on alternative with respect to attributes and denoted by .

Step 3’-4. Modify the element as , where

Set and go to Step 2.

Consensus Rule 4

Step 3’’. While improving the consensus of the group , all elements of all individual decision matrices are modified. Denote the modified matrices by . Suppose , where

Set and go to Step 2.

6.2.2. Comparative Analysis with IR and DR Consensus Rules

For showing merits of Consensus Rule 1 and Consensus Rule 2 proposed in this paper, the example in Section 6.1 is solved by Consensus Rules 1-4 with different values of the parameter , respectively. Suppose the threshold . Comparative results are represented as Tables 9 and 10. Table 9 shows distances between original and adjusted individual decision matrices by Consensus Rules 1-4. Table 10 demonstrates iterative rounds for reaching consensus by Consensus rules 1-4.

From Tables 9 and 10, merits of Consensus Rule 1 and 2 proposed in this paper are outlined as follows:

In the consensus reaching process, Consensus Rules 1 and 2 are able to retain more decision information provided by DMs compared with Consensus rules 3 and 4. Observing total distances between original decision matrices and their corresponding adjusted ones in Table 9, the distances obtained by Consensus Rules 1 and 2 are obviously less than those obtained by Consensus Rules 3 and 4. As we know, the less the distance, the closer the adjusted decision matrix is to the original one and more decision information is preserved. At this point, Consensus Rules 1 and 2 retain more decision information and are better than Consensus Rules 3 and 4. In addition, as the distances obtained by the Consensus Rule 4 are very large, most decision information may be lost while reaching consensus by this rule.

Compared with Consensus Rule 3, Consensus Rules 1 and 2 need much less iterative rounds for reaching consensus under any values of parameter , which can be verified by Table 10. In other words, Consensus Rules 1 and 2 are time-saving. Although Consensus Rule 4 also needs fewer rounds, it is not sensible enough to select this rule for reaching consensus because much decision information may be lost as mentioned above.

It is observed from Tables 9 and 10 that distances obtained by Consensus Rule 1 are less than those obtained by Consensus Rule 2, whereas the iterative rounds needed by Consensus Rule 1 are more than those needed by Consensus Rule 2 with the same value of parameter . Thus, for reaching consensus as soon as possible and retaining decision information as much as possible, it is sensible to combine Consensus Rule 1 and Consensus Rule 2 in consensus reaching process. That is, we can first improve consensus by Consensus Rule 1 and then by Consensus Rule 2.

6.2.3. Comparison with an Autocratic Multiattribute Group Decision-Making Method

To further show advantages of the proposed method, this subsection compares it with an autocratic multiattribute group decision-making (AM-AGDM) method presented by Cheng [10]. By the AM-AGDM method, the example in Section 6.1 is resolved below.

To make this comparison more valid, suppose that attribute weights supplied by DMs are the same as those in the example in Section 6.1, i.e.,

Steps 1-2. Set and , where is the iterative round and and and are weights of DMs at the first round, respectively. By the AM-AGDM, the weighted decision matrices of DM are first computed and then the aggregated evaluating matrix is constructed as

Steps 3-4. According to in [10] and Xu’s ranking method [30], the preference vector of alternatives with respect to is shown as

Steps 5-6. Combining DMs’ weights and score matrices, the aggregated group evaluating values of alternatives are obtained and the group preference orders of alternatives are yielded as

Step 7. Based on method [10], the weighted similarity degrees are obtained as

Thus, the group consensus degree is calculated as . Let the consensus threshold value which was set in [10]. Thus, one has and it is necessary to modify DMs’ weights to improve the group consensus degree.

Step 8. DMs’ weights are derived by and in [10] as

Then go to Step 5.

Repeating Steps 5-8, DMs’ weights are stable at . At this round, DMs’ weights and the consensus degree are obtained as , , and , respectively. Obviously, the consensus degree is less than . This verifies that method [10] sometimes fails to improve the consensus of a group to an acceptable consensus.

Step 9. Rank alternatives.

According to in method [10], the aggregated group evaluating values of alternatives at 13 round are computed as

Therefore, alternatives are ranked as , which are clearly different from orders obtained by the proposed method.

Compared with AM-AGDM method, the proposed method has following advantages:

While measuring the consensus, the former only measures the differences between preference orders of alternatives with respect to DMs and the group preference orders, whereas the latter measures the difference between each attribute values provided by DMs and those of group. As the attribute values of alternatives over attributes describe more decision information than alternatives’ preference orders, the consensus defined by the latter more accurately describes the consensus among DMs.

While reaching consensus, the former reaches consensus by modifying DMs’ weights, which may result in some DMs’ weights being equal to zero as the number of iterative rounds increases. This can be verified by Step 8 in this subsection. In this case, these DMs have no effect in the rest decision. However, the latter reaches consensus by modifying attribute values provided by DMs. This guarantees that each DM plays a certain role in decision-making. Furthermore, the latter can retain the initial decision information as much as possible in reaching consensus.

In the process of reaching the consensus, the latter is proved to be convergent, while the former sometimes cannot help DMs reach the predefined consensus degree, which can be verified by Step 8 in this subsection.

For determining weights of attributes, the former assigns attribute weights in advance, while the latter first assigns attributes’ weights in the form of IVIFVs, and then objectively determines accurate attributes’ weights by constructing an IVIF program and solving it. Accordingly, in the process of determining attributes’ weights, the latter not only explores DMs’ activeness but also utilizes attribute values of alternatives. Therefore, the attributes’ weights determined by the latter may be more reasonable.

7. Conclusions

This paper investigates multiattribute group decision-making problems where attribute values and attributes’ weights are all IVIFVs. The consensus and the determination of attributes’ weights are discussed, respectively. For checking the consensus, a new consensus index is proposed by measuring the deviations between individual decision matrices and the collective one. In the case that the consensus is unacceptable, two consensus rules are introduced by minimizing adjustment amounts. The dominant features of these two rules are that they are convergent and able to retain initial decision information as much as possible. To derive attributes’ weights objectively, an IVIF fuzzy program is constructed and a new solving approach is presented. Thereby, a novel method is proposed to solve MAGDM problems with IVIFVs. An example and comparative analyses are conducted to illustrate the application and advantages of the proposed method.

In the future, we will extend the proposed idea into other types of MAGDM problems, such as Pythagorean sets [39] and trapezoid intuitionistic fuzzy numbers [40].

Appendix

A.

Proof of Theorem 13. According to (33), one has . Thus, the following is yieldedSimilarly, the following is obtained Therefore, one has . Since are the optimal solutions obtained by (24) in the th iteration of Consensus Rule 1, one gets Hence, we have Meanwhile, as are optimal solutions obtained by (24), it follows thatAccording to Definition 6, it is deduced that Combining (A.5) and (A.6), the following is obtained: Observing (A.4) and (A.7), one getsFor convenience, supposing , then (A.8) can be written asFrom (A.9), it is concluded that the sequence is monotone decreasing with respect to the number of iteration . Since , the limit of the sequence exists. Taking the limit on both sides of (A.9) yields On the other hand, Plugging (33) into (A.11) generates Combining (A.10) and (A.12), it is concluded that . Since are optimal solutions obtained by (24), they satisfy which can be deduced from the constraints of (24). Therefore, we have .
Theorem 13 is proved.

B.

Proof of Theorem 14. Letting be the optimal value of (29) in l th iteration, then one has . In virtue of (29), decision matrices are also feasible solutions in the (l+1) th iteration. Let be the value of the objective function in the (l+1)th iteration on feasible solutions . Suppose The inequality holds for any , which can be proved considering the following two cases:
If , then we have . As , it is obtained that . Thereby, it is yielded that .
If , then it is deduced that . When , one has . In this case, one gets . Otherwise, and . Hence, we have .
Similarly, inequalities , and hold. Thus, one has On the other hand, it holds thatCombining (B.2) and (B.3), it is proved that . Therefore, the sequence is monotone decreasing. Since , this concludes that exists and denoted by .
According to Cauchy’s test for convergence, there exists a positive integer , the equality when . As , one has . Namely, are also the optimal solution in the (l+1)th iteration, which implies that the number of adjusted preference values tends to 0. Thereby, it is deduced that . Thus, . As , it is concluded that .
Theorem 14 is completed and proved.

Data Availability

The key data used to support the findings of this study are included within the article.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This research was supported by Guangxi Natural Science Foundation (nos. 2016GXNSFBA380077 and 2016GXNSFAA380059), Guangxi Philosophy and Social Science Programming Project (2015) (no. 15FGL011), Guangxi University Research Project (no. KY2016YB196), and the Doctoral Scientific Research Foundation of Guilin University of Technology (no. GUTQDJJ2007033)