#### Abstract

The growth of large enterprises in the manufacturing market commonly depends on good New Product Development (NPD) projects; these projects represent a strategy to overcome competitors inside a competitive environment. The management of such projects is usually complex and involves risk due to the changing and conflicting environment. The approaches that tackle the problem lack an explicit consideration of the DM’s attitude facing uncertainty and imprecision related to the risk and particularly in the presence of time-interdependencies. This paper proposes a model of the time-related effects, under imperfect knowledge, and their influence in choosing optimal NPD portfolios. The proposed approach is an interval-based method to solve NPD portfolio optimization problems under different forms of imperfect knowledge. This approach has the advantage of a unified and simple way to model the different sources of imprecision, vagueness, uncertainty, and arbitrariness. The attitude of the DM facing the imperfect knowledge is adjusted by using some meaningful parameters. The research focuses particularly in creating a method useful for risk-averse DMs. The proposal was tested through an experimental design that compared the results achieved by the new method against the expected value in portfolios. The results revealed that high levels of conservatism might prevent wasting resources in failed projects.

#### 1. Introduction

It is commonly accepted that good research and development practices are essential for growing and well positioned large enterprises in the manufacturing market, commonly having dozens or hundreds of projects. The development of competitive new products is likely the most important task that allows a manufacturing enterprise surveillance within a competitive environment [1, 2]; such NPD projects can positively impact in the company competitive position allowing a better efficiency in the product management, and/or the generation of social benefits [3]. Hence, for the growth of enterprises under stiff competition, good New Product Development (NPD) projects are required. However, the competence can affect the projects of an enterprise in different ways; e.g., strong competence is related to reduction in real prizes (cf. [4, 5]), or technical inefficiency can be reduced improving external and/or internal competence [6]. According to [7], planning the interdependency among projects in a portfolio is a dynamic hard and complex task that must optimize resources considering the competitors movements in the market. Given the time is one of the wider resources spread in the development of a project, studying its effects must be considered in NPD projects.

To a great extent, a successful NPD can produce large benefit (profit, prestigious, market share, etc.), but needs complex management and involves high risk, mainly due to the fast changing and conflicting environment, as well as technological innovations. According to some authors most NPD projects have low probability of success (e.g., [1]). Many authors argue the importance of good practices to handle the innovation risk, thus increasing chances of having successful new products (e.g., [8–11]).

Since often there are numerous good projects but there are not sufficient resources to develop all of them, the decision makers should select the most appropriate NPD project portfolios, expecting that these portfolios allow developing several, even many, attractive, and successful products that generate growing benefits [12]. To balance risk and potential benefits is a crucial aspect in selecting appropriate NPD portfolios (e.g., [13]).

NPD portfolio selection is a particular case of research and development projects portfolio selection. But NPD projects are distinguished from other innovation and research projects by some relevant features:(i)Uncertain market payoffs that change over time.(ii)Strong dependence of benefits on imperfectly known project completion times, technological innovations, potential competitor products, and their interactions.(iii)Sometimes, NPD projects can be preceded by applied research projects. Then, effects related to time-interdependence among different projects should be considered as an additional feature in portfolio selection problems.

Chan and Ip [14] proposed a framework for a decision support system that aids in NPD through the assessment of product values based on three types of influence factors, which are the product, the customers, and the market. The system uses two models to estimate the Net Customer Lifetime Value (NCLV) from predicted customer purchasing behaviour; the NCLV can be used to rank a set of potential products. The selection process is performed through a ranking process based on such value.

Loch and Kavadias [13] proposed a model based on marginal analysis to solve a dynamic version of the Portfolio Selection of NPD programs. Using a probabilistic approach, this paper handles multiple periods, synergy, uncertainty, managerial risk, and obsolescence, but not different levels of conservatism from decision makers (DMs).

Wei and Chang [1] and Lin et al. [15] proposed approaches that integrate fuzzy set theory and multicriteria group decision-making method into a NPD Project Portfolio Selection Model (PPSM) which allows the management of risk. These proposals use multiattribute fuzzy group decision techniques to deal with fuzziness, uncertainty and inaccuracy. The PPSM proposes the use of a Project Fuzzy Decision Index in combination with Fuzzy Gates (FG) to eliminate not so good projects. The FG should be calculated from information provided by the decision makers; the FG are used by a Go-Kill method together with a threshold (which is defined by the DM regarding enterprise resources, risk tolerance, and so on); finally, the total dominance method is used for the Go-Kill decision. The definition of the FG might consider information related to evaluation of risks and other aspects. These works do not directly involve the influence of time-interdependencies such as the obsolescence of a project or the apparition of new competing products from other companies.

Badizadeh and Khanmohammadi [2] proposed a Fuzzy Multicriteria Decision-Making model for evaluation and prioritizing NPD under uncertainty. Project selection is performed through a ranking process. This paper considered three types of uncertainties that influence in the value of products; these are market uncertainty, technology uncertainty, and process uncertainty (related to internal issues of the organization). This work can involve a large number of criteria to be handled by a DM, a situation that can lead to an exhaustive cognitive effort from her/him to provide the required information.

Reich [16] proposes to map the Portfolio Selection Problem of NPD to the domain of the Constraint Satisfaction Problems. The proposal consists in a decision support system that aids in portfolio selection of NPDs, taking as guiding feature the desired reliability of products, i.e., the capacity of a product to perform the required function. The strategy relies on a fuzzy system and historical data to estimate the NPDs costs; these costs are involved in the constraint satisfaction optimization model to make the appropriate selection of projects. This approach successfully includes the reliability feature in the selection process for NPDs.

Wei et al. [17] present a PPSM for NPD based on fuzzy sets theory to rank product lines, dealing with synergy, risk, and uncertainties. The approach is an improved PPSM for fuzzy multicriteria group decision-making; it proposes an NPD project Go-Kill decision index named project fuzzy synthetic rating (PFSR) to aggregate fuzzy weights and fuzzy evaluations of the NPD projects. Once the PFSR is calculated, the method uses the Go-Kill threshold provided by the DMs based on risk and performance to compare the crisp PFSR and determine which projects survive for the next stage. The method ends with a ranking of the projects.

Reich and Pawlewski [18] use neural networks and fuzzy sets theory to provide a solution for PPSM for NPD. The approach offers a rational structure of NPD project evaluation reflecting vagueness or ambiguity that appears in the business environment. The combined use of neural networks and fuzzy systems offers an evaluation of product lines that is used to rank the set of projects; such rank aids in the selection process and with it, in the construction of an NPD portfolio.

Recognizing that launching a new product is highly risky due to uncertainties of the market and competitors, Tolga [8] proposed a model based on compound options with type-2 fuzzy numbers. He stated that those uncertainties cannot be represented by crisp numbers and traditional techniques are inappropriate to solve real life problems.

Tiwari et al. [10] proposed a method to become a NPD process more effective, evaluating design concepts in uncertain environments of early design stages. In order to improve the overall effectiveness of the process, design concepts are characterized by soft sets and customer’s preferences by Shannon entropy. The approach to incorporate costumers in early NPD stages is validated through examples.

Most of the related works in the particular framework of NPD are subject to at least one of the following criticisms: (i) no consideration of complex time-interdependencies related to competing products and technologic obsolescence; (ii) no explicit consideration of the DM’s attitude facing uncertainty and imprecision; (iii) no solution of an optimization problem in the portfolio space; the best portfolio is not necessarily composed by the best projects, because the complex interdependencies among projects and their influence on the DM’s preferences (c.f. [19]); (iv) lack of knowledge about market, competitors, technological changes, etc. is mainly modelled by fuzzy sets. Fuzzy Sets theory is a powerful tool to model vagueness, but this is not the unique (probably not the main) source of imperfect knowledge in NPD decision-making.

The difference between uncertainty and vagueness in the framework of R&D project selection has been approached by several authors (e.g., [20]). The uncertainty of projects is related to the degree of precision with which the variations in outcome, resources, and work processes of projects can be forecast [21]. From the point of view of multicriteria decisions, other authors prefer to use the more general term “imperfect knowledge” (e.g., [22–26]); the sources of this imperfect knowledge are arbitrariness, imprecision, ill-determination, and uncertainty in data and model parameters [22]. In this sense, the term “uncertainty” is related to the values of certain data in a more or less distant future [22].

There is a vast literature modelling imperfect knowledge in the field of Research & Development project portfolio selection (e.g., [20, 27–29]), using statistical information or fuzzy sets. Interval analysis is another approach that has been recently applied to model imperfect knowledge in project portfolio selection and stock portfolio optimization (c.f. [30–35]). Interval analysis combined with outranking methods was recently applied to portfolio optimization in [24, 26]. The interval approach is a natural and simple way in which to express imperfect information, it does not matter its source.

NPD portfolio selection can be modelled as a multiobjective optimization problem under imperfect knowledge. Unlike most research projects, the benefits provided by NPD projects are strongly depending on time; but the time-related features of NPD projects are often poorly known. In this paper, we are not mainly interested in addressing the multicriteria aspects concerning with NPD portfolio selection; these aspects are not essentially different from those of research and development project selection, which have received a great attention for many years. Our aim is to propose a way to face the risk provoked by imperfect knowledge in time-related interdependencies which are typical of NPD.

Three different moments are generally present in any NPD project: the estimated completion time; the moment in which the competence become significant; and the moment in which the developed product becomes old. Those moments have strong dependences with the final benefit produced by a project, e.g., the longer periods of completion of a NPD project or the apparition of competence in the market might provoke smaller benefits, even null benefits if the project becomes old (i.e., it has no longer relevance in the market). These specific forms of dependences should be established by the management teams of each project; nevertheless, long lead times of R&D projects combined with a complex market and technology dynamics make it very difficult to collect reliable data [36], originating the presence of risk.

This paper is primarily oriented to the modelling of the time-related effects, under imperfect knowledge, and their influence in choosing optimal NPD-oriented project portfolios. This work proposes an interval-based method to concern with NPD portfolio optimization problems under the above forms of imperfect knowledge. This approach has the advantage of a unified and simple way to model the different sources of imprecision, vagueness, uncertainty, and arbitrariness. The attitude of the DM facing the imperfect knowledge is adjusted by using some meaningful parameters. We are particularly interested in creating a method useful for risk-averse decision makers. Although interested in benefits from NPDs, risk-averse decision makers are also interested in controlling and minimizing his/her regret as a consequence of spending resources in risky projects with low probability of success.

The remaining of this paper is organized as follows: Section 2 presents the general problem formulation for NPD-oriented project portfolio optimization with time-interdependencies under imperfect knowledge. Section 3 gives a brief description of the algorithmic solution, approach based on an evolutionary algorithm. Section 4 presents an illustrative example, which allowed us to evaluate the performance of the proposed approach. Finally, Section 5 contains the conclusions.

#### 2. Problem Formulation

Beyond the expected market payoffs from the new products or revenues from innovation projects, there are usually other objectives to be taken into account by the decision makers. These criteria may concern with social responsibility, image, alignment with long term objectives of the enterprise, portfolio balance, etc. To consider several criteria is more general than the simple single-objective formulation based on expected revenues. In NPD-oriented portfolios, at least one of the criteria aggregates the expected revenues from the products that will be introduced in the market. Most of the candidate projects should be oriented to the market, although there could be some projects with nonprofit goals (e.g., applied research projects).

The project portfolio selection problem is formulated below as a multiobjective optimization problem. This model follows the well-known binary formulation for the stationary problem of project selection, which has been broadly studied in the literature (e.g., [37–42]). The basic features of the model that describe any instance of the project portfolio problem are (a) the number of projects* M *competing for financing; (b) the number of criteria (or objectives)* N *considered by the DM for measuring the quality of portfolios and projects; and (c) the requirements* R *for resources.

A portfolio is a subset of projects and is represented by a binary vector where means that project will not be financed, and means that the project receives support. The vector of the impacts of portfolio is associated with objectives . Assuming that the DM’s preference increases with the value of the objectives, the decision-making problem can be expressed aswhere is the space of feasible portfolios and is usually determined by the available resources and timing relations among projects.

Considering the imperfect knowledge in project objectives, requirements, and budget availability, problem (1) can be expressed as in (2). where(i), i=1,…M represents the estimated value of the i-th objective in portfolio ;(ii) i=1,…M represents the imperfect knowledge associated with the i-th portfolio objective;(iii) is the number of different classes of resources;(iv) is the estimated consumption of the -th resource for the portfolio ;(v) is the estimated availability of the* k*-th resource;(vi) represents the imperfect knowledge related to the consumption of the* k*-th resource for the portfolio ;(vii) represents the imperfect knowledge related to the availability of the* k*-th resource;(viii) denotes “less with sufficient likelihood”;(ix)(a)=0 denotes the fulfillment of the precedence relationships between projects in portfolio* a*;(x) denotes the number of precedence relationships among projects.

Let us also denote by(i): the estimated impact produced by the* j*th project on the* i*th objective;(ii): a representation of the imperfect knowledge related to the* i*th objective related to the* j*th project;(iii): the estimated consumption of the* k*th resource for the* j*th Project;(iv): the imperfect knowledge associated with .

*Remark 1. *, , , and ,* j*=1,…*N*,* i*=1,…*M*, are aggregated at portfolio level using certain aggregation functions which can model different synergies. Let us denote these aggregation functions as follows:(i), (ii)), (iii), (iv), Formulation (2) could correspond to a portfolio selection problem of basic and applied research project, in which the influence of time-related effects on objective values can be neglected. Our proposal pretends to underline the differences between research and NPD project portfolio optimization. In manufacturing enterprises, during its last phase, an applied research project becomes NPD project, oriented to market and whose results are strongly influenced by time-related effects, mainly concerning competitors and technological changes.

Since in this paper we are primarily interested in modelling the timing effects and their influence in solving problem (2), let us introduce the following notation:(i)*t*_{cj}: the estimated completion time of the* j*th project;(ii)Δ*t*_{cj}: the imperfect knowledge associated with the completion time of the* j*th project;(iii)*t*_{sj}: the estimated start of the* j*th project;(iv)Δ*t*_{sj}: it represents the imperfect knowledge associated with the start of the* j*th project;(v)*P*_{a,j}: the set of projects which are predecessor of the* j*th project. Two precedence conditions should be fulfilled for each project in a feasible portfolio according to (3) and (4):

*Remark 2. *(i)*r*_{k,j} depends on* t*_{cj}; this dependence is usually an increasing function, because more time for completion likely implies more resources spending;(ii)Δ*r*_{kj} depends on Δ*t*_{cj} for the above reason. Since a subset of projects is oriented to the development of new products, one of the objectives in problem (2) is the revenue obtained by the projects that achieve certain portion of the market. This portion depends on the level of the competency for the same market.

*Remark 3. *(i) can depend on the estimated moment of completion, that is () and on its relationship with the moment when the competence for the same market become significant; when the impact of the project can be degraded by competitors, later completions makes it more likely stronger presence of competitors; so, the function that model such a dependence is often a decreasing one;(ii) can depend on () for the above argument. Let us introduce two new concepts:(i)*t*_{comp,j}: the moment in which the competence of the product developed by the* j*th project becomes significant;(ii)*t*_{old, j}: the moment in which the product developed by the* j*th project can be considered old.

*Assumption 4. **f*_{i,j}+Δ*f*_{i,j},* r*_{k,j}+Δ*r*_{k,j},* j*=1,…,*N*,* i*=1,…,*M*,* t*_{cj}+Δ*t*_{cj},* P*_{k.0}+Δ*P*_{k},* t*_{sj}+Δ*t*_{sj},* t*_{comp,j},* t*_{old,j},* j*=1,…*N*,* i*=1,…*M*,* k*=1,…*n*_{r}, can be represented as intervals. In the following, interval numbers will be denoted by boldface italic letters.

*Assumption 5. *The aggregation functions in Remark 1 can be expressed by using the basic operations of the interval arithmetic (see Appendix).

Under the above assumptions, problem (2) can be expressed aswhere the symbol** “**≈” means “with sufficient likelihood”.

*Remark 6. *(a)The presence of competitors in a given project* j *provokes a reduction of*t*_{old,j} and a strong dependence of some impact functions on ( + ). In such a case, is a function of ( + ), , and . is a monotonically decreasing function on ( + ), and increasing on , and . As greater is, greater should be. The specific forms of those dependences should be established by the management team of each project supported by market studies and expert opinions. If a project is related to results that do not become old, its is set to infinite.(b)The strong dependences on the impact of the criteria due to time effects are included in the formulation of problem (5); such dependences are distinctive of NPD projects and mark differences from the more general portfolio optimization problem provided by problem (2). Basically, problem (5) contains the time-related effects addressed by our proposal.(c)The nature and volume of the information required for solving problem (5) can be seen as an important criticism. However, many recent papers emphasize the importance of involving numerous stakeholders like shareholders, financial institutions, suppliers, buyers, customers, dealers, and different sources of design expertise in the early stages of the development process to get information about customer’s preferences, technological changes, and competitors (e.g., [9, 10, 43]). In this sense, the importance of new information and communication technologies involving many stakeholders to get the necessary information has been recognized by several recent papers (e.g., [44–46]). According to Zhong and Lou [44], rich information about competitors can be obtained through collaboration and communication with buyers and suppliers. In such a collaborative environment, with the use of expert consensus methods, it should be possible to make a reasonable estimation of the input data required by problem (5). With the use of the possibility measure for interval numbers (see (A.2) in Appendix), problem (5) can be transformed intowhere *γ* and are thresholds related to the phrase “with sufficient likelihood”.

*Definition 7. *We say that a portfolio is (*γ*,)-feasible if and only if fulfills the constraints in problem (6).

*Definition 8. *For *γ* and from Definition 7, let* a*_{i} and* a*_{j} be two (*γ*,)-feasible portfolios. We say that -dominates* a*_{j} (denoted* a*_{i}D(*β*)*a*_{j}) if and only if* Poss*((*a*_{i})(*a*_{j})) ≥*β*≥0.5),* k*=1,…*N*, and* Poss*((*a*_{i})(*a*_{j})) >*β* for some* k*.

*Remark 9. *The level of conservatism of the DM increases with *γ*, , and *β*. Although such parameters are related to the DM’s conservatism, these refer to different portfolio features, and therefore, they have not necessarily equal values.

*Definition 10. *A (*γ*,,*β*)- Pareto portfolio is defined as a (*γ*,)-feasible portfolio* a*_{i} in problem (6) such that there is no (*γ*,)-feasible portfolio* a*_{j} that fulfills* a*_{j}D(*β*)*a*_{i}.

*Remark 11. *The set of (*γ*,,*β*)-Pareto portfolios form the (*γ*,, *β*)-Pareto frontier. The threshold parameters can be modified, thus exploring different degrees of conservatism. It is obvious that the “best” solution to problem (6) is an element of the (*γ*,, *β*)-Pareto frontier for certain values of *γ*, , and *β*. Once these values have been set by the DM and the decision analyst, the best compromise solution to problem (6) depends on the DM’s multicriteria preferences. In this paper, we are not interested in modelling the DM’s preferences, but the DM’s degree of conservatism facing the risk provoked by imperfect knowledge. Hence, our proposal is limited to generate the (*γ*,,*β*)-Pareto frontier. From this set, through a posteriori articulation of preferences, the DM will choose his/her best compromise.

Setting precise values of the conservatism-related thresholds is certainly demanding for the DM. Parameters *γ* and *β* were introduced by Balderas et al. [26] in the frame of interval-based project portfolio optimization. The present paper has introduced to model the DM’ attitude facing the risk related to time effects. As stated by Balderas et al. [26], those conservatism parameters should be set in a coconstructive process involving a DM-decision analyst pair. In terms of the conservatism thresholds, an interaction between the DM and the decision analyst is mandatory in which the DM should understand the meaning of *γ*-feasibility and *β*-dominance. In [26] the reader can find illustrative examples of this interaction. The concept of a *γ*-feasible portfolio is easier than non-*β*-dominance; hence, the value of *γ* should be set first; this setting can be achieved by comparing the interval number associated with the available resources with different interval numbers representing potential levels of resource consumption (see [26]). As a result of these comparisons, acceptable values of *γ* can be identified.

Now, from the concept of non-*β*-dominance, the DM sets a starting value of *β*. Given *γ* and the starting *β*, solving problem (6) with different values of allows the DM to identify a value compatible with his/her level of conservatism facing the risk related to time effects. The setting of will be illustrated in Section 4. Once *γ* and have been determined, solving problem (6) with different values of *β* helps the DM-analyst pair to identify good solutions that are non *β*-dominated with appropriate levels of conservatism. As a consequence of this process, a good compromise can be determined that will be a non-*β*-dominated solution for some *β* in the interval .

When N=2,3 problem (6) can be efficiently solved by using the I-NSGA-II method proposed in [34]. In this range of evaluation criteria, the human cognitive limitations are not an obstacle to compare multiobjective solutions, and the DM can identify a final compromise. Higher dimensions problems can be addressed by the I-NOSGA method [26], which has the capacity to handle many objective functions incorporating DM preferences through the interval-based outranking approach by Fernández et al. [23].

#### 3. Brief Description of the Algorithm

This section presents the methodology based, shown in Figure 1, on the algorithm I-NSGA-II (cf. [34]). The I-NSGA-II deals with interval multiobjective optimization problems and, in the proposed methodology, it is used to solve problem (5), i.e., the optimization of NPD project portfolios under time effects. The methodology works in three phases. In the first phase, it transforms the time effects of the instance* I*_{te} of problem (5) into an instance involving just intervals in objectives and constraints (denoted* I*). During the second phase, the instance* I *is solved using the I-NSGA-II algorithm in order to obtain an approximation of the RoI (denoted as* F*). Finally, the best portfolios are chosen among the solution of* F *(i.e., the portfolios that belong to the front zero ).

The I-NSGA-II algorithm, shown in Algorithm 1, is an adaptation of the most relevant strategies involved in its predecessor NSGA-II [47] (which are* fast-nondominated-sort* and* crowding*-*distance*-*sort*) using interval mathematics (see Algorithms 2 and 3). The I-NSGA-II algorithm focuses on the creation of nondominated fronts using interval mathematics. Each generation of I-NSGA-II the fronts are ordered, under the interval domain, based on the individual’s nondominance and crowding distance (see Lines 4 and 11, respectively). The I-NSGA-II is supported by the* crowded-comparison-operator *(or* crowding operator*, denoted by ); this operator guides the selection process to achieve diversity on the optimal interval Pareto front. The* crowding operator* employs as interval comparison threshold the value 0.5. Finally, the I-NSGA-II uses the population generated through the previously described orderings (see Line 12) to create the set of individuals that will form part of the new population for the next generation (see Line 13). The I-NSGA-II returns the best front obtained in the last generation (see Line 15).

Input: , | |

Output: of the last iteration of the algorithm | |

1. Initialize: , | |

2. for to do | |

3. | |

4. ←interval-fast-non-dominated-sort() | |

5. , ← | |

6. while | |

7. interval-crowding-distance-assignment() | |

8. | |

9. | |

10. end while | |

11. interval-crowding-distance-sort(, ) //ascending sorting by crowding distance | |

12. | |

13. ←make-new-population() | |

14. end for | |

15. return |

Input: A population P in which each individual is an interval vector representing a portfolio | |

Output: Individuals of P sorted in dominance fronts according to their level of dominance | |

1. Inicialize: , , , | |

2. for each do | |

3. | |

4. | |

5. for each do | |

6. if then | |

7. | |

8. else if then | |

9. | |

10. if then | |

11. | |

12. | |

13. while do | |

14. | |

15. for each do | |

16. for each do | |

17. | |

18. if then | |

19. | |

20. | |

21. return |

Input: Non-dominated set | |

Output: None. The crowding distance value is assigned to each member in I in this method | |

1. Inicializar: | |

2. for each i, set =0 | |

3. for each objective do | |

4. | |

5. | |

6. for to do | |

7. |

Algorithm 2 is the pseudocode of the method* interval-fast-nondominated-sort*; this algorithm was former proposed in [48], where it was explained in detail. The I-NSGA-II algorithms uses this method to rank the population* P*_{T} each iteration* T* and to create the interval nondominated fronts.

Algorithm 3 is the pseudocode of the method* interval-crowding-distance-assignment. *The I-NSGA-II algorithm uses an interval extension of the method presented in [47]. The operations over the objective values and the ordering are made through interval mathematics described in Appendix.

#### 4. Study Case: Portfolio Selection of New Product Development (PSNPD)

In order to validate the proposed model defined in problem (6) and its solution using the algorithm I-NSGA-II, we address here the NPD case for a company involving the features described in Section 4.1. The experimental design developed to evaluate the case of study is described in Section 4.2, and the results and their analysis are presented in Section 4.3.

##### 4.1. Optimization Model

Let us consider a problem as a particular case of project portfolio selection of new product development that is characterized by the following circumstances:(1)The impact of the portfolio is described by* N=*2 objectives, which involve imperfect knowledge and are denoted by .(2)There are* M* candidate projects of New Product Development.(3)The estimated impact of each project* j**M *over each objective* i**N *involves imperfect knowledge, and it is denoted by the interval function . The impact of a given portfolio over the each objective* i *is obtained by a simple aggregation of the values of for each .(4)The* budget *is the single class of resource involved () in this problem. The available budget, denoted by , represents the estimated availability of such resource. Now, its estimated consumption for a given portfolio* a*, denoted by (*a*), is computed by adding the individual costs of each project, denoted .(5)There is imperfect knowledge related to the availability and the consumption of budget; such imperfections are represented by and , respectively. The resource consumption of the j-th project is imperfectly known and denoted by . The consumed budget (*a*) is the total consumption of portfolio (obtained by simple aggregation of the individual project costs for each ), and it must not exceed the available budget .(6)All the projects has no precedence relations, i.e., for any project* j *of any portfolio* a*. Hence, the starting time of any project can be considered as .(7)The j-th project is associated with the following time components:(a): the moment in which the competence of the product developed by the* j*th project becomes significant;(b): the moment in which the product developed by the* j*th project can be considered old;(c): its estimated completion time.(8)There are certain functions that has a strong dependence on the time conditions , , , and their impact is defined according to (7), where and *π* are degradation coefficients defined by the management team of each project in collaboration with the marketing department of the enterprise.Equation (7) is a simplified model that reflects the impact on the i-th objective from the j-th project in dependence on its completion time. Particularly, those effects can be traduced as follows. The conclusion of a project when its competence has not yet appeared implies that it provides all the benefit for the objectives where it impacts. However, if a project is concluded after it has become obsolete, then all the benefits on objectives affected by this situation are reduced by a factor *π*. Finally, the apparition of the competence before the projects is ended affects its benefits by a factor >*π*.

This case of study of the problem PPSP can be formally described by (8), which is an instance of problem (6). Note that a portfolio is a binary vector where =1 indicates the inclusion of the j-th project in the portfolio.where *γ* and are thresholds related to the phrase “with sufficient likelihood”.

The present work proposes the development of a method that solves problem (8) based on the definition of a (*γ*,, *β*)-feasible portfolio , i.e., a portfolio that satisfies the time-related and resource constraints as defined previously (according to the conservatism levels *γ*, defined by the DM), and for a given dominance parameter *β*.

##### 4.2. Experimental Design

This section describes the experiment conducted to evaluate the performance of the proposed I-NSGA-II as solver of problem (6) in the special case PSNPD described in Section 4.1. The organization is as follows: Section 4.2.1 presents the general information of the instances used for the experiments; Section 4.2.2 describes the different conservatism levels that were tested, and Section 4.2.3 details the experiments.

###### 4.2.1. Instance Definition

According to the description in Section 4.1, Table 1 shows the particular instance with M=100, N=2, and budget used in the experiment. Let us recall that all the bold variables involve information with uncertainty expressed by intervals. For simplicity, the strong degradation factor *π* was set to zero; the weak degradation factor was set to 0.5.

###### 4.2.2. Definition of Parameters *γ*, *δ*, and *β*

The level of conservatism is a crucial aspect concerning the way in which a DM faces imperfect knowledge on his/her addressed problem. Parameters *γ*, , and *β* are used here to model this aspect of the DM’s subjectivity. As stated in Remark 9, these parameters refer to different portfolio features, and therefore, they have not necessarily equal values. The decision maker-decision analyst pair is in charge to set appropriate values of these conservatism levels. The concept of *γ*,-feasible portfolio is easier than non-*β*-dominance; hence, the value of *γ* and should be set firstly. Both parameters allude to certain constraints (on resources and time, respectively), so one could expect that their values may be not very different. In this experiment *γ* was set as 0.66, and as 0.66, 0.75, and 0.90. *β* is set as 0.5, which reflects a nonconservative attitude with respect to dominance.

###### 4.2.3. Experimental Design Based on the Expected Value Method (EVM)

This section describes the experiment conducted to evaluate the performance of the proposed I-NSGA-II as solver of problem (6) in the special case PSNPD described in Section 4.1. The experiment analyses the proposed model against the situation when the time-related imperfect knowledge are modelled by uniform probability distributions and the project performances are represented by their expected values.

Under uncertain conditions, risk-neutral DMs and other nonconservative DMs can make decisions based on average values. Risk-neutral DMs have linear utility functions; in risky events, their certainty equivalent equals the average value of the lottery. Also, the use of average values represents a simple way of handling uncertainty and imprecision; so, this strategy can be used by DMs that are unfamiliar with (or reject) an in-depth modelling of imperfect knowledge.

The use of expected values eliminates the uncertain time effects (initially provided as intervals) through the computation of the expected impact of each objective* i *in all the projects* j*. The new impact is calculated as follows:(1)Generate a realization , for each pair of intervals (, ), respectively, satisfying .(2)Assuming a strong correlation, compute the value of a realization in as - ), upper.(3)Compute the new impact of each objective subject of degradation due to the current values of , , .(4)Accumulate in the value of .(5)Repeat steps to from 1 until 1000.(6)Compute .

The average contribution of projects, for the instance shown in Table 1, is presented in Columns 2-3 in Table 2. The average contributions of the j-th project are considered as the contributions of the j-th project of a new instance, but in which the time-related effects have been already considered in the average process.

A risk-neutral decision maker may focus his/her attention on the average values provided by Table 2 and make his/her decisions based on those values. On this background, the next step consisted in using I-NSGA-II to solve the problem, derived from the definition of objectives impact due to time effects as shown in Table 2:where denotes the information of the average projects in Table 3; the requirements are the same given in Table 1, second column.

Solving problem (9), after 30 I-NSGA-II runs, 88 nondominated solutions were obtained. The ideal solution is and and is formed by identifying the maximum impact value for each objective in the nondominated set of solutions using interval mathematics.

In order to identify the closest nondominated solution to the ideal, we used the mean point of the intervals as representative of the intervals that describe solutions in the objective space. Having computed the mean point for each solution provided by I-NSGA-II, the Euclidean distance was used to identify the closest one to the ideal (which was also defined by mean points). The closest solution in the objective space for EVM is shown in Table 3 and in the portfolio space (denoted as ) is shown in Table 8 along with the others corresponding to the different level of conservatism considered.

The portfolio built using EVM is justified in the sense that a DM that is neutral to the risk will choose it, while someone with more conservative attitude would feel more comfortable with a portfolio composed of less risky project. A risk-averse DM would feel a great regret for projects whose results were obsolete. Since most decision makers are risk-averse, a method to control regret for delayed products should be welcome.

##### 4.3. Results Obtained by Our Proposal

After 30 runs of I-NSGA-II solving problem (8), with different levels of conservatism related to the time-constraints, we obtained the results given in Table 4.

The portfolios obtained with = 0.66 contain more projects, but these are riskier. A significant part of the supported projects will not give benefits or only will give a portion of them.

The single solution (in the objective space) corresponding to = 0.9 is provided by Table 5. Tables 6 and 7 show the closest solution (also in the objective space) to the ideal one in Euclidean sense for = 0.66 and = 0.75, respectively. In the portfolio space, these solutions, denoted by C_{0.9}, C_{0.66}, C_{0.75}, respectively, are shown in Table 8.

In order to compare the solutions obtained by I-NSGA-II and the one obtained using the EVM, the best portfolios were compared in terms of the number of failure projects and wasted budget. For this purpose, the time effects derived from , , and were simulated using a sample of 1000 possible outcomes of their impact. Each outcome is calculated from realizations* t*_{c,j}, , and* t*_{old,j}. The first ones are randomly chosen from the intervals and (the values in the range are considered uniformly distributed); the realization is calculated assuming a strong correlation with* t*_{comp,j}, taking* t*_{old,j}=min + (*t*_{comp,j} -* lower*(*t*_{comp,j})), upper.

Now, the performance of a particular portfolio* p* is measured in terms of the number of* successful projects*,* partial failure projects*,* complete failure projects *and the* wasted budget* derived from it under the following rules: if* t*_{c,j}<* t*_{old,j} and* t*_{c,j}≥* t*_{comp,j} then a project in* p* is considered a* partial failure* because the project was finished after the competence appear in the market and part of its impact is degraded; if* t*_{c,j}≥* t*_{old,j} then a project in* p *is considered a* complete failure *because its relevance disappeared even before it could be finished; and when* t*_{c,j}<* t*_{old,j}, the j-th project is considered successful and its budget is count as well-invested. Table 9 presents a summary of the values for* successful*,* partial failure*, and* complete failure projects*, derived from the portfolios created by I-NSGA-II and EVM; Table 10 presents the information related to the budget that is* well-invested*,* partial wasted*, or* complete wasted*.

Based on the results presented in Tables 9 and 10, we can conclude that the results are in agreement with an increment of the level of conservatism. This is mainly because the probability of having failed projects is reduced 60%, and more importantly, to 0 when it refers to complete failures. Wasted budget can be strongly reduced. A risk-averse decision maker should prefer . Note that portfolio* C*_{0.90} does not use the whole available budget because the optimization process did not find sufficient nonrisky projects; hence, the portfolio is very robust with respect to the imperfect information of resources availability and consumption. This portfolio has no budget invested in projects that end in a complete fail.

Concerning the question about the appropriate value of , we note that the solution with = 0.75 is not more conservative than the risk-neutrality defined by EVM. It seems that risk-averse decision makers should set values close to 0.9. This should be confirmed by more experimentation.

#### 5. Conclusions

In manufacturing enterprises, during its last phase, an applied research project becomes NPD project, oriented to market and whose results are strongly influenced by time-related effects, mainly concerning competitors and technological changes. These time-dependencies are usually neglected by the project portfolio optimization literature, but they must be taken into account for NPD project evaluation.

The research presented in this work involves a solution method for the portfolio selection problem of new product developments (NPD) under uncertainty and imprecision, considering effects related to time-interdependence among different projects and imperfect knowledge on completion times. The three important characteristics of NPD projects included in the developed model are uncertain market payoffs that change over time; strong dependence of benefits on imperfectly known project completion times, technological innovations, potential competitor products, and their interactions; and probable precedence of applied research projects. Particularly, time effects are the completion times of projects, the moment when the competence becomes relevant, and the moment in which a project is outdated and possibly no longer of interest, which are the considered time-related effects; all of them have impact on one or more objectives that characterize the project performances, degrading their values, and hence, making it important to study the handling of risk, particularly in case of risk-averse decision makers.

The proposed optimization model integrates in its design the following novel characteristics: management of imperfect knowledge through interval mathematic; incorporation of the conservatism level of a decision maker in the optimization process; and a set of parameters that allows the adjustment of the conservatism level, at least in the management of the budget and the constraints derived from time-related effects. The multiobjective optimization problem is solved by the interval-based evolutionary algorithm I-NSGAII, which could approximate the Pareto frontier in the interval domain. The use of this method inherits the limitations of NSGA-II; i.e., it works well only with 2 or 3 objectives; in cases where the portfolio is described by N > 3 objectives a different algorithm of solution would be required, e.g., as the one published in [26].

To be operational, the proposal requires a significant volume of information, although not necessarily precise: (a) mathematical models of the way in which the impact of the projects depend on time; (b) time-related interdependencies among projects; (c) other interdependencies among projects; (d) three parameters describing different aspects concerning the DM’s conservatism facing uncertainty and imprecision. It is important to underline that the DM is not along in this task. The DM-decision analyst pair should be supported by experts, project manager teams, and marketing departments. Fortunately, in large manufacturing enterprise each NPD project is managed and supported by an expert team. This team, working closely to the marketing department, is in charge of collecting the necessary information. Several recent papers insist in involving many stakeholders in the early stages of NPD processes to get information about customer’s preferences, technological changes, and competitors (e.g., [43, 44]). The DM is required only to reflect his/her attitude facing the risk produced by imperfect knowledge and to make decisions about the final portfolio.

The adoption of our proposal would imply challenges and new tasks to marketing departments and supporting teams of NPD projects. In this sense, our approach requires some organizational changes to be fully operational.

Handling the imprecision and uncertainty unavoidable in such volume of information is a crucial point. As stated by Fernández et al. [23], to express the values of parameters and estimations as intervals is a much easier and a more manageable task than setting precise values. Such interval-based models can also better satisfying group of experts with conflicting opinions, because they can agree easier on specifying a range of values from their opinions rather than a single one.

The proposal was tested through an experimental design that had the purpose of comparing the results achieved by the new method against the expected value based portfolio. This strategy computed the average of a sample of possible outcomes about how a portfolio might end after considering the time-related effects. The results revealed that high levels of conservatism might prevent wasting resources in failed projects, i.e., projects that if were chosen as part of a portfolio, their impact will reduce drastically due to the time-related effects. The results show that, adjusting the parameter that characterizes the time-related conservatism, the decision maker can control, reduce, and even eliminate his/her regret produced by the budget wasted in failed projects. The example seems to suggest that time-related conservative portfolios are associated with resource-related conservative portfolios; this should be confirmed by extensive tests.

#### Appendix

The concept of interval number was originated in the so-called interval analysis theory (Moore, 1962). Such a number represents a numerical quantity whose exact value is unknown. Moore (1962) describes an interval number in terms of a range, = [], where represents the lower limit and the upper limit of the interval number. A real number is a particular case of interval numbers when .

Let = [] and = [] be interval numbers. Basic arithmetic operations can be defined for interval numbers as follows: Yao et al. (2011) introduced certain order relation rules over interval numbers. This order relation rests on a possibility measure of . It is easy to prove that this possibility measure is equivalent to the following equation:where

If and are real numbers and , respectively, then The order relation between two interval numbers is defined as follows: (i)If and , this means that is equal to (). Then, .(ii)If , this means that is greater than (). Then, .(iii)If , this means that is smaller than (). Then, .(iv)If or , when , this means that is greater than (). When , this means that is smaller than ().

A real number within the interval [] is said to be a* realization* of the interval number . is interpreted as a likelihood degree of the statement “once both realizations are determined, will be greater than or equal to ”.

#### Data Availability

The experimental design data used to support the findings of this study are available from the corresponding author upon request.

#### Conflicts of Interest

The authors declare that they have no conflicts of interest.

#### Acknowledgments

The authors want to thank the support from CONACYT projects nos. 236154 and 3058, the CONACYT project titled A1-S-11012-“Análisis de Modelos de NO-Inferioridad para incorporar preferencias en Problemas de Optimización Multicriterio con Dependencias Temporales Imperfectamente Conocidas”, and the TecNM project no. 5797.19-P.