Abstract

This study uses the concept of alignment as a framework to examine empirical research on the impact of entrepreneurship education interventions on students. Alignment assumes that effective instruction requires congruence between three instructional components: intended outcomes, instructional processes, and assessment criteria. Given the extant diversity and complexity of entrepreneurship education impact, scholars have not been able to explain how teaching approaches and methods are being adjusted to the variety of expected outcomes. In order to address this gap, we critically reviewed the published empirical studies on entrepreneurship education impact in 20 journals over a 15-year period (2000–2015). We found 16 empirical studies that met our inclusion criteria. Our findings revealed that teaching objectives, teaching methods, and teaching content receive scant attention from researchers. This study will be of value to scholars researching the impact of heterogeneous entrepreneurship education practices and approaches on individuals. Our analytical framework could contribute to less contradictory findings of entrepreneurship education impact studies. We also identify research limitations and suggest avenues for future research.

1. Introduction

The impact of entrepreneurship education has received increasing attention in recent review studies [16]. Prior reviews have for the most part used a methodological rigour lens examining extant empirical studies on entrepreneurship education impact [2, 3, 7]. Consequently, entrepreneurship education researchers are aware of the methodological weaknesses pertinent to the extant literature and have recommendations as to how to overcome them. Yet, scholars have not been able to explain whether the teaching approaches and methods are adjusted to the various expected outcomes [8], particularly given the diversity and potential complexity of the entrepreneurial learning outcomes that educators might impact [9]. This study attempts to address this gap by asking how have the objectives, delivery modes, and impact assessment been aligned in empirical entrepreneurship education literature?

It is customary to differentiate between a “narrow” and “wide” view on entrepreneurship in education. The narrow definition assumes that individuals should be encouraged to start up their own businesses, while the wide definition puts emphasis on making individuals more opportunity oriented, creative, self-reliant, proactive, and innovative [10]. The definition, that one adheres to, affects educational objectives, content design, target audience, teaching methods, and assessment procedures, leading to a vast variety of existing approaches [5]. In the context of this study, we support both views and define entrepreneurship education as a diverse and eclectic phenomenon. It implies that we study entrepreneurship education interventions that employ various curricula and teaching methods as well as outcomes.

Consistent with Rideout and Gray [7], we conducted a literature review of relevant empirically based studies between 2000 and 2015 (we include articles published until January 1st 2016). Following Henry et al. [11], we performed a within-journal search. We targeted major entrepreneurship and small business journals and management journals and added two specific journals that widely publish on the topic of entrepreneurship education. We brought the concept of alignment into focus to examine identified empirical studies. In carrying out this research we applied a fundamental concept of effective instruction, instructional alignment, to the entrepreneurship education literature on impact assessment. The evaluation of impact is a key dimension of any teaching intervention and therefore requires to be taken into consideration at the intervention design stage [12]. Impact assessment allows for the evaluation of entrepreneurship education outcomes. As there is a stream of research concerned about the impact of entrepreneurship education in higher education [6, 7, 13], there is a need to analyse the current knowledge; thus our focus is on entrepreneurship education at the university level. We undertake a literature review of empirical impact studies according to the new analytical lens over a 15-year period. This paper contributes to extant entrepreneurship education literature by responding to the need for theory-driven frameworks to assess the impact of entrepreneurship education interventions (cf. [10, 12]). Our study should be of value to scholars researching the impact of heterogeneous entrepreneurship education practices and approaches on individuals. Additionally, our suggested framework could contribute to less contradictory findings of entrepreneurship education impact studies, taking into account the fact that it can be replicated to assess a given entrepreneurship education initiative. We believe that our paper adds to the improvement of the quality of research on the impact of entrepreneurship education interventions and corroborates the decisions of education and policymakers to stimulate and further invest in entrepreneurship education initiatives.

This article begins by explaining the origin and essence of the alignment concept. We then introduce an analytical framework that is used for setting up the research context for this study. The methodology section outlines the selection and steps of our search strategy and data analysis approach. Next, we describe the findings from our analysis. Finally, we derive conclusions, briefly state limitations, and outline theoretical and practical considerations.

2. The Concept of Alignment

Teaching entrepreneurship represents a system that incorporates many interacting elements: teachers, students, teaching context, learning activities, and outcomes. According to Von Bertallanffy, as cited in Biggs ([14], p. 350), any system with elements that constantly interact with each other strives to reach a stable equilibrium. A similar pursuit for balance as well as congruence can be relevant when addressing the issue of impact assessment in entrepreneurship education. One way to attain a necessary congruence between the key elements of entrepreneurship education intervention is likely by adopting the principle of alignment.

According to Cohen [15], the idea of applying the principles of instructional alignment to teaching is not novel, and instructional alignment is a well-established phenomenon in the domain of instructional design. Cohen’s [15] research into instructional alignment concluded a positive effect—about four times greater—of the aligned instructional efforts, even from little instructional effort, compared to nonaligned instructions. Further, using the assumption that students construct their own learning, Biggs [14] successfully merged the constructivist stream of learning with the concept of alignment to create constructive alignment, making it “one of the most widely read, cited and applied conceptual frameworks in higher education” ([16], p. 929).

Indeed, Biggs [14, 17] succeeded in integrating a working version of constructivism with instructional design at three cross points. First, the curriculum—in the form of objectives—is clearly stated in terms of content specific levels of understanding that imply appropriate teaching/learning activities. In turn, the teaching/learning activities require students to do the things that will likely elicit the objectives. And the assessment tasks address those same objectives, proving or disproving whether the students have learned what the objectives declare. The author suggested that the alignment concept constitutes an entirely criterion-referenced system with the highest level of consistency throughout the system [17].

Despite the popularity of this integrative concept for enhancing the quality of teaching and learning [18], this framework has not been examined in entrepreneurship education in detail [16]. For example, Jones [19] used the process of constructive alignment to evaluate a framework for organizing enterprise curricula. More recently, Macht and Ball [16] relied upon the framework of constructive alignment, together with other established educational frameworks, to propose a novel framework for entrepreneurship education. Another indication in extant entrepreneurship education literature of the value of constructive alignment is a study by Mwasalwiba [5], who mentions the usefulness of the alignment concept in guiding the choice of course objectives, course content, and teaching methods, and how the impact of desired outcomes should be assessed.

Following the tenet of constructive alignment that learning is most effective if curriculum objectives, teaching methods, and assessment are aligned, our paper suggests that research on entrepreneurship education can benefit greatly if researchers perceive the principle of alignment as relevant for assessing the impact of entrepreneurship education. In this study, we seek to obtain balance in studying impact assessment, despite the argument that, for some entrepreneurship theory, and thus entrepreneurship education, does not mirror the heterogeneity and intricacy of entrepreneurial practice [20]. Another argument that seems omnipresent is that entrepreneurship education is a young research discipline with a body of knowledge that is still inexplicit. Other reasons include the idea that entrepreneurship education is heterogeneous, which limits standardization across institutions, faculty, and students, and the influence of nonacademic practitioners who teach what and how they like [21]. By looking at impact assessment as an aligned system, we can not only mitigate the existing concerns, but also study the impact of entrepreneurship education interventions in a reasonably effective and consistent manner.

Using the concept of alignment, Table 2 presents the framework used to guide the analytical process of this study. It is recognized that educators readily pursue high-level aims, like teaching for deep understanding, in the courses they offer; however, they usually struggle to break down their aims into explicit objectives [14]. It is a demanding and important task to both formulate the curriculum objectives and make them meaningful and understandable for students [14]. Entrepreneurship educators should decide for themselves and make students aware of clear objectives they wish to achieve after the course or program is finished. For this reason, objective(s) is the first element of our analytical framework.

The next educators’ step is intended to develop the teaching/learning activities that specifically respond to the explicit objectives. Teachers are free to select appropriate teaching/learning activities. As Biggs ([14], p. 354) rightly claimed: “selecting appropriate teaching/learning activities is a matter of experience and judgement.” Since the activities are nested in the context, it is also the teacher’s responsibility to create the environment that will be beneficial and engaging for students to attain required objectives [17]. This step implies that entrepreneurship educators, in spite of their conceptual differences in understanding how entrepreneurship should be taught and a variety of teaching contexts, ought to include teaching/learning activities that support their defined objectives. In order to merge teaching/learning activities and content to be taught into one category, we use the term delivery mode, which is the second element in the framework.

The next decision concerns how stipulated curriculum objectives may be assessed. It is essential to apply an appropriate assessment task that permits the evaluation of individual student performance [14, 17]. The decision of how impact will be assessed is critical for entrepreneurship educators and scholars who study impact assessment. It allows students to demonstrate what they have learned with respect to the expectations set up in the objectives. It simultaneously allows scholars to make assessments, interpretations, and clearly convey the effect of educational intervention on students using appropriate assessment methods and indicators. Thus, methods and indicators form our third element, which is termed impact assessment mode.

In the context of this study we define intervention in a wide sense: it is any entrepreneurship program, course, or training. Impact assessment focuses on the outcomes of the interventions and seeks for causality between intervention and marked effect or influence. Thus, it leads to the attribution of specific impact measures to specific interventions [22]. According to Hulme [22], predetermined objectives of the intervention are necessary for impact assessment to take place. Knowing the actual impact, we may be able to say something about the effectiveness of entrepreneurship education intervention.

3. Method

3.1. Literature Review

In order to search for and analyse entrepreneurship education impact studies through the lens of alignment, we conducted an exhaustive and systematic review (see Table 3) of the entrepreneurship education literature (cf. [7]). We identified relevant peer-reviewed journal articles from the past 15 years—between January 2000 and December 2015 (inclusively). We explicitly targeted only empirical studies with university-based (including business schools) students as the focal point (given the nature of our research goal—to review the impact of actual entrepreneurship education interventions—we deliberately excluded conceptual papers and literature reviews). Our focus on entrepreneurship education interventions in a university setting is governed by a fast-growing body of empirical research on entrepreneurship education outcomes that need to be studied [6]. We built our search and selection strategy on the established method of systematic literature review [18, 23], which has been recognized within the entrepreneurship field in general (e.g., [2426]) and within the field of entrepreneurship education in particular (e.g., [4, 5]). Following Henry et al. [11] we started by compiling a list of corresponding journals within which to do our literature search. The convenience of using a focused journal-led search method, compared to a general Boolean search across the broad business databases and/or search engines (e.g., [2, 5]), lies in filtering out the resulting search hits from step one [24].

3.2. Data

Fourteen out of twenty journals in our set have been selected on the basis of their inclusion in the Association of Business Schools (ABS) Academic Journal Quality Guide (Journal of Business Venturing, Entrepreneurship Theory and Practice, Strategic Entrepreneurship Journal, Journal of Small Business Management, Small Business Economics, Entrepreneurship and Regional Development, International Small Business Journal, Family Business Review, International Journal of Entrepreneurship and Innovation, Journal of Family Business Strategy, Journal of Small Business and Enterprise Development, Venture Capital: An International Journal of Entrepreneurial Finance, and International Entrepreneurship and Management Journal) and the Entrepreneurship and Small Business subject field (Journal of Enterprising Culture is omitted, since Scopus does not cover this journal). Such inclusion indicates the quality of the publishing outlets and thus their application to aid reviews of research outputs [27]. Four journals represented the Academy of Management (Academy of Management Journal, Academy of Management Review, Academy of Management Perspectives, and Academy of Management, Learning and Education); in addition, we included two journals—Education and Training and Journal of European Industrial Training—that widely publish on the topic of entrepreneurship education and are well-represented in previous reviews (see, e.g., [2, 5, 6]). As previous studies of entrepreneurship education have selected the same journals we view this as a shared agreement on the fact that those are being recognized as the most important on both submission and publication levels in the area of entrepreneurship education. This is particularly the case of Education and Training and the Journal of European Industrial Training. We did our searches through Elsevier’s Scopus using a generic Boolean keyword search including one string: (entrepr OR enterprise) AND (education) in the title, abstract, and key words fields. The entire process of our systematic search is illustrated in Figure 1. The first step generated 615 hits. In the next step, we conducted a manual review of each of the 615 hits. During the first round of screening, one of the coauthors manually examined the titles and abstracts, excluding all nonempirical studies. Next, both authors went into the main body of the remaining articles and filtered them out based on the following relevancy criteria: (a) university-level students (including business schools) being involved in an entrepreneurship education intervention and (b) some type of impact assessment taking place. A total of 51 studies were identified that met our filtering criteria.

In the next step, a perusal of each article was conducted by the authors to decide if the article provided sufficient information for each element of the analytic framework. This step was especially time-consuming and required not only meticulous reading but also substantial reflection as the majority of studies were not explicit about their objectives, delivery mode, and impact outcomes. The third step refined our results to a total of 16 empirical studies which were included in the review process. We developed a detailed reading guide with particular emphasis on the three categories of our analytical framework. A variety of other categories like intervention type (e.g., program, course, and training), key research question, focus or purpose, and methodological approach were also included in the guide (see Table 4).

The authors together constructed a reading guide that allowed us to analyse each study in detail. First, the authors analysed 5 out of the 16 articles and filled in the guide independently and discussed and agreed on definitions and conflicting interpretations. This step allowed ensuring a high-level of reliability. Next, the authors read the residual manuscripts independently and discussed discrepancies if any occurred. The reading guide was employed by the authors to manually fill in the categories presented in the guide, particularly emphasizing objective(s), delivery mode, and impact assessment mode adopted in the articles. Since the three elements of our analytical framework were not always stated explicitly, the authors needed to read the articles while carefully pondering the substance of the three elements. Table 5 presents a finished table with the results for objective(s), delivery mode, and impact assessment mode from our review, while Table 6 introduces snippets of some of the evidence collected from our review. The next section introduces the findings from our review of the 16 published research articles.

4. Results

4.1. Descriptive Analysis

Table 1 summarises the identified empirical articles that fulfilled our selection conditions, specified by journal. The results show that no article was published before 2006, while 5 articles were identified in the 2006–2011 period and 11 articles were published between 2012 and 2015. Table 1 also shows that studies are not spread across many journals. Only a handful of journals, 6 of the 20 initially selected outlets for our review, are represented in the table. Both Education and Training and Journal of Small Business Management are the top two journals, with 4 and 5 relevant studies, respectively. The lack of studies published in other journals could indicate that the concept of alignment was not embraced due to its novelty and lack of detailed examination in the field of entrepreneurship education.

Our review of 16 studies showed that a variety of definitions have been applied to determine the boundaries of entrepreneurship programs. First, we observed that some studies lack explicit definition of which intervention type has been studied [28] and may change terms throughout a study [13]. Second, our analysis also reveals that the interventions vary greatly in duration and at times do not match what one would consider a program or course. For instance, a bundle of studies did not report on the time period for the intervention [25, 29, 30]. Yet, others define a course that may last between three days or twenty-four hours of class in total [1] to one semester [31]. Similarly, articles with a program specified as an intervention face a similar trend. Two studies report a six-week program [32, 33], while other programs carried on for twelve months [34] and around five months [35], respectively.

A review of the theoretical perspectives adopted in our selected studies indicate that the Theory of Planned Behaviour holds a dominant position. Two studies do not specify any theoretical stance [33, 36]. Other theoretical perspectives, such as the action regulation theory, human capital theory, and structuration theory, are used and tested in isolated instances [28, 32, 37]. With regard to methodological approaches, the absolute majority of the studies are quantitative (). There are only four notable exceptions with one qualitative article [25], and three others which employ a mixed method approach [28, 31, 32]. Among all the studies, questionnaire is the most common instrument for data collection, while researchers also used interviews as a data collection method [25]. Further, we structure the analysis according to the three elements of the analytical framework, drawing upon selected data from the studies to exemplify our points.

4.2. Objectives of the Interventions

The analysis shows a diverse set of program/course objectives. The objectives span from being unspecified [35, 38] through the rather generic to the more specific. Objectives like the “programme is targeted to final year students in higher education institutes who are interested in undertaking business opportunities in the future before they graduated” ([30], p. 607), “the programme aims to prepare students for an entrepreneurial career, specifically to prepare them for establishing their own businesses” ([34], p. 192), or cultivate professionals [36] offer a good illustration of the generic motive throughout some studies. We argue that there is no clear delineation of implicitly formulated objectives in these generic motives. With regard to more accurate objectives, some educators set to raise students’ awareness about entrepreneurship and its key related issues [1, 8] or encourage an enterprising student mindset [25] as their objective. Others take a step forward and divide their objectives into two parts: encourage students to consider an entrepreneurial career and provide entrepreneurial skills and knowledge of the business planning process [29]. For example, Fretschner and Weber [31] take a composite approach to stating objectives. The authors aim at changing individuals’ belief systems that that will ultimately drive their attitude toward entrepreneurship, make students aware of self-employment as a viable occupation in the future, and have an effect on an individual’s personal attitude toward entrepreneurship.

4.3. Delivery Mode

The teaching/learning activities aim to reflect methods that engage students and require of them to perform in the way specified in the curriculum objectives. It is teachers’ role to encourage students by enabling appropriate and stimulating learning-related activities. The activities are selected because their function and purpose are meant to be coherent with educators’ total teaching system [14]. This review demonstrates that teaching/learning activities can be divided into several categories based on the level of detail found in description. The activities range from unspecified [31] to broadly described [35] to well-articulated [28, 29, 34].

With regard to the broadly outlined methods, Souitaris et al. ([35], p. 574) indicate that “a wide range of activities across all four components (formal teaching of courses, business planning, interaction with practice and university support)” were offered in the program. Similarly, another study briefly says that the program is dedicated to entrepreneurship topics and covered various situations such as starting new venture, acquiring existing businesses, and venturing in corporations [8]. These two examples exemplify vagueness in outlining delivery mode. By contrast, some authors address the description of teaching/learning activities and content to be taught with a better rigour. Across all 16 empirical studies, there are a number of notable examples of such rigour. One is the paper by Rauch and Hulsink [34] who used various methods: (1) communicating theoretical knowledge through structured lectures, biography analysis of entrepreneurs, and case teaching method and (2) providing practice-oriented classes that involved activities like field projects, mentoring sessions with entrepreneurs, and pitches. Another study by Gielnik et al. [28] reviewed an entrepreneurial training that used active learning approach to cover twelve topics from the domain of entrepreneurship, business management, and psychology in twelve modules over a period of twelve weeks. The modules in this class were chosen based on comprehensive literature reviews of pertinent topics and content in entrepreneurship education.

We found that in the studies with the outlined methods, whether broadly, moderately, or well-articulated, conventional teaching methods (i.e., lectures and seminars, case studies, and class discussions) are dominant, followed by mixed methods (i.e., lectures and practice-oriented classes, lectures and independent projects, classroom session, and field work). This illustrates adherence to traditional pedagogical methods, while also showing evidence of educators’ willingness to increase their reliance on the action-learning approach of delivering entrepreneurship education.

4.4. Measurement of Impact Assessment

When making a decision about the impact assessment indicator, it is necessary to consider the extent to which it embodies the target objectives of an intervention and how suitable it is for assessing the impact. It can be useful to reverse the question and ask what degree of impact a chosen assessment approach is likely to draw out. Inappropriate mode of the impact assessment of entrepreneurship education intervention might be misguiding and even detrimental.

As may be anticipated, psychological constructs, and student’s entrepreneurial intention in particular, are dominant impact measures (e.g., [8, 35, 38, 39]). Some studies utilize numerical impact measures. Jones et al. [29], for example, aim at evaluating the impact of the program by measuring the difference of entrepreneurial career aspirations before and after the course using the Likert-type scale and percentages. Mohamed et al. [30] measure participants’ perceptions toward the program using scores from a Likert scale of 1 to 7, indicating that the program develops interest in entrepreneurship (97% of the respondents), provides its participants with the requisite skills to carry out proper business practices (89% of the respondents), and changes students’ perceptions from taking jobs offered by the government and private sector and becoming entrepreneurs instead (86% of the respondents). Interest in starting a business is another example of a simplistic impact measure used by Van Auken [33] to assess which aspects of the program affected students’ interest in business ownership after graduation.

To assess impact, a number of studies rely on existing scales and methods from extant literature, while some scholars design novel means and suggest new measures. An apparent illustration of the existing assessment means would be the Theory of Planned Behaviour (TPB) by Ajzen [40]. Fayolle et al. [8] employ TPB to measure the impact of the entrepreneurship education program and make use of the questionnaires designed and validated by Kolvereid [41]. Similarly, Souitaris et al. [35] adopted a 3-item measure of career intention proposed by Kolvereid [42]. Entrepreneurial mindset, which relates to the intention to become an entrepreneur, was measured by Solesvik et al. [37] through a Likert-type scale with five statements indicating various aspects of intention suggested by Liñán and Chen [43].

New assessment means and measures are also present in the studies. Rauch and Hulsink [34] designed a list of 19 behaviours, which consists of a representative set of activities associated with the formation of new ventures. It allowed them to not only deduce that the entrepreneurship education program has a positive effect on the intention to become an entrepreneur, but also recognize an impact on entrepreneurial behaviour 18 months after the first course in the program was completed. Gielnik et al. [28] provide another example of a new assessment means and measure. They use entrepreneurial action and business creation to measure the impact of an action-based entrepreneurial training. The study shows that the training has a significant impact on business creation and an effect on entrepreneurial action through action-regulatory mechanisms like action planning, entrepreneurial goal intentions, action knowledge, and entrepreneurial self-efficacy.

Despite the prevalence of quantitative impact assessment measures and methods, our analysis found a study by Jones et al. [25] which used semistructured interviews with 122 students to qualitatively assess a perceived impact of the course on entrepreneurial attitudes, motivations toward an entrepreneurial career, and attitudes toward the learning experience.

This study is a follow-up and builds upon the quantitative evidence within Jones et al. [29], whose study is also included in our final sample. Jones et al. [25] aimed to provide a rich body of evidence to confirm that entrepreneurship education can have a positive impact on both entrepreneurial attitudes and entrepreneurial career choices.

A study by Fretschner and Weber [31] and another by Morris et al. [32] are noteworthy for their mixed method research designs. Fretschner and Weber [31] relied on the mixed method to measure and understand the impact of an entrepreneurial awareness course on students. The qualitative strand was used to gain insight into the formation of students’ personal attitudes toward the entrepreneurial behaviour by understanding their behavioural beliefs. The beliefs represent drivers with positive (e.g., independence, financial success, and self-reliance) or negative (e.g., risk, workload, and responsibility) attitudinal change toward entrepreneurship. The authors designed and used open-ended questions to elicit students’ basic behavioural beliefs about self-employment, which were critical to trigger in students by the course. Morris et al. [32] employed a multi-round Delphi technique to form a list of core entrepreneurial competencies by asking 20 distinguished entrepreneurs and 20 leading entrepreneurship educators. After a final set of 13 competencies was produced, the authors developed a set of self-reporting measures to assess progress in mastering each competency after being involved in a program.

5. Discussion

The aim of this study was to review the entrepreneurship education literature on impact assessment by employing the concept of alignment. In doing so, we sought to develop a more complete understanding of the relationship between entrepreneurship education and its impact on individuals. Our review revealed several key findings that we discuss below.

The first key finding in our study relates to empirical impact studies where scholars are concerned with describing entrepreneurship education interventions. Sixteen studies met our inclusion criteria and provided descriptions, with a varying level of detail, of the interventions, including objective(s) and delivery mode. This finding partly supports the claim by Fayolle et al. [44] that there are scant empirical data reporting teaching beliefs, real practices, and outcomes for learners beyond entrepreneurial intention. Despite the fact that the teaching beliefs are not part of our review, we have observed that entrepreneurship education practices are rarely well-documented in the empirical studies in our final sample. Scholars are not preoccupied with giving a detailed account of the intervention, particularly the objective(s) and delivery mode of the entrepreneurship education intervention. This finding is also consistent with Rideout and Gray [7] who state that some researchers, as other reviews discovered, do not provide sufficient information about their entrepreneurship education programs.

Our second noteworthy finding relates to the concern by Fayolle et al. [44] with regard to the outcomes. Indeed, regardless of which intervention the students were part of, the authors’ choice of the impact measures converged toward entrepreneurial intentions, making intentions a prevalent outcome for individuals. While the formation of entrepreneurial intentions after an intervention is a legitimate outcome, authors avoid or omit detailing the linkage between the objective(s), delivery mode, and the formation of intentions. We argue that researchers, when using entrepreneurial intentions and their motivational antecedents to assess the impact of an entrepreneurship education intervention, should provide a sound explanation of what type of intervention is being assessed. For example, awareness program or course aims at changing students’ belief systems that ultimately drive the personal attitude toward the entrepreneurship and hence entrepreneurial intention [31]. Further, we do encourage researchers to go beyond using entrepreneurial intentions as an outcome of the impact. Our review revealed other outcomes like entrepreneurial action and business creation [28], entrepreneurial career aspirations [29], and entrepreneurial competencies [38]. Overall, quantitative measures dominated in the studies. Consistent with [6], we believe that the dominance of quantitative measure has been predisposed by the direction of a fast-growing body of empirical research on entrepreneurship education outcomes. Researchers have tended to focus on subjective and short-term impact measures such as entrepreneurial attitudes and intentions. Finally, we identified a variety of impact measures that are used by scholars. Nevertheless, we can conclude that a positive effect of entrepreneurship education interventions prevails, whatever impact measure entails.

The third key finding is that objectives are, with only few exceptions, outlined poorly and in vague terms. Interestingly, the failure to define specific objectives is found not only in less impactful journals in the field of entrepreneurship, but also among top-ranking publication outlets. A review of the teaching objectives outlined in the selected studies indicates poor efforts to define clear and specific objectives. This observation leads us to hypothesizing that such poor efforts are made by either educators in those interventions or scholars who did not commit to investigating the objectives of program and courses. Hence, although the educators might have high-level aims, they are prone to make a common mistake of not descending “from the rhetoric of their aims to the specified objectives” ([14], p. 351). Moreover, the issue of contradictory objectives in a single intervention—educating about, for, and in entrepreneurship—prompts a haphazard selection of teaching methods and content [5]. Thus, less consistent outcomes are to be expected if the diversity of objectives remains. A fundamental assumption that a marked impact is generated when objectives are stated in a clear manner should be of important concern to educators and entrepreneurship scholars who research into the impact of entrepreneurship education interventions. Moreover, when one sets up referencing criteria in terms of objectives, she/he facilitates the process of setting up both teaching and assessment agendas [14].

Similarly, we noted that a description of the delivery mode of the interventions receives lack of attention from scholars. When researchers describe interventions, they tend to be generic and omit or perhaps intentionally avoid details that might be important to other scholars who study impact assessment or educators running a similar course or program. It could be argued that such behaviour is counterintuitive, given the notion that teaching methods are not an end per se and are instead selected to achieve particular course or program objectives [12]. What to teach (content) is equally as important as how to teach (teaching methods), and both follow objectives. Our review suggests that traditional approaches to teaching, which are ascribed to teaching “about” entrepreneurship and aim at providing a general understanding of entrepreneurship [10], prevail in the studies. Yet, we have also observed the evidence that educators are moving toward and adopting teaching “for” and “through” entrepreneurship, despite the challenges (e.g., resource and time constraints, cost implications, and assessment challenges) that these approaches were facing [45].

A fourth and probably not surprising finding is lack of alignment between objectives, delivery mode, and impact assessment. One possible explanation is that a requirement to document entrepreneurship education intervention has not been seen as an important priority by scholars and reviewers. Even if authors make an effort to report learning objectives and delivery mode, they seldom link them to impact. A choice of impact assessment measures and methods seems to have been a dominant focus for scholars while conducting research on the impact of entrepreneurship education. In other words, the research has been driven by the notion of impact without paying much attention to the coherence between the three elements—teaching/learning objectives, delivery mode, and impact assessment mode. Another explanation is that scholars may not feel comfortable or motivated to approach educators whose diverse entrepreneurship programs or courses they research on. Thus, they report shallow information about intervention’s objectives or delivery mode without delving into details. We argue that applying the principle of alignment may allow researchers to increase the comparability between the studies, despite different curricular and instructional designs of the entrepreneurship education interventions. Such differences have been seen as part of the reason why some entrepreneurship education impact studies arrive at contradictory results [31].

6. Conclusions

In this article, we asked: How have the objectives, delivery modes, and impact assessment been aligned in empirical entrepreneurship education literature? According to our knowledge, the current study is the first review that has used the concept of alignment to examine entrepreneurship education impact studies. Our review focused on studies of university-based entrepreneurship education interventions from 2000 through 2015 that attempted deliberately or unwittingly to depict at least two out of three elements of the analytical framework we employed.

The study revealed sixteen empirical studies that met our inclusion criteria. Thus, we argue that the principle of alignment remained distant from being taken into consideration. First, we suggest that this is the case because the concept of alignment is new to the field of entrepreneurship education; therefore, it has not been sufficiently examined yet. Second, it could also be the case that entrepreneurship journals do not require from scholars to provide elaborate descriptions of entrepreneurship education interventions in impact studies. Our findings did demonstrate that scholars are not used to giving significant attention to both course or program objective(s) and delivery mode, making focus on impact a driving force in the studies. Presumably, the better the practitioners demonstrate the first two elements—teaching objective(s) and delivery mode—the better the scholars are able to take them into account when using existing or designing new impact assessment methods and measures.

As Rideout and Gray ([7], p. 346) suggested, entrepreneurship education seems to be a phenomenon “where action and intervention have raced far ahead of the theory, pedagogy, and research needed to justify and explain it.” Likewise, research on impact assessment of entrepreneurship education interventions has raced ahead of the theory needed to confirm and explain it. The framework we suggested can facilitate the process of balancing out the race between research and theory.

Even though the framework offers an applied and theoretically grounded approach to study the impact of entrepreneurship education, the paper does not propose that this is the only suitable approach. It presents value for policymakers, educators, and researchers who have differing interests and theoretical orientations with regard to entrepreneurship education [5] and do not endorse one-approach-fits-all model of entrepreneurship education [46]. On that premise, we believe that the framework is useful due to its embedded flexibility in embracing the heterogeneity of educational practices and approaches adopted in entrepreneurship education, which should be appreciated, according to Jones and Matlay [47]. To assess the impact of an entrepreneurship education intervention on an individual effectively, the framework can be enacted to align the endless variations of objective(s), teaching methods and contents, and impact assessment methods and measures. Furthermore, the framework could also be empirically replicated to assess a given entrepreneurship education initiative. Based on our review, we may conclude that the elements of the framework have been poorly aligned in the empirical studies we examined.

6.1. Limitations and Avenues for Future Research

While our work contributes to the knowledge on impact assessment of entrepreneurship education, we acknowledge that it has its limitations, which imposes restrictions on how one should interpret our findings and conclusions. First, similar to Henry and Foss [24], we accept that a specific number of particular journals and investigation time periods may have led to the omission, although unintentional, of some relevant articles. Indeed, although we have reviewed a small sample size, it was purposeful and the articles included in our study underwent a peer review process in corresponding journals, which can serve as a proxy for a minimum quality control [23]. Second, extra insights could be gained by looking for studies listed in citation databases like Business Source Premier and/or Web of Science and increasing search time frame.

There are a number of avenues for future research that arise in light of our findings. A way forward for future research could be to generate richer methodological research. One avenue could be to utilize a longitudinal research and a mixed method approach more often in order to both attribute specific impacts to specific entrepreneurship education interventions and also provide an interpretation of the processes that take place in the intervention and of their plausible impacts.

Another avenue, which is in line with Rideout and Gray [7], could be well-designed case studies to identify important mediators, including objectives, teaching methods, and taught content, in entrepreneurship education interventions, and understand how they function and exert influence upon individuals using rich and insightful data. Entrepreneurship education interventions do not operate in isolation. The context that surrounds them has an influence on them. This includes educators’ beliefs about the teaching objectives they would like to attain, the delivery mode to be utilized, and the impact to be made. Scholars could take advantage of a case approach to better elucidate the complex nature of the impact of entrepreneurship education, which is a context-embedded phenomenon.

Finally, in future research we encourage scholars to increase the relevance and practical usefulness of research on entrepreneurship education impact to practitioners. This could be achieved not only if researchers and practitioners apply their joint efforts to interpret the results of the research [48], but also if scholars talk to practitioners beforehand and ask for specificities around each entrepreneurship education intervention they plan to conduct research on. We believe that by bridging both communities (research and practice) to study the impact of entrepreneurship education, we help to make explicit taken-for-granted assumptions about teaching objective(s), delivery mode, and impact assessment methods and measures, as well as their congruence with each other.

Disclosure

An abstract and an early version of this article were submitted, but not presented, to NORSI Research School Conference 2016.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

The authors want to thank Colette Henry, Alain Fayolle, and Nhien Nguyen for valuable comments on earlier versions of the manuscript. The publication charges for this article have been funded by a grant from the publication fund of UiT, The Arctic University of Norway.