Table of Contents Author Guidelines Submit a Manuscript
Education Research International
Volume 2019, Article ID 3648318, 11 pages
https://doi.org/10.1155/2019/3648318
Research Article

Exploring the Relationship between the Collegiate Learning Assessment, Student Learning Activities, and Study Behaviors: Implications for Colleges and Universities

Fayetteville State University, NC, USA

Correspondence should be addressed to Theodore Kaniuka; ude.usfcnu@akuinakt

Received 12 August 2018; Accepted 10 February 2019; Published 22 April 2019

Academic Editor: Yi-Shun Wang

Copyright © 2019 Theodore Kaniuka and Matthew Wynne. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Globally, institutions of higher learning have attempted to utilize various methods to assess student learning outcomes and simultaneously determine what factors influence targeted performance measures. The National Survey of Student Engagement is a popular instrument many colleges and universities employ to gain an understanding of what student behaviors are linked to such desired outcomes as graduation and persistence. Internationally, other universities are exploring the concept of student engagement as a means to assess college environments. Recently, calls from stakeholders have prompted institutions to use assessments to demonstrate so-called twenty-first century skills such as critical thinking; the Collegiate Learning Assessment is one such tool. This study reports preliminary efforts to link the NSSE engagement indicators to CLA performance at a medium-sized Historically Black College or University. Results indicate that the NSSE indicators are (1) poor predictors of student GPA, (2) of the ten indicators of the NSSE, only one was found to be a significant predictor of CLA performance, and (3) of the items comprising this indicator, only half are associated with CLA outcomes.

1. Introduction

Recent events in defining the role and purpose of higher education from the federal and local levels have been the impetus for many institutions of higher learning to explore ways to demonstrate that a postsecondary degree matters [1]. These pressures are present worldwide as the role of a college education has shifted from that of the private to public well [2, 3], such that colleges and universities have aligned their focus to service business and industry in order to remain economically viable and receive the support of both the individual and the society [4, 5]. As a result of these pressures, institutions of higher learning have been tasked with teaching and assessing the degree to which a college degree can improve attributes such as the ability to think logically and solve problem [6, 7]. In an effort to demonstrate the valued-added nature of the college degree, these institutions have expanded accountability and assessment initiatives to demonstrate to stakeholders that college matters.

There have been multiple accountability initiatives in higher education [8] that have called for the direct assessment of student learning in ways that provide comparable information across institutions and states [9, 10]. Using regression analysis and archival data, this article examines to what degree measures of student learning and engagement experiences are related in an effort to identify the link between learning activities and performance to support postcollege career readiness is possible [11]. These researchers state that, despite the numerous examples of institutions using the National Survey of Student Engagement [12] to monitor programs that are linked to enhancing student engagement, there is a paucity of studies that attempt to link student engagement to desired student outcomes. This study does so by utilizing regression analysis to explore if there are links between student performance on the Collegiate Learning Assessment [13], student learning activities, and study skills as reported by the National Study of Student Engagement in an effort to determine if such a process could be useful in providing insight into how colleges and universities may better align the engagement work they do with the aims and outcome-associated measures of student performance.

2. Literature Review

In the twenty-first century, the skills essential for today’s students are far different from those required in the previous decades. A greater emphasis is being given to college readiness and college outcomes because employers now more than ever expect college graduates to possess writing, critical thinking, and problem-solving skills [14] in response to the changing demands of available jobs [15, 16]. Teachers and students can no longer rely solely on the accumulation of disciplinary knowledge and skills as employers evaluate potential employees not purely on what they know, but also on what they can do and how they do it.

There is now an emphasis and precedent on focusing on the twenty-first century skills [17, 18] (SBAC, 2012) in addition to knowledge in specific content domains [1922] in hopes of fostering the development of critical thinking, problem-solving, communication, collaboration, creativity, and innovation skills [19], skills coveted by employers. The impetus to change instructional programs is multifaceted such that it includes various stakeholder groups including prospective students, the very customer colleges, and universities trying to attract. In a recent annual survey conducted by UCLA, Wyer [23] reports that, for the first time, freshman state that getting a better job is most important to attend college.

2.1. Assessing Learning and Engagement

Assessment in higher education is a complex issue not only with regard to its purpose but also includes social justice perspectives, equality, and overall public policy considerations of what is assessed in higher education [24]. Messick suggests that assessment for learning in higher education must serve two groups: (1) the faculty must get feedback to support changes in instruction to enhance student outcomes and (2) students must learn from assessment so they can reflect on their own learning practices to better support their knowledge and skill development. Furthermore, it is posited that administrators as well as key policy-makers must also be aware of how the types of learning activities instructors utilize are linked to these same immediate pedagogical outcomes, which includes the long-term implications the college experience has for students and society [2529]. Assessment is a multifaceted activity often intended to serve many related and divergent purposes. An institution must be clear on what assessment could be added not only in terms of measuring student outcomes but also how these outcomes can be influenced by the actions of faculty and administrators.

2.2. Collegiate Learning Assessment

The Collegiate Learning Assessment (CLA) is designed to assess the skills most critical for success beyond college [30, 31]. CLA provides a reliable measure of students’ ability to solve complex problems by analyzing and synthesizing disparate pieces of information, while presenting cohesive, persuasive, and grammatically correct solutions [32, 33], and its values to do so have received international attention and application [34]. The research on the CLA+ as a predictive measure of multiple postcollege outcomes was presented in [33, 35]. The CLA+ was found to gauge and predict factors such as employment, salary, and graduate school attendance. These findings support the utilization of such assessments by employers and institutions of higher learning as a method to determine if applicants and graduates are suitable for employment and further education [36]. It has been suggested that the CLA+ can among other things provide information to potential employers, graduate schools, and communicate achievement standards that can serve to certify the rigor of a school’s educational programs [33, 35, 37].

2.3. Student Engagement

Assessing student engagement has become a critical activity for institutions of higher learning globally. Recent efforts in the United Kingdom [38] illustrate the growing attention student engagement is acquiring. In the United States, the National Study of Student Engagement (NSSE) has been utilized by institutions of higher learning to better understand how to measure student engagement and then to assess the degree to which these schools provide their students with value-added learning environments [39, 40]. It also has been argued that the survey can support the actions for faculty to better align their behaviors to better achieve certain desired outcomes for students [41]. Kinzie et al. [42] comment that the NSSE can be a useful tool in monitoring institutional practices and student behaviors that have been linked to certain outcomes. However, according to Gordon et al. [11], schools that use the NSSE should not be concerned on how well they do on the NSSE, but rather should focus on the application of the results to develop a better understanding of how they can improve student outcomes by linking the NSSE to actual instructional activities, and these pedagogical actions support school success [43]. The use of the NSSE data for institutional purposes, specifically for improvement student outcomes, has been criticized with regard to the reliability and validity of the results [44]. However, as these criticisms are not without merit, as Pike [45] previously stated that these shortcomings necessarily need to be evaluated within context and that it is important to apply any set of data carefully and understand the intended purpose of the data. Interestingly, Pike [44] posits that the results from the NSSE are (1) dependable measures of student engagement in educational practices when at least 50 students are used and (2) the NSSE institutional benchmark scores were significantly related to graduation and retention. He further provides that it is possible the use these scores to gauge the engagement of students and evaluate the effectiveness of institutional actions and programs to improve student engagement and academic success.

Despite the robust research on student engagement and the relationship to student learning [4648], supporting student retention [49, 50], and impacting student persistence [51], faculty and administrators still struggle to effectively assess student engagement at lower levels within the college or university. Considering the preceding arguments, it is argued that analyzing the relationship between the NSSE and the CLA is consistent with the recent emphasis on predicting the postcollege activities of graduates as being important for both college officials and potential students. Furthermore, the potential results from this study may yield powerful policy tools to institutional leaders, faculty, and students as to select which institutions of higher learning offer the most value [52].

2.4. Research Questions

The following questions were used to guide the study by examining the possible relationships between student engagement and critical thinking as defined above:(1)Are the NSSE engagement indicators associated with students’ GPA?(2)Are specific items in the NSSE related to the performance scores on the CLA?

2.4.1. Research Question 1

Finding that benchmark scores of the NSSE are related to institutional retention and graduation rates [11, 44] may suggest that the scores from the benchmarks can serve as measures of the influence of institutional programs and practices that are intended to enhance student success. While previous research has demonstrated weak relationships between NSSE benchmark scores and certain student outcomes [53], additional exploration is warranted.

2.4.2. Research Question 2

This question explores the possible relationships of the student engagement measures used in the NSSE with the critical thinking outcomes of the CLA. The research of Niu et al. [54] showed that there are limited effects of the interventions colleges and universities have implemented in an effort to affect the critical thinking skills of students. Examining how these lower level measures are related to measures of student performance has shown that, although the macromeasures conceptualized and the benchmarks are useful, it is not without limitations, and looking at the microlevel measures may yield relationships not revealed by using aggregated methods [11].

3. Methods

This study uses multivariate linear regression using maximum likelihood estimation with missing values utilizing robust standard errors. To accomplish this, extant data from several past administrations of both the NSSE and CLA were retrieved via institutional research. The participants were freshmen and senior students from 2013 to 2015 that took both the NSSE and CLA. Students are not required to complete the NSSE; therefore, the engagement indicators are weighted to improve the representativeness of the sample to the larger student population. In contrast, the CLA is given to all students as a freshman and again as a senior to measure the value-added experiences of attending the university; therefore, no weighting is required. Table 1 reports the demographic data on the sample.

Table 1: Select demographic variables for freshmen and senior students.

In Table 2 the descriptive statistics for the aggregated level of the regression variables are reported. To investigate the relationship between measures associated with student engagement and problem-solving skills, this study replicated to some degree the work of Gordon et al. [11] such that both the aggregate measures of student engagement as defined on the NSSE were regressed onto CLA performance and then the individual questions which are grouped by these identifiers were also regressed on the same outcome measures. To establish parsimonious models the demographic variables in Table 1 were originally included in the regression analysis, and it was found that these factors did not significantly account for differences in either the engagement indicators of CLA scores to the extent that including them in the final models reduced the power and failed to improve the overall precision.

Table 2: Descriptive statistics for aggregate regression variables.

4. Results

The first analysis was to regress the NSSE engagement indicators onto the student cumulative GPA at the time of taking the NSSE to address research question 1. The results shown in Table 3 indicate that there is little predictive power of the engagement indicators and institution research-reported GPA as of the ten engagement indicators, only four were found to have coefficients different from zero. They were as follows: Student-Faculty Interaction, Supportive Environment, Quality Interaction, and Effective Teaching for all students. When student class was used to create groups, two engagement indicators were found to have some predictive power for freshman status of students—Student-Faculty Interactions and Effective Teaching. This is in stark contrast for senior class students as the predictive ability of the NSSE engagement indicators of student GPA revealed that not one of the indicators was found to have an estimated coefficient different from zero.

Table 3: Regression results for GPA and NSSE engagement indicators with robust standard errors.

For the second question, a regression was run to determine if the NSSE engagement indicators are predictive of CLA performance. Prior to running this regression, a simple pairwise correlation was run between GPA and overall performance on the CLA, and it was found there was a significant positive relationship (r = 0.4, , r2 = 0.16); a small correlation exists between the two. Therefore, it was expected that if a relationship between NSSE and CLA was to be discovered, it would be limited. As shown in Table 4, using all students, of the ten indicators, four were found to have coefficients different from zero, Student-Faculty Interactions, Diverse Discussion (DD), Collaborative Learning (CL), and Reflective and Integrated Learning (RI). The predicted coefficients had both negative and positive values and ranged in size with Collaborative Learning having the largest negative coefficient implying that a one-unit change in this value was predicted to lower a CLA score by approximately 2.3 points or less than 1%. Both Diversity and Reflective and Integrated Learning had positive coefficients with RI having been found to increase CLA scores by 2.09 points for every one-unit change, a very small percentage increase. When examining these relationships by student class status, there is a simple difference on how each class of students perceives engagement being a factor on CLA performance sharing only one indicator. For freshman, RI, CL, and Learning Strategies (LS), all have estimated coefficients different from zero. While for senior students, only DD and CL have nonzero estimated coefficients. A test was run to determine if the difference in estimators was significant usingwith , indicating that the difference in estimators was not significant. The sign for CL was negative for both groups indicating that as either group reported more frequent occasions of being involved in collaborative settings, overall CLA scores were predicted to decrease. For freshman, RI and LS appear to predict CLA scores positively, such that for freshmen who reported high frequencies of each behavior, their CLA scores were predicted to increase albeit in small amounts. For example, a one-unit increase of RI was estimated to increase CLA scores by 1.48 or well less than a 1% change. Seniors reported that working with and being exposed to others different from themselves tend to increase CLA scores, again small in magnitude. The small magnitude of change in CLA scores for seniors was similar to what was found with freshmen.

Table 4: Regression results for CLA total score and NSSE engagement indicators with robust standard errors.

The next phase of analysis was driven by the results above, whereas only the items measuring those engagement indicators (EI) that were found to have coefficients different from zero would be included in the subsequent analysis. This lower level analysis examined the relationships of the individual items that compromised each of the four engagement indicators: Discussion with Diverse Others, Reflective and Integrated Learning, Student-Faculty Interactions, and Collaborative Learning. The descriptive statistics for the lower level items are reported in Table 5. For a complete definition of the items listed in Table 5 see http://nsse.indiana.edu/html/summary_tables.cfm.

Table 5: Descriptive statistics for engagement item level regression variables across all, freshmen, and senior class students.

It was decided to continue to use maximum likelihood estimation here as with many survey-derived data sets; missing values occur randomly and therefore limit sample sizes when convention OLS regression is employed.

The first EI to be examined at the individual item level was that of Collaborative Learning—or the frequency of how often student collaborated with others as measured by four different behaviors. As seen in Table 6, the results indicate that not all behaviors contribute to the prediction of the CLA score. When considering both groups of students, three of the four items that comprise this EI had coefficients significantly different from zero. It appears that, as students ask others more frequently for help, CLA scores are predicted to decline. However, when looking across groups, studying with others fails to retain a significant coefficient. Asking for help and explaining to others retain significant estimators for both, with senior students having larger estimates. To determine whether the estimators were significantly different from each other, using equation (1) again, we found that and −0.91, revealing that the differences reported were not significant for either item across student class. It is suggested that these be interpreted as measures or indicators of student preparedness or understanding of the material. That is, as students need more help from peers, it indicates lower confidence or perceived ability, yielding the negative value for the predictors. Explaining to others conversely maybe an indicator of mastery of material or that explaining to others reinforces knowledge, skills, of confidence in the student’s perception of themselves as learners. In relation to each other, asking for help appears to exert the greatest influence on the CLA in terms of the items for this indicator for all students, and this relationship appears to change when groups are considered, albeit slightly. A one-unit change in explaining to others was predicted to increase CLA scores about 1.5% and 2.4% for freshmen and seniors, respectively, small but potentially influential.

Table 6: Regression results for CLA total score and collaborative learning items with robust standard errors.

The second engagement indicator analyzed was that of Reflective and Integrative Learning (RI) reported in Table 7. When all students were included, two of the seven items were found to have significant predictors. The group analysis is quite revealing as not one of the items for freshmen has significant estimated coefficients, while for senior students, four items that range from moderate to highly significant predictors. This may reflect the value added from the educational experiences accumulated by the senior students. Conversely, freshmen had yet to benefit from the educational environment to the university and therefore may have been reporting based on prior experiences.

Table 7: Regression results for CLA total score and reflective and integrative learning items with robust standard errors.

Senior class students were found to have the only significant results for any of the items defined as Reflective and Integrated Learning. The connect item asks students how often they connect ideas from their courses to prior knowledge and experiences. The reported positive relationship predicts that, for students who reported more opportunities to connect the learning to prior knowledge and experiences, their scores on the CLA were predicted to be higher. Therefore, as students believe they are more reflective and integrate prior learning with new material and courses, students score higher on the CLA and they engage more in such thinking. In fact, this was the strongest relationship found both in terms of frequency and percent of items. Also, the coefficient for connect with others was the largest found where a one-unit change was predicted to improve CLA scores by 3.8%.

The next item with significant estimated coefficients is diverse perspectives, where an increase in discussions representing diverse views was found to predict higher CLA scores. Intuitively, this appears correct as being exposed to different views should cause you to be more reflective and consider more information before forming opinions or solving problems.

The negative predicted coefficients for societal (connecting your learning to societal problems or issues) and new view (how a being exposed to a different view caused you to form a new one) are curious. It would seem that either of these would prompt different and deeper thinking in students; yet, this is not reflected in improved CLA scores. It just may be that the CLA offers problems and measures thinking that fails to utilize such skills.

The next EI that analyzed in this manner was that of Discussion with Diverse Others (DD) with the results displayed in Table 8. For all students of the four items for this indicator, only discussions with others as defined by race was found to have a coefficient different from zero. This may not be surprising for students attending this campus where the student population is fairly homogeneous across wealth, religion and political perspectives. For example, 63% of the student body is black, while economically 97% of students are classified as in need of financial assistance. While no data are available for political and religious diversity, all indicators appear to support the claim of homogeneity. Therefore, the most frequent opportunity for interaction with diverse others is by race, which as reported below may provide students opportunities to encounter opinions and beliefs defined across racial boundaries given that there are differences between how the two classes of students have this indicator, predicting CLA performance. The results are different when reported across the two student groups with freshmen having two items with significant predictors and seniors none. The signs for the two coefficients appear to convey very different perceptions of how economic and political diversity are translated into CLA scores. Discussion with others from different economic statuses was estimated to improve CLA scores about 2.3% for freshmen.

Table 8: Regression results for CLA total score and discussion with diverse others items with robust standard errors.

It is seen as freshmen more frequently reported discussion of educational experiences with those of different economic means, CLA scores are predicted to increase while the converse was reported when discussions with those with different political views are part of the students’ discussions. Freshmen may be reflecting in their scores previous life experiences, and given the nature of those experiences, how they are reflected in CLA scores may indicate the need to engage in such activities.

The final group of items to be analyzed were those from the EI of Student-Faculty Interactions. The results seen in Table 9 show that, for all students in the sample, the items addressing students working with faculty other than course work (committees, student groups, etc.) was found to have coefficients different from zero.

Table 9: Regression results for CLA total score and student-faculty interaction items with robust standard errors.

This was also true for senior students but not freshmen. The negative predicted coefficient may point to the fact that, as students engage in activities not directly associated academic classwork, CLA scores are estimated to decline. This decline may result from the additional time away from studies as a result of the commitment these activities require. Or that the nature of these activities is such that they do not provide experiences that improve the type of thinking and problem-solving as measured by the CLA. Or stated differently may reveal that as students were more involved in work outside the classroom, the demands these activities presented as far a time and effort, the return to critical thinking as measured on the CLA was negative for about a 1.2% decrease in CLA scores.

There were nineteen items explored in this part of the analysis and on seven occasions when both groups of students were included significant predictors were estimated. For freshmen, this happened four times (21%) and seven (37%) for senior students. Only twice (5%) did both senior and freshmen have significant predictors (asking for help (−) and explaining to others (+)). Clearly, two patterns were revealed from both analyses (1) that for freshmen and seniors, the relationship between these NSSE items and CLA performance is different; (2) there was little agreement between the two groups of students; and (3) overall the relationship between NSSE and CLA is tenuous, complex, and may be of limited value.

5. Discussion

While the literature is saturated with studies reporting the linkage between engagement and beneficial student experiences, it seems plausible that associations should exist between the items surveyed on the NSSE and student performance measures. While such a linkage has been demonstrated on broad measures such as retention and graduation rates, linking the results from the NSSE to other measures of student learning remains elusive [11, 44]. This current study of the relationship between NSSE engagement indicators and CLA scores yielded outcomes that were consistent with those of the work of Gordon et al. [11] when they found that the NSSE indicators were not reliable predictors of student GPA, and this study extends this outcome to another measure of student performance.

Given this, the results presented herein call into question the acceptance of the hypothesis presented by Carini et al. [46]. Their hypothesis stated students with higher ability, as measured by tests such as the SAT, may benefit less from student engagement elements. Gordon et al. mentioned this in their study as students in their sample had an average SAT score of 1340, implying that the relationship between NSSE measures and GPA would be weak as a result of the high mean SAT scores. However, students in this study had a mean score SAT of approximately 861. If students with lower ability are to benefit more from engagement, then the weak relationship between these two measures is curious. More completely, if the above were present, it is suggested that lower ability students would show stronger relationships to the NSSE indicators and that these indicators would be better predictors of GPA.

The authors in the Carini et al.’s study proposed a term called “educational efficiency” that suggests institutions may experience a variation in how the engagement indicators associate with various learning outcomes. As a reference, the Gordon et al.’s study reported of the twelve opportunities for the NSSE benchmarks to correlate with GPA; it occurred 4 times, and in this present study, this did not occur for any of the ten indicators. Therefore, the proposed relationship between lower ability students benefiting more from engagement may not be present. In fact, it seems that, for this group of students with SAT scores some 450 points below of those in the Gordon et al.’s study, there are is a much weaker overall relationship between these measures. The measures were all consistent for all freshmen and senior class students.

Results of previous research attempting to link the NSSE benchmarks and measures of student learning have shown weak associations as presented in the work of Carini et al. [46, 53, 55, 56]; therefore, it is not surprising in the current study that the reported relationships between the current NSSE engagement indicators and CLA performance was limited. Only four of the ten engagement indicators were seen as a reliable predictor of performance on the CLA. While this is better than the relationship between the NSSE and GPA, the association between engagement and achievement appears to be infrequent. Examining the item level findings reveal that, although there were four indicators found to be significant predictors of CLA performance, the connections are limited. The engagement indicator of Collaborative Learning having two of four items showing significance is consistent with what Fiorini, et al. found albeit again in their study that associations were found on the more global measures of student success such as graduation.

Research on the effect of diversity experiences on student self-reported gains in various measures of cognition is well-documented [57, 58]. In their study, Pascarella extended this work to use a standardized measure of critical thinking and found results supporting the original work and revealing any such relationships were complex. Therefore, it was not surprising to find that the indicator Discussions with Diverse Others was a significant predictor of CLA performance. While not mirroring the complexity developed by Pascarella, the results presented herein show that this relationship is limited in scope as few of the item-specific measures yielded significant relationships with performance. Specifically, for all students, only discussions with diverse others defined by race proved to be a reliable predictor of CLA performance, and this was different for freshmen and senior students. Although two different approaches were used to investigate the idea of diversity and critical thinking, a degree of commonality was found. This lends support that students do benefit from interacting with others different from themselves. However, in this case, it was limited to freshmen, therefore implying that the college experience has a possible moderating effect on this interaction as a result of this university being more homogeneous than many high schools from which the freshmen graduated from.

The point being the paucity of relationships found in this study between the NSSE and the CLA illustrates the difficulty and complexity of finding measures of student learning and student behaviors that can be measured in such a way to provide beneficial information to administrators, policy-makers, and other stakeholders. This is problematic as intuitions continue to rely on student self-reports as measures of behaviors, learning, and perspectives of the quality of education they receive exam-based [59].

5.1. Implications

The results are somewhat consistent with those of the larger studies that developed associations between ways to measure institutional effectiveness in terms of student learning as this study exclusively examined an HBCU with high percentages of Black students as compared to predominately white institutions (PWI). The results appear to offer that institutions serving minority students face similar issues as they strive to construct approaches to determine what institutional and faculty-based practices contribute to improvements in student learning. Taking this further, race or even gender may not be the student characteristics associated with attempting to understand how engagement should be considered as a tool to refine learning environments. Developing this, if students of diverse racial backgrounds experience the effects of engagement in a similar manner, then possibly, it is the school context that matters the most and not the individual student which aligns with the arguments posited by Kuh et al. [60]. It has been presented that [61] using data derived from student self-reported approaches may yield little in terms of information upon which to base improvement efforts. This may account for some of the perplexing results reported earlier. The potential for students to interpret the intent of a question fail to accurately remember personal and academic experiences; therefore, basing perceptions on faulty or incomplete information all can contribute to biased and inconsistent results and large numbers are needed to support efforts to comprehend how engagement is manifested in student outcomes.

Aside from the issues associated with self-reported data, what policy implications does this study suggest? If institutions are interested in using data to drive improvements, a well-designed plan needs to be implemented that (a) either uses a probabilistic random sampling process to improve the potential to generalize or require key surveys or assessment activities to include all students, (b) invests in institutional research efforts focused on improving student outcomes, and (c) supports faculty involvement to engage in assessment activities as faculty are directly responsible for the types of engagement and learning activities that may have the potential in effecting student outcomes. This study did match the individual student survey and performance data; however, limitations in generalizability may be attributed to the sampling processes used by the university. While the analysis did show that including student demographic characteristics failed to improve the power of the models, this may have been more a result of sample size and not model misspecification as the researcher did face sample size limitations such as attempts made to develop alternative models.

5.2. Recommendations

Researchers and policy-makers are plagued with a multitude of technical and practical issues especially when examining the utility of attempting to associate student learning and institutional behaviors at the higher education level [38]. As Porter [59] recommends, assessing student learning with instruments such as the NSSE is problematic at best, arguably inappropriate, and should be discontinued. While possibly well-suited to link to gross measures of institutional performance such as retention and graduation rates [44], the NSSE benchmarks appear to be not well-suited in associating with specific measures of student learning; specifically, the CLA. The root cause of this can be argued, regardless the conclusion appears to be well-supported by this study and others.

Where does this leave researchers and policy-makers interested in understanding those actions an institution can take to improve learning (quality) not just graduation (quantity), as many private and public sector members are questioning and demanding that institutions clearly demonstrate the worth of a college education [62, 63]. Therefore, searching for alternatives to assessing what defines a successful the college is complex and varied therefore utilizing such metrics as returns on investment [64] remain relevant. Taking this into consideration, measuring the quality of an education and the potential worth is critical for students and society. It is important to continue to research how institutions can not only demonstrate that they do produce quality graduates but also must understand how they did it.

Data Availability

Data are available upon request. All data will be such that no personal identifying information will be included.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

References

  1. J. Marcus, “Facing skepticism, colleges set out to prove their value,” The Hechinger Report, PBS Newshour, Public Broadcasting System, Arlington, VA, USA, 2016, http://www.pbs.org/newshour/updates/facing-skepticism-colleges-set-out-to-prove-their-value/. View at Google Scholar
  2. O. Filippakou and G. Williams, Higher Education as a Public Good: A Critical, Perspective, Peter Lang Publisher, New York, NY, USA, 2014.
  3. B. Pusser, “Reconsidering higher education and the public good: the role of public spheres,” in Governance and the Public Good, W. G. Tierney, Ed., pp. 11–28, State University of New York Press, Albany, NY, USA, 2006. View at Google Scholar
  4. D. Berrett, “The Day the Purpose of College Changed: after February 28, 1967, the main reason to go was to get a job,” The Chronicle of Higher Education, vol. 61, no. 20, pp. 18–21, 2015. View at Google Scholar
  5. R. J. Thompson, Beyond Reason and Tolerance: The Purpose and Practice of Higher Education, Oxford University Press, New York, NY, USA, 2014.
  6. J. Brennan, N. Durazzi, and T. Sene, Things We Know and Don’t Know About the Wider, Benefits of Higher Education, A Review of the Recent Literature, London School of Economics and Political Science (LSE), London, UK, 2013.
  7. J. Selingo, There is Life after College: What Parents and Students Should Know About, Navigating School to Prepare for the Jobs Tomorrow, William Morrow, New York, NY, USA, 2016.
  8. P. Knight, Assessment for Learning in Higher Education, Routledge, Abingdon, UK, 2012.
  9. Commission on the Future of Higher Education, A Test of Leadership: Charting the Future of U.S. Higher Education, U.S. Department of Education, Washington, DC, USA, 2006.
  10. M. Miller, Assessing College Level Learning, National Center for Public Policy and Higher Education, San Jose, CA, USA, 2006.
  11. J. Gordon, J. Ludlum, and J. J. Hoey, “Validating NSSE against student outcomes: are they related?” Research in Higher Education, vol. 49, no. 1, pp. 19–39, 2008. View at Publisher · View at Google Scholar · View at Scopus
  12. National Survey of Student Engagement, Using NSSE Data, National Survey of Student Engagement, Bloomington, IN, USA, 2018, http://nsse.indiana.edu/pdf/using_nsse_data.pdf.
  13. Council for the Aide to Education, What We Do, CAE, Montreal, QC, Canada, 2018, https://cae.org/about-cae/what-we-do.
  14. Hart Research Associates, How Should Colleges Prepare Students to Succeed in Today’s Global Economy?-Based on Surveys Among Employers and Recent College Graduates, Hart Research Associates, Washington, DC, USA, 2006.
  15. D. H. Autor, F. Levy, and R. J. Murnane, “The skill content of recent technological change: an empirical exploration,” The Quarterly Journal of Economics, vol. 118, no. 4, pp. 1279–1333, 2003. View at Publisher · View at Google Scholar · View at Scopus
  16. NACE, Job Outlook 2016: The Attributes Employers Want to See on New College Graduate’s Resumes, National Association of Colleges and Employers, Bethlehem, PA, USA, 2016, http://www.naceweb.org/career-development/trends-and-predictions/job-outlook-2016-attributes-employers-want-to-see-on-new-college-graduates-resumes/.
  17. PARCC, Partnership for Assessment of Readiness for College and Careers, USA, 2012, http://www.parcconline.org/about-parcc.
  18. SBAC, Smarter Balanced Business Consortium, The Regents of the University of California, Oakland, CA, USA, 2019.
  19. A. Porter, J. McMaken, J. Hwang, and R. Yang, “Common core standards,” Educational Researcher, vol. 40, no. 3, pp. 103–116, 2011. View at Google Scholar
  20. E. Silva, Measuring Skills for the 21st Century, Education Sector, Washington, DC, USA, 2008.
  21. R. Arum and J. Roksa, Academically Adrift: Limited Learning on College Campuses, University of Chicago Press, Chicago, IL, USA, 2011.
  22. T. Wagner, The Global Achievement Gap: Why Even Our Best Schools Don’t Teach the New Survival Skills Our Children Need—And What We Can Do about It, Basic Books, New York, NY, USA, 2008.
  23. K. Wyer, Survey: More Freshmen than Ever Say They Go to College to Get Better Jobs, Make More Money, The Higher Education Research Institute, Los Angeles, CA, USA, 2012, http://www.heri.ucla.edu.
  24. S. J. Messick, Assessment in Higher Education: Issues of Access, Quality, Student Development and Public Policy, Routledge, Abingdon, UK, 2013.
  25. A. W. Astin, “Student involvement: a developmental theory for higher education,” Journal of College Student Personnel, vol. 25, no. 4, pp. 297–308, 1984. View at Google Scholar
  26. A. W. Astin, Assessment for Excellence: The Philosophy and Practice of Assessment and Evaluation in Higher Education, Rowman and Littlefield Publishers, Lanham, MD, USA, 2012.
  27. T. Banta and G. Pike, “Revisiting the blind alley of value-added,” Assessment Update, vol. 19, no. 1, pp. 1-2, 2007. View at Google Scholar
  28. T. W. Banta, G. R. Pike, and M. J. Hansen, “The use of engagement data in accreditation, planning and assessment,” in Using NSSE in Institutional Research (New Directions for Institutional Research Series), R. M. Gonyea and G. D. Kuh, Eds., pp. 21–34, Jossey-Bass, San Francisco, CA, USA, 2009. View at Google Scholar
  29. T. W. Banta, “Assessment update: progress, trends, and practices in higher education,” Assessment Update, vol. 26, no. 2, p. 2, 2014. View at Google Scholar
  30. R. Benjamin, “The role of generic skills,” in Measuring Academic Quality, in, Assessing Quality in Postsecondary Education: International Perspectives, Weingarten, P. Harvey, M. Hicks, and A. Kauffman, Eds., pp. 49–64, McGill-Queens University Press, Montreal, QC, Canda, 2018. View at Google Scholar
  31. R. Benjamin, S. Klein, J. Steedle, D. Zahner, S. Elliot, and J. Patterson, The Case for Critical Thinking Skills and Performance Assessment, CAE, Montreal, QC, Canada, 2013, http://org/images/uploads/pdf/The_Case_for_Critical_Thinking_Skills.pdf.
  32. S. Klein, O. L. Liu, J. Sconing et al., Test Validity Study (TVS) Report. Supported by the Fund for the Improvement of Postsecondary Education, CAE, Montreal, QC, Canada, 2009, http://www.cae.org/content/pdf/TVS_Report.pdf.
  33. D. Zahner, Reliability of CLA+, Council for Aid to Education, New York, NY, USA, 2014.
  34. O. Zlatkin-Troitschanskaia, M. Toepper, D. Molerov et al., “Adapting and validating the collegiate learning assessment to measure generic academic skills of students in Germany: implications for international assessment studies in higher education,” in Methodology of Educational Measurement and Assessment, pp. 245–266, Springer, Cham, Switzerland, 2018. View at Google Scholar
  35. D. Zahner and J. James, Predictive Validity of a Critical Thinking Assessment for Post-College Outcomes, Council for Aid to Education, New York, NY, USA, 2015.
  36. R. Arum and J. Roksa, Aspiring Adults Adrift, University of Chicago Press, Chicago, IL, USA, 2014.
  37. P. Garcia, How to Assess Expected Value-Added: The CLA Method, California Association for Institutional Research, Monterey, CA, USA, 2007.
  38. E. C. Maskell and L. Collins, “Measuring student engagement in UK higher education: do surveys deliver?” Journal of Applied Research in Higher Education, vol. 9, no. 2, pp. 226–241, 2017. View at Publisher · View at Google Scholar · View at Scopus
  39. G. D. Kuh, “The national survey of student engagement: conceptual framework and overview of psychometric properties,” Indiana University Center for Postsecondary Research, Bloomington, IN, USA, 2001. View at Google Scholar
  40. G. D. Kuh, J. C. Hayek, R. M. Carini, J. A. Ouimet, R. M. Gonyea, and J. Kennedy, NSSE Technical and Norms Report, Indiana University Center for Postsecondary Research and Planning, Bloomington, IN, USA, 2001.
  41. G. D. Kuh, R. M. Carini, and S. P. Klein, Student Engagement and Student Learning: Insights from a Construct Validation Study, American Educational Research Association, San Diego, CA, USA, 2004.
  42. J. Kinzie, A. McCormick, and R. Gonyea, Using Student Engagement Results to Oversee Educational Quality, Association of Governing Boards, Washington, DC, USA, 2016, http://agb.org/trusteeship/2016/januaryfebruary/using-student-engagement-results-to-oversee-educational-quality.
  43. B. J. Mandernach, “Assessment of student engagement in higher education: a synthesis of literature and assessment tools,” International Journal of Learning, Teaching and Educational Research, vol. 12, no. 2, 2015. View at Google Scholar
  44. G. R. Pike, “NSSE benchmarks and institutional outcomes: a note on the importance of considering the intended uses of a measure in validity studies,” Research in Higher Education, vol. 54, no. 2, pp. 149–170, 2013. View at Publisher · View at Google Scholar · View at Scopus
  45. G. R. Pike, “Using college students’ self-reported learning outcomes in scholarly research,” in Validity and Limitations of College Student Self-Report Data (New Directions for Institutional Researcher Series), S. Herzog and N. A. Bowman, Eds., pp. 41–58, Jossey-Bass, San Francisco, CA, USA, 2011. View at Google Scholar
  46. R. M. Carini, G. D. Kuh, and S. P. Klein, “Student engagement and student learning: testing the linkages∗,” Research in Higher Education, vol. 47, no. 1, pp. 1–32, 2006. View at Publisher · View at Google Scholar · View at Scopus
  47. M. M. Handelsman, W. L. Briggs, N. Sullivan, and A. Towler, “A measure of college student course engagement,” The Journal of Educational Research, vol. 98, no. 3, pp. 184–192, 2005. View at Google Scholar
  48. C.-M. Zhao and G. D. Kuh, “Adding value: learning communities and student engagement,” Research in Higher Education, vol. 45, no. 2, pp. 115–138, 2004. View at Publisher · View at Google Scholar · View at Scopus
  49. J. M. Braxton, W. A. Jones, A. S. Hirschy, and H. V. Hartley III, “The role of active learning in college student persistence,” New Directions for Teaching and Learning, vol. 2008, no. 115, pp. 71–83, 2008. View at Publisher · View at Google Scholar · View at Scopus
  50. J. W. Kushman, C. Sieber, and P. Heariold-Kinney, “This isn’t the place for me: school dropout,” Youth at Risk: A prevention Resource for Counselors, Teachers, and Parents, vol. 4, pp. 471–507, 2000. View at Google Scholar
  51. J. F. Milem and J. B. Berger, “A modified model of college student persistence: exploring the relationship between Astin’s theory of involvement and Tinto’s theory of student departure,” Journal of College Student Development, vol. 38, no. 4, p. 387, 1997. View at Google Scholar
  52. H. Coates, “The value of student engagement for higher education quality assurance,” Quality in Higher Education, vol. 11, no. 1, pp. 25–36, 2005. View at Publisher · View at Google Scholar · View at Scopus
  53. S. Fiorini, T. Liu, L. Shepard, and J. Ouimet, “Using NSSE to understand student success: a multi-year analysis,” in Proceedings of Annual Conference of the Indiana Association for Institutional Research, Indianapolis, IN, USA, October 2014.
  54. L. Niu, L. S. Behar-Horenstein, and C. W. Garvan, “Do instructional interventions influence college students’ critical thinking skills? A meta-analysis,” Educational Research Review, vol. 9, pp. 114–128, 2013. View at Publisher · View at Google Scholar · View at Scopus
  55. T. F. N. Laird, A. K. Garver, A. S. Niskodé-Dossett, and J. V. Banks, “The predictive validity of a measure of deep approaches to learning,” in Proceedings of Annual Meeting of the Association for the Study of Higher Education, Jacksonville, FL, USA, November 2008.
  56. E. T. Pascarella, T. A. Seifert, and C. Blaich, “Validation of the NSSE benchmarks and deep approaches to learning against liberal arts outcomes,” in Proceedings of Annual Meeting of the Association for the Study of Higher Education, Jacksonville, FL, USA, November 2008.
  57. C. Loes, E. Pascarella, and P. Umbach, “Effects of diversity experiences on critical thinking skills: who benefits?” The Journal of Higher Education, vol. 83, no. 1, pp. 1–25, 2012. View at Publisher · View at Google Scholar
  58. E. T. Pascarella, G. L. Martin, J. M. Hanson, T. L. Trolian, B. Gillig, and C. Blaich, “Effects of diversity experiences on critical thinking skills over 4 years of college,” Journal of College Student Development, vol. 55, no. 1, pp. 86–92, 2014. View at Publisher · View at Google Scholar · View at Scopus
  59. S. R. Porter, Using Student Learning as a Measure of Quality in Higher Education. Context for Success: Measuring Colleges’ Impact, HCM Strategist, Washington, DC, USA, 2012.
  60. G. D. Kuh, J. Kinzie, J. H. Schuh, and E. J. Whitt, Student Success in College: Creating Conditions that Matter, John Wiley & Sons, Hoboken, NJ, USA, 2011.
  61. S. R. Porter, “Do college student surveys have any validity?” The Review of Higher Education, vol. 35, no. 1, pp. 45–76, 2011. View at Publisher · View at Google Scholar · View at Scopus
  62. J. R. Abel and R. Deitz, “Do the benefits of college still outweigh the costs?” Current Issues in Economics and Finance, vol. 20, no. 3, 2014. View at Google Scholar
  63. A. K. Biswas and J. Kirchherr, Is a College Degree worth it? Interventions are Needed to Enhance the Practical Relevance of Higher Education, London School of Economics and Political Science, London, UK, 2016.
  64. S. B. Dale and A. B. Kruger, “Estimating the effects of college characteristics over the career using administrative earnings data,” Journal of Human Resources, vol. 49, no. 2, 2014. View at Google Scholar