Table of Contents Author Guidelines Submit a Manuscript
Education Research International
Volume 2019, Article ID 5243639, 8 pages
https://doi.org/10.1155/2019/5243639
Research Article

Differences in College Engagement Benchmark Scores as a Function of Honors Course Enrollment for Community College Students: A Nationwide Study

1Lone Star College-CyFair, Cypress, TX, USA
2Sam Houston State University, Huntsville, TX, USA

Correspondence should be addressed to John R. Slate; moc.loa@etalsforp

Received 1 August 2018; Revised 18 March 2019; Accepted 15 April 2019; Published 23 May 2019

Academic Editor: Kirsi Tirri

Copyright © 2019 Abraham Korah et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

In this investigation, the extent to which differences were present in benchmark scores as a function of community college student honors course enrollment status was investigated using data from the Community College Survey of Student Engagement. Statistically significant differences were revealed for all 5 benchmark scores (i.e., active and collaborative learning, student effort, academic challenge, student-faculty, and support for learners). Students who had been enrolled in an honors course had benchmark scores that were 9 to 16 points higher than their peers who had not been enrolled in an honors course, reflecting higher levels of scholastic engagement, deeper connections with instructors and peers, and greater use of academic and student support services.

1. Introduction

Benchmark scores have become a common data point reviewed and analyzed by college administrators. Levy and Ronco [1] reported that the notion of benchmarking may have originated from the work of ancient Egyptian surveyors or cobblers taking measurements. Modern benchmarking provides organizations with information to measure institutional performance or completion of objectives. Data produced through benchmarking are used: (a) for reports to external local and state entities, (b) for accreditation agency reporting, and (c) to gauge internal performance [2]. Ewell [3] suggested that community colleges should harness reporting requirements and benchmarking to examine organizational performance and strengthen institutions.

A benchmark score can be used by community colleges and by 4-year universities to (a) determine if a goal was attained, (b) set a baseline for improvement, or (c) compare performance with a peer institution or a group of institutions. Benchmarks are defined as, “quantitative standards or criteria by which something can be judged or measured” [3]. The Community College Survey of Student Engagement (CCSSE) is one of several instruments with benchmarking tools developed for community colleges. Specifically developed for 4-year postsecondary settings is the National Survey for Student Engagement (NSSE) that was created by the Lumina Foundation. The Center for Community College Student Engagement (CCCSE) developed the instrument to measure the frequency and success of community college initiatives that helped students reach their postsecondary educational goals [4]. According to McClenney [5], the survey is grounded in research findings from (a) Pace [6], regarding the experiences of students, (b) Astin’s [7] work on student involvement, (c) Chickering and Gamson’s [8] effective undergraduate practices, and (d) Kuh’s [9] focus on student engagement.

Institutional practices can be developed to encourage student success. Tinto [10] focused on four elements present in CCSSE benchmarks that have institutional influence: (a) setting high expectations of students; (b) supporting students in the academic, social, and financial realms; (c) offering frequent and timely assessment and communication with students; and (d) providing students with opportunities for involvement. In a study of the five CCSSE benchmark scores, McClenney and Marti [11] observed that student engagement had a moderate effect on GPA for students enrolled in Florida community colleges. When examining individual CCSSE benchmarks, McClenney and Marti [11] reported small effects for the active and collaborative learning benchmark on course completion and associate degree attainment. A small effect on associate degree attainment was also observed for student effort and support learners’ benchmarks. Greater levels of engagement had the most positive influence on the GPA of academically underprepared and Black students. When three independent studies were examined, active and collaborative learning, academic challenge, and student-faculty interaction benchmarks had the greatest influence on degree attainment [11]. Price and Tovar [12] reported that active and collaborative learning and support for learners had predictive values for institutional graduation rates.

The resulting data can be used by institutions to improve teaching and learning [13]. Since 2003, the survey has been administered in random sections of courses annually during the spring semester [5]. Upon completion of CCSSE administration by institutions, colleges receive data that can be used to compare: (a) full-time and part-time students; (b) individual institutional data with all participating institutions; (c) individual institutional data with institutions of a similar size; and (d) consortium data if a college is part of a consortium [3].

Results of the CCSSE survey are used to generate five benchmarks: (a) active and collaborative learning, (b) student effort, (c) academic challenge, (d) student-faculty, and (e) support for learners. Scholarship related to benchmark score comparisons based on honors course enrollment status is limited. Ross and Roman [14], in an analysis of honors and nonhonors students at one Florida community college using CCSSE survey results, determined the presence of a higher degree of academic engagement in general courses. Conceptually related survey items examined by Ross and Roman [14] were grouped together to develop benchmarks. Honors students indicated a greater degree of class participation and academic preparation, and they expended more effort than the students’ perceived capability than nonhonors students. Also, honors students indicated that honors courses emphasized more critical thinking including analysis, synthesis, and problem solving. Ross and Roman [14] indicated lower levels of engagement for honors students compared to nonhonors students regarding career plans, career goals, e-mail communication with faculty, discussion of grades or assignments with faculty, and solving numerical problems.

The first CCSSE assessment, the active and collaborative learning benchmark, can be used to understand academic participation [15]. For this benchmark, students were asked to answer 12 questions that described how often they participated in specific activities in the classroom including class discussions, presentations, and group work. Students answered questions about class-related activities occurring outside of the classroom, including questions about (a) group projects, (b) tutoring or teaching other students, and (c) participation in a community-based project. Discussion of readings or course information was also factored into the active and collaborative learning benchmark. This benchmark was developed using Chickering and Gamson’s [8] practices for student-to-student collaboration principles and active learning techniques.

The second benchmark, student effort, is calculated based on student responses to eight questions [15]. The questions include (a) preparation of multiple drafts of a paper, (b) working on projects requiring synthesis of researched sources, (c) attending class unprepared, (d) personal reading, and (e) time spent preparing for classes. Other questions in this benchmark are related to the use of tutoring services, skill labs (e.g., writing and mathematics), and computer labs. This benchmark was developed using Chickering and Gamson’s [8] undergraduate principle regarding the importance of time-on-task, a quantification of student effort.

Academic challenge, the third benchmark, is calculated based on the responses to 10 questions that reflect the academic rigors experienced by students [15]. Five questions are focused on mental activities in courses, including (a) conceptual analysis, (b) synthesis of information, (c) evaluation of data, (d) theoretical applications, and (e) development of new skills using current information. This benchmark was created based on Chickering and Gamson’s [8] principle of high expectations, time-on-task guidelines, and active learning recommendations. Although students may initially express negative feelings about rigorous work, substantive learning that goes beyond rudimentary exercises resulted in students expressing positive feelings about learning [16]. Responses for questions about the quantity of course materials and written papers and time spent studying were included in the academic challenge score. Lastly, questions related to the level of challenge presented to students through exams and by instructors were included in the benchmark.

The fourth benchmark, student-faculty interaction, is a measure of connections between students and faculty. The benchmark is calculated by examining student responses to six questions. Responses to three questions about the frequency of communication with faculty, including discussion of grades and assignments, use of e-mail for correspondence, and receiving written or verbal feedback on performance, were used to calculate the benchmark. The other three questions included in this benchmark are related to discussions with faculty on a variety of topics including readings or class materials, career plans, and work on activities beyond coursework. The foundation for the development of this benchmark was Chickering and Gamson’s [8] assertion that increased faculty-student interactions led to increased motivation and engagement for students.

The fifth CCSSE benchmark, support for learners, consists of responses to seven questions related to the level of support perceived by students from their institution. Questions are related to the level of support available to help students succeed, which include (a) encouragement of interactions with a diverse student community; (b) support in managing nonacademic responsibilities; (c) social support; and (d) financial support. Two additional questions related to the utilization of academic advising and career counseling were also included in this benchmark. This benchmark is derived from Chickering and Gamson’s [8] suggestion that the institutional environment has a significant influence on the quality of a student’s education.

2. Statement of the Problem

Approximately 56.4% of students who first enrolled at a public community college in 2014, continued into their second year [17]. This perceived lack of success has led to a focus on collegiate practices. Questions have been raised by government officials and the public about the role of taxpayer subsidies for educational initiatives [18], and the level of public financial support has trended lower [19]. Community college administrators are being asked by accreditation boards, local, state, and federal government agencies and the public to demonstrate institutional effectiveness through data that illustrated standards and cost-effectiveness [1, 2]. Therefore, the process of collecting, analyzing, and utilizing data for developing effective initiatives that benefit students as they work toward educational and career goals is vital.

Many benchmarking endeavors are characterized by an informal collection and utilization of best practices from internal and external entities [1]. Although benchmarking is a standard practice in business settings, formalized benchmarking processes are not common in higher education. The culture of higher education has been resistant to the use of assessment tools, and administrators in higher education have collected the expected clarifying data for improvement [1]. Also, the data collection process is challenging and expensive with no guarantee institutions will benefit from the investment.

Community colleges can benefit by making use of data to improve institutional performance and student success. Many traditional measures of institutional effectiveness may provide an inaccurate picture of community college effectiveness [3]. Benchmarking becomes difficult when institutions operate independently to be responsive to demands by the community in which the college is located. Many higher education performance measures were developed based on measures of success, such as retention and graduation, that are more difficult to attain in an open enrollment educational setting [3]. Community colleges benefit through the development of benchmarking tools that function in a manner in which the effectiveness of institutional programs and processes are reliably measured [4].

3. Purpose of the Study

The purpose of this study was to determine the degree to which differences were present in college engagement benchmark scores between students who had been enrolled in an honors course and students who had not been enrolled in an honors course. Specifically addressed were active and collaborative learning benchmark scores, student effort benchmark scores, academic challenge benchmark scores, student-faculty benchmark scores, and support for learners benchmark scores by the honors course enrollment status of community college students. Because a national dataset was analyzed in this empirical study, information obtained may be of interest to community college administrators in the United States. Readers should note that this article resulted from Korah’s [20] doctoral dissertation.

4. Significance of the Study

Honors education, particularly at community colleges, has not been examined extensively. Community college samples are used by education researchers in less than 10% of higher education investigations [11]. Current research specifically focused on honors education in community colleges is nominal [21, 22]. The majority of honors education related dissertations and publications have been qualitative [22], and a large-scale quantitative study of honors education in community colleges has not occurred since the late 1990s [23]. Therefore, community college administrators and leaders may consider results of this empirical investigation of college engagement when determining strategies for allocating limited resources.

5. Research Questions

In this empirical investigation, one overarching research question was addressed: What is the difference in college engagement benchmark scores between students who had been enrolled in an honors course and students who had not been enrolled in an honors course? Specific subquestions under this overarching research question were (a) What is the difference in active and collaborative learning benchmark scores by honors course enrollment status?; (b) What is the difference in student effort benchmark scores by honors course enrollment status?; (c) What is the difference in academic challenge benchmark scores by honors course enrollment status?; (d) What is the difference in student-faculty benchmark scores by honors course enrollment status?; and (e) What is the difference in support for learners benchmark scores by honors course enrollment status?

6. Method

6.1. Research Design

A nonexperimental, causal-comparative research design was used in this study [24, 25]. In this type of nonexperimental causal-comparative research, the independent variable cannot be manipulated. The events represented through the archival data had already occurred [25]. The independent variable that was analyzed was the honors course enrollment status of community college students who completed the survey. The dependent variables were the college engagement benchmark scores of community college students who participated in the survey.

6.2. Participants and Instrumentation

The CCCSE provided an archival data set consisting of a 25% random sample of the 2014 three-year (2012 through 2014) CCSSE cohort. The data set contained responses from 108,509 students who completed the CCSSE survey, including almost 7,000 students who indicated enrolling in an honors course. A total of 684 institutions in 48 states, the District of Columbia, select Canadian provinces, and three island nations [26] were included in the data set. The locations of the community colleges in the sample included 147 colleges in urban areas, 149 colleges in suburban areas, and 395 colleges in rural areas. The sizes of enrollments in the dataset also varied with 296 small sized colleges with enrollments of less than 4,500 students; 168 medium sized college with enrollments of 4,500 to 7,999 students; 141 large colleges with 8,000 to 14,999 students; and 79 extralarge colleges with 15,000 or more students [15].

The CCSSE survey included 38 questions designed to ascertain student views on the academic and nonacademic college environment. Question types included in the survey included ratings, Likert scales, and multiple choice questions. The reliability, validity, and consistency between first and second administrations of the survey have been validated [13]. Responses from survey items related to institutional practices and student behaviors that bolster student engagement and positively influence learning and persistence were used to calculate benchmark scores [15]. Specifically, responses for questions related to active and collaborative learning, student effort, academic challenges, student-faculty interactions, and support for learners were used to develop benchmarks.

The active and collaborative learning benchmark is a measure of student participation in class discussions, presentations, group work, outside class group projects, peer tutoring, participation in community projects, and discussion of course information outside of the classroom [15]. The student effort benchmark score is based on academic efforts in preparation of multiple paper drafts, synthesis of information, attending class unprepared, personal reading, preparation for classes, and utilization of academic support services and facilities. The academic challenge benchmark is a synthesis of student responses to questions about the mental activities required for courses, quantity of academic work, amount of student effort, and level of challenge in exams and from instructors. Student-faculty interactions measure the connection between students and faculty developed through frequency of communication and topic of communication. Support for learners is a benchmark of the level of academic support services and personal support services available to students [15].

7. Results

Data were analyzed to determine the extent to which differences were present in college engagement benchmark scores between students who had been enrolled in an honors course and their peers who had not been enrolled in an honors course. A multivariate analysis of variance (MANOVA) was the statistical analysis conducted as the dependent variable of benchmark scores (i.e., active and collaborative learning, student effort, academic challenge, student-faculty, and support for learners) consisted of continuous and interval level data. Prior to conducting the MANOVA procedure, the underlying assumptions for data normality were checked. Specifically examined were Box’s Test of Equality of Covariance and Levene’s Test of Equality of Error Variances. Although these assumptions were not met, due to the robustness of a MANOVA procedure, Field [27] contends that this procedure is appropriate for this investigation.

The MANOVA revealed a statistically significant difference, Wilks’ Λ = .96, , partial η2 = .04, in college engagement benchmark scores between students who had been enrolled in an honors course and students who had not been enrolled in an honors course. Using Cohen’s [28] criteria, a small effect size was present. Follow-up univariate analysis of variance procedures revealed statistically significant differences between students who had enrolled in an honors course and students who had not enrolled in an honors course in their active and collaborative learning benchmark score, F(1, 79092) = 2664.64, , partial η2 = .033, a small effect size; student effort benchmark score, F(1, 79092) = 1128.90, , partial η2 = .014, a small effect size; academic challenge benchmark score, F(1, 79092) = 1237.61, , partial η2 = .015, a small effect size; student-faculty benchmark score, F(1, 79092) = 2057.48, , partial η2 = .025, a small effect size; and for the support for learners benchmark score, F(1, 79092) = 919.14, , partial η2 = .011, a small effect size. Accordingly, the five effect sizes in this investigation were small effect sizes [28].

Following these five univariate analysis of variance procedures, descriptive statistics were examined to determine where the statistically significant differences were. Readers should note that all benchmark scores were converted to a T-score reporting format so that scores across the benchmarks can be compared and contrasted. In a T-score format, the M value is always 50 and the SD is always 10. With respect to student active and collaborative learning benchmark scores, students who had been enrolled in an honors course had an average score that was approximately 16 points higher than for students who had not been enrolled in an honors course. Presented in Table 1 are the descriptive statistics pertaining to this analysis.

Table 1: Descriptive statistics for active and collaborative learning benchmark scores by honors course enrollment status.

The second research subquestion was focused on student effort benchmark scores by honors course enrollment status. The average benchmark scores for students who had been enrolled in an honors course was approximately 10 points higher than for their peers who had not been enrolled in an honors course. The difference in benchmark scores reflects student effort in academic preparation of multiple paper drafts, synthesis of information, frequency of attending classes unprepared, personal reading, preparation for classes, and use of academic services and facilities. The descriptive statistics for this analysis are delineated in Table 2.

Table 2: Descriptive statistics for student effort benchmark scores by honors course enrollment status.

The focus of the third research subquestion was on academic challenge benchmark scores by honors course enrollment status. Students who had been enrolled in an honors course had an average score that was approximately 11 points higher in their academic challenge benchmark score than their peers who not been enrolled in an honors course. As such, students who had been enrolled in an honors course reported more engagement in intellectual activities required for courses, quantity of academic work, amount of student effort, and level of challenge experienced by students during exams and from instructors than their peers who had not been enrolled in an honors course. Descriptive statistics for this analysis are revealed in Table 3.

Table 3: Descriptive statistics for academic challenge benchmark scores by honors course enrollment status.

The fourth research subquestion was focused on student-faculty benchmark scores by student honors course enrollment status. Benchmark scores for students who had been enrolled in an honors course were approximately 14 points higher than their peers who had not been enrolled in an honors course. With respect to the student-faculty benchmark score, students who had been enrolled in an honors course reported more frequent communication and greater breadth in topics of communication between instructors and students than their peers who had not been enrolled in an honors course. Table 4 contains the descriptive statistics for this analysis.

Table 4: Descriptive statistics for student-faculty interaction benchmark scores by honors course enrollment status.

The focus of the fifth research subquestion was on the support for learners benchmark scores by honors course enrollment status. Students who had been enrolled in an honors course had an average score that was approximately 9 points higher for this benchmark, a measure of academic support services and student support services available for students, than their peers who had not been enrolled in an honors course. Presented in Table 5 are the descriptive statistics for this analysis.

Table 5: Descriptive statistics for support for learners benchmark scores by honors course enrollment status.

8. Discussion

In this empirical investigation, the degree to which differences were present in college engagement benchmark scores between students who had been enrolled in an honors course and their peers who had not been enrolled in an honors course was addressed. National data from more than 108,000 community college students who completed the CCSSE survey were used to conduct this analysis. Statistically significant differences were revealed for all five benchmark scores (i.e., active and collaborative learning, student effort, academic challenge, student-faculty, and support for learners). Students who had been enrolled in an honors course had benchmark scores that were 9 to 16 points higher than their peers who had not been enrolled in an honors course, reflecting higher levels of scholastic engagement, deeper connections with instructors and their peers, and greater use of academic and student support services.

8.1. Connections with Existing Literature

The benchmark scores that had the greatest predictive value regarding student graduation rates were the active and collaborative learning and support for learners benchmarks [12]. The finding regarding the predictive value of the active and collaborative benchmark was also observed by McClenney and Marti [11]. McClenney and Marti [11] also reported that the academic challenge and student-faculty benchmarks had predictive value regarding degree attainment. With respect to the active and collaborative learning benchmark, students who had been enrolled in honors courses averaged 16.14 more points for this benchmark, the largest disparity between the two groups among the five benchmark scores. Thus, students who had been enrolled in an honors course had more opportunities to participate in class discussions, presentations, group work, outside class group projects, peer tutoring, participation in community projects, and discussions of course information outside the classroom. A combination of in-class and out-of-class opportunities may positively influence students striving to reach their academic goals.

The second largest disparity in benchmark scores, 14.21 points, between students who had been enrolled in an honors course and their peers who had not been enrolled in an honors course, was present in the student-faculty interaction benchmark. Students who had been enrolled in an honors course reported greater frequency of communication with faculty (i.e., discussion of grades and assignments, e-mail correspondence, and written or verbal feedback on academic performance) and greater frequency of interactions with faculty (i.e., discussion of readings or class material, career plan consultation, or work on noncourse activities). Greater interactions with faculty may be influenced by smaller class sizes common in honors courses [29, 30]. Smaller classes may provide more opportunities for interaction between students and faculty and may increase rapport between students and instructors. Class participation by students decreased due to fear or lack of confidence [31] and feelings of infallible authority regarding faculty [32]. According to Weaver and Qi [33]; interacting with faculty outside of the classroom led students to feel more confident, be less fearful of faculty criticism, and participate more in class discussions than their peers who had fewer out-of-class interactions with faculty.

The support for learners benchmark (i.e., institutional encouragement to interact with a diversity of students, support managing nonacademic responsibilities, social support, financial support, use of academic advising, and use of career counseling) had the smallest mean difference in scores of all five benchmarks, approximately 9 points, between students who had been enrolled in an honors course and their peers who had not been enrolled in an honors course. Thus, although students who had been enrolled in an honors course perceived a greater level of support from the academic institution, the contrast was not as stark as other benchmarks. The use of academic advising services has been positive for retention [34, 35] and persistence [36] rates. Consulting with financial aid counselors can also positively influence student retention through financial guidance regarding college costs and financial aid [37].

8.2. Implications for Policy and for Practice

The findings from this study lead to several implications for policy and practice. First, students who had been enrolled in an honors course had a greater number of opportunities for a more robust academic experience as reflected in the large disparity in the academic and collaborative benchmark score. This benchmark also had the greatest predictive value for degree attainment [11, 12]. Thus, institutional leaders should consider current curriculum and teaching strategies to determine if instructional techniques should be modified to include more active and collaborative learning opportunities. Opportunities outside the classroom such as field work, civic engagement, or service learning may also provide experiential opportunities to enhance student learning.

Second, students who had not been enrolled in an honors course had statistically significantly fewer interactions with faculty than their peers who had been enrolled in an honors course. Although the larger number of interactions in honors courses may partially reflect smaller class sizes, it is vital to provide opportunities for all students who desire connections with faculty inside and outside the classroom. Administrators should examine student-to-instructor ratios to determine if some disciplines need more instructors in order to reduce class sizes. Additionally, instructors should be provided with informal opportunities to interact with students such as advising student clubs, serving as academic advisors, or by providing supplemental instruction through labs or one-on-one tutoring. Both monetary and nonmonetary rewards may be considered as additional incentives for faculty.

8.3. Recommendations for Future Research

Based upon the results of this study, several recommendations for future research can be made. First, as few investigations of honors programs in community colleges have been published, opportunities exist for more inquiries into honors programs in community colleges [21, 22]. Specifically, quantitative investigations are minimal in the literature. Second, this quantitative study should be replicated by researchers using more current data to determine if similar conclusions can be drawn.

Third, researchers should consider using data from the National Survey of Student Engagement to extend this investigation to students at 4-year universities and determine the generalizability of these findings. Fourth, an examination of CCSSE data that is reflective of scholastic engagement and faculty engagement, between students who had been enrolled in an honors course and their peers who had not been enrolled in an honors course, is of value. Lastly, further research using CCSSE data on student support service use differences between these two groups of students is recommended. Additional research on support service use that analyzes CCSSE data would be complimentary to this study.

9. Conclusion

In this nationwide investigation, the extent to which differences were present in benchmark scores between students who had been enrolled in an honors course and their peers who had not been enrolled in an honors course was examined. Statistically significant differences were revealed for all five benchmark scores (i.e., active and collaborative learning, student effort, academic challenge, student-faculty, and support for learners) between the two groups of students. Students who had been enrolled in an honors course had benchmark scores that were 9 to 16 points higher than students who had not been enrolled in an honors course, reflecting higher levels of scholastic engagement, deeper connections with instructors and their peers, and greater use of academic and student support services. Opportunities may be present for community college leaders to reduce the benchmark score disparity through championing innovative instructional techniques and encouraging student use of support services and by providing opportunities for students to have positive interactions with faculty.

Data Availability

The data we obtained were provided to us free of charge from the organization that administers the CCSSE survey. Readers who would want similar data will need to contact that organization.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This research article was conducted as part of the first author’s doctoral dissertation [20].

References

  1. G. D. Levy and S. L. Ronco, “How benchmarking and higher education came together,” New Directions for Institutional Research, vol. 2012, no. 156, pp. 5–13, 2012. View at Publisher · View at Google Scholar
  2. T. Bers, “Surveys and benchmarks,” New Directions for Institutional Research, vol. 2012, no. 153, pp. 33–48, 2012. View at Publisher · View at Google Scholar
  3. P. T. Ewell, “Accountability and institutional effectiveness in the community college,” New Directions for Community Colleges, vol. 2011, no. 153, pp. 23–36, 2011. View at Publisher · View at Google Scholar
  4. A. Nora, G. Crisp, and C. Matthews, “A reconceptualization of CCSSE’s benchmarks of student engagement,” The Review of Higher Education, vol. 35, no. 1, pp. 105–130, 2011. View at Publisher · View at Google Scholar · View at Scopus
  5. K. M. McClenney, “Research update: the community college survey of student engagement,” Community College Review, vol. 35, no. 2, pp. 137–146, 2007. View at Publisher · View at Google Scholar · View at Scopus
  6. C. R. Pace, Measuring the Quality of College Student Experiences: An Account of the Developmental and Use of the College Student Experiences Questionnaire, University of California, Los Angeles, CA, USA, 1984.
  7. A. W. Astin, “Student involvement: a developmental theory for higher education,” Journal of College Student Development, vol. 25, no. 4, pp. 297–308, 1984. View at Google Scholar
  8. A. W. Chickering and Z. F. Gamson, “Seven principles for good practice in undergraduate education,” Biochemical Education, vol. 17, no. 3, pp. 140-141, 1987. View at Publisher · View at Google Scholar · View at Scopus
  9. G. D. Kuh, “Assessing what really matters to student learning inside the national survey of student engagement,” Change: The Magazine of Higher Learning, vol. 33, no. 3, pp. 10–17, 2001. View at Publisher · View at Google Scholar
  10. V. Tinto, Completing College: Rethinking Institutional Action, The University of Chicago Press, Chicago, IL, USA, 2012.
  11. K. M. McClenney and C. N. Marti, “Exploring Relationships between Student Engagement and Student Outcomes in Community Colleges: Report on Validation Research,” 2006, Retrieved from Community College Survey of Student Engagement website: http://www.ccsse.org/center/resources/docs/publications/CCSSE_Validation_Research.pdf.
  12. D. V. Price and E. Tovar, “Student engagement and institutional graduation rates: identifying high-impact educational practices for community colleges,” Community College Journal of Research and Practice, vol. 38, no. 9, pp. 766–782, 2014. View at Publisher · View at Google Scholar · View at Scopus
  13. C. N. Marti, “Dimensions of student engagement in American community colleges: using the community college student report in research and practice,” Community College Journal of Research and Practice, vol. 33, no. 1, pp. 1–24, 2008. View at Publisher · View at Google Scholar · View at Scopus
  14. L. O. Ross and M. A. Roman, “Assessing student learning in community college honors programs using CCCSE course feedback forms,” Journal of the National Collegiate Honors Council, vol. 10, no. 2, pp. 73–92, 2009. View at Google Scholar
  15. Community College Survey of Student Engagement. How Benchmarks are Calculated, 2017, http://www.ccsse.org/survey/docs/How_Benchmarks_are_Calculated.pdf.
  16. S. L. Payne, K. L. M. Kleine, J. Purcell, and G. R. Carter, “Evaluating academic challenge beyond the NSSE,” Innovative Higher Education, vol. 30, no. 2, pp. 129–146, 2005. View at Publisher · View at Google Scholar · View at Scopus
  17. American College Testing. National Collegiate Retention and Persistence to Degree Rates, 2016, https://www.ruffalonl.com/documents/shared/Papers_and_Research/ACT_Data/ACT_2016.pdf.
  18. K. Field, E. Kelderman, and A. Bidwell, “Obama says colleges should keep costs low and hints at “alternative” accreditation,” Chronicle of Higher Education, vol. 59, no. 24, pp. A3–A4, 2013. View at Google Scholar
  19. D. J. Phelan, “The clear and present funding crisis in community colleges,” New Directions for Community Colleges, vol. 2014, no. 168, pp. 5–16, 2014. View at Publisher · View at Google Scholar
  20. A. Korah, “Differences in college engagement of students as a function of community college honors course status: a nationwide study,” Ph.D.thesis, Sam Houston State University, Huntsville, TX, USA, 2018, https://shsu-ir.tdl.org/handle/20.500.11875/2333.
  21. C. Achterberg, “What is an honors course?” Academic Leader, vol. 20, no. 9, p. 4, 2004. View at Google Scholar
  22. D. K. Holman and J. H. Banning, “Honors dissertation abstracts: a bounded qualitative meta-study,” Journal of the National Collegiate Honors Council, vol. 13, no. 1, pp. 41–61, 2012. View at Google Scholar
  23. C. Outcalt, Community College Honors Programs: An Overview, ERIC Clearinghouse for Community College, Los Angeles, CA, USA, 1999.
  24. J. W. Creswell, Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, SAGE Publishing, Thousand Oaks, CA, USA, 4th edition, 2013.
  25. R. B. Johnson and L. Christensen, Educational Research: Quantitative, Qualitative, and Mixed Approaches, SAGE Publishing, Thousand Oaks, CA, USA, 4th edition, 2012.
  26. Community College Survey of Student Engagement. About the CCSSE survey, 2017, http://www.ccsse.org/aboutsurvey/aboutsurvey.cfm.
  27. A. Field, Discovering Statistics Using SPSS, SAGE Publishing, Thousand Oaks, CA, USA, 3rd edition, 2009.
  28. J. Cohen, Statistical Power Analysis for the Behavioral Sciences, Lawrence Erlbaum, Hillsdale, NJ, USA, 2nd edition, 1988.
  29. National Collegiate Honors Council, Definition of Honors Education, National Collegiate Honors Council, Lincoln, NE, USA, 2013, https://www.nchchonors.org/uploaded/NCHC_FILES/PDFs/Definition-of-Honors-Education.pdf.
  30. R. Otero, R. Spurrier, and G. Lanier, “A practical handbook for honors program and honors college evaluation and assessment,” 2011, Retrieved from National Collegiate Honors Council website: https://c.ymcdn.com/sites/nchc.site-ym.com/resource/collection/f70e8c21-d030-4764-b878-b233de1ee5dd/A%20Practical%20Handbook%20for%20Assessment%20and%20Evalua.pdf?hhSearchTerms=%22A+PRACTICAL+HANDBOOK+FOR+HONORS+PROGRAM+AND+HONORS%22. View at Google Scholar
  31. J. R. Howard, G. H. James III, and D. R. Taylor, “The consolidation of responsibility in the mixed-age college classroom,” Teaching Sociology, vol. 30, no. 3, pp. 214–234, 2002. View at Publisher · View at Google Scholar · View at Scopus
  32. J. R. Howard and R. Baird, “The consolidation of responsibility and students’ definitions of situation in the mixed-age college classroom,” The Journal of Higher Education, vol. 71, no. 6, pp. 700–721, 2000. View at Publisher · View at Google Scholar
  33. R. R. Weaver and J. Qi, “Classroom organization and participation: college students’ perceptions,” The Journal of Higher Education, vol. 76, no. 5, pp. 570–601, 2005. View at Publisher · View at Google Scholar
  34. D. K. Hatch and C. E. Garcia, “Academic advising and the persistence intentions of community college students in their first weeks in college,” The Review of Higher Education, vol. 40, no. 3, pp. 353–390, 2017. View at Publisher · View at Google Scholar · View at Scopus
  35. Noel-Levitz., Inc., Academic Advising Highly Important to Students (ED541564), Noel-Levitz., Inc., Davenport, IA, USA, 2009.
  36. T. Bailey and M. Alfonso, Paths to Persistence: An Analysis of Research on Program Effectiveness at Community Colleges, Lumina Foundation for Education, Indianapolis, IN, USA, 2005.
  37. L. McKinney and T. Roberts, “The role of community college financial aid counselors in helping students understand and utilize financial aid,” Community College Journal of Research and Practice, vol. 36, no. 10, pp. 761–774, 2012. View at Publisher · View at Google Scholar · View at Scopus