Research Article | Open Access
Tanya M. Horacek, Elif Dede Yildirim, Dean Seidman, Carol Byrd-Bredbenner, Sarah Colby, Adrienne A. White, Karla P. Shelnutt, Melissa D. Olfert, Anne E. Mathews, Kristin Riggsbee, Lisa Franzen-Castle, Jesse Stabile Morrell, Kendra Kattelmann, "Redesign, Field-Testing, and Validation of the Physical Activity Campus Environmental Supports (PACES) Audit", Journal of Environmental and Public Health, vol. 2019, Article ID 5819752, 13 pages, 2019. https://doi.org/10.1155/2019/5819752
Redesign, Field-Testing, and Validation of the Physical Activity Campus Environmental Supports (PACES) Audit
This paper describes the redesign, field-testing, and convergent validity of a practical tool—Physical Activity Campus Environmental Supports (PACES) audit. Methods. The audit includes two parts: (1) PACES-Programs, which is comprised of questions regarding populations served, fees, programs (recreation/fitness classes and intramurals), proximity, adequacy of facilities, and marketing, and (2) PACES-Facilities, which is comprised of questions regarding built environment (aesthetics, bike racks, stairs, and universal design), recreation equipment, staff, amenities, and access. Each item criterion is specifically scored using a five-point, semantic-differential scale ranging from limited to extensive environmental support. A few questions utilize select all that apply for a summed score. PACES training, interrater reliability, and data collection are all accessible via an online portal. PACES was tested on 76 college campuses. Convergent validity was examined by comparing the PACES-Programs questions to Healthy Campus Initiatives-Programs questions (HCI-Programs) and comparing the PACES-Facilities questions to questions contained in the Physical Activity Resource Assessment (PARA) Instrument. Statistical analyses included Cronbach’s alpha, ANOVA, latent profile analysis, and Spearman correlations. Results. The PACES-Programs audit includes 10 items for a potential total of 73 points () and PACES-Facilities audit includes 15 items for a potential total of 77 points (). Most (77.8%) of the 153 facilities assessed scored in the most healthful range (20–42), which was mainly due to the extensiveness of the aerobic equipment/amenities and the competence/accessibility of staff. Significant differences in PACES-Total and PACES-Programs scores were associated with campus size and PACES-Facilities across regions. For the paired validation assessments, correlations were significant between PACES-Programs and HCI-Programs (() , ) and PACES-Facilities and PARA () for both features (, ) and amenities (, ), indicating moderate convergent validity. Conclusion. The PACES audit is a valid, reliable tool for assessing the quality of recreation facilities and programs in a variety of college campus environments.
Obesity prevention guidelines recommend regular physical activity through the lifespan to prevent disease and promote good health [1–3]. The availability, access, quality, and usage of recreation facilities and programs have been identified as factors influencing a population’s level of physical activity [4–8]. For school children, the number of outdoor facilities at school was associated with higher physical activity levels , and adolescents were more active using public recreation spaces (rather than private) or open field times [10, 11]. On college campuses, recreation facility usage was related to favorable health indices . However, student participation in campus recreation programs declined when membership fees were charged to use the facility, highlighting financial considerations as a barrier to achieving physical activity . Given the basic relationship between the availability of recreation facilities and physical activity levels, the quality and extensiveness of recreation programs and facilities require further study.
The existing tools available to evaluate the quality of recreation facilities/programs are limited. The Worksite Health Promotion Readiness Checklist, a simple yes/no survey, is available to assess the health promotion and protection practices and policies in worksites . The SPOTLIGHT virtual audit tool assesses the presence of indoor/outdoor recreation facilities and public parks via the street view feature of Google Earth/GIS . Another tool based upon Total Quality Management (TQM) evaluates recreation facilities from a variety of user viewpoints, with a focus on safety, condition, and maintenance . This tool can be used to evaluate recreation centers, parks, playgrounds, aquatic facilities/pools, and sports fields, and it was found to be a reliable and effective measure of the physical features (amenities) of a recreation facility .
A variety of tools are available to assess students, employees, alumni, and the community satisfaction or perception of recreation services [17–21]. One tool  evaluates students’ satisfaction levels, perceived service quality, and behavioral intentions for continued use of campus recreation facilities and programs. Using a Likert scale, the topics include facility ambiance, operations quality, staff competency, overall satisfaction, and behavioral intentions. Another tool evaluates students’ perceptions of the effectiveness of, and their satisfaction with, campus recreation programs . It assesses personal treatment, budget, academic support, individual performance, and ethics. Using a Likert scale, another tool named “the Scale of Service Quality in Recreation Sports” assesses recreation program client perceptions of quality and satisfaction . Specifically, this tool assesses quality based on the range of programs, operating times, client-employee interactions, interclient interactions, physical changes, valence, social ability, ambient conditions, design, equipment, and satisfaction ratings of programs . A study of students’ perceptions indicated that recreation program administration and promotion were important factors because many of the students were unaware of the existence of the available recreation programs . The study also showed that students’ perceptions of recreation facilities available on campus differ between men and women and by class standing . Some tools for evaluating recreation facilities are too simple  because they primarily assess presence/availability  and safety  and rely upon client perceptions or satisfaction of facility users [17–21]. Few tools objectively evaluate recreation facilities [22–24].
A Recreational Facility Audit Tool (RecFAT), created and tested in Hong Kong, uses a 111-item checklist to evaluate the availability and accessibility of sport facilities and amenities, policies, environmental safety and aesthetics, and population usage of the facilities . The tool was determined to be reliable and useful for evaluating parks, play grounds, and sports centers. The Physical Activity Resource Assessment (PARA), which has good reliability, was designed to assess publicly available facilities in low-income communities . Trained researchers objectively assess parks, churches, schools, sports facilities, fitness centers, community centers, and trails based on location, cost, features, amenities, qualities, and incivilities (noise, trash, vandalism, etc.). PARA is a general checkoff that evaluates components for presence/quality on a scale of 0 to 3; however, some of the detailed features such as staffing, weight/aerobic equipment, and universal access of recreation center/gym are not assessed.
A tool for objectively evaluating the quality and extensiveness of the recreation facilities is the Physical Activity Campus Environmental Supports (PACES) audit, which was originally developed to assess the environmental supports for recreation programs and facilities related to physical activity on a university campus . In 2009, the PACES audit was conducted at thirteen universities in the United States by trained researchers. PACES audit categories included built environment (bike racks, health promotion signage, and stairwells) and campus recreation programs (availability and quality of equipment, exercise spaces, courts/fields; availability of health education and intramural programs; and recreation facility hours, staff, and amenities). Data were collected with a simple checkoff paper survey tool. To more effectively evaluate and compare the quality and extensiveness/completeness of campus recreation facilities and programs, the purpose of this study was to update/redesign and validate PACES. The updated PACES will be more accessible for monitoring and evaluating campus recreation facilities and programs, with a more user-friendly online data collection format and scored results.
This paper is divided into two parts. Part one includes (1) development of inventory of items for the redesigned PACES audit; (2) expert, cognitive, and pilot testing; (3) survey analysis and revisions; and (4) field-testing. Part two validates PACES by comparing PACES-Facilities to PARA  and PACES-Programs to Healthier Campus Initiatives (HCI) . Data were collected between 2015 and 2017 and analyzed in 2018. This study was deemed exempt by Syracuse University IRB because it was an environmental audit not human subject research.
2.2. Part 1: Instrument Development
2.2.1. Development of Inventory Items for the Audit
This audit was designed to rate the quality and extensiveness of recreation facilities and physical activity programs. It can be used for municipalities, worksites, schools, and college campuses to evaluate one venue or to understand a more complete picture of the recreation facilities/programs for a specific environment by evaluating a number of venues. With the improved PACES audit training and data entry online system, users are provided with results compared and benchmarked to a wider sample of data.
Worksite/college campus recreation facilities and programs require periodic evaluation to determine their effectiveness for the population that they serve. The evaluation audits the overall campus environment, and the quality and extensiveness of the physical activity supports which contribute to making healthy physical activity decisions. The initiatives/policy support  and walkability/bikeability of a campus  were evaluated separately with other audits.
To create the survey questions, a four-step process was used. (1) The team reviewed the original PACES to identify the difficulties and limitations for collecting and interpreting the data. (2) The literature [1, 9, 13, 16–18, 24, 28–42] was reviewed to discover the behavioral and environmental correlates for physical activity and decide which topics to include. The following topics emerged for inclusion: facility updates, aesthetics, amenities, and cleanliness; stairwell and bike access; universal access; staff competence; extensiveness and adequacy of health programs, clubs/intramurals, exercise classes, equipment, fields, courts, and trails; and marketing of programs and fees. (3) For each question, semantic differential or Likert scales were created, based on the literature, to indicate low to high support. (4) Originally, one survey was created to include all relevant campus program/recreation facilities topics. As a result of testing the survey, it was divided into separate facilities and programs surveys because campuses could have more than one recreation facility, but only one overall recreation program (Table 1).
Responses required: 1Likert scale defined: strongly disagree, disagree, neutral, agree, and strongly agree; 2not applicable and does not apply to our environment.
The recreation programs audit contains 13 questions including populations served, fees, programs, proximity, and marketing. The recreation facilities audit contains 20 questions including built environment, equipment, staff, amenities, and access. Each item criterion is specifically scored using a five-point, semantic-differential or Likert scale ranging from limited to extensive environmental support/evidence. A few questions utilize select all that apply for a summed score.
2.2.2. Expert, Cognitive, and Pilot Testing
This audit was cognitively tested with seven research assistants. Each student independently attempted to apply/score the audit for two different facilities. Via a group discussion, each question was discussed to determine interpretation, clarity, and appropriateness of semantic or Likert scales. Five public health/recreation program experts reviewed the audits for content validity. The results from the cognitive testing and expert review improved wording of questions and response items. The audit was pilot tested twice. Changes based on pilot testing included dividing it into two surveys for ease of administration and refining the wording of a few questions/responses (Summer 2014 at SU and Fall 2014 data not shown).
2.2.3. Recreation Facilities Venue Definitions
Main (Primary) Recreation Facility. This is the only or primary recreation facility for the population served.
Secondary/Satellite Facility. This is a smaller recreation facility that houses a portion or smaller version of the total recreation facilities.
One Component of Facilities. This includes single components of recreation facilities, such as a pool or a tennis court.
2.2.4. Field Testing: Audit Administration Procedures
This audit was tested on and near college campuses participating in the Get FRUVED research study (). Get FRUVED  is a social marketing and environmental change intervention to promote health on college campuses. At each college, the venues to be evaluated were determined by a campus team that identified a representative sample of the recreation facilities (main and secondary/satellite) which were most frequented by the campus population. At a minimum, the team assessed the main facility and approximately 25% of the secondary/satellite facilities within a 1.5-mile radius, depending on the campus. In cases where the served population extensively utilized a facility located beyond the 1.5-mile radius, the audit review team could decide to audit it.
Two different assessments were completed on each campus PACES-Facilities and PACES-Programs audit. A PACES-Facilities survey was completed for each recreation facility on/off campus, whether it was for a main recreation facility, secondary/satellite facility, or one component of a facility. For campus programs, one PACES-Programs survey was completed per campus.
Training and Interrater Reliability. Research assistants completed video training and practiced and performed interrater reliability (IRR) exercises. Each participating community had its student researchers complete IRR on two recreation facilities. Interclass correlations > 0.80 were required for each team prior to data collection. Starting in 2017, the IRR procedures were converted to an online quiz.
Scores were computed for each PACES-Programs and PACES-Facilities survey. Interclass correlations (ICC) were computed to determine interrater reliability. To compare campuses, a PACES-Total score was computed by adding the PACES-Programs and the PACES-Facilities scores for the main facility evaluated. Cronbach’s alpha for PACES-Programs was (, 10 items) after deleting the question “When was the most recent recreation facility built?” Cronbach’s alpha for PACES-Facilities was (, 15 items). Differences by campus size and region were determined with ANOVA. To distinguish the quality of recreation facilities between campuses, latent profile analysis (LPA) was applied. LPA categorization allows for the assessment of heterogeneity of the sample based on distinctive characteristics of schools’ recreation facilities, which were expected to follow non-normal distributions. Two to five profiles were tested iteratively by using the robust maximum likelihood method and Akaike information criteria (AIC), bayesian information criteria (BIC), entropy, and sample size-adjusted BIC (SSABIC). The uniqueness and interpretability of latent profiles were considered to choose the optimal model . Lower AIC, BIC, and SSABIC values indicate better model fit.
A total of 153 facilities were assessed on and near 76 campuses. Students were effectively trained to implement the audits with ICC ranges of 0.90 to 0.98 for IRR. Sixty percent of the sample was from medium (, 29.2%) to large (, 33.3%) schools (Table 2). Most audits were completed in the south (, 40.3%) with only 15.3% in the west (Table 2). Most facilities were primary (, 53.5%), followed by secondary (, 30.3%), and finally, 44 one component/stand-alone facilities (15.5%). The scores ranged from 2 to 42 for facilities across all campuses, with a maximum of 77. PACES-Programs scores ranged from zero to 55 across all campuses of a maximum total of 73 points.
Small schools (500–1000 students) scored significantly lower than the largest schools (>20,000 students) on PACES-Total and PACES-Programs, but not on PACES-Facilities. Although there were no differences in PACES-Total by region, the campuses in the west scored significantly lower than all other regions on PACES-Facilities (Table 3).
1PACES-Total = PACES-Programs + PACES-Facilities (for the main facility audited).2Campus size based upon student population: very small ≤5000; small 5001–10,000; medium 10,001–20,000; large ≥20,001. 3F = 3.715, df = 3, ; different subscripts are significantly different. 4F = 8.779, df = 3, ; different subscripts are significantly different. 5F = 1.397, df = 3, . 6F = 5.264, df = 3, ; different subscripts are significantly different.
The three-class solution was identified as the best model based on AIC (2-class: 9185.638; 3-class: 8954.549; 4-class: 9054.549; and 5-class: 9061.713), BIC (2-class: 9507.555; 3-class: 9428.314; 4-class: 9680.161; and 5-class: 9839.173), and SSABIC (2-class: 9172.049; 3-class: 8934.550; 4-class: 9028.140; and 5-class: 9028.895) values and meaningful interpretation of profiles. Entropy was 0.996 for the three-class solution. The first profile (low quality) consisted of 8.5%, the second profile (moderate quality) consisted of 13.7%, and the third profile (high quality) consisted of 77.8% of the samples (means and ranges available in Table 4). Figure 1 indicates aesthetics, adequacy of aerobic equipment, staff competence, staff accessibility, and extensiveness of amenities contributed to the facilities classified as high quality. Moderate quality scoring classified facilities scored in the middle for most questions, with moderate peaked scores on bike rack adequacy, aerobic equipment, and amenities. Low quality scoring classified facilities consistently scored lowest on all questions except cleanliness.
F = 213.61, ; different subscripts are significantly different.
The distribution in the quality of the facilities differs by campus size χ2 (4, facilities) = 16.994, . Approximately 34% of high-quality facilities were in large schools, 34% were in medium size schools, and 25.2% were in the smallest size schools, whereas 61.5% of low-quality facilities were at large schools and 23.1% were in small size schools. Approximately 57.1% of moderate quality facilities were in the smallest schools. No facilities scored in the exceptional quality category.
4. Part 2: Convergent Validation Study
4.1. PACES-Facilities Validation
The Physical Activity Resource Assessment (PARA) Instrument  was chosen for the validation comparison because it was the most appropriate objective tool to evaluate similar recreation facilities concepts. The survey is a one-page checklist that assesses types of resource, features, amenities, and incivilities. Types of resource (fitness club, park, sport facility, trail, community center, church, school, and combination) are assessed on size, capacity, cost (free, pay at door, pay for certain programs), hours (open and close), and signage (hours, rules: yes/no). For the features and amenity sections of the survey, each item is rated on a 0 to 3 scale: 0 = not present, 1 = poor, 2 = mediocre, and 3 = good. Features include baseball field, basketball court, soccer field, bike rack, exercise station, play equipment, pool >3 feet deep, sandbox, sidewalk, tennis court, trail-running/biking, volleyball court, and wading pool <3 ft. The amenity section includes access points, bathrooms, benches, drinking fountains, fountains, landscaping effort, lighting, picnic tables shaded, picnic tables no shade, shelters, shower/locker room, and trash containers. The incivilities items are rated on a 0–3 scale: 0 = not present, 1 = little/few, 2 = some, and 3 = a lot. Incivilities include auditory annoyance, broken glass, dog refuse, dogs unattended, evidence of alcohol use, evidence of substance abuse, graffiti/tagging, litter, no grass, overgrown grass, sex paraphernalia, and vandalism. Detailed directions were included with the survey.
The PACES-Facilities and PARA audits were tested at 29 facilities on and near college campuses (). For validation purposes, each auditor/team was responsible for a paired evaluation using both tools (PACES-Facilities and PARA). To reduce potential bias, PARA was collected first for one half of the audits and for the other half, PACES-Facilities was collected first. Auditors entered all surveys into Qualtrics, with one for PARA and one for PACES-Facilities.
Training and Interrater Reliability. Research assistants completed training and practiced and performed interrater reliability (IRR) exercises for both tools. Each participating campus had its student researchers’ complete IRR on two recreation facilities. Interclass correlations > 0.80 were required for each team prior to data collection.
Scoring incivilities were reverse coded: 4 = not present, 3 = little/few, 2 = some, and 1 = A lot. For each section, features, amenity, and incivilities, an average score and a sum score were computed. The reliability for each section was assessed: features (); amenities (); incivilities ().
Spearman’s correlations were used to compare PACES-Facilities score to the PARA features section and the temporary PACES “amenities items” to the PARA amenities. PACES was not designed to assess incivilities, and the PARA reliability was low, so no comparison was made for this dimension.
4.2. PACES-Programs Validation
To validate PACES-Programs, the results were compared to a survey created from the Partnership for Healthier America’s Healthier Campus Initiative (HCI) . A portion of the HCI survey was chosen for this validation because it measures comparable concepts for the college campus regarding extensiveness of health and wellness programming on campus. The HCI survey contained 41 questions, 15 regarding food/nutrition offerings, 19 regarding physical activity programs/facilities, and seven regarding policies. Each question was a Yes/No checkoff to indicate if a campus had the initiative or policy. The 16 specific HCI-programming questions selected for validation included bike share/rental, fitness/intramural opportunities, introduction to physical activity classes, physical activity breaks offered, fitness orientations, sufficient outdoor activities, rental for outdoor equipment, outdoor recreation clinics/trips, marked walking routes, free access to fitness/recreation center, dedicated physical activity space, outdoor running/walking track outdoor fitness system, certified personal trainers, implementation of comprehensive wellness program, and healthy cooking classes.
Campuses participating () in Get FRUVED  completed PACES-Programs and the HCI survey as part of their full data collection.
4.2.3. Data Analysis
Summing the 16 selected HCI programming questions, the HCI programming subscore was compared to PACES-Programs using Spearman’s correlation.
5. Validation Results
There were 29 total PACES-Facilities and PARA pairs. Interrater reliability for PARA was ICC = 0.91 to 0.99 and for PACES-Facilities ICC = 0.81 to 1.0. Most of the schools were public institutions (87%) (Table 5). The northeast and the south each represented 30% of the sample, while there were no school facilities evaluated from the west. More than a third of the sample (39%) was from very small and small schools with ≤10,000 students. Correlations were significant between PACES-Facilities (features) and PARA features (, ) and for PACE-Facilities (amenities) and PARA amenities (, ).
aCampus size based upon student population: very small ≤5000; small 5001–10,000; medium 10,001–20,000; large ≥20,001.
Forty-one of the 78 Get FRUVED schools had matched PACES-Programs and HCI data. Most of the schools were public institutions (70.7%) (Table 5). The south represented 39% of the sample, while only 17% were from the northeast. More than a third of the sample (39%) was from very small and small schools with ≤10,000 students. There was significant correlation between total PACES-Programs and the HCI programming subscore (, ).
For this study, the team redesigned, tested, and validated an updated version of PACES, a reliable tool to assess the quality of recreation facilities and programs. PACES distinguishes differences between facility types and programs, across campuses and regions. Most facilities evaluated with PACES-Facilities categorized into the highest quality recreation facilities category, primarily due to the extensiveness of their aerobic equipment and amenities and the competence and accessibility of the staff. The range of PACES-Programs scores indicated that most campuses provided a moderate level of options and supports for their overall campus programs.
Only a few researchers have attempted to assess the quality of recreation facilities/programs, with most relying on client or user perception [17–19, 45]. The original PACES , PARA , and RecFAT  more objectively assessed quality by focusing on condition and maintenance. The comparisons between PACES and previous research are limited because the latter typically evaluates only portions of the physical activity environment, such as presence/availability of recreation facilities [15, 46] or user satisfaction [20, 21]. Some existing audits are focused on safety , universal access , park quality [47, 48], or rural environments . In a systematic review of worksite tools  related to supports for physical activity (), only 20% of the tools were objective audits while over 50% were based upon employee self-report. Of the tools reviewed, 75% of the studies included access to physical activity equipment/facilities, amenities, and an assessment of educational opportunities; 66% of studies included bike rack availability/stairwell features; and less than 50% included evaluation of fitness assessment opportunities . While 50% had completed internal and/or interrater reliability, only 33% reported some level of validation.
The PARA and RecFAT tools are useful in a diversity of recreation environments and are less specific regarding the details of campus recreation facilities. For this study, PARA was helpful as a validation comparison for PACES-Facilities for features and amenities . For the subsample assessed by both PARA and PACES-Facilities, there were significant correlations for both features and amenities (PACES-Facilities does not assess incivilities, so no comparison was made.) Using PARA, Adamus et al.  found some similar results to these found on college campuses: fitness clubs had the highest scores for amenities and combination resources had the highest scores for features.
Others have found low to moderate reliability between population perception and objective (Google Earth) assessment of recreation facilities . PACES can be a valuable and objective tool for evaluating and comparing the quality of recreation programs and facilities in varied environments. This is important because recreation facility quality has been found to relate to a variety of outcomes [6, 7, 9, 10, 21, 53–59]. The accessibility of recreation facilities are related to the level of physical activity in various populations [6, 7, 9, 10, 21, 53, 54, 60], and a community’s natural amenities and recreation facilities per capita are negatively related to the populations’ rate of obesity . On college campuses, recreation facility usage has been related to higher academic outcomes (GPA) [12, 59], higher student retention [56, 59], reduced stress , increased exercise frequency , and improved health indices . The quality of recreation services (specifically staff competency, operations quality, and facility ambiance) has been shown to influence levels of satisfaction with recreation facilities and programs .
The PACES training and practice require approximately 2-3 hours, and an audit can be completed in 25–30 mins per facility or survey. The PACES audit is part of the Healthy Campus Environmental Audit (HCEA), a series audit tools to evaluate restaurants , convenience stores , vending , walkability/bikeability , and policies . The PACES audit is user-friendly and available on the internet, with training and data entry links (contact the primary author for information). The primary institution analyses the data and provides feedback to the user including comparison and benchmark information. PACES-Facilities and PACES-Programs audits have been validated for college campus-type work environments but might also be useful in a variety of settings including communities, worksites, colleges, and schools.
A limitation of this study is that only college-educated populations on and near college campuses have used and tested the audit, and it therefore needs to be validated for use by other data collectors to determine the utility of PACES in communities beyond college campuses. Although many recreation facilities audits are perception-based [17–21, 45], and some contradict objective findings , the PACES audit should be further tested by comparing audit results with recreation facilities/program clients’ perceptions. Future research should also evaluate the relevance and weighting of PACES items and the effectiveness of facilities and program supports in encouraging physical activity.
Data are available upon request to the corresponding author.
The funder, USDA AFRI grant, had no role in the design, analysis, or writing of this article.
Conflicts of Interest
The authors declare that they have no conflicts of interest.
All the authors have made substantial contributions (a) to either conception and design, or acquisition of data, or analysis and interpretation of data, (b) to drafting the article or revising it critically for important intellectual content, and (c) on final approval of the version to be published and agreed to its submission. Specifically, Dr. Horacek and Dean Seidman designed the study. All authors pilot-tested the tool and acquired data, and Dr. Horacek and Elif Dede Yildirim analyzed and interpreted the data. Dr. Horacek drafted the article, and all authors revised it. All authors provided final approval of the version to be reviewed and agreed to its submission.
We would like to acknowledge (1) the technical support for data collection and training provided by Megan Mullin, Laura Brown, and Heather Brubaker and (2) all of the research assistants at each institution who collected data. Funding for this study is provided by the Agriculture and Food Research Initiative (Grant no. 2014-67001-21851) from the USDA National Institute of Food and Agriculture, “Get FRUVED: a peer-led, train-the-trainer social marketing intervention to increase fruit and vegetable intake and prevent young adult weight gain, A2101.”
- U.S.D.o.H.a.H., Physical Activity and Health: A Report of the Surgeon General, Services, Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion, Atlanta, GA, USA, 1996.
- S. Weihrauch-Blüher, K. Kromeyer-Hauschild, C. Graf et al., “Current guidelines for obesity prevention in childhood and adolescence,” Obesity Facts, vol. 11, no. 3, pp. 263–276, 2018.
- U.S.D.o.H.a.H., Physical Activity Guidelines for Americans, U.S. Department of Health and Human Services, Washington, DC, USA, 2nd edition, 2018.
- T. M. Horacek, E. Dede Yildirim, K. Kattelmann et al., “Path analysis of campus walkability/bikeability and college students' physical activity attitudes, behaviors, and body mass index,” American Journal of Health Promotion, vol. 32, no. 3, pp. 578–586, 2018.
- D. W. Barnett, A. Barnett, A. Nathan, J. Van Cauwenberg, and E. Cerin, “Built environmental correlates of older adults’ total physical activity and walking: a systematic review and meta-analysis,” International Journal of Behavioral Nutrition and Physical Activity, vol. 14, no. 1, p. 103, 2017.
- J. Choi, M. Lee, J.-k. Lee, D. Kang, and J.-Y. Choi, “Correlates associated with participation in physical activity among adults: a systematic review of reviews and update,” BMC Public Health, vol. 17, no. 1, p. 356, 2017.
- K. M. Heinrich, C. K. Haddock, N. Jitnarin, J. Hughey, L. A. Berkel, and W. S. C. Poston, “Perceptions of important characteristics of physical activity facilities: implications for engagement in walking, moderate and vigorous physical activity,” Frontiers in Public Health, vol. 5, p. 319, 2017.
- M. Smith, J. Hosking, A. Woodward et al., “Systematic literature review of built environment effects on physical activity and active transport-an update and new findings on health equity,” International Journal of Behavioral Nutrition and Physical Activity, vol. 14, no. 1, p. 158, 2017.
- E. Haug, T. Torsheim, J. F. Sallis, and O. Samdal, “The characteristics of the outdoor school environment associated with physical activity,” Health Education Research, vol. 25, no. 2, pp. 248–256, 2010.
- A. V. Ries, A. F. Yan, and C. C. Voorhees, “The neighborhood recreational environment and physical activity among urban youth: an examination of public and private recreational facilities,” Journal of Community Health, vol. 36, no. 4, pp. 640–649, 2011.
- N. Durant, S. K. Harris, S. Doyle et al., “Relation of school environment and policy to adolescent physical activity,” Journal of School Health, vol. 79, no. 4, pp. 153–159, 2009.
- M. Brock, J. W. Carr, and M. K. Todd, “An examination of campus recreation usage, academic performance, and selected health indices of college freshmen,” Recreational Sports Journal, vol. 39, no. 1, pp. 27–36, 2015.
- S. C. Jones and L. Barrie, “Declining physical activity levels as an unintended consequence of abolishing mandatory campus service fees,” Journal of American College Health, vol. 59, no. 6, pp. 511–518, 2011.
- P. D. Faghri, R. Kotejoshyer, M. Cherniack, D. Reeves, and L. Punnett, “Assessment of a worksite health promotion readiness checklist,” Journal of Occupational and Environmental Medicine, vol. 52, no. 9, pp. 893–899, 2010.
- J. R. Bethlehem, J. D. Mackenbach, M. Ben-Rebah et al., “The SPOTLIGHT virtual audit tool: a valid and reliable tool to assess obesogenic characteristics of the built environment,” International Journal of Health Geographics, vol. 13, no. 1, p. 52, 2014.
- M. M. Cavnar, K. A. Kirtland, M. H. Evans et al., “Evaluating the quality of recreation facilities: development of an assessment tool,” Journal of Park and Recreation Administration, vol. 22, pp. 96–114, 2004.
- R. W. Osman, S. T. Cole, and C. R. Vessell, “Examining the role of perceived service quality in predicting user satisfaction and behavioral intentions in a campus recreation setting,” Recreational Sports Journal, vol. 30, no. 1, pp. 20–29, 2006.
- N. Tsigilis, T. Masmanidis, and A. Koustelios, “University students’ satisfaction and effectiveness of campus recreation programs,” Recreational Sports Journal, vol. 33, no. 1, pp. 65–77, 2009.
- Y. J. Ko and D. L. Pastore, “An instrument to assess customer perceptions of service quality and satisfaction in campus recreation programs,” Recreational Sports Journal, vol. 31, no. 1, pp. 34–42, 2007.
- J. Reed, “Perceptions of the availability of recreational physical activity facilities on a university campus,” Journal of American College Health, vol. 55, no. 4, pp. 189–194, 2007.
- S. A. Lee, Y. J. Ju, J. E. Lee et al., “The relationship between sports facility accessibility and physical activity among Korean adults,” BMC Public Health, vol. 16, no. 1, p. 893, 2016.
- K. Y. Lee, D. Macfarlane, and E. Cerin, “Objective evaluation of recreational facilities: development and reliability of the recreational facility audit tool,” Journal of Park and Recreation Administration, vol. 31, no. 4, pp. 92–109, 2013.
- R. E. Lee, K. M. Booth, J. Y. Reese-Smith, G. Regan, and H. H. Howard, “The Physical Activity Resource Assessment (PARA) instrument: evaluating features, amenities and incivilities of physical activity resources in urban neighborhoods,” International Journal of Behavioral Nutrition and Physical Activity, vol. 2, no. 1, p. 13, 2005.
- T. M. Horacek, A. A. White, C. Byrd-Bredbenner et al., “PACES: a physical activity campus environmental supports audit on university campuses,” American Journal of Health Promotion, vol. 28, no. 4, pp. e104–e117, 2014.
- Healthier Campus Initiative, https://www.ahealthieramerica.org/articles/healthier-campus-initiative-146.
- T. Horacek, M. Simon, E. Dede Yildirim et al., “Development and validation of the policies, opportunities, initiatives and notable topics (POINTS) audit for campuses and worksites,” International Journal of Environmental Research and Public Health, vol. 16, no. 5, p. 778, 2019.
- T. M. Horacek, A. A. White, G. W. Greene et al., “Sneakers and spokes: an assessment of the walkability and bikeability of U.S. Postsecondary institutions,” Journal of Environmental Health, vol. 74, pp. 8–15, 2012.
- R. Allen and C. M. Ross, “An assessment of proximity of fitness facilities and equipment and actual perceived usage by undergraduate university students: a pilot study,” Recreational Sports Journal, vol. 37, no. 2, pp. 123–135, 2013.
- D. R. Bassett, R. Browning, S. A. Conger, D. L. Wolff, and J. I. Flynn, “Architectural design and physical activity: an observational study of staircase and elevator use in different buildings,” Journal of Physical Activity and Health, vol. 10, no. 4, pp. 556–562, 2013.
- K. S. Bayne and B. A. Cianfrone, “The effectiveness of social media marketing: the impact of facebook status updates on a campus recreation event,” Recreational Sports Journal, vol. 37, no. 2, pp. 147–159, 2013.
- N. Cooper and D. Theriault, “Environmental correlates of physical activity: implications for campus recreation practitioners,” Recreational Sports Journal, vol. 32, no. 2, pp. 97–105, 2008.
- M. Hamre, N. Hongu, V. R. Lee, G. S. Welter, J. N. Farr, and S. B. Going, “Effects of a short-term group fitness intervention on body composition and exercise motivation in college students,” The FASEB Journal, vol. 24, p. lb340, 2010.
- A. R. Hurd and S. A. Forrester, “Constraints to participation in campus recreational services,” Physical & Health Education Journal, vol. 72, pp. 18–21, 2006.
- A. T. Kaczynski, “Neighborhood walkability perceptions: associations with amount of neighborhood-based physical activity by intensity and purpose,” Journal of Physical Activity and Health, vol. 7, no. 1, pp. 3–10, 2010.
- E. Lennon, E. Mathis, and A. Ratermann, “Comparison of strength changes following resistance training using free weights and machine weights,” Missouri Journal of Health, Physical Education, Recreation and Dance, vol. 20, pp. 29–35, 2010.
- J. M. Linenger, C. V. Chesson, and D. S. Nice, “Physical fitness gains following simple environmental change,” American Journal of Preventive Medicine, vol. 7, no. 5, pp. 298–310, 1991.
- M. Lindström, “Means of transportation to work and overweight and obesity: a population-based study in southern Sweden,” Preventive Medicine, vol. 46, no. 1, pp. 22–28, 2008.
- J. A. Reed and D. A. Phillips, “Relationships between physical activity and the proximity of exercise facilities and home exercise equipment used by undergraduate university students,” Journal of American College Health, vol. 53, no. 6, pp. 285–290, 2005.
- J. H. Rimmer, B. Riley, E. Wang, and A. Rauworth, “Development and validation of AIMFREE: accessibility instruments measuring fitness and recreation environments,” Disability and Rehabilitation, vol. 26, no. 18, pp. 1087–1095, 2004.
- B. Strand, J. Egeberg, and A. Mozumdar, “Health-related fitness and physical activity courses in U.S. Colleges and universities,” Journal of Research in Health, Physical Education, Recreation, Sport & Dance, vol. 5, pp. 17–20, 2010.
- P. J. Troped, R. P. Saunders, R. R. Pate, B. Reininger, J. R. Ureda, and S. J. Thompson, “Associations between self-reported and objective physical environmental factors and use of a community rail-trail,” Preventive Medicine, vol. 32, no. 2, pp. 191–200, 2001.
- S. Zizzi, S. F. Ayers, J. C. Watson, and L. Keeler, “Assessing the impact of new student campus recreation centers,” Journal of Student Affairs Research and Practice, vol. 41, no. 4, pp. 588–630, 2004.
- S. Colby, M. Olfert, A. Mathews et al., ““Get fruved”: the RCT year,” Journal of Nutrition Education and Behavior, vol. 50, no. 7, pp. S116–S117, 2018.
- T. Asparouhov and B. Muthén, “Auxiliary variables in mixture modeling: three-step approaches using mplus,” Structural Equation Modeling: A Multidisciplinary Journal, vol. 21, no. 3, pp. 329–341, 2014.
- W. J. Muthén, “The development of an instrument to measure effectiveness in campus recreation programs,” Journal of Sport Management, vol. 11, no. 3, pp. 263–274, 1997.
- J. A. Hirsch, K. A. Meyer, M. Peterson et al., “Obtaining longitudinal built environment data retrospectively across 25 years in four US cities,” Frontiers in Public Health, vol. 4, p. 65, 2016.
- A. E. Greer, R. Marcello, and R. Graveline, “Community members’ assessment of the physical activity environments in their neighborhood parks,” Health Promotion Practice, vol. 16, no. 2, pp. 202–209, 2015.
- A. T. Kaczynski, J. Schipperijn, J. A. Hipp et al., “ParkIndex: development of a standardized metric of park access for research and planning,” Preventive Medicine, vol. 87, pp. 110–114, 2016.
- R. A. Seguin, B. K. Lo, U. Sriram, L. M. Connor, and A. Totta, “Development and testing of a community audit tool to assess rural built environments: inventories for Community Health Assessment in Rural Towns,” Preventive Medicine Reports, vol. 7, pp. 169–175, 2017.
- J. A. Hipp, D. N. Reeds, M. A. van Bakergem et al., “Review of measures of worksite environmental and policy supports for physical activity and healthy eating,” Preventing Chronic Disease, vol. 12, p. E65, 2015.
- H. J. Adamus, S. K. Mama, I. Sahnoune, and R. E. Lee, “Evaluating the quality and accessibility of physical activity resources in two southern cities,” American Journal of Health Promotion, vol. 27, no. 1, pp. 52–54, 2012.
- C. Roda, H. Charreire, T. Feuillet et al., “Mismatch between perceived and objectively measured environmental obesogenic features in European neighbourhoods,” Obesity Reviews, vol. 17, no. 1, pp. 31–41, 2016.
- C. M. Hoehner, P. Allen, C. E. Barlow, C. M. Marx, R. C. Brownson, and M. Schootman, “Understanding the independent and joint associations of the home and workplace built environments on cardiorespiratory fitness and body mass index,” American Journal of Epidemiology, vol. 178, no. 7, pp. 1094–1105, 2013.
- D. Adlakha, A. J. Hipp, C. Marx et al., “Home and workplace built environment supports for physical activity,” American Journal of Preventive Medicine, vol. 48, no. 1, pp. 104–107, 2015.
- D. A. Hall, “Participation in a campus recreation program and its effect on student retention,” Recreational Sports Journal, vol. 30, no. 1, pp. 40–45, 2006.
- A. Henchy, “The perceived benefits of participating in campus recreation programs and facilities: a comparison between undergraduate and graduate students,” Recreational Sports Journal, vol. 37, no. 2, pp. 97–105, 2013.
- D. Barney, L. Benham, and L. Haslem, “Effects of college student's participation in physical activity classes on stress,” American Journal of Health Studies, vol. 29, pp. 1–6, 2014.
- S. B. J. Pitts, M. B. Edwards, J. B. Moore, K. A. Shores, K. D. DuBose, and D. McGranahan, “Obesity is inversely associated with natural amenities and recreation facilities per capita,” Journal of Physical Activity and Health, vol. 10, no. 7, pp. 1032–1038, 2013.
- S. J. Danbert, J. M. Pivarnik, R. N. McNeil, and I. J. Washington, “Academic success and retention: the role of recreational sports fitness facilities,” Recreational Sports Journal, vol. 38, pp. 14–22, 2014.
- T. L. McKenzie, J. S. Moody, J. A. Carlson, N. V. Lopez, and J. P. Elder, “Neighborhood income matters: disparities in community recreation facilities, amenities, and programs,” Journal of Park & Recreation Administration, vol. 31, pp. 12–22, 2013.
- T. M. Horacek, E. Dede Yildirim, M. Simon et al., “Development and validation of the full restaurant evaluation supporting a healthy (FRESH) dining environment audit,” Journal of Hunger & Environmental Nutrition, pp. 1–20, 2018.
- T. Horacek, E. Yildirim, E. Kelly et al., “Development and validation of a simple convenience store SHELF audit,” International Journal of Environmental Research and Public Health, vol. 15, no. 12, p. 2676, 2018.
- T. Horacek, E. Yildirim, M. Matthews Schreiber et al., “Development and validation of the vending evaluation for nutrient-density (VEND) ing audit,” International Journal of Environmental Research and Public Health, vol. 16, no. 3, p. 514, 2019.
Copyright © 2019 Tanya M. Horacek et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.