Abstract

This paper describes the redesign, field-testing, and convergent validity of a practical tool—Physical Activity Campus Environmental Supports (PACES) audit. Methods. The audit includes two parts: (1) PACES-Programs, which is comprised of questions regarding populations served, fees, programs (recreation/fitness classes and intramurals), proximity, adequacy of facilities, and marketing, and (2) PACES-Facilities, which is comprised of questions regarding built environment (aesthetics, bike racks, stairs, and universal design), recreation equipment, staff, amenities, and access. Each item criterion is specifically scored using a five-point, semantic-differential scale ranging from limited to extensive environmental support. A few questions utilize select all that apply for a summed score. PACES training, interrater reliability, and data collection are all accessible via an online portal. PACES was tested on 76 college campuses. Convergent validity was examined by comparing the PACES-Programs questions to Healthy Campus Initiatives-Programs questions (HCI-Programs) and comparing the PACES-Facilities questions to questions contained in the Physical Activity Resource Assessment (PARA) Instrument. Statistical analyses included Cronbach’s alpha, ANOVA, latent profile analysis, and Spearman correlations. Results. The PACES-Programs audit includes 10 items for a potential total of 73 points () and PACES-Facilities audit includes 15 items for a potential total of 77 points (). Most (77.8%) of the 153 facilities assessed scored in the most healthful range (20–42), which was mainly due to the extensiveness of the aerobic equipment/amenities and the competence/accessibility of staff. Significant differences in PACES-Total and PACES-Programs scores were associated with campus size and PACES-Facilities across regions. For the paired validation assessments, correlations were significant between PACES-Programs and HCI-Programs (() , ) and PACES-Facilities and PARA () for both features (, ) and amenities (, ), indicating moderate convergent validity. Conclusion. The PACES audit is a valid, reliable tool for assessing the quality of recreation facilities and programs in a variety of college campus environments.

1. Introduction

Obesity prevention guidelines recommend regular physical activity through the lifespan to prevent disease and promote good health [13]. The availability, access, quality, and usage of recreation facilities and programs have been identified as factors influencing a population’s level of physical activity [48]. For school children, the number of outdoor facilities at school was associated with higher physical activity levels [9], and adolescents were more active using public recreation spaces (rather than private) or open field times [10, 11]. On college campuses, recreation facility usage was related to favorable health indices [12]. However, student participation in campus recreation programs declined when membership fees were charged to use the facility, highlighting financial considerations as a barrier to achieving physical activity [13]. Given the basic relationship between the availability of recreation facilities and physical activity levels, the quality and extensiveness of recreation programs and facilities require further study.

The existing tools available to evaluate the quality of recreation facilities/programs are limited. The Worksite Health Promotion Readiness Checklist, a simple yes/no survey, is available to assess the health promotion and protection practices and policies in worksites [14]. The SPOTLIGHT virtual audit tool assesses the presence of indoor/outdoor recreation facilities and public parks via the street view feature of Google Earth/GIS [15]. Another tool based upon Total Quality Management (TQM) evaluates recreation facilities from a variety of user viewpoints, with a focus on safety, condition, and maintenance [16]. This tool can be used to evaluate recreation centers, parks, playgrounds, aquatic facilities/pools, and sports fields, and it was found to be a reliable and effective measure of the physical features (amenities) of a recreation facility [16].

A variety of tools are available to assess students, employees, alumni, and the community satisfaction or perception of recreation services [1721]. One tool [17] evaluates students’ satisfaction levels, perceived service quality, and behavioral intentions for continued use of campus recreation facilities and programs. Using a Likert scale, the topics include facility ambiance, operations quality, staff competency, overall satisfaction, and behavioral intentions. Another tool evaluates students’ perceptions of the effectiveness of, and their satisfaction with, campus recreation programs [18]. It assesses personal treatment, budget, academic support, individual performance, and ethics. Using a Likert scale, another tool named “the Scale of Service Quality in Recreation Sports” assesses recreation program client perceptions of quality and satisfaction [19]. Specifically, this tool assesses quality based on the range of programs, operating times, client-employee interactions, interclient interactions, physical changes, valence, social ability, ambient conditions, design, equipment, and satisfaction ratings of programs [19]. A study of students’ perceptions indicated that recreation program administration and promotion were important factors because many of the students were unaware of the existence of the available recreation programs [20]. The study also showed that students’ perceptions of recreation facilities available on campus differ between men and women and by class standing [20]. Some tools for evaluating recreation facilities are too simple [14] because they primarily assess presence/availability [15] and safety [16] and rely upon client perceptions or satisfaction of facility users [1721]. Few tools objectively evaluate recreation facilities [2224].

A Recreational Facility Audit Tool (RecFAT), created and tested in Hong Kong, uses a 111-item checklist to evaluate the availability and accessibility of sport facilities and amenities, policies, environmental safety and aesthetics, and population usage of the facilities [23]. The tool was determined to be reliable and useful for evaluating parks, play grounds, and sports centers. The Physical Activity Resource Assessment (PARA), which has good reliability, was designed to assess publicly available facilities in low-income communities [22]. Trained researchers objectively assess parks, churches, schools, sports facilities, fitness centers, community centers, and trails based on location, cost, features, amenities, qualities, and incivilities (noise, trash, vandalism, etc.). PARA is a general checkoff that evaluates components for presence/quality on a scale of 0 to 3; however, some of the detailed features such as staffing, weight/aerobic equipment, and universal access of recreation center/gym are not assessed.

A tool for objectively evaluating the quality and extensiveness of the recreation facilities is the Physical Activity Campus Environmental Supports (PACES) audit, which was originally developed to assess the environmental supports for recreation programs and facilities related to physical activity on a university campus [24]. In 2009, the PACES audit was conducted at thirteen universities in the United States by trained researchers. PACES audit categories included built environment (bike racks, health promotion signage, and stairwells) and campus recreation programs (availability and quality of equipment, exercise spaces, courts/fields; availability of health education and intramural programs; and recreation facility hours, staff, and amenities). Data were collected with a simple checkoff paper survey tool. To more effectively evaluate and compare the quality and extensiveness/completeness of campus recreation facilities and programs, the purpose of this study was to update/redesign and validate PACES. The updated PACES will be more accessible for monitoring and evaluating campus recreation facilities and programs, with a more user-friendly online data collection format and scored results.

2. Methods

2.1. Overview

This paper is divided into two parts. Part one includes (1) development of inventory of items for the redesigned PACES audit; (2) expert, cognitive, and pilot testing; (3) survey analysis and revisions; and (4) field-testing. Part two validates PACES by comparing PACES-Facilities to PARA [23] and PACES-Programs to Healthier Campus Initiatives (HCI) [25]. Data were collected between 2015 and 2017 and analyzed in 2018. This study was deemed exempt by Syracuse University IRB because it was an environmental audit not human subject research.

2.2. Part 1: Instrument Development
2.2.1. Development of Inventory Items for the Audit

This audit was designed to rate the quality and extensiveness of recreation facilities and physical activity programs. It can be used for municipalities, worksites, schools, and college campuses to evaluate one venue or to understand a more complete picture of the recreation facilities/programs for a specific environment by evaluating a number of venues. With the improved PACES audit training and data entry online system, users are provided with results compared and benchmarked to a wider sample of data.

Worksite/college campus recreation facilities and programs require periodic evaluation to determine their effectiveness for the population that they serve. The evaluation audits the overall campus environment, and the quality and extensiveness of the physical activity supports which contribute to making healthy physical activity decisions. The initiatives/policy support [26] and walkability/bikeability of a campus [27] were evaluated separately with other audits.

To create the survey questions, a four-step process was used. (1) The team reviewed the original PACES to identify the difficulties and limitations for collecting and interpreting the data. (2) The literature [1, 9, 13, 1618, 24, 2842] was reviewed to discover the behavioral and environmental correlates for physical activity and decide which topics to include. The following topics emerged for inclusion: facility updates, aesthetics, amenities, and cleanliness; stairwell and bike access; universal access; staff competence; extensiveness and adequacy of health programs, clubs/intramurals, exercise classes, equipment, fields, courts, and trails; and marketing of programs and fees. (3) For each question, semantic differential or Likert scales were created, based on the literature, to indicate low to high support. (4) Originally, one survey was created to include all relevant campus program/recreation facilities topics. As a result of testing the survey, it was divided into separate facilities and programs surveys because campuses could have more than one recreation facility, but only one overall recreation program (Table 1).

The recreation programs audit contains 13 questions including populations served, fees, programs, proximity, and marketing. The recreation facilities audit contains 20 questions including built environment, equipment, staff, amenities, and access. Each item criterion is specifically scored using a five-point, semantic-differential or Likert scale ranging from limited to extensive environmental support/evidence. A few questions utilize select all that apply for a summed score.

2.2.2. Expert, Cognitive, and Pilot Testing

This audit was cognitively tested with seven research assistants. Each student independently attempted to apply/score the audit for two different facilities. Via a group discussion, each question was discussed to determine interpretation, clarity, and appropriateness of semantic or Likert scales. Five public health/recreation program experts reviewed the audits for content validity. The results from the cognitive testing and expert review improved wording of questions and response items. The audit was pilot tested twice. Changes based on pilot testing included dividing it into two surveys for ease of administration and refining the wording of a few questions/responses (Summer 2014 at SU and Fall 2014 data not shown).

2.2.3. Recreation Facilities Venue Definitions

Main (Primary) Recreation Facility. This is the only or primary recreation facility for the population served.

Secondary/Satellite Facility. This is a smaller recreation facility that houses a portion or smaller version of the total recreation facilities.

One Component of Facilities. This includes single components of recreation facilities, such as a pool or a tennis court.

2.2.4. Field Testing: Audit Administration Procedures

This audit was tested on and near college campuses participating in the Get FRUVED research study (). Get FRUVED [43] is a social marketing and environmental change intervention to promote health on college campuses. At each college, the venues to be evaluated were determined by a campus team that identified a representative sample of the recreation facilities (main and secondary/satellite) which were most frequented by the campus population. At a minimum, the team assessed the main facility and approximately 25% of the secondary/satellite facilities within a 1.5-mile radius, depending on the campus. In cases where the served population extensively utilized a facility located beyond the 1.5-mile radius, the audit review team could decide to audit it.

Two different assessments were completed on each campus PACES-Facilities and PACES-Programs audit. A PACES-Facilities survey was completed for each recreation facility on/off campus, whether it was for a main recreation facility, secondary/satellite facility, or one component of a facility. For campus programs, one PACES-Programs survey was completed per campus.

Training and Interrater Reliability. Research assistants completed video training and practiced and performed interrater reliability (IRR) exercises. Each participating community had its student researchers complete IRR on two recreation facilities. Interclass correlations > 0.80 were required for each team prior to data collection. Starting in 2017, the IRR procedures were converted to an online quiz.

2.3. Analysis

Scores were computed for each PACES-Programs and PACES-Facilities survey. Interclass correlations (ICC) were computed to determine interrater reliability. To compare campuses, a PACES-Total score was computed by adding the PACES-Programs and the PACES-Facilities scores for the main facility evaluated. Cronbach’s alpha for PACES-Programs was (, 10 items) after deleting the question “When was the most recent recreation facility built?” Cronbach’s alpha for PACES-Facilities was (, 15 items). Differences by campus size and region were determined with ANOVA. To distinguish the quality of recreation facilities between campuses, latent profile analysis (LPA) was applied. LPA categorization allows for the assessment of heterogeneity of the sample based on distinctive characteristics of schools’ recreation facilities, which were expected to follow non-normal distributions. Two to five profiles were tested iteratively by using the robust maximum likelihood method and Akaike information criteria (AIC), bayesian information criteria (BIC), entropy, and sample size-adjusted BIC (SSABIC). The uniqueness and interpretability of latent profiles were considered to choose the optimal model [44]. Lower AIC, BIC, and SSABIC values indicate better model fit.

3. Results

A total of 153 facilities were assessed on and near 76 campuses. Students were effectively trained to implement the audits with ICC ranges of 0.90 to 0.98 for IRR. Sixty percent of the sample was from medium (, 29.2%) to large (, 33.3%) schools (Table 2). Most audits were completed in the south (, 40.3%) with only 15.3% in the west (Table 2). Most facilities were primary (, 53.5%), followed by secondary (, 30.3%), and finally, 44 one component/stand-alone facilities (15.5%). The scores ranged from 2 to 42 for facilities across all campuses, with a maximum of 77. PACES-Programs scores ranged from zero to 55 across all campuses of a maximum total of 73 points.

Small schools (500–1000 students) scored significantly lower than the largest schools (>20,000 students) on PACES-Total and PACES-Programs, but not on PACES-Facilities. Although there were no differences in PACES-Total by region, the campuses in the west scored significantly lower than all other regions on PACES-Facilities (Table 3).

The three-class solution was identified as the best model based on AIC (2-class: 9185.638; 3-class: 8954.549; 4-class: 9054.549; and 5-class: 9061.713), BIC (2-class: 9507.555; 3-class: 9428.314; 4-class: 9680.161; and 5-class: 9839.173), and SSABIC (2-class: 9172.049; 3-class: 8934.550; 4-class: 9028.140; and 5-class: 9028.895) values and meaningful interpretation of profiles. Entropy was 0.996 for the three-class solution. The first profile (low quality) consisted of 8.5%, the second profile (moderate quality) consisted of 13.7%, and the third profile (high quality) consisted of 77.8% of the samples (means and ranges available in Table 4). Figure 1 indicates aesthetics, adequacy of aerobic equipment, staff competence, staff accessibility, and extensiveness of amenities contributed to the facilities classified as high quality. Moderate quality scoring classified facilities scored in the middle for most questions, with moderate peaked scores on bike rack adequacy, aerobic equipment, and amenities. Low quality scoring classified facilities consistently scored lowest on all questions except cleanliness.

The distribution in the quality of the facilities differs by campus size χ2 (4, facilities) = 16.994, . Approximately 34% of high-quality facilities were in large schools, 34% were in medium size schools, and 25.2% were in the smallest size schools, whereas 61.5% of low-quality facilities were at large schools and 23.1% were in small size schools. Approximately 57.1% of moderate quality facilities were in the smallest schools. No facilities scored in the exceptional quality category.

4. Part 2: Convergent Validation Study

4.1. PACES-Facilities Validation
4.1.1. Materials

The Physical Activity Resource Assessment (PARA) Instrument [23] was chosen for the validation comparison because it was the most appropriate objective tool to evaluate similar recreation facilities concepts. The survey is a one-page checklist that assesses types of resource, features, amenities, and incivilities. Types of resource (fitness club, park, sport facility, trail, community center, church, school, and combination) are assessed on size, capacity, cost (free, pay at door, pay for certain programs), hours (open and close), and signage (hours, rules: yes/no). For the features and amenity sections of the survey, each item is rated on a 0 to 3 scale: 0 = not present, 1 = poor, 2 = mediocre, and 3 = good. Features include baseball field, basketball court, soccer field, bike rack, exercise station, play equipment, pool >3 feet deep, sandbox, sidewalk, tennis court, trail-running/biking, volleyball court, and wading pool <3 ft. The amenity section includes access points, bathrooms, benches, drinking fountains, fountains, landscaping effort, lighting, picnic tables shaded, picnic tables no shade, shelters, shower/locker room, and trash containers. The incivilities items are rated on a 0–3 scale: 0 = not present, 1 = little/few, 2 = some, and 3 = a lot. Incivilities include auditory annoyance, broken glass, dog refuse, dogs unattended, evidence of alcohol use, evidence of substance abuse, graffiti/tagging, litter, no grass, overgrown grass, sex paraphernalia, and vandalism. Detailed directions were included with the survey.

4.1.2. Protocol

The PACES-Facilities and PARA audits were tested at 29 facilities on and near college campuses (). For validation purposes, each auditor/team was responsible for a paired evaluation using both tools (PACES-Facilities and PARA). To reduce potential bias, PARA was collected first for one half of the audits and for the other half, PACES-Facilities was collected first. Auditors entered all surveys into Qualtrics, with one for PARA and one for PACES-Facilities.

Training and Interrater Reliability. Research assistants completed training and practiced and performed interrater reliability (IRR) exercises for both tools. Each participating campus had its student researchers’ complete IRR on two recreation facilities. Interclass correlations > 0.80 were required for each team prior to data collection.

4.1.3. Analysis

Scoring incivilities were reverse coded: 4 = not present, 3 = little/few, 2 = some, and 1 = A lot. For each section, features, amenity, and incivilities, an average score and a sum score were computed. The reliability for each section was assessed: features (); amenities (); incivilities ().

Spearman’s correlations were used to compare PACES-Facilities score to the PARA features section and the temporary PACES “amenities items” to the PARA amenities. PACES was not designed to assess incivilities, and the PARA reliability was low, so no comparison was made for this dimension.

4.2. PACES-Programs Validation
4.2.1. Materials

To validate PACES-Programs, the results were compared to a survey created from the Partnership for Healthier America’s Healthier Campus Initiative (HCI) [25]. A portion of the HCI survey was chosen for this validation because it measures comparable concepts for the college campus regarding extensiveness of health and wellness programming on campus. The HCI survey contained 41 questions, 15 regarding food/nutrition offerings, 19 regarding physical activity programs/facilities, and seven regarding policies. Each question was a Yes/No checkoff to indicate if a campus had the initiative or policy. The 16 specific HCI-programming questions selected for validation included bike share/rental, fitness/intramural opportunities, introduction to physical activity classes, physical activity breaks offered, fitness orientations, sufficient outdoor activities, rental for outdoor equipment, outdoor recreation clinics/trips, marked walking routes, free access to fitness/recreation center, dedicated physical activity space, outdoor running/walking track outdoor fitness system, certified personal trainers, implementation of comprehensive wellness program, and healthy cooking classes.

4.2.2. Procedure

Campuses participating () in Get FRUVED [43] completed PACES-Programs and the HCI survey as part of their full data collection.

4.2.3. Data Analysis

Summing the 16 selected HCI programming questions, the HCI programming subscore was compared to PACES-Programs using Spearman’s correlation.

5. Validation Results

There were 29 total PACES-Facilities and PARA pairs. Interrater reliability for PARA was ICC = 0.91 to 0.99 and for PACES-Facilities ICC = 0.81 to 1.0. Most of the schools were public institutions (87%) (Table 5). The northeast and the south each represented 30% of the sample, while there were no school facilities evaluated from the west. More than a third of the sample (39%) was from very small and small schools with ≤10,000 students. Correlations were significant between PACES-Facilities (features) and PARA features (, ) and for PACE-Facilities (amenities) and PARA amenities (, ).

Forty-one of the 78 Get FRUVED schools had matched PACES-Programs and HCI data. Most of the schools were public institutions (70.7%) (Table 5). The south represented 39% of the sample, while only 17% were from the northeast. More than a third of the sample (39%) was from very small and small schools with ≤10,000 students. There was significant correlation between total PACES-Programs and the HCI programming subscore (, ).

6. Discussion

For this study, the team redesigned, tested, and validated an updated version of PACES, a reliable tool to assess the quality of recreation facilities and programs. PACES distinguishes differences between facility types and programs, across campuses and regions. Most facilities evaluated with PACES-Facilities categorized into the highest quality recreation facilities category, primarily due to the extensiveness of their aerobic equipment and amenities and the competence and accessibility of the staff. The range of PACES-Programs scores indicated that most campuses provided a moderate level of options and supports for their overall campus programs.

Only a few researchers have attempted to assess the quality of recreation facilities/programs, with most relying on client or user perception [1719, 45]. The original PACES [24], PARA [23], and RecFAT [22] more objectively assessed quality by focusing on condition and maintenance. The comparisons between PACES and previous research are limited because the latter typically evaluates only portions of the physical activity environment, such as presence/availability of recreation facilities [15, 46] or user satisfaction [20, 21]. Some existing audits are focused on safety [16], universal access [39], park quality [47, 48], or rural environments [49]. In a systematic review of worksite tools [50] related to supports for physical activity (), only 20% of the tools were objective audits while over 50% were based upon employee self-report. Of the tools reviewed, 75% of the studies included access to physical activity equipment/facilities, amenities, and an assessment of educational opportunities; 66% of studies included bike rack availability/stairwell features; and less than 50% included evaluation of fitness assessment opportunities [50]. While 50% had completed internal and/or interrater reliability, only 33% reported some level of validation.

The PARA and RecFAT tools are useful in a diversity of recreation environments and are less specific regarding the details of campus recreation facilities. For this study, PARA was helpful as a validation comparison for PACES-Facilities for features and amenities [23]. For the subsample assessed by both PARA and PACES-Facilities, there were significant correlations for both features and amenities (PACES-Facilities does not assess incivilities, so no comparison was made.) Using PARA, Adamus et al. [51] found some similar results to these found on college campuses: fitness clubs had the highest scores for amenities and combination resources had the highest scores for features.

Others have found low to moderate reliability between population perception and objective (Google Earth) assessment of recreation facilities [52]. PACES can be a valuable and objective tool for evaluating and comparing the quality of recreation programs and facilities in varied environments. This is important because recreation facility quality has been found to relate to a variety of outcomes [6, 7, 9, 10, 21, 5359]. The accessibility of recreation facilities are related to the level of physical activity in various populations [6, 7, 9, 10, 21, 53, 54, 60], and a community’s natural amenities and recreation facilities per capita are negatively related to the populations’ rate of obesity [58]. On college campuses, recreation facility usage has been related to higher academic outcomes (GPA) [12, 59], higher student retention [56, 59], reduced stress [57], increased exercise frequency [42], and improved health indices [5]. The quality of recreation services (specifically staff competency, operations quality, and facility ambiance) has been shown to influence levels of satisfaction with recreation facilities and programs [17].

The PACES training and practice require approximately 2-3 hours, and an audit can be completed in 25–30 mins per facility or survey. The PACES audit is part of the Healthy Campus Environmental Audit (HCEA), a series audit tools to evaluate restaurants [61], convenience stores [62], vending [63], walkability/bikeability [27], and policies [26]. The PACES audit is user-friendly and available on the internet, with training and data entry links (contact the primary author for information). The primary institution analyses the data and provides feedback to the user including comparison and benchmark information. PACES-Facilities and PACES-Programs audits have been validated for college campus-type work environments but might also be useful in a variety of settings including communities, worksites, colleges, and schools.

A limitation of this study is that only college-educated populations on and near college campuses have used and tested the audit, and it therefore needs to be validated for use by other data collectors to determine the utility of PACES in communities beyond college campuses. Although many recreation facilities audits are perception-based [1721, 45], and some contradict objective findings [52], the PACES audit should be further tested by comparing audit results with recreation facilities/program clients’ perceptions. Future research should also evaluate the relevance and weighting of PACES items and the effectiveness of facilities and program supports in encouraging physical activity.

Data Availability

Data are available upon request to the corresponding author.

Disclosure

The funder, USDA AFRI grant, had no role in the design, analysis, or writing of this article.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Authors’ Contributions

All the authors have made substantial contributions (a) to either conception and design, or acquisition of data, or analysis and interpretation of data, (b) to drafting the article or revising it critically for important intellectual content, and (c) on final approval of the version to be published and agreed to its submission. Specifically, Dr. Horacek and Dean Seidman designed the study. All authors pilot-tested the tool and acquired data, and Dr. Horacek and Elif Dede Yildirim analyzed and interpreted the data. Dr. Horacek drafted the article, and all authors revised it. All authors provided final approval of the version to be reviewed and agreed to its submission.

Acknowledgments

We would like to acknowledge (1) the technical support for data collection and training provided by Megan Mullin, Laura Brown, and Heather Brubaker and (2) all of the research assistants at each institution who collected data. Funding for this study is provided by the Agriculture and Food Research Initiative (Grant no. 2014-67001-21851) from the USDA National Institute of Food and Agriculture, “Get FRUVED: a peer-led, train-the-trainer social marketing intervention to increase fruit and vegetable intake and prevent young adult weight gain, A2101.”