Abstract

Introduction. The Centers for Disease Control and Prevention (CDC)’s National Tuberculosis Surveillance System (NTSS) is the national repository of tuberculosis (TB) data in the United States. Jurisdictions report to NTSS through the Report of Verified Case of Tuberculosis (RVCT) form that transitioned to a web-based system in 2009. Materials and Methods. To improve RVCT data quality, CDC conducted a quality assurance (QA) needs assessment to develop QA strategies. These include QA components (case detection, data accuracy, completeness, timeliness, data security, and confidentiality); sample tools such as National TB Indicators Project (NTIP) to identify TB case reporting discrepancies; comprehensive training course; resource guide and toolkit. Results and Discussion. During July–September 2011, 73 staff from 34 (57%) of 60 reporting jurisdictions participated in QA training. Participants stated usefulness of sharing jurisdictions’ QA methods; 66 (93%) wrote that the QA tools will be effective for their activities. Several jurisdictions reported implementation of QA tools pertinent to their programs. Data showed >8% increase in NTSS and NTIP enrollment through Secure Access Management Services, which monitors system usage, from August 2011–February 2012. Conclusions. Despite challenges imposed by web-based surveillance systems, QA strategies can be developed with innovation and collaboration. These strategies can also be used by other disease programs to ensure high data quality.

1. Introduction

In 2010, there were 8.8 million new cases of tuberculosis (TB) disease reported worldwide, with over 1 million TB deaths [1]. In the United States, 11,182 people were newly diagnosed with TB disease [2]. The mission of the Division of Tuberculosis Elimination (DTBE), Centers for Disease Control and Prevention (CDC) is to promote health and quality of life by preventing, controlling, and eventually eliminating TB from the United States, and by collaborating with other countries and international partners in controlling TB globally [3].

Tuberculosis surveillance is a core public health function. Ongoing and systematic collection, analysis, interpretation, and dissemination of surveillance data allow programs to target control interventions that provide the most impact in eliminating TB [4]. These surveillance data are essential in describing morbidity and mortality, monitoring trends in TB incidence and prevalence, detecting potential outbreaks, and defining high-risk groups. In addition, TB data are needed to evaluate TB control programs, identify deficiencies, and allocate resources. In order to perform these important functions, it is essential that surveillance data are collected and reported in an accurate, complete, and timely manner.

The CDC’s National Tuberculosis Surveillance System (NTSS) is the national repository of TB surveillance data in the United States. CDC receives data on TB cases from reporting jurisdictions through a standardized data collection form, the Report of Verified Case of Tuberculosis (RVCT). NTSS has 60 reporting jurisdictions: all 50 US states, the District of Columbia, New York City, American Samoa, Federated States of Micronesia, Guam, Republic of the Marshall Islands, Commonwealth of the Northern Mariana Islands, Puerto Rico, Republic of Palau, and US Virgin Islands.

The RVCT was revised by a group of TB experts in 2009 and transitioned into a new web-based reporting system. An interdisciplinary CDC DTBE team collaborated with key national partners, state-based medical or health officers, and other local healthcare professionals to launch a national training program on the new RVCT [5, 6]. Extensive reviews of training materials enabled partners to provide feedback for improvements on the instructions for each of the 49 RVCT items [7]. The team also developed a self-study manual for participants that was used during facilitator-led trainings [8]. The manual can also be used as self-study for new TB staff and as a reference guide. In addition, a facilitator manual was developed and used during training-of-trainers courses to build RVCT training capacity throughout the reporting jurisdictions [9].

Quality assurance (QA) is a critical part of any successful surveillance system and is a continuous cycle of monitoring, evaluating, and improving data quality [10, 11]. Prior to 2009, jurisdictions depended on a CDC disk operating system (DOS) used for surveillance of TB data. This system provided a series of validation reports to jurisdictions for managing data. When CDC transitioned to a web-based system in 2009, there was a need for a standardized QA process that jurisdictions could adapt to their setting.

The team determined that a logical followup to the RVCT trainings was to enhance the QA knowledge and skills of TB surveillance staff. Furthermore, the RVCT training participants expressed concerns regarding the lack of data validation of some state systems and the inability of reporting areas to transmit all data electronically. DTBE staff began working individually with state public health partners to develop QA strategies. This paper describes these strategies to ensure the quality of TB data reported to the CDC’s NTSS through the new web-based system.

2. Materials and Methods

The RVCT QA training team, in collaboration with key partners, developed innovative strategies to provide standardized methodologies, skills, and tools to enhance the capacity for conducting QA. Similar to the RVCT training course, the team used the systematic process for health education to develop these QA strategies [6, 12]. This process includes needs assessment, development, pilot testing, implementation, and outcome evaluation.

2.1. Quality Assurance Needs Assessment

During 2010-2011, the training team conducted a comprehensive needs assessment to determine strategies that could enhance QA for TB surveillance data. During the needs assessment, the team facilitated discussions of QA topics with prepared open-ended questions to jurisdictions and CDC staff.

The needs assessment included the following.(i)meeting with TB program area staff from 11 reporting jurisdictions in either focus groups or individual interviews. Three incidence levels of TB-burden areas were represented including low (≤3.5 cases per 100,000 population in 2009), medium (3.6–3.8 cases per 100,000 population in 2009), and high TB incidence (>3.8 cases per 100,000 population in 2009). The staff described their surveillance system, and staff characteristics (training and expertise level), and shared their QA process and tools (i.e., tables, charts, graphs, processes, and templates). Staff suggested content topics and prioritized QA components that should be covered in the materials and a training course. In addition, they discussed successes and challenges experienced when conducting QA at their sites;(ii)meeting with colleagues from DTBE who have a role in ensuring quality data including the subject matter experts in the laboratory, the Data Management and Statistics Branch, project officers for the National TB Indicators Project (NTIP), and the TB Genotyping Information Management System [13, 14]. These staff members collaborated to help develop and conduct a comprehensive training program and QA tools;(iii)meeting with surveillance staff from other CDC divisions including the Division of STD Prevention, Division of Viral Hepatitis, and the Division of HIV/AIDS Prevention. Some of these colleagues indicated they conducted QA only after data arrived at CDC. None of the QA procedures or processes utilized by other divisions met the needs of DTBE;(iv)conducting a review of available QA materials on surveillance data [1538]. This review yielded information on various QA components and definitions (Table 1). However, the team did not find a comprehensive QA framework, practical step-by-step QA strategies for TB surveillance data, or practical models for a QA training course;(v)reviewing the surveillance section of the Tuberculosis Elimination and Laboratory Cooperative Agreement, a portion of an agreement between DTBE and NTSS reporting jurisdictions that describes area surveillance activities [39]. This yielded QA components and a requirement to monitor data quality (Table 1).

2.2. Quality Assurance Strategies

The results of the needs assessment were used to develop 4 strategies for enhancing QA procedures in reporting jurisdictions. These include the following.

(1) Providing a QA Process That Includes Five Components and Categorizing Activities into Each of These Components
QA components include case detection, data accuracy, data completeness, data timeliness, and data security and confidentiality. These components provided logical steps for conducting QA activities and were designed to allow reporting areas to utilize those strategies that would benefit them.

(2) Providing QA Tools Including Guidance for a Written QA Protocol
The RVCT QA training team provided a template for a written QA protocol and other tools that jurisdictions can easily adapt and use to conduct QA. Staff from CDC and the various jurisdictions developed over 45 tools that were classified into each of the five QA components (Table 2). The tools include tables, charts, graphs, processes, and templates and are available in common electronic formats (e.g., Word, Excel, and PowerPoint). The team developed a main tool which is a template to help jurisdictions write a QA protocol required in the annual DTBE Cooperative Agreement (Table 2).
The National Tuberculosis Indicators Project (NTIP) is also an important QA tool [13]. During 2010, an NTIP module was developed to allow users to identify any TB case reporting discrepancies. This module has proven useful in recognizing data coding errors and data transmission problems, and highlighting the issue that errors are occurring more frequently than was previously recognized.
The reporting jurisdictions can access their NTIP and NTSS QA reports such as the missing and unknown (MUNK) reports through the Secure Access Management Services (SAMS). SAMS is a federal information technology system that gives authorized personnel secure, external access to nonpublic CDC applications.

(3) Developing and Conducting a QA Training Course
The training team developed and conducted a comprehensive QA training course to enhance the knowledge and skills needed by TB surveillance staff from the reporting jurisdictions for conducting QA. The results of the needs assessment and collaboration with subject matter experts were key to development of the QA course. The course focused on the QA process and five components, as well as other related topics to increase course participants’ use of NTSS data for QA and program planning.
The course format included presentations from faculty (DTBE subject matter experts) with slides and handouts, exercises to apply the content to realistic situations, interactive discussions to share experiences and answer questions, and tools to use or adapt to their setting. Participants also described how they conducted QA at their sites and provided examples of QA challenges they encounter.

(4) Developing a Resource Guide and Toolkit
The training team is currently developing a resource guide and toolkit that can be used as a QA reference guide or a training manual. It will include many of the materials developed for the course such as handouts, exercises to apply the content, glossary, and examples of the tools (Table 2). A companion CD will provide the tools in easy-to-use formats (Word, Excel, and PowerPoint) so that jurisdictions can adapt them to their own setting. The guide and toolkit will be available in a print-based format with a companion CD that includes the tools. In addition, the materials will be downloadable from the CDC website.

3. Results and Discussion

3.1. Quality Assurance Training Evaluations

In July 2011, the team facilitated a pilot test of the 2-day QA training course with eleven TB surveillance experts from various state and local TB programs. The participants provided suggestions on how to improve the materials, the presentations, and the course schedule. The comprehensive course evaluation included written evaluations with qualitative and quantitative questions; discussions at the end of each of the five QA components; an end-of-course written evaluation; observations by course faculty. The team revised the materials and training course based on the analysis of the evaluation results (Table 3).

The team and other faculty members also conducted four 2-day trainings in Atlanta, GA, between August and September 2011. Course participants included 61 TB surveillance staff. Participants from the four trainings completed an end-of-course evaluation form consisting of qualitative and quantitative questions.

Results of the combined responses from the pilot course and four trainings evaluations (Table 3) indicated that participants learned about the QA process and benefited from sharing information on how other jurisdictions implement the QA components at their sites. The 73 participants (from the pilot course and four trainings) represented 34 (57%) of the 60 NTSS reporting jurisdictions. The 34 jurisdictions represent more than 80% of all TB cases reported to CDC each year.

Of the 73 participants, 66 (93%) stated that the QA tools will be effective in helping them conduct QA in their programs (Table 3). Participants stated that some of the most important things they learned were the five QA components and how they relate to the requirements in the cooperative agreement for a written QA TB surveillance protocol. Most of the participants appreciated the assessment of programmatic needs and the effort that went into implementing the course.

3.2. Quality Assurance Strategies with Limited Resources

In developing innovative strategies for QA of TB surveillance data, a key question for all programs is how best to maximize the use of limited resources to ensure data quality. The design and flexibility of the guide and toolkit enable health care staff to learn about the QA process in a self-study format or as part of a facilitator-led training course. Also, providing the materials in print-based format and the internet ensures accessibility to the materials without additional resources. Gaining knowledge and skills to conduct QA helps reporting jurisdictions remain vigilant in maintaining high quality of surveillance data despite limited resources.

3.3. Impact of Quality Assurance Strategies

This QA project represents a significant improvement to NTSS because it compiles for the first time guidelines, step-by-step process, and tools for monitoring and improving the quality of TB surveillance data. Additionally, CDC noticed an unprecedented timeliness and accuracy of all the required RVCT variables needed for publication on TB surveillance data in the Morbidity and Mortality Weekly Report issue for the World TB Day on March 24, 2012. The process to obtain data has been much easier than previous years because of improved understanding between CDC and jurisdictions fostered at the QA training.

Although the team conducted training on the QA strategies less than a year ago, CDC staff have also noticed better collaboration among NTSS reporting jurisdictions in sharing QA tools. Several jurisdictions sent letters to CDC shortly after the training stating that they have implemented QA tools that were pertinent to their programs.

In addition, SAMS portal reports indicated a 10% increase of NTSS enrollment from August 2011 to February 2012 and an 8% increase of NTIP enrollment for the same period. But it may be premature to attribute these results to the QA strategies.

The team also attempted to obtain MUNK reports of the jurisdictions represented during the QA training to examine any QA improvement, but this information was not readily available. This information can compare RVCT data and how missing or validation issues occur from the previous year. However, this may not be a reliable measurement of the impact of the QA strategies because the MUNK reports are influenced by changes in the number of TB cases, complexity of data-related issues, changes to state-based systems, or staff turnover.

The impact of the QA strategies to the quality of TB surveillance data can be better evaluated after the jurisdictions have fully implemented them. A survey of their QA practices may systematically evaluate the importance of these strategies.

Despite the limitations, these QA strategies support TB policies, laws, and regulations as they equip local jurisdictions with a systematic set of processes and tools that may be used to fulfill the requirements of the cooperative agreement to monitor data quality. Also, since reporting of a patient with TB disease to health authorities is mandated by state laws, the guidance on case detection, data accuracy, completeness, and timeliness helps reporting areas in complying with these laws.

In addition, these strategies are essential in collecting accurate and reliable TB surveillance data that are critical to making decisions to meet DTBE’s priorities: interrupt transmission of Mycobacterium tuberculosis, reduce TB in foreign-born populations, reduce TB in racial/ethnic minority populations, mitigate/reduce impact of multidrug-resistant and extensively drug-resistant TB, and reduce HIV-associated TB. Being vigilant in performing QA ensures high-quality data and ultimately helps accelerate progress toward elimination of TB in the United States.

4. Conclusions

Despite challenges imposed by various surveillance systems, economic constraints, and new diagnostic technologies, strategies for conducting QA can be developed with innovation and collaboration. Mobilizing the TB community to ensure high-quality data involves commitment, time, and energy of TB leaders and partners.

Guidelines, a step-by-step process, and tools for monitoring and improving the quality of TB surveillance data are essential for the TB community to effectively control TB. Future evaluation on the impact of these QA strategies will further demonstrate their importance in maintaining data quality.

Disclosure

This paper lists nonfederal resources in order to provide additional information to consumers. The views and content in these resources have not been formally approved by the U.S. Department of Health and Human Services (HHS). Listing these resources is not an endorsement by HHS or its components. The findings and conclusions in this paper are those of the authors and do not necessarily represent the views of CDC.

Acknowledgments

The authors would like to thank the TB program staff from various jurisdictions for participating in the needs assessment and those who submitted QA tools including Jason Cummins (TN), Sheanne Allen (WA), Jill Fournier (NH), Eyal Oren (Seattle/King County, WA), Janice Westenhouse (CA), and Gayle Wainwright (OR). They also thank the QA training course faculty for their hard work and commitment including Sandy Price and Stacey Parker (Data Flow and System QA Reports); Bob Pratt (Data Accuracy); Lori Armstrong (Data Validation Pilot Project); Carla Jeffries (Missing and Unknown Reports); Beverly Metchock and Angela Starks (Laboratory); Glenda Newell (Case Count Timeliness); Kai Young (National TB Indicators Project); Rachel Yelk-Woodruff (RVCT Completeness Study); Juliana Grant, Sandy Althomsons, and Brian Baker (TB Genotyping Information Management System). In addition, they would like to thank DTBE leadership for providing support for developing the QA project. Without their continued guidance, financial resources, and commitment to quality surveillance data, this project would not be possible. They also want to thank the dedication and hard work of all of the reporting jurisdictions that make TB surveillance successful.