- About this Journal ·
- Abstracting and Indexing ·
- Aims and Scope ·
- Annual Issues ·
- Article Processing Charges ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Recently Accepted Articles ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents
Evidence-Based Complementary and Alternative Medicine
Volume 2013 (2013), Article ID 701280, 10 pages
Mixed-Methods Research in a Complex Multisite VA Health Services Study: Variations in the Implementation and Characteristics of Chiropractic Services in VA
1VA Center for Implementation Practice and Research Support, VA Greater Los Angeles Healthcare System, 16111 Plummer Street, Sepulveda, Los Angeles, CA 91343, USA
2Military Medical Research Program, Samueli Institute, Corona del Mar, CA, USA
3Chiropractic Program, Patient Care Services, Veterans Health Administration, Washington, DC, USA
4Chiropractic Service, VA Connecticut Healthcare System, West Haven, CT, USA
5RAND Corporation, Boston, MA, USA
Received 22 September 2013; Accepted 30 September 2013
Academic Editor: Cheryl Hawk
Copyright © 2013 Raheleh Khorsan et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Maximizing the quality and benefits of newly established chiropractic services represents an important policy and practice goal for the US Department of Veterans Affairs’ healthcare system. Understanding the implementation process and characteristics of new chiropractic clinics and the determinants and consequences of these processes and characteristics is a critical first step in guiding quality improvement. This paper reports insights and lessons learned regarding the successful application of mixed methods research approaches—insights derived from a study of chiropractic clinic implementation and characteristics, Variations in the Implementation and Characteristics of Chiropractic Services in VA (VICCS). Challenges and solutions are presented in areas ranging from selection and recruitment of sites and participants to the collection and analysis of varied data sources. The VICCS study illustrates the importance of several factors in successful mixed-methods approaches, including (1) the importance of a formal, fully developed logic model to identify and link data sources, variables, and outcomes of interest to the study’s analysis plan and its data collection instruments and codebook and (2) ensuring that data collection methods, including mixed-methods, match study aims. Overall, successful application of a mixed-methods approach requires careful planning, frequent trade-offs, and complex coding and analysis.
There is growing consumer interest in complementary and alternative medicine (CAM) in the USA and internationally [1–3]. Healthcare systems have responded to this demand by offering a range of CAM services in outpatient and inpatient settings [4, 5]. Patients enrolled in the US Department of Veterans Affairs (VA) healthcare delivery system often use CAM services outside of VA but have a strong interest in receiving these services within the VA system [6–11]. In response, VA began providing selected in-house CAM services in about 2001 . VA’s most substantial undertaking in delivering any CAM-related service has been its introduction of chiropractic services.
Chiropractic care is often described as sitting at the crossroads of CAM and mainstream medicine , and its introduction into the VA healthcare system exemplifies that duality. In 1999, Congress directed VA to establish a policy regarding chiropractic services for musculoskeletal conditions (Public Law 106–117) . Although specific action was not mandated, in response to this legislation, VA began providing limited access to chiropractic care by paying for services delivered outside the VA healthcare system. In 2001, Public Law 107–135 made chiropractic services part of the standard medical benefits available to all Veterans and required VA to deliver these services on-site by VA chiropractors at a minimum of one VA medical facility in each of VA’s 21 geographic regions (Veterans Integrated Service Networks or VISNs) [14, 15]. This required the incorporation of a new provider type, doctors of chiropractic (DCs), into VA’s clinical and administrative policies and procedures.
The establishment of chiropractic clinics within VA was challenged by the rarity of existing models in other healthcare systems and by the widely varying perception of chiropractic services by medical physicians and other stakeholders . VA convened a Federal Advisory Committee to make recommendations on the implementation of chiropractic services and in July 2004 issued Directive 2004–035 which established the overall policy for VA chiropractic services. While chiropractic care is now part of VA’s standard medical services, in practice and perception chiropractic care still retains many of the limiting features of a CAM service within a traditional medical setting. The introduction of chiropractic services in VA faced not only the typical challenges of introducing any new clinical service or program into a large healthcare system but also the unique obstacle of integrating a nontraditional healthcare service into conventional medical settings [14, 17–19].
By the end of 2005, VA had successfully complied with the requirement of establishing a minimum of one chiropractic clinic within each VISN. This initiative was loosely coordinated by VA Central Office (VACO), leaving much of the details to individual facilities. Over the following years, the use of chiropractic services at these initial facilities dramatically increased. This growth, along with interest from Veterans and providers at other VA facilities, stimulated the expansion of chiropractic clinics into other VA facilities. From fiscal year 2005 to fiscal year 2011, without further Congressional mandate, the number of VA chiropractic clinics increased from 24 to 43, and the number of Veterans receiving care at these clinics increased from just under 4,000 to over 81,000. Also during this time, VACO established central leadership for the chiropractic program in the Office of Rehabilitation Services which began to monitor and assess the ongoing uptake and expansion of services . Because of expected challenges facing the introduction of a new provider type (issues of privileging, competencies, and facility integration), unique features related to chiropractic care (varying perception and prior experience of other clinicians), and the relatively decentralized manner in which initial clinics were established, the chiropractic program office sought to gain deeper knowledge of the program’s continuing development and features.
Early studies of chiropractic care in VA described patient characteristics and outcomes in individual VA chiropractic clinics [17, 21–23], characteristics of patients and clinics at the national level , and elements of academic training programs . However, a more in-depth understanding of VA’s implementation of chiropractic services was needed to inform future policy and practice decisions and ultimately to ensure the highest quality of care delivered to Veterans. Program implementation initiatives within VA, as well as similar efforts outside VA, require careful planning and execution to achieve success. The chiropractic program office lacked the resources and expertise to conduct a large-scale program evaluation but was positioned to build partnerships with the VA research community. These circumstances led to a research-policy-practice partnership established to design and obtain funding for a program of research, beginning with a pilot study entitled “Variations in the Implementation and Characteristics of Chiropractic Services in VA (VICCS).” The VICCS study was guided by prior research examining the introduction and integration of nurse practitioners in VA  and related research examining the introduction and role of nurse practitioners and physician assistants in other healthcare delivery settings , as well as additional studies documenting the implementation and integration of new clinical services in a range of settings.
The VICCS research-policy-practice partnership sought to explore the chiropractic services program in parallel with other VA integrated care initiatives. These include programs for Veterans returning from operations in Afghanistan and Iraq (i.e., VA’s Post-Deployment Integrated Care Initiative) as well as a national palliative care program (Comprehensive End-of-Life Care Initiative) and the ongoing primary care medical home initiative (Patient Aligned Care Teams) .
This paper describes the design and methods of the VICCS study and insights gained from the application of a mixed-methods approach to address study questions. The experiences and insights from the study offer guidance for future research-practice partnerships and methods suitable for assessing the introduction of other new clinical services—traditional or CAM—in VA and other large healthcare systems. The paper describes the mixed-methods design employed, as well as specific challenges and issues related to data collection instruments and data collection logistics, analyses of diverse data types for distinct study aims, and other issues.
2. VICCS Study Development and Aims
The primary objective of the VICCS study was to identify variations in the implementation processes and organizational arrangements of VA chiropractic services and examine the causes and consequences of those variations. A mixed-methods approach was used to pursue the study’s three specific aims.(1)Document and characterize (a) the implementation of chiropractic services into individual VA healthcare delivery facilities and (b) the characteristics and organizational arrangements through which these services are delivered, including their integration with existing clinical services.(2)Identify (a) key factors leading to different implementation patterns and clinic characteristics across different VA facilities and (b) selected impacts and consequences of different implementation patterns and clinic characteristics.(3)Develop and refine research methods and tools for (a) a larger, more definitive study of chiropractic care programs in VA and for (b) studies examining the implementation of other new services and disciplines (including CAM services) in large healthcare delivery systems.
To address VICCS study aims (1) and (2), the study employed a comparative case study approach relying on (a) interviews to gather data from key stakeholders, (b) collection and content analysis of policy and procedure documents and other archival/documentary material to supplement interview-provided data, and (c) administrative data on use of VA chiropractic services. The study team’s experience and identification of several methodological and logistical challenges encountered during the study contribute to VICCS study aim (3), in which the study team used many of the “lessons learned” to inform and guide planning for future studies, whether of chiropractic services in VA or other new services and disciplines in any large US healthcare delivery system.
The mixed-methods approach included qualitative and quantitative analysis methods for inductive (hypothesis generation and exploration) and deductive (hypothesis testing) analyses. VICCS study data collection occurred in 2010 and 2011, followed by data analysis and reporting in 2012.
3. Mixed Methodology and Health Services Research
The VICCS study’s core conceptual framework relied on Donabedian’s structure, process, and outcomes model (Table 1) . Donabedian  suggests that the quality of health care can be conceptualized and evaluated along three main dimensions of care delivery: structures of care, processes of care, and care outcomes. Structure refers to the setting in which care is delivered, including facilities and equipment, qualification of care providers, administrative structure, and operations of programs. Process encompasses how care is provided and is measured in terms of appropriateness, acceptability, completeness, or competency [29–33]. These measurements are typically less definite than those obtained through assessing outcomes. Lastly, outcomes refer to the end points of care, such as improvement in function, recovery, or survival. Outcomes are usually concrete and precisely measured .
The VICCS data collection framework was designed to include several key categories of variables. These included features of each site and the background and motivation for the establishment of each chiropractic clinic. For example, information was collected on the initial impetus for each clinic (i.e., did VISN leadership require establishment of a clinic at a given facility or did facility leadership voluntarily establish a clinic?) and other key features of the healthcare setting prior to chiropractic clinic implementation. The data collection framework also distinguished several distinct phases in the clinic planning, implementation, and maintenance process and several distinct categories of variables describing the clinic context, chiropractic clinic itself, and key outcomes and measures of performance at each clinic.
Table 1 lists the domains and illustrative variables selected for the VICCS study. The conceptual model was further refined through an iterative process as the study was underway, as described in the following.(i)Environment/context includes local factors, such as local stakeholder attitudes toward innovation in general and chiropractic in particular, as well as VA regional and national factors, and non-VA external factors such as Veteran Service Organization influences.(ii)Planning/implementation includes features of a facility’s planning process and the participation of various stakeholders with differing levels of subject matter expertise.(iii)Clinic structure includes characteristics of the individual DC clinician(s), organizational alignment, and physical features of the clinic, the formal relationship, and extent of integration or collaboration with other facility programs and stakeholders.(iv)Care processes include characteristics of healthcare services provided, features of case management or care pathways, and quality of services.(v)Impacts/outcomes include the status of clinic access and use, patient-based outcomes, system perception of value, and external stakeholder opinions.
Semistructured interview guides were developed for each type of stakeholder. Interview subjects included VA facility leaders, chiropractors, chiropractor supervisors, chiropractic clinic support staff, other clinicians from various departments, administrative planners, VA patients with musculoskeletal pain complaints (from both chiropractic and nonchiropractic clinics), and external stakeholders (academic affiliates, Veteran Service Organizations, and former Federal Advisory Committee participants). Interview guides were adapted from existing instruments employed in similar studies, augmented by new questions and content specific to this study. Questions to measure stakeholder satisfaction and views were informed by existing instruments such as the Chiropractic Satisfaction Questionnaire , the Measure of Clinicians’ Orientation toward Integrative Medicine (IM-30) , and input from subject matter experts.
4. Mixed-Methods Approach
Mixed-methods research is increasingly used in the social, behavioral, and health sciences. The VICCS mixed-methods approach employed an explicit conceptual framework identifying key variables and data sources relevant to the study’s primary aims. This approach improves the quality and completeness of data in health care research to study issues as varied as health disparities, cultural differences, behavioral factors contributing to disability and health, processes and factors involved in implementation of health research findings, and much more. The increasing use of mixed methods reflects growing recognition of the value of qualitative and other social science research methods, collaborative interdisciplinary research teams (also known as team science) [37, 38], and the use of multilevel approaches to investigate complicated health issues . Such approaches are often combined with clinical trials, surveys of attitudes and beliefs, and epidemiological measures to better understand health and illness [40, 41].
Creswell et al.  document current trends in the application of mixed-methods approaches in a broad range of health-related research, such as in cardiology , pharmacy , family medicine , pediatric oncology nursing , mental health services , disabilities , and public health . The settings vary from the clinic  to the social context of daily activities and relationships . Trends in mixed-methods research are also documented in a study of NIH-funded investigations that incorporated “mixed methods” or “multimethods” in NIH-funded research abstracts . Qualitative methods are used in mixed-methods studies to address broad, open-ended, and interconnected questions that are often quite different from conventional clinical hypotheses . Many social scientists view inductive, interpretive, and related applications of qualitative methods as an important advantage over quantitative methods in developing insights into values, beliefs, attitudes, and interpretations of current or past events and other phenomena, but these methods can be used to supplement quantitative methods to examine many other phenomena as well.
5.1. Sampling, Site and Subject Selection, and Recruitment
The VICCS study data included extensive notes from one hundred eighteen interviews. Most interviews were conducted in person (84%, ) during two-day site visits at seven facilities. The remaining interviews were completed by telephone (16%, ) with VA facility staff unavailable during the site visit and with external stakeholders. Documents were collected during site visits and received via fax and e-mail prior to and following site visits.
Sampling procedures and criteria were developed for seven study sites. As the VA National Director of chiropractic services, the study’s co-PI (AJL) was invaluable given his expertise and experience in administrative and subspecialty matters. However, to avoid potential coercion or bias, AJL did not participate in the site selection decision-making process, recruitment, or interviews with any chiropractic staff (including chiropractors, their supervisors, or support staff).
Sites were selected to ensure diversity on key dimensions, including:(i)facility geographic location (regions of USA, urban/suburban/rural);(ii)facility type (medical center versus outpatient clinic, complexity);(iii)administrative alignment (facility service line overseeing the clinic);(iv)chiropractor characteristics (appointment type, full-time versus part-time, clinical experience, prior practice setting, credentials);(v)clinic establishment (how long the clinic had been in existence);(vi)involvement with academic affiliate(s).
At the time of site selection, forty-one (41) VA facilities offered on-site chiropractic services. Two (2) sites involved the study’s co-PI as a staff member and were thus excluded. Two (2) other sites were deemed ineligible because they functioned as independent outpatient clinics and were not directly linked administratively to VA. Another site was ineligible because there was no chiropractor on staff at the time of site selection. Therefore, a total of thirty-six (36) sites were eligible for recruitment.
We set a sample size of seven sites, including one pilot site. A total of 12 sites were invited to participate until we reached our sample of seven, for a total site response rate of 58%. Of the five (5) sites that declined, 3 declined due to workload or other time conflicts, and 2 declined because facility leadership determined that local IRB review would be required (based on their perception that their sites would be actively engaged in the research, thereby triggering the need for local IRB review), thus disqualifying them from meeting the study’s timeline. Facilities invited but declining to participate in this study remain confidential to those outside of the study team as well as to the study’s co-PI, as per the study’s recruitment protocol and a confidentiality feature designed to minimize coercion.
At each site, we targeted a variety of stakeholders for interviews:(i)facility leadership (facility directors, chiefs of staff);(ii)key department heads (e.g., primary care, physical medicine and rehabilitation, orthopedics, neurology, pain clinic, rheumatology, radiology, and spine clinic);(iii)chiropractors;(iv)clinicians (primary care and specialty providers who both did or did not refer patients to chiropractic service) (2 per discipline, for a total of 6–8 per site);(v)chiropractic clinical and administrative support staff;(vi)VA back or neck pain patients (2-3 from each chiropractic clinic and 2-3 from a nonchiropractic back pain or related clinic, each seen at VA three or more times for the same neck or back issue);(vii)external stakeholders (local, such as academic affiliates, and national, such as national VSO representatives and federal advisory associates).
Prior to data collection, all study team members attended interview training and observed at least two pilot interviews. Two or three members of the study team attended each site visit, with one or two members conducting each interview. All subjects orally consented prior to participation. For respondents who agreed to be audio-recorded, these interviews were transcribed. Patients were not audio-recorded as per our protocol, but copious notes were taken and debriefing sessions occurred immediately afterwards to ensure that as much verbatim information was retained as possible. Excluding the 18 patients interviewed, 96 of the remaining 100 interview subjects agreed to be audio-recorded. To ensure confidentiality of sites and subjects, all identifiers were removed and replaced with study-generated consecutive identification codes prior to data coding and analysis.
5.2. Qualitative and Quantitative Data
Between December 2010 and November 2011, 118 semistructured interviews were conducted at seven sites, including one pilot site. Interview subjects included sixty-two non-DC clinicians (53%), eighteen patients (15%), eleven leaders (9%), seven chiropractors (6%), six chiropractic support staff (5%), five staff involved in planning chiropractic clinics (4%), four chiropractic supervisors (3%), three former federal chiropractic advisory committee members (3%), and two academic affiliates (2%). Participation response rates for subjects and sites were 43% and 58%, respectively.
Documents reviewed and analyzed included significant chiropractic care-related policy documents obtained from the study sites, including VA regional (VISN) policies, local facility policies, local service agreements, chiropractor clinician privileges, and other public documents such as congressional bills/resolutions related to VA chiropractic services.
Content analysis of interviews and documents assessed a priori hypotheses derived from prior literature, as well as new themes emerging from transcript review. The codebook for the interviews and documents was developed and refined throughout the coding process using top-down (deductive/a priori hypothesis testing) and bottom-up (inductive/emerging hypothesis generating) methods.
Data collected from both interviews and documents were coded using NVivo (QSR International) and Excel (Microsoft) software, respectively. We observed high interrater agreement () among coding team members. Data were coded in a two-phased process utilizing high level codes first (double coded) for general themes and variable domains and then more specific detailed codes for subthemes and individual variables.
Additionally, the study obtained quantitative administrative data on clinic use and utilization characteristics such as patient visit counts, patient demographics, diagnoses seen, and services delivered. These data were obtained from the VA Corporate Data Warehouse via VA Informatics and Computing Infrastructure (VINCI).
The VICCS study was approved by four separate institutional review boards (IRBs): VA Greater Los Angeles Healthcare System, VA Connecticut Healthcare System, Western IRB, and US Army Medical Research and Materiel Command.
6.1. VICCS Study Methodological Issues and “Lessons Learned”
Most of the methodological challenges in the VICCS study fall into 4 main categories: (1) subject selection and recruitment, (2) site selection, (3) data collection instruments and logistics, and (4) analysis of and interpretation of diverse types of data related to the study’s multiple research questions and goals.
6.1.1. Selection and Recruitment of Participants
The site selection process was designed to ensure wide variability of chiropractic clinics and their internal structures and processes. However, because participation of sites and stakeholders at each site was voluntary, some bias may have been introduced that will limit any generalizations outside of VA.
Recruitment of busy healthcare professionals (both clinicians and administrators) is a well-recognized challenge to conducting research in healthcare settings . Because some clinicians and staff are unionized, sites also had to have union notification and, in some cases, approval (local and national).
VA and related federal regulations on patient survey/interview research allow small numbers of patients to be sampled in all IRB-approved studies, but, for studies attempting to recruit larger numbers of patients (>9), additional approval of the Office of Management and Budget is required. Because OMB approval is lengthy, we sampled a smaller number of patients directly and attempted to gather additional patient perspectives indirectly through provider interviews.
6.1.2. Site Selection
In partnership research involving program leaders who serve as research team members, research subjects (i.e., staff from participating sites) may be concerned that they are being evaluated on their performance and actions rather than studied for the purpose of scientific knowledge development. Therefore, issues of possible coercion and sensitivities may affect the validity of data collected.
Staff turnover and limited memories also threaten data validity (temporal bias) when studying program development and evolution in a retrospective manner. Archival records are limited, and thus we relied heavily on VA staff (where still available) and their memories.
To improve the validity and integrity of data collected, this study performed most interviews in person during a two-day site visit (with two to three research team members) at each of the seven sites; 84% of all interviews were conducted in person (). To minimize data security and privacy concerns with IRBs, patients were recruited for interviews onsite and only first names were used.
Two patient respondent groups participated in this study based on their personal experience with three or more visits related to back or neck pain issues: those who received chiropractic services and those who received other nonchiropractic services (e.g., primary care, neurology, and orthopedics). Patients in clinic waiting areas were recruited systematically. No one was missed or avoided. And, because of the study’s IRB-approved minimal risk status, this study was granted a waiver of documentation of informed consent; thus all subjects orally consented to participate in this study.
Interviews at each site were scheduled to fill target quotas for each of the various roles (respondent groups) needed. After each site’s chiropractor and facility leaders approved participation in the study, lists of providers by department (or service line) were compiled. Recruitment involved sending individual invitations to VA employees to fill the two-day site visit schedule. Initial invitations were sent by e-mail describing the study and announcing the planned site visit dates, and up to five follow-up contacts were attempted per person. Relatively low subject response rates (45.6%) may have resulted partly from the limited two-day interview timeframe scheduled at each site. Of those who did not participate, 69.5% () were subjects who did not respond to these initial or follow-up e-mails requests for participation.
6.1.3. Data Collection Instruments
Data collection instruments and protocols in qualitative research are often informal, flexible, and subject to large variations in application. While flexibility represents a strength in traditional qualitative research, it can result in inconsistent and unfocused data collection and variable data quality when qualitative methods are applied in deductive research. For example, interview guides specifying general topics of interest, using broad, open-ended questions, can be very effective in assessing interview subjects’ assessment of important concepts and issues and their beliefs and values but ineffective in ensuring that complete and comparable measures of identified variables are collected consistently across a range of subjects (e.g., assessing organizational participants’ views or their ratings of concepts or variables deemed important by the research team). In part, the distinction here is between data collection approaches designed to develop new insights and frameworks for understanding and describing the phenomena of interest, versus applying a priori frameworks to collect predefined data and test aspects of these frameworks. Similar problems result from the use of observation guides or protocols lacking adequate specificity and a firm foundation in a priori hypotheses and clearly identified variables: such protocols often produce inconsistent data by (1) encouraging the observer to record events as they unfold and to record a wide range of attributes of the situation under study (whether or not they are deemed relevant to the hypotheses of interest), (2) limiting the likelihood that the observer will note the significance of events that do not occur, and (3) limiting the likelihood that the observer will collect complete, consistent data required for direct comparisons across observation samples.
Considerations of validity, intrusiveness or subject reactivity (Hawthorne effects), and triangulation (to minimize bias) are also too often neglected in deductive applications of qualitative methods. Distinctions between subjective and objective data and between formal and informal organizational structures and processes are also frequently neglected, threatening the validity of study conclusions.
Avoiding these problems requires careful design of data collection plans, based on study goals and hypotheses, involving use of systematic tables or other methods for specifying key variables and suitable, multiple measures. Depending on the importance of each variable and the validity of available measures, two or more data sources are typically needed in qualitative research. Data planning tables listing concepts or variables, definitions, and data sources are effective in ensuring appropriate rigor; data collection instruments (including document coding forms, survey questions, and other data specifications) can be developed directly from these tables.
Rigor and validity are also enhanced through development and use of data collection instrument specifications and training protocols, including variable and measure definitions and instructions in instrument use. When used in research examining health care delivery organizations, such protocols should include plans and instructions for approaching sites, making contacts, arranging interviews/visits, identifying and obtaining documents, following up (to obtain documents and other postvisit/call information), managing informed consent and confidentiality, and so forth. Adequate pilot testing helps ensure the appropriateness of data sources and measures although data collection protocols must be flexible and allow for changes in data collection plans and strategies, when pilot testing fails to reveal valuable new data sources or validity problems that eventually emerge during the main study period.
Finally, study validity is further enhanced through development of data analysis protocols and plans together with the actual instruments, rather than after the completion of data collection. Data planning tables created to guide data collection activities can be used to develop data reporting templates and specifications for translating raw data into variables and preparing for analyses; data from organizational studies are often reported in a standardized “organizational profile” or other comparative formats. These profiles store raw data and summary variables from all data sources, which are then converted into tables for analysis. Additional challenges arise in studies pursuing inductive and deductive study aims simultaneously. For deductive studies with a rigid a priori framework, the number of variables to be measured should be relatively small and easily managed. Inductive, exploratory work involves open-ended questions and unlimited data and is thus more challenging to plan and conduct.
6.1.4. Analyzing Diverse Types of Data
The field of health services research has benefited from several insightful, comprehensive discussions of qualitative research methods and their appropriate use . Proponents have convincingly argued that qualitative methods contribute to findings and insights that cannot be derived from “conventional” or “quantitative” research methods and that research in the clinical, social, and policy sciences requires careful application of both types of approaches to properly study their phenomena of interest.
For qualitative research, patterns of observed individual and organizational practices, behaviors, and outcomes are highly variable (across time and site) and are subject to wide variations and interviewer/observer bias and interpretation. This heterogeneity challenges those designing the research to find and describe consistent patterns and topics within the findings. The patterns and topics identified are heavily influenced by idiosyncratic factors, such as an individual leader’s personal views or situation or unrelated pressures or events within a site. Also, each site’s situation (planning, implementation, clinic structure, etc.) is influenced by a very large number of factors, whose combined and interacting effects lead to highly variable outcomes (as described in chaos or complexity theory, e.g.). Data collection relying on interviews with individuals entails potential bias, limited validity, and inaccuracies due to challenges in recall and differing perspectives and views of events and differential access to information. Therefore standard challenges in qualitative/case study research also apply here.
Another challenge arises from the limitation that data for many key variables are not always reliably available or are difficult to access, in addition to having questionable validity in some instances. This study also examined a long chain of causal links and multiple determinants and outcomes (independent and dependent variables). Overall, the mixed-methods approach employed here was challenging because dual inductive and deductive research is inherently demanding, for example, deciding how to allocate limited interview time to measure variables identified a priori (for deductive) versus open-ended interviewing to maximize the likelihood of learning something new and interesting (for inductive). Lastly, the lack of available standardized, validated measures, concepts, definitions, and so forth was a significant challenge to this study. These challenges are especially common to pilot studies.
However, there are steps to minimize these biases, including adequate training of data collection staff; comprehensive plans for data collection, validation, and storage; and frequent reviews of data quality and interpretation. While data validity and completeness can be enhanced through the recording of interviews, other methods should also be paired with the traditional recording of interviews such as use of paired interviewers, postinterview debriefing, and other methods. Quality assurance methods should be considered and operationalized for each instrument and data sources. Problems such as incomplete, missing, unusable data should be identified and resolved during the data collection phase rather than after its completion.
Finally, analysis plans are relatively quantitative in most conventional health care studies that often pursue narrow, explicit aims and test explicit, confirmatory hypotheses. However, mixed-methods studies that address both inductive and deductive study questions and employ methods features, such as randomized sampling, that are more typically associated with experiments—combined with open-ended interviews more typically associated with qualitative research—offer what is termed by Morse “alternative forms of evidence” [55, p. 86]. Therefore new opportunities for qualitative inquiries have the ability to emerge . These points are illustrated well by this passage from Ronald J. Chenail:
Take a pragmatic posture to creating studies that marry the most fitting design and methodology choices with the focus of your research curiosity remain true to your interests and then explore a variety of research approaches which can help in the designing and conducting studies to meet your needs. The bottom line is to be pragmatic in creating the design, but remain curious so every reasonable methodological option is considered. However, like taking too many medications can lead to adverse effects to your body, using too many methodologies might produce negative side effects which could be unhealthy for your study. To help remedy this potential risk, please remember this simple research commandment: Thou shall not select an additional methodology for a study, until thou is sure the first methodology selected cannot manage all of the design issues .
To our knowledge, VA’s introduction of chiropractic services represents the most extensive introduction of any nontraditional medical service into the largest integrated US healthcare system. This is likely to present future research areas of interest to multiple stakeholders. VA policy makers may seek data to inform efforts to best assess and improve the delivery of chiropractic services to meet Veterans’ needs. Stakeholders in the chiropractic profession may look to VA’s experience as an indicator of future opportunities and integration into other healthcare systems. Other CAM disciplines seeking inclusion into VA and other systems may be interested in the policy and practice implications of VA’s chiropractic program. At a broader level, beyond the unique chiropractic and/or CAM implications, our study may have implications for researchers to assess the introduction of any new healthcare service into VA or other large healthcare systems.
For these types of inquiries, research-practice-policy partnerships facilitate research that can be more useful to decision makers (relative to traditional academic research). Decision maker involvement increases the likelihood that (1) useful questions are answered by developing methods and data analyses relevant to service delivery and (2) the interpretation and reporting of results will inform future policy.
Analysis of qualitative observational data in studies combining deductive and inductive aims should be guided by prespecified, model-based hypotheses and detailed analysis plans developed at the outset of the study. Unfortunately, while quantitative analysis methods are well established and accepted, methods for analysis of qualitative data are subject to variability and lack of consensus. Analyses of qualitative data are too often informal, ad hoc, and emergent, with low reliability and validity. These threats can be countered through the use of formal table approaches, in which key variables relevant to each hypothesis are listed in tables and manipulated in a blinded fashion, using qualitative pattern identification and nonparametric quantitative techniques. The analysis tables summarizing and synthesizing information from diverse sources in a standardized format may also serve as reporting tools, in papers and reports. Combining the use of qualitative methods for hypothesis testing and interpretive, inductive applications in this manner represents a powerful application of these methods, using their strengths to enhance management studies and other empirical research in important ways.
In conclusion, qualitative case study research allows for the collection of rich data for deep understanding of the phenomenon of interest. Yet data collection relying on interviews with individuals entails potential bias, limited validity, and inaccuracies due to challenges in recall and differing perspectives and views of events and differential access to information. Also, data for many key variables may be unavailable or difficult to access, in addition to having questionable validity in some instances. Researchers who are not trained in qualitative methods (and who are accustomed to conducting empirical research using quantitative methods alone) are less likely to be interested in applying qualitative methods in inductive or interpretive research but can—and should—be interested in applying qualitative methods to enhance the data available for more conventional deductive forms of empirical research. The use of mixed methodology can enrich health services research testing a priori study hypotheses and narrowly defined research questions and can also help suggest detailed causal explanations and generate important new exploratory questions and findings as well.
Conflict of Interests
The authors declare that there is no conflict of interests with any financial organization regarding the material discussed in the paper.
The authors would like to acknowledge Joan A. Walter, J.D., P.A., and Rick Welton, M.D., for their support and contributions to the project. They would also like to acknowledge the members of the project advisory board: Lucille Beck, Ph.D.; Charles Burgar, M.D.; A. Lucile Burgo-Black, M.D.; Ian Coulter, Ph.D.; Paul Shekelle, M.D., Ph.D.; Joan Walter, J.D., P.A. Oral presentations by (1) Khorsan R, Cohen A, Smith M, Lisi A, Mittman B, Coulter I, and Walter J. Integrative Chiropractic Services in VA: A Pilot Study (VICCS). American Public Health Association (APHA) 140th Annual Meeting and Exposition, Session: 3418.0 Policy, regulation, comparative and cost effectiveness research: San Francisco, CA. Monday, October 29, 2012: 5:15 p.m. (Abstract no. 269497); (2) Cohen A, Smith M, Khorsan R, Lisi A, Mittman B, Jenkins D, and Armstrong C. Methodological and logistical challenges to conducting multi-site partnered research in the VA healthcare system. American Public Health Association (APHA) 140th Annual Meeting and Exposition, Session: 3323.0 Chiropractic Research: Current status and updates: San Francisco, CA. Monday, October 29, 2012: 3:00 PM. (Abstract no. 268597); (3) Smith M, Cohen A, Khorsan R, Jenkins D, Armstrong C, Lisi A, and Mittman B. Tracking VA’s chiropractic program: Implementation timelines and variations across selected facilities. American Public Health Association (APHA) 140th Annual Meeting and Exposition, Session: 4133.1 Veteran’s Health Care and Health Risks: San Francisco, CA. Tuesday, October 30, 2012: 11:30 PM. (Abstract no. 265497); (4) Lisi A, Mittman B, Khorsan R, Smith S, Cohen A, Armstrong C, MacGregor C, Carucci M, and Jenkins DM. Variations in the Implementation and Characteristics of Chiropractic Services in VA: A Pilot Study. Association of Chiropractic Colleges (ACC) and Research Agenda Conference (RAC): Las Vegas, NV. March 20-21, 2012; (5) Lisi A, Mittman B, Khorsan R, Smith S, Cohen A, Armstrong C, MacGregor C, Carucci M, and Jenkins DM. Studying the introduction, integration, and effectiveness of chiropractic services within the VA healthcare system: A novel clinical/research collaboration. Association of Chiropractic Colleges (ACC) and Research Agenda Conference (RAC): Las Vegas, NV. March 19–21, 2011; and (6) Lisi A, Mittman B, Khorsan R, Smith S, Cohen A, Armstrong C, MacGregor C, Carucci M, and Jenkins DM. Variations in the implementation and characteristics of chiropractic services in VA Association of Chiropractic Colleges (ACC) and Research Agenda Conference (RAC): Las Vegas, NV. March 19–21, 2011. This material is based upon work supported in part by the Department of Veterans Affairs, Veterans Health Administration, Office of Research and Development, Health Services Research Development Service. The views expressed in this paper are those of the authors and do not necessarily reflect the position or policy of the Department of Veterans Affairs or the United States government. In addition, this work was supported, in part, by a grant to Brian S. Mittman by the Samueli Institute. This grant is supported by the US Army Medical Research and Materiel Command under Award no. W81XWH-06-1-0279. The views, opinions, and/or findings contained in this report are those of the author(s) and should not be construed as an official position, policy, or decision of Department of the Army unless so designated by other documentation. Also, the views, opinions, and/or findings contained in this report are those of the author(s) and should not be construed as the opinion or policy of the Samueli Institute. In the conduct of research where humans are the subjects, the investigator(s) adhered to the policies regarding the protection of human subjects as prescribed by Code of Federal Regulations (CFR) Title 45, Volume 1, Part 46; Title 32, Chapter 1, Part 219; and Title 21, Chapter 1, Part 50 (Protection of Human Subjects). Monica M. Smith was supported in this work by Grant no. K01-AT002391, a grant from National Institutes of Health National Center for Complementary and Alternative Medicine (NIH-NCCAM). Study findings and conclusions are those of the authors and do not represent the opinion or position of NIH or NCCAM. Institutional review board (IRB) approvals were by VA Greater Los Angeles Healthcare System (IRB of record), VA Connecticut Healthcare System (IRB for Co-PI), Western IRB (IRB for Samueli Institute), and US Army Medical Research and Materiel Command (IRB for Department of Defense).
- H. A. Tindle, R. B. Davis, R. S. Phillips, and D. M. Eisenberg, “Trends in use of complementary and alternative medicine by US adults: 1997–2002,” Alternative Therapies in Health and Medicine, vol. 11, no. 1, pp. 42–49, 2005.
- D. M. Eisenberg, R. B. Davis, S. L. Ettner et al., “Trends in alternative medicine use in the United States, 1990–1997: results of a follow-up national survey,” Journal of the American Medical Association, vol. 280, no. 18, pp. 1569–1575, 1998.
- R. L. Nahin, P. M. Barnes, B. J. Stussman, and B. Bloom, “Costs of Complementary and Alternative Medicine (CAM) and frequency of visits to CAM practitioners: United States, 2007,” National Health Statistics Reports, no. 18, pp. 1–14, 2009.
- S. Ananth, “Integrative healthcare for the military: a Samueli Institute/Department of Defense partnership,” Explore, vol. 8, no. 4, pp. 256–257, 2012.
- S. Ananth, “Applying integrative healthcare,” Explore, vol. 5, no. 2, pp. 119–120, 2009.
- L. M. Denneson, K. Corson, and S. K. Dobscha, “Complementary and alternative medicine use among veterans with chronic noncancer pain,” Journal of Rehabilitation Research and Development, vol. 48, no. 9, pp. 1119–1128, 2011.
- C. M. Baldwin, K. Long, K. Kroesen, A. J. Brooks, and I. R. Bell, “A profile of military veterans in the southwestern United States who use complementary and alternative medicine: implications for integrated care,” Archives of Internal Medicine, vol. 162, no. 15, pp. 1697–1704, 2002.
- D. G. Campbell, A. P. Turner, R. M. Williams et al., “Complementary and alternative medicine use in veterans with multiple sclerosis: prevalence and demographic associations,” Journal of Rehabilitation Research and Development, vol. 43, no. 1, pp. 99–110, 2006.
- F. P. McEachrane-Gross, J. M. Liebschutz, and D. Berlowitz, “Use of selected Complementary and Alternative Medicine (CAM) treatments in veterans with cancer or chronic pain: a cross-sectional survey,” BMC Complementary and Alternative Medicine, vol. 6, article 34, 2006.
- K. Kroesen, C. M. Baldwin, A. J. Brooks, and I. R. Bell, “US military veterans' perceptions of the conventional medical care system and their use of complementary and alternative medicine,” Family Practice, vol. 19, no. 1, pp. 57–64, 2002.
- K. Bent and L. Hemphill, “Use of complementary and alternative therapies among veterans: a pilot study,” Federal Practitioner, vol. 21, no. 10, pp. 43–52, 2004.
- Healthcare Analysis & Information Group (HAIG), Complementary and Alternative Medicine Survey, Office of Policy and Planning, Veterans Health Administration, Department of Veteran Affairs, Washington, DC, USA, 2011, http://shfwire.com/files/pdfs/2011CAM_FinalReport.pdf.
- W. C. Meeker and S. Haldeman, “Chiropractic: a profession at the crossroads of mainstream and alternative medicine,” Annals of Internal Medicine, vol. 136, no. 3, pp. 216–227, 2002.
- A. J. Lisi, C. Goertz, D. J. Lawrence, and P. Satyanarayana, “Characteristics of Veterans Health Administration chiropractors and chiropractic clinics,” Journal of Rehabilitation Research and Development, vol. 46, no. 8, pp. 997–1002, 2009.
- Public Law 107-135, Department of Veterans Affairs Health Care Programs Enhancement Act of 2001, section 204, Program for Provision of Chiropractic Care and Services to Veterans, Washington, DC, USA, 2012, http://veterans.house.gov/legislation/107/hr3447b.html.
- J. W. Busse, C. Jacobs, T. Ngo et al., “Attitudes toward chiropractic: a survey of North American orthopedic surgeons,” Spine, vol. 34, no. 25, pp. 2818–2825, 2009.
- A. J. Lisi, “Management of Operation Iraqi Freedom and Operation Enduring Freedom veterans in a Veterans Health Administration chiropractic clinic: a case series,” Journal of Rehabilitation Research and Development, vol. 47, no. 1, pp. 1–6, 2010.
- B. N. Green, C. D. Johnson, A. J. Lisi, and J. Tucker, “Chiropractic practice in military and veterans' health care: the state of the literature,” Journal of the Canadian Chiropractic Association, vol. 53, no. 3, pp. 194–204, 2009.
- B. N. Green, C. D. Johnson, and A. J. Lisi, “Chiropractic in U.S. military and veterans' health care,” Military Medicine, vol. 174, no. 6, pp. 6–7, 2009.
- M. W. Evans Jr., G. Page, H. Ndetan et al., “Are patients receiving health promotion advice in the chiropractic teaching clinic setting?: an impact assessment of a brief intervention to increase advising rates and goal setting,” Journal of Chiropractic Education, vol. 25, no. 2, pp. 132–141, 2011.
- A. S. Dunn, B. N. Green, L. R. Formolo, and D. Chicoine, “Retrospective case series of clinical outcomes associated with chiropractic management for veterans with low back pain,” Journal of Rehabilitation Research and Development, vol. 48, no. 8, pp. 927–934, 2011.
- A. S. Dunn and S. R. Passmore, “Consultation request patterns, patient characteristics, and utilization of services within a Veterans Affairs medical center chiropractic clinic,” Military Medicine, vol. 173, no. 6, pp. 599–603, 2008.
- A. S. Dunn, J. J. Towle, P. McBrearty, and S. M. Fleeson, “Chiropractic consultation requests in the Veterans Affairs health care system: demographic characteristics of the initial 100 patients at the western New York Medical Center,” Journal of Manipulative and Physiological Therapeutics, vol. 29, no. 6, pp. 448–454, 2006.
- A. S. Dunn, “A survey of chiropractic academic affiliations within the department of Veterans Affairs health care system,” Journal of Chiropractic Education, vol. 21, no. 2, pp. 138–143, 2007.
- P. Y. Huang, E. M. Yano, M. L. Lee, B. L. Chang, and L. V. Rubenstein, “Variations in nurse practitioner use in Veterans Affairs primary care practices,” Health Services Research, vol. 39, no. 4, part 1, pp. 887–904, 2004.
- P. D. Jacobson, L. E. Parker, and I. D. Coulter, “Nurse practitioners and physician assistants as primary care providers in institutional settings,” Inquiry, vol. 35, no. 4, pp. 432–446, 1998.
- A. Donabedian, “The quality of medical care. Methods for assessing and monitoring the quality of care for research and for quality assurance programs,” Science, vol. 200, no. 4344, pp. 856–864, 1978.
- A. Donabedian, “Evaluating the quality of medical care,” Milbank Quarterly, vol. 83, no. 4, pp. 691–729, 2005.
- A. Donabedian, “Evaluating the quality of medical care,” The Milbank Memorial Fund Quarterly, vol. 44, supplement 3, pp. 166–206, 1966.
- A. Donabedian, “The evaluation of medical care programs,” Bulletin of the New York Academy of Medicine, vol. 44, no. 2, pp. 117–124, 1968.
- A. Donabedian, “Quality of care: problems of measurement. II. Some issues in evaluating the quality of nursing care,” American Journal of Public Health and the Nation's Health, vol. 59, no. 10, pp. 1833–1836, 1969.
- A. Donabedian, “Some basic issues in evaluating the quality of health care,” American Nurses Association Publications, no. 124, pp. 3–28, 1976.
- A. Donabedian, “Measuring and evaluating hospital and medical care,” Bulletin of the New York Academy of Medicine, vol. 52, no. 1, pp. 51–59, 1976.
- A. Donabedian, J. R. C. Wheeler, and L. Wyszewianski, “Quality, cost, and health: an integrative model,” Medical Care, vol. 20, no. 10, pp. 975–992, 1982.
- I. D. Coulter, R. D. Hays, and C. D. Danielson, “The chiropractic satisfaction questionnaire,” Tech. Rep. RP-374, RAND Corporation, Santa Monica, Calif, USA, 1995, http://www.rand.org/pubs/reprints/RP374.
- A.-F. Hsiao, R. D. Hays, G. W. Ryan et al., “A self-report measure of clinicians' orientation toward integrative medicine,” Health Services Research, vol. 40, no. 5, part 1, pp. 1553–1569, 2005.
- N. E. Adler and J. Stewart, “Using team science to address health disparities: MacArthur network as case example,” Annals of the New York Academy of Sciences, vol. 1186, pp. 252–260, 2010.
- D. Stokols, S. Misra, R. P. Moser, K. L. Hall, and B. K. Taylor, “The ecology of team science. Understanding contextual influences on transdisciplinary collaboration,” American Journal of Preventive Medicine, vol. 35, supplement 2, pp. S96–S115, 2008.
- K. Börner, N. Contractor, H. J. Falk-Krzesinski et al., “A multi-level systems perspective for the science of team science,” Science Translational Medicine, vol. 2, no. 49, Article ID 49cm24, 2010.
- J. W. Creswell, A. C. Klassen, V. L. P. Clark, and K. C. Smith, Best Practices for Mixed Methods Research in the Health Sciences, Office of Behavioral and Social Sciences Research, Washington, DC, USA, 2011, http://obssr.od.nih.gov/scientific_areas/methodology/mixed_methods_research/pdf/Best_Practices_for_Mixed_Methods_Research.pdf.
- V. L. P. Clark and J. W. Creswell, Understanding Research: A Consumer’s Guide, Prentice Hall-Merrill, Upper Saddle River, NJ, USA, 2010.
- L. A. Curry, I. M. Nembhard, and E. H. Bradley, “Qualitative and mixed methods provide unique contributions to outcomes research,” Circulation, vol. 119, no. 10, pp. 1442–1452, 2009.
- A. B. Almarsdóttir and J. M. Traulsen, “Multimethod research into policy changes in the pharmacy sector-the Nordic case,” Research in Social and Administrative Pharmacy, vol. 5, no. 1, pp. 82–90, 2009.
- K. C. Stange, B. F. Crabtree, and W. L. Miller, “Publishing multimethod research,” Annals of Family Medicine, vol. 4, no. 4, pp. 292–294, 2006.
- K. Wilkins and R. Woodgate, “Designing a mixed methods study in pediatric oncology nursing research,” Journal of Pediatric Oncology Nursing, vol. 25, no. 1, pp. 24–33, 2008.
- J. W. Creswell and W. Zhang, “The application of mixed methods designs to trauma research,” Journal of Traumatic Stress, vol. 22, no. 6, pp. 612–621, 2009.
- S. Beake, L. L. Clark, T. Turner, and D. Bick, “A mixed methods study to develop and pilot a competency assessment tool to support midwifery care of women with intellectual disabilities,” Nurse Education Today, vol. 33, no. 8, pp. 901–906, 2013.
- A. C. Klassen, K. C. Smith, M. M. Black, and L. E. Caulfield, “Mixed method approaches to understanding cancer-related dietary risk reduction among public housing residents,” Journal of Urban Health, vol. 86, no. 4, pp. 624–640, 2009.
- K. McVea, B. F. Crabtree, J. D. Medder et al., “An ounce of prevention? Evaluation of the “put prevention into practice” program,” Journal of Family Practice, vol. 43, no. 4, pp. 361–369, 1996.
- R. J. Pasick, N. J. Burke, J. C. Barker et al., “Behavioral theory in a diverse society: like a compass on Mars,” Health Education & Behavior, vol. 36, no. 5, pp. 11S–35S, 2009.
- V. L. P. Clark, “The adoption and practice of mixed methods: U.S. trends in federally funded health-related research,” Qualitative Inquiry, vol. 16, no. 6, pp. 428–440, 2010.
- National Institutes of Health, Office of Behavioral and Social Sciences Research, Qualitative Methods in Health Research: Opportunities and Considerations in Application and Review, National Institutes of Health, Office of Behavioral and Social Sciences Research, Washington DC, USA, 2001, obssr.od.nih.gov/pdf/qualitative.pdf.
- J. Meiklejohn, J. Connor, and K. Kypri, “The effect of low survey response rates on estimates of alcohol consumption in a general population survey,” PLoS ONE, vol. 7, no. 4, Article ID e35527, 2012.
- I. D. Coulter and R. Khorsan, “Is health services research the Holy Grail of complementary and alternative medicine research?” Alternative Therapies in Health and Medicine, vol. 14, no. 4, pp. 40–45, 2008.
- J. M. Morse, “The politics of evidence,” in Qualitative Inquiry and the Conservative Challenge, N. K. Denzin and M. D. Giardina, Eds., pp. 79–92, Left Coast Press, Walnut Creek, Calif, USA, 2006.
- R. J. Chenail, “Ten steps for conceptualizing and conducting qualitative research studies in a pragmatically curious manner,” Qualitative Report, vol. 16, no. 6, pp. 1715–1730, 2011.