Government and Regulations
HHS Grants Fund Health IT in Communities
Organizations, government agencies, or other community groups that demonstrate the use of health information technology to a wide range of health...
Dr. Battaglia is a nurse scientist, Dr. Lambert-Kerzner is a health services researcher, Dr. Ho is a physician, and Ms. Haverhals is a health research specialist, all at VA Eastern Colorado Health Care System in Denver. Dr. Aron is a physician and Dr. Stevenson is a research health science specialist at Louis Stokes Cleveland VA Medical Center in Ohio. Dr. Kirsh is a clinical consultant at the VA Office of Specialty Care Services in Washington, DC. Dr. Sayre is a research health science specialist and qualitative resources coordinator, Dr. Au is a physician, and Dr. Helfrich is a research investigator, all at VA Puget Sound Health Care System in Seattle, Washington. Dr. Battaglia, Dr. Lambert-Kerzner, Dr. Sayre, Dr. Ho, Ms. Haverhals, Dr. Au, and Dr. Helfrich are affiliated with the VA’s Seattle-Denver Center of Innovation. Dr. Battaglia, Dr. Lambert-Kerzner, and Dr. Ho have faculty appointments at the University of Colorado in Aurora. Dr. Aron is a professor at Case Western Reserve University in Cleveland, Ohio. Dr. Sayre, Dr. Au, and Dr. Helfrich have faculty appointments at the University of Washington in Seattle.
Respondents were identified using a modified snowball sampling process. Snowball sampling is a qualitative sampling technique that identifies study participants, who then identify other potential participants to participate in the study. The researchers started with the local e-consult initiative lead and then contacted the directors of primary care and specialty care services for help identifying PCPs, specialists, and support staff (nurse practitioners, pharmacists, program managers, informatics staff, and medical support personnel) engaged in the initiative. The goal for follow-up interviews was to interview at least 2 of the following respondents at each site: e-consult project manager, PCP, and/or specialist. Due to turnover and changes in clinic roles, some follow-up interviews were conducted with different individuals from the baseline interviews.
Interviews followed semistructured interview guidelines and included open-ended questions designed to elicit rich responses to a variety of aspects related to e-consult implementation, including patient needs, communication, leadership, resources, priorities, knowledge about the program, and unintended consequences. Follow-up interviews addressed how e-consults impacted the quality of specialty care; the impact of e-consults on Patient Aligned Care Teams (PACTs), the VHA patient-centered medical home initiative for primary care; and how e-consults have been used, eg, whether patients were involved in the decision to seek an e-consult.
Two interviewers who had participated in a 1-day, in-person training covering both data collection and analyzing key informant data conducted the 40 to 60 minute telephone interviews. One team member conducted the interview while the other took field notes. Interviews were also recorded. Follow-up probes were used to elicit specific examples and ensure sufficiently rich data. Following each interview, the notetaker reviewed the audio recording and filled in details in the field notes. The interview team debriefed and reviewed the augmented field notes and audio recordings, which became the primary data sources for the study.
This was a qualitative descriptive analysis.12 Interview data were analyzed using an iterative, inductive content analysis method using an open coding approach (ie, a priori codes were not defined for this portion of the analysis).13 Two members of the research team used audio recordings and summary transcripts simultaneously to code data. Summary transcripts were compared with the recorded interviews to assure fidelity.
The researchers used Atlas.ti (Berlin, Germany) qualitative data analysis software to organize the coding process. Emergent codes were iteratively added throughout the analysis to reflect quotations that did not adequately fit previously developed codes. Codes were combined weekly to biweekly. After the combinations were completed, the analytics team met to review meanings of codes to ensure consistency of coding and interpretations.
To create categories, broad themes were identified from interview responses and grouped under high- order headings that described distinct aspects of participant experience. The analysis was intentionally kept close to the original data to reflect and describe the participant’s experience as accurately as possible. In support of analytical rigor, members of the multidisciplinary research team, composed of clinicians, implementation scientists, and mixed methodologists, reviewed findings to assess their thoroughness, comprehensiveness, and representativeness across roles and participating sites.14
The e-consult evaluation period was from November 1, 2011, to July 31, 2013. Key conclusions were drawn from both alpha and beta sites (Table). Baseline interviews were conducted with 37 participants at 8 sites from April 10, 2012, to August 6, 2012. Follow-up interviews were conducted with 21 of the 37 participants at the 8 sites. Follow-up interviews with either a PCP or specialist could not be scheduled at 1 site. Follow-up interviews were conducted from April 16, 2013, to June 18, 2013. Open coding continued until saturation (the point at which subsequent data failed to produce new findings).15 This occurred after analysis of 22 baseline interviews (12 PCPs, 6 specialists, 1 pharmacist, and 4 other staff members) and 17 follow-up interviews (10 PCPs, 4 specialists, 1 pharmacist, and 2 other staff members).
The e-consults provided a programmatic structure to the more informal practice of obtaining diagnostic or therapeutic advice from a specialist. Several of the specialists interviewed described having previously used existing informal consult processes that were “like e-consult.” These specialists reported that their practice patterns did not change significantly since implementing e-consults, because they have been “using the Computerized Patient Record System (CPRS) in an e-consult way for many years.” In these cases, the primary change resulting from the initiative was that national VHA workload policy was revised so that e-consults were assigned a CPT (Current Procedural Terminology) code and specialists began receiving workload credit for completing e-consults.
At sites where an informal e-consult practice was already in place, the initiative was consistently described as flexible. Many specialists reported that this degree of flexibility allowed them to make a relatively easy transition to e-consults by adopting new mechanisms to support existing processes. The e-consult initiative also allowed specialists to formally document this work and to increase the efficiency of specialty care.
Organizations, government agencies, or other community groups that demonstrate the use of health information technology to a wide range of health...
Despite the challenges of implementing facilitative coaching, the Richard L. Roudebush VAMC staff succeeded in translating primary care medical...