Turning Feedback into Action at Martinsburg VA Medical Center.
What was the opportunity, issue or challenge you were trying to address and in what setting?
The challenge placed before the Martinsburg VAMC primary care leadership team was to identify targeted and effective ways to improve and sustain key Veteran experience performance metrics that feed into the VA’s national performance measurement system or S.A.I.L. (Strategic Analytics for Improvement and Learning Value). Those Veteran experience performance measures include:
- In the last six months, did anyone in this provider’s office ask you if there was a period of time when you felt sad, empty or depressed?
- In the last six months, did you and anyone in this provider’s office talk about things in your life that worry you or cause you stress?
- In the last six months, did you and anyone in this provider’s office talk about a personal problem, family problem, alcohol use, drug use, or a mental or emotional illness?
- Provider Rating: Using any number from 0 to 10, where 0 is the worst provider possible and 10 is the best provider possible, what number would you use to rate this provider?
What process did you use to develop a solution?
- Compared the facility’s patient satisfaction results across 12 clinical sites of care within the organization to the Veterans Health Administration (VHA) national average and top 10% performers.
- Identified significant areas of underperformance including both positive and negative drivers of performance.
- Mapped the entire patient experience and identified key points of patient-staff contact in which the patient’s perception of the visit was or could be impacted by behavioral changes.
- Structured point of care questions to address those key points of patient experience such as clinic responsiveness to phone requests, impression of the front desk interaction, care provided by the nurse at triage for the scheduled visit and the care received by the clinic provider. Additionally, we targeted an overall rating of the visit, impression of the clinic cleanliness, timeliness of the response to call or messages, the provider’s inquiry regarding any current mental health concerns/issues and the ability to schedule both urgent and routine visits. To assist the facility’s infection control monitoring, we later added questions regarding hand hygiene.
- The data was collected via patient responses to a questionnaire offered at check out. The questions were made available to the patient via email link or text message. We tracked invites daily by clinic location.
- The data was summarized by the contractor and shared at an appropriate frequency; invites daily, outcome data on weekly, monthly, quarterly and rolling 12-month basis depending upon the specific measure.
- The contractor/consultant formatted the data and provided coaching during a scheduled monthly call in addition to an “as needed” basis.
- In response to the data analysis, end user feedback and changing targeted areas, the patient experience question set were adjusted.
What outcomes were you looking to achieve?
The following outcomes were identified as measures of success:
- Comprehensiveness
- Office Staff
- Overall Provider Rating
- Improved top box Veteran Experience scores [Lead Measures] for targeted improvement efforts
- Improved compliance addressing mental health concerns during primary care visits
- Improved staff and provider morale through informal and formal recognition programs
- Improved clinical leadership and staff/provider engagement in patient experience improvement efforts
- Improved compliance with hand hygiene protocols
What specific steps did you take to address the problem?
- Met with the established Primary Care Service Leadership Team to review the current VHA patient satisfaction survey results (SHEP = Survey of Healthcare Experiences of Patients) to identify sites of care and underperforming areas.
- With the understanding of the underperforming areas, identified the points of patient contact within the service (phone, email, front desk, nurse visit, provider visit, environment) which may contribute to the underperformance.
- Decided to optimize the existing private patient experience feedback tool and available coaching. This resulted in an enhanced, more focused feedback tool with questions addressing the patient’s satisfaction with key points of contact during the visit. We structured the questions to reflect the VHA survey question intent and, unlike the VHA national tool, we encouraged comments regarding compliments and suggestions for improvement.
- Shared the plan with all clinical sites and staff to more fully engage the patient’s voice, share successes and areas for improvement regularly and to “set a high expectation for success”.
- Expanded the patient’s opportunity to participate in feedback from a tablet-based or paper-based questionnaire via an “invite” offered at the front desk during check out by email or text message.
- Reviewed the number of feedback invites by clinical site daily and provided positive feedback to high performing sites. We tracked response rates monthly as well to ensure the “invite” process remained robust.
- Obtained weekly, monthly, quarterly and rolling 12-month statistical reports with graphs from the contractor.
- Shared these reports with comments to both leadership and frontline staff with both strong praise and encouragement as well as tough realistic comments about shortfalls in performance.
- Presented summary data to the entire service during service-wide staff meetings.
- Adjusted the patient experience questions to reflect changes in the VHA SHEP focus, the Mission Act impacts to our clinic operations and patient desires for care, feedback from our team and frontline staff and to obtain patient feedback about new programs or clinic hours.
- Added data metric outcomes to the provider evaluations and physician pay for performance plan (patient generated provider rating and patient perception of urgent care access).
- Tracked/trended rolling patient feedback data and benchmarked to VHA SHEP data results. VHA SHEP/ SAIL data was shared regularly with the facility senior management.
What resources, if any, did you engage – either internally or externally – to address the problem?
The Martinsburg VAMC worked with pCare, a leading patient engagement improvement organization that provided point of care feedback software, actionable push reporting and performance improvement support, best practices and evidence-based improvement tools, to gather point of care feedback from Veterans leaving the primary care clinic. As Veterans check out of their primary care appointments, clerical staff asked the Veterans if they would be willing to answer a few anonymous questions about their recent primary care visit.
If the Veteran agreed, they were able to provide feedback using a hand-held tablet at the clinic, or they could answer the questions after their visit via a text or email-based questionnaire sent to their desired location. The patient feedback software allows for both the secure capturing of real-time patient feedback, but also the dissemination of patient feedback to clinic leadership and practitioners in the time, format, and frequency of their choosing.
pCare also provides performance improvement coaching from experts with extensive experience in health care and backgrounds in performance improvement and patient experience improvement methodologies. The Martinsburg VAMC primary care team met with their pCare coach monthly to advance improvement efforts and to refine the customizable solution to meet their targeted needs.
What measures did you establish to determine the success of this effort?
SHEP Measurement Methodology
The Clinician & Group Survey (CAHPS), endorsed by the National Quality Forum (NQF) in July 2007, comprises several instruments that enable users to assess and report on the experiences of patients in primary and specialty care settings. The National Committee for Quality Assurance (NCQA) worked with the CAHPS Consortium, sponsored by AHRQ, to develop a new version of the CAHPS Clinician & Group Survey to address specific processes of care relevant to patient-centered medical homes (“CAHPS PCMH Survey”).
The CAHPS PCMH Survey includes all core items in the CAHPS Clinician & Group Survey and incorporates new items to address domains of care that multiple stakeholders identified as critical for evaluating functioning of PCMH practices.
The CAHPS PCMH Survey gathers information on patients’ experiences receiving care in a patient-centered medical home, thereby laying the groundwork for measuring and improving organizations and delivery of care. The VA Patient Aligned Care Teams (PACT) is a major initiative to provide Veteran-centered primary care. The VA Office of Analytics and Business Intelligence (OABI) has used the SHEP program as the single most important system-wide effort to assess patient experiences with VA care. As of March 2012, the VA implemented the Patient Centered Medical Home (PCMH) survey as part of the SHEP inventory of survey instruments.
Purpose:
- To obtain valid and reliable evaluations of veterans’ healthcare experiences with VHA ambulatory care.
- To obtain these evaluations using a standardized questionnaire and consistent methodology nationwide, thereby permitting the reporting of valid results, trending over time, and the comparison of local results with VHA internal benchmarks.
- To support assessment of VA’s initiative to provide Veteran centered primary care through implementation of Patient Aligned Care Team (PACT), based on a patient centered medical home (PCMH) mode
Survey Administration Protocol
SHEP administers mail-based surveys. Veterans selected for the survey are sent a pre-survey notification letter explaining the nature and goals of the upcoming survey and encouraging the veteran to participate. One week later the questionnaire is mailed to everyone in the sample. Thank you/reminder postcards are sent to the entire sample one week later. Data collection remains open for three weeks after the postcard is mailed.
Calculation of CAHPS PCMH composites:
The following CAHPS PCMH composites are calculated for the outpatient setting: Access (Getting Timely Appointments, Care, and Information), Communication (How Well Providers Communicate with Patients), Providers Discuss Medications Decisions, Self-Management Support (Providers Support you in Taking Care of Your Own Health), Comprehensiveness (Providers Pay Attention to Your Mental/Emotional Health), Office Staff (Helpful, Courteous, Respectful Office Staff).
These composites are composed of two to six individual questions, some of which may have a response scale of “Never, Sometimes, Usually, Always”, “Yes, No”, or “Not at all, A little, Some, A lot”. The site score for each question is first computed as the percentage of responses that fall in the top category (“Always” or “Yes” or “A lot”). The site composite score then is the average for the questions assigned to each composite. It should be noted that each question in a composite is weighted equally, regardless of how many patients respond. Other CAHPS PCMH “reporting measures” that are not composites, but rather themed single survey items include:
Rating of Provider; two Information single items: After Hours Care Information, and Reminder Received; and three Coordination of Care single items: Follow Up on Test Results, Provider Informed About Specialists Care and Prescriptions Discussed. Refer to “PCMH Dimensions of Care” tables below for more information.
SHEP measures that feed into S.A.I.L.:
- In the last six months, did anyone in this provider’s office ask you if there was a period of time when you felt sad, empty or depressed?
- In the last six months, did you and anyone in this provider’s office talk about things in your life that worry you or cause you stress?
- In the last six months, did you and anyone in this provider’s office talk about a personal problem, family problem, alcohol use, drug use, or a mental or emotional illness?
- Provider Rating: Using any number from 0 to 10, where 0 is the worst provider possible and 10 is the best provider possible, what number would you use to rate this provider?
What was the ultimate outcome of your effort?
- The following outcomes were achieved by the Martinsburg VAMC primary care leadership team and providers:
- Increased volume of real-time Veteran feedback used for performance improvement initiatives
- Improved staff and provider morale through informal and formal recognition programs
- Improved clinical leadership and staff/provider engagement in patient experience improvement
- Improved compliance with hand hygiene protocols
- Veteran perceptions that clinic staff washed/cleaned their hands improved from 64% in Oct-Dec’18 to 70% in Oct-Dec’19
- Improved compliance addressing mental health concerns during clinic visits
- Improved top box patient satisfaction composite scores [Lag Measures] o Comprehensiveness Composite improved from 57% in FY’18Q3 to 70% in FY’19Q3 (All-time high)
- Office Staff Composite improved from 73% in FY’17Q4 to 78% in FY’19Q3
- Overall Provider Rating improved from 71% in FY’17Q4 to 77% in FY’19Q4 (All-time high)
- Improved top box Veteran Experience (TruthPoint) outcomes [Lead Measures]
- Veteran perceptions that providers asked about feelings of depression improved from 40% in Apr Jun’18 to 61% in Jul-Sep’19
- Veteran perceptions that providers asked about worries or stresses improved from 38% in Apr-Jun’18 to 55% in Jul-Sep’19
- Veteran perceptions that providers asked about personal or family problems improved from 33% in Apr-Jun’18 to 39% in Jul-Sep’19
- Veteran perceptions that providers asked about alcohol/drug use improved from 39% in Apr-Jun’18 to 50% in Jul-Sep’19
- Veteran perceptions that providers asked about mental/emotional illness improved from 35% in Apr Jun’18 to 43% in Jul Sep’19
- Veteran perceptions of the primary care provider improved from 79% “Excellent” in Apr-Jun’18 to 89% “Excellent” in Oct-Dec’19
- Veteran perceptions of front desk staff improved from 49% “Excellent” in Jan Mar’17 to 82% “Excellent” in Oct-Dec’19
- Veteran perceptions of same-day call backs for medical questions improved from 55% “Same day” in Jan-Mar’19 to 65% “Same day” in Oct-Dec’19
What lessons did you learn that you would share with others as they consider addressing a similar issue?
Improving the patient/family experience is challenging due to the multiple points of potential failure including the clinic environment, point of first contact (lobby, help desk, clinic front desk), nurse visit, provider visit, check out and contact with the clinic via phone or secure email. Given the multiple sites of care and the highly variable results across these sites, we needed to review, accept and imagine what the results truly represented.
Did a poor impression of the care result from a grumpy conversation at check in? Or with the nurse? Or with the provider? It is important to allow patients to share their www.theberylinstitute.org stories and they must be shared with the staff to allow for self reflection. Of key importance was envisioning the patient’s visit and the attempt to identify points of contact with the staff/clinic and determining how to gauge the patient’s satisfaction with that point of care though the question-design process.
Our staff wish to be perceived as caring and helpful. Encouragement, liberal sharing of clearly understood data with reinforcement of good practices through positive communication, frank sharing of underperformance and holding clinic leaders and providers accountable for the results have been keys to success.