Christina R. Bergin, MD
Banner University Medical Center Phoenix
Program Size: 31-100 residents
Academic Setting: Mixed
Clinical Setting: Professionalism/Communication
Residency programs are required to provide objective assessments of residents in the ACGME Core Competencies. Some of these competencies may be somewhat difficult to assess, such as professionalism, interpersonal and communication skills, and even certain aspects of the patient care competency. Furthermore, these assessments must come from multiple evaluators, including patients and nursing staff, and in various clinical settings. The outpatient clinic has become the traditional setting for obtaining multi-source feedback. This has left a paucity of information regarding residents’ performance in certain competencies in the inpatient setting, where a significant proportion of their time is spent during residency training. The University of Arizona College of Medicine-Phoenix Internal Medicine Residency Program therefore developed an evaluation tool to obtain direct patient feedback in order to help achieve this multi-source assessment of residents in the inpatient setting. This tool allows patients to directly evaluate various aspects of residents’ patient care skills, professionalism, and interpersonal communication. A similar but distinct form was developed to obtain direct nursing feedback and assessments of these competencies as well. The patient evaluation and feedback tool is distributed to patients on the inpatient Internal Medicine teaching service on the day of discharge by a hospital volunteer; the volunteer assists patients with completion of the form if requested. The nursing evaluation tool may be distributed to nursing staff at any point during the residents’ wards rotation. All responses are anonymous to increase the candor of feedback provided by both patients and nursing staff. Resident pictures, names, and roles (intern vs. resident) are included on both forms to assist with patient and nursing identification of who is being evaluated.
In order to develop this evaluation tool, we first had to determine which competencies and which outcomes within these competencies we wanted to have patients evaluate. We then investigated what qualities and skills patients value in their physicians in order to determine how to best engage patients in providing feedback to residents. The competency outcomes and patient-valued qualities were then utilized to create the evaluation questions/prompts. A rating scale thought to provide for adequate distribution of scores was determined. Finally, the method for distribution of the evaluation/feedback tool was chosen and implemented.
Sustaining the implementation of a new evaluation tool can be difficult. Resident distribution of the evaluation tools to both patients and nursing staff is inconsistent due to pressures of time and workload. There is also the possibility of distribution bias. Use of a hospital volunteer to deliver the tool to patients, and frequently to also provide in person assistance with its completion, provided for much more consistent distribution and return of the evaluation tool. We also found that patient assessments of residents tend to be very positive despite assurances of anonymity; constructive feedback, if desired, must be encouraged as well.
Faculty Development and Training
None, as this is a tool utilized by patients to evaluate residents. If faculty rather than hospital volunteers are chosen to distribute the tool to patients, then faculty may benefit from a short session covering techniques for directly requesting feedback from patients and the avoidance of distribution bias.
For more information, please contact firstname.lastname@example.org.