Relationship between Faculty Assessment of Resident Medical Knowledge and Resident Performance on the CREOG In-Service Examination

Date

2015-08

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

In 2012, 13% of Obstetrics and Gynecology (Ob/Gyn) physicians failed the American Board in Obstetrics and Gynecology (ABOG) written exam, the first step to becoming certified following their 48-month residency training. During their training, Ob/Gyn residents take the Council on Resident Education in Obstetrics and Gynecology (CREOG) in-training exam. This annual test was designed, in part, to measure an Ob/Gyn resident’s fund of knowledge and may determine their level of preparedness for the ABOG written exam. However, the test is not intended to be the exclusive assessment tool utilized by training programs to determine a resident’s medical knowledge. In order to supplement this assessment, educators from one academic training program performed monthly evaluations of residents’ medical knowledge using a standard instrument, but the accuracy of their assessment was unknown. A positive relationship between faculty evaluations of residents in general gynecology and general obstetrics and resident scores in these domains on in-training exam may more quickly establish a need for resident remediation to help ensure board exam passage. The purpose of this study was to determine if such a relationship exists. Archival in-service exam scores and average faculty evaluation scores of medical knowledge in general gynecology and general obstetrics were collected at one training program from 2009-2014 and were used to determine the relationship, if any, between the two sets of scores. Data analysis yielded mixed results. A moderate relationship existed in the area of obstetrics in all years with the strongest correlation found in the PGY2 year. The gynecology results showed no correlation between the faculty evaluations and the exam scores for the PGY2 and PGY4 years, with only a weak, statistically significant correlation in the domain of gynecology for the PGY3 year. The study concluded that faculty may be evaluating technical skill in gynecology rather than medical knowledge and that consistent exposure to a core group of supervising educators may have contributed to a stronger relationship in the area of obstetrics. Recommendations for future practice to identify earlier opportunities for intervention included training the faculty to improve evaluation technique; creating new learning opportunities for the residents in the clinical environment, and scheduling more consistent exposure to full-time, core faculty.

Description

Keywords

Resident in-training exam, Faculty evaluations

Citation