Editorial Views  |   April 2018
Anesthesiology Board Certification Changes: A Real-time Example of “Assessment Drives Learning”
Author Notes
  • From the Department of Anesthesiology, Washington University School of Medicine, St. Louis, Missouri (D.J.M.); and the Foundation for Advancement of International Medical Education and Research, Philadelphia, Pennsylvania ( J.R.B.).
  • Corresponding article on page 813.
    Corresponding article on page 813.×
  • Accepted for publication November 30, 2017.
    Accepted for publication November 30, 2017.×
  • Address correspondence to Dr. Murray:
Article Information
Editorial Views / Quality Improvement
Editorial Views   |   April 2018
Anesthesiology Board Certification Changes: A Real-time Example of “Assessment Drives Learning”
Anesthesiology 4 2018, Vol.128, 704-706. doi:10.1097/ALN.0000000000002086
Anesthesiology 4 2018, Vol.128, 704-706. doi:10.1097/ALN.0000000000002086

“…[A]s a first step in the evaluation of changes in the certification examination process, and as an example of how to gather validity evidence, the trend to a higher knowledge standard is promising.”

Image: American Board of Anesthesiology.
Image: American Board of Anesthesiology.
Image: American Board of Anesthesiology.
THE American Board of Anesthesiology (ABA) has introduced a number of changes in the certification process both for physicians who are qualifying for the first time and for those with time-limited board certification. In this issue of Anesthesiology, Zhou et al.1  provide data from the ABA that indicate that one of these changes, the division of the final knowledge examination into a Basic and an Advanced Examination, positively impacts resident knowledge as measured by improvements in in-training examination scores. The methods and findings presented in this manuscript are of immediate interest to residency programs and educators. In the longer term, if these examination changes result in board-certified diplomates with greater knowledge and skill, then certification standards will increase and this improvement will translate into a higher quality of care that will benefit all patients.
The study provides a real-time example of how “assessment drives learning.”2  According to Zhou et al.,1  all of whom are either staff or directors (current or former) of the ABA, the goal of adding the Basic Examination for certification was to act as “an incentive for residents to develop positive study habits early in their training” and to “focus their early phases of study on content areas that provide the foundation for later training.” Similar to other examples of “assessment drives learning,” such as the introduction of the United States Medical Licensing Examination Step 2 Clinical Skills (USMLE Step 2CS), modifications to the certification process would be expected to both alter resident behavior and change, hopefully for the better, the training curriculum.3  However, as observed in previous studies, the addition of an assessment can also result in unintended, potentially negative, consequences.2 
The introduction of the Basic Examination led to higher in-training examination scores when compared to residents in previous years, leading to the conclusion that the examination change led to accelerated knowledge acquisition during residency training. The scores on in-training examinations in the Clinical Anesthesia (CA)-1 and CA-2 years support the value of the new staged examination system in promoting the desired educational outcomes of anesthesiology training. From an educational perspective, the idea that in-training examination scores will improve with the introduction of a “high-stakes” Basic Examination earlier in training is reasonable given that it would motivate residents to study and spur curriculum changes in residency programs.2,3  However, a more detailed analysis of the Basic Examination and in-training examination scores would be helpful to determine what aspects of knowledge acquisition are most affected and whether these translate into longer-term performance gains. While the differential improvement in in-training examination scores is noteworthy, it is a small effect size, and based on examination scores in which measurement error, either attributable to sampling and/or to equating, may also play a role in the differences.4  If more knowledge acquisition occurs and is sustained throughout training, then this should translate into an increase in the year-over-year certification rate, provided that the passing standard stays the same. This could be partially validated by analyzing performance on the Advanced part of the written examination. Nevertheless, as a first step in the evaluation of changes in the certification examination process, and as an example of how to gather validity evidence, the trend to a higher knowledge standard is promising.
There could also be some less desirable unintended consequences of the change in examination approach. While the study cohort showed a greater improvement in in-training examination scores, after conditioning for possible confounding variables, these assessments were taken approximately 6 months before and 6 months after the Basic Examination. The separation of the examination into Basic and Advanced components may merely have shifted knowledge acquisition to earlier during training with either no overall change or potentially a decrement if basic knowledge is not retained over time. Regardless of when it is acquired, it is important that this knowledge does not extinguish over time and that the improved learning is indeed the outcome of the change in examination timing. The residents in the historical control group were required to study for a single examination, albeit later in training, which required both Basic and Advanced knowledge. Upon completing this requirement, stakeholders could be reasonably certain that the diplomate had the essential knowledge required for practice. While the same could be true for the staged examination, provided that performance on the Advanced Examination component was adequate, any argument favoring a staged examination sequence, at least in terms of final knowledge acquisition, cannot be supported by in-training examination results at the end of the CA-1 or CA-2 year.
As observed in previous studies of medical students, inserting a “high-stakes” examination during training could have negative educational consequences. Studying for these types of assessment often consumes the students (or residents) who concentrate their efforts on gaining factual knowledge required to pass a Basic Examination, often to the exclusion of acquiring the clinical experiences and clinical acumen necessary to become a consultant.2  The staging of the knowledge assessment, which effectively changes the application of the anesthesiology curriculum, could limit some of the clinical experiences necessary to become a consultant in anesthesiology. From both program evaluation and assessment validity perspectives, this certainly warrants investigation. If the introduction of the Basic Examination has no impact on the acquisition of clinical skills, and it probably should not, then one would expect that the various measures of a resident’s clinical experiences and their performance on the Applied Examination would remain stable.
The ABA, via this publication and a recent manuscript looking at disciplinary actions by state medical boards, has provided greater access to examination and certification outcomes. The concerted effort to gather evidence to support the validity of board certification examination scores, and associated decisions, is commendable. In the recent disciplinary action outcome study, anesthesiologists who obtained their certification on the first attempt had a lower likelihood of having an action against their medical license than those who required more than one attempt.5  Conducting these types of investigations, which can help inform new, or support existing, examination structures, is extremely important. Providing data so that stakeholders, particularly those directly responsible for the education of residents, can better understand the value of certification can go a long way in garnering support for all assessment activities. Often, the rationale for changes in the certification process and, more importantly, the outcomes of these changes, are not forthcoming. Diplomates and educators are left to ponder what motivated the decision to alter certification requirements. In the case of Maintenance of Certification, these decisions have more often led to considerable rancor and discord among diplomates.6 
The conceptual basis underlying the decision to change from a single to a staged examination is reasonably sound. The content overlap in the Basic Examination (high stakes) and the in-training examinations, combined with the greater differential improvement in performance for the study cohort compared to the historical cohort, suggests that residents may actually learn more, particularly when passing the Basic Examination is required to continue their training. However, from a validity perspective, there are several unanswered questions that require further study. The addition of the Basic Examination could have some impact on the study patterns of residents, their clinical experience, and the administration of the curriculum. It is important to know more about how residents prepare for the Basic Examination, and whether this additional study time, and stress, impact other learning activities. If knowledge is at the core of clinical decision making, and the staged examination leads to longer-term knowledge improvement, then one would expect that diplomates who take the staged examinations will meet a higher knowledge standard than those who were certified under the old system. While yet to be explored, this should ultimately lead to better patient care.
Competing Interests
The authors are not supported by, nor maintain any financial interest in, any commercial activity that may be associated with the topic of this article.
Zhou, Y, Sun, H, Lien, CA, Keegan, MT, Wang, T, Harman, AE, Warner, DO . Effect of the BASIC Examination on knowledge acquisition during anesthesiology residency. Anesthesiology 2018; 128:813–20
Newble, D . Revisiting ‘The effect of assessments and examinations on the learning of medical students.’ Med Educ 2016; 50:498–501 [Article] [PubMed]
Gilliland, WR, La Rochelle, J, Hawkins, R, Dillon, GF, Mechaber, AJ, Dyrbye, L, Papp, KK, Durning, SJ . Changes in clinical skills education resulting from the introduction of the USMLE step 2 clinical skills (CS) examination. Med Teach 2008; 30:325–7 [Article] [PubMed]
Cohen, J . Statistical Power Analysis for the Behavioral Sciences, 19882nd edition. Hillsdale, Lawrence Erlbaum.
Zhou, Y, Sun, H, Culley, DJ, Young, A, Harman, AE, Warner, DO . Effectiveness of written and oral specialty certification examinations to predict actions against the medical licenses of anesthesiologists. Anesthesiology 2017; 126:1171–9 [Article] [PubMed]
Teirstein, PS, Topol, EJ . The role of maintenance of certification programs in governance and professionalism. JAMA 2015; 313:1809–10 [Article] [PubMed]
Image: American Board of Anesthesiology.
Image: American Board of Anesthesiology.
Image: American Board of Anesthesiology.