Free
Editorial Views  |   September 2017
Simulation for Assessment of the Practice of Board-certified Anesthesiologists
Author Notes
  • From the Department of Anesthesiology, Medical College of Wisconsin, Milwaukee, Wisconsin (C.A.L.); Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, Minnesota (M.A.W.); and Department of Anesthesiology, Perioperative and Pain Medicine, Brigham and Women’s Health Care, Harvard Medical School, Boston, Massachusetts (J.P.R.).
  • Corresponding article on page 475.
    Corresponding article on page 475.×
  • Accepted for publication June 20, 2017.
    Accepted for publication June 20, 2017.×
  • Address correspondence to Dr. Lien: clien@mcw.edu
Article Information
Editorial Views / Education / CPD
Editorial Views   |   September 2017
Simulation for Assessment of the Practice of Board-certified Anesthesiologists
Anesthesiology 9 2017, Vol.127, 410-412. doi:10.1097/ALN.0000000000001792
Anesthesiology 9 2017, Vol.127, 410-412. doi:10.1097/ALN.0000000000001792

“…how can we use tools such as assessment in simulation…to improve the everyday practice of anesthesiologists?”

Image: Brigham and Women’s Hospital, STRATUS Center for Medical Simulation.
Image: Brigham and Women’s Hospital, STRATUS Center for Medical Simulation.
Image: Brigham and Women’s Hospital, STRATUS Center for Medical Simulation.
×
SIMULATION in the field of anesthesiology has proven useful in helping clinicians keep their practice skills current, particularly in the management of uncommon crises that they may encounter in their day-to-day work environment. Simulation is now an Accreditation Council for Graduate Medical Education requirement for residency programs, and its use as an educational tool has been well demonstrated. Recognizing its benefits to practicing anesthesiologists, simulation is an accepted component of Maintenance of Certification in Anesthesiology (MOCA), Part 4, which is aimed to improve medical practice.
Although the use of simulation for assessment has been described in training situations, it has not been widely discussed as an assessment tool for practicing anesthesiologists. In this issue of Anesthesiology, Weinger et al.1  assessed the performance of board-certified anesthesiologists in managing critical events that may occur in the course of the everyday practice of anesthesiology. What can we learn from this new use of high-fidelity simulation? The results of their study raise concern. Many practicing anesthesiologists were rated as performing poorly in the management of uncommon but what should be familiar scenarios: local anesthetic systemic toxicity, hemorrhagic shock from occult peritoneal bleeding, malignant hyperthermia in the postanesthesia care unit, and acute onset of atrial fibrillation with hemodynamic instability followed by ST elevation myocardial infarction.
In their study, 30% of the 284 practicing anesthesiologists could not manage these scenarios. On first reading, that sounds alarming. We know that the use of simulation in education has taken many different forms–task trainers, objective structured clinical examinations, and high-fidelity simulation among them. In education, simulation is an effective tool in teaching resuscitation skills2  and in introducing trainees to pediatric anesthesiology.3  However, a single exposure to a simulation-based educational experience in a training setting does not instill skills that are retained for the duration of one’s professional life,4  and repeated exposure–deliberate practice–is necessary to maintain skills. On reflection, the favorable performance of the majority of anesthesiologists in the study by Weinger et al.1  should be encouraging. Their findings may have identified variations in participants’ daily exposure to specific areas of anesthesiology and the lack of recent exposure or deliberate practice in areas where these anesthesiologists had not recently or repeatedly had exposure in their daily practices.
The study by Weinger et al.1  adds credence to the concept that continuing professional development programs and, subsequently, assessment tools should be designed to support the delivery of care that improves patient outcomes.5  To be effective, these programs should generate enough interest that participants will voluntarily seek to use them as effective learning tools. They must appeal to the individual needs of a diverse group of learners. For practicing clinicians, simulation has been demonstrated to be an effective way to identify and address practice gaps.6  Simulation has also proven to be a cost-effective way to teach the management of infrequent adverse events, new surgical techniques, sterile central line insertion, and teamwork.5  However, the use of simulation as an assessment tool in high-stakes examinations or in evaluating the practice of individuals who completed their training years earlier has been limited.
The world of medicine is constantly changing, and it takes significant and deliberate effort to stay current. New discoveries, changes in the practice of medicine, and the growth of disruptive technologies continue to drive the way that medicine is practiced. With these changes, continuing professional development is essential to master new advances and incorporate those that improve patient care into practice. Significant changes that impact perioperative patient care in the past 10 yr include, among other things, revisions of Advanced Cardiac Life Support guidelines, new recommendations for lung protection, evolving transfusion guidelines and recommendations for the management of patients with intracardiac stents, and the availability of new medications, ranging from novel anticoagulants to a new agent for the reversal of neuromuscular blockade. Without deliberate practice aimed at mastering the new knowledge and skills needed in these areas, any clinician could struggle. With this changing practice knowledge and environment, is it surprising that there is a range in ability among practicing anesthesiologists? Some of the participants may have had more recent experience with the skills required to perform well in the scenarios tested, whereas others had not encountered these clinical challenges or simulated practice environments for some time. Indeed, younger anesthesiologists performed better than older anesthesiologists, lending some credence to this notion.
Assessment of clinical practice is difficult. Study of the American Board of Anesthesiology certification process revealed that some candidates achieve certification despite the opinion of their residency program directors that they would not personally allow these trainees to perform an anesthetic on them.7  Assessment in certifying examinations is designed to demonstrate the acquisition of a body of knowledge rather than a complete set of facts. Very good residents can incorrectly answer questions relating to crucial concepts and still pass the examinations.8  Any assessment system should be constructed to allow those being evaluated an opportunity to do their best. Optimal performance depends on knowing what is being tested, being familiar with the format of assessment that is used, and having the opportunity to prepare. This is where we call for particular caution in being overly concerned about the seemingly high proportion of those who did not perform well in the study by Weinger et al.1  Participants were not in a familiar setting, it is unlikely they had recently encountered the scenarios that they faced in simulation, and they were not able to prepare for the cases in which they were assessed.
The objective of the study by Weinger et al.1  was to establish an assessment process through review of recorded simulation performance and to identify gaps in anesthesiologist management of medical emergencies. The authors measured the performance of each participant in a single scenario that may or may not have been part of the clinician’s routine practice. Study participants included board-certified anesthesiologists who had signed up for simulation to fulfill part of their requirements for MOCA. However, participants were not aware in advance of the areas on which they were being tested and thus had no ability to practice or prepare. In contrast to what has been recommended by previous experts,9  standards for performance in these simulation scenarios were not established in advance. Validation should include evidence regarding the intended and unintended consequences of the assessment, the degree to which performance on specific tasks transfers, and the fairness of the assessments. The content being tested should be generalizable. Valid assessments of performance cannot be made on the basis of a single encounter. As noted by Boulet et al.,10  when simulation is used for assessment, a number of scenarios are required for fair assessment and the more precise the measurements, the more scenarios that are required. The results obtained from assessment of performance in as many as six scenarios, in the study by Boulet et al.,10  was not generalizable to other patient situations. There are limitations in using the performance on one of the four scenarios, as was done in this study,1  to draw conclusions about how well these anesthesiologists would do when faced with similar scenarios in actual clinical practice (transferability) and other aspects of the practice of anesthesiology (generalizability). An additional consideration in the study by Weinger et al.1  is that most scenarios were evaluated by only one rater. Of the 39 scenarios that were rated by at least two raters and reviewed for interrater reliability, the reliability was acceptable but not perfect. As demonstrated by Gaba et al. in 1998,11  using two raters reduced the likelihood of a significant error when assessing performance. Moreover, consistency of rater assessments across participants was not measured, and scores were not adjusted for the severity of the rater.
Weinger et al.1  have described the performance of board-certified anesthesiologists in simulation scenarios originally designed for MOCA. How these physicians practice in their day-to-day environment with a variety of different patients cannot be determined based on these results. Ongoing assessment and self-evaluation in practice are important to facilitate a clinician’s ability to remain current. Hospital appointments are reevaluated periodically, licenses are renewed every few years, Advanced Cardiac Life Support certification is valid for only 2 yr and, as required by all 24 of the American Board of Medical Specialties boards, certification must be maintained once it is achieved. A dedication to lifelong education is a measure of competence12  and a commitment to revise practice based on what is learned predicts whether changes will be incorporated into practice.13  In addition to a commitment to lifelong learning, competence requires deliberate practice. Simulation, which through its debriefing sessions provides formative assessment, is a means of learning behaviors ranging from technical skills to teamwork. Although high-fidelity simulation is an engaging educational tool with demonstrated efficacy in graduate medical education, its use as an assessment tool for practicing physicians is not common. It has been used in Israel as a part of the certification process for anesthesiologists.14  It is important to note that all anesthesiologists certified in Israel learn and are assessed in the same simulation center. The study by Weinger et al.1  was conducted in several different simulation centers, and study participants were likely not familiar with either the setting or the process.
Simulation is a wonderful tool for some things, and Weinger et al.1  should be congratulated for taking a careful and extensive look at how it might be used in assessment to identify performance gaps. The next crucial step is linking their assessment tool directly to learning experiences that help practitioners to improve their practice. We need to be confident that those who are found deficient using the new assessment tool described by Weinger et al.1  can learn from it, make changes, and transfer those changes to improvements in everyday practice. We need to know that being involved in this sort of activity leads to generalized improvements in other areas of practice beyond the areas addressed in the focused assessments.
The current system of continuous professional development is inadequate; the real question, however, is how can we use tools such as assessment in simulation described by Weinger et al.1  to improve the everyday practice of anesthesiologists? Although in the future its use as an assessment tool may be of great value, there is work to be done before it is ready to be widely adopted. Standards to define acceptable practices must be set, a content outline of material that will be assessed should be developed, and simulation centers and scenarios have to be standardized so that anesthesiologists can anticipate what to expect during their assessments. In the meantime, encouraging all practicing physicians to engage in educational programs and deliberate practice that will maintain and improve their clinical skills15  and ultimately the care that they provide for their own patients is of paramount importance. Weinger et al.1  should be congratulated for providing data to support this approach.
Competing Interests
Dr. Lien is a former director of the American Board of Anesthesiology. Dr. Warner is the president of the Anesthesia Patient Safety Foundation and former director of the American Board of Anesthesiology. Dr. Rathmell is the president of the American Board of Anesthesiology.
References
Weinger, MB, Banerjee, A, Burden, AR, McIvor, WR, Boulet, J, Cooper, JB, Steadman, R, Shotwell, MS, Slagle, JM, DeMaria, S, Torsher, L, Sinz, E, Levine, AI, Rask, J, Davis, F, Park, C, Gaba, DM . Simulation-based assessment of the management of critical events by board-certified anesthesiologists. Anesthesiology 2017; 127:475–89
Dagnone, JD, Hall, AK, Sebok-Syer, S, Klinger, D, Woolfrey, K, Davison, C, Ross, J, McNeil, G, Moore, S . Competency-based simulation assessment of resuscitation skills in emergency medicine postgraduate trainees: A Canadian multi-centred study. Can Med Educ J 2016; 7:e57–67 [PubMed]
Ambardekar, AP, Singh, D, Lockman, JL, Rodgers, DL, Hales, RL, Gurnaney, HG, Nathan, A, Deutsch, ES . Pediatric anesthesiology fellow education: Is a simulation-based boot camp feasible and valuable? Paediatr Anaesth 2016; 26:481–7 [Article] [PubMed]
Jansson, MM, Syrjälä, HP, Ohtonen, PP, Meriläinen, MH, Kyngäs, HA, Ala-Kokko, TI . Randomized, controlled trial of the effectiveness of simulation education: A 24-month follow-up study in a clinical setting. Am J Infect Control 2016; 44:387–93 [Article] [PubMed]
Kothari, LG, Shah, K, Barach, P . Simulation based medical education in graduate medical education training and assessment programs. Prog Pediatr Cardiol 2017; 44:33–42 [Article]
DeMaria, S, Levine, A, Petrou, P, Feldman, D, Kischak, P, Burden, A, Goldberg, A . Performance gaps and improvement plans from a 5-hospital simulation programme for anaesthesiology providers: A retrospective study. BMJ Simulation and Technology Enhanced Learning 2017; 3:37–42 [Article]
Slogoff, S, Hughes, FP, Hug, CCJr, Longnecker, DE, Saidman, LJ . A demonstration of validity for certification by the American Board of Anesthesiology. Acad Med 1994; 69:740–6 [Article] [PubMed]
Slogoff, S, Hughes, FP . Validity of scoring ‘dangerous answers’ on a written certification examination. J Med Educ 1987; 62:625–31 [PubMed]
Linn, RL, Baker, EL, Dunbar, SB . Complex, performance-based assessment: Expectations and validation criteria. Educ Res 2016; 20:15–21 [Article]
Boulet, JR, Murray, D, Kras, J, Woodhouse, J, McAllister, J, Ziv, A . Reliability and validity of a simulation-based acute care skills assessment for medical students and residents. Anesthesiology 2003; 99:1270–80 [Article] [PubMed]
Gaba, DM, Howard, SK, Flanagan, B, Smith, BE, Fish, KJ, Botney, R . Assessment of clinical performance during simulated crises using both technical and behavioral ratings. Anesthesiology 1998; 89:8–18 [Article] [PubMed]
Epstein, RM . Assessment in medical education. N Engl J Med 2007; 356:387–96 [Article] [PubMed]
Wakefield, J, Herbert, CP, Maclure, M, Dormuth, C, Wright, JM, Legare, J, Brett-MacLean, P, Premi, J . Commitment to change statements can predict actual change in practice. J Contin Educ Health Prof 2003; 23:81–93 [Article] [PubMed]
Berkenstadt, H, Ziv, A, Gafni, N, Sidi, A . Incorporating simulation-based objective structured clinical examination into the Israeli National Board Examination in Anesthesiology. Anesth Analg 2006; 102:853–8 [Article] [PubMed]
Sachdeva, AK . Continuing professional development in the twenty-first century. J Contin Educ Health Prof 2016; 36(suppl 1):S8–S13 [Article] [PubMed]
Image: Brigham and Women’s Hospital, STRATUS Center for Medical Simulation.
Image: Brigham and Women’s Hospital, STRATUS Center for Medical Simulation.
Image: Brigham and Women’s Hospital, STRATUS Center for Medical Simulation.
×