Free
Correspondence  |   January 2008
The Use of Simulation Education in Competency Assessment: More Questions than Answers
Author Affiliations & Notes
  • David Murray, M.D.
    *
  • *Washington University School of Medicine, St. Louis, Missouri.
Article Information
Correspondence
Correspondence   |   January 2008
The Use of Simulation Education in Competency Assessment: More Questions than Answers
Anesthesiology 1 2008, Vol.108, 167-168. doi:10.1097/01.anes.0000296641.73408.69
Anesthesiology 1 2008, Vol.108, 167-168. doi:10.1097/01.anes.0000296641.73408.69
In Reply:—
We would like to thank Dr. Edler for her interest in our editorial.1 In an innovative study that measured teamwork during simulated obstetric emergencies, Morgan et al.  found that the team scores were not reliable.2 The investigators attributed the limited reliability to the use of the Human Factors Rating Scale. Dr. Edler correctly indicates that the limited reliability could be the result of several factors and that a generalizablity analysis that partitioned the variance associated with scoring method, raters, and scenario would help to clarify the findings. Regardless of the cause of the limited reliability, Morgan et al  .’s main conclusion would still stand.
“Simulation education” or, perhaps more appropriately, simulation-based assessment, has stimulated interest in interpreting participant or team scores. The reliability and validity of scores obtained during a simulation (or any performance assessment) depend on the event and environment’s authenticity, how effectively the scoring instrument captures the skills of interest, and whether the raters consistently observe and record the performance. Morgan et al.  ’s experimental design offers an important first step in evaluating teamwork in that it includes high-fidelity simulated obstetric emergencies that can be managed by a multidisciplinary team. Hopefully, rather than discouraging educators, the challenges of developing assessment instruments that can be used to provide valid and reliable team scores will inspire additional investigations. This research would help to establish the role of simulation-based assessment as a method to investigate and evaluate team performance.
*Washington University School of Medicine, St. Louis, Missouri.
References
Murray D, Enarson C: Communication and teamwork: Essential to learn but difficult to measure. Anesthesiology 2007; 106:895–6Murray, D Enarson, C
Morgan PJ, Pittini R, Regehr G, Marrs C, Haley MF: Evaluating teamwork in a simulated obstetric environment. Anesthesiology 2007; 106:907–15Morgan, PJ Pittini, R Regehr, G Marrs, C Haley, MF