Free
Editorial Views  |   July 1999
Oral Practice Examinations  : Are They Worth It?
Author Notes
  • Professor of Anesthesiology; Associate Dean for Graduate Medical Education; Wake Forest University School of Medicine; Medical Center Boulevard; Winston-Salem, NC 27157–1009;
  • Accepted for publication March 15, 1999.
Article Information
Editorial Views
Editorial Views   |   July 1999
Oral Practice Examinations  : Are They Worth It?
Anesthesiology 7 1999, Vol.91, 4-6. doi:
Anesthesiology 7 1999, Vol.91, 4-6. doi:
This Editorial View accompanies the following article: Schubert A, Tetzlaff JE, Tan M, Ryckman JV, Mascha E: Consistency, inter-rater reliability, and validity of 441 consecutive mock oral examinations in anesthesiology: Implications for use as a tool for assessment of residents. Anesthesiology 1999; 91:288–98.
MODERN day society increasingly demands affirmation that physicians are capable medical practitioners. The public, health maintenance organizations, hospital credential committees, state licensing boards, group practices, and other organizations insist on competence, accountability, initial and maintained specialty board certification, continuing education, and so on. Graduate medical education programs have a major responsibility to assess the competence of their residents. Through appropriate and careful faculty evaluation of resident performance, these programs may have the best opportunity during a physician's entire professional life to assess competence. Unfortunately, relatively little effort has been made to determine the best methods for such assessment. The article by Schubert et al. [1 ] in this issue of Anesthesiology is a major contribution in this area. This lengthy article is not particularly easy to read and at times requires wrestling with a "foreign language" because of the considerable volume of technical detail, unfamiliar terminology, and statistical analysis. Nevertheless, it contains a great deal of valuable information. The article reports on a 5-yr period, 1989–1993, during which 190 residents participated in 411 oral practice examinations (OPEs). The authors then examined the reliability and validity of mock oral examinations as an indicator of the progress residents have made toward achieving the ability to practice medicine independently.
OPEs serve many purposes. They can facilitate education by stimulating residents to read, ask questions, and seek broad clinical experience. Residents and junior faculty preparing for the oral examination of the American Board of Anesthesiology find it stressful to be examined by individuals with whom they currently associate, hence the added motivation to study to avoid embarrassment. OPEs help to prepare individuals for "the real thing" and may reduce stress by familiarizing those taking the examination with the general setting and format of oral examinations and by providing coaching on how to improve communication skills, effectively present ideas and opinions, organize thought processes and answers, and portray confidence via body posture and eye contact. Mock oral examinations may also provide a valid mechanism to help assess a resident's progress in the program and overall competence when used in conjunction with assessment by faculty in the clinical setting, patient outcome, and performance on in-training examinations.
As presented in the article by Schubert et al., the OPE differs in several ways from the examination of the American Board of Anesthesiology (ABA). The authors used the format employed by the ABA before 1997. The new ABA examination places more emphasis on perioperative medicine, especially the postoperative period, which could be completely ignored and thereby omitted in the old format. This difference does not detract from the effort Schubert et al. make to validate their own OPE with regard to evaluating resident performance. Clearly, a greater possibility for a "halo effect" exists when faculty who know the candidates conduct the examinations. Associate examiners in the ABA system have no knowledge of a candidate's place of training, practice type or location, performance as a resident, previous evaluations, or personality. Schubert et al. provide some reassurance that familiarity with the candidates did not significantly affect the results by finding "acceptable agreement" between live and taped overall numerical scores and pass-fail scores. The ABA does not use the overall numerical scores, a process whereby each subquestion receives equal weight. The ABA system weights each subquestion on the basis of the examiner's judgment of its importance and the time devoted to the subquestion. The scoring system used by Schubert et al. could produce different pass-fail results on identical examinations, but it is unlikely to have altered the conclusions of this study. Although some faculty members at the Cleveland Clinic serve as associate examiners for the ABA's oral examinations, other faculty who are less familiar with oral examinations also participate. Although the inter-rater reliability results fail to show significant differences in scoring between examiners, possible differences between examiners in how the examination is conducted were not addressed. All ABA associate examiners attend orientation sessions and workshops specifically directed toward different levels of examiner experience each time they participate in the oral examination. This process of continuing education for associate examiners influences examination style, the effectiveness of questions asked, grading, and so on.
Practice oral examinations require considerable time, effort, and expense for the programs that conduct them. Are OPEs really worth it? Schubert et al. examined 5 years' experience with OPEs and their reliability, consistency, and validity at the Cleveland Clinic. The results provide convincing evidence of success by demonstrating substantial internal consistency and reliability. The positive correlation of OPE scores at their institution with in-training examination scores, faculty evaluations, and other indicators of resident preparedness seem to affirm that OPEs can represent a reasonably valid tool for assessing resident performance.
Teaching programs must work to improve the education provided to residents in anesthesiology. Examining our methods of teaching is an important part of improving residency training and the physicians educated in them. So often we assume that what we do has value. Many programs conduct OPEs year after year without carefully investigating their efficacy. I commend the authors for studying their own OPE process and demonstrating the OPEs can serve effectively as one measure of the resident's progress toward independent practice and specialty board certification.
Francis M. James III, M.D.
Professor of Anesthesiology; Associate Dean for Graduate Medical Education; Wake Forest University School of Medicine; Medical Center Boulevard; Winston-Salem, NC 27157–1009;
REFERENCES 
REFERENCES 
Schubert A, Tetzlaff JE, Tan M, Ryckman JV, Mascha E: Consistency, inter-rater reliability, and validity of 441 consecutive mock oral examinations in anesthesiology: Implications for use as a tool for assessment of residents. Anesthesiology 1999; 91:288-98