Free
Editorial Views  |   July 2014
Data, Data, On the Server: Challenges in Applying Data Analysis to Operating Room Management
Author Affiliations & Notes
  • Joseph A. Sanford, M.D.
    From the Department of Anesthesiology, University of Arkansas for Medical Sciences, Little Rock, Arkansas (J.A.S.); and Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University School of Medicine, Stanford, California (A.M.).
  • Alex Macario, M.D., M.B.A.
    From the Department of Anesthesiology, University of Arkansas for Medical Sciences, Little Rock, Arkansas (J.A.S.); and Department of Anesthesiology, Perioperative and Pain Medicine, Stanford University School of Medicine, Stanford, California (A.M.).
  • Corresponding article on page 171.
    Corresponding article on page 171.×
  • Accepted for publication March 24, 2014.
    Accepted for publication March 24, 2014.×
  • Address correspondence to Dr. Macario: amaca@stanford.edu
Article Information
Editorial Views / Cardiovascular Anesthesia / Critical Care / Neurosurgical Anesthesia / Pediatric Anesthesia / Practice Management
Editorial Views   |   July 2014
Data, Data, On the Server: Challenges in Applying Data Analysis to Operating Room Management
Anesthesiology 07 2014, Vol.121, 6-8. doi:10.1097/ALN.0000000000000288
Anesthesiology 07 2014, Vol.121, 6-8. doi:10.1097/ALN.0000000000000288

“Providing data and sophisticated analyses are not the ultimate goals; rather, they are a point of departure for dialogue to begin.”

Image: ©Thinkstock.
Image: ©Thinkstock.
Image: ©Thinkstock.
×
FOR the past decade and a half, pioneering work has been done to quantify mathematically various elements of operating room (OR) management including how to allocate the optimal amount of OR time for each room and service so that overall nurse and anesthesiology staffing costs are minimized. In this issue of Anesthesiology, an excellent study from Vanderbilt University adds to this literature by examining statistical methods to predict future OR demand.1  This was done by examining the OR schedule, as case bookings accumulate weeks before the day of surgery, to predict how busy a given day in the future will be. Their work is also laudable because they implemented in their hospital what they found in their study. The prediction of future case volume was used to achieve a key priority for the OR manager—adjusting OR staffing levels to match workload as closely as possible while reducing costs by an estimated 2.8 Full Time Equivalents per year.
Anesthesia groups that actively engage in such data analysis and its management can reap great benefits, enhancing their position as OR stakeholders. And a group that is comfortable doing its own analysis enjoys a key advantage over those that hire out to consultants. Namely, they have skin in the game and will be subject to whatever change may bring. Even those groups that bring in outside analysts acquire knowledge to which they would otherwise not have access.
As clinicians who have tried to implement OR management analytics, we have found that when datasets are too large to hand check and the potential impact is large, several lessons learned are worth remembering:
  • Data alone are not enough to ensure organizational change, but the analysis of data can facilitate discussion that may provide improvement opportunities.

  • Prediction of future OR activity alone is not sufficient to optimize OR management. The staff that are scheduled to work must also have the right mix of clinical skills.

  • Understanding the strengths and weaknesses of large dataset analysis and predictive models is a prerequisite for avoiding common pitfalls of statistical modeling. Specifically, how well do the models reflect reality and how does their implementation alter the performance of the OR suite?

More Than Data Is Needed for Change
The most important aspect of OR management is to first allocate the correct amount of OR time to each service on each day of the week. This should make sense because if an OR only has staff for 10 h and the room consistently runs 14 h, then that is not efficient. The correct amount of OR time allocated in the future to each service needs to be based on historical use by that service and aims to minimize underutilized time where ORs finish unexpectedly early and also at the same time minimize the more expensive overutilized time where ORs run past the scheduled end time.
When addressing any management dilemma (e.g., insufficient OR capacity to complete all elective cases), sometimes hospital managers source solutions from opinions and anecdote. Although the squeaky wheel getting the grease is an all too common outcome, the plural of anecdote is not data. Although anecdote cannot be discounted out of hand, it may be that the antidote to relying upon anecdote is lots of data. As a foundation from which to make educated decisions, having good quantitative information about the past can be valuable. Providing data and sophisticated analyses are not the ultimate goals; rather, they are a point of departure for dialogue to begin.
Clinicians and managers armed with data will recognize that organizational and cultural barriers exist. For example, people need to learn to trust the data, as a common first reaction at meetings when OR data are presented is to question data accuracy, and then to deny it is relevant to their specific practice. Skepticism is not a bad thing. When asked to make a decision based on presented information, participants may act on incomplete information as true and complete even when they are aware this is not the case.2  As the application of statistical analysis gives data meaning, it becomes open to interpretation. Although by human nature we are likely to take presented data at face value, we must remind ourselves to ask, “What are we not seeing?”
In addition, when trying your hand at predictive models, it helps not to come to the data with a specific model in mind, but rather to try a variety of approaches, knowing that some will be more successful than others. Doing so causes the management team to be less likely to fall into the trap of imposing causality where none exists just to increase the impression of understanding. Tiwari et al.1  succeeded on these two points. They disclose a variety of analytic methods examined in addition to the implemented technique and their data come from two separate locations across all posted cases. Exclusion criteria were reasonable and the sample covers major seasonal and academic time periods to account for volume variability.
Ultimately, a functional governance structure is essential. In both public and private hospitals around the world, variability exists in the OR governing body and associated authority, reporting lines, and responsibilities. Current approaches range from a single person making the key decisions all the way up to a large OR executive committee.
Staff Skill Sets Vary
As we have investigated OR clinical activity prediction algorithms here at Stanford, several issues were discovered. First, although case volume or total hours of OR time that those cases require is a sizable component in the equation of predicting demand, it is by no means the only component. There are other dimensions that are important to address when optimizing staff scheduling. The right skill mix is also required (e.g., the urology OR nurse may not be able to work in the heart room). This is especially true for large medical centers where specialized anesthesia or nursing is required to cover specialty services such as cardiothoracic, neurosurgical, or pediatric cases.
Other obstacles may arise in the push to have a well-functioning OR suite such as contracts and union mandates on staffing hours limiting opportunities to fully optimize staffing. Data analysis may suggest that shifts are optimized at a certain length, but current contracts may not allow for that shift duration. For the OR manager, this issue deserves recognition because the stakeholders extend beyond the players in the surgical suite environment.
Modeling Pitfalls
All models are, by definition, erroneous. If they were not, then they would be reality. Increasing model complexity to overfit data or ignoring outlier data may make a prediction model seem more accurate, but actually creates subtle yet significant error. Because collected data are not pure signal, it is subject to interpretative variation such that multiple observers can apply very different models—all of them likely to be wrong as a matter of approximation. One of the advantages of in-house data analysis is that, after overcoming the initial learning curve, questions can be quickly examined and potential changes can be tested and rapidly iterated.
This behavior alone is noteworthy because static belief in a model is a dangerous practice. Should the model predict something that seems unreasonable both unconditional trust and outright dismissal are inappropriate responses. Further investigation is required. This is illustrated well in the article when the model predicted an abnormally low volume without historical precedence.1  The authors investigated and discovered many surgeons planned to attend a national meeting but had not informed the schedulers.
When reporting novel data analysis with significant economic impact, reproducibility takes on particular importance. Unlike bench research, the barrier to reproducibility with predictive modeling is low. Datasets can be fairly easily adapted and distributed, and there is no equipment required beyond a computer to replicate results. Thus, that which can be held to a more rigorous standard should. Methodologies reported in the literature should be exceedingly explicit so that they can be duplicated by interested parties. Should a hospital or anesthesia department wish to begin doing such work themselves options exist across the cost spectrum. These options include using the open-source analytics package R*01  or IBM’s SPSS (IBM Corp., New York, NY) as was done by Tiwari et al.1  Alternatively, an engineer can be hired to do the programming required or outside consultants with expertise in OR management can be brought in. When the implementation of statistical methods is adequately described, as Tiwari et al. have done here, a group’s locally acquired data can be tested against a new method to see whether similar results might be obtained.
Finally, hospital managers must explore and understand the consequences of decisions made from a model. How does applying the results of the new model affect your OR suite? All efficiency gains are not equal and what looks like efficiency on the front-end may be disguising the potential for an unexpected negative event should the system take an unanticipated shock. An example might be a community hospital surgical department convinces administration to reserve a room 24/7/365 for its use on the basis that it will shorten hospital stays and increase revenue by increasing throughput. However, this continual resource allocation in the face of variable demand could limit the flexibility of the OR suite as a whole, and have a net negative impact overall. Decisions made in the short term may have undesirable long-term consequences. To that end, it is wise to ask how the overall condition of the affected system will change in relation to a change in the underlying variables.
Conclusion
Operating room management modeling studies aim to extract such meaning and knowledge from data. But raw data consist of many facts and little meaning. And data analysis is not a magic mirror. You cannot just “ask” data a question and get one perfect answer. Although in-depth statistical methods may not be familiar to the practicing clinician, this does not preclude the critical assessment of results generated out of the magic of large dataset analysis. When taking theory to practice and applying these analytics to the real world of managing a busy and chaotic OR suite, data alone are not enough to drive change. There will be qualitative challenges when incorporating modern quantitative methods designed to maximize the number of patients who undergo surgery safely at the lowest possible cost to the hospital.
Competing Interests
The authors are not supported by, nor maintain any financial interest in, any commercial activity that may be associated with the topic of this article.
*Available at: http://www.R-project.org. Accessed February 1, 2014.
Available at: http://www.R-project.org. Accessed February 1, 2014.×
References
Tiwari, V, Furman, WR, Sandberg, WS Predicting case volume from the accumulating elective operating room schedule facilitates staffing improvements.. Anesthesiology. (2014). 121 171–83 [Article] [PubMed]
Brenner, LA, Koehler, DJ, Tversky, A On the evaluation of one-sided evidence.. J Behav Decis Mak. (1996). 9 59–70 [Article]
Image: ©Thinkstock.
Image: ©Thinkstock.
Image: ©Thinkstock.
×