Abstract
Appropriate inpatient staffing levels minimize hospital cost and increase patient safety. Hospital inpatient units dynamically adjust premium staffing (above base staffing) levels by attempting to match their daily demand. Historically, inpatient managers subjectively adjust daily staffing from observing the morning inpatient inventory. Inpatient units strive to match staff with demand in a complex patient throughput environment where service rates and non-stationary profiles are not explicitly known. Related queue control and throughput modeling literature do not directly match staffing with demand, require explicit service process knowledge, and are not formulated for an inpatient unit. This paper presents a Markov decision process (MDP) for dynamic inpatient staffing. The MDP explicitly attempts to match staffing with demand, has a statistical discrete time Markov chain foundation that estimates the service process, predicts transient inventory, and is formulated for an inpatient unit. Lastly, the MDP application to a telemetry unit reveals a computational myopic, an approximate stationary, and a finite horizon optimal policy that is validated through hospital expert experience. The application reveals difficult-to-staff inventory levels and shows that the removal of discharge seasonality can drastically decrease required size of the premium staffing pool and the probability of full occupancy thus improving the inpatient unit's patient flow.
Acknowledgments
This research is support by Banner Health in Phoenix, AZ. The authors would like to thank Twila Burdick, VP of Organizational Performance, and Management Engineering at Banner Health.
The views expressed in this article are those of the authors and do not reflect the official policy of the United States Air Force, Department of Defense, or the United States Government.