FishyOperationsCategory Archives: r, and kindly contributed to Want to share your content on R-bloggers? [This article was first published on, and kindly contributed to R-bloggers ]. (You can report issue about the content on this page here Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Estimating required hospital bed capacity requires a thorough analysis. There are a lot of ways of approaching a capacity requirement problem, but I think we can agree that a simple spreadsheet analysis just won't cut it.

The approach described in this post makes use of discrete-event simulation and, just to clarify, makes abstraction from a lot of variables which should be taken into consideration in a real-life analysis.

To explain the approach, the following case will be used:

An emergency department of a small regional hospital receives complains about its emergency admission capacity. After investigation of its admission data it becomes clear that their service level is not up to par with that of other hospitals. Therefore, plans for the redesign of the emergency department and an investment in its emergency bed capacity are presented. The plan proposes a new bed capacity of 12 beds (coming from a previous of 10 beds). The Chief of Medicine wants to know what the effect of this investment will be on their emergency admission service level.

The emergency department has recorded data on the interarrival times of patients that are admitted (or should be admitted) to an emergency bed. The following graph shows the interarrival distribution (triangular: mode=5, min=.1, max=12):

The graph below shows the collected data on the LOS of patients in the emergency bed (triangular: mode=24, min=7, max=72):

SimPy will be used to build the simulation model. More information on SimPy can be found on their website, another example can be found in a previous post.

The following script shows how the model has been build.

from SimPy.Simulation import * import random import csv class EmergencyDepartment(Simulation): def __init__(self,capacity,days_to_run): Simulation.__init__(self) self.capacity=capacity self.days_to_run=days_to_run def runModel(self): self.initialize() #create a monitor to observe waitingtimes & value-adding time self.PROCmonitor=Monitor() #create the 'bed resource' self.beds=Resource(name='Beds', capacity=self.capacity, sim=self) #create patient arrivaltime list patientarrivals=[] last=0 while last<24*self.days_to_run: new=random.triangular(.1,12,5) patientarrivals.append(new+last) last=new+last #create the patients for p in patientarrivals: patient=Patient(name="Patient %s"%p,sim=self) self.activate(patient,patient.run(),at=p) #simulate for the defined number of days self.simulate(until=self.days_to_run*24) class Patient(Process): def run(self): self.admitted=0 #save a reference to the 'arrival time' of the patient starttime=self.sim.now() #get a length-of-stay time from the triangular distribution los_time=random.triangular(7,72,24) #request a bed #renege clause, a patient can wait for a maximum of one hour before being transferred to another hospital yield (request,self,self.sim.beds),(hold,self,1) if self.gotResource(self.sim.beds): self.admitted=1 #hold the bed for a duration of 'los_time' yield hold,self,los_time #release the bed after discharge or transfer yield release,self,self.sim.beds #observe admittance status self.sim.PROCmonitor.observe(self.admitted, t=starttime) def gotResource(self,resource): result=self in resource.activeQ if not result: resource.waitQ.remove(self) if resource.monitored: resource.waitMon.observe(len(resource.waitQ), t=now()) return result #create a list to collect results in results=[] for beds in range(1,21): for run in range(1,10): s=EmergencyDepartment(capacity=beds,days_to_run=365) s.runModel() #extract results from the process monitor for t in s.PROCmonitor: results.append([run,beds]+t) # write out results to a CSV file writer=csv.writer(open('model_results.csv','wb') ,quoting=csv.QUOTE_NONNUMERIC) writer.writerow(['Run','Beds','Time','Status']) for r in results: writer.writerow(r)

The model runs every scenario (bed capacity of 1 up to 20 beds) 10 times in order to achieve information on the effect of variance. This means that the model will run 20 scenarios * 10 re-runs = 200 total runs of the model. Next to that, it saves information at every timepoint at which a patient enters the model. The result is a pretty hefty datafile: 250000+ rows. Please note that in a real-life model the number of re-runs would most likely be higher.

The results will be analyzed using R. The model outputs and saves its results in the following style:

results <- read.csv("model_results.csv") head(results)

## Run Beds Time Status ## 1 1 1 12.667 0 ## 2 1 1 16.555 0 ## 3 1 1 22.673 0 ## 4 1 1 25.184 0 ## 5 1 1 30.294 0 ## 6 1 1 5.532 1

To analyze the results we have to aggregate the results by day/week. Therefore we have to determine the day and week from the Time column. The time is in hours so we can determine the day and week as follows;

results$day <- floor(results$Time/24) results$week <- floor(results$Time/24/7) head(results)

## Run Beds Time Status day week ## 1 1 1 12.667 0 0 0 ## 2 1 1 16.555 0 0 0 ## 3 1 1 22.673 0 0 0 ## 4 1 1 25.184 0 1 0 ## 5 1 1 30.294 0 1 0 ## 6 1 1 5.532 1 0 0

Next, lets summarize the data per day for each bed capacity scenario and each re-run.

library(plyr) summary_run_day <- ddply(results, .(Run, Beds, day, week), summarize, servicelevel = mean(Status))

Now to summarize the re-run data we extract the mean, minimum and maximum servicelevel per day for each bed capacity configuration.

summary_day <- ddply(summary_run_day, .(Beds, day, week), summarize, mean = mean(servicelevel), min = min(servicelevel), max = max(servicelevel))

On to the visualization! For visualization purposes we aggregate the data on a weekly basis and then plot the results using ggplot2.

summary_week <- ddply(summary_day, .(Beds, week), summarise, mean = mean(mean), max = max(max), min = min(min)) library(ggplot2) ggplot(summary_week, aes(x = week, y = mean, ymin = min, ymax = max, colour = mean)) + geom_ribbon(colour = NA, fill = "red", alpha = 0.2) + geom_line(size = 1) + scale_colour_gradient(low = "red", high = "green") + facet_wrap(~Beds) + opts(legend.position = "bottom") + ylab("Servicelevel") + xlab("Week")

The above plot shows the servicelevel for each bed capacity configuration. The line shows the mean servicelevel and the ribbon indicates the effect of variation. The upper and lower boundary of the ribbon corresponds to the minimum and maximum servicelevel calculated on a daily basis. As we can clearly see from the graph, the current configuration of 10 beds just won't cut it when working towards a servicelevel of around 100%. The proposed bed capacity of 12 beds would be a major step in the right direction, but as you can see, extremely busy days can lead to a sub-100% servicelevel. To be completly save, a bed capacity of 13 beds could be considered.

As this model only tests the effect of 1 variable, bed capacity, it can easily be interpreted visually. However, when multiple variables (e.g. nurse/doctor utilization, patient to nurse ratio, acute/sub-acute bed) are to be analyzed and the optimal configuration has the be found, it becomes difficult to analyse the result visually. In this situation, linear/binary/integer-programming can offer a solution in finding the optimum configuration within all the scenarios that have been generated. When using this methodology an optimization objective could for example be to minimize costs while adhering to the model constraints, e.g. patient/nurse ratio of 2, always at least two doctors on duty and a minimum servicelevel of 99.5%. An example of building an linear/binary/integer programming model in R can be found in a previous post.

Please note, this simulation example is only meant for illustration purposes and makes abstraction of a whole lot of important influential factors, such as e.g. day/night variation, available personnel, process flow within the ED.

The post Estimating required hospital bed capacity appeared first on FishyOperations.