Intelligent Emergency Care

Dr. Charles House, Medical Director, Medicine Clinical Board, University College London Hospital (UCLH)

Driving quality and performance improvement through a reflective cycle is to measure actual performance against a capability, understand the gap and then adjust the inputs (be they skills, materials, tasks or processes) at the next iteration or cycle, in order to close the gap between the two.

At the time of writing for emergency admissions to UCLH we have a clear performance standard to meet, namely 95% of patients either admitted or discharged from the Emergency Department (ED) within 4 hours. On a daily basis, we understand only too well what the gap is, we know by how far we have come up short of hitting the 95% mark.

For some components of the overall process NHS centrally mandated standards exist, against which we measure performance, for example Time to Initial Assessment and Time to Treatment. Processes, pathways and staffing levels are tweaked and flexed in response to underperformance in the relevant areas in the continuing attempt to close the gap between actual and desired performance.      

For other aspects of care and interventions along the patient pathway, we set our own internal standards and expectations. Some of these, such as the turnaround time for blood test results to become available, can be agreed between a relatively small number of stakeholders and be measured with relative ease. Others involve more teams and are subject to less sophisticated data capture, for instance the time taken from referral for a patient to be seen by a specialist team.

So, while some steps of a care pathway can be and indeed are measured and monitored, it is nevertheless hard to provide a systematic higher order action designed to improve the overall process so that the performance gap is always being reduced. Daily variance in performance suggests a process that is not “in control”, with insufficient damping of variation between best and worst performance.  

Our challenge is to understand how best to address this, to reduce variation and improve overall performance? One approach is to be clear about what things are within our control and to ask teams (for instance the ED team or the radiology department) to concentrate on these specific aspects of the pathway, rather than expending time and energy on factors out of their control. Those are things which only somebody else can fix.

In this way, the Imaging team, say, can look at their own performance. They can consider the time taken to perform and report a CT scan on a patient from ED and take steps to address any gap between actual performance and the agreed standard.

Breaking down performance by areas such as Majors (seeing sicker patients) and the Urgent Treatment Centre (seeing minor injuries and ailments) allows higher level oversight and coordination of effort. A performance gap in Majors, which is more dependent on specialist team input and bed availability, needs a different set of fixes compared to a gap in UTC, where most patients are seen and discharged by the ED team within the department.    

This sort of analysis of performance by area allows the agenda to move away from viewing individual service components in silos, towards a focus on process and teamwork. Such a shift may also be driven by analysis of outcomes. We might compare for instance, performance for patients who are admitted for inpatient care against those who are discharged to home from the ED.

Daily monitoring and assessment of performance includes Breach Analysis, whereby each patient who spends longer than 4 hours in the department has their care pathway analysed. Using an established algorithm, all such breaches are attributed to a cause, such as “ED long wait”, “Bed delay”, “Imaging delay” and “Specialty review delay”. Although this enables individual cases to be studied and lessons learnt, it carries a significant downside in attributing individual accountability (which human nature often translates into blame) in a complex environment with multiple interdependencies.

Perhaps, in the days of a decade ago, when daily breaches of the 4 hour standard would routinely be counted in single figures, such an approach had its merits. In pressured, current-day practice this can all too easily become an exercise in avoiding blame for numerous breaches. It may exacerbate the risk of staff, working hard with limited resource, feeling bullied or harassed in the workplace. The challenge now is for UCLH to reconsider how performance is analysed, in order to feel sure that our data translates into useful information and that our processes enable intelligent healthcare.