Consultant Outcomes

21 November 2014

Publishing consultant outcomes should be, well…simple, shouldn’t it?  We have the data, we have guidance on what to publish and we have the means to publish it, but in between there’s a whole spectrum of how to get this simple process wrong. Not least of all by getting the backs up of all our consultants by not engaging with them about their data first.  To achieve real trusted consultant outcomes we need to take an integrated approach by setting up visually stimulating, well designed intelligent dashboards alongside transparent de-coded clinical datasets that consultants will understand, recognise, sign up to and take note of.  In the long term by doing this we believe we’ll achieve a clear call to action amongst our consultant body, enable informed patient choice and develop a transparent, positive culture of challenging each other within our own organisation.

As for selecting which metrics to publish, it’s a minefield. Get it wrong and we might as well go home.  There are those we would consider to be global and applicable to all clinicians – in its most basic form this will include pure activity counts alongside performance against the national access targets (RTT, Cancer, etc.).  Then layered on top of this would be the nationally recognised outcome measures including deaths, readmissions and length of stay, ensuring anything published internally is benchmarked and adjusted for in terms of caseload.  Before external publication of metrics we would further aim to adjust for casemix and ensure they are explained and accompanied by any balancing metrics.

Then there are the Royal College defined outcome measures, the zenith of consultant outcomes, the beacon, the gold standard to which we should adhere… the gold blend if we were talking coffee.  By publishing these we would be handing ourselves, our consultants and our patients the power to make the key decisions we need to, without scrabbling around in the dark.

Clearly the strength of publishing consultant outcomes will start to build as we identify, agree and produce local outcome measures and maintain robust data collection systems. However, a word of warning!  At the point of identification we must also consider that the data relating to these metrics may not be comparable with our peers due to unsupported national collection systems.  Fortunately at EKBI, we’re working on the solution to this too.

Further domains to include and consider would include patient experience and, patient reported outcomes measures and on top of the direct clinical metrics we would include cost details and Trust defined efficiency measures to set the financial context.

At East Kent we have the software, capability and skills to produce dashboards to this quality, although the impact and use of this data will only be defined by the engagement of our clinicians, which of course brings us full circle.  For the publishing of consultant outcomes to be a success, the data needs to be accurate and trusted.  It’s very much like the chicken and the egg scenario.  It must be easy for the organisation to maintain a strategy for investigation as without this we are likely to develop an industry of ad-hoc reporting  and unhealthy challenge ultimately resulting in an under engaged clinical body. In fact all the things we’re against.  As such we believe the success of dashboards can only be achieved if we can provide consultants with regular sight of the activity recorded against them.

So the mantra we are adopting is this,

  • We will consult with our consultant body to identify the range of scenarios and platforms in which they wish to access their data.
  • We will build our repository around the patient administration dataset and in time add additional datasets including Clinical Applications and National Returns. The long term benefit should include the ability to provide self-service consultants’ log books.
  • To fully deliver sustainable data quality improvements a clear mechanism for feedback will be developed, ensuring that the data is corrected at source and learning is disseminated throughout the organisation.

Easy?  The proof, as they say, will be in the pudding. Lucky Christmas is on its way.


James Bennell

Sara Hughes