Sunday 1 March 2015

Optimising services in the public sector

Improving the public sector for the benefit of the citizens is, maybe, the (quiet but constant) wish of citizens and the (occasional but loud) promise of politicians. Quite rightfully so, especially in countries where considerable sums of public money are channelled to support the various social state functions, such as education, healthcare, welfare services, etc.
'Winner' by Alessandro Capurso
under a CC license

So, why not try to optimise the public sector as would any business do with its core processes?

Interesting idea, probably not-that-new, certainly always tempting, but with its pitfalls. So, not so fast!

Optimisation, in the mathematical sense, is the selection of the best element(s) against a pre-determined set of criteria. This implies that, in order to optimise something, one needs data to for performance and cost parameters. In the typical scenario, one would have a large set of variables, with the corresponding datasets, and would need to select the target for the optimisation process, providing any constrains that may apply to the variables. It sounds easy but it's not.



Let's visit a hypothetical healthcare sector example (understandably, a sector under constant study). There, the target variable for optimisation might be the overall patient capacity (to be maximised) with constrains on, say, total cost, total working hours per staff member, buildings and equipment. That would leave a lot of factors to play with, including shifts and personnel balance per shift, treatment protocols, space usage, etc. Given sufficient data and the right tools (statistical and IT) a result will be reached. But would that be a feasible one? And if yes, would that be an acceptable one? The first question depends on whether all variables and constrains were taken into consideration and on whether the right questions were asked. For instance, one way of maximising patient capacity might be to minimise the space given to intensive care. That would allow the relocation of costs (intensive care units have high operation costs), personnel and equipment to other uses. However, would that be an acceptable choice? Reducing intensive care resources (used for the care of the very ill) beyond some extent might increase mortality rates amongst patients, which would make it not-so-acceptable in the local society.

To be fair, given the fact that an increasing number of processes in the public sector are facilitated or monitored by IT systems, it is clear that, nowadays, there is increasingly more data produced to aid optimisation (in the mathematical sense). Statistical and IT tools are also in place and they are getting increasingly refined. So trying further optimisations only makes sense.

However, the real challenge of optimisation in big systems lies on the good understanding of the system to be optimised, of the optimisation analysis methodology and of the physical meaning of its outputs. In other words, it needs knowledge and expertise (and patience)! To a layman, even the target of an optimisation is not necessarily straight forward. For instance, should we try to optimise a single public service or the entire corresponding sector (e.g., a single hospital or the healthcare public sector)? What are the right target variables to be optimised? (e.g., should it be patient capacity, patient recovery and discharge time, treatment cost per patient treated, patient satisfaction or community satisfaction?)

Understanding the output of a mathematical process, applying it in real life and monitoring the outcome in order to confirm or reject whether the expected effect was achieved is a follow-up challenge to the above. Statistical analyses can be tricky, e.g., the Simpson's paradox, as it has been demonstrated numerous times, and that becomes even trickier when politics or public opinion are also involved :)

So, just because it is difficult, should we abandon the idea of optimising the services of the public sector?

No. Anything that can bring more real value to the citizens needs to be explored. Especially when resources are not that plenty. It could be done in a reasonably slow, step-wise fashion, carefully selecting constrains so as not to produce detrimental effects on society, giving reasonable time for consultation and reflection. It could be done at a small, pilot-scale at first and then expand to similar systems. After all, improving just a single parameter (say, the quality of employees) while keeping all others the same would still be an improvement. Mathematically, such approaches may or may not make sense but politically or socially they seem reasonable and may prove acceptable. There is certainly merit in bringing science to optimise such big processes but there is still a lot to be learnt, there.

In the mean time, maybe we should not fixate on the idea of full-scale optimisation. Even if that would be - theoretically - possible. It is important to understand that it is unlikely that we would be able to reach the optimal point in a single go, within a reasonable timeframe, at a reasonable (monetary and non-monetary) cost. And that even if would reach that point, not all of us would necessarily like it. And that even if all of us did like it, the changing conditions, due to the various external factors, such as the fluctuating global and local economy, the changes in the regional demographics or the need to tackle the effects of any big natural disaster, would force the system out of the optimal zone at some point.

At any rate, despite the many challenges that lie when using data and maths to fuel policy development, optimisation seems to be a step towards the right direction. Provided that we know what our targets truly are (because numbers, really, can't show us that)...

No comments: