The UK General Election - 4th July 2024

Days
Hours
Minutes
Seconds

What the Post Office Scandal teaches us about data and technology

Alex Hackett

Group Director of Digital

The national conversation surrounding ITV’s Mr Bates vs The Post Office has successfully galvanised the media and the House to right a historic wrong brought about by the Post Office’s Horizon IT system. Quite coincidently, I have been reading the fanatically detailed book The Great Post Office Scandal by Nick Wallis, which goes into forensic detail about all aspects of the story, including the procurement of the Horizon IT system from Fujitsu. Wallis outlines numerous failures in the platform’s development, external procurement process and data handling methods which, coupled with numerous other failings, ultimately caused this disaster. As someone who works with data and digital platform creation every day, I noticed many of the common digital pitfalls that individuals and organisations have fallen into, before seeking our advice on the best route forward. In essence, these boil down to three key learnings:

Always interrogate data

Data can often be seen as a single point of truth, an objective marker that cannot be interpreted beyond whether an increase or decrease in a given value is a positive or negative outcome (e.g. more visitors to a website is good, fewer is bad etc). I often find that people don’t look at data for what it is – one measure of information that requires interpretation, coordination with other data points, and interrogation in order to fully comprehend a bigger picture. Take the Horizon scandal for example – because the underlying data which saw so many sub-postmasters jailed was entirely trusted, many innocent spectators saw the initial increase in the number of arrests as a somewhat surprising but ultimately accepted phenomena – clearly, this new technology was rooting out criminality which could not be adequately tracked and caught before now. This of course was a devastating assumption. True data analysis requires a holistic approach – to fully understand what is happening in any system, multiple data sets need to be interrogated at once to test a presumption and to result in a robust working theory. One point of information is seldom enough to presume a viable conclusion – always check that the majority of valid internal and external data points support your understanding.

Build new technology on a firm foundation

According to Wallis’ research, the Horizon platform was built on a prototype system used to demonstrate the capability of the product to the government during a lengthy and ever-evolving procurement process. The numerous changes in scope and leadership within the project, coupled with failures within the internal development team resulted in a truly dangerous system that ended up being rolled out nationwide. These sorts of catastrophic failures are inevitable when a system is so bloated and hacked together with different data entry points, coding languages and information structures. The development of Horizon is a warning to all those embarking on a complex development process –  it is vital that a clear roadmap of functionality is established from the beginning with minimal deviation, and that accepted principles of data security and integrity are followed at all times. If good practice like this is adhered to, minor adjustments and enhancements can be incorporated without it resulting in a catastrophic failure. It’s always important, before embarking on any platform design, that all involved have a clear understanding of what ultimately the system is devised to achieve and how best to keep the most vital data secure and correct (especially when it involves personal or transaction data). Any system with a high level of complexity should be able to produce an accurate log of changes and updates to ensure that when disaster hits, errors can be fully understood and rectified with no loss of data.

The computer isn’t always right

In 2021, the British Computer Society called for a “reconsideration of the courts’ default presumption that computer data is correct”. This is of course vital in avoiding a similar set of assumptions causing similar miscarriages of justice shown in the Post Office scandal, but it’s also an important lesson for all those who are less technically literate – the data can just be rubbish.

Data is easily skewed, manipulated, incomplete, broken or downright wrong, often through no fault of anyone involved. Even when looking at basic analytics such as video views or website traffic, it’s always vital that you ask yourself a few simple questions; what if this data has been miscalculated? Does this reading correlate with my expected outcomes based on other data points and previous data? Is there another system or process I could undertake to qualify this data and ensure its accuracy? Code is fallible, written by people for people, mistakes do happen and a single one or a zero in the wrong place can cause havoc on any scale.

Knowing that the systems we create and the data we amass are often messy, and reliant on many hands coordinating their work as one, helps us to appreciate and accept that problems can and will arise – it’s how people understand these systems, double-check their data and qualify their theories that ensures nightmare scenarios like Horizon never get started.

PLMR Insights: Labour Manifesto

PLMR Insights: Conservative Manifesto

Add PLMR to your contacts

PLMR’s crisis communications experience is second to none, and includes pre-emptive and reactive work across traditional and social media channels. We work with a range of organisations to offer critical communication support when they are faced with difficult and challenging scenarios.

Menu