Computer scientist, Professor Harold Thimbleby, shares a story from the NHS and his involvement in campaigning for a better understanding of technology in the law. Through his insights, we see the need for SHAPE and STEM to work together, balancing our humanity with our desire for new and improved technologies.
If you’ve ever got cross with your computer or frustrated when technology seems confusing, then you have an idea of what makes me tick. I am a professor of computer science, studying how to make computer systems safer and easier for people to use.
In 2015, my local hospital asked me to review a blood glucometer called the XceedPro, made by Abbott, a large international company. There was clearly something going on, as the hospital wanted an objective report on the glucometer, but wouldn’t tell me what sort of things they wanted to know or why.
A hospital blood glucometer like the XceedPro is used by nurses to measure blood glucose levels especially to help decide on the right insulin dosage. I found an advert for the XceedPro dressed up to look like Einstein: it was portrayed as a very clever device that was easy and safe to use. I studied the XceedPro and wrote a report; it certainly wasn’t as clever as the advert implied, but it wasn’t unsafe.
A year later, I was invited to be an expert witness in a court case where the XceedPro had a leading role. For a computer scientist like me, expert witnessing is when STEM hits SHAPE: it is all about the social impact of the technology, and particularly the way it interacts with the law. Interesting SHAPE issues in this case were how people misunderstand computer evidence, and how the law currently entrenches these problems. Expert witnessing is very satisfying; the art and science details, the human priorities and the engineering facts collide; the piles of evidence are like a jigsaw of treasure troves.
The hospital had suspended 73 nurses, and the police had decided to prosecute five for criminal negligence. It was to do with their not monitoring patients’ blood glucose. Actually, it was their alleged not monitoring of patients.
Soon the defence and prosecution met to see whether there were any other solutions than going to court. The prosecution outlined their case. I argued that you might get one or two negligent nurses, but 73 all doing (or not doing in this case) exactly the same thing was implausible. Had the prosecution looked at anyone managing the computer systems? If a lab technician, for instance, had had a grudge, it would have been very easy to create the impression that the nurses were negligent. Or had there been a cyberattack, or some other computer problem, corrupting the patient data?
No, said the prosecution; all the nurses were “in it together.” It was a conspiracy, they said. So, the court case began.
The Judge reminded us of the relevant law, including the Common Law presumption that computer evidence is correct. I was cross-examined almost every day for three weeks. It is a mixture of being very scary, and quite fun in a weird way — the prosecution barrister had no idea about computers, so really I had a huge advantage over him. But my key problem was that although I had all the data, I did not know why some data was missing — the data alone couldn’t distinguish a cyberattack from a deliberate deletion of data by an employee, or anything else. That gave the barrister lots of opportunities to goad me for my ignorance.
The fact that the Common Law takes it for granted that computer data is correct means nobody had bothered to check the data. Nobody had asked why it was good enough evidence to drive a criminal prosecution. There was no curiosity. Before the court case itself, my attempts to find out more, beyond just the raw data I’d been given, had been stonewalled by the hospital. I know there are a lot of sensitivities when there is a police investigation, and there was also a very angry campaign group calling for an inquiry.
The prosecution’s explanation was that the nurses had failed to do their jobs. The absence of data they said meant that the nurses had failed to take blood glucose recordings from their patients, putting patients at risk. I gave statistical evidence showing that the data was corrupted in a systematic way. I argued that the data didn’t show negligence from nurses but rather a systematic corruption of data, in a way that no nurses would be likely to be able to do.
The prosecution was stumped, and called Abbott’s Chief Engineer to testify.
The Chief Engineer said that he worked on the XceedPro system, and had visited the hospital. I tugged the defence barrister in front of me, and told her to ask “When?” The Chief Engineer gave some dates. I tugged the barrister again: “That’s when data disappeared!”
It turned out that the Chief Engineer had been called in by the hospital to tidy up the database, and that’s exactly what he had done. He had deleted data to tidy it up, and thus created the impression that many nurses had not done their job.
To cut a long story short, the Judge intervened and ruled that the prosecution evidence had no probative value. He asked the prosecution what they wanted to do. They admitted they now had no case. After ordering the jury that there was no case to answer, the Judge said the famous words: “Release the prisoners!” The nurses had been behind bars (in fact, sheets of glass) for three weeks.
I was soon to meet Stephen Mason, a barrister who had co-authored a definitive textbook on computer evidence. He is an activist trying to get the Common Law presumption on computer evidence sorted out. This was the moment when I started to think about the systemic issues of computer science and the law, where SHAPE meets STEM.
There is some sense in the presumption. Computers are very complicated, and courts don’t understand them, so it is much easier to take computer evidence as correct than start an argument that nobody will understand. Unfortunately, if the computer evidence is not correct, then the defendants are in a very difficult position. They have to show the evidence is faulty when the prosecution can hide behind the presumption. The defendants will then struggle, and probably be accused of going on a fishing expedition trying to find something concrete to help them.
The worrying thing is that this case is not unique. There are many other computer problems across the NHS and, as the Post Office Horizon scandal proves, computers cause massive problems that people courts don’t understand (see Harold’s Blog 2).
Stephen Mason and I soon joined a team of like-minded people, including professors of computer science, barristers, and others (a combination of STEM and SHAPE expertise) trying to get the Common Law presumption sorted out. From an answer to a question in Parliament in January 2024, we know some of our briefing papers are being reviewed by the Lord Chancellor. We’re making progress.
Further reading
About the author
Prof Harold Thimbleby
I am a professor of computer science, but I’ve moved into legal activism, via this story in the NHS and the Post Office. I am now working with barristers, and I am proud my work in the NHS has led to me being elected honorary fellow of the Royal College of Physicians and of the Royal College of Physicians Edinburgh. I’ve published 345 refereed papers and been invited to give over 300 conference keynotes and over 600 presentations and workshops around the world.