How To Fix The NHS IT Problem
Have you seen the current budget for NHS IT? How @*#!ing much? And that is just for one country . Now, apparently, it is a massive project never undertaken before with ambitious goals etc. etc. Rubbish. It is a very small copy of something we call the internet. It is based on a (average) modest few megabytes of data per patient, mostly housed locally to that patient by his doctor. Up the chain from that would be a handful of consultants in his local area. If the case is rare then there might be a consultant out of the region entirely looking at it.
So, what's the problem? Apparently it has to hold many different types of things for the patient. Yeah, right. All of these types of things will be media of some kind inevitably displayed on a Windows XP box in the nurse's office. Apparently the database will be massive. Yeah, again rubbish. Most people will probably have more megabytes of photos online on their social network (operating for free). There are severe implications for security of data. Again, can I say online banking?
The web can do verything they want, though it really shouldn't be over the public internet. But the technology underlying it is free, open and secure. Much more so than any system that a contracted company is going to write could hope to match. The system has to be scalable and reliable. Check. Secure yet open. Check. Hideously complicated, likely to ignore the needs of its users, liable to failure and overpriced? Nope, fails on that one. I have done database and comms work for a living since I were a little seedling. Looking at the loads and distribution this stuff can be done for next to nowt (in Government IT budget terms). I could do the job for ten percent of what they are going to pay, and the vast majority of that would go on buying an island and building a lair inside a fake volcano with an opening top.
Just as a suggestion, each doctors office could have a small server holding all their patient records. This could connect up the chain to the nearest couple of hospitals. They could connect up to the regional trust. They could connect up to a central distribution and backup utility. When a consultant wants to access the record of a patient, that makes a request down to the doctors server to keep them informed of updates, and in turn to send notes and scans back down to the doctor. The number of users is TINY in practise. A nurse or doctor will see a patient every fifteen minutes. The practise might have 6 doctors, 6 nurses, 2 receptionists and 20,000+ patients. That still works out at 14 staff however many patients. They can only access detailed records at 14 * 4 per hour. A workload of less than 60 hits per hour!
They might be multimegabyte hits, but I believe a major FTP download site handled such hits at the rate of 1000s of simultaneous connections on a 200MHz processor for years. Sure that is a lot of bandwidth, but think about the gigabit networking. At the end of each session they might put in a few notes and test results per patient. The network will not notice this in any way. Not even on the upload to registered consultants for that patient. The replication of data up to the next link in the chain will be neglible on modern bandwidth. Major data from scans etc. could come down the link from the consultants much more slowly and still get to the surgery before the patients can book in to see the results.
The link from hospital to regional trust needs to be bigger of course, but then there are few of them. The link from regional trust to backup centre needs to be, err, DHL (I might be channeling Cringely here!). Fill terrabyte hard drives with compressed updates to your data, put it in a big foam parcel and send it next day. Seriously, a pigeon with an SD card is faster than any possible bandwidth. Only send stuff across the distribution network that needs to go that way. A consultant in a different region taking records for a new patient, that kind of thing. This kind of data isn't urgent realtime data, it can take an hour or two to go from the scanner to the consultant in the normal course of testing, it is only the super-urgent cases where it needs to be with them immediately, and in that case shouldn't someone NEAR the patient be dealing with it? Even in the few cases that is not possible you can get hi-res images to people via email on dodgy DSL connections as it stands. A dedicated network can do the same, especially without the burdens the actual internet faces such as iPlayer traffic and spam.
In the worst case very small country practises might be disadvantaged in all this, but I think that is an excellent excuse to build infrastructure for that region, which should still leave 90% of the budget free for my island purchase, as part of that rural development budget should be coming from other programs anyway. There is a thought amongst some respected analysts that people like Google, or even Microsoft (?!? MS Bodyscanner V1.0 - sorry we fried the patient, please insert a new one and try again) should get involved because anyone is better than the normal government contractors. I think these analysts are thinking too big. This is a puny project. Get a data conversion specialist in at each region with dodgy file formats (me! me!) get a few network engineers networking, a few server specialists and the whole thing could be up and running in no time. The real difference is the interface could be tuned to run how each practise wanted it. If you can tweak the options on your iGoogle the same goes for this kind of thing. I do it every day on my systems, it isn't even difficult. The important thing is to look at what is wanted from the system and listen hard.
Some places will need more hardware than others, some more bandwidth. But the key thing is, this system has already been designed and built. A guy called Tim Berners Lee thought about it once and came up with the solution. Ever since, the stability and workload of the web has been increasing, and the possibilities now go far further than modern medicine needs for a while. Standard file formats, standard databases and standard servers running a standard networking OS can match and beat anything we are going to get from the contractors. And we need only build it once and release it. The coders in India and China will polish it to take account of rural and low connection areas, as it would be a must there. The high tech stuff can take place in Sweden or Norway as normal. The security is already being worked on by real internet teams, so that work is done too. I really don't want to pay these companies billions anymore for a guaranteed cock-up.