What will it take to make our information systems secure? The answer is simple: an alternative universe in which these systems aren’t designed and built by humans.
Take a look around. What can you find that was made by a human and doesn’t contain flaws? Even in great works of art – a Rembrandt portrait or a Mahler symphony – you can point to little details and say, that could have been done better.
Security exploits feed on flaws. Bad code is a security nightmare, but software doesn’t have to contain actual errors to be vulnerable. Many weaknesses stem from ignorance of the consequences of implenting certain features or capabilities. The code works exactly as designed and predicted – it’s just that no-one thought about how the software might be misused. Oh, it might get a code review, some static testing, perhaps, maybe even some mild fuzzing, but that’s not going to catch everything. Often it doesn’t catch very much.
We can do it better, for sure. Making commercial software vendors legally liable for the flaws in their code might help concentrate their minds on security a little more. A better appraoch would be to make security awareness a key element in how we teach and execute software development – make it intrinsic to every single part of that process, rater than an afterthought.
On that theme, I think we can draw a parallel with another human endeavour: flying.
When I was training for my pilot’s licence, I was struck by how there wasn’t a single activity that didn’t have a safety component to it. Everything you do as a pilot is shaped and guided by safety concerns. Even something as basic as slamming back into your seat after adjusting it to make sure it’s locked into position. (If you’re in a Cessna, you do it twice.) Or making a visual inspection of the fuel tanks even though the aircraft is equipped with fuel gauges. (Remind me to tell you the story of my night cross-country in a C150…)
Every other aspect of aviation, such as aircraft design and maintenance, air traffic control and so on, is similarly imbued with safety practices woven into the fabric of every procedure.
Does this mean the aviation world is perfect? No, of course not. Aircraft are designed, flown and maintained by humans. They make mistakes and are sometimes stupid or irresponsible. (In the early days of GPS, I knew several private pilots who regularly flew without charts or pre-flight planning. At least one stumbled right through Heathrow’s Class A airspace.) And commercial pressures result in companies cutting corners.
But aviation does at least start from a baseline of high safety awareness. No-one can claim that they’re not fully aware of how it should be done. And when a flaw causes a plane to crash, the manufacturer doesn’t just shrug and say, we’ll get a patch out in the next month or so. There are consequences. Failures to abide by best practices in safety are regarded as wrong. This is why, for example, the EC maintains a list of airlines banned from flying in Europe.
In aviation, safety is embedded in the culture. It permeates it at every level and every stage. Bad attitudes and practices are unacceptable. Safety is an integral part of the mindset.
We don’t have that yet in IT – and we desperately need it. And I have a hunch things are getting worse. Particularly on the Internet, there is a pressure to be first – to be the latest cool thing. In a recent guest blog on TechCrunch, Umi Shmilovici bemoans the ‘ship fast’ culture. He mentions Facebook’s ‘Done is better than perfect’ philosophy as an example of how net-speed competitiveness leads to poor products.
But his complaint is quality. He doesn’t even mention security, which is one of the chief victims of this mentality.
The got-to-be-first mindset isn’t going to go away. Nor is the ‘ship now, worry later’ attitude – that’s been with us since software was first sold. So what we need, to counter this, is to embed security awareness into every part of the software life cycle, not as a quality control procedure, but as an integral function of every other task. It’s a big ask because it means changing how we teach software development, it means changing processes and – worst of all – it probably means longer development times and higher costs.
Firms wanting to keep costs low and beat the competition are unlikely to embrace this idea with much enthusiasm. The fact is, they really don’t care much about these kinds of issues. Just look at Facebook’s cavalier attitude to privacy. But laws that make firms legally liable for their software, backed with a few lawsuits, might help them to focus on the issue a little more.