Another day, another public hack: Today we learned that a “cyber-espionage operation” has used a previously unknown flaw in Adobe Flash to gain information from NATO governments and others. And also 87% of Android devices are vulnerable to numerous known hacks because there’s no good business model to distribute patches on that platform.
That’s just in today’s news. Earlier this summer we found out that GM took five years (!) to fully patch a vulnerability in the OnStar system that was discovered by a research group at UC San Diego in early 2010. That group was led by Stefan Savage, one of a number of super-smart and accomplished friends from college who make me wonder what the hell I’ve been wasting my life on.
A little closer to home, since we work on IoT products, the Cardinal Peak team has written several posts about the security of IoT devices, and of course you don’t have to look too hard to find examples of those products being hacked, either. When I’m not losing sleep about my lack of success in the world, I lose sleep about the possibility that something engineered by our team will end up on the front page of Slashdot because of a hack.
What all this underscores is this: As an industry and profession, we software and computer engineers need to come up with a fundamentally different and better approach toward security. Whatever we’ve been doing sure isn’t working, and I’m not optimistic that continuing more of the same will yield fundamentally different results.
One thing that’s not helping is the evident glee that some security researchers use while reporting flaws — mixed with a healthy amount of derision for the stupid engineers who allowed the bug through in the first place. For instance, after Fiat Chrysler had to recall 1.4 million Jeep models this summer because a hacker showed they could be remotely controlled via an exploit in their Internet-connected entertainment systems, one of the researchers tweeted:
I wonder what is cheaper, designing secure cars or doing recalls?
— Charlie Miller (@0xcharlie) July 24, 2015
Let’s grant that the root cause of the Jeep hack was that an engineer made a mistake. Regardless, the gloating isn’t useful. Given the prevalence of the hacks, obviously at present mistakes are too easy to make, and the only option is doing recalls. (Or, in the case where your device is more connected, pushing a software fix via an over-the-air update — an option that evidently wasn’t available to Fiat Chrysler.)
Two weeks ago, I had the pleasure of attending the excellent Xperience conference in Boston, put on by Xively. One of the speakers was James Lyne from Sophos. He’s an engaging speaker, but he was gloating too: The basic gist of the talk was to show how easy it is to hack all these Internet-connected devices we’ve got. Boy, the engineers who design these things must be idiots!
But when asked for his advice to executives of companies who want to build secure connected devices, Lyne’s response was to “involve the security experts early in the process.”
That’s maybe useful advice if, like Lyne, you work for a security consulting company. In an increasingly connected age, though, the rest of us need a solution that doesn’t involve getting a member of some high priesthood to approve the process by which we slaughter our cattle. We need to make it possible for regular, competent engineers to develop secure and safe products, and for the companies that employ them to have confidence that the products they’re shipping to their customers won’t harm those customers.
I don’t have a solution, yet, but it’s been something I’ve been thinking a lot about. The ultimate answer has got to involve some combination of better tools, better building blocks, and better training for engineers.