In the last couple of years, there’s been a move to give names to security vulnerabilities that would be otherwise too arcane to discuss in the mainstream media. For example, back in 2014, Heartbleed, “a security bug in the OpenSSL cryptography library, which is a widely used implementation of the Transport Layer Security (TLS) protocol”, had not only a name but a logo.

The recent media storm around the so-called ‘Spectre’ and ‘Meltdown’ shows how effective this approach is. It also helps that they sound a little like James Bond science fiction.

In this article, Zeynep Tufekci argues that the security vulnerabilities are built on our collective desire for speed:

We have built the digital world too rapidly. It was constructed layer upon layer, and many of the early layers were never meant to guard so many valuable things: our personal correspondence, our finances, the very infrastructure of our lives. Design shortcuts and other techniques for optimization — in particular, sacrificing security for speed or memory space — may have made sense when computers played a relatively small role in our lives. But those early layers are now emerging as enormous liabilities. The vulnerabilities announced last week have been around for decades, perhaps lurking unnoticed by anyone or perhaps long exploited.
Helpfully, she gives a layperson's explanation of what went wrong with these two security vulnerabilities:

Almost all modern microprocessors employ tricks to squeeze more performance out of a computer program. A common trick involves having the microprocessor predict what the program is about to do and start doing it before it has been asked to do it — say, fetching data from memory. In a way, modern microprocessors act like attentive butlers, pouring that second glass of wine before you knew you were going to ask for it.

But what if you weren’t going to ask for that wine? What if you were going to switch to port? No problem: The butler just dumps the mistaken glass and gets the port. Yes, some time has been wasted. But in the long run, as long as the overall amount of time gained by anticipating your needs exceeds the time lost, all is well.

Except all is not well. Imagine that you don’t want others to know about the details of the wine cellar. It turns out that by watching your butler’s movements, other people can infer a lot about the cellar. Information is revealed that would not have been had the butler patiently waited for each of your commands, rather than anticipating them. Almost all modern microprocessors make these butler movements, with their revealing traces, and hackers can take advantage.

Right now, she argues, systems have to employ more and more tricks to squeeze performance out of hardware because the software we use is riddled with surveillance and spyware.

But the truth is that our computers are already quite fast. When they are slow for the end-user, it is often because of “bloatware”: badly written programs or advertising scripts that wreak havoc as they try to track your activity online. If we were to fix that problem, we would gain speed (and avoid threatening and needless surveillance of our behavior).

As things stand, we suffer through hack after hack, security failure after security failure. If commercial airplanes fell out of the sky regularly, we wouldn’t just shrug. We would invest in understanding flight dynamics, hold companies accountable that did not use established safety procedures, and dissect and learn from new incidents that caught us by surprise.

And indeed, with airplanes, we did all that. There is no reason we cannot do the same for safety and security of our digital systems.

There have been patches going out over the past few weeks since the vulnerabilities came to light from major vendors. For-profit companies have limited resources, of course, and proprietary, closed-source code. This means there'll be some devices that won't get the security updates at all, leaving end users in a tricky situation: their hardware is now almost worthless. So do they (a) keep on using it, crossing their fingers that nothing bad happens, or (b) bite the bullet and upgrade?

What I think the communities I’m part of could have done better at is shout loudly that there’s an option (c): open source software. No matter how old your hardware, the chances are that someone, somewhere, with the requisite skills will want to fix the vulnerabilities on that device.

Source: The New York Times