A major security flaw called CVE-2018-8897 was identified earlier this year in operating systems running Intel processors, which affected Microsoft, Apple, and Linux distributions. While you may not remember the name of this flaw, you will remember hearing about it. But its name and what it does matters.
There is just too much information, and that makes for a hacker’s paradise.
For sure, CVE-2018-8897 is newsworthy, but that doesn’t make it remarkable. New flaws are discovered all the time–and these “major new glitches” with their James Bond villain-sounding names (Spectre, Ryzenfall, Fallout) are often quite dangerous to consumers, businesses and governments.
There are many reasons for this, but here’s one that is too complicated to be newsworthy, yet is a contributing factor to our culture’s pandemic cyber insecurity: The CVE-2018-8897 may have been caused by too much information. Specifically, the vulnerability seems to be due to either a lack of comprehension on the part of more than one engineer, or the failure to read security documentation provided by Intel in a 4,844-page document.
That’s a lot of pages that require a lot of work hours to master. So this gives rise to a serious question: Should the security requirements for hardware be buried in a document more than twice the length of Leo Tolstoy’s War and Peace and Anna Karenina combined? Is it any wonder CVE-2018-8897 was overlooked?
Words, Words, Words
While the Intel example here is an extreme one, it’s hardly a one-off. The recently adopted v1.1 NIST Framework for government cybersecurity infrastructure comes in at a more modest 55 pages, which starts to swell when one adds in the NIST Roadmap, supplemental texts mapping Framework policies to existing NIST publications, etc. The payment card industry (PCI) documentation for compliance is 139 pages with an accompanying library of supplemental texts and updates, including a glossary of terms (24 pages), summary of changes (3 pages), and ‘quick’ reference guide (40 pages). And the list goes on.
There’s a tendency in the world of technology to take for granted compliance with mind-bogglingly long texts that we can safely assume not everyone reads. The End User License Agreements that accompanied a new iPhone in 2015 alone weighed in at 21,586 words–and that’s for a device known for its relative user-friendliness.
Most consumers click ‘I Agree,’ and move on with their lives, but it becomes a more serious matter when, as in the Intel example, there are more disturbing implications for cybersecurity.
How Did We Get Here?
Part of the blame for this goes to the 2000 ESIGN Act that declared clicking a checkbox on an online form to be a legally binding form of assent.
ESIGN ushered in a kitchen-sink approach to tech policies and agreements vis-a-vis potential misuse of products and services. It’s at least in part due to the collision of two kinds of legalese (where it’s necessary to go into excruciating detail to eliminate ambiguity) and technology (which is prone to massive amounts of jargon), both of which come at the expense of clarity.
But the jargon issue is just a symptom of a ‘hot potato’ cybersecurity culture where all parties involved try their utmost to minimize their own accountability rather than minimize exposure to threats.
No one wants to be left holding the potato when news of a major breach or compromise hits. It’s infinitely easier to include information in an unreadable document and have a vendor, IT department, or policy compliance officer agree that they’ve not only read it but will also follow it to the letter (that’s the legacy of ESIGN).
As any hacker will tell you, this approach hasn’t worked. There’s little use in having even the most detail-oriented gatekeepers if the security policies they’re meaning to uphold aren’t immediately accessible and easy to understand. If policies aren’t followed by every member of an organization, eventually they are not going to protect that organization. We’ve seen time and again that personnel are very often the weak link in security, and adding pages to PDF documents won’t do much to address this. And yes, engineers and software developers are just as likely to skim through or skip over documents as the rest of us are.
This isn’t to say that security policies shouldn’t be documented. Legal and technical matters both need that information. That being the case, any enterprise-level cybersecurity strategy that relies solely on unreliable humans reading and obeying every word in a security policy is courting disaster.
Culture beats strategy, as Peter Drucker said. Cybersecurity demands a more cooperative approach, business-to-business and peer-to-peer, with practical real-world methods of training, education, and cyber hygiene always front and center.