This is an excerpt from SWIPED by Adam K Levin, now out in paperback.
A strong corporate culture is a work in progress, constantly evolving. It stays ahead of the curve because it never stops applying the processes that make it successful—the consequence of clear leadership and a culture where employees feel invested in their work is that they take ownership of the tasks assigned to them. Something like the Sony hack—where the enemy is well armed, fully weaponized, and in war mode—may not be avoidable, but a state of readiness, predicated by a healthy corporate culture that puts security first, is the only way such an attack can be properly contained and managed. A study released by the Ponemon Institute last year found that the average cost of an enterprise-level breach in 2013 was $3.5 million, and the most recent figure is $3.8 million. If you need something concrete to believe that data breaches can be an extinction-level event for many companies, a figure like that should do the trick. But if you approach data security in a more holistic way, the threat of breaches can actually be a moneymaker.
While it should be clear that the time to change the way businesses approach data security was the day before yesterday, what may not be as obvious is the fact that the framework for a solution already exists. It was first discussed in the 1990s, in a slightly different guise, by Ontario’s now-former information and privacy commissioner Ann Cavoukian. She is the progenitor of “privacy by design.”
At the very beginning of privacy’s emergence as a profit center, when companies began mining personal data and repackaging it to retailers, websites, advertisers, and marketing companies, Cavoukian discerned a limit to what consumers would tolerate; they would quickly understand that “free” services were merely a way to separate them from their personal information for use in marketing campaigns aimed right back at them. She also rightly saw that companies that offered clients options would become de rigueur, a foresight that even privacy-hostile Facebook had to eventually make real, after denying there was such a thing as privacy for some time. When Facebook users got privacy toggles, it was clear the privacy-by-design idea had legs.
Twenty years later, privacy by design has become a term of art used in marketing meetings, and while not everyone sees the matter in precisely the same way, a company’s approach to the issue of consumer privacy is now something that gets figured out in the planning process and not just tacked on after the fact. That’s what the market wants.
Taking a page out of Cavoukian’s playbook, I floated the idea of security by design, which started getting marketed in earnest after the megabreaches of the past couple of years.
Here are Cavoukian’s seven principles of privacy by design, which can easily be adapted for the security-by-design concept:
- Be proactive, not reactive; focus on preventative, not remedial. The company “does not wait for privacy risks to materialize, nor does it offer remedies for resolving privacy infractions once they have occurred—it aims to prevent them from occurring.” Security by design would focus on eliminating the risks associated with storing third-party information.
- Privacy should be the default setting. “If an individual does nothing, their privacy still remains intact. No action is required” on the part of the individual to protect his or her privacy. It is built into the system. Ditto with security by design: Consumers should not have to worry about the security of their data when they make a transaction.
- Privacy should be embedded into the design and architecture of IT systems and business practices, “not bolted on as an add-on.” Security must be part of the design process, as well.
- A positive-sum, not zero-sum, approach avoids “false dichotomies, such as privacy vs. security, demonstrating that it is possible to have both.”
- End-to-end security is “embedded into the system prior to the first element of information being collected, and extends throughout the entire lifecycle of the data involved, from start to finish. This ensures that at the end of the process, all data are securely destroyed, in a timely fashion.” is point would remain virtually unchanged from Cavoukian’s original concept.
- Visibility/invisibility and transparency/opacity: the privacy-by-design model says that companies need to tell consumers exactly what they are going to do with the information they collect. While this works for privacy by design, it would put a giant target, if you’ll excuse the pun, on a company that touts its security. There’s no greater magnet for a hacker than a good challenge. Security by design requires invisibility and opacity.
- Respect for the consumer: As with privacy by design, security by design “requires architects and operators to keep the interests of the individual uppermost by offering such measures as strong privacy defaults, appropriate notice and empowering user-friendly options.”
Security by design speaks to the culture question. It’s not lip service. You can’t say you’re doing security by design and leave it at that. Hackers will continue to get better at hacking, just as companies will continue to get better at fighting them off. That’s part of the design. Both large- and small-business owners still need to be prepared for a breach and its aftermath.
Excerpted from Swiped: How to Protect Yourself in a World Full of Scammers, Phishers, and Identity Thieves by Adam Levin. Copyright © 2016. Available from PublicAffairs, an imprint of Perseus Books, LLC, a subsidiary of Hachette Book Group, Inc.