eu lo gy -noun, plural -gies. 1. a speech or writing in praise of a person or thing, especially a set oration in honor of a deceased person. 2. high praise or commendation.
I don't do this a lot, because I try to keep this blog positive and constructive, but occasionally I have to speak up and call BS on what is just blatant lunacy. I'll try to keep my criticisms constructive here.
The problem is this: people are once again falling into that rut of blaming the users for making bad security decisions, all the while having created, sustained, and grown an enablement culture that drastically abstracts users from the impact of those decisions. Plainly put: if the users don't feel the pain of their bad decisions, then they have no incentive to make a change. This is basic psychology.
Unfortunately, whining about users and the state of the industry is apparently en vogue again. It's not like we haven't been down this road a few dozen times already. Much of the negativity started a few months ago with a handful of sec-twits complaining about how security didn't understand business, blah blah blah, the end. They started advocating for something that many of us having been doing for more than a decade: working with the business to get them to understand and integrate security genes into the genomic DNA of business operations. However, that whining morphed into something else that was, for a while, extremely negative... mostly, though, it's just cliche at this point... Jack Daniel got part of the answer to this insanity right in his post "InfoSec's misunderstanding of business." when he said:
"If you want to improve security in your organization, you need to understand how your organization works, not how it should work. You need to know what feeds it and what scares it. Sadly, that may have no relation to the business your organization is in."I say this is only part of the answer because it only addresses business processes, and it doesn't speak (fully) to the human aspect of the problem. It doesn't address the human paradox gap; people are no closer to the impact of their actions because security people understand how the business operates. That's merely the first step.
The next batch of insanity that made the rounds recently was the phenomenally arrogant and condescending piece "We Are Infosec Professionals - Who the Hell Are You?" Now, I certainly understand the source of angst from which his vehemence stems. After all, with all the media coverage of various breaches, you'd think that security people are woefully incompetent. Unfortunately, that ends up being true a lot more of the time than he seems willing to admit.
I've done interviews of so-called "security professionals" over the years, and if there's one thing I've found, it's that at least 3/4ths of them tend to grossly overstate their skills and competency. Certifications like the CISSP certainly haven't helped, nor does the US Government (especially the Department of Defense), which churns out thousands of "security" people who still have trouble turning their PCs on in the morning. I digress... (since that's a whole different holy war that's been reignited for the bajillionth time)
The bottom line is that "security" is absolutely responsible for these various breaches and incidents... just not for the reasons people might think... it's because of the failure to understand how the business operates and thinks... AND because, once again, we've done NOTHING to connect peoples' actions to the impact. We've created an enablement culture that grossly disincentivizes people from making good decisions.
Jericho does a much better job answering this article, so I direct you to his rebuttal over here:
"Rebuttal: We Are InfoSec Professionals... Not The Beatles"
This brings me to the last point, which is a pair of posts from a couple industry stalwarts that, yet again, aims vitriol at the users for doing exactly what we've enabled and encouraged them to do (it's sad, really).
First up is Dave Shackleford's post "Infosec: Designing for IDGAF" in which he claims that people simply don't care.
"Imagine your users being totally, completely honest with you when you talk about the need for security. In a world not colored by political correctness and "business etiquette", many of them would probably tell you (regarding security): I Don't Give A F***."
I agree (to a point) that the status quo is broken, but I strongly disagree with his conclusions. His "IDGAF mentality" assertion is just plain wrong. People do care, and do want to make the right decisions. However, we've done nothing to enable them to make good decisions. Instead, we've done just the opposite. We've enabled them to make bad decisions by almost completely abstracting them from the impact of those choices. Dave advocates building security in - which is great! - but, unfortunately, his approach advocates further removing people from the consequences of their actions. This is the wrong approach, and it will only result in more of the SSDD that we see. In fact, it will actually expand the status quo, causing even greater problems.
Writing about Shack's post, Rothman gets it right in saying "Your employees are not paid to worry about security. They are paid to do their jobs, and more often than not security gets in the way of their actual responsibilities." Unfortunately, he then goes on to agree with the same faulty conclusion, saying "What I really liked about Dave's post is his focus on taking many of the decisions out of the user's hands, stopping them from doing stupid things. Basically protecting them from themselves. As we've been saying for years, this involves locking down devices and adopting a default deny stand wherever you can."
People! This is exactly what we've been doing for the last 15+ years! How's that worked out? Oh, that's right. IT HASN'T.
It's time to quit trying the same old stupid donkey tricks. What we're doing has failed, and will continue to fail. The rules of this game mean we lose - every. single. time. We need to change those rules, and fast. Specifically, we need to:
1) Include security responsibilities in all job descriptions.
2) Tie security performance into employee performance reviews.
3) Include disciplinary actions for all security incidents.
If you want people to stop making bad decisions, then it's not enough to offer annual awareness training, nor is it even remotely reasonable to simply take options away from people. The computing environment has changed dramatically in the last 15 years. There's NO WAY we can control every aspect of users' online interactions. The DoD has tried using draconian security measures and look how pwn3d they've been! If the military can't fully control their users, then what hope do we have of wrangling users in the private sector? None.
Adopting a resiliency mentality focused on survivability, which is underpinned by detection and response capabilities in addition to traditional protective capabilities, is an imperative. The time is overdue to accept this new paradigm and move it forward. It saddens me that we've been talking about these topics for a few years now and yet several major players in the industry still don't seem to get it. I guess it really will take a generation...
So, formally recognise and evaluate information stewardship duties, get a handle on what is actually happening in our environments, design and build resilience by default?
Effectively stop making security and accountability an afterthought?
Yeah, I'm in.
Yes, exactly it! Thanks for commenting.
Great post. I like how you have sticks and carrots in your suggestions. Can you think of some carrot examples for "security performance" in reviews?
Perhaps granting cash bonuses for the absence of undesirable behavior?
App dev teams: no post prod sev1 sec bugs
Ops: no sev 1 incidents, audit findings
Line managers: no security incidents in their team
Line workers: no personal security incidents e.g. dlp alert, phish victim, device malware, prof embarrassing social media.
It would be interesting to determine how much cash it would take for each job class to affect behavior
CEO: $100k
VP: 25k
Mngr: 10K
Line: 5k ?
Glass of wine thought of the night: what if you fired your infosec team and offered the expense as bonuses for the above :-?
I like it! :) I've maintained for a couple years now that an infosec team is purely an outgrowth of the enablement culture we've created. It doesn't make things better, but rather worse. Instead of making people accountable for their own actions/decisions, infosec teams have borne the brunt of the negative impact, giving decision-makers no real incentive to make better decisions.
An excellent post Mr.Ben very well described security is very necessary in an organization.
Interesting rebuttal. Apparently I struck a nerve there, which is good, usually, when smart people have discussions about how to fix things. I won'd defend myself, our opinions differ - I do not think most people care about security in their computing environment, and I think that's because security as an OPTION is inconvenient. Inconvenience always loses, and plenty of use cases have shown this. The only thing I'll argue is that I am not, in fact, blaming users. Quite the opposite, really - I'm accepting them. Warts and all. And I'm not fond of blaming US with this whole enablement schtick, either - c'mon, man, this industry (personal computing in general) has only existed for 20 or so years. The only thing we should be "blaming ourselves" for is not getting in front of this earlier, and building better security into the "user" side of things. You can say we have done "default deny" all you want, but that is not really true. And awareness is definitely not the answer. Those programs are usually pretty weak, and everyone knows it. I like the "carrot/stick" idea. We need more good ideas, and I'm pretty open to whatever will work, without trying to fundamentally change innate human behavior patterns. B/c I am pretty sure that won't. Hopefully we'll get a chance to hang out sometime soon, and discuss this over drinks. Or not. :)
Thanks for the comment, Shack.
I think the average person wants to do "the right thing." I think that safe computing is one of those "right things" that people would love to do, if it didn't interfere with doing their jobs. As my former (now deceased) VP of OpsSec at AOL used to say: "Your job, first and foremost, is to make your direct manager happy." Security generally has no stake in that game (today).
As a result, rather than pushing to make safe/secure computing part of everyone's duties, we've instead said "users are too stupid! we'll just take care of it!" in the typical arrogant IT Crowd mentality. This tells people "don't care about security" and as a result, they don't.
We expect people to know how to operate their vehicles safely and, despite all the lousy drivers on the road, people are generally able to "do the right thing" when it comes to safely operating their vehicles. Why? Because their are repercussions for failing to do so. We need to find ways to institute that same level of consciousness and conscientiousness into the average computer user, too. Unfortunately, as I've noted in a previous post, I think it will take a generation for that change to be affected.