InfoSec vs. Fast Food Nation

Many problems in infosec trace back to human activities, and are consequently reflective of larger societal issues, which have been often represented by the "fast food nation" and "age of ignorance" notions. Sadly, these characterizations are true, as we see now played out with the BYOD movement, so-called "consumerization" of IT, and difficulties keeping control of data.

What got the wheels turning for me was an article I read back in March on The New York Review of Books blog titled "Age of Ignorance". In the article, they pointedly lament what seems to be a rush toward idiocracy and away from a more golden time where intelligence, academia, and open-ended R&D were considered positives. In fact, tying this back into the security meme of my blog, they marvel at even the most fundamental failing of our current society to even know our own basic histories, pinned largely on extremism on both ends of the political spectrum, and representing a very 1984-like reality.

So, what does this have to do with infosec? Well, for one thing, it means we're oftentimes getting ourselves bogged down in fighting the wrong fight. Rather than wagging in futility at all the various attacks using half-baked or long-since-deprecated approaches (e.g., AV, firewalls), we should probably be working with people to help them truly raise their intelligence on properly handling data. Unfortunately, this is not an easy task.

3 Practices to Encourage Improvement

In order to address these societal differences, we must then start modeling appropriate behavior while actively educating people on the thought processes involved and the analyses that should be applied. And, when all is said and done, we must hold people accountable for when they violate the rules and decisions that have been rightly made.

1) Apply good risk analysis within discussions. It's imperative that we first model better behavior. A key part of this will work toward de-operationalizing infosec and bifurcating the former infosec organization into standard IT operations and a GRC program. The hallmark of this shift is to quit flogging technologies and solutions without first performing a reasonable risk analysis, as well as walking others through the analysis. It's this last part - of actively educating people through routine discussion - that is perhaps the most important point. Doing this is nothing less than modeling the desired behavior that we wish others to exhibit in the future.

2) Clearly articulate expectations. One of my favorite quotes from my Dad is "if you never communicate your expectations, then you shouldn't be disappointed when people don't meet them." If you don't clearly state what people are expected to do (and why!), then you're not creating an environment that will easily or readily achieve your definition of "success." There are many ways to communicate expectations, such as through policies, controls, awareness activities, assessments, audits, continuous monitoring, etc. The key is that all of these should be turned into educational tools, and presented in a manner that is not designed to attack people, but rather to help them and make them feel empowered to make good decisions. This might sound like a bunch of psycho-babble, but the basis is sound: if you want people to modify their behavior, then it has to be easy, makes sense, and benefit them in some way.

3) Establish an accountability culture. Part of maintaining a healthy learning culture means allowing people to experiment, innovate, and - yes - even fail. However, this does not mean allowing people to operate in a consequence-free environment. If there are rules in places (preferably properly vetted ones), then it's imperative that those rules be enforced, with infractions flagged and violators sanctioned. How your organization accomplishes this can vary widely, but it must be meaningful and have a deterrent effect. At the same time, we should also be looking for ways to prop up laudable practices, further modeling the desired future state; just so long as we don't get so caught-up in being positive that we overlook violations that inherently threaten the stability, success, or security of the organization.

I was tempted, in writing the subtitle above, to call these "simple steps" for improvement. However, when you think about it, these steps are anything but simple. Sure, the concepts are straightforward, but the execution is typically very difficult. As such, make sure to set reasonable expectations for yourself, your team, or your organization when undertaking such initiatives. Consider yourself an education, but contrary to the old "those who can, do; those who can't, teach" insult, you are a "can do" educator who is active in your field.

Good luck!

Note: Updated 4/19 to correct a rather significant typo... if you don't communicate expectations, then you *shouldn't* be surprised when people don't meet them...

About this Entry

This page contains a single entry by Ben Tomhave published on April 18, 2012 11:32 AM.

Book Review: The Alexandria Project by Andrew Updegrove was the previous entry in this blog.

Where's Ben? (May 2012 Edition) is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Monthly Archives

Pages

  • about
Powered by Movable Type 6.3.7