Arguing Against the Absurd is Easy, But Not Helpful

Bruce Schneier would have us believe that security awareness training is pointless. People have inadequate incentive to change, and thus why waste the time, money, or energy? And, to a degree, he is certainly correct. The old-fashioned once-per-year computer-based training modules to which many (if not all) of us have been subjected are, in fact, completely worthless. After all, these training modules are a mere blip on the radar of one's life, with no foundation in reality, and making no meaningless impact on how we conduct our jobs.

However, that is not the state of practice in the industry. Or, more specifically, it's not the leading edge state of practice. Moreover, his comments ignore much that we know about approaches, learning styles, incentives, etc., based on research from the past few years.

Consider this comment that he makes:

"The whole concept of security awareness training demonstrates how the computer industry has failed. We should be designing systems that won't let users choose lousy passwords and don't care what links a user clicks on. We should be designing systems that conform to their folk beliefs of security, rather than forcing them to learn new ones."

Here he's both right and wrong. Yes, the system should enforce desired attributes/qualities, such as password characteristics. However, you cannot generalize this sentiment to say that the human element should be excused, somehow magically eliminated from the decision path. I'm all for adapting security techniques and requirements to better match corporate culture, but at the same time we need to be doing more to educate and encourage people so that they'll make better decisions. This can be accomplished through creating a culture of accountability, instituting continuous education & awareness initiatives (vs. point-in-time or once-per-year sessions), and by adopting behavior modification research and practices to help improve the overall state of "folk beliefs." What is not acceptable, plausible, feasible, commercially reasonable, or legally defensible is simply throwing in the towel on human factors in hopes that system designers and developers can magically address all weaknesses at a technical level. There is far too much technology debt today to make this an even remotely reasonable or rational idea.

A Culture of Accountability

If you want to institute meaningful change, then it starts with accountability. If you have rules, then you must enforce following those rules. If a rule doesn't make sense, then change it. If a rule makes sense, then make sure people are aware of the rules, why the rules exist, and that they're being followed. This attribute is where most policy regimes fail. Look at all the policy exceptions that organizations grant on a regular, recurring basis. Doesn't this reflect a state of poor policies being out of alignment with business needs? Overall, rules should be based on business risk decisions, and they should reinforce those behaviors and practices that are beneficial to the business and will help it survive and thrive. Anything else is superfluous nonsense. At the end of the day, though, you must consistently enforce policies, or you run the risk of invalidating your policies through non-enforcement (which can represent additional legal risk, among other things).

Continuous Education & Awareness

Education, training, and awareness activities cannot be limited to once-per-year instances. It must be ongoing and continuous. More importantly, it must be contextually relevant. For example, the best time to educate people about the potential (if not likely) positive and negative impact of their decisions in a context is... when they're in that context! You would not expect a business analyst to understand the intricacies of firewall rule development or an overall access management program, but you would expect them to understand risk analysis and risk management processes as it pertains to their role. Operational risk management is heavily influenced by ICT risk, and thus it's imperative that these ICT risks be highlighted for decision-makers along the way in order to continuously and actively educate them about the interrelations and dependencies involved.

Behavior Modification

Last, but not least, we need to realize, acknowledge, and accept that our training and awareness programs have not been designed well, nor have they generally been designed by qualified people. All of those reeeeeeeaaaaalllllly boring video and/or video-interactive CBTs are such a drag, and with no hope for retention. According to basic training course that I took last weekend, people will only generally retain 10% (or less) of what they are told in a single training instance. However, if you have them practice what they're being taught, such as through real-world scenarios, etc., you can increase retention to 65%. This point is just one of many that we can learn from professionals with a background in psychology, sociology, and behavioral modification. Working with these professionals can also help us create policy regimes that are better formulated to achieve broad conformance and compliance, as well as help us evolve toward a culture of accountability.

---
As noted up-front, Schneier is right, to a degree, that awareness training as it's historically been implemented is generally a waste of time and money. However, we cannot remove humans from the overall enterprise risk picture, and thus we must undertake measures to help them make better decisions. Progressive techniques for education and awareness can be adopted that achieve superior results, and that can help reduce some of the more egregious cases of bad decisions. At the same time, certainly, more secure systems and applications, such as ones that consistently enforce/reinforce underlying requirements, are also an important component of an effective operational risk management / risk reduction program. Let us, however, not be deceived by Bruce's use of composition/division and black-or-white logical fallacies. Security awareness training and education is essential and imperative if we have any hope of successfully reducing the impact of human risk on enterprise risk management.

About this Entry

This page contains a single entry by Ben Tomhave published on March 19, 2013 3:43 PM.

Thoughts On RSA US 2013... was the previous entry in this blog.

3 Quick Updates is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Monthly Archives

Pages

  • about
Powered by Movable Type 6.3.7