Security awareness

Security Awareness – Judge the Impact to Justify the Effort

John G. Laskey
November 10, 2015 by
John G. Laskey

[download]Download the BEST PRACTICES FOR DEVELOPING AN ENGAGING SECURITY AWARENESS PROGRAM whitepaper[/download]

Learn the best practices for developing a security awareness training program that is engaging. Engaging awareness programs have been shown to change more users' behavior and are seen as an asset for your organization instead of annoyance. 

Phishing simulations & training

Phishing simulations & training

Build the knowledge and skills to stay cyber secure at work and home with 2,000+ security awareness resources. Unlock the right subscription plan for you.

-----------------------------------------------------------------------------------------------

Security awareness is a top priority for most security officers. But, making it measurable for business analysis can be hard work. In this piece, I want to consider how to ensure anyone responsible for these programs can measure their success. To convince doubters that they are needed, budget managers that they are worth it and security auditors that they are effective.

I'm writing this series of articles as former security officer in the British Civil Service. If you've read some of my other articles, I hope I've convinced you there has been a universal leveling off of security practices, and that the lessons learnt from the world of government can be applied to the smallest business (and the other way around, too).

This was not always so. In the case of security education (as we call it now), things have changed a lot over the past few decades. When I joined the civil service in the late 1970s, the threats were very different, as were the means of warning people about them. In that age, security (with hindsight at least) was much easier to define. In physical terms, the menace to the UK government was mainly from guerilla groups, like the Irish Republican Army and Black September. In terms of documentary assets (there being few computers around) the threat was mostly from the Soviet Union, its allies and some home based groups that overtly or covertly supported Soviet ideology.

The means of delivering the security messages about these threats was a talk to all new entrants by someone from the security department. This man – it was always a man – could be good at instilling feelings of dread in freshmen about the consequences of non-compliance. The occasional circulation of a staff security handbook with detachable pages (to facilitate amendments) would follow this up.

Though very unsophisticated and horribly dated, the effect of all this was quite profound on this school leaver. That I was actually pitted against the Red Army and shadowy urban guerillas gave a sort of thrill to my impressionable mind, though the less questioning attitudes to authority of that time probably made security manager's tutorials an easier ride.

Measuring the effectiveness of security education was not a factor then. I believe that were any metrics to have been applied, they would have been expressed very simply in terms of the absence of terrorist attacks and successful (and known) espionage exploits.

I've written elsewhere about establishing an effective method of security awareness, but since we live in a more questioning age, I don't think it is sufficient to simply get on with a program of education. Management and budget holders have to be convinced, not only that this is necessary but also that what is done is effective and provides good value.

I believe the British Civil Service now recognizes this. In 2008, they introduced an 'Information Assurance Maturity Model' (IAMM) that requires government offices to annually demonstrate improvements to the effectiveness of their IA security measures. This partly came about as a result of the big news headline loss of personal data in 2007, though it is within the growing (and I think welcome) tide of demonstrable accountability and improvement to which British public services must now submit. The 1970s approach can still be glimpsed in TV Brit Com reruns.

One of the requirements of the IAMM covers information security training and awareness. For the first time, government offices have to assess their progress with this and improvements to it, in formats that will be available via the UK Parliament to the public record. That is not the exclusive reporting line either: most government offices are also liable to direct public questioning through Freedom of Information legislation).

We have come a long way from a handbook of security rules that staff were expected to be too scared to break. But how can we report on the effectiveness of any education, especially when as a security officer you probably are not primarily an educationalist and have other things to do?

Measuring the successful perception of a subject is something that can keeps educationalists – and advertising executives – exercised for ages. Though technology and desktop systems are there to help us with our metrics, there is a temptation to just apply expensive turnkey software, which beams security awareness at staff. This sort of software will require them to respond to their learning with, typically, a multiple choice set of questions. It will then compile statistics about the range of responses which management can analyze.

These sorts of packages have been available for some time. I have always been skeptical of them. They are often written in a style that is alien to your office and which may not have a sympathetic grasp of the threats that your customers face. Inevitably, they lack any feel for the office chain of command, which is particularly difficult to generalize. The British government has tried hard to create job roles for those with security responsibilities within its departments, but the independent-mindedness of these institutions makes that difficult. I have also found these packages tend to concentrate on generalities like password construction and to include unnecessary theorizing about 'confidentiality, integrity and availability'. This is the language of information security specialists, not staff who handle everyday corporate data. In one case, the choice of graphics was rather off-putting, too resembling a childish looking game of 'crazy golf'. This did not underpin a serious message and could have exposed the security office to ridicule had it been adopted.

Another security education favorite I looked upon skeptically was the inclusion of gimmicks in awareness drives, such as pens and coasters. My own doubts on this aside, it is probably indisputable that the effectiveness of these novelties cannot be measured. They might prompt staff to think about a message but cannot teach them ways of doing things or be made accountable statements of security policy. I think you will tell me if I am overlooking how an 'advertising' style e security awareness campaign might change behavior, but my experience is that the enthusiasm in producing gimmicks is never proportionate to the response from their target audience.

So what measurements can we apply to an education program that will not also require buying in expensive and possibly ill-fitting turnkey products? How can the measurements themselves help improve the product?

Sometimes security officers have no choice but to put maximum publicity to an event that is a clear and present threat to their organization's security. These efforts can be time consuming without being measurable. But I think it is sometimes necessary just to show a management board and CEO that steps are being taken to tell staff about such threats. However, the effort put in to designing an ongoing security education program should be a regular undertaking, with messages prioritized by the current threats to the organization. This assumes that all threats are captured from top to bottom of the organization, regularly managed and plainly visible to the security officer.

This is the ideal. The reality is a mixture of regular security features to be produced in a digestible and busy-reader sort of way, alongside the occasional one-off response to a threat.

In both cases, my view is that the messages really do need to be crafted by an organization's own security officers, if not entirely from scratch then with help from a network of contacts and mentors who have the same objectives. This ensures messages are made relevant to staff and do not become just an echo- chamber for the sorts of security headlines that we see almost every day in the national news. This also helps to ensure that the messages, which staff do receive, are relevant to the threats the organization is most concerned about. That should also help to highlight the valuable work of your security team, an important consideration when times are lean and cuts in programs are sought.

The canvas upon which we can present security education is really very wide. Though skeptical of turnkey security education products, I am all for the use of desktop networks to present the materials. Some organizations will already have the skills needed for the presentation and recording of results of questionnaires associated with the material. That said, I would be cautious about relying on online returns alone to present a picture of the organization's security readiness. Staff will exchange the 'right' answers around the water-cooler, while many will find the right excuses not to complete any online quizzing on time (or at all). Better, I think to agree with the company auditors or equivalent upon a minimum number of responses to draw your conclusions about the effectiveness of security knowledge. There should be no expectation of a 100% return upon which to base this. I have always found that a close alliance with qualified auditors can help security officers here. Auditors are often called to assess the objectivity of evidence. They should be able to provide a professional view on what constitutes a sufficient quorum of staff to satisfy managers that a selected slicing is representative of the whole organization.

Another, more effective assessment requires a little more time to set up and relies on soft skills, which are not always associated with security officer roles. This can be through a focus group approach, either within a Town Hall meeting style of arrangement or to a smaller group of selected staff. Here, the security officer can direct a line of questioning to cover specific concerns – including specific threats – as a way of getting a fair sounding on how awareness programs are hitting the mark. It also gives some useful first-hand feedback as to what might be changed to improve future content. Again, the numbers of staff being questioned and the timing of these questioning sessions can be varied depending upon an organization's requirements.

This approach can be considered alone or alongside the more number-crunching methodologies available through a questionnaire. Time is of course valuable and many security staff might be drawn towards a hands-off approach to education and awareness materials. There is however, a risk that security will then seem rather remote. A big advantage of conducting regular meetings with selected groups of staff is that it humanizes the subject, allows an exchange of views and ensures that lines of security reporting are collaborative and are not associated with alarm or failure.

See Infosec IQ in action

See Infosec IQ in action

From gamified security awareness to award-winning training, phishing simulations, culture assessments and more, we want to show you what makes Infosec IQ an industry leader.

Finally, it is necessary to present some convincing - but not overpowering - metrics that will convince senior managers that they can worry less (I said in an earlier piece that one of the most effective roles that can be played by a security officer is to reassure their own managers that everything is under control). I caution against too much work being put into analytical reports. I have been quite confused by some of the more colorful charts and graphical representations so I'm sure senior managers will have been too (though they are not always very quick to admit this). Obvious elements of such metrics might be the numbers of staff who have received a security briefing within, say , a twelve month period and the numbers and types of breaches of security that are attributable to oversight or (much rarer in my experience) deliberate and malicious actions. This is important: no general security education will be enough to prevent human error or catastrophic and unseen events. The metrics will only really take flight when fixed periods can be compared (but ensure the presentation does not get too static either, since the categorization of some incidents may change as technology evolves). Therefore, start plotting data on security incidents against throughput of your security awareness and training programs ensure these programs are changed in response to changes of threats and ensure that there is effective and adequate feedback from staff about their understanding.

John G. Laskey
John G. Laskey

John Laskey is a US-based security consultant who previously worked in the British government, where he was responsible for securing systems and advising senior managers about major programs. In the US, John has taught the ISO 27001 standard and is now helping develop and market new InfoSec products and services. He is a member of ISSA (New England Chapter).