This week we are trialling an idea around a virtual ‘Security Watercooler’. 25~30 min video calls to break up the day and showcase different viewpoints. Check out more about the concept here.
Today Robin Oldham was joined by ThinkCyber‘s Tim Ward to discuss the security training and awareness. Here are the summary notes from the call:
90% of attacks start with the human user, and technical controls are never 100%. You need them as a line of defence.
Lots of people working from home in a way that they may never have done so before. Corporate support feels (and is) remote! Opens up to routes such as fake inbound IT support calls - Users won’t know this isn’t what IT support are doing.
Phishing emails have spiked by over 600% since the end of February, many people are working in new environments and with lots of distractions - from tiny, or furry, co-workers!
Sending out additional training - especially when everyone is already busy/stressed - isn’t going to get the best results. Behavioural science shows little and often is more effective than a big bang, too.
It takes 66 days to form a new habit, so this isn’t just a one-off ‘WFH security awareness’ session. To change a habit, start with a timely reminder, then work on the ease to achieve, plus (positive) feedback loops.
It also helps to change, or reframe, the context. Positive achievement is often more powerful than negative avoidance. Sometimes the two are related though, for example ‘training for a marathon’ has nothing to do with giving up smoking, though with commitment towards the positive outcome of running a marathon it may be more natural for them to reduce, and give up, smoking.
Security awareness, in general, can be viewed as a ‘compliance thing’ (we need to get X% of people through the training) Beating this can be done by identifying how you demonstrate the value (averted incidents?)
Security awareness also tends to focus on ‘end users’ in general, where specific training may be required for other groups. In particular ‘Developers’ and ‘Management and Leadership’ were highlighted for different reasons. Developers because their decisions and trade-offs are baked into the code they are writing. Leadership because they set the cultural norms of an organisation (‘it is OK to get away with that because…’)
A big oil and gas company are regularly reporting awareness metrics down to the team level of 5-10 people (and right the way up to the board) across five/six key areas. And it’s open for others to see their relative performance.
Maladaptive Response can also push users towards becoming an insider threat: either by tackling the undesirable behaviour in a way that they feel victimised and harbour resentment, or by not tackling it soon enough and the intervention, and therefore response, being escalated.
Mary Haigh at BAE Systems had a talk about good and band neighbourhoods. In the physical world, it is obvious when you are ‘safe’ and when you’re in a more ‘sketchy’ part of town. We don’t have that in the digital world and it is trivial for the bad guys to replicate a bank ‘shop front’ in a way that would be costly to spoof a high street branch.
For all the talk of ‘gamifying training’ no one on the call had come across a spontaneous, random reward or users who engaged in positive behaviours and consistently ‘do the right thing.’ Nor were they aware of any companies who were analysing for good behaviour.
Use this Google Form to register and receive joining instructions for future Security Watercooler sessions.
Tomorrow, Wednesday 1st, join us at 15:30 BST for DETECT: Adapting detection to deal with remote working, with F-Secure‘s Tim Orchard.