top of page
  • Phil Venables

The Illusion of Choice : A Review

In the last post we talked about the challenges and opportunities of using individual and organizational incentives to ensure effective security risk management. This can be aided by the right design choices for organizational processes, structures, and communications.


This is a rich field ranging from behavioral economics, psychology, design thinking and so-called nudge theory. Given a sizable chunk of my recommended security and risk reading list is already about these topics, I was surprised how much I enjoyed Richard Shotton’s new book which adds many new perspectives.



Shotton provides a great review of the psychological biases and even though it is oriented toward sales and marketing it does contain a rich seam of applicability to security and risk management. It’s a short read that summarizes a huge field in an accessible way with plenty of examples. So, in the style of my prior book review I will take a walk through covering its applicability to security.


1. Habit Formation

- Pick the right moment to break existing habits - Don’t rely on motivation - create a cue - Use an existing behavior to create a cue - Make the behavior you’re trying to encourage easy - Harness the power of uncertain rewards - 3 most important tactics: repetition, repetition, repetition

A big part of an enterprise security program is to build the everyday muscle memory of activities and behaviors that lead to the right outcomes. This can range from establishing policy enforced in process through to creating “ambient controls” - where the environment is designed to enforce and support the required controls.


Another way of looking at this is how to make sure such behaviors become habits (the organizational muscle memory) and this book has many examples of how to do that. Particularly the notion that if you want some behavior then make it easy and rewarding and if you don’t want a behavior then make it hard and costly.


A good approach to change behavior is to find the right moment of change (“pick the right moment to break existing habits”). You can use natural organization change moments to reinforce or introduce new behaviors. One of the best moments for this is at certain levels of job promotion. The more senior levels where there is a degree of induction or training into that job level are very useful for this. In a prior organization we’d made some tremendous cultural inroads by having key parts of risk, security and crisis management training be included in the senior / executive training programs associated with those promotions. People were primed (because they’d just been told they were now “senior”) to want to be more responsible and so easily took on board these new habits and then modeled those behaviors when they went back into their expanded and more senior roles. There are many other moments in organizations when people’s roles change, new responsibilities are defined or rewards granted that we can latch onto to reinforce cultural risk management expectations.


2. Make it Easy

Spend more time seeking out and eliminating friction Make the first step of the journey as easy as possible Consider reducing the amount of choice you offer Don’t mess with your audience’s world view Add friction if you want to reduce a behavior

There are plenty of lessons here for us as well. We’re all working toward making the secure path the easiest path. So, embed default security in processes and tools and provide default frameworks that implement necessary security controls so that people get security by design and default.


Many organizations are getting better at this, but the one thing which needs to improve is the notion of making the first step of the journey as easy as possible. In many parts of security we’ve done a great job of making security easy if you adopt a whole framework or platform. But to adopt that takes a mass migration. In many situations it’s important to provide some initial adoption that gets people on the journey.


For example, if you have a software development team that will be able to radically overhaul their security practices by adopting your new delivery pipeline and security tooling / frameworks, but this requires a wholesale migration of their build environment that might take a long time. Rather, consider the 80/20 of what you can give them now, from that tool set, that in-situ can not only reduce security risk but at the same time make their environment more conducive to a subsequent migration. This could include packaging scripts, aligning their build system to pull some central packages, through to changing the last step of software deployment / packaging.


There are also plenty of situations where security teams can work to reduce the “activation energy” for wider changes by proposing some initial quick wins. For example, if you are working with a business unit that needs to do a broad review of who has access to what. This might be a lot of effort, and a lot of long term effort to adopt a more automated role based means of control. They might resist doing this or accept the need to do it but keep putting it off. In any case you can motivate action, for example, by showing egregious levels of privileges held by certain individuals or by showing toxic combinations of privilege that would break obviously important separation of duties requirements. They can do some quick tactical clean-up of these - easier to do and self-evidently important when faced with clear risk. Now the good thing here is that this removes some inertia, creates the self-fulfilling prophecy that they are now “an organization that fixes these type of things” and makes the harder problem seem more tractable. Even if none of that works out then you’ve at least reduced some risk which is a win.


3. Make if Difficult

Use a two step approach to behavior change Apply the IKEA effect Make your customers know the effort you have gone to

As we make the secure path the easier path it will be preferable to make that so easy, embedded and so self-evidently natural that it will be the path always taken. But, there might also be a need to make the less desired path, well, less desirable by adding friction or other inconveniences.


You can hide less preferred approaches, deprecate items from standard frameworks, or have a few hoops added to their adoption. In cost-accounting driven environments you can even have the price of the least preferred option be increased as a means of discouragement. My favorite, from a prior organization, was to completely allocate the costs of legacy systems (to be decommissioned) only among the final business units using that legacy infrastructure. Once a tipping point had been surpassed it was then a race to get off such infrastructure lest your business unit was saddled with the remaining total cost.


In this chapter, Shotton covers a more sales oriented angle, that if you put some effort into using or adopting something then you are psychologically inclined to make yourself believe that was the best thing to do. The one aspect that does make a lot of sense for security is to let people know the amount of effort you have gone to. This is the classic of no one knows about security until it fails. So why not tell people more clearly what is going on around them, for example: “in helping your team launch this product we expended X hours to remove Y bugs and helped your team stay on time and in budget, while doing this by using tool Z we reduced the likelihood of recurrence of similar classes of issues so that your next product release can happen 25% faster.”


This is really about making your organization, customers, partners and vendors, be more invested in the activities they need to do to assure security. Everything from a federated security program, where they own their own security teams, through to a clear(er) accountability from the Board to have business leaders "pull" help from security rather than security always having to "push" this into those areas.


4. The Generation Effect

Apply the generation effect laterally not just literally Does asking questions harness the generation effect? Use your design to make people work a little bit

The generation effect is another sales and marketing technique when the messaging encourages a prospective customer to mentally fill in the blanks.



This is most applicable, naturally, in security training and education work but I’ve also seen extremely high risk situations where the generation effect can also be used to add pause to confirmations that over time people might otherwise become desensitized to. For example, in confirming the amounts and currencies of extremely high value payments, even when there are multiple party approval controls it can make sense to pose a question to the approver rather than present a simple Yes / No approval dialog box, such as asking people to retype the 3 letter currency symbol of the intended payee amount.


5. The Keats Heuristic

Harness rhyme more regularly to boost believability Harness rhyme to improve memorability Alliteration advances accuracy Enhance the fluency of your brand name Tailor the font you use to your task

Again, this is more oriented toward sales and marketing but it is very helpful to use similar constructs in your own internal security messaging whether it is to the Board, other leadership or the wider organization. I find it helpful to brand certain programs. In a prior organization we had a threat hunting program called, imaginatively enough, Hunter. We had a fairly predictable but visually appealing logo. In the same family of programs we had: Sonar (attack surface discovery), Radar (risk identification) and Harbinger (long range strategic threat intelligence). Every time we presented to the Board or other risk oversight group we could show the logos and the names of these and it provided immediate recall and reorientation for people who weren’t as immersed in these topics as we were. Similarly, it’s really helpful to name programs in alliterative ways to help familiarity and reorientation. For example, security programs focusing on: Tools, Talent and Toil-reduction.


6. Concreteness

Mind your language Helping your customer imagine using your product Keep it simple, stupid Stories overs statistics Check your expertise

It is important to use stories to illustrate risk. Another distinctive feature of great security programs is they communicate using stories not dry statistics. They galvanize action by emotionally connecting with people and teams. When we say stories, we can also think of this as narrative scenario planning. Talk about risk in scenarios that could occur, use storytelling techniques to make such scenario analysis compelling and tied to your organization’s business or mission. Similarly, when talking about risk use incident or close-call analysis, particularly of things that went wrong in other organizations and how that could or could not happen to your organization.



7. Extremeness Aversion

Launch a super premium option Consider the order in which you display products Consider the decoy effect: a twist on extremeness aversion

At one level there is the obvious (perhaps trite) approach of structuring risk recommendations to guide the decision maker to what we believe is the right outcome. Even if it looks like you’re doing this then, as long as it’s not done in a disingenuous way, it can still be effective. But, the most effective lesson learnt here is not to have this applied to you and your decision making process. In other words be forewarned and therefore forearmed about colleagues presenting their risk mitigation options in a way that is framed to what they want vs. what might be the right outcome.


8. Denominator Neglect

Apply the rule of 100 to promotions Consider offering multiple stacked discounts Present your stacked discounts in ascending order Reframe the discount as a comparison against the sale price

A risk take on denominator neglect is the need for extreme care in being too happy with high percentage compliance in environments that are large scale and dependency sensitive. One, perhaps obvious, example of this is patching. Let’s say you have 100,000 end points and your IT team are gloriously happy with a 95% compliance rate for patching. At some level 95% seems pretty good but that leaves 5,000 machines unpatched and exposed. Now, that might be even more pernicious if an impact to one or some subset of those 5,000 would have a broader impact on the supposedly good 95,000.


Another related example that has always bugged me is people obsessing over accuracy in metrics that are substantially below their target. For example, let’s say one of your goals is to have all production accesses be 100% conformant to required controls (through a gateway, dual-controlled, logged, reviewed, etc.). In any measurement there’s always a degree of measurement inaccuracy for various reasons like scope questions (denominator accuracy) or measurement effectiveness questions (numerator accuracy). As a result you might get a situation where a team says something like: “Hey, you’re telling me that my conformance is 36.4% but I think I’m at 38.9%, your data is inaccurate so why should I focus on this activity?” Of course the correct response is: “Hmmm….ok, we accept the need for you to help us get the scope right and improve the measurement but perhaps we should really focus on that when you’ve implemented the controls on the systems you know to be in scope and get to a >90% conformance rate”. In other words, unless your measurements are egregiously inaccurate or volatile then don’t sweat accuracy in many cases until you’re getting close to 100%.


9. The Need to Experiment

Be more skeptical about claimed data Improve your surveys by monadic testing Complement monadic tests with field experiments

Sadly, there is still a lot of work in security and wider risk management where claims are made about the effectiveness of some control or some action without that actually being verified through experiment. I could just say phishing tests and I’d probably be done but there are myriad other examples. Having some degree of hypothesis development and testing is important especially for activities that will have a high opportunity cost in the context of your wider security program.



10. Framing

Focus on losses, rather than gains The benefits of using nouns rather than verbs (imagine yourself doing this, become that person) Harness social proof to minimize irritation with shortages (sold out vs. unavailable)

As our security programs have evolved into or have become part of wider enterprise risk programs then the framing of risks becomes more significant. Understand which situations to frame a risk mitigating activity as a loss avoidance vs. an enabling factor needs to be well-considered.


11. Fairness

Harness righteous indignation Apply the principle of fairness to your pricing Harness the power of because (explain why) A more lateral interpretation Make sure your customers behave fairly

For many domains within our risk programs, particularly security and fraud mitigation, there are opportunities to motivate people through righteous indignation and appealing to many aspects of fairness. In some senses, we are lucky in our work to be on the side of the right and we can harness that more.


This could be amplifying the notion of our "war" with attackers and to marshal your non-security colleagues into the fight. It could, for other risks as well, be motivating to advocate for your customers, their lives and livelihoods. It’s also important to remember, as much as it might be painful to admit, that the customer might not always be right and while there is opportunity to influence their behavior through product choices, opinionated architectural choices and defaults there may be occurrences where you have to constrain their activities for the good of your wider environment.


A side effect of this is to remember your customers aren’t always one person - in a consumer environment for certain services the customer might be the parent as well as the child. In corporate environments different people have different needs and skin in the game - like a trading customer of a large bank, you might have the trader who wants maximum function and flexibility, the compliance officer who wants constraints and the CEO who wants to limit any possibility of fraud. The satisfaction of these multiple objectives is a significant opportunity for the security or risk manager to show leadership and commercial acumen.


Finally, if you ever find yourself saying, “We need to do this activity because the policy says so”, then give yourself a solid smack on the wrist.



12. Freedom of Choice

Be wary of triggering reactance when there is a power imbalance with the audience Avoid overly assertive messages when communicating with your loyal customers Take culture into account Harness the “but you are free” principle (make it appear a choice) Involve people in the decision Remove the possibility of change

Here’s where we remember the need for effective risk framing to enable the right choices to be made. Framing in the context of business (or mission) objectives is always, in my experience, the right path. Although, you do need to connect this beyond the specific context if a realized risk could result in significant brand impairment, regulatory sanction or systemic risk that would have impact beyond the immediate issue.


It helps to take the other person’s viewpoint in your framing as much as possible. This not only helps with the framing but also helps the relationships with people as it will inevitably lead to you approaching situations with more empathy.



13. The Red Sneakers Effect

Breaking conventions signals status Beware the nuances of the red sneakers effect The need to show intention The need for audience familiarity

The red sneakers effect is the notion that you can signal status by not conforming, such as a business leader turning up to a meeting in red sneakers when something more formal is expected. Lack of conformity signals you don’t need to conform. But the side poin on risk management here is to always communicate intent. This has been a common theme across many of these points that people can fail to cooperate because they don’t understand your intent. In security, think about signaling difference as part of a deterrence strategy. Remember, attackers are mostly rational actors and they don’t want to waste time or money, or put themselves at undue risk of detection. So, sometimes it’s not a bad idea to actually signal your controls.


14. The Halo Effect

You can achieve your goals obliquely as well as directly Focus on the halo effect if your brand is little known Prioritize the halo effect if you need to convey intangible attributes Tap into the halo effect by emphasizing your brand’s attractiveness Harness the halo effect by boosting your likability

Security leaders need to be seen to be helpful, and how helpful you are seen to be is a result of the halo effect of all the actions you take, or how your specific usefulness is represented and discussed among the community you serve.


It’s the greatest success for a security leader to get into a situation where business units and other areas are pulling in your team more than you are pushing yourself into their business. It’s an even bigger sign of success if you have a significant supply / demand imbalance to deal with where you get to work on a problem of increasing your leverage and scaling to meet that demand.


It’s imperative to run a professional organization. This might go without saying, but it’s always noticeable when pockets of people’s organizations (no matter how isolated) don’t live up to their commitments, complain publicly or are otherwise overly cynical.


15. The Wisdom of Wit

Give more consideration to humor as a tactic Turn to humor when you’re communicating uncomfortable matters Prioritize humorous messages among your fans

Be pleasant to work with, you can still be appropriately tough without being a jack ass.

Use humor to lighten the mood but still be professional. Cynicism and snark might get a laugh but they don’t actually help you.


16. The Peak-End Rule

Focus on the moments that matter Start by filling in the troughs Amplify the pinnacles End on a high

There are some great insights here and in the context of the security program activities it can be applied, for example, to finish your risk reviews read-outs with a repeat of the final item you want everyone to focus on (peak-end rule).


To adopt approaches where you amplify good work and practices to crowd out the bad rather than focus simply on eliminating the bad (amplify the pinnacles). Look for the worse aspects of some of your processes and do an 80/20 (small changes that have outsize effects) to fill in the troughs, and focus on the moments that matter by looking at your security (tool and process) usability attributes in key parts of the user journey and make those a particularly great experience


Bottom line: behavioral approaches learnt from the sales, marketing and psychology worlds can be useful to enterprise security and risk programs - from gentle nudges to more wide-ranging ways of representing and measuring risk.

3,583 views0 comments

Recent Posts

See All

6 Truths of Cyber Risk Quantification

I wrote the original version of this post over 4 years ago. In revisiting this it is interesting to note that not much has actually...

Ethics and Computer Security Research

If we are to keep advancing the fields of information / cybersecurity, technology risk management and resilience then we need to apply...

Comments


Commenting has been turned off.
bottom of page