top of page
  • Phil Venables

The Obvious CISO : Don’t Overlook the Simple

There is a great little book I read recently, “Obvious Adams - The Story of a Successful Businessman", it’s available on Amazon, but it’s sufficiently old (1916) that there are plenty of free archives online as well.

It’s a very short read conveying a powerful message that professionals and experienced practitioners can sometimes be so entrenched in their techniques that they fail to see the obvious or simple solutions to apparently difficult problems. Even from the very beginning when Adams wanted to pursue a career in advertising he went straight to the head of a famous agency and simply asked to work there and learn from him, as that leader remembers:

That didn’t immediately get him the role he desired but it struck the head of the agency as such an obvious thing to do, to simply ask for what you want, that led him to get a minor role in the company that was the springboard of things to come. It’s interesting in our security programs how many times we simply don’t ask directly for what we want or need, or what we want people to do in plain terms.

It’s also interesting in looking at your career development how circuitous things can be when, instead, you might want to simply state exactly what you want of your leadership or colleagues. As leaders, how many times have you had people leave and you’ve been surprised because they didn’t previously tell you directly what they wanted when you had the chance to fix that, but rather they hoped you would sense this. There are clearly lessons for leaders here, but also for teams. Your boss is not an omniscient being and so sometimes (all the time?) needs to be told explicitly what you want. You might not always get it because of many possible constraints but you might actually get what you want more often than not.

There have been several moments in my career when I’ve sat down with my own leadership and said I wanted a particular role or a particular responsibility. About half the time I got it right away, usually with a statement like, “we really wanted you to do this, but we thought you didn’t want it”, on other occasions I didn’t get what I wanted, but either got something similar or got what I wanted albeit later when organizational shifts occurred and it came my way because I’d pre-registered my interest. Obvious, right?

The rest of the book gets into many examples of work in the advertising industry where Adams uses his talent for figuring out the obvious to get to the right outcome where more experienced people, using more data and analysis, had previously failed.

For one client, he was asked to determine, despite myriad analysis, why one of their stores in one town was doing less well than another store in the same town. This was despite targeted advertising and much refined merchandizing. What did Adams do? Well, the obvious thing of course. He went to the town and went to the stores and actually went through the experience of shopping and uncovered how difficult it was to actually find one of the stores. This was never going to be uncovered by analysts at headquarters poring over data.

The root cause for many issues are obvious when you put yourself in the right perspective. There are plenty of security issues from situations like inappropriate access grants, large numbers of authentication credential resets, non-adherence to policy and other so-called “human issues” through to inattention from developers and engineers to create or maintain suitably secured systems.

Good security teams analyze these situations, pore over the data and likely come up with some reasonable adjustments to reduce the number and frequency of these issues. But, great security teams, get off their butts and go live the experience of the employees, the customers, the developers, the help desk staff and actually look at the root cause of root cause of the issues. From this they come up with better solutions that are based on the way things are used, not just what the data is trying to reveal.

I’ve seen plenty of organizations, for example, where there is extensive investment in developer's production access controls and the process looks great but the amount of privileged developer access to production never drops as much as you want. There’s usually a lot of excuses for this given by leaders who are doing the right thing by looking at the data. But, in many organizations, when you actually go sit with a developer and watch the deploy process you then realize there’s myriad of weird things and manual steps needed to get the software into production in the right configuration that requires developers or DevOps staff to sustain broader and deeper levels of privilege than you want. Once you go look at this workflow in action then you can see the tuning steps that are needed and the metrics that get to the root cause of the root cause of the issues.

There are plenty of examples I’m sure you’re all familiar with from flawed employee onboarding that drive errors in access assignment, or internal software deployment practices to end points that violate the education you’ve so carefully directed at your employees, through to process breaks in customer call centers that have people asking customers for security information you never really want them disclosing. Until you go there, watch it, talk to the people and find out all those process breaks and accommodations being made so people can do their job you’ll never realize what you need to do. Obvious, right?

One of my favorite examples in the book was Adams developing an advertising campaign for a paper manufacturer. The core of the campaign was to explain to people how well the paper was made, how much quality went into the process. The executive of the paper company was resistant to this because it was "common knowledge" that all good paper was made this way. The fact that the consumers of paper had no idea about this didn’t seem to matter to him.

Adams response is brilliant. Pointing out the goals are to sell to the paper users, not to focus on impressing his peer paper manufacturers. This is a powerful message. How many of our supposed security advancements, programs of work, conference presentations, strategy or other innovations are tuned to be impressive to our peers as opposed to actually getting the job done in the right way for the risk profile of the organization you are serving. This is very much where ego can be the enemy. Obvious, right?

This is an excellent quick read that despite its brevity has you thinking about how much we all need to pursue the obvious. Now, I’m not saying data and analysis are without value. Of course, it is crucial to analyze situations but the message is don’t let that stand in the way of what might be obvious solutions if you just take the time to seek the perspective that can lead you to those ideas.

One thing I learnt from a long career at an investment bank that has a justifiably good reputation for risk management was that risk is not just about data. I have seared into my memory some guidance I got not long after joining from one of their senior, almost legendary, traders that “risk isn’t managed by data alone, risk is managed by smart people with good judgment, and experience, who use the data as input.” I got to see this in action on many occasions not least during the financial crisis when that particular investment bank fared better than the others for one simple reason: when reality differed from the financial risk models, that discrepancy triggered not an assumption that the models were correct and reality would soon conform, nor that the models were wrong but let’s take time to adjust them and monitor the situation. No, what happened is well recorded, it triggered a flight to safety, to a risk-off mode to buy time to assess from a protected position. When reality and your data/models don’t conform that is the cue to get to a safe place, then figure out what is going on. Obvious, right?

Now, finding the obvious is not easy. Like simple explanations for things that take time to refine vs. easier more complex answers you have to work to find them and they’re only obvious after they are stated. The later editions of the book contained some annotations about how to go about finding the obvious ways:

The Five Tests of Obviousness

  1. This problem when solved will be simple - if an idea is complicated we should view it with some suspicion.

  2. Does it check with human nature? - can you explain it easily?

  3. Put it on paper - write the idea down in a document, expressing the idea in plain terms that others can understand. Not a deck / not bullet points but clear writing that forces your own clarity of thinking.

  4. Does it EXPLODE in people’s minds? - do people exclaim …”Now why didn’t we think of that before?”

  5. Is the time ripe? - some problems are not ready to be solved even if it is obvious.

The Five Creative Approaches to the Obvious

  1. Never mind how a thing has always been done, or how other people want to do it. What is the simplest possible way of doing it?

  2. Suppose the whole thing were to be completely reversed? (A nice example in the text is about designing a new seaplane - remember the era - that was less a boat that could fly than a plane that could float.)

  3. Can a vote be taken on it, or the public’s help actively enlisted? Does it capture the imagination, does it create a desire to follow the idea, and does it resonate or align with other activities so that it becomes self-propelling.

  4. What opportunity is being overlooked because no one has bothered to develop it?

  5. What are the special needs of the situation?

Bottom line: Don’t assume an entrenched situation can’t be solved by an obvious and simple approach. Yes, work the data, analyze the situation but also go take a different perspective and experience the problem. Look for the obvious and when you find it don’t be afraid to state it. It might only be obvious to you. When you state it then it may well be an idea that really does explode in people’s minds.

1,692 views0 comments

Recent Posts

See All

Incentives for Security: Flipping the Script

We’re getting it wrong on the messaging for incentives to do security - and people are pretending it’s landing when it isn’t. There are 5 main categories of security incentives: Loss avoidance. The pr

Where the Wild Things Are: Second Order Risks of AI

Every major technological change is heralded with claims of significant, even apocalyptic, risks. These almost never turn out to be immediately correct. What often turns out to be riskier are the 2nd


Commenting has been turned off.
bottom of page