top of page
  • Phil Venables

Cybersecurity and the Curse of Binary Thinking

Working in information/cybersecurity and technology risk is a fascinating and challenging career, as I’ve covered here. There is, mostly, a great spirit of sharing and collaboration among security professionals. However, I’ve observed one disturbing and growing trend in the past few years that might be characterized as a curse of binary thinking. By this I mean the assertion that if something isn’t perfect then it must be terrible, or that if you don’t fully agree with something then you must therefore absolutely disagree with it. This lack of dealing with shades of gray or the ability to hold different views on topics at the same time is by no means confined to our discipline. This seems to have infected the public discourse on many topics. Here are some of the examples in our profession:


Certifications


What is said: certifications like CISSP don’t represent the full spectrum of skills a professional needs therefore certifications are a waste of time.


Reality: certifications represent a foundational body of knowledge for new entrants to the field - to start the "scaffolding" of their knowledge. It helps employers test a candidate’s commitment to the career and so is useful as long they don’t mandate it above all else.


Compliance


What is said: compliance is counterproductive for security.


Reality: compliance is a necessary but not sufficient condition for security. Many compliance regimes do in fact represent a baseline level of security that is useful and necessary, but sustaining compliance does not equate to the security you might need in your context. Many compliance approaches in other walks of life are useful, for example, I take comfort from building codes, healthcare rules, public health ratings for restaurants, banking regulation and so on. Why not security?


Security Through Obscurity


What is said: security through obscurity is pointless.


Reality: Kerchoff’s principle, in cryptography, states the only thing that should be secret is the key. In other words, there should be no security reliance on the secrecy of the algorithm. This is correct, but has grown into a religion that all security through obscurity is bad. It is undeniable that if your only defense is obscurity then you’re asking for trouble. In reality there can be a lot of value in keeping an attacker guessing and creating uncertainty by using deception or a variety of other techniques that could be described as obscurity.


Everyone is a Target


What is said: all organizations have something of value and so everyone is a target.


Reality: many organizations are simply targets of opportunity. Many attackers look for weakness to exploit and then target the exploitable rather than always exclusively and exhaustively fixating on one objective. Of course, such targeting does occur against organizations by specific threat actors but it's not the whole game.


Two Types of Companies


What is said: there are two types of companies, those that have been hacked and know it and those that have been hacked and don’t know it.


Reality: I don’t like this frequently repeated quote. It depends on what you mean by "hacked". If it is meant that a company has been compromised significantly by external attackers then there are a few companies that haven’t and have high confidence that they haven’t through very good monitoring. But, if it means has a company experienced a security event whether it’s an insider data leak, an accident, or a contained malware event then yes all companies have been hacked.


Cybersecurity Skills Crisis


What is said: there isn’t enough cybersecurity professionals and we're not training enough people to meet our challenges.


Reality: this is a more complex discussion as we spoke about here. We need to improve the productivity of the cybersecurity professionals we have and to lessen their load through organization re-design and better technology architecture choices. Look at both sides of the supply and demand problem.


Security Ratings


What is said: security ratings are not accurate therefore they’re pointless.


Reality: they’re getting better and new entrants are building better approaches and models. However, even if they weren’t then they are still useful for some purposes, even for the negative if not positive predictive qualities. I’ve written about this here.


Management


What is said: management doesn’t care about security because they don’t fund every single thing the security team asks for.


Reality: in reality security investment is a business or mission risk decision like everything else. Yes, certain levels of security in specific contexts are just a basic non-negotiable safety mechanism, but just because you don’t get everything you want doesn’t mean that management is wrong not to prioritize it in all cases. Perhaps, you didn't make the case well enough or there really are alternatives - and perhaps in a business context a specific risk is actually tolerable.


End Users


What is said: end user’s don’t care about security and make stupid choices.


Reality: most end users make fully rationale decisions to optimize their time and energy to get their work done. Solutions and guard rails (ambient controls) are needed to protect end users to let them do their job securely and efficiently. The classic example is chastising end users for failing phishing tests and blaming incidents on employees that clicked a link that led to malware that led to a compromise. The reality here is if your enterprise security depends on people not clicking a link there you’re doing something very wrong. Also, while we're talking about this, don’t rely on passwords as your main authentication mechanism and don’t have a high value payments process that is authenticated by e-mails.


Complexity


What is said: complexity is the enemy of security.


Reality: complexity is certainly not the friend of security but calling it the enemy fails to identify the real issue which is that most useful things are complex at some level of abstraction and this complexity needs to be managed.


Open Source


What is said: open source is more secure because many eyes have looked at it.


Reality: there are still bugs in open source software either because they’re subtle or because the much lauded many eyes have been turned elsewhere. The reality is we all need to work on this aspect as well. Join the Open Source Security Foundation to help.


Information Sharing


What is said: more public/private sector information sharing is needed.


Reality: many private sector organizations have as much useful threat and other intelligence from paid or open sources than our colleagues in government have. Even if more private sector companies had access to the most classified intelligence on attacker TTPs the real challenge would still be how to make sense of it and to take action from it in a timely enough way. A lot of organizations that scream for more intelligence often don’t have a basic threat intelligence team or other handling capability to do this.


Cloud


What is said: the cloud is just someone else’s computers.


Reality: I love this snarky comment as it is often meant negatively with respect to the cloud, but really is in fact a positive assertion. In reality the cloud is, of course, someone else’s computers (the cloud providers!) but they are (in most cases) very good computers with a level of security at scale that even the largest of end user organizations can’t implement.


Incidents


What is said: incidents are increasing, and on days like today it certainly feels that way.


Reality: maybe, maybe not. You need to look at not just the headline numerator but the denominator as well. As the attack surface of the world increases due to mass digitization then perhaps we should be seeing even more incidents, if we’re not then why not?


Sophisticated Attacks


What is said: it was a sophisticated attack and we take your security seriously.


Reality: not everything is a sophisticated attack. When you look at most incident reports the attacks exploited an avoidable weakness that with basic IT controls (e.g. CIS Critical Controls) could have either been thwarted. If not, then at least it made the attacker have to work and to burn some technique in the process. While we're on this topic, again a binary thinking trap, basic controls doesn't mean easy. For many organizations with high levels of technical debt or insufficiently modern approaches to managing their IT infrastructure the basic controls take a degree of effort and executive commitment that will feel anything like basic.


Bottom line: if you see something presented as an absolute or a binary choice then use that as a red flag to do some critical thinking. Ask what would need to be also true for this to be true and what would challenge the assertion. It’s fun when you start to do this.

23,514 views0 comments

Recent Posts

See All

DevOps and Security

Each year, DevOps Research and Assessment (DORA) within Google Cloud publishes the excellent State of DevOps report. The 2023 report published in Q4 was as good as ever and in particular documented so

The 80 / 20 Principle 

Ever since I first became familiar with the 80/20 principle, and other circumstances marked by Pareto distributions, I began to see examples of it everywhere. Naturally, I’m particularly biased to obs

bottom of page