top of page
  • Phil Venables

Dangerous Embedded Assumptions

There is a notion I keep coming back to thanks to this article from a few years ago. The essence is that there are things that have become so relied upon or accepted as true that they are no longer questioned. Never questioned, that is, before they are seen as no longer true. This is sometimes revealed in dramatic fashion. The article focuses on what might be the dangerous embedded assumptions of the current US military, based on the thought that various prior assumptions in World War 2, like "the bomber will always get through", proved false over time until more innovations occurred. The assumptions laid out are:

  1. The U.S. military still knows how to fight a major war.

  2. The United States can protect its ground forces from air attack.

  3. The U.S. Air Force will be able to fight effectively from contested bases.

  4. Stealthy aircraft will remain stealthy.

  5. Aircraft carriers can be both effective and survivable.

  6. U.S. submarines will remain undetectable.

  7. Amphibious operations remain viable.

  8. The U.S. military can protect its air and sea logistics pipelines.

  9. Advanced U.S. weapons systems will operate effectively under wartime conditions.

  10. The U.S. military can keep its secrets.

Now the point of this isn’t necessarily to say these are false or about to be false, but rather that like scenario analysis we should establish indicators and warnings (potential red flags) about such assumptions starting to be invalidated - well ahead of their potential crunch time so we have a chance to do something about it. In fact, like scenario planning, a well executed challenge of dangerous embedded assumptions might yield no issues because the assumptions are challenged and then changed before they are fundamentally broken.


A useful approach is to think of this in the particular context of your own organization. For example, I’ve seen organizations that are so good at crisis response that it is just assumed that no matter what is thrown at them they will be able to handle it. Again, this might be true - until it isn’t. Every organization will have some common dangerous embedded assumptions but many will also have their own particular ones.


More broadly, this got me thinking about what are the possible dangerous embedded assumptions in cybersecurity in general, or even broader information and technology risk management. The list below could be some candidates for this. Again, I’m not saying I agree with these, just that there is a sense in many parts of the industry these are true, and will always be true, which is potentially worrying.


  • Ransomware won’t shift to integrity (data modification) attacks.

  • Companies will be able to adapt and respond to crises as they have in the past.

  • Devastating continental-wide physical disasters won’t occur (e.g. Carrington Event electro-magentic consequences from solar weather).

  • Stock prices of companies experiencing a data breach will always recover.

  • There is plenty of time for post-quantum cryptography upgrades.

  • Timely patching will keep out opportunistic attackers.

  • Background checks and surveillance will keep insider risks at bay.

  • Strong cryptographic authentication will provide high assurance of claimed identities.

  • Quantum computing will be the only threat to RSA and ECC.

  • Zero days will continue to be detectable (and thus responded to) in the wild.

  • Defense in depth will provide for reinforcing levels of controls.

  • We will be able to keep reducing attacker dwell time.

  • Deep fakes will be detectable.


Bottom line: we should look out for the major assumptions we are making about what is effective, what will continue to be true and what is coming that we take for granted. Then we should look and see if any of these are “dangerous embedded assumptions”. In particular, assumptions that we are so reliant on their continued truth that we will be in serious trouble if they turn out not to be. As a consequence we should prepare contingencies and then look for indications and warnings that such assumptions are being violated and take that action well ahead of it actually becoming a dangerous outcome.

1,586 views0 comments

Recent Posts

See All

A Letter from the Future

A few weeks ago The White House published our PCAST report on cyber-physical resilience. Thank you for all the positive reactions to this. There is already much work going on behind the scenes in publ

InfoSec Hard Problems

We still have plenty of open problems in information and cybersecurity (InfoSec). Many of these problems are what could easily be classed as “hard” problems by any measure. Despite progress, more rese

DevOps and Security

Each year, DevOps Research and Assessment (DORA) within Google Cloud publishes the excellent State of DevOps report. The 2023 report published in Q4 was as good as ever and in particular documented so

bottom of page