Catch the recent “Politics, Power, and Preventative Action” podcast interview with RTJ founder Mark Mateski.

The Dangerous ‘Illusion of Certainty’ (Updated)

MagicYou’ve probably met the red teamer who believes that red teaming cures all ailments without introducing any side effects. Beware this red teamer.
      Seasoned red teamers understand that mismanaged red teaming can potentially introduce just as much uncertainty as it claims to reduce (if not more), leading to a very real and potentially dangerous false confidence.
      I spoke recently with a retired US O-7 who wanted to debate me on this point. I was a bit stunned when he asserted that red teaming only reduces uncertainty and never adds to it. I was even more stunned when the retired O-6 listening in suggested that we can measure the uncertainty reduced and assign it a number. (Conversations like this inspire me to keep RTJ going!)1
      Here are some uncertainty-generating examples that I’ve seen:

  • Misapplied scope I: The red team considered a subsystem but applied the insights from the subsystem to the whole system.
  • Misapplied scope II: The red team considered a system but failed to account for the system’s interfaces to other systems.
  • Misaligned skills: The red team’s skills exceeded the likely adversary’s by an order of magnitude. The findings and recommendations failed to account for the gap.
  • Cultural blinders/blunders: The red team failed to remove their Western/American glasses. Not only did they fail to intuit the effects of their worldview, they didn’t even consider that another worldview might exist. (This is a common one.)
  • Tool fetish: The red team was so enamored with their red teaming toolkit that they failed to see how each tool both revealed and concealed.
  • Method fetish: The red team was so enamored with their method that they failed to see how the method embodied a single decision method at the expense of others.
  • Rationality fetish: The red team stressed standards of normative thinking without accounting for real-world heuristics. (This can be dangerous when attempting to simulate any real-world adversary other than the Mad Logician.)
  • Arrogance: This is the bane of all good red teaming, and I’ve seen it far too often. I can’t speak much for red teams in other countries, but in my opinion it’s the standout issue among many American red teams. It can lead them to engage in all the issues mentioned previously while simultaneously asserting the awe-inspiring goodness of their red teaming efforts. (And yes, it can even lead them to assert that red teaming only reduces uncertainty and never adds to it.)
  • Hybrids: Any or all of these issues can combine to multiply the effects of uncertainty.

      I’ve used the phrase “illusion of certainty” in the past to express the idea that mismanaged red teams can induce a false sense of security. I was pleased to see that Gerd Gigerenzer also uses the phrase to express a similar idea, although with a different focus.2 The illusion of certainty is why the possible uncertainty-generating aspects of red teaming—when unacknowledged—are so dangerous: we can walk away feeling more confident than ever even though our confidence is misplaced. If you were my adversary, I’d encourage you to pursue this style of red teaming with abandon.
      How do we counter these problems? My number-one suggestion is to be mindful of them. Spend some time up front considering how you will minimize their effects before you unleash your red teaming prowess. Think about them as you red team. Ask yourself if you’ve fallen prey to any of them as you share your wisdom and insights. (And don’t bristle if a colleague points out that you, in fact, have!) And if it’s a formal red teaming engagement, spend some time thinking about them again as you prepare your findings and recommendations.
      Yes, red teaming can be good medicine, but even the best medicines have their limits and always come with a warning label.

Edit (2 August): Added the Jomini quote (see footnote 1).

  1. It’s also a great example of a Jominian bias. Consider, for example, this quote from page 323 of Jomini’s The Art of War:

    It is true that theories cannot teach men with mathematical precision what they should do one every possible case; but it is also certain that they will always point out the errors which should be avoided; and this is a highly important consideration, for these roles thus become, in the hands of skillful generals commanding brave troops, means of almost certain success.


  2. See his book Risk Savvy. []