As red teamers, we sometimes assume that the need for red teaming is self-evident, and, given this assumption, we proceed to promote the practice through example and anecdote (the more entertaining, the better): “Look what happened to Company X! They forgot to red team, poor fools,” or “You won’t believe what our extremely clever red team uncovered!” While anecdotes can be illustrative and persuasive, grounding our efforts on a more solid foundation is an effort that is past-due.
Let’s start with the goal. Red teaming can be fun, and it can give a team a surge to identify an unexpected vulnerability, but the real purpose of red teaming is to help improve the client’s decisions. If we all made great decisions all the time, red teamers would be out of business. The root of red teaming, then, is the poor decision, and it’s there that we should look in order to unpack the need.
Digging deeper, we immediately encounter the “tyranny of uncertainty”: every decision involves a degree of uncertainty, and that degree of uncertainty is itself uncertain. Put differently, if our decisions involved no uncertainty, they would be easy, and, again, red teamers would be out of business. Uncertainty frames every decision whether we admit it or not, and red teaming (done well) should help reduce uncertainty.
What generates uncertainty? I’ve sorted its troublesome sources into six bins:1
Individual: Every one of us at times exhibits cognitive biases. We tend to misperceive risk in predictable ways. Our awareness is limited, often because we see no need to gather more information. Emotion can cloud our reason. All of this happens in our own heads, and we’re completely unaware of most of it.
Organizational: Take what’s happening in an individual’s head and multiply it by x^2, where x is the number of individuals within the relevant group. (Yes, the formula is notional, but I’m only half kidding.) The elements here include group think, pressure to conform, policy bias, “tribal think,” parochial competition, over reliance on organization, routines, excessive secrecy, and many others. Realize as well that significant interaction effects can exist across these various elements.
Cultural: Each of us is born into a culture. For the most part, we inherit our culture’s worldview, its ideologies, and its strengths and weaknesses. Just because someone is smart or savvy doesn’t mean he or she isn’t hindered by nearly invisible cultural blinders. We tend to believe our own culture is superior to others’, and we often unknowingly trample other culture’s norms and ways of seeing the world.
Situational: Few if any decisions can be distilled to one cause, one effect, and one solution, yet we often view them in just these terms, ignoring situational factors such as dynamic complexity (and our limitations comprehending it), delayed learning, too much or too little information, contradictory information, and the influence of time. Then throw luck into the mix, and remember that sometimes a reasonably good decision yields a bad outcome and a manifestly poor one yields a win.
Adversarial: As unlikely as it seems, we often remove the adversary from our decision calculus. This is unfortunate, because, as Michael Handel tells us
… the reciprocal nature of all action in war means that attempts to grasp its complexities through a static, unilaterally based concept will never succeed…. a realistic approach must consider how one’s adversary interprets the war as well. Thus, perceiving the nature of a war is a reciprocal and dialectic process in which it is important to consider how one side’s perspective and actions affect the other side’s actions and reactions.2
Add to this the following observation from Colin Gray, and you have real problem.
People unfamiliar with the arcane world of defence analysis might be surprised to learn just how common it is for imaginative, energetic, and determined strategic thinkers and defence planners to forget that the enemy too has preferences and choices.3
Perceptual: Misperceiving the level of uncertainty associated with any or all of these factors is entirely possible, and not every stakeholder will perceive the same “game.” Some might even hold a perceptual advantage, an advantage that can be created and enhanced through deception. Lest you underestimate the potential of deception, pin this to your wall: “Though deception is far less common than denial-it is akin to the silver bullet held in reserve for only the rare but perfect circumstances-its batting average is extraordinarily high, succeeding more than nine times of every ten it is used.” (Bruce and Bennett, from the chapter “Foreign Denial and Deception,” in Analyzing Intelligence: Origins, Obstacles, and Innovations, a 2008 volume edited by George and Bruce, p. 123.))
Twist these six sources into a single tangled knot, and it’s a wonder that anyone ever makes a good decision. Underscoring this somewhat dire assessment is the fact that we’re largely unaware of these factors, and a single emotional response can sabotage the most careful and self-aware analysis. Balancing the equation is the fact that your adversaries and competitors suffer from the same limitations and face the same challenges. This suggests a prime reason why red teaming offers so much advantage. On a level playing field in which both competitors are unaware of their own limitations, the penalty of ignoring these sources is severe but more or less equally distributed. You might even say that the effects of uncertainty yield neither advantage nor disadvantage to either competitor. If one competitor, however, becomes self-aware, recognizes these sources, explores them, tames them, and—even more powerfully—exploits an adversary’s perception of them, this competitor holds a true advantage.
The best red teamers are aware of the system of uncertainties described above. They consider the problem from the perspective of the system and educate their clients accordingly. This does not guarantee success in every case, but it certainly improves the odds. Just hope your adversary isn’t doing it, too. If so, you just have to up your red teaming game by keeping up on the very latest from Red Team Journal!
Note: I extracted the ideas for post is from a much longer presentation I use in some of my red teaming courses.
- For a more technical look at sources of uncertainty, see chapter 4 of Granger and Henrion’s Excellent 1990 book Uncertainty: A Guide to Dealing with Uncertainty in Quantitative Risk and Policy Analysis. [↩]
- Handel, Masters of War: Classical Strategic Thought, 1992, pp. 94-95. [↩]
- Gray, Modern Strategy, 1999, p. 20. [↩]