The Need for Genuine Empathy in Modern Adversarial Red Teaming


You never really understand a person until you consider things from his point of view . . . until you climb in his skin and walk around in it.”

      – Atticus Finch to Scout in Harper Lee’s To Kill a Mockingbird.

      This is the heart of adversarial red teaming, right?—to consider a problem from the adversary’s perspective. Kind of . . . what Atticus advocates is something more, something elusive, and something many red teamers unthinkingly overlook: genuine empathy.
      Genuine empathy for an adversary can be difficult to achieve. After all, the adversary is often someone we dislike, dismiss, and even disdain, and these feelings don’t mystically dissolve when we “play red.” We still prefer our own values and mores and retain them when we red team. To achieve the kind of red teaming we really want, though, we must suppress our preconceptions and, to the degree sensible, understand and don the adversary’s values and mores.1
      In knowledge management (KM), we often talk about the levels of culture that must be addressed when attempting to implement a KM program. We sometimes compare culture to an iceberg, where observable artifacts break the water. Beneath the waterline are espoused values, and deeper still are underlying assumptions and mental models.2
      When red teaming, we usually acknowledge and employ an adversary’s “observable artifacts” (known capabilities, publicly stated intentions, attack patterns, and so on). Much less frequently do we acknowledge and employ the adversary’s “espoused values,” and even less frequently do we truly understand and employ the adversary’s “underlying assumptions.”
      It’s easy to assert that the observable artifacts represent the sum of an adversary’s espoused values and underlying assumptions, so we usually stop there. Yet observable artifacts are usually contextual. They change more easily and frequently than assumptions. They are likely to shift as the context shifts. Values and assumptions are much more stable. They represent the adversary’s thinking processes, not just the adversary’s prior decisions.

      If we want to engage in superior red teaming, we must dive deep in dark and icy cultural waters. It’s not comfortable to concede that an adversary’s values and assumptions are—from the adversary’s perspective—inherently valid, even if (and especially when) we disagree with them. We’d much prefer to project our values and assumptions on the adversary. Consider, for example, President Bush’s 2001 statement before Congress that al Qaeda terrorists attacked us because they “hate our freedoms.”3 Did they? Maybe, but was that really their primary motivation? Would a red team proceeding from that assumption generate valid insights?
      It’s one thing to say we need to perceive and appreciate the whole iceberg, and it’s another thing entirely to do it. In other words, how do we dive deep? A complete answer would require a book, but we can begin by doing the following:

  • Acknowledge the issue. Simply doing so opens us to the possibility of understanding and empathy. A little humility goes a long way, and a lot of humility sets you on a path toward shedding your own cultural prejudices, at least temporarily.
  • Absorb the relevant history from the adversary’s perspective. We might know our version of the history, but what of the adversary’s? It’s very likely to be different, and even when it’s essentially the same, it’s likely to be interpreted differently. Read the history with empathy to gain empathy. Keep diving until you find the sources of the adversary’s worldview.
  • Abandon our transcendental pretenses.”4 Westerners wrongly assume that logic and the mathematics of risk are universal. We also tend to rely on principles and processes of normative decision making to identify what our adversaries “should” do. Dropping such assumptions, however, is merely a half measure; we must replace them with the adversary’s assumptions, which requires much more than a review of observable artifacts.

      Too much of what we call adversarial red teaming today is merely a surface scan. What’s more, it’s largely self-validated: it seems legitimate to us because its logic and perspective align with our own (typically unexamined) values and assumptions. Until we move past this comfortable conceit and explore the depths of the iceberg (both our own and the adversary’s), we’ll continue to generate naïve findings. As the fictional Atticus might say, “Wearing your adversary’s skin doesn’t do much for you until you look out through their eyes without censure and blame.” And that’s genuine empathy

  1. Of course, we must also incorporate sensible rules of engagement. []
  2. See, for example, chapter three in Green and Stankosky, eds., In Search of Knowledge Management. []
  3. See the full text here. []
  4. For an explanation of this idea, see the first chapter in Hall and Ames’ book Anticipating China. []


Terms of Use

Please read.