Red Teaming: A Balanced View


Defined loosely, red teaming is the practice of viewing a problem from an adversarial or contrarian point of view. While some red teamers define the practice more narrowly and some more broadly, nearly all agree that a red team should play or model “red”–the attacker, the opponent, or simply a devil’s advocate.
      The goal of most red teams is to enhance decision making by challenging assumptions and exploring new ideas, typically from the perspective of an adversary or a competitor. A red team, for example, might play the role of an attacker and test the security of a system. Alternatively, a red team might review and assess the assumptions of a corporate strategic plan. Whether a red team adopts a specific perspective, method, or toolkit depends on the nature of the problem and the circumstances of the red team. A red team that performs a given type of task repeatedly is likely to develop a process framework and an associated toolkit.


Skilled red teams offer a decision maker several potential advantages. Perhaps the most important is the ability to identify an otherwise overlooked or underappreciated decision trap. This trap might rest in a system or process vulnerability, an adversary’s overlooked preference, or a competitor’s unheeded capability. Closely related to this advantage is the expert red team’s useful ability to help expose a decision maker’s blinders, preconceptions, and biases. Finally, a good red team will employ a systems view to help reveal hidden seams and connections. This latter advantage is particularly important when these seams and connections represent exploitable weaknesses with possible nonlinear effects.
      In practice, these advantages tend to be demonstrated anecdotally. This is both a strength and a weakness. It is a strength because anecdotes and stories are a powerful method of communicating experiences, principles, and lessons. Among other things, they speak directly to our doubts and apprehensions. Most persuasive are the red teaming anecdotes that illustrate the sidestepped disaster (due to the red team’s diligence). Not surprisingly, organizations are likely to hold these anecdotes very closely. Anecdotal demonstration is also a weakness. In an era of tight budgets, decision makers want numbers: “How much will the red team save me?” the potential client asks. This is an extremely difficult question to answer in part because it involves a chain of branching “what ifs?” no red team can possibly resolve.

Limitations and Constraints

Despite the many advantages of candid red teaming, the practice is subject to various limitations and constraints. A red team cannot predict with certainty what an adversary will do, nor can it uncover all possible weaknesses in a concept, plan, or system. Red teams that claim these abilities overstate the benefits of red teaming and invariably mislead their clients. Decision makers who attempt to use a red team to divine specific events risk doing worse than nothing.
      Additionally, few red teams work for free; someone must pay them, and this someone is typically most interested in his or her own organization, system, plan, or mandate. Real-world adversaries, however, tend to think across organizations, systems, plans, and mandates. In some cases, then, a client will direct a red team to adopt a view that is narrower than that of the client’s possible adversaries. Poor red teams fail to detect this tension and work strictly within the defined box (knowing nothing else); superior red teams do their best to work within and around the tension. In fact, a superior red team will use the opportunity to disclose the problematic dependencies that emerge from client-related constraints.

Superior and Inferior Red Teams

Clearly, not every red team is created equal. Superior red teams, for example, tend to

  • View the problem of interest from a systems perspective;
  • Shed the cultural biases of the decision maker and, as appropriate, adopt the cultural perspective of the adversary or competitor;
  • Employ a multidisciplinary range of skills, talents, and methods;
  • Understand how things work in the real world;
  • Avoid absolute and objective explanations of behaviors, preferences, and events;
  • Question everything (to include both their clients and themselves); and
  • Break the “rules.”

One can argue that the best red teamers are born, not trained. It seems that some people have an instinctive ability to red team, while others—despite extensive training—can never escape the secure but confining pen of convention. In fact, this is perhaps the key characteristic of the inferior red team: an inability or unwillingness to color outside the lines. Inferior red teams also tend to

  • Accept without question the client’s description of the problem;
  • Embrace the biases inherent in their own values and culture;
  • Adopt the first or most easily discerned answer;
  • Defer to reputation and status; and
  • Know it all.

Interestingly enough, then, both a lack of confidence and unchecked arrogance can undermine red teaming. The members of an inferior red team might include deferential technocrats and self-important experts.


Not every decision maker wants a red team (or at least a candid red team). A red team can undermine a decision maker’s preferred strategies or call into question his or her choices, policies, and intentions. It takes a decision maker of solid integrity to sponsor, empower, and manage a superior red team. That said, a thoughtful decision maker also balances the costs and benefits of red teaming with the costs and benefits of advocacy, compromise, and consensus building. It is also important to note that not all resistance is harmful; it can represent valid interests, concerns, and risks of which the red team is simply unaware.

To Red Team or Not to Red Team

Nearly everyone can benefit from some form or degree of red teaming. Whether the “red team” is a highly structured, formal unit or a self-appointed devil’s advocate, almost every idea, concept, design, or plan benefits from healthy opposition and testing. Too much red teaming, however, can be as harmful as too little. No one wants a relentless contrarian gumming up every phase of a project. It won’t take long, in fact, for everyone to dismiss the contrarian as an annoyance.
      Decision makers must be careful to apply red teaming judiciously. Among other factors, timing is especially important. Establishing a red team too early can lead to aimless dithering; establishing it too late can trigger fierce (and justifiable) resistance. Even so, the adage “better late than never” sometimes applies. (If one adage always applies to red teaming, it is “one size [doesn’t] fit all.”) All of this challenges the decision maker and the red teamer, both of whom must consider and reconsider the context of the red team’s activities throughout the lifecycle of the effort.
      It is also important to consider and value the perspective of all client stakeholders. Not every problem has a distinct boundary delineated by a single, unbiased point of view. Often the overriding characteristic of a complex problem is the unclear, contradictory, and confusing tangle of relationships and concerns among the various stakeholders. The broader the problem, the greater the challenge. Indeed, this may explain why national-level initiatives rarely experience honest red teaming. Red teams must avoid serving as a shill for a single stakeholder when red teaming complex problems of this sort.
      In short, the decision when and how to red team can be a surprisingly complex one. Dropping a red team into a highly charged political situation can undermine trust and erode hard-won consensus. Similarly, red teaming a decision during implementation can raise more questions than it answers, sabotage morale, and cause a decision maker to second-guess sound choices unnecessarily. On the other hand, aiming a seasoned red team at a problem or system at the right time with the proper mandate can steer a decision maker away from an otherwise pending catastrophe.

1 comment


Terms of Use

Please read.