Catch the recent “Politics, Power, and Preventative Action” podcast interview with RTJ founder Mark Mateski.

The False Client

It’s one thing to red team; it’s another thing entirely for a red team to facilitate useful change. All red teaming is embedded within a culture, and savvy red teamers learn quickly that not all red team engagements are what they appear to be. Sometimes a client hires a red team to validate what the client already “knows” (typically then tying the red team’s hands through a set of overly constrained rules of engagement). For the experienced red team, this usually yields a level of frustration that’s best avoided by simply not taking the job.
      In a roundabout way, the quote below from Jorge Luis Borges reminds us of the red team client who feigns interest in uncovering the uncomfortable truths. Read on …

The Hazards of Cross-Cultural Red Teaming

Are members of all cultures equally good at intuitive red teaming? Though his words might sound stilted and “politically incorrect” to our sensitive 2017 ears, F. S. C. Northrop, writing in 1946, suggests that the answer is “no.” He begins by arguing that the “ideographic symbolism” of the Chinese language yields a “superlative degree of fluidity, a capacity to convey the unique particularity, nuance, and precisely refined richness of the specific, individual experience which probably no other mature language in the world today achieves.”1 This, he suggests, generates within “the Chinese psychology” an exceptional ability to identify with other cultures: “It is doubtful,” he says, “if any other people have such capacity as have the Chinese, having visited, lived with, and immediately experience the culture and psychological reactions of another people, to put themselves in the intuitive standpoint of that people.”2 He cites examples of Chinese students living in France and the United States, who exhibit a remarkable ability to absorb the cultural perspectives and habits of their host countries. He attributes this ability not just to the fluidity of the Chinese language but also to the “ancient philosophical and religious intuitions” of the Chinese culture.3 Further, he warns that “Unless we of the Occident find in our own immediate experience the factors to which their remarkably denotative philosophical and religious terminology refers, we can never hope, regardless of our information, or our observation, to understand either the Chinese or any other Oriental people.”4 Read on …

  1. F. S. C. Northrop, The Meeting of East and West, p. 318 []
  2. Ibid., p. 318. []
  3. Ibid., p. 319. []
  4. Ibid., p. 319. []

The Seen and the Unseen

In his 1946 book Economics in One Lesson, Henry Hazlitt unfolds an interesting systems-oriented principle that we believe belongs in every red teamer’s toolkit. Often, Hazlitt tells us, the unseen is more important than the seen, even though we naturally tend to focus on the seen. Among other things, Hazlitt discusses public-works projects, which generate visible activity and tangible results. What we don’t see is the “what-might-have-beens,” the results that would have emerged had the same resources been applied differently. Read on …

Archived Red Team Journal Mission (1998)

The following is the Red Team Journal mission as posted on the site in 1998. For better or worse, it still applies today. (We even dug up an old RTJ banner!)

In spite of growing readiness problems, the U.S. military remains without peer. It is the best-trained, best-equipped force in the world. Its budget is larger than the next five largest defense budgets combined. It fields technologies many forces won’t acquire until well into the next century, if ever.
      Yet today the United States is more vulnerable to attack than perhaps ever before. The same complex latticework of technology that powers our cities and yields dominant awareness on the battlefield masks an abundance of critical leverage points. A knowing adversary can target these points with potentially spectacular effect. Pick a card and pull. The house comes down. Read on …

‘Sitting Around Thinking’—Well … Yeah

I spoke briefly yesterday with a gentleman who runs a successful pentesting company. For the most part, I get what he does, but I don’t think he got what I do, nor did he seem inclined to ask any questions to find out exactly what that might be. (At one point, he described my version of red teaming as “sitting around thinking,” which, of course, doesn’t make money!)
      The misunderstanding just might be my fault. I realized today that I need a better method of describing how successful red teaming addresses the whole system even if the red team ultimately only “attacks” a portion of it. I’ve tried before (here and here), but until I get it right, I’m going to keep trying. Read on …

The Annoying Red Teamer: A Philosophical Approach to the Problem

Painting of Diogenes and the Lantern.Red teamers can be annoying. Sometimes the annoyance is justified, sometimes not. After all, who likes to be told that they overlooked a key assumption or failed to implement a sensible practice? It’s not surprising that many people resist even the idea of red teaming.
      As red teamers, we often lament the shortsightedness of this resistance. What we don’t discuss very often is the uncomfortable fact that we often aggravate and perpetuate it. Yes, we can be self-satisfied and snobbish. And why not? We spend our days thinking about important things other people ignore, neglect, and overlook. Even when we’re not snobbish and condescending (honest!), we have to work twice as hard not to be perceived as such. That’s just the nature of the game. Read on …

‘Thank You’ Cleaning Crew

cleaning_bucketThe pleasant little 1968 comedy Hot Millions starring Peter Ustinov and Maggie Smith features an interesting moment relevant to red teamers. (If you haven’t seen the movie but intend to, stop reading here.) Ustinov plays a compulsive embezzler. After serving time in gaol (that’s “jail” for us Yanks), he assumes a programmer’s identity and secures a job at a large company. He thereupon attempts, unsuccessfully, to circumvent the security of the company’s computerized accounting system. Temporarily frustrated, he is delighted to learn that a simple “bang” on the side of the computer’s casing with a mop bucket opens it, circumventing the security he’d tried so hard to foil. The punchline? Ustinov learns the secret by chance; the cleaning crew uses the trick to open the computer in order to warm their tea inside the computer’s casing.
      The real world, of course, is rife with such irony, and superior red teamers have a nose for it. Perhaps not often (but often enough), the most splendid security system is vulnerable to an unexpected, comically simple exploit, all of which calls for the timely services of the superior red teamer’s nose. It reminds me of Red Teaming Law #17: “The superior red teamer learns how things work in the real world, not just how they work on a diagram or presentation slide. The most useful insights often come from the bottom of the org chart. The higher up the org you go, the broader the view but the more filtered the information.”

Postscript: There’s another Ustinov movie with a scene relevant to red teamers. I’ll post on that soon.

The Need for Genuine Empathy in Modern Adversarial Red Teaming

You never really understand a person until you consider things from his point of view . . . until you climb in his skin and walk around in it.”

      – Atticus Finch to Scout in Harper Lee’s To Kill a Mockingbird.

      This is the heart of adversarial red teaming, right?—to consider a problem from the adversary’s perspective. Kind of . . . what Atticus advocates is something more, something elusive, and something many red teamers unthinkingly overlook: genuine empathy. Read on …

‘Seven-Place Accuracy with Bum Data’

At times during this election season I felt as if I were living in a house of mirrors. With leaks, allegations, and counter-allegations sprouting like weeds, I wondered how, as a citizen, I could discern anything close to the truth. As red teamers, we often face a similar dilemma. Sometimes we just don’t know enough to draw actionable conclusions from the available information. Sometimes all the normative decision making approaches in our toolkit can’t compensate for the degree of uncertainty we face. Sometimes we’re forced to rely on our intuition—knowingly—while seeking new and better information. Sometimes we find opportunity in the ambiguity and uncertainty, but typically the very worst thing we can do is assert certainty where none can reasonably exist. As a Robert Heinlein character says in the short story “Space Jockey,” “What good is seven-place accuracy with bum data?”

Red Teaming: Seven Red Flags

rtj-flagsYou might be surprised to learn that I don’t believe red teaming always works. You might be even more surprised that I believe red teaming can sometimes do more harm than good. Here are seven red flags that might indicate that you need to review and perhaps reconsider how your red team goes about its business. Read on …