As a red teamer, I was immediately skeptical of the pop-up sobriquet “fake news.” It’s one of those vague, accusatory phrases more like a playground insult than a real argument. How does one respond to the charge of fake news? With counter-evidence and counter-reasoning? Don’t bother; the thought-stopping power of the label has already done its work.
The sinister beauty of the fake news charge is that it truly is in the eye of the beholder, and in this sense, all news is fake. Everyone reads the news from their own perspective. How often, for example, does the reader of Drudge Report go to The Huffington Post to get their “news,” and vice versa? Each of us selects and reads “the news” from our own perspective. Read on …
It’s time to update The Red Teamer’s Bookshelf. In the past, we’ve either built the list ourselves or consulted a small group of colleagues. This time we’d like to crowdsource the list in partnership with redteams.net and OODA Loop. Use the contact page to send us the titles of the book or books that you believe red teamers should be reading. (You can reach back into history; these don’t need to be 2016 titles.) When you do send us your title or titles, add a sentence on each telling us why you think it’s important. After a week or so, we’ll aggregate the submissions and post The Red Teamer’s Bookshelf (2016 Edition) at all three sites. To get your thinking started, here are the previous lists:
Matt Devost’s list of the best security, business, and technology books over at OODA Loop is worth checking out as well.
The following is the Red Team Journal mission as posted on the site in 1998. For better or worse, it still applies today. (We even dug up an old RTJ banner!)
In spite of growing readiness problems, the U.S. military remains without peer. It is the best-trained, best-equipped force in the world. Its budget is larger than the next five largest defense budgets combined. It fields technologies many forces won’t acquire until well into the next century, if ever.
Yet today the United States is more vulnerable to attack than perhaps ever before. The same complex latticework of technology that powers our cities and yields dominant awareness on the battlefield masks an abundance of critical leverage points. A knowing adversary can target these points with potentially spectacular effect. Pick a card and pull. The house comes down. Read on …
As regular readers know, I post something about myself now and then. Today, I’m asking RTJ readers to help me boost my LinkedIn profile. I’ve let it languish, and now that I want to use it, it looks like, well … like it’s been languishing! If you’ve found my musings on RTJ to be useful to your red teaming practice, please consider posting a recommendation to my profile under the RTJ section. If you’re not connected to me on LinkedIn, but would like to be let me know through the RTJ contact page, and I can send you my email so you can get through the LinkedIn gates.
At Black Hat USA 2014, I shared a diagrammatic method of perceiving what I call con and hypercon. A con is just what it sounds like: a state in which one actor attempts to deceive another, most often to do something that benefits the first and hurts the second. Phishing is a con as is “the big store” in the movie The Sting as is—more generally—any case in which one actor willingly hides a secret or projects a falsehood when attempting to manipulate another actor.
A hypercon state exists when the “conned” actor sees through the con. This opens options such as covertly watching what the “conning” actor will do and exploiting the “conning” actor’s now inferior state of knowledge. Turning the tables on a phisher or, in other words, “conning the con” might result from awareness at the hypercon level. Read on …
I spoke briefly yesterday with a gentleman who runs a successful pentesting company. For the most part, I get what he does, but I don’t think he got what I do, nor did he seem inclined to ask any questions to find out exactly what that might be. (At one point, he described my version of red teaming as “sitting around thinking,” which, of course, doesn’t make money!)
The misunderstanding just might be my fault. I realized today that I need a better method of describing how successful red teaming addresses the whole system even if the red team ultimately only “attacks” a portion of it. I’ve tried before (here and here), but until I get it right, I’m going to keep trying. Read on …
It might just be me, but it seems as if red teaming is getting some additional attention lately. As proof, I offer these recent articles:
Next level red teaming: Working behind enemy lines
Stealing, scamming, bluffing: El Reg rides along with pen-testing ‘red team hackers’
(And if you don’t follow our Twitter feed, consider doing so. We post links to articles like this regularly there.)
Red teamers can be annoying. Sometimes the annoyance is justified, sometimes not. After all, who likes to be told that they overlooked a key assumption or failed to implement a sensible practice. It’s not surprising that many people resist even the idea of red teaming.
As red teamers, we often lament the shortsightedness of this resistance. What we don’t discuss very often is the uncomfortable fact that we often aggravate and perpetuate it. Yes, we can be self-satisfied and snobbish. And why not? We spend our days thinking about important things other people ignore, neglect, and overlook. Even when we’re not snobbish and condescending (honest!), we have to work twice as hard not to be perceived as such. That’s just the nature of the game. Read on …
The pleasant little 1968 comedy Hot Millions starring Peter Ustinov and Maggie Smith features an interesting moment relevant to red teamers. (If you haven’t seen the movie but intend to, stop reading here.) Ustinov plays a compulsive embezzler. After serving time in gaol (that’s “jail” for us Yanks), he assumes a programmer’s identity and secures a job at a large company. He thereupon attempts, unsuccessfully, to circumvent the security of the company’s computerized accounting system. Temporarily frustrated, he is delighted to learn that a simple “bang” on the side of the computer’s casing with a mop bucket opens it, circumventing the security he’d tried so hard to foil. The punchline? Ustinov learns the secret by chance; the cleaning crew uses the trick to open the computer in order to warm their tea inside the computer’s casing.
The real world, of course, is rife with such irony, and superior red teamers have a nose for it. Perhaps not often (but often enough), the most splendid security system is vulnerable to an unexpected, comically simple exploit, all of which calls for the timely services of the superior red teamer’s nose. It reminds me of Red Teaming Law #17: “The superior red teamer learns how things work in the real world, not just how they work on a diagram or presentation slide. The most useful insights often come from the bottom of the org chart. The higher up the org you go, the broader the view but the more filtered the information.”
Postscript: There’s another Ustinov movie with a scene relevant to red teamers. I’ll post on that soon.
You never really understand a person until you consider things from his point of view . . . until you climb in his skin and walk around in it.”
– Atticus Finch to Scout in Harper Lee’s To Kill a Mockingbird.
This is the heart of adversarial red teaming, right?—to consider a problem from the adversary’s perspective. Kind of . . . what Atticus advocates is something more, something elusive, and something many red teamers unthinkingly overlook: genuine empathy. Read on …