• This slide presentation does not articulate a crucial aspect of Red Teaming/Alternative Analysis, namely these are an essentially empirical exercise, where ‘new facts’ are created through a process of ‘acting-out’ the event. Thus, in simple terms – the exercise of running a Red Teaming/Alternative Analysis is radically different from conventional decision-making methodologies, as the analysis team actually ‘acts-out’ a fictional event. What this does is ‘create’ a new raft of facts that add to the data already available, which a decision-maker can use in developing their analysis.

  • A second critique of this slide presentation is that it does not relate Red Teaming/Alternative Analysis back to its most important function which is to inform the start of Risk Management Cycle.
    Red Teaming/Alternative Analysis through evidence based testing is a mechanism to ‘test’ a hypothesis, which a decision-maker has about a particular risk, or presumed consequence of particular events occurring.

  • Chris,

    From your comments, I understand generally how and why you conduct red teaming. Many practitioners would agree with you.

    Not everyone, however, conducts red teaming or alternative analysis the same way or for the same purpose. For example, your statement that the “most important function” of red teaming “is to inform the start of the risk management cycle” is not universally held. Nor do I think you’ll find unanimous agreement over whether the red team must actually “act out” a “fictional event.” The DSB, for example, included devil’s advocacy within their definition of red teaming. Alternative analysis as a class of tools or approaches is even more broad. Consider additionally the Sandia list of eight types of red teaming. The list does a good job of describing the variety of red teaming types that decision makers can employ and not all of the types are necessarily designed to inform the start of the risk management cycle.

    My goal was to remain as agnostic as possible regarding specific methods of approaches. I even avoided presenting my preferred red teaming framework and approach in order to preserve this broad, introductory perspective. Yes, I offer a representative set of definitions, but only to illustrate the wide variety of current views. I’m happy to add your definition to the presentation, but only within the context of one perspective among many. In fact, I agree it would add some ideas the other definitions currently lack. I’m also open to adding “risk assessment” as a general principle (see slide 31), and on that count, I’m interested in hearing what others think.

  • Chris, Mark,

    I think this could also be grist for a post/article on differing styles of red-teaming, especially bridging strategic cultural divides (such as government/private, US/NATO, etc).

  • In answer to the question, does of Red Teaming/Alternative Analysis essentially ‘act-out’ potential events, I will relate this to a real world scenario (based on recent project work). In this particular case, the Red Teaming/Alternative Analysis was to identify the consequence of typical terrorism/counter terrorism events in public space. The owners of that public space were private companies whose risk management plans and facility plans (in terms of site upgrade) had to parallel with local policing, and national security planning. In this particular scenario these public spaces had never been the subject of a terror attack involving a bombing, and the senior risk manager had never experienced such an event, nor had his security; so any consequence analysis while based on ‘factual’ analysis of the building structure (and so forth), was nevertheless a ‘fiction’. At the same time, this analysis is essentially producing ‘new set of facts’ (i.e. data – what were the engineering issues/results of a blast event; how would this impact on the risk and security management of the area in question). Thus, in this particular event, the Red Teaming/Alternative Analysis is ‘acting-out a fictional event’ in order to vicariously pass the experience onto the risk management, in order to inform their analysis of how to respond potentially. As well, Red Teaming/Alternative Analysis fundamentally changes the polarity of the decision making cycle, in that whereas traditional decision making is based on what facts are presented; Red Teaming/Alternative Analysis runs contra to this, in that its mechanisms are creating ‘new facts’ to challenge the conventional response.

  • Chris’ use of the term ‘fiction’ is itself potentially a point of contention. ‘Fiction’ is a term that, IMO, risks undervaluing the scenarios red teams identify. Professional red teams identify possibilities; the greater the capability of the red team, and the higher the level of effort exerted by the red team, the greater the confidence in the scenarios identified. Furthermore, red teaming is most productive when scenarios that have never occurred in actuality are identified for the sponsor. In theory anyway, defenses should already account for scenarios consisting of malevolent events that have previously occurred. Forgive my focus on the malevolent, but by connecting red teaming and alternative analysis in this context, we are already in some danger of building a conceptual trap since alternative analysis and risk analysis can address abnormal conditions like extreme weather events that red teaming can’t.

    Chris made very good points about the creation of new data from red team scenarios. Having reviewed Chris’ website, I appreciate how his clients will benefit from a risk-based structuring of red teaming/alternative analysis. That said, as a veteran red teamer I have participated in many assessments that were not funded to answer risk questions, but questions related to other aspects of security. Now can I connect the concept of ‘risk’ to any or all of of those? I think I can – some of them would be forced. But the sponsor wasn’t necessarily asking the red team to identify and discuss risk.

    Good discussion!

  • Problematically, the bulk of national critical infrastructure is civil and in private hands, and these institutionally tend to talk a language of risk management, and often it is senior risk managers who have more direct access to governance boards. In my experience, a lot of the vulnerability analysis being undertaken is in effect Red Teaming/Alternative Analysis (however it is not called that); and it is being conducted in the language of vulnerability and risk analysis. Most of my work has been about ‘translation’ of typically military and security concepts into the risk management thesis, because these are our primary stakeholders. And I think, that as we develop national and international definitions, methodologies and standards, these issues – such as relationship to the risk management cycle – need to be taken into account.

  • Absolutely, in the context of helping infrastructure owners identify what they should know and what they should do about things representing risk, I agree that translation of security concepts to the risk world is needed. There is opportunity for the red teaming/alternative analysis community to work with the risk community to improve techniques and concepts used and/or shared between the communities.

  • I think this has been a good discussion. Reviewing the ISOs such as 27001, and 27002 (for example) we see a ‘justification’ for red teaming/alternative analysis but there seems to be not direct reference. This lack of ISO standardisation makes it difficult to promulgate the methodologies. Interestingly, the Julian Talbot and Miles Jakeman (August 2009), Security Risk Management Body of Knowledge (Wiley Series in Systems Engineering and Management. Volume 001) was developed to align with International Standards for Risk Management, such as the future ISO 31000 and it lists ‘red teaming’, as an adjunct methodology to security risk management (physical), along with the traditional security risk management – IT.


Terms of Use

Please read.