18 minutes
On Red Teams - Part 2
Introduction
This article is a continuation (and correction/accuracy) of a previous article I wrote in 2017, titled “Red Team and its Role in Penetration Testing”, that was published in Digital Whisper, DW84 on July, 2017. This English version is a translation of the original Hebrew article publish in Digital Whisper, DW167 on October 31st, 2024.
The reasons that led me to write this article are varied, some private, some public, but the main one is the fact that no matter how much time passes, I constantly encounter incorrect definitions and misconceptions about what a Red Team really is, and I want to set the record straight. It’s likely that some of what is written here will be different from what was in the previous article, and the reason is pretty simple: I’ve learned.
Honestly? Today, Red Teaming isn’t what it used to be. At least not for everyone working in the realms of information security and cyber. Several years have passed (seven, but who’s counting) since I wrote the previous article on Red Teams, emphasizing their role in penetration testing, and I think it’s time for Part 2.
So in this part, I don’t intend to focus on their role in penetration testing (though I will discuss how, in my opinion, these projects can be improved), but I do want to focus on the essence of a true Red Team. I believe the concept itself can provide great insights and value once understood, improving results in various frameworks (or even individuals' own mindsets, more on that later), but it’s okay, I don’t expect the buzzword to change in the industry.
So a Red Team is essentially an idea. A concept. It’s not new at all, and the idea itself also exists under other names, some of which could be:
- Devil’s Advocate
- The 10th man in the room
- Opposing Force (OPFOR)
- Red Cell
And there are likely more, but that’s less essential at the moment. What’s important to understand is that it’s not some unique concept that was invented recently, although it is probably the most prominent and loud in cyber. (Although all things AI are giving it a good run this year).
So what is a Red Team? Well, it depends on who you ask. It is likely that most definitions you will find on the internet will be based on Red Team in contexts of information security/cyber, and most will be very similar to the fantastic with a broad scope. In some places, you will find definitions like “the team that is meant to practice against the blue team”, “a team of pen testers who also check physical security”, “cyber tests that include phishing and social engineering” – all these definitions are incorrect.
Now, after some readers may have become annoyed, I want to start talking about the Red Team, leaning on a few older and better sources than me, and discuss a bit about what such a team does, its limitations, when one is needed, where one can be used (spoiler – in any situation), and what makes someone a “Red Teamer” (another spoiler – being a super duper advanced pen tester, highly technical who can break into anything is not part of the requirements. Sorry).
So the first line of the definition in Wikipedia hits the nail on the head:
“A red team is a group that pretends to be an enemy”
The sharp-eyed will notice that nothing is mentioned about cyber, zero days, pen tests, or anything else related to computers or information security – and the reason is quite simple – because the concept of a Red Team started long before.
Where did it start?
The idea of a “Red Team” is not something new that was invented in recent decades. If we go back in time, we see that the roots of this idea are deeply embedded in human history, especially in military and strategic contexts. One of the earliest appearances of the principles upon which a Red Team relies comes from the 5th century BC in China, with the famous book “The Art of War” by Sun Tzu. Sun Tzu understood well the importance of understanding the enemy deeply, thus being able to anticipate their moves and overcome them. His message was simple: if you want to win, you must think like your enemy. This is where the approach of the Red Team starts – thinking from the opponent’s perspective to improve your strategies.
During the Middle Ages, we can also see similar uses of this concept in a surprising place – the Vatican. Yes, the Catholic Church was one of the first to adopt a “Red Team” approach, with a special role called “Devil’s Advocate”. When they wanted to declare someone a saint, someone was appointed specifically to challenge the claims and argue against them – all to ensure that the decisions being made were done through thorough critical examination.
Over the years, this approach evolved in the military world, particularly in the 20th century, as NATO’s military training included exercises where special units acted as opposing forces (OPFOR) to challenge defending forces and prepare them for real scenarios on the battlefield. This method of simulating internal enemies to assess your capabilities has since expanded into other fields, such as business and cyber, becoming an integral part of the strategies of organizations and companies that want to deeply understand their weaknesses.
Today, a Red Team is not just a military or religious matter. It has found its way into diverse fields such as business, intelligence, and even our daily lives. So how does all this connect to the world we live in now? Well, first of all, Red Teams are used not just to simulate physical attackers, but also digital ones. In the cyber world, their role is to simulate attacks on organizations’ computer systems and networks, using a wide range of tools and techniques. We are talking about scenarios like social engineering, phishing, and any other method that can simulate the actions of a real attacker.
It sounds simple, but in practice? A good Red Team is far more than a group trying to break into systems. It involves a holistic approach that includes strategic planning, deep understanding of the opponent, and primarily creative thinking that does not rely solely on the newest and coolest technological tools. This is a team that knows how to challenge the existing assumptions of the organization, how to expose the most unexpected vulnerabilities, and how to enhance the organization’s understanding of its resilience.
In business, for example, companies recognize the value of a Red Team not only concerning information security but also in other situations, such as testing business assumptions and assessing financial risks. Many financial companies, for instance, use Red Teams to challenge fundamental assumptions regarding investments and make more informed decisions. Basically, anywhere you need to think several steps ahead, see the big picture, and prepare for what could go wrong – a Red Team is beneficial.
And what about our daily lives? There’s also room for a Red Team here. Think about the decisions we make in our personal lives or businesses sometimes; it’s worth pausing to ask ourselves: what would I do if I were my competitor? How would I attack myself? This is exactly offensive thinking. It helps us examine ourselves from new angles, challenge existing assumptions, and find ways to improve what we do.
In short, a Red Team is not just a technical tool – it is an approach, a way to think and see the world. And if done right, it can improve all the areas in which we operate.
Definitions
- According to Red Team Journal, one of the leading sources in the field, a Red Team is “the practice of looking at a problem or situation from the perspective of an adversary”. The idea here is not only to check if everything is working as it should, but also to help the organization see things from a different angle, uncover weaknesses that were not clear before, and suggest ways to strengthen performance.
- And what about a broader definition? In the book Red Team: How to Succeed By Thinking Like the Enemy, a Red Team is defined as “a vital strategic capability for any organization that wants to improve by forcing itself to think like an enemy.” The emphasis here is on a broader approach based on strategy and critical thinking, not just techniques of information security. That is, the goal is to get inside the opponent’s head, whether in the business world, intelligence, or any other field, and improve the organization’s performance by thinking like a potential attacker.
Uses
- Use in the Intelligence World: In intelligence, a Red Team focuses on conducting simulations of potential enemy operational scenarios. A famous example of successful use of such techniques is the operation to capture Osama bin Laden. American intelligence forces conducted a series of simulations and deep planning to weigh all possible scenarios, ensuring every conceivable possibility, and guaranteeing the success of the operation.
- Use in the Business World: In the business world, Red Teams function as a strategic tool for examining business decisions while preventing cognitive biases. For example, financial companies use Red Teams to assess investment risks and ensure that the decisions made are not based on faulty assumptions, but on a comprehensive and objective analysis of reality.
- Personal Use in Daily Life: Even in everyday life, one can adopt the principles of “red teaming” to examine ourselves. This means thinking like your competitor or potential adversary and asking yourself how you would attack or challenge yourself. For example, in a small business, one can use this approach to challenge marketing decisions and ensure they are based on deep judgment and not on unfounded assumptions.
- Use in the Cyber World: In the cyber world, Red Teams play a central role in testing the resilience of organizational systems. They simulate real cyber attacks – using techniques like social engineering, phishing, and exploiting vulnerabilities – thus giving the organization an opportunity to learn about its weaknesses and improve its defense system.
- Use in the Military World: In armies like the USA and Israel, Red Teams are used to simulate enemy forces in military exercises. These exercises allow defending forces to prepare more effectively for real scenarios on the battlefield and build better strategies.
- General Uses: Beyond specific fields, Red Teams can be beneficial in many other areas – from simulating natural disasters and managing pandemics, to scenarios designed to test responses to internal events like spotting wayward employees. Any situation where it is important to examine the response and improve the organizational structure can benefit from applying the Red Team approach.
Goals
- Testing Assumptions: The main goal of a Red Team is to examine the central assumptions upon which the organization relies. The team focuses on critical thinking and deeply investigates the assumptions, processes, and models that exist within the organization. This allows it to expose gaps and potential faults that the organization has not noticed, and to make critical improvements.
- Breadth vs. Depth Compared to Pen Tests: Unlike penetration tests, which focus on identifying specific technical vulnerabilities, a Red Team examines the big picture. It operates across a wide range of fields and topics – from strategic conduct to potential operational scenarios. This gives it the ability to understand not only the technical vulnerabilities but also the weaknesses at the organizational level as a whole.
- Defined Objectives: The advantage of a Red Team lies in working towards well-defined objectives. Their goal is not only to find technical breaches but to assess the overall resilience of the organization. This way, they can provide a comprehensive picture of the security level of the defending forces and the strategic measures existing within the organization.
- Shoes of a Specific Attacker: The emphasis of a Red Team is to simulate a real and specific attacker. While penetration tests tend to identify security weaknesses without focusing on the profile of the attacker, a Red Team focuses on a specific attacker’s mindset, allowing it to develop more realistic scenarios and improve the organization’s ability to cope with specific threats.
Required Traits and Abilities
Adversarial Thinking:
Adversarial thinking requires identifying vulnerabilities and challenging existing perceptions from the offensive perspective of “how would I exploit this if I were the enemy?”.
-
Example 1: In a military Red Team, team members might introduce a simulation of an attack on a military base to a tactical unit. Instead of thinking about the security surrounding the main entrance gate, they might look for a damaged fence at the far side of the base, which may have been neglected by the defending forces.
-
Example 2: During the Six-Day War (1967), Israeli forces thought offensively and decided to surprise the Egyptian air force by attacking their planes on the ground before they could take off. This attack, known as Operation Focus, was a sophisticated offensive move that quickly secured air superiority, altering the course of the war in favor of Israel.
Critical Thinking:
Critical thinking is the ability to analyze and challenge existing assumptions, processes, and models objectively.
-
Example 1: A Red Team might examine the management processes of a hospital, for example, in the emergency treatment area, and ask questions such as “Are emergency teams really coordinated optimally to handle multiple complex cases simultaneously?” The Red Team can simulate scenarios of overload, such as an outbreak of a pandemic or a mass casualty disaster, and check if the teams operate in a coordinated and timely manner, exposing weaknesses in cooperation between different departments.
-
Example 2: During the Challenger Space Shuttle disaster in 1986, a lack of critical thinking was part of the reason for the disaster. An engineer named Roger Boisjoly insisted that the shuttle should not be launched due to a failure in the O-ring system at low temperatures, but his views were dismissed by NASA managers who did not want to delay the launch. The disaster led to the death of all crew members and proved the need for critical examination and asking questions, even under pressure to proceed.
Creativity:
A Red Team needs to think outside the box to identify unconventional ways to challenge perceptions or overcome barriers.
-
Example 1: A Red Team in physical safety might examine an industrial or commercial building to see if there are vulnerabilities unrelated to technology. For example, escape routes might be blocked or poorly maintained, or fire safety equipment may be defective. By conducting a thorough examination, the team can present findings and suggest ways to improve safety.
-
Example 2: During World War II, the British used extraordinary creativity to crack the German Enigma machine. Alan Turing and his team at Bletchley Park built the early computer “Bombe” to break the German encryption code. This approach required creative thinking and transcended traditional encryption methods.
Empathy:
Empathy is the ability to understand the motives and behaviors of the opponent or users, in order to think and respond like them.
-
Example 1: A Red Team can simulate customer service interactions in a public domain like welfare or health, examining how service teams respond in situations where people seek help in stressful circumstances. They can present challenging scenarios, such as a citizen in severe distress, to check whether the system can provide a quick and effective response or if there are failures in service and availability.
-
Example 2: During the Battle of El Alamein in World War II, General Bernard Montgomery was able to think like German General Erwin Rommel and understand his motivations and responses. Montgomery anticipated Rommel’s movements and devised an offensive plan that thwarted the Germans’ attempts to repel British attacks. Montgomery’s ability to get inside the opponent’s head helped him win the battle.
Here enters an important distinction between empathy and sympathy. Empathy is the ability to step into someone else’s shoes, to understand how they think and feel, while sympathy involves expressing compassion or identification with that person. In a Red Team, we are not seeking sympathy for the potential attacker – we do not need to like them, agree with them, or identify with their goals. What we do need is empathy – the ability to understand their patterns of action, the thinking that drives them, without emotionally connecting with them. Only then can we effectively simulate the attacker.
Ultimately, the work of a Red Team is somewhat to simulate what we hate. Yes, sometimes we must think like the attackers we are trying to stop, people or entities we would prefer did not exist. But to do this correctly, it is important that we know how to disconnect emotionally, use empathy in a cold and calculated manner, and understand their moves without identifying with them. At the end of the day, we are here to protect what matters to us by understanding how those who threaten us think and act – without any sympathy towards them.
Teamwork:
Collaboration and the ability to work harmoniously with other teams is critical in a Red Team.
-
Example 1: In a military Red Team, when one member specializes in physical engineering and another in intelligence, they will need to collaborate to perform complex tasks requiring focused information and execution of plans such as placing decoy charges on sensitive bridges.
-
Example 2: During the investigation of the Watergate scandal in 1972, reporters from the Washington Post, Bob Woodward and Carl Bernstein, worked together as part of a journalistic team that uncovered the scandal leading to President Richard Nixon’s resignation. Their collaboration, combining their skills and sources, allowed them to reveal the truth and draw far-reaching conclusions about government corruption.
Game Theory in the Context of a Red Team
Let’s talk for a moment about game theory. Yes, game theory is a powerful tool that can elevate a Red Team’s game several levels. What is it about? In essence, it is a method for analyzing situations where one party’s decisions directly influence the moves of the other, while both parties are trying to achieve conflicting or shared goals. When it comes to a Red Team, game theory is an excellent way to simulate the dynamics between them and the opponent, anticipate what they are going to do, and build strategies that will push them into a corner.
So how does this work in practice? A Red Team adopts game theory principles to think several steps ahead, understand how the defending team (the blue team) will react to their attack, and develop moves that leave the opponent with no response. For example, they might create an attack aimed at one asset, only to make the organization think that is the central target, while the real asset is located elsewhere – this is called a “bluff,” and this tactic is well-known in game theory.
But it doesn’t stop there. Game theory allows a Red Team to plan optimal moves that fit the situation precisely. It not only helps them avoid mistakes but also causes the opponent to fall into them. This is higher-level strategic thinking, integrating psychological, social, and even technological elements, to achieve the best outcome in a real scenario or simulation. In short, if we want to play a smart game and stay one step ahead – game theory is exactly the tool we need in a Red Team.
A Bit About AI and LLM
I’m not going to get into the differences between them and why it matters; it doesn’t matter (here).
Artificial intelligence (AI), generative artificial intelligence (GenAI), and large language models (LLM) have become central tools in the cyber world, particularly for Red Teams. These tools allow teams to operate more efficiently and rapidly, enhancing their creativity and flexibility. For example, with LLM, a Red Team can automatically analyze vast amounts of information, generate quick and precise attack scenarios, and even create simulations of cyber attacks that mimic real attackers. GenAI enables teams to develop more sophisticated examples of social engineering, automatically craft personalized phishing emails, or even generate fake content that organization employees might fall for.
However, despite the enormous benefits that AI offers, it is not a substitute for the basic human traits and abilities required of a Red Team member. Adversarial thinking, creativity, empathy, and teamwork are critical traits that can never be completely replaced by technology. AI is merely another tool in the team’s toolbox, enhancing the capabilities of its members. Like any tool, in the hands of experts, it becomes a more effective instrument – allowing them to be sharper and more creative – but in the hands of those lacking the required skills, it can become an uncontrolled tool that causes more damage than good. AI does not eliminate the need for fundamental knowledge and skills but rather amplifies the abilities of those who already master the craft, allowing them to perform their roles more sophisticatedly.
For instance, I used it in this article to provide a uniform structure, check spellings, and clarify some expressions – and also, for example, to find concrete examples or sources for these examples.
Summary
Red Teaming is far more than penetration tests or technical security exercises; it’s a strategic concept that enables organizations and individuals to challenge assumptions, think offensively and creatively, and deeply understand their vulnerabilities. By adopting the Red Team mindset and principles, we can enhance our resilience and effectiveness in all areas of life—from business and intelligence to everyday situations. It’s essential to remember that technological tools, including AI and LLMs, are merely enablers that amplify our capabilities, but they don’t replace critical human qualities like critical thinking, empathy, and creativity. By understanding and applying the essence of Red Teaming, we can better prepare for future challenges, improve our processes and systems, and ensure we stay one step ahead of our adversaries.
Sources for Further Reading
You want more; I get that.
I sit on the shoulders of giants, those are the key ones for me among these giants.
A wide range of knowledge and reading sources that I recommend, instead of James Bond stories on LinkedIn by consulting firm owners:
- Red Team by Micah Zenko | Hachette Book Group
- https://redteamjournal.com/
- Red Teams
- The Applied Critical Thinking Handbook
- Red Teams and Counterterrorism Training - University of Oklahoma Press
- Black Irish Entertainment, LLC. » Left of Bang
“The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge.” — Stephen Hawking
Roei Sherman