Have you ever read the popular children’s series Where’s Waldo? or looked at one of those 3D Magic Eye images that took the ’90s by storm? The basic premise for both is to spot a hidden person or object within a larger puzzle. But sometimes the harder you stare, the less you find. Then, someone peers over your shoulder and points out Waldo hiding behind a tree, and somehow that elusive “magic” image comes into focus. That’s what a cybersecurity Red Team can do: challenge your thinking by helping you see things you couldn’t see before.
Why We All Need a Second Opinion
As humans, we each process information and solve problems differently. Our knowledge and personal experiences influence the way we view the world and have a huge impact on the decisions we make. As a result, we form “cognitive biases,” or unconscious flaws in our thinking, as our brains try to simplify complex ideas and situations.
First introduced in 1972 by Israeli psychologists Amos Tversky and Daniel Kahneman, the concept of cognitive bias can take many forms. For instance, attentional bias is when we prioritize certain things while ignoring others. Like when you’ve fallen in love with an 1800s Colonial-style house for its charm and top-ranked school district but brush aside the aging electrical system and sagging roofline. And by “you,” we may mean “us,” but that’s another story for another day.
Another bias, functional fixedness, impedes our ability to think outside the box and find new ways to solve problems. It’s the idea that you’ve always used a paper clip to keep pages together and can’t contrive alternate uses for the object like fixing a zipper or opening a lock. And then there’s the optimism bias — aka the illusion of invulnerability — that tricks us into believing negative experiences “won’t happen to me.”
Our inherent cognitive biases are why we need others to provide a second set of eyes or play devil’s advocate to help us think critically about the flipside of an issue, expand our view of the world and, hopefully, avoid damaging missteps. Red Teams were created along these very ideas: to put assumptions and plans through the wringer to make teams more resilient.
Red Teaming and the Art of Alternative Analysis
Red Teaming is a concept first introduced by the military to help shed such cognitive biases and test strategies from an external point of view. In a simulated wargame, the Red Team acts as the adversary, using various techniques and tools to try to penetrate defenses.
The U.S. University of Foreign Military and Cultural Studies (UFMCS) defines Red Teaming as “a function executed by trained, educated and practiced team members that provides commanders an independent capability to fully explore alternatives in plans, operations, concepts, organizations and capabilities in the context of the operational environment and from the perspectives of our partners, adversaries and others.” Red Teaming is built on four principles: self-awareness and reflection; fostering cultural empathy; groupthink mitigation and decision support; and applied critical thinking.
Because cognitive bias affects us all, teams across public and private sectors can benefit from an outsider’s look at their processes. Fans of Aaron Sorkin’s The Newsroom will recall when an outside Red Team was assembled to poke holes in a high-stakes investigative news story and test the credibility of its source, unbeknownst to the journalists assigned to the project. Similarly, real-world law enforcement and legal teams use Red Team techniques to uncover weaknesses in their cases and improve trial advocacy. And as organizations face an endless barrage of cyber threats, many are conducting independent Red Team exercises to get inside the mind of an attacker and put their cybersecurity defenses to the test.
How Adversary Simulations Can Help You Find Flaws Before Cyber Attackers Do
Red Team adversary simulations provide a safe, controlled way for security operations teams to uncover vulnerabilities, test response capabilities and identify areas of improvement. Red Teamers use any means necessary to mimic a real-world attack without introducing risk to the business. Organizations often engage independent Red Teams who bring advanced skills, fresh perspectives and objectivity to the table — along with the element of surprise — which is hard to achieve with an in-house group.
The Red Team works closely with the organization to determine the goals of the program based on its unique concerns and requirements. The organization may choose to test against known threats — by following the MITRE ATT&CK framework to simulate indicators of compromise (IoCs) associated with a specific threat actor — or unknown threats — by developing custom tools designed to penetrate the environment, pivot within the network and exfiltrate data.
As the global ransomware epidemic continues to grow and wreak havoc, many corporations are engaging Red Teams to identify technology and process gaps to improve cyber readiness. For these exercises, Red Teams design and execute specialized defense analysis programs that aim to encrypt local system files and evade various security technologies the organization may have in place such as anti-virus, endpoint detection and response solutions, and special-purpose ransomware prevention tools. In response, the organization’s security team — the Blue Team — executes its incident response process to contain the infected host, prevent further execution and recover the affected files.
At the end of the exercise, organizations often receive a two-part takeaway. The first report is a 50,000-foot view of the organization’s security posture with key findings and risk-prioritized recommendations for the executive team. The second is a technical analysis that details information on the vulnerabilities uncovered and recommended remediation steps to reduce exposure for security teams. With deeper insights into their security strengths and weaknesses, organizations can bolster defenses and create a baseline from which future security improvements can be measured.
Winston Churchill once said, “Criticism may not be agreeable, but it is necessary. It fulfills the same function as pain in the human body. It calls attention to an unhealthy state of things.” Many cyber attacks begin as a twinge — so minor they often go unnoticed until things become injurious. By exploiting weaknesses in systems and processes, and in human nature itself, Red Teamers push cybersecurity teams to think differently and see things sooner, no matter how uncomfortable the process may be. Empowered, these teams have the prescient ability to anticipate future failures — and work to stop them before they ever happen.