As you may know, I've been involved with red teaming all levels of CCDC, but I've also taken part in a number of CTF competitions. CCDC is one of a number of defense competitions growing in popularity, including the high-school level "Cyber Patriot" and military academy CDX. These stand in contrast to the longer-running Capture-The-Flag competitions commonly found at hacker conferences and elsewhere, which tend to focus on finding exploits for pieces of software. Defensive exercises have come under harsh criticism in the past few years, so are they really doing any good?

One of the most outspoken critics of CCDC has been Chris Eagle. Chris Eagle has done a number of presentations on competitions, "Organizing and Participating in Computer Network Attack and Defense Exercises" in 2009 http://www.nps.edu/video/portal/Video.aspx?enc=Fvcj9jTKwtwcxg2Wgv3NOEGEdfe6jktD and a keynote at Infiltrate http://infiltratecon.com/chriseagle.html in 2013. In both talks he compares his significant experiences in the Defcon CTF, which his team has won twice, and defensive competitions, primarily with CDX. I do not want to put words in anybody's mouth so I just tried to write down quotes from the talks to give the criticism a fair response.

In his 2009 talk, he said "there's not a lot of places you can go out and be a professional attacker" but "99% of security professionals/network administrators are concerned with defense." I don't know whether that's exaggerated, but we'll go with it and try to compare the effectiveness of the different competitions for primarily defensive personnel.

On defense exercises, Mr. Eagle said "I don't get involved in these that much because it's not fun for me" and explains that he teaches about secure programming and identifying vulnerabilities in binary software. For the CDX: "They dictate to you the services on your network." "We want the most secure network that meets the specifications of the network ... this happens in the real world all the time" calling it a "fairly realistic scenario" but saying "if you completely lock down your network, nothing's going to happen." (It looks like some things changed by 2013.) He said that the CDX's goals were to teach the students "secure system configuration," "instrumentation ... watch what happens," "tool selection," "network architecture," and "recognizing attacks."

For CCDC, those apply, but also significant are business skills, communication skills, ability to adapt and familiarity with a wide variety of OS's and software as well as the secure configuration piece. As much as I know every pentester and security geek (myself included) hates reports, in the "99%" of security jobs we are talking about, how well you can communicate is directly related to how much you can make a difference. CCDC grades writing and reporting and how well students can make a business proposal, which are all highly relevant to most of the students, but ignored in Mr. Eagle's criticism. Familiarity with and ability to work with managing many different operating systems are also very relevant to all of IT, security, and pentesters, although they are often absent from many CTF's.

In 2009, the main shortcomings he listed were:

Only COTS/open source software was allowed In my big-network defensive jobs I've seen and used many expensive pieces of security software and hardware. In my opintion, none of them are better for defense than open-source software and custom scripts/tools used by intelligent people, which are routinely used in CCDC I am assuming CDX has different rules preventing student tools, which does seem sad indeed. "Attacks are not new ... Defenders aren't stepping outside their comfort zones much ... antivirus is a classic example; it can only tell you what it knows about ... the well-funded adversary isn't using attacks that you know about ... this exercise isn't helping us find that kind of attack" I can't speak to all CCDC red team members, but the malware I am using is most definitely custom and not detected by any antivirus. While I didn't use a browser 0day, there are more new elements in our attacks than many real bad guys use. The students who can stop it are definitely using the same techniques that will detect and stop real custom malware, such as state-sponsored rootkits. Even top-tier well-funded attackers still only have one or two new pieces; operational success even against well-funded bad guys comes not from knowing about their 0day exploit before it is used, but from being able to find the not-new parts or anomaly detection or detecting techniques used instead of specific tools. All of these strategies can be used by successful CCDC teams. "Misconfiguration downfall" was the most common failure of teams In other words, teams were compromised by old exploits, misconfigurations, stolen credentials, or often web exploits developed on the fly. All of which are quite realistic. While Mr. Eagle may prefer to develop exploits for memory corruption vulnerabilities, the "99%" won't be doing that in the real world and won't be doing that in CCDC either. Familiarity with those techniques is still useful and can still be used to stop such exploits in CCDC. "Hyper-aware participants don't necessarily reflect typical network user ... I've got 2000 users who aren't hyper aware" Mr. Eagle explained that phishing is a typical stepping stone to enter a network; but harder to do in an exercise, saying the exercise "Doesn't quite reflect the real world" While not all do, many CCDC events incorporate users who do roleplay typical network users, and may click on malicious links, download games, etc. It is difficult to reproduce the scale of a large network, but smaller scale competitions can still be instructive. We did use client-side attacks as an entry vector in our most recent contest. And of course, Defcon CTF and most other CTF's don't have anything even close to the CCDC client-side attacks. "How do you measure the effectiveness of your defenses?" "What do you ask when someone says they can block buffer overflow attacks?" "Unless you understand the technical details of what they claim to be preventing, you're going to have a hard time seeing through the vendor promotional literature." "My argument is that there is tremendous benefit in even in the defensive case, in learning these skills" I agree that those skills do provide a benefit, and they may even be a more direct measure of IQ than the skills to succeed at CCDC. But CCDC measures those skills as they are measured in the real world; ability to stop attacks that are as real as possible on systems that are real. Much more goes into that than familiarity with binary reverse engineering and exploit development.

Mr. Eagle contrasts that with the offensive contests, which primarily consist of software vulnerability analysis, exploit development, and binary patch development for custom software. "You've never seen this software before" "every attack is new" "we want them to analyze services, find exploits."

This is exactly what the "1%" does, and it does demand a specific and demanding skill set that not many people have. Familiarity with those skills is relevant to most security jobs, but building a contest around measuring only those skills, while fun, is not the best way to encourage development of all the skills most security professionals use.

Directly confirming my suspicion with the Defcon CTF, Chris Eagle (surprisingly honestly) said "I have pigeonholed myself into the binary software analysis arena." He continued to explain how NPS has developed many tools that make them really good at the Defcon CTF but aren't applicable to the real world, since they're tailored to alert on Defcon flags and those specific types of binaries, and would be unlikely to alert on real attacks. As he said, "It's really kinda focused on the game" and "We've gamed the game a lot" since "We'd seen the same kind of game three times."

We have also seen a number of students at CCDC develop their own scripts and tools to use at CCDC. The difference I see is that so far, all the custom tools I have seen students employ could be used on real networks as well to harden systems or detect & disable real malware. This is another indication that CCDC, as opposed the Defcon CTF finals, is not teaching students how to "game the game" it's teaching them how to defend a real network.

It's ironic to me that while I have not seen any business that finds 0-day vulnerabilities in binary products just to write a custom binary patch to defend itself, businesses frequently write, maintain, patch, and secure their own web applications. Yet in his 2009 talk, Mr. Eagle specifically said he does not have or practice these skills. The Defcon CTF has been so pigeonholed into one facet of application security that the one aspect of application security most commonly practiced is still untouched. (quals excluded)

There are a number of big problems when "pigeonholed" exploit developers make defensive recommendations. For example, penetration tester and instructor John Strand said, new hires often think "it's all about the exploits" when in reality, it mostly isn't. One of the most common ways attackers compromise a business network is by finding a vulnerability on a web server, stealing credentials, and pivoting throughout the network. When the attacker (e.g. a penetration tester) then provides advice on how to secure the network, the attacker will always include instructions on patching or mitigating the vulnerability, but rarely if ever explain how to manage a network without leaving credentials lying around. In contrast to most IT professionals and exploit developers, CCDC students are far more likely to understand and address credential-dissemination issues, which frequently make the difference between one popped server and a completely compromised domain. As already mentioned, CCDC rewards students who can identify and fix vulnerabilities in web applications and who have communications skills, etc.

In the 2013 keynote,

Chris Eagle spends a few minutes talking about his most recent experience with CDX: "Scenarios are generally unrealistic" and that the students were given "10 year old servers" pre-infected with the NSA red team's latest backdoors. He derides having to have "unpatched XP boxes" that "can't be running antivirus" with "pre-installed backdoors." He concludes with saying the contest is a "nightmare from top to bottom" and "the only real lesson from a cyber defense exercise - know when to walk away" because it "devolves into forensics" which "makes no sense at all." He also says that the students are required to "know nothing about offense" and "don't get to act in any part like a red team." He points out that the CCDC winners didn't score a single point in the Defcon CTF (although also admitted at least once his own team didn't qualify for Defcon CTF finals).

I'd like to point out that many of those complains are not true for CCDC, and for others, far from being unrealistic, they are actually quite reflective of the real world. In CCDC students can and do install any antivirus or security software they want. The biggest complaint that Mr. Eagle has is the idea that the students come in to a network that already has malware on it. This is not true for CCDC, in which the red teams have no prior access or pre-installed backdoors on the students systems. In fact, in many events, the students get time to look at and start defending their systems before the red team is allowed to start.

Even if that were not true, I'd argue that pre-existing backdoors are a realistic condition. Well-known intrusion response company Mandiant says that when they discover a compromise, they find intruders have maintained access for on average almost a year, and sometimes many years before detection. Gaining access occurs at one very difficult to detect point in time, and serious attackers will first run their attacks through most security systems to ensure they will not be caught. It is possible to use custom techniques that are likely to detect an initial intrusion with a lot of expert analysts, but this cannot be relied upon. IT/security staff have a much better chance of identifying compromise if they can master forensic and antimalware skills. Far from making "no sense at all," even if it was the entire competition, identifying malware pre-existing on a network and conducting forensics would be valuable and directly applicable to the real world.

Although some old systems are often included in a CCDC network, many systems are at least close to up-to-date. In the last CCDC competition, each student network had only one server old enough the red team had a remote exploit for. Presently 30% of desktops are running Windows XP, so having a system or two on a large network that is running an old operating system is very realistic; and I'd be surprised if any large company does not have at least a few systems 10 years old. I find it interesting that Mr. Eagle first complains that in defensive competitions students do not face zero day exploits that real attackers may use, and then complains that some of the systems the students are provided are vulnerable to known exploits. It's hard to complain both that attackers have exploits for a system and that they don't. Whether zero-day or not, against an organization's systems, attackers may have an exploit for one or more systems, but are unlikely to have an exploit for most or all systems. CCDC reflects that reality.

Mr. Eagle complains that students are required to "know nothing about offense" and "don't get to act in any part like a red team." Actually, CCDC students are encouraged to, and the most successful teams routinely do employ offensive techniques, including open-source reconnaissance and exploit tools to identify vulnerabilities on their own networks and address them. While it may be more "fun" to employ those attacks against other teams, teams do benefit from acting like a red team. It is true that CCDC teams do not identify and write exploits for memory corruption vulnerabilities, which is the entirety of the Defcon CTF, but they still directly exercise more skills used by more security professionals.

In summary