Last night I attended a talk at my local ISSA chapter. The speaker was Joe Jarzombek, Director for Software Assurance for the National Cyber Security Division of the Department of Homeland Security. Mr Jarzombek began his talk by pointing out the proposed DHS reorganization creates an Assistant Secretary for Cyber Security and Telecommunications working for the Under Secretary for Preparedness.
This is supposed to be an improvement over the previous job held by Amit Yoran, where he lead the National Cyber Security Division, under the Information Analysis and Infrastructure Protection Directorate. According to this story, "Yoran had reported to Robert P. Liscouski, assistant secretary for infrastructure protection, and was not responsible for telecommunication networks, which are the backbone of the Internet." Mr Jarzombek said that people who are not Assistant Secretaries are "not invited to the table" on serious matters.
Turning to the main points of his presentation, Mr Jarzombek said the government worries about "subversions of the software supply chain" by developers who are not "exercising a minimum level of responsible practice." He claimed that "business intelligence is being acquired because companies are not aware of their supply chains."
The government wants to "strengthen operational resiliency" by "building trust into the software acquired and used by the government and critical infrastructure." To that end, software assurance is supposed to incorporate "trustworthiness, predictable execution, and conformance." Mr Jarzombek wants developers to "stop making avoidable mistakes." He also wants those operating critical infrastructure to realize that "if software is not secure, it's not safe. If software can be changed remotely by an intruder, it's not reliable." Apparently infrastructure provides think in terms of safety and reliability, but believe security is "someone else's problem."
I applaud Mr Jarzombek's work in this area, but I think the problem set is too difficult. For example, the government appears to worry about two separate problems. First they are concerned that poor programming practices will introduce vulnerabilities. To address this issue Mr Jarzombek and associates promote a huge variety of standards that are supposed to "raise the bar" for software development. To me this sounds like the argument for certification and accreditation (C&A). Millions of dollars and thousands of hours are spent on C&A, and C&A levels are used to assess security. In reality C&A is a 20-year-old paperwork exercise that does not yield improved security. The only real way to measure security is to track the numbers and types of compromise over time, and try to see that number decrease.
Second, the government is worried about rogue developers (often overseas and outsourced) introducing back doors into critical code. No amount of paperwork is going to stop this group. Whatever DHS and friends produces will be widely distributed in the hopes of encouraging its adoption. This means rogue developers can code around the checks performed by DHS diagnostic software. Even if given the entire source code to a project, skilled rogue developers can obfuscate their additions.
In my opinion the government spends way too much time on the vulnerability aspect of the risk equation. Remember risk = threat X vulnerability X asset/impact/cost/etc. Instead of devoting so much effort to vulnerabilities, I think the government should divert resources to deterring and prosecuting threats.
Consider the "defense" of a city from thieves. Do city officials devote huge amounts of resources to shoring up doors, windows, locks, and so forth on citizen homes? That problem is too large, and thieves would find other ways to steal anyway. Instead, police deter crime when possible and catch thieves who do manage to steal property. Of course "proactive" measures to prevent crime are preferred, so the police work with property owners to make improvements to homes and businesses where possible.
I asked Mr Jarzombek a question along these lines. He essentially said the threat problem is too difficult to address, so the government concentrates on vulnerabilities. That's not much of an answer, since his approach has to defend all of the nation's targets. My threat-based approach focuses on deterring and capturing the much smaller groups of real threats.
Mr Jarzombek then said that the government does pursue threats, but he "can't talk about that." Why not? I understand he and others can't reveal operational details, but why not say "Federal, state and local law enforcement are watching carefully and we will have zero tolerance for these kinds of crimes." Someone actually said those words, but not about attacking infrastructure. These words were spoken by Alberto Gonzales, US Attorney General, with respect to Katrina phishers.
This approach would have more effect against domestic intruders, since foreign governments would not be scared by threat of prosecution. However, if foreign groups knew we would pursue them with means other than law enforcement, we might be able to deter some of their activities. At the very least we could devote more resources to intelligence and infiltration, thereby learning about groups attacking infrastructure and preventing damaging attacks.
Since I'm discussing software assurance, I found a few interesting sites hosted by Fortify Software. The Taxonomy of Coding Errors that Affect Security looks very cool. The Fortify Extra is a newsletter, which among other features includes a "Who's Winning in the Press?" count of "good guy" and "bad guy" citations. This site is not yet live, but in October DHS will launch buildsecurityin.us-cert.gov. The Center for National Software Studies was mentioned last night.
Also, the 2nd Annual US OWASP Conference will be held in Gaithersburg, MD 11-12 October 2005.