I'm using T-Mobile at the San Francisco airport as I write this, on my way home from the RSA Conference 2006. Here are my thoughts on my first RSA conference: Holy vendors, Batman. This seemed to be a show by vendors, for vendors. In some ways the presentations were afterthoughts, or just another way for some vendors to describe their products or upcoming technologies. I plan to report on one or two cool products I encountered on the exposition floor, but for now I'll quickly mention the talks I saw.
I began Tuesday be attending a briefing advertised as a discussion of wireless intrusion detection. Instead of learning something new, I heard an IBM employee describe wireless as if the audience had never heard of it. Buddy, it's 2006, for Pete's sake. That was a wasted hour.
Next I listened to Chris Wysopal discuss static binary analysis to discover security vulnerabilities. In contrast to another ex-@Stake/ex-L0pht member (mentioned later), Chris was coherent, informative, and worth seeing. He mentioned that compilers sometimes introduce vulnerabilities that were not intended by the coder. This is called What You See Is Not What You eXecute, or WYSINWYX (.pdf). For example, an older version of a Microsoft compiler decided that it was not necessary to clear memory before freeing it, as instructed by the coder. Instead, the compiler created an executable where passwords or other sensitive information could be found in memory.
Chris mentioned the Software Assurance Metrics and Tool Evaluation project, which I intend to visit. He also discussed why he would like to see an EnergyStar-like rating for software. The rating might say, "Of the financial applications subjected to binary security analysis, the best score was 112, the worst was 24, and this application rates 86. This program's estimated incident response and patching cost is $1600 per server per year when customer-facing, and $400 per server per year when kept in-house." He concluded the talk by describing how defenders are being destroyed by adversaries who get inside their OODA loops.
After Chris I saw Dan Geer speak. That was certainly a valuable hour. He postulated that "data value and data mobility are conjoined," and that "it's not security if it's not cost-effective." Dr. Geer discussed relationships between predators and prey, and how they evolve together. He focused on data "as the point and focus of security," where the "perimeter must contract down to data." He believes data is at risk when it changes state, from when it goes from being at rest (in storage) to being in motion (in use). Dr. Geer believes data must be protected at that point of transition.
I was very pleased to hear and see these thoughts: "Monitoring is the first priority. You cannot manage what you cannot measure. The unknown unknowns will kill you. Rumsfeld was right." Attacks which do not reveal themselves require preemption. Preemption requires intelligence. Intelligence requires surveillance. But what should you observe, people or data? Dr. Geer prefers observing data.
To perform that observation, he invoked the idea of a reference monitor (citing Anderson, circa-1972) that watches all data access, and can intervene when necessary. It acts by analyzing "traffic" (ostensibly data manipulation, not packets) and does not use content inspection to make decisions. Dr. Geer concluded by saying that trusted computing can be implemented in software or hardware. Software implemention favors innovation with a default permit stance, while hardware favors safety and a default deny stance. I obviously cannot do either of these talks justice, but if you'd like to hear more these talks should be sold through RSA in audio format.