Digital Security Is Not Just an Engineering Problem

Recently I participated in a small meeting involving a cross-section of people interested in digital security and public policy. During the meeting one of the participants voiced the often-repeated but, in my opinion, misguided notion that the primary problem with digital security is "design." In other words, "the Internet was not designed to be secure." If the Internet was not designed to be secure, all applications are "built on a foundation of sand" and therefore can never be "secure."

This is a typical "engineering" mentality applied to digital security. I do not agree with it. You might think it's because I'm not a "professional engineer." Strangely enough, at USAFA I took classes in chemistry, physics (two courses), math (calc III and diff eq), thermodynamics, and five pure engineering courses (electrical, mechanical, civil, aeronautical, astronautical) plus the dreaded Academy "capstone" course -- all of which would qualify me for a minor in engineering at a "normal" college. Still, I do not think digital security is an engineering problem.

My opinion does not mean that engineering has no role. On the contrary, good engineering helps reduce vulnerabilities and exposures. Unfortunately, that focus only affects part of the risk equation. Focusing only on engineering completely ignores the threat, which in my judgement is the biggest problem with digital security today.

You know what prompted me to write this post? It was Security Engineering Is Not The Solution to Targeted Attacks by Charles Smutz, a professional software developer who creates custom security tools for a large defense contractor. Charles wrote:

[B]laming security engineering for the impact of targeted attacks is [a] herring as red as they come. A world where security engineering actually tried to solve highly targeted and determined attackers would not be a fun place in which to live. In absence of other solutions, an intelligence driven incident response model is your best bet.

You know I agree with that.

Charles wrote his post to refute Security engineering: broken promises by Michal Zalewski. Michal is a really smart security researcher but I agree with Charles that Michal has also fallen for the "security as design problem" mentality.

If you want to know what I think works, please consult my 2007 post Threat Deterrence, Mitigation, and Elimination.


H. Carvey said…

Great post. It's things like this that bear repeating.

Unfortunately, what it doesn't result in is that intelligence driven incident response model.
HypedUpCat said…
I concur with your idea that all security does not (and cannot feasibly) be built into a version 1 or original engineering foundation. The idea that we only had one chance to make the Internet secure is ludicrous.
Security can be added afterwards or in the next version, as we learn and innovate. How did another engineer come up with a car security add-on called the set belt or IPv6 (which though not perfect has additional security capabilities).
We learn, we adapt (both for good and bad) and eventually we Improve.
Kris said…
doesn't it always just come down to a conjunction of time, opportunity & motivation?
good old:
Anonymous said…

Focusing only on engineering completely ignores the threat, which in my judgement is the biggest problem with digital security today.

If infosec practitioners like me are focused on vulnerabilities (whether in products, people, or processes), it's because we can do very little about the threat. I have zero diplomatic, military, legal, or financial influence over potential attackers. Unlike the real world, where I'm allowed to use lethal force to defend myself, I can't even hack back. So I ask in all sincerity, what can I do other than reduce my vulnerability to security threats through the careful engineering, operation, and monitoring of my information systems?
emily said…
One of the issues, is that most organizations, those that employ your typical infosec department model have your policy side and you engineering side. There's not one section in there for intelligence and threat coordination. The policy folks will look to the engineers and say, "why can't your tools detect/prevent/remediate the problem" - and the engineering folks will retort, "why are your policies so weak and not enforced". So long as that back and forth goes on and that some matrixing and adapting to the environment, nothing will advance.

I can easily say that, if we had the time and the manpower, I'd have started an intelligence analysis group... outside the console watching IR/IH and forensics and the standard engineering. I would have fed that info back into those groups, but also used it to inform the policy makers and make sure that those adapted policies were addressed and enforced. However, corporate, and now i can say it, civilian USG security hasn't advanced to that stage... it's going to require an evolution beyond regulations (FISMA, SOX, GLBA), and beyond the current tools (SIEM, AV, DLP, etc.). It just needs the right primeval goop to get started.

I agree with your statement, at least for now. It's important to recognize our limitations but not accept them. I discuss options for "active defense" whenever I can these days. I even talked to a group of engineers in my company last month, and they wanted to know why we weren't fighting back!
Dave Funk said…
To Anonymous,

I'm more of a glass is half-full guy. If you are monitoring, you are regularly catching bad guys trying to mess with your network. If you are in the government, US CERT (or CIRT or whatever) is providing you data on bad guys. Weither you are in the government or not Symantec and other service providers are getting that data and are including it in their managed monitoring services. All of this is intelligence driven data that can or should be part of your network protection. It is not insignificant data. We catch more maleware from our own and Symantec black lists than from our anti-malware programs. We'd all like to fight back, and a couple of people have/are.
As for the National Security Strategy. The glass ain't half-full. If it is an 8 oz. glass it is missing about 3.9 oz. to get to the half way mark. More and more in the government are figuring this out. Unfortunately they are getting no help from the administration (this or last) OMB, Congress, or NIST. Just remember, there is too frequently an inverse relationship between FISMA score and computer security readiness.

Popular posts from this blog

Zeek in Action Videos

New Book! The Best of TaoSecurity Blog, Volume 4

MITRE ATT&CK Tactics Are Not Tactics