Friday, June 02, 2006

Risk-Based Security is the Emperor's New Clothes

Donn Parker published an excellent article in the latest issue of The ISSA Journal titled Making the Case for Replacing Risk-Based Security. This article carried a curious disclaimer I had not seen in other articles:

This article contains the opinions of the author, which are not necessarily the opinions of the ISSA or the ISSA Journal.

I knew immediately I needed to read this article. It starts with a wonderful observation:

What are we doing wrong? Is the lack of support for adequate security linked to our risk-based approach to security? Why can't we make a successful case to management to increase the support for information security to meet the needs? Part of the answer is that management deals with risk every day, and it is too easy for them to accept security risk rather than reducing it by increasing security that is inconvenient and interferes with business.

I would argue that management decides to "accept security risk" because they cannot envisage the consequences of security incidents. I've written about this before.

However, Donn Parker's core argument is the following:

CISOs have tried to justify spending resources on security by claiming that they can manage and reduce security risks by assessing, reporting, and controlling them. They try to measure the benefits of information security "scientifically" based on risk reduction. This doesn't work... I propose that intangible risk management and risk-based security must be replaced with practical, doable security management with the new objectives of due diligence, compliance consistency, and enablement.

I agree. Here is a perfect example of the problem:

One CISO told me [Parker] that he performs risk assessment backwards. He says that he already knows what he needs to do for the next five years to develop adequate security. So he creates some risk numbers that support his contention. Then he works backwards to create types of loss incidents, frequencies, and impacts that produce those numbers. He then refines the input and output to make it all seem plausible. I suggested that his efforts are unethical since his input data and calculations are all fake. He was offended and said that I didn't understand. The numbers are understood by top management to be a convenient way to express the CISO's expert opinion of security needs.

This is my question: what makes these shenanigans possible? Remember the risk equation (Risk = Threat X Vulnerability X Asset Value) and consider these assertions:

  • Hardly anyone can assess threats.

  • Few can identify vulnerabilities comprehensively.

  • Some can measure asset value.


As a result, there is an incredible amount of "play" in the variables of the risk equation. Therefore, you can make the results anything you want -- just as the example CISO shows.

It is tough enough to assign values to threats and vulnerabilities, even if time froze. In the real world, threats are constantly evolving and growing in number, while new vulnerabilities appear in both old and new software and assets on a daily basis. A network that looked like it held a low risk of compromise on Monday could be completely prone to disaster on Tuesday when a major new vulnerability is found in a core application.

Parker's alternative includes the following:

Due diligence: We can show management the results of our threat and vulnerability analysis (using examples and scenarios) by giving examples of the existence of the ulnerabilities and solutions that others have employed (not including estimated intangible probabilities and impacts). Then we can show them easily researched benchmark comparisons of the state of their security relative to other well-run enterprises and especially their competitors under similar circumstances. We then show them what would have to be done to adopt good practices and safeguards to assure that they are within the range of the other enterprises.

Bottom line: be as good as the next guy.

Compliance: We are finding that the growing body of security compliance legislation such as SOX, GLBA, and HIPAA and the associated personal and corporate liability of managers is rapidly becoming a strong and dominant security motivation...(The current legislation is poorly written and has a sledgehammer effect as written by unknowing legislative assistants but will probably improve with experience, as has computer crime legislation.)

Bottom line: compliance has turned out to be the major incentive I've seen for security initiatives. I am getting incident response consulting work because clients do not want to go to jail for failing to disclose breaches.

Enablement: It is easily shown in products and services planning that security is required for obvious and competitive purposes and from case studies, such as the Microsoft experience of being forced by market and government pressures to build security into their products after the fact.

Bottom line: this is the weakest argument of the three, and maybe why it is last. Microsoft may be feeling the heat, but it took five years and the situation is still rough. Oracle is now under fire, but how long will it take for them to take security seriously? And so on.

I think Donn Parker is making the right point here. He is saying the Emperor has no clothes and the legions of security firms providing "risk assessments" are not happy. Of course they're not -- they can deliver a product that has bearing on reality and receive money for it! That's consequence-free consulting. Try doing that in an incident response scenario where failure to do your job means the intruder remains embedded in a client's infrastructure.

As security professionals I agree we are trying to reduce risk, but trying to measure it is a waste of time. I am sad to think organizations spend hundreds of thousands of dollars on pricey risk assessments and hardly any money on real inspection of network traffic for signs of intrusions. The sorts of measurements I recommend are performance-based, as I learned in the military. We determine how good we are by drilling and exercising capabilities, preferably against a simulated enemy. We don't write formulas guestimating our defense posture.

This is not the last I have to say on this issue, but I hope to be boarding a flight soon. I commend The ISSA Journal for publishing an article that undermines a pillar of their approach to security. I bet (ISC)2 will also love Donn's approach. :)

15 comments:

Dhar said...

While your blog post does a good job of summarizing and analyzing Donn Parker's article, I was wondering if the original article is available to all users?

Warm Regards,
Sumit Dhar

Richard Bejtlich said...

Apparently not. Try asking Donn?

Dv8or025 said...

Sumit Dhar :

(1) Get yourself a free 3-month ISSA Trial Membership on http://www.issa.org/join.html, as

"Trial Members are entitled to all the benefits offered by the International
Association, including receiving copies of the ISSA Magazine, access to the
ISSA International web site
, and discounts to.... More details."


(2) go to their homepage http://www.issa.org/ , click on the link to download the article, provide the credentials you will have received after some time and - BINGO! - you've got the article in full! :-))

BTW : besides just this article, ISSA has to offer a lot more; personally I'm a big fan of the Security Events organised by their local chapters : interesting topics and excellent networking opportunities with fellow IT Security Professionals (when attending the ISSA Northern Virginia meetings you might even bump into the one and only Richard Bejtlich in person!! ;-)

Kind Regards
Dv8or025

Anonymous said...

Saying efforts to measure risk is nothing but drivel, a meaningless whining about the degree of difficulty. Accurate and meaningful measurement occurs in the later stages of technological innovation. The first many years of bronze making, for example, went without accurate temperature measurements. After the science advanced, we developed much improved methods and materials.

For meaningful measures to occur, we must advance the science. By giving up, we are sentencing outselves to more years in the dark ages. It's not the way to go.

Conde Marshall said...

Donn Parker's argument falls apart when he advocates anchoring security with "Due Diligence", which you accurately characterize as "be as good as the next guy". Being as good as the next guy does not mean only means everyone is equally vulnerable. Given different asset values and the different abilities to shoulder damage to reputation and other adverse outcomes, equality of vulnerability only works for the least common denominator.

Also, the compliance argument falls apart when you look at the regulations. For example, GLBA requires a risk assessment, and the regulators are adamant in not setting standards.

A much better approach is industry standards, developed by each industry. Although we have a few standards today, we don't have them for all industries. We should develop them.

Richard Bejtlich said...

Drivel-guy,

Your analogy fails. Temperature is an absolute that has been the same during the entire time it could be measured. The chemistry for making bronze as also never changed.

Hardly anything stays the same in the digital security business. The assets, threats, and vulnerabilities ten years ago were a much smaller problem compared to today. With the variables changing so quickly, risk calculations are meaningless.

JimmytheGeek said...

"If you can't measure it, you can't manage it."

I believe the originator of that idea was indicted. If not, he should have been!

One problem with management by measurement is that you focus on what is easy to measure. Your scenario of timing a pen test with known skills avoids that. But that doesn't tell you what a group with unknown skills can do. It gives you a better basis for what I still feel is a subjective assessment of your security posture. "It was {easy|hard|impossible} for these guys to get to {target} in {time frame}." That would probably be valuable.

You put the case against numeric assessment much more clearly than I've ever been able to. I've fallen back on, "See, at this point you have to pull a number out of {the air|your ass} and beyond that point you have nothing useful. So you might as well skip to the part where you make up numbers and save time."

At times it might be useful to use subjective rankings - an organization is rational to prefer one system be protected more than another if it values those systems differently. Another way to put it is "h4x0r these in this order." But that often leads to another bugaboo of mine, performing arithmetical operations on rankings. I am currently preparing for a security policy audit and I have been required to estimate the likelihood of attack leading to compromise. As an indicator of the fuzziness of this type of thinking, no scale is given, just scores. 10% likelihood. Over a period of time? Ever? Inside/outside? If we pick a low number, do we get to not secure it? More importantly, HOW THE HELL SHOULD I KNOW? I'll let you know once I get all the unplanned outages on the calander.

I can answer questions like, "Do we have all vendor patches applied" and "Do we allow connections from arbitrary hosts" I can not answer "Is there a vector of attack you haven't thought of?"

Chris Walsh said...

Rich:

It is funny that you should refer to the absoluteness of temperature in support of your position. As you know, the SI unit for temperature measurement is the kelvin, named after Lord Kelvin, whose famous dictum is "When you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind".

Accepting for the moment the claim that ours is so rapidly changing a field that it renders risk calculations impossible, since we cannot gather incidence data, I would simply say that this also implies that we cannot merely do what experience suggests is reasonable, for that very experience is based on the past -- a past which, per the argument, is not relevant.

Perhaps the thing to say that the past provides the only data we have, and we should do with it what we can. Maybe I am a sucker for simple models, but I think a good deal can be done.

ashish arora said...

The position (avoid trying to measure and manage information security risk, and instead use "best practice") seems odd. It appears to be a case of "making good the enemy of best".

I concede that the numbers are difficult to estimate, but there is some transparency in saying (like the CIO who made up the numbers) that I think there is an x% chance of losing more than $y million due to a confidentiality breach.

The alternative is "Trust me. I am a professional and I know what is good and right." This scenario sometimes ends badly, with insufficient investment in security (at least by some reckoning).

At the end of the day, security is a risk, only one of the risks of doing business, and may not be the most important risk for the organization (most firms that exit the market do not exit because of a security incident).

Anonymous said...
This comment has been removed by a blog administrator.
SteveA said...

A most delightful chain of thoughts. Glad Mike Rothman pointed me in this direction. Having gone from an anti-Risk Assessment bigot, to a HIPAA required RA driver, I come back to Donn Parker's sage advice. Threats and Vulnerabilities change hourly, assets are never valued realistically. How to I have the brand/reputation of my company? I can't be as good as any next guy, but I can be as good as guys who look a lot like me - and a good ciso has a network of them.

I don't know who writes these regulations, but obviously none of them have ever tried to do a risk assessment of a moderately complex organization.

We've adopted a simple qualiftative model for now. We played with Riskwatch a few years ago when we were naive CISSP newbies (ALE and all that crap)...

I will have to consider a book of my own when I hang up my ciso shingle - this subject is so charged with opinions.

dghnfgj said...
This comment has been removed by a blog administrator.
Anonymous said...

For the record, proposing point-counterpoint to Donn's article based on an analysis performed by a third party on an article not available to the general public is poor form at best and wholly incorrect at worst.

I recommend a more thorough read of his work, not just this article, and direct analysis of any further work to be debated.

how to bowling said...

I was much struck by the equation Risk (Risk = Threat x Vulnerability x Asset Value) is a different way of seeing things, excellent observation

John said...

The article is now freely available at

https://www.issa.org/Library/Journals/2006/May/Parker%20-%20Replacing%20Risk-Based%20Security.pdf