Risk Assessment, Physics Envy, and False Precision

In my last post I mentioned physics. Longtime blog readers might remember a thread from 2007 which ended with Final Question on FAIR, where I was debating the value of numerical outputs from so-called "risk assessments." Last weekend I attended the 2009 Berkshire Hathaway Shareholder meeting courtesy of Gunnar Peterson. He mentioned two terms used by Berkshire's Charlie Munger that now explains the whole numerical risk assessment approach perfectly:

Physics Envy, resulting in false precision:

In October of 2003 Charlie Munger gave a lecture to the economics students at the University of California at Santa Barbara in which he discussed problems with the way that economics is taught in universities.One of the problems he described was based on what he called "Physics Envy." This, Charlie says, is "the craving for a false precision. The wanting of formula..."

The problem, Charley goes on, is, "that it's not going to happen by and large in economics. It's too complex a system. And the craving for that physics-style precision does nothing but get you in terrible trouble..."

When you combine Physics Envy with Charley's "man with a hammer syndrome," the result is the tendency for people to overweight things that can be counted.

"This is terrible not only in economics, but practically everywhere else, including business; it's really terrible in business -- and that is you've got a complex system and it spews out a lot of wonderful numbers [that] enable you to measure some factors. But there are other factors that are terribly important. There's no precise numbering where you can put to these factors. You know they're important, you don't have the numbers. Well practically everybody just overweighs the stuff that can be numbered, because it yields to the statistical techniques they're taught in places like this, and doesn't mix in the hard-to-measure stuff that may be more important...

As Charley says, this problem not only applies to the field of economics, but is huge consideration in security analysis. Here it can give rise to the "man with a spread sheet syndrome" which is loosely defined as, "Since I have this really neat spread sheet it must mean something..."

To the man with a spread sheet this looks like a mathematical (hard science) problem, but the calculation of future cash flows is more art than it is hard science. It involves a lot analysis that has nothing to do with numbers. In a great many cases (for me, probably most cases) involves a lot of guessing. It is my opinion that most cash flow spread sheets are a waste of time because most companies do not really have a predictable future cash flow.


You could literally remove any references to financial issues and replace them with risk assessments to have the same exact meaning. What's worse, people who do so-called "risk assessments" are usually not even using real numbers, as would be the case with cash flow analysis!

Physics envy, leading to false precision, are two powerful ideas I intend to carry forward.


Richard Bejtlich is teaching new classes in Las Vegas in 2009. Regular Las Vegas registration ends 1 July.

Comments

Alex Hutton said…
*yawn*

:)
Anonymous said…
I am so very happy we finally have a succinct phrase for this. I see it in our industry constantly, and have struggled to describe it in under two or three sentences (which is far longer than the attention span of some of the message's recipients). I'll be adopting this term as well. Thanks for the reference, Richard. Very helpful.

- Mike
Clint Laskowski said…
Precision (or lack thereof) is something I've been thinking about for years.

There is just no way to do an accurate information security risk assessment (ISRA). Put another way, there is no way to be confident in the accuracy of a ISRA.

In fact, I've started to use a new phrase. Instead of "Risk Assessment" I prefer "Risk Awareness".

One difference between calculating future cash flows and conducting an ISRA is that one says, "let's look at the 'revenue' we might have", while the other says "let's look at the losses we might have". I suspect it is much easier to get funding to do a future cash flow analysis than it is to do an information risk assessment!

In the end, the problem is, without numbers, without some precision, it is difficult to prioritize security spending. If I have 15 security projects, but a limited budget, how do I divvy up the money? Where do I spend my next security dollar?

If not with risk assessments, how does one manage risks? If one can not measure, how can one manage?

I know Donn Parker has suggested that there are other ways to approach security besides risk management. For example, one could use a policy-based approach, a due-diligence-based approach, etc. (These were described in a issue of the ISSA Journal within the last year, I think).

Thoughts?
Clint Laskowski said…
[this is a follow-up on previous comment]

Since the ISSA article I'm thinking of is only available to ISSA members, here's a link to a blog post about it by PurpleSlog:

http://purpleslog.wordpress.com/2006/05/25/donn-parker-suggests-dropping-the-risk-based-approach-to-information-security/

And, frequent readers here on the fantastic TaoSecurity blog will note that Richard also blogged about this topic at:

http://taosecurity.blogspot.com/2006/06/risk-based-security-is-emperors-new.html

Based on my work, I'm 84.6% sure that information security risk assessments are of less value than most professionals realize.
Alex Hutton said…
Salvo back at ya, Richard!

http://newschoolsecurity.com/2009/05/richard-bejtlichs-quantum-state/

Quoting Charlie:


"You've got to have models in your head. And you've got to array your experience ? both vicarious and direct ? on this latticework of models. You may have noticed students who just try to remember and pound back what is remembered. Well, they fail in school and in life. You've got to hang experience on a latticework of models in your head.

What are the models? Well, the first rule is that you've got to have multiple models ? because if you just have one or two that you're using, the nature of human psychology is such that you'll torture reality so that it fits your models, or at least you'll think it does. You become the equivalent of a chiropractor who, of course, is the great boob in medicine.

It's like the old saying, "To the man with only a hammer, every problem looks like a nail." And of course, that's the way the chiropractor goes about practicing medicine. But that's a perfectly disastrous way to think and a perfectly disastrous way to operate in the world. So you've got to have multiple models.

And the models have to come from multiple disciplines ? because all the wisdom of the world is not to be found in one little academic department. That's why poetry professors, by and large, are so unwise in a worldly sense. They don't have enough models in their heads. So you've got to have models across a fair array of disciplines.

You may say, "My God, this is already getting way too tough." But, fortunately, it isn't that tough ? because 80 or 90 important models will carry about 90% of the freight in making you a worldly ? wise person. And, of those, only a mere handful really carry very heavy freight.

So let's briefly review what kind of models and techniques constitute this basic knowledge that everybody has to have before they proceed to being really good at a narrow art like stock picking.

First there's mathematics. Obviously, you've got to be able to handle numbers and quantities ? basic arithmetic. And the great useful model, after compound interest, is the elementary math of permutations and combinations. And that was taught in my day in the sophomore year in high school. I suppose by now in great private schools, it's probably down to the eighth grade or so.

It's very simple algebra. It was all worked out in the course of about one year between Pascal and Fermat. They worked it out casually in a series of letters.

It's not that hard to learn. What is hard is to get so you use it routinely almost everyday of your life. The Fermat/Pascal system is dramatically consonant with the way that the world works. And it's fundamental truth. So you simply have to have the technique.

Many educational institutions ? although not nearly enough ? have realized this. At Harvard Business School, the great quantitative thing that bonds the first ? year class together is what they call decision tree theory. All they do is take high school algebra and apply it to real life problems. And the students love it. They're amazed to find that high school algebra works in life....

By and large, as it works out, people can't naturally and automatically do this. If you understand elementary psychology, the reason they can't is really quite simple: The basic neural network of the brain is there through broad genetic and cultural evolution. And it's not Fermat/Pascal. It uses a very crude, shortcut ? type of approximation. It's got elements of Fermat/Pascal in it. However, it's not good.

So you have to learn in a very usable way this very elementary math and use it routinely in life ? just the way if you want to become a golfer, you can't use the natural swing that broad evolution gave you. You have to learn to have a certain grip and swing in a different way to realize your full potential as a golfer.

If you don't get this elementary, but mildly unnatural, mathematics of elementary probability into your repertoire, then you go through a long life like a one?legged man in an ass?kicking contest. You're giving a huge advantage to everybody else.

One of the advantages of a fellow like Buffett, whom I've worked with all these years, is that he automatically thinks in terms of decision trees and the elementary math of permutations and combinations...."

Have at!
Alex, when I get a straight answer to the question I posted in

http://taosecurity.blogspot.com/2007/09/final-question-on-fair.html

then I'll bother responding to your posts.

"Yes" or "no" would suffice.
Alex Hutton said…
Well, I was trying to be nice about
how poorly formed the question was.


The answer is obviously yes, but you're using loaded terms in "opinions", which I object to. It's like your interpretation of what you thought Munger was stating.

Anyhow, I'll buy some drinks once I'm in NoVA and we can talk about (hopefully) how violent our agreement is.
Anonymous said…
The whole article is not about information security, but about 'securities', as in: 'stocks'... Are you not seeing that?
Anonymous, I got one of the only "A" grades in my Economics of Corporate Finance and Financial Markets in 1995. I am well aware what "securities" means.
Alex Hutton said…
Also, the answer could be "no" that prior estimates are not actually subjective, but uncertain.

Depends on perspective and your understanding of statistics. Thus the blog jokes.
Jack said…
http://riskmanagementinsight.com/riskanalysis/?p=629
Unknown said…
OK, I'll bite. :-)

I read Mr Munger's comments, they speak volumes. What I find rather amusing is just how much what he said applies to things so dear to those that like to attack their idea of risk management (which often enough isn't really what risk management is to others).

Let's replace Mr Munger's "a man with a spreadsheet" with "a man with logs of his network monitoring device" and you get the same result. Both are prone to black swans, both will swear black and blue just prior to the unexpected event that their model is "just perfect, thank you very much" and both will try to put down any opposing view as "wishful thinking" and "imprecise".

Be open to critical thinking, not criticise others' thinking.
The difference between a "man with a spreadsheet" and a man monitoring his network is that the man with the spreadsheet thinks he knows what is happening, but the man with monitoring knows what is happening.
Jon Robinson said…
But the man with the monitoring logs does not know what will happen in the future. He therefore must guess (yes, estimate) using past data and a model to interpret it. I'm sure you make such estimations when deciding how much insurance to buy, how much money to invest and where, etc. Why do you categorically reject risk assessment? It seems to be woven into how humans deal with reality. FAIR is an attempt to make these types of decisions more rational and accurate.
"Sigh." Please take a look at the posts about FAIR and risk assessment I've already written. I don't feel like summarizing several years of blogging.
Anonymous said…
Hi,

We have just added your latest post "Risk Assessment, Physics Envy, and False Precision" to our Directory of Science . You can check the inclusion of the post here . We are delighted to invite you to submit all your future posts to the directory and get a huge base of visitors to your website.


Warm Regards

Scienz.info Team

http://www.scienz.info

Popular posts from this blog

Zeek in Action Videos

New Book! The Best of TaoSecurity Blog, Volume 4

MITRE ATT&CK Tactics Are Not Tactics