Further Thoughts on SANS Top 20

It seems my earlier post Comments on SANS Top 20 struck a few nerves, e.g. this one and others.

One comment I'm hearing is that the latest Top 20 isn't "just opinion." Let's blast that idea out of the water. Sorry if my "cranky hat" is on and I sound like Marcus Ranum today, but Marcus would probably agree with me.

First, I had no idea the latest "Top 20" was going to be called the "SANS Top-20 Internet Security Attack Targets" until I saw it posted on the Web. If that isn't a sign that the entire process was arbitrary, I don't know what is. How can anyone decide what to include in a document if the focus of the document isn't determined until the end?

Second, I love this comment:

Worse still, Richard misses the forest completely when he says that “… it’s called an ‘attack targets’ document, since there’s nothing inherently ‘vulnerable’ about …”. It doesn’t really matter if it’s a weakness, action item, vulnerability or attack. If it’s something you should know about, it belongs in there. Like phishing, like webappsec, and so on. Don’t play semantics when people are at risk. That’s the job of cigarette and oil companies.

This shows me the latest Top 20 is just a "bad stuff" document. I can generate my own bad stuff list.

Top 5 Bad Things You Should Worry About

  • Global warming

  • Lung cancer

  • Terrorists

  • Not wearing seat belts

  • Fair skin on sunny days


I'm not trying to make a point using a silly case. There's a real thought here. How many of these are threats? How many are vulnerabilities? (Is the difference explicit?) How many can you influence? How many are outside your control? How did they end up on this list? Does the ranking make any difference? Can we compare this list in 2006 with a future list in 2007?

Consider the last point for a minute. If the SANS Top 20 were metric-based, and consisted of a consistent class of items (say vulnerabilities), it might be possible to compare the lists from year to year. You might be able to delve deeper and learn that a class of vulnerabilities has slipped or disappeared from the list because developers are producing better code, or admins are configuring products better, or perhaps threats are exploiting other vectors.

With these insights, we could shift effort and resources away from ineffective methods and focus our attention on tools or techniques that work. Instead, we're given a list of 20 categories of "something you should know about." How is that actionable? Is anyone going to make any decisions based on what's in the Top 20? I doubt it.

Third, I'm sure many of you will contact me to say "don't complain, do something better." Well, people already are. If you want to read something valuable, pay attention to the Symantec Internet Threat Report. I am hardly a Symantec stooge; I like their approach.

I will point out that OWASP is trying to work in the right direction, but their single category ("Web Applications") is one of 20 items on the SANS list.

I realize everyone is trying to do something for the good of the community. Everyone is a volunteer. My issue is that the proper focus and rigor would result in a document with far more value.

Comments

Anonymous said…
Don't back down Richard. This:

"it’s called an ‘attack targets’ document, since there’s nothing inherently ‘vulnerable’ about."

is absolutely correct. Risk is all about context, and it can be just as much of a mistake to consider potential weaknesses without putting them in context as it would be to be "wrong" about that context (one of his reasons to not delineate between "weakness, action item, vulnerability or attack.).

To me, this makes no sense. To ignore taxonomy, context, and relationships between factors in a taxonomy is "Bad security" just as doing so in another field is "Bad science".
Anonymous said…
The link you are looking for is http://www.ranum.com not www.mjr.com.
Unknown said…
I could make a list such as this and it would be about as important:

1. Windows
2. Unix
3. WWW
4. Users
5. Other OS

I mean, this is just furthering what has happened. Instead of a top 20 list, they have a top 75 list just categorized into 20 sections.

In a way, a document like this could be argued to need more opinion. Sadly, when you have so many people pushing for their "biggest holes on the Internet" to be on the list, someone somewhere either has to create an objective measure (not likely) or just put their foot down, declare the identity and purpose of the document, and make the decisions.

As it is, too many people are wanting too many things, and as such we now have a document that purports to be a top 20, yet encompass everything they can think of.

But yes, the list has its puposes, and thankfully can be very powerful with "FBI" and "SANS" in the title, plus how it has been recognized through the years. I just wish it were more surgical in the 20 items.
Anonymous said…
Nerd Fight!
Anonymous said…
If the SANS Top 20 were metric-based, and consisted of a consistent class of items (say vulnerabilities), it might be possible to compare the lists from year to year.

Come on, Richard, you that's not what it's about! It's about putting money in someone's pocket, that's all. "Consistent class of items"...that just makes sense! You've really gotta stop doing that!
Anonymous said…
I've thought about this question a lot, within the context of looking at older protocols that included threats later found to be controversial. When should we worry about a threat?

I've since modelled it as needing to be validated, and a threat is validated when it is a clear and present danger. It's clear if we can measure it; present if we can show it exists; and a danger if it hurts us. With that framework, it's a lot easier to separate out the wheat from the chaff.
Protocols cannot have threats. Protocols can only have vulnerabilities. Threats exploit vulnerabilities. Threats can present a clear and present danger.
Anonymous said…
Just wondering if we are missing the point here. I don't think the SANS TOP 20 is for the Administrators. They are the ones in the ditches every day pulling the muck out and getting it cleaned up.

The SANS TOP 20 is for high level overviews, for _non_ technical people to see, such as the people who run the companies. It's also probably for the auditors and the like for them to know where to look for issues, and not specific ones, but general issues.

At that level (an overview) _any_ list becomes somewhat arbitrary, and easily picked apart, especially given that it can not give to many specifics.

I don't disagree with your observations, and yes I do agree that Symantec's report is probably better, but I've found a few issues with it as well. Their "metrics" that show that Microsoft fixes bugs within 8 days is just plain BS, and any decent Administrator will know that.

So, Richard, how do you produce a list for _non_ technical administrators of companies, auditors and the like that isn't detailed, and gives an overview of the industry?

Thanks!
So now I've seen comments (here and elsewhere) that the SANS Top 20 is for security people, that it's not for security people, that it's for administratiors, that it's not for administrators, that it's for policymakers, etc...

I don't buy for a minute that the SANS Top 20 is for "nontechnical people." Nontechnical people are not going to read a document that long. They do not understand CVEs. They do not understand most of the terminology in that document.

Here's a link to two books I reviewed that try to speak to nontechnical users.
Anonymous said…
This comment has been removed by a blog administrator.

Popular posts from this blog

Zeek in Action Videos

MITRE ATT&CK Tactics Are Not Tactics

New Book! The Best of TaoSecurity Blog, Volume 4