Extending Security Event Correlation

Last year at this time I wrote a series of posts on security event correlation. I offered the following definition in the final post:

Security event correlation is the process of applying criteria to data inputs, generally of a conditional ("if-then") nature, in order to generate actionable data outputs.

Since then what I have found is that products and people still claim this as a goal, but for the most part achieving it remains elusive.

Please also see that last post for what SEC is not, i.e., SEC is not simply collection (of data sources), normalization (of data sources), prioritization (of events), suppression (via thresholding), accumulation (via simple incrementing counters), centralization (of policies), summarization (via reports), administration (of software), or delegation (of tasks).

So is SEC anything else? Based on some operational uses I have seen, I think I can safely introduce an extension to "true" SEC: applying information from one or more data sources to develop context for another data source. What does that mean?

One example I saw recently (and this is not particularly new, but it's definitely useful), involves NetWitness 9.0. Their new NetWitness Identity function adds user names collected from Active Directory to the meta data available while investigating network traffic. Analysts can choose to review sessions based on user names rather than just using source IP addresses.

This is certainly not an "if-then" proposition, as sold by SIM vendors, but the value of this approach is clear. I hope my use of the word "context" doesn't apply to much historical security baggage to this conversation. I'm not talking about making IDS alerts more useful by knowing the qualities of a target of server-side attack, for example. Rather, to take the case of a server side attack scenario, imagine replacing the source IP with the country "Bulgaria" and the target IP with "Web server hosting Application X" or similar. It's a different way for an analyst to think about an investigation.


compit said…
Hi Guys,

Check this link http://economictimes.indiatimes.com/features/the-sunday-et/backpage/Hackings-ethical-side/articleshow/5231471.cms

This is one of the biggest hoax I have ever come across. Ankit coining the word “ethical hacker” J LOL . I was under the impression that the term “ethical hacker” was coined by IBM many years ago before Ankit Fadia started writing technical stuffs. I am not sure why he is so famous? Most of the stuff he writes is freely available in the wild.
Metajunkie said…
Regarding Security Event Correlation...

I agree with your addition in this post, Richard. I agree with your definition overall.

I believe that the list of what SEC is NOT might be misleading to the person who doesn't read every word. The important phrase is "not simply", for all of those things can play into effective SEC.

In my opinion, the point at which those things in the NOT category become useful is the point when they cross the threshold between data and knowledge.

Sometimes correlating between known good events and suspicious events can be useful too. For example, having a pile of data-flow documents is just a pile of data. Taking that information and using it as a background to hold realtime IDS alerts against has produced huge wins for me.

I'm a student of Sun Tzu, and I feel that this sort of work falls under the "know yourself" portion of his teachings.

Too many IT departments do not know themselves. This is why we are losing battles.

Ken Walling, CISSP, GREM

aka Metajunkie
Josh said…
"I think I can safely introduce an extension to "true" SEC: applying information from one or more data sources to develop context for another data source."

I don't get it. So you say tcpdump -n turns off tcpdump's built in SEC functionality? ;-)

Hi Richard, talking about Event correlation, did you have a look at ossim.net? The project has a lot of funcionality and it is very flexible. Maybe it's a bit hard for newbies, but now, with the CD Installer it's very easy to do a small deployment and test it.
Anonymous said…
One of the greatest challenges to an analyst for successful security event correlation is establishing the context of the nodes/infrastructure involved in the observed communication. If you want to attempt to make an educated evaluation of the dynamic risk these events pose to the organization, I propose you apply the formula you have used for years:

Risk = Threat x Vulnerability x Asset

In this regard, anything that helps establish context and value of the asset to the organization will help with a more informed and helpful response tactic.

In a more practical example- while working for a large enterprise during the course of using a SIM product combined with a centralized log repository, we discovered UDP packets on random destination ports leaving our zones that represented regular work desktop/laptop client IP blocks and headed for different geographical zones that mostly included Eastern Europe and former Soviet Bloc countries. This was one of the earliest detection of the new Storm worm that we were aware of before it really became public knowledge.

Popular posts from this blog

Five Reasons I Want China Running Its Own Software

Cybersecurity Domains Mind Map

A Brief History of the Internet in Northern Virginia