InfoWorld published two articles of interest to the intrusion detection community this week. Network Detectives Sniff for Snoops is a review of "Internet Security Systems (ISS) Proventia G200, Lancope StealthWatch 4.0, Snort 2.10, and StillSecure Border Guard 4.3." I think they meant "Snort 2.1.0." Already you might suspect I have problems with this first article, which was done at the Naval Postgraduate School in Monterey, CA.
My major concerns with product reviews of this sort is their focus on alert-centric intrusion detection at the expense of other forms of network security monitoring data. Session, full content, and statistical data are completely ignored. Most reviews judge products primarily on their capability to identify "attacks," which in this case included "both live Internet traffic and a variety of attacks we launched from penetration testing tool Core Impact 4.0."
When an attack is launched, the IDS is judged to be "good" if it blinks its red light, and "bad" if the light stays dark. No mention is made of the analyst's capability to perform true incident analysis or the ability to determine how the product made its detection decision. The review stops with attack detection and does not consider the ability to track the intruder beyond the event which triggered an alert, let alone looking for non-alert evidence of prior intruder activity.
All reviewers also include a caveat like this:
"Its [Snort's] main weakness is its dependence on (sometimes poor) signatures. As with all signature-based IDSes, Snort can be defenseless against unknown or 'zero-day' attacks until a signature becomes available."
Unfortunately, dealing with events for which there is no signature or anomaly is the primary problem with all detection products. I believe that administrators should deploy network audit devices that detect failures in protection as well as keep records of all network activity, within the bounds set by legal, technical, and administrative policies. Products which focus on detecting "attacks" are merely documenting possible exploitation events. They are most useful against well-defined events like worm outbreaks but nearly worthless for incident scoping against rogue insiders, stealthy intruders, and unknown exploit techniques.
I was pleased to see some discussion of true enhancements to detection capabilities in the review. Areas like active asset discovery, vulnerability correlation, and passive mapping only make the analyst's job easier. These "contextuals" (as Marty says) give the analyst a better idea of his defensive landscape and should receive more attention in future IDS reviews.
The areas covered by popular IDS reviews which do matter (to a lesser degree) are those that make life easier for the administrator and analyst. These include sensor deployment, rule management, and reporting. Reports on these features (especially installation) remind me of nearly every Linux review published; most people want to know if the installer is an X application or a series of text-based instructions. Because it's hard for most people to judge the power of a new scheduler or the network capture improvements of device polling, reviewers spend their time on other matters.
Sguil users know sensor deployment, rule management, and reporting are lacking in our tool, but we are working on each. Sguil 0.5.2, for example, features new reporting features courtesy of Steve Halligan.
Editor Steve Fox wrote the second interesting article, The Luck of the Virus, subtitled "There's a reason you need intrusion detection systems." The article ends in a light-hearted discussion of the funny names attached to many open source products:
"Snort creator Martin Roesch — founder of security pioneer Sourcefire and an InfoWorld 2004 Innovator — confirmed our suspicions about the open source crowd in a post-test conversation. ACID, he confided, is on its way out as the preferred Snort GUI, soon to be replaced by SGUIL. And what does that stand for? Snort Graphical User Interface for Losers, of course.
That’s a winner in our book."
While that makes for a cute story, Sguil users know the "L" doesn't stand for "Losers." When I contacted Steve by email, he graciously offered to publish a reply from the Sguil team. Bamm decided to roll with the tone of the article and not publish a correction. I ask all Sguil users and potential users to not take such articles too seriously, and let our tools speak for themselves.
My major concerns with product reviews of this sort is their focus on alert-centric intrusion detection at the expense of other forms of network security monitoring data. Session, full content, and statistical data are completely ignored. Most reviews judge products primarily on their capability to identify "attacks," which in this case included "both live Internet traffic and a variety of attacks we launched from penetration testing tool Core Impact 4.0."
When an attack is launched, the IDS is judged to be "good" if it blinks its red light, and "bad" if the light stays dark. No mention is made of the analyst's capability to perform true incident analysis or the ability to determine how the product made its detection decision. The review stops with attack detection and does not consider the ability to track the intruder beyond the event which triggered an alert, let alone looking for non-alert evidence of prior intruder activity.
All reviewers also include a caveat like this:
"Its [Snort's] main weakness is its dependence on (sometimes poor) signatures. As with all signature-based IDSes, Snort can be defenseless against unknown or 'zero-day' attacks until a signature becomes available."
Unfortunately, dealing with events for which there is no signature or anomaly is the primary problem with all detection products. I believe that administrators should deploy network audit devices that detect failures in protection as well as keep records of all network activity, within the bounds set by legal, technical, and administrative policies. Products which focus on detecting "attacks" are merely documenting possible exploitation events. They are most useful against well-defined events like worm outbreaks but nearly worthless for incident scoping against rogue insiders, stealthy intruders, and unknown exploit techniques.
I was pleased to see some discussion of true enhancements to detection capabilities in the review. Areas like active asset discovery, vulnerability correlation, and passive mapping only make the analyst's job easier. These "contextuals" (as Marty says) give the analyst a better idea of his defensive landscape and should receive more attention in future IDS reviews.
The areas covered by popular IDS reviews which do matter (to a lesser degree) are those that make life easier for the administrator and analyst. These include sensor deployment, rule management, and reporting. Reports on these features (especially installation) remind me of nearly every Linux review published; most people want to know if the installer is an X application or a series of text-based instructions. Because it's hard for most people to judge the power of a new scheduler or the network capture improvements of device polling, reviewers spend their time on other matters.
Sguil users know sensor deployment, rule management, and reporting are lacking in our tool, but we are working on each. Sguil 0.5.2, for example, features new reporting features courtesy of Steve Halligan.
Editor Steve Fox wrote the second interesting article, The Luck of the Virus, subtitled "There's a reason you need intrusion detection systems." The article ends in a light-hearted discussion of the funny names attached to many open source products:
"Snort creator Martin Roesch — founder of security pioneer Sourcefire and an InfoWorld 2004 Innovator — confirmed our suspicions about the open source crowd in a post-test conversation. ACID, he confided, is on its way out as the preferred Snort GUI, soon to be replaced by SGUIL. And what does that stand for? Snort Graphical User Interface for Losers, of course.
That’s a winner in our book."
While that makes for a cute story, Sguil users know the "L" doesn't stand for "Losers." When I contacted Steve by email, he graciously offered to publish a reply from the Sguil team. Bamm decided to roll with the tone of the article and not publish a correction. I ask all Sguil users and potential users to not take such articles too seriously, and let our tools speak for themselves.
Comments