Posts

Showing posts with the label controls

Continuous Diagnostic Monitoring Does Not Detect Hackers

Image
There is a dangerous misconception coloring the digital security debate in the Federal government. During the last week, in the wake of the breach at the Office of Personnel Management (OPM), I have been discussing countermeasures with many parties. Concerned officials, staffers, and media have asked me about the Einstein and  Continuous Diagnostic Monitoring  (CDM) programs. It has become abundantly clear to me that there is a fundamental misunderstanding about the nature of CDM. This post seeks to remedy that problem. The story  Federal cyber protection knocked as outdated, behind schedule by Cory Bennett unfortunately encapsulates the misunderstanding about Einstein and CDM: The main system used by the federal government to protect sensitive data from hacks has been plagued by delays and criticism that it is already outdated — months before it is even fully implemented. The Einstein system is intended to repel cyberattacks like the one revealed last week ...

Why DIARMF, "Continuous Monitoring," and other FISMA-isms Fail

Image
I've posted about twenty FISMA stories over the years on this blog, but I haven't said anything for the last year and a half. After reading Goodbye DIACAP, Hello DIARMF by Len Marzigliano, however, I thought it time to reiterate why the newly "improved" FISMA is still a colossal failure. First, a disclaimer: it's easy to be a cynic and a curmudgeon when the government and security are involved. However, I think it is important for me to discuss this subject because it represents an incredible divergence between security people. On one side of the divide we have "input-centric," " control-compliant ," "we-can-prevent-the-threat" folks, and on the other side we have "output-centric," "field-assessed," "prevention eventually fails" folks. FISMA fans are the former and I am the latter. So what's the problem with FISMA? In his article Len expertly discusses the new DoD Information Assurance Risk...

Let a Hundred Flowers Blossom

Image
I know many of us work in large, diverse organizations. The larger or more complex the organization, the more difficult it is to enforce uniform security countermeasures. The larger the population to be "secure," the more likely exceptions will bloom. Any standard tends to devolve to the least common denominator. There are some exceptions, such as FDCC , but I do not know how widespread that standard configuration is inside the government. Beyond the difficulty of applying a uniform, worthwhile standard, we run into the diversity vs monoculture argument from 2005. I tend to side with the diversity point of view, because diversity tends to increase the cost borne by an intruder. In other words, it's cheaper to develop exploitation methods for a target who 1) has broadly similar, if not identical, systems and 2) publishes that standard so the intruder can test attacks prior to "game day." At the end of the day, the focus on uniform standards is a manifest...

Control "Monitoring" is Not Threat Monitoring

Image
As I write this post I'm reminded of General Hayden's advice: "Cyber" is difficult to understand, so be charitable with those who don't understand it, as well as those who claim "expertise." It's important to remember that plenty of people are trying to act in a positive manner to defend important assets, so in that spirit I offer the following commentary. Thanks to John Bambanek's SANS post I read NIST Drafts Cybersecurity Guidance by InformationWeek's J. Nicholas Hoover. The article discusses the latest draft of SP 800-37 Rev. 1: DRAFT Guide for Applying the Risk Management Framework to Federal Information Systems: A Security Life Cycle Approach . I suspected this to be problematic given NIST's historical bias towards "controls," which I've criticized in Controls Are Not the Solution to Our Problem and Consensus Audit Guidelines Are Still Controls . The subtext for the article was: The National Institute for Standar...

Controls Are Not the Solution to Our Problem

Image
If you recognize the inspiration for this post title and graphic, you'll understand my ultimate goal. If not, let me start by saying this post is an expansion of ideas presented in a previous post with the succinct and catchy title Control-Compliant vs Field-Assessed Security . In brief, too many organizations, regulators, and government agencies waste precious time and resources devising and auditing "controls," regardless of the effect these controls have or do not have on security. They are far too input-centric; they should become more output-aware. They obsess over recording conditions they believe may be helpful while remaining ignorant of the "score of the game." They practice management by belief and disregard management by fact. Let me provide a few examples from one of the canonical texts used by the control-compliant crowd: NIST Special Publication 800-53: Recommended Security Controls for Federal Information Systems (.pdf). The following is ...

Control-Compliant vs Field-Assessed Security

Last month's ISSA-NoVA meeting featured Dennis Heretick , CISO of the US Department of Justice . Mr. Heretick seemed like a sincere, devoted government employee, so I hope no one interprets the following remarks as a personal attack. Instead, I'd like to comment on the security mindset prevalent in the US government. Mr. Heretick's talk sharpened my thoughts on this matter. Imagine a football (American-style) team that wants to measure their success during a particular season. Team management decides to measure the height and weight of each player. They time how fast the player runs the 40 yard dash. They note the college from which each player graduated. They collect many other statistics as well, then spend time debating which ones best indicate how successful the football team is. Should the center weigh over 300 pounds? Should the wide receivers have a shoe size of 11 or greater? Should players from the north-west be on the starting line-up? All of this seem...