Radiation Detection Mirrors Intrusion Detection

Yesterday I heard part of the NPR story Auditors, DHS Disagree on Radiation Detectors. I found two Internet sources, namely DHS fudged test results, watchdog agency says and DHS 'Dry Run' Support Cited, and I looked at COMBATING NUCLEAR
SMUGGLING: Additional Actions Needed to Ensure Adequate Testing of Next Generation Radiation Detection Equipment
(.pdf), a GAO report.

The report begins by explaining why it was written:

The Department of Homeland Security’s (DHS) Domestic Nuclear Detection Office (DNDO) is responsible for addressing the threat of nuclear smuggling. Radiation detection portal monitors are key elements in our national defenses against such threats. DHS has sponsored testing to develop new monitors, known as advanced spectroscopic portal (ASP) monitors.

In March 2006, GAO recommended that DNDO conduct a cost-benefit analysis to determine whether the new portal monitors were worth the additional cost. In June 2006, DNDO issued its analysis. In October 2006, GAO concluded that DNDO did not provide a sound analytical basis for its decision to purchase and deploy ASP technology and recommended further testing of ASPs. DNDO conducted this ASP testing at the Nevada Test Site (NTS) between February and March 2007.

GAO's statement addresses the test methods DNDO used to demonstrate the performance capabilities of the ASPs and whether the NTS test results should be relied upon to make a full-scale production decision.

GAO recommends that, among other things, the Secretary of Homeland Security delay a full-scale production decision of ASPs until all relevant studies and tests have been completed, and determine in cooperation with U.S. Customs and Border Protection(CBP), the Department of Energy (DOE), and independent reviewers, whether additional testing is needed.
(emphasis added)

Notice that a risk analysis was not done. Rather, a cost-benefit analysis was done. This is consistent with the approach I liked in the book Managing Cybersecurity Resources, although in that book the practicalities of assigning certain values made the exercise fruitless. Here the cost-benefit approach has a better chance of working.

Next the report summarizes the findings:

Based on our analysis of DNDO’s test plan, the test results, and discussions with experts from four national laboratories, we are concerned that DNDO’s tests were not an objective and rigorous assessment of the ASPs’ capabilities. Our concerns with the DNDO’s test methods include the following:

  • DNDO used biased test methods that enhanced the performance of the ASPs. Specifically, DNDO conducted numerous preliminary runs of almost all of the materials, and combinations of materials, that were used in the formal tests and then allowed ASP contractors to collect test data and adjust their systems to identify these materials.

    It is highly unlikely that such favorable circumstances would present themselves under real world conditions.

  • DNDO’s NTS tests were not designed to test the limitations of the ASPs’ detection capabilities -- a critical oversight in DNDO’s original test plan. DNDO did not use a sufficient amount of the type of materials that would mask or hide dangerous sources and that ASPs would likely encounter at ports of entry.

    DOE and national laboratory officials raised these concerns to DNDO in November 2006. However, DNDO officials rejected their suggestion of including additional and more challenging masking materials because, according to DNDO, there would not be sufficient time to obtain them based on the deadline imposed by obtaining Secretarial Certification by June 26. 2007.

    By not collaborating with DOE until late in the test planning process, DNDO missed an important opportunity to procure a broader, more representative set of well-vetted and characterized masking materials.

  • DNDO did not objectively test the performance of handheld detectors because they did not use a critical CBP standard operating procedure that is fundamental to this equipment’s performance in the field.

(emphasis added)
Let's summarize.

  • DNDO helped the vendor tune the detector.

  • DNDO did not test how the detectors could fail.

  • DNDO did not test the detectors' resistance to evasion.

  • DNDO failed to follow an important standard operating procedure.


I found all of this interesting and relevant to discussions of detecting security events.

Comments

Popular posts from this blog

Zeek in Action Videos

New Book! The Best of TaoSecurity Blog, Volume 4

MITRE ATT&CK Tactics Are Not Tactics