Posts

Showing posts with the label engineering

Does This Sound Familiar?

I read the following in the 2009 book Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making by Gary Klein. It reminded me of the myriad ways operational information technology and security processes fail. This is a long excerpt, but it is compelling. == Begin == A commercial airliner isn't supposed to run out of fuel at 41,000 feet. There are too many safeguards, too many redundant systems, too many regulations and checklists. So when that happened to Captain Bob Pearson on July 23, 1983, flying a twin-engine Boeing 767 from Ottawa to Edmonton with 61 passengers, he didn't have any standard flight procedures to fall back on. First the fuel pumps for the left engine quit. Pearson could work around that problem by turning off the pumps, figuring that gravity would feed the engine. The computer showed that he had plenty of fuel for the flight. Then the left engine itself quit. Down to one engine, Pearson made the obvious decision to divert fr...

Whither United States Air Force Academy?

Image
From TaoSecurity Thomas Ricks' post Does the Air Force Academy have ‘the least educated faculty’ in the country? inspired me to write this post. Mr. Ricks cited a story by Jeff Dyche, a former USAFA professor who cited a litany of concerns with the USAFA experience. I graduated from the Air Force Academy in 1994, ranked third in my class of 1024 cadets, and proceeded to complete a master's degree at Harvard in 1996. In my experience, at least in the early 1990s, USAFA faculty were as good, or better, than Harvard faculty. I considered the nature and volume of my graduate courses to be simple compared to my USAFA classes. When several fellow graduate students broke into tears after learning what the Harvard faculty expected of them, I couldn't believe how much easier the classes were going to be! Rather than address points made by Ricks and Dyche, I prefer to focus on a theme that appears every few years: "why does the nation need service academies?" To pr...

Minneapolis Bridge Lessons for Digital Security

Image
The Minneapolis bridge collapse is a tragedy. I had two thoughts that related to security. If the bridge collapsed due to structural or design flaws, the proper response is to investigate the designers, contractors, inspectors, and maintenance personnel from a safety and negligence perspective. Based on the findings architectural and construction changes plus new safety operations might be applied in the future. This is a technical and operational response. If the bridge collapsed due to attack, the proper response is to investigate, apprehend, proseceute, and incarcerate the criminals. Redesigning bridges to withstand bomb attack is unlikely. This is a threat reduction and deterrence response. Do you agree with that assessment? If yes, why do you think response 1 (try to improve the "bridge" and similar operations) is the response to every digital security attack (i.e., case 2)? My short answer: everyone blames the victim, not the criminal. The NTSB is on scene in Min...

More Engineering Disasters

Image
I've written several times about engineering disasters here and elsewhere. Watching more man-made failures on The History Channel's "Engineering Disasters," I realized lessons learned the hard way by safety, mechanical, and structural engineers and operators can be applied to those practicing digital security. >In 1983, en route from Virginia to Massachusetts, the World War II-era bulk carrier SS Marine Electric sank in high seas. The almost forty year-old ship was ferrying 24,000 tons of coal and 34 Merchant Mariners, none of whom had survival suits to resist the February chill of the Atlantic. All but three died. The owner of the ship, Marine Transport Lines (MTL), blamed the crew and one of the survivors, Chief Mate Bob Cusick, for the disaster. Investigations of the wreck and a trial revealed the Marine Electric's coal hatch covers were in disrepair, as reported by Cusick prior to the disaster. Apparently the American Bureau of Shipping (ABS), an ins...

Nisley on Failure Analysis

Since I'm not a professional software developer, the only reason I pay attention to Dr. Dobb's Journal is Ed Nisley. I cited him earlier in Ed Nisley on Professional Engineering and Insights from Dr. Dobb's . The latest issue features Failure Analysis , Ed's look at NASA 's documentation on mission failures. Ed writes: [R]eviewing your projects to discover what you do worst can pay off, if only by discouraging dumb stunts. What works for you also works for organizations, although few such reviews make it to the outside world. NASA, however, has done a remarkable job of analyzing its failures in public documents that can help the rest of us improve our techniques. Documenting digital disasters has been a theme of this blog, although my request for readers to share their stories went largely unheeded. This is why I would like to see (and maybe create/lead) a National Digital Security Board . Here are a few excerpts from Ed's article. I'm not going to s...

Insights from Dr. Dobbs

I've been flying a fair amount recently, so that means I've been reading various articles and the like. I want to make note of those I found interesting. The March 2006 issue of Dr, Dobb's Journal featured a cool article on Performance Analysis and Multicore Processors . I found the first section the most helpful, since it differentiates between multithreading and hyperthreading. I remember when the FreeBSD development team was criticized for devoting so many resources to SMP. Now it seems SMP will be everywhere. In the same issue Ed Nisley writes about Crash Handling . I call out this article for this quote: Mechanical and civil engineers must consider how their projects can fail, right at the start of the design process, by calculating the stress applied to each component and the strength required to withstand it. Electrical engineers apply similar calculations to their circuits by considering voltage, current, and thermal ratings. In each case, engineers determine ...

Ed Nisley on Professional Engineering

Image
I get a free subscription to Dr. Dobb's Journal . The March 2006 issue features an article by Ed Nisley titled "Professionalism." Ed is a software developer with a degree in Electrical Engineering. After working at a computer manufacturer for ten years in New York state, he decided to become a "consulting engineer." Following the state's advice, Ed pursued a license to be a Professional Engineer. Now, 20 years after first earning his PE license, Ed declined to renew it. He says "the existing PE license structure has little relevance and poses considerable trouble for software developers." You have to register with DDJ to read the whole article, but the process is free and the article is worthwhile. Here are a few of Ed's reasons to no longer be a PE: "[T]o maintain my Professional Engineering license, I must travel to inconvenient places, take largely irrelevant courses, and pay a few kilobucks. As nearly as I can tell from the co...

Another Engineering Disaster

Does the following sound like any security project you may have worked? Executives decide to pursue a project with a timetable that is too aggressive, given the nature of the task. They appoint a manager with no technical or engineering experience to "lead" the project. He is a finance major who can neither create nor understand design documents. (This sounds like the news of MBA s being in vogue, as I reported earlier.) The project is hastily implemented using shoddy techniques and lowest-cost components. No serious testing is done. The only "testing" even tried does not stress the solution in any meaningful way -- it only "checks a box." Shortly after implementation, the solution shows signs of trouble. The project manager literally patches the holes and misdirects attention without addressing the underlying flaws. Catastrophe eventually ensues. What I've just described is the Boston Molasses Flood of 1919, best described by the Boston Society of ...

Engineering Disasters in Information Security Magazine

The December 2005 issue of Information Security magazine features an article I wrote titled History Lessons with the subtitle "Digital security could learn a lot from engineering's great disasters." It is based on this blog entry describing analog engineering disasters like the 1931 Yangze River damn failure, the 1944 Cleveland LNG tank fire, the 1981 Kansas City Hyatt Regency hotel walkway collapse, and the 1933 Atlanta Marriott parking lot sink hole. I am considering expanding this topic of digital security disasters to encompass a new book. I would like to take a historical and technical look at digital security failures on a case-by-case basis. Ideally the cases would be based on testimony from witnesses or participants wishing to (anonymously) share lessons with their colleagues. My concept is simple: when a bridge fails in the "analog" world, everyone knows about it. The disaster is visible, and engineers can analyze and learn from the event. The...

More on Engineering Disasters and Bird Flu

Here's another anecdote from the Engineering Disasters story I wrote about recently . In 1956 the cruise ship Andrea Doria was struck and sunk by the ocean liner Stockholm. At that time radar was still a fairly new innovation on sea vessels. Ship bridges were dimly lit, and the controls on radar systems were not illuminated. It is possible that the Stockholm radar operators misinterpreted the readings on their equipment, believing the Andrea Doria was 12 miles away when it was really 2 miles away. The ships literally turned towards one another on a collision course, based on faulty interpretation of radar contact in the dense fog. Catastrophe ensued. This disaster shows how humans can never be removed from the equation, and they are often at center stage when failures occur. The commentator on the show said a 10 cent ligh bulb illuminating the radar controls station could have shown the radar range was positioned in a setting different from that assumed by the operator. ...

Further Thoughts on Engineering Disasters

My TiVo managed to save a few more episodes of Modern Marvels . You may remember I discussed engineering disasters last month. This episode of the show of the same title took a broader look at the problem. Three experts provided comments that resonated with me. First, Dr. Roger McCarthy of Exponent, Inc. offered the following story about problems with the Hubble Space Telescope. When Hubble was built on earth, engineers did not sufficiently address issues with the weight of the lens on Earth and deflections caused by gravity. When Hubble was put in orbit, the lens no longer deflected and as a result it was not the proper shape. Engineers on Earth had never tested the lens because they could not figure out a way to do it. So, they launched and hoped for the best -- only to encounter a disaster that required a $50 million orbital repair mission. Dr. McCarthy's comment was "A single test is worth a thousand expert opinions." This is an example of management by fa...

Engineering Disaster Lessons for Digital Security

I watched an episode of Modern Marvels on the History Channel this afternoon. It was Engineering Disasters 11 , one in a series of videos on engineering failures. A few thoughts came to mind while watching the show. I will provide commentary on each topic addressed by the episode. First discussed was the 1944 Cleveland liquified natural gas (LNG) fire. Engineers built a new LNG tank out of material that failed when exposed to cold, torching nearby homes and businesses when ignited. 128 people died. Engineers were not aware of the metal's failure properties, and absolutely no defensive measures were in place around the tank to protect civilian infrastructure. This disaster revealed the need to (1) implement plans and defenses to contain catastrophe, (2) monitor to detect problems and warn potential victims, and (3) thoroughly test designs against possible environmental conditions prior to implementation. These days LNG tanks are surrounded by berms capable of containing a...