Saturday, September 01, 2007

Wall Street Clowns and Their Models

Recently I cited an Economist article in Economist on the Peril of Models. While walking through the airport this Businessweek cover story, Not So Smart, caught my eye. I found the following excerpts to be interesting.

The titans of home loans announced they had perfected software that could spit out interest rates and fee structures for even the least reliable of borrowers. The algorithms, they claimed, couldn't fail...

It was the assumptions and guidelines that lenders used in deploying the technology that frequently led to trouble, notes industry veteran Jones. "It's garbage in, garbage out," he says. Mortgage companies argued their algorithms provided near-perfect precision. "We have a wealth of information we didn't have before," Joe Anderson, then a senior Countrywide executive, said in a 2005 interview with BusinessWeek. "We understand the data and can price that risk."

But in fact, says Jones, "there wasn't enough historical performance" related to exotic adjustable-rate loans to allow for reasonable predictions. Lenders "are seeing the results of not having that info now..."


At this point in probably sounds like I am seriously anti-model. That isn't really the case. The points I cited from Businessweek involve inserting arbitrary values into models. Non-arbitrary data is based on some reality, such as "historical performance" for an appropriate past period, looking forward into an appropriate future period.

Incidentally, one of the articles I read cited the Intangible Asset Finance Society, which is "dedicated to capturing maximum value from intellectual properties and other intangible assets such as quality, safety, security; and brand equity." That sounds like something to review.

3 comments:

alex said...

So Richard, you were a consultant, and presumably in your new job you answer someone - when asked to make a recommendation about two different risk reducing efforts, how do you answer the question? What criteria do you choose?

jbmoore said...

Businesses seek to model their markets and processes to make money and give themselves an edge. The best people who are good at these jobs learn through an interative process of trial and error we call experience. They know what they are doing, but they can't always describe how they do it because likely some of the analysis is subconscious. Given enough time and resources, you can build an expert system to mimic a human expert, but it will be brittle. The expert program will only be good at what it was trained to do. Clearly the expert programs this article is talking about weren't vetted enough. Either the process they were modeling wasn't fully understood, they didn't talk to enough experts, or as the article states, they didn't test their software against real data to verify its results against reality. They also didn't test the software against one of their own expert's results it sounds like. The latter is a QA problem. They skipped a crucial step in software development, and being penny wise and pound foolish, they paid for it. Or again, perhaps the software has the same success rate as human lenders. After all, there is no historical performance for adjustable rate mortgage loans. They could have used an expert program that was modeled on fixed rate loans and assumed that it could do the job. The French made a similar mistake when they used the Ariane 4 flight guidance software to fly the Ariane 5 booster rocket. They lost the Ariane 5 booster and the payload using that software because the rocket had different flight characteristics that the software couldn't handle. The software did what it was designed to do which was guide an Ariane 4 rocket, it shut down causing a launch abort during flight. We'll likely find out what went wrong 6 months or more from now when a computer scientist dissects the program.

Anonymous said...

Sorry, maybe I am over simplifying the whole sub-prime mess... but it doesn't take a computer scientist to figure out that poor people with crappy credit buying houses that they shouldn't would cause a crash. There was too much liquid and too many carpetbagger lenders for this not to go south.