Below is a random post.Our third daughter was born.
I was working in Financial Research at Freddie Mac. Freddie Mac had recently been turned loose from the Federal Home Loan Bank Board and was a publicly traded company. I had overseen the development of option pricing models for credit risk and interest-rate risk.
The information systems department was working on a project to redevelop Freddie Mac’s systems, which were still mostly mainframe programs written in the 1970’s. As far as I know, this project never was completed. My guess is that ten years later, during the Y2K scare, was when most of the redevelopment work took place. Even as late as 2003, the corporate systems used to produce its accounting statements could not stand up to scrutiny.
A lesson I learned from that is that if you don’t tear up your computer systems and start from scratch every few years, they end up dragging you down like an anchor. A few years later, when I had my own company, I was willing to overhaul our system any time we needed to upgrade in some way. Rewriting a system goes quickly if you do it often enough, and it gives you a chance to clean up all the garbage that accumulates in old systems.
Twenty-one years ago, there was no World Wide Web. America Online was launched at that time (transforming a previous company), primarily to serve gamers over a proprietary network. It did not use the Internet, which at that time was pretty much confined to engineers and scientists at some government agencies and major universities.
The Berlin Wall had recently fallen. The elder George Bush was President. The economy had been growing steadily with declining inflation for nearly seven years.
Sean Hannity was trying to break into radio. Paul Krugman had no public recognition, although he had served a stint as a staff economist with the Council of Economic Advisers under President Reagan. My most recent Presidential vote had been for Dukakis.
Much has changed in twenty-one years.
READER COMMENTS
Eric Falkenstein
Oct 5 2010 at 9:24am
By ‘credit risk’, was that mainly in the sense of how default affects prepayment speeds, as opposed to obligor (eg, fico) and facility (eg, ltv) information? I get the sense that when the GSEs talk about credit risk, it presumes the def. prob and loss-in-event of default, and rather approaches more the portfolio implications, or how these affect those complicated bond option models.
David N. Welton
Oct 5 2010 at 9:30am
Interesting – this is a field I know much more about than economics.
In stark contrast (well, it seems that way to me at least) with your view is Joel Spolsky on “the big rewrite”:
http://www.joelonsoftware.com/articles/fog0000000069.html
“They did it by making the single worst strategic mistake that any software company can make:
They decided to rewrite the code from scratch.”
It’s a tricky issue though… if you never ever rewrite stuff, it can be like a ball and chain. On the other hand, sometimes rewriting ends up killing companies because everything is going into copying a working system rather than adding features or fixing bugs or working on new products.
Dave
Oct 5 2010 at 10:07am
I presume there is no risk that you will start a new career in Information Technology any time soon, so your bad advice is probably harmless–however, “tear up your systems and start from scratch” every few years is a recipe for treading water, not for moving forward.
Better advice would be: every year reserve some of your budget for system improvements focusing on simplification and data integrity. Constantly improve your systems, don’t let them age in such a way that you are stuck with no choice but a re-write.
Arnold Kling
Oct 5 2010 at 12:05pm
Eric,
No, Freddie Mac did not have much of a portfolio at the time. Credit risk was the default risk on the mortgages. We wanted to know how much to price an 80 percent LTV vs. a 60 percent LTV, or an investor loan vs. owner-occupied. Also, how much capital to hold in a stress-test scenario, where prices were assumed to fall by 40 percent over four years.
You had to spray out a distribution of house prices relative to a national average, and assume a propensity to default as borrowers went underwater.
Rebecca Burlingame
Oct 5 2010 at 12:49pm
Dave, per your second paragraph – what a good option for all the heavy weight and gridlock of tody’s laws. Which is less scary – opt out or re-write?
Phil
Oct 5 2010 at 12:51pm
Do I just have current-day bias, or has their been less change in the world in the last 10 years than in any other recent decade?
The 60s were a lot different from the 50s. The 70s were a lot different from the 60s. The 80s were a lot different formt he 70s. The 90s, especially the late 90s (internet!) were a lot different from the 80s.
But is 2010 really that much different from 2000?
burger flipper
Oct 5 2010 at 1:10pm
Hopefully you can still say your last presidential ballot was cast for Dukakis
Chris T
Oct 5 2010 at 3:03pm
Do I just have current-day bias, or has their been less change in the world in the last 10 years than in any other recent decade?
Probably current day bias. It’s impossible to put events or experiences in context until you’re well past them.
You remember a lot more from the past decade than the previous decades and so it seems far more continuous. Major changes in prior decades are remembered as singular events rather than as the result of a chain of smaller changes.
Phil
Oct 5 2010 at 3:19pm
>Probably current day bias. It’s impossible to put events or experiences in context until you’re well past them.
Thanks. You’re probably right.
Nic Smith
Oct 5 2010 at 3:29pm
A quick thought on the “rewrite software often” idea — to the extent that software is a collection of decisions made, it represents a constraint on your current actions, and you want to rewrite it as often as possible. To the extent that it is a collection of “things we’ve learned how to do” — capital — you want to keep it around. This will vary from project to project. The implication is that you want to rewrite software if there are more “arbitrary decisions” within it, copying liberally from the old version where the decisions previously made aren’t binding (assuming the choice of language itself or overall architecture isn’t the constraint). Measuring this, however, is not easy.
Dan Weber
Oct 5 2010 at 3:45pm
When you rewrite software well, you forget it almost instantly.
When you rewrite it poorly, you remember it every single day until you quit in disgust.
Even if there are 10 good rewrites for every poor one, you remember the poor one a lot better.
Erich Schwarz
Oct 5 2010 at 9:27pm
Do I just have current-day bias, or has their been less change in the world in the last 10 years than in any other recent decade?
My superficial impression is that that’s true.
But I think that superficial impression mainly a function of the 2000s having been, bluntly, a very dreary decade compared to either the 1980s or the 1990s (at least for this upper-middle class American, and probably for lots of others). It’s hard to appreciate a decade that feels like one long kick in the shins.
Rationally, I can easily think of technological changes that occurred during the 2000s which are immense, and that are likely to have vast consequences for human life in the coming decades:
1. Development of techniques for getting embryonic-like stem cells from adult cells, using techniques that do not alter the cell’s DNA and that are thus likely to be noncarcinogenic. This would have been utter science fiction as late as 2000. Given this newly developed technique, we now could in principle take any random human being and grow, specifically for that person, replacement body parts, which could have genetic enhancements. (In practice, we still need to learn how to get an organ from that embryonic-like cell — but that is a solved problem in vivo, so it is likely to be tractable in vitro; and we don’t have to deal with the abortion issue while doing so.)
Think about how many organs start to fall apart in people as they age; think about what it would do to human life expectancy if those organs could be replaced with youthful, genetically enhanced, completely isogenic substitutes. More generally, think about being able to give an amputee or paraplegic back the use his or her limbs, or a blinded man his eyes.
2. Next-generation genomic sequencing, heading steadily toward the goal of an entire human genome being sequenceable from scratch for $1K. This technology is already revolutionizing most of molecular biology becuase many experiments become much more powerful — or go from impossibility to practicality — when you can sequence 10 billion nucleotides of DNA in two weeks for a few thousand dollars. Again, this was pure science fiction in 2000.
The practical implication is that we are likely to have, by 2020, very wide knowledge of the exact genetic composition of many human beings and many ethnic groups. That will have enormous implications for our ability to decipher the genetics of common human traits and the recent evolution of humankind, not to mention our ability to do customized human genomics on individual patients suffering from diseases with a genetic component (e.g., cancer). That, in turn, is likely to make people both happy and unhappy.
Those are two changes I know rather well since I work in biology. I’ve totally skipped over the iPod, the iPhone, and their non-Apple congeners — but those are pretty big changes too, particularly with the prospect that iPhones and Androids will turn into general-purpose miniature computers (e.g., for credit card transactions by ordinary people) and that they will make anyplace in cell-phone range a free wireless hotspot.
Chris T
Oct 6 2010 at 6:33pm
Some other major science/technological advances/changes in the 2000s:
-Isolation and characterization of graphene
-The creation of metamaterials
-Significant advances in Lithium Ion Batteries leading to them being in almost every mobile device
-Hybrids began to make up a significant portion of the vehicle fleet
-Astronomy is in the midst of a golden age
-The combination of horizontal drilling and hydraulic fracturing have made shale gas economic
-Gene therapy has been refined and shown to be efficacious for a number of disorders (generally in the eye)
-The first manned private space flight
-Military robots have gone from a marginal role to being critical in modern operations
-Home robots are increasingly practical
-Productivity growth was at or near historical highs throughout the decade
Scott
Oct 6 2010 at 10:02pm
When I worked for DEC, I worked with a client that complained about how long it was taking to produce a new version of his software system. His comment was that it only took the Lord six days to create the heavens and the earth and on the seventh day He rested. So what was our problem.
My response was that that was only because He didn’t have an installed base. Once He had an installed base and wanted to revise it, He had to start by making it rain for forty days and forty nights.
BZ
Oct 7 2010 at 11:03am
Ah, America Online (AOL). It was started by Quantum Services and originally named PC-Link to be a IBM PC Clone version of their flagship product Quantum-Link, which had been an online service for Commodore 64/128 users since the early 1980s. IBM PCs were beginning their take over of the Home Computer market around this time, and it was time to take notice.
Of course, I’m a programmer, and a home computer user for 35 years, so I’m biased. But if you ask me, it was the later that was most important about 1990. The era of proprietary computer platforms, where computer products worked roughly the same way game console products do today, was coming to an end. Microsoft and IBM PC had always been joined at the hip, so Bill Gates was not a poor man at the time. He was about to become a mega-rich man.
Comments are closed.