July 31, 2007 4:00 AM PDT
Albuquerque extols its role in PC revolution
Some might call that conclusion heresy, but here's my twisted theory: Albuquerque-based Micro Instrumentation and Telemetry Systems (MITS) created the Altair 8800 in 1975. This in turn led to Bill Gates and Paul Allen writing the Altair version of Basic, founding Microsoft in the process, and moving here to set up shop close to the Altair's manufacturer.
Sounds good, right? But here's the rub: the Altair was a hit, probably more so than MITS could handle. As production problems and other technical issues arose, Altair began losing money. It got bought up by an out-of-town corporation. And because Gates and Allen no longer had anything tying their now million-dollar-a-year business to Albuquerque, they packed their bags and took Microsoft back to where they'd grown up, the Seattle area.
The result? Redmond, not Albuquerque, is now home base for most of Microsoft's 65,000 employees. And what city wouldn't want an employer that creates that kind of tax base? Instead, Albuquerque is left with the notoriety of being only where Microsoft began.
These are some of my conclusions after stopping here during Road Trip 2007, my tour around the Southwest. I'm visiting "Startup: Albuquerque and the Personal Computer Revolution," an exhibit conceived of and largely funded by Allen that's now housed at the New Mexico Museum of Natural History. It's probably not the message Allen and the exhibit's curators wanted to convey, but there you go.
The real point of "Startup," if you can see past my perverted logic, and as its name implies, is that the personal computer revolution did in fact begin in Albuquerque--something that might surprise many people unfamiliar with the PC's history.
The exhibit, most likely the only one in the world specifically devoted to the history of the microcomputer, makes its point elegantly by first laying out the technology that led to the PC revolution, and then explicitly spelling out Albuquerque's role.
The exhibit starts by quickly taking us back to the earlier 20th century. One of the very first artifacts is a little book, Songs of the IBM, a 1931 volume filled with the fellowship songs Big Blue employees would chant at company meetings.
A little farther down is one of the exhibit's masterpieces: a beautiful Univac-1 console, part of the machine that in 1953 became perhaps the first commercial computer. Of course, as the exhibit points out, a contemporary little handheld computer has 45,000 times as much memory and works 450,000 times faster than a Univac-1, but who's counting?
Visitors are then presented with this factoid: in 1953, it took $1 million, or $35,000 a month in rent, to own a computer, and you'd also need enough electricity for a small town, enough air conditioning for a three-bedroom house, the ability to speak machine language, and a staff of seven to operate it.
Pretty funny, given that today a great computer goal is to provide $100 laptops to children in the Third World. The exhibit, in fact, has one of Nicholas Negroponte's $100 laptops on display.
Next, we enter the '60s, and we're told that the computer revolution wouldn't have happened were it not for the military and space programs requiring compact computer processing power.
Luckily they did, and one of the first results was Spacewar!, what may be the world's first video game. It was a project of several MIT students, who took their oh-so-powerful DEC PDP-1 computer in 1961 and used it to make a two-player game in which each controls a spaceship and tries to shoot the other while maneuvering around the gravity well of a star.
From there we move swiftly to the fact that Dartmouth College math professors John Kemey and Thomas Kurtz invented Basic in 1964, and to a brief introduction of the evolution of computer processing power, from vacuum tubes to transistors in 1947, and then onto integrated circuits, which, coincidentally, Jack Kilby and Robert Noyce both independently invented in 1958.
There's also a brief discussion of the founding of Intel, which invented the microprocessor, even though it didn't intend to.
According to the exhibit, a client asked Gordon Moore and Noyce's young company to build a complex calculator that would have required a dozen chips. But Intel didn't have the manpower to do the job, so it came up with another idea: What if they could put all the computing functions on a single chip? Thus, computer history was made.
4 commentsJoin the conversation! Add your comment