Wednesday, April 14, 2010

ON COMPUTATION

My experience with computers goes back to the fall of 1970, when, as a callow freshman in college, I encountered the IBM 360.

That was old-school computing at its finest. The damned thing was big enough to fill an entire building, but we rarely ventured there. It was said that a mysterious squad of Geek-Acolytes lived in the Computer Center, where virgins (thin on the ground in those heady days) would be sacrificed, on occasion, to the Calculational Gods. And so we would go to Fine Hall, where there was a convenient Hollerith card-reading station and a printer. You’d stick your deck of punch-cards in the reader, then wait for your job to run. As soon as the printer (a humongous affair the size of a Mini Cooper) would poop out your output, you would collect it, curse at the (inevitable) belatedly-discovered errors, then start all over again with a corrected card deck.

[Fine Hall was an interesting place. Late at night, bizarre scribblings would appear on blackboards, placed there by the so-called “Phantom of Fine Hall.” Said Phantom was none other than John Forbes Nash, Jr., the schizophrenic genius mathematician who would later receive the Nobel Prize for Economics for his work in game theory... and who would eventually be portrayed by Russell Crowe in the film A Beautiful Mind.]

The idea of a computer small enough to sit on your desk - never mind fit in your pocket - was pure moonshine in those days.

Home computing started impinging on our consciousness in 1984 when our friend Donnie Joe bought himself a Macintosh computer. It combined the CPU, disc drive, and monitor in a single, chunky unit... and it used something interesting and new: a mouse. We finally got one for ourselves seven years later - a Macintosh LC - by which time the machine had evolved to where it had a color monitor, a whole megabyte of RAM, and a 30 megabyte hard drive.

That machine, now nineteen years old, sits quietly in our garage. It has been superseded. Many times.

Sometime in the mid-1990’s, it became clear that technological advances had rendered our little Mac LC kludgy and obsolete. Connecting to the then-nascent Internet was possible only after jacking up the RAM, and even then the results were not especially robust. And so we ventured into PC-land.

In our experience, a computer will last for something on the order of four to five years before advances in software, operating systems, and hardware make replacement an increasingly more attractive option. Our most recent desktop machine had been with us for something on the order of six or seven years, and it was definitely showing signs of age. We replaced the hard drive a few years ago, but there just wasn’t enough RAM to keep up with the latest versions of my workhorse applications.

And so we have replaced it... with a shiny new machine that boasts 8 gigs of RAM and a 1 TB (that’s terabyte, y’all - a trillion bytes!) hard drive. And the 27-inch high-definition monitor is tasty, too.

The thing runs like greased lightning. Webpages snap into place, applications boot up almost instantly. Sweet.

Whether it will improve the quality of my blogging remains to be seen. But don’t count on it.

As hot as this new Computing-Engine is right now, I’m sure our grandkids (assuming we eventually have any) will be looking at its dusty, basement-dwelling carcass in fifteen or twenty years, thinking, “Geez - just one terabyte - how could those people live?

No comments: