Moving Forward Through the Past

Posted 19 Apr 2001 at 15:50 UTC by Mulad Share This

It is important to periodically look through history, so that we are not doomed to repeat it. Here are a few examples of what problems computing pioneers had to deal with, and why we may be regressing today.

This semester, I've been taking a History of Computing course that has made me take a look back at the accomplishments of our predecessors in this field. It's been interesting to find parallels between today's problems and the problems that were being solved 25, 50, or 100 years ago. I've also taken other classes that have forced me to take a closer look at programming and the other processes involved in making computers work.

Back in the early days of electronic computers, they weren't reliable at all. ENIAC had thousands of tubes. Considering the fact that tubes usually only last a few thousand hours, this becomes a problem. The ENIAC folks did come up with a number of techniques to get the creme de la creme of vacuum tubes. They created special testing equipment that would help detect tubes that would die early, and they kept the tube heaters on most of the time, since tubes would usually fail when they were turned off or on. (The same problem I have today with light bulbs...)

The effort to improve reliability carried over into many other projects. Thanks to some people that were completely obsessed with getting it right, a number of great products were generated. The early nodes of the ARPANet were based on the military-hardened versions of a Honeywell computer. A representative of the company once took a sledgehammer to one of these machines at a trade show to show how much of a beating it could take. (Of course, it ended up that the over-engineered cabinetry hindered repair work a bit, but it shows how these folks wanted reliable systems.)

Bolt Beranek and Newman (BBN) created a lot of interesting technology for the ARPANet (BBN was contracted to build/operate it). The nodes on the network were able (not in the beginning, but after a time) to load new copies of their software from neighboring nodes, meaning that no paper tape or anything would have to be distributed. They created all sorts of other tools for testing network integrity. Their line testers were better than AT&T's (they provided the leased lines), and could show if a particular link was on its last legs.

In addition to the reliability factor, many folks were very interested in how to best display information. You may know this better as the ``Command and Control'' or ``C&C'' problem. How much or how little information do you need in order to be able to make a `command decision?' Computers are not just around to crunch numbers or produce documents, they can also be programmed to help us see the big picture without telling us everything by giving us the `important' information. Also, this implies a good amount of interactivity and the ability to get, process, and represent information in a real-time or near-real-time manner.

Today, I'm not sure if we're living up to the standards laid down by those before us. Reliability is and always will be a problem. Professionals in the computing field just need to make sure they put the effort into thinking before doing. I don't subscribe to many of the Software Engineering ideas -- I just can't justify the overhead involved in keeping track of my time down to the minute, or producing document after document describing my thought processes. However, I now understand how much planning and design is required to produce something that reeks of excellence. The weekend hack has its place, but I think we should always try to produce things that will stand up over time.

A lot of effort has been put into changing software so that it drives web pages rather than text or graphical interfaces. The problem is that web pages really go against much of the preceding work put into the C&C problem. They aren't real-time, and many are designed extremely poorly (of course, many web pages are based on interfaces that were poorly designed in the first place).

Many people have lost sight of what computing technology is really supposed to do. What did your computer do for you today? I read a few web pages (mostly news) and read some e-mail. That doesn't sound very different from life on the ARPANet back in 1972, when people could read the AP newswire online and some crazy netheads would carry around twenty pound `portable' terminals so that they could dial up and read e-mail.

Lastly, I'm concerned about what seems to be a lack of openness. This is a two-way street, of course, as some are finding it more beneficial to become more open about their technologies, and others are trying to see how much money they can get by tightening their grip on proprietary devices and techniques. Personally, I don't think our society can advance without becoming more open. It's striking to discover similarities between a speech made two thousand years ago against the Roman occupation of what is now Great Britain and an online rant of today against corporate conglomerates or other large foes. When we cannot remember what has come before, technologically or otherwise, we are only doomed to repeat it again and again and again.

Now, what do you want your computer to do for you today? Some early pioneers wanted computers to help us be more involved with government. In many ways, this has been successful as laws, court cases, and other documents have found their way onto the web. Computers also took over many of the jobs done by secretaries and other clerks. My History of Computing professor has found, though, that he is now doing much of that work, not his computer.

Certainly, I've been taking a somewhat dim view. Our society has definitely progressed. However, it's important to remind ourselves of the accomplishments of the past so that we can properly focus our energies in the future.


Read the Risks Forum, LinuxQuality, posted 19 Apr 2001 at 22:41 UTC by goingware » (Master)

The issue of computer reliability, both software and hardware, has been discussed extensively for years in The Forum on Risks to the Public in Computers and Related Systems also available on the Usenet News as comp.risks.

If you're a member of Advogato, then you've got good reason to be reading Risks regularly, and contributing on-topic posts to it. This means you.

In fact, anyone who uses computers at all ought to check out Risks every now and then.

Note also what I'm trying (slowly) to do around the area of reliability of Free Software with project LinuxQuality. It is on the web at http://linuxquality.sunsite.dk/ There's not a lot there yet but there are a couple of articles on testing the Linux kernel and testing web applications.

LinuxQuality being a volunteer project (as most free software projects are) and my business being hectic, I don't get a lot of time to work on it but I have great hopes for it in the long run.

Systems research, posted 20 Apr 2001 at 06:48 UTC by mslicker » (Journeyer)

See Rob Pike's related commentary. Universities contribute a majority of the public computing research, it only seems natural that if systems research is stagnating in the universities than the systems we use are also staganting. To me at least, it seems the state of the art has stagnated, and maybe in some areas regressed. Unix for example is a beautiful system, however it is 30 years old. And now we trying to boot strap very different system ideas on the Unix model. I know the main reason for this is compatability, but eventually we will need a new system model to adapt to our changing use of the computer. We also need to look at new ways of interacting, and not formulaic user inteface concepts of the past.

Worse is better, posted 20 Apr 2001 at 11:13 UTC by dirtyrat » (Journeyer)

You might want to read The rise of "Worse is Better".

computing reliability and lost parallel algorithms, posted 20 Apr 2001 at 13:24 UTC by lkcl » (Master)

one of the people at unisys described to me the history of the development of their servers.

they started off, twenty or thirty years ago, providing massive raid reliability disk storage systems, when a hard disk would typically have one failure every few days [yes, that's right. 12-in winchester hard drives that, if you inserted the 16 platters incorrectly you had better take cover when they wound themselves up].

The Rules Are: You Do Not Answer That The Data Has Been Written Until You Have Written It To Several Disks And Then Verified *ALL* Of Them.

[ which is bit of a bugger if, twenty five years later, you're writing an SMB server, with client-side write-back cacheing built in to the protocol :) :) ]

this was for use by banks etc. with the kind of data that, if it's lost, the bank stands to lose several million or several billion dollars. i mean, that data *is* the money. so you ABSOLUTELY HAVE to know it's written to disk.

now, take that up to the present day. using advanced modern raid systems, you now have reliability guaranteed to a staggeringly stupid degree of probability. so they basically built upon the lessons learned from old techniques, and just took it one stage further.

what else. i read a science fiction book fifteen years ago by... asimov, i think. the basic premise of the book was that the crew of a craft lost their computer. instead of panicking, fortunately, they had someone on-board who was a maths professor, who knew the principles of astronomy and the laws of motion. so, what they did was, they learned how to do sines, cosines, tangents etc. and they got to the point where, after a few months, they could do massive amounts of computation in their heads, just eating and sleeping numbers. they had the charts, they worked out where they were: over time, they worked out their speed, they worked and worked and worked.

and they computed their way home, in a massive, collaborative, parallel algorithm effort.

even FIFTEEN YEARS ago, the age of machine computing was heralded as the breakthrough to calculations. everything could, of course, be done seqentially, by a computer, much faster than human beings doing the same computations, in parallel, right? [remember the slide rule?]

right.

wrong.

it became very apparent that the application of sequential programming techniques was superceding the art of *manually* performing and using parallel algorithms, with the result that such parallel algorithm techniques became lost.

and now, of course, we know better (*duur*), but it's too late: the people who used to know such parallel algorithm techniques because they lived them every day have retired... and now we have to rediscover them.

another book, again by Asimov, called "The End of Eternity". Computer Harkan is one of the characters. no, he's not a machine, he's a human. a very respected individual.

"Computer" is a title. a bit like "Dr" or "Professor". these are the people responsible for performing "Computation". they are *highly* respected.

...this book is only probably about twenty years old...

lightbulbs are designed to fail, posted 20 Apr 2001 at 13:27 UTC by lkcl » (Master)

consumer age, and all that stupidity i'm afraid. it _is_ possible to take all the air and replace it completely with argon: this is not done, such that the lightbulb is guaranteed to fail, and you will go spend your money.

sorry :)

valve reliability, posted 20 Apr 2001 at 13:30 UTC by lkcl » (Master)

[by contrast: a small amount of mercury is placed in the tube, when constructed. a tiny coil is fitted with the *sole* purpose of flash-heating that mercury such that it turns to mercury oxide with any remaining air in the tube, creating near-as-dammit the best vacuum they could get].

navy commission in 1800s, posted 20 Apr 2001 at 13:38 UTC by lkcl » (Master)

[i'm really sorry: this article seems to be dredging stories out of me :)]

the Royal Navy issued a commission in about... 1850? the purpose: navigation charts were expensive to compute, and even more difficult to copy accurately. pages and pages of numbers. they had discovered that not only were some of the charts calculated incorrectly, but *one in three* had copying errors. translated into actual movement, an error in the fourth decimal place could result in being a hundred miles off-course to your destination.

_not_ funny.

their commission? a) to build a machine capable of performing accurate computation [well, duur, i think we've got that one down] b) to *print* the results of the computation accurately.

WYSIWYG? pffh! don't make me laugh :) printing is a pain in the neck! double-sided, A4 or US-letter, colour, whoops, i seem to have put the page in backwards. hellooo, where's my network? oh dear, my scaling's all wrong now that i'm using a black-and-white windows printer not a colour one AAAGH!

[btw, the challenge was taken up by Babbage, with his Analytical Engine, the theoretical successful completion of which and the historical consequences is explored in 'The Difference Engine', by William Gibson and Bruce Sterling. very interesting book]

Books on computing history?, posted 22 Apr 2001 at 12:27 UTC by srl » (Journeyer)

So, at least in the US, one's secondary education covers some general sociopolitical history. Here's what people before us did, and here's how they were wrong, or here's how someone later improved on their way of doing things. It occurs to me that my undergrad education was missing computing history after Turing. I got the sense from reading the "Worse is Better" article that We've Been Here Before and I don't know about it.

So, what books should I read about that cover computing and language history in interesting, readable detail? I'm not looking for scholarly work or pages and pages of dry detail; more like Neal Stephenson's explanations of cryptography than like *Structure and Interpretation of Computer Programs*.

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

X
Share this page