Older blog entries for ploppy (starting at number 1)

20 Jan 2003 (updated 20 Jan 2003 at 20:23 UTC) »

Another birthday, another irritating day at work, trying to do the work of 3+ linux kernel developers with ever more unrealistic milestones. Look on the job sites and find there is exactly one linux kernel developer job on offer in Europe, on the other side of the European Union. That's still one more than there has been. Of course, there's lots of kernel developer jobs in America as usual, senior ones too requiring a PhD, rather than the crumbs on offer in Europe.

Slightly bemused by the fact I've still got no certification, even though I have been certified three times. Having read the blurb I can see how that might happen, but it seems rather silly.

No open source development over the weekend, just a lot of boozing. I will probably start on the next set of improvements to squashfs (1.2). It's good to get feedback, and I've already had people asking for the improvements I was intending to do.

Chalst asks for suggestions/feedback on his list of seminal highlevel languages. I must admit I disagree with Occam being a seminal language - the elegant parts of Occam were straight from Hoare's CSP (Communicating Squential Processes). The other parts of Occam were a nightmare, no dynamic thread/memory or array allocation, no structures, no shared memory between threads, and a intensely irritating syntax straight from hell (Pascal). It was in short a brain dead version of Pascal with Hoare's elegant CSP added. The real innovation was the implementation of CSP's channels in hardware on the transputer, a concept which made plugging together multiple transputers into a parallel computer network easy.

18 Jan 2003 (updated 18 Jan 2003 at 20:46 UTC) »

I was going to make some posts and/or reply to some today. But I can't and so a diary entry will have to do.

Tomorrow I will be 35, this means I have been programming for 20 years. Computing has changed so much in those 20 years (other than technological advances), and in my opinion for the worse.

Superficially computing has stayed the same. Twenty years ago, there was a kind of open source, hobbyists wrote code, printed it out and sent it into magazines. We had the big computing companies (DEC, IBM, etc.) that dominated the industry, and business bought from those and scorned the hobbyist 'toy' computer market. No change there perhaps?

However, we have lost the lively middle ground of small companies; the Sinclairs, Acorns, and Commodores of the industry. These were where the real innovation in computing was comming from. Back in 1983 you could buy for a reasonable amount of money (130 pounds), a computer with a 256x192 pixel display in 8 colours. Going to a big iron company like DEC, for far far more you'd get a slow 80x25 green screen serial (max speed 19200 k/bits) VT220 text terminal. No innovation. The small companies drove the GUIs, laser printing (Xerox which developed them never made any money out of it), the concept of the personal computer, and modern languages like BASIC in ROM. Many companies then were still using COBOL and punched cards (in the process of becoming a computing 'professional' I had to learn COBOL in 1984/85).

Does the death of these companies actually matter? Yes, of course it does. Twenty years ago, you could say there were three styles of computing where you could earn a living: the hobbyist/home computer companies, as an academic, or in business computing (read COBOL and punched cards). Today, with the collapse of the middle ground hobbyist companies, we've all become business computing employees, with the constraints, conservatism, and lack of innovation that implies. I never intended in 1982, to become a business computing employee, that side of the market filled me with revulsion.

So where has all the innovation gone? Open source of course. Whereas 20 years ago, 'open source' consisted of amateurs practising their skills and sending it into magazines, open source (confirmed by some large surveys performed a year or two ago), consists of a large number of computing professionals in full time occupations chaffing against the lack of innovation in their jobs.

This is bad of course. What makes it worse is the big companies, having pushed the middle ground companies out of business, are increasingly turning to open source for their innovation. The best companies of course use open source for their own ends (to innovate, as big companies never innovate), but give as much back to the open source community, in the ways where they are good - bug fixing, intensive useability tests and fixes.

The worst (perhaps the vast bulk?) treat open source as a reservoir of free talent, and take it (without any of its high minded principles) and never give anything back. This, in the IT downturn, has led to highly experienced people being sacked, with open source taking their place.

The biggest quesion facing open source is therefore, in my mind, how can open source sustain itself, when it's very success is being used against itself, destroying the livelihoods of the people who created it?

Increasingly, given a choice (or more likely not given a choice), I would move out of computing an a career, and stick to open source. However, having devoted 20 years to computing, and having gained an under-used PhD in it, there is very little else I want to do as a living. Catch 22.

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!