Older blog entries for ploppy (starting at number 2)

23 Jan 2003 (updated 23 Jan 2003 at 17:09 UTC) »

I've been reading Salmoni's diary. You mention

I would have learned [Java] a few years ago because my local college was offering an evening course which I could get onto cheaply, but they insisted on using Microsoft Java (was it called J++?) which even then I knew wasn't the standard so I didn't take the course and went to learn Python instead. I wonder why they insisted on using MS's version?

The tendency (in the UK at least) of Universities/colleges to teach/instruct in Microsoft products is annoying. As to why, it's the joint problems of business pressure and budget cuts. When I first started out as a PhD student(1989), the computing department where I worked (no names to protect the guilty) used Sun workstations/multi-user servers to teach undergraduates. However, with the pressure to cut costs, we moved over to PCs and Linux in about 1995. So far so good, however, with the department now using mainstream equipment, it was argued that the department didn't need it's own equipment and could use central university pool (of machines) provision. The problem here is pool machines use Microsoft operating systems, and therefore teaching had to be done under Microsoft operating systems.

From here it is a slippery slope towards total Microsoft product use, with the encouragement that Microsoft of course sells products cheaply to educational establishments, and that students get a 'skill' in demand by business.

As a lecturer (assistant/associate professor) you are often powerless to resist such changes, the changes being forced on you from high, for purely financial reasons. Of course the people forcing the decisions are unaware of the differences between a general purpose machine and one for computer science teaching.

As far as the 'slippery slope' goes, it saddens me that my old computing department appears to now be pretty much a Microsoft shop, they are now getting funded by Microsoft for research, and recently an ex-colleague of mine went over to Microsoft HQ to accept some sort of academic cooperation award.

As an aside you (Salmoni) mention you used a Dragon 32, and that you now work at Cardiff University... Did you know that the Dragon was not a complete copy of the Tandy CoCo, and as such, the keyboard driver and other required modifications to MS Basic, was done in the Computer Science department at Cardiff. The machine was also manufactured not many miles from you in Port Talbot (opposite the Margam steel works).

The Dragon was the first computer I had. Like you I could not afford to buy an assembler, and I still have a notebook from 1983, with machine code laboriously worked out by hand. I finally wrote my own assembler, which was then the longest program I'd ever written!

20 Jan 2003 (updated 20 Jan 2003 at 20:23 UTC) »

Another birthday, another irritating day at work, trying to do the work of 3+ linux kernel developers with ever more unrealistic milestones. Look on the job sites and find there is exactly one linux kernel developer job on offer in Europe, on the other side of the European Union. That's still one more than there has been. Of course, there's lots of kernel developer jobs in America as usual, senior ones too requiring a PhD, rather than the crumbs on offer in Europe.

Slightly bemused by the fact I've still got no certification, even though I have been certified three times. Having read the blurb I can see how that might happen, but it seems rather silly.

No open source development over the weekend, just a lot of boozing. I will probably start on the next set of improvements to squashfs (1.2). It's good to get feedback, and I've already had people asking for the improvements I was intending to do.

Chalst asks for suggestions/feedback on his list of seminal highlevel languages. I must admit I disagree with Occam being a seminal language - the elegant parts of Occam were straight from Hoare's CSP (Communicating Squential Processes). The other parts of Occam were a nightmare, no dynamic thread/memory or array allocation, no structures, no shared memory between threads, and a intensely irritating syntax straight from hell (Pascal). It was in short a brain dead version of Pascal with Hoare's elegant CSP added. The real innovation was the implementation of CSP's channels in hardware on the transputer, a concept which made plugging together multiple transputers into a parallel computer network easy.

18 Jan 2003 (updated 18 Jan 2003 at 20:46 UTC) »

I was going to make some posts and/or reply to some today. But I can't and so a diary entry will have to do.

Tomorrow I will be 35, this means I have been programming for 20 years. Computing has changed so much in those 20 years (other than technological advances), and in my opinion for the worse.

Superficially computing has stayed the same. Twenty years ago, there was a kind of open source, hobbyists wrote code, printed it out and sent it into magazines. We had the big computing companies (DEC, IBM, etc.) that dominated the industry, and business bought from those and scorned the hobbyist 'toy' computer market. No change there perhaps?

However, we have lost the lively middle ground of small companies; the Sinclairs, Acorns, and Commodores of the industry. These were where the real innovation in computing was comming from. Back in 1983 you could buy for a reasonable amount of money (130 pounds), a computer with a 256x192 pixel display in 8 colours. Going to a big iron company like DEC, for far far more you'd get a slow 80x25 green screen serial (max speed 19200 k/bits) VT220 text terminal. No innovation. The small companies drove the GUIs, laser printing (Xerox which developed them never made any money out of it), the concept of the personal computer, and modern languages like BASIC in ROM. Many companies then were still using COBOL and punched cards (in the process of becoming a computing 'professional' I had to learn COBOL in 1984/85).

Does the death of these companies actually matter? Yes, of course it does. Twenty years ago, you could say there were three styles of computing where you could earn a living: the hobbyist/home computer companies, as an academic, or in business computing (read COBOL and punched cards). Today, with the collapse of the middle ground hobbyist companies, we've all become business computing employees, with the constraints, conservatism, and lack of innovation that implies. I never intended in 1982, to become a business computing employee, that side of the market filled me with revulsion.

So where has all the innovation gone? Open source of course. Whereas 20 years ago, 'open source' consisted of amateurs practising their skills and sending it into magazines, open source (confirmed by some large surveys performed a year or two ago), consists of a large number of computing professionals in full time occupations chaffing against the lack of innovation in their jobs.

This is bad of course. What makes it worse is the big companies, having pushed the middle ground companies out of business, are increasingly turning to open source for their innovation. The best companies of course use open source for their own ends (to innovate, as big companies never innovate), but give as much back to the open source community, in the ways where they are good - bug fixing, intensive useability tests and fixes.

The worst (perhaps the vast bulk?) treat open source as a reservoir of free talent, and take it (without any of its high minded principles) and never give anything back. This, in the IT downturn, has led to highly experienced people being sacked, with open source taking their place.

The biggest quesion facing open source is therefore, in my mind, how can open source sustain itself, when it's very success is being used against itself, destroying the livelihoods of the people who created it?

Increasingly, given a choice (or more likely not given a choice), I would move out of computing an a career, and stick to open source. However, having devoted 20 years to computing, and having gained an under-used PhD in it, there is very little else I want to do as a living. Catch 22.

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!