I was going to make some posts and/or reply to some today. But I can't and so a diary entry will have to do.
Tomorrow I will be 35, this means I have been programming for 20 years. Computing has changed so much in those 20 years (other than technological advances), and in my opinion for the worse.
Superficially computing has stayed the same. Twenty years ago, there was a kind of open source, hobbyists wrote code, printed it out and sent it into magazines. We had the big computing companies (DEC, IBM, etc.) that dominated the industry, and business bought from those and scorned the hobbyist 'toy' computer market. No change there perhaps?
However, we have lost the lively middle ground of small companies; the Sinclairs, Acorns, and Commodores of the industry. These were where the real innovation in computing was comming from. Back in 1983 you could buy for a reasonable amount of money (130 pounds), a computer with a 256x192 pixel display in 8 colours. Going to a big iron company like DEC, for far far more you'd get a slow 80x25 green screen serial (max speed 19200 k/bits) VT220 text terminal. No innovation. The small companies drove the GUIs, laser printing (Xerox which developed them never made any money out of it), the concept of the personal computer, and modern languages like BASIC in ROM. Many companies then were still using COBOL and punched cards (in the process of becoming a computing 'professional' I had to learn COBOL in 1984/85).
Does the death of these companies actually matter? Yes, of course it does. Twenty years ago, you could say there were three styles of computing where you could earn a living: the hobbyist/home computer companies, as an academic, or in business computing (read COBOL and punched cards). Today, with the collapse of the middle ground hobbyist companies, we've all become business computing employees, with the constraints, conservatism, and lack of innovation that implies. I never intended in 1982, to become a business computing employee, that side of the market filled me with revulsion.
So where has all the innovation gone? Open source of course. Whereas 20 years ago, 'open source' consisted of amateurs practising their skills and sending it into magazines, open source (confirmed by some large surveys performed a year or two ago), consists of a large number of computing professionals in full time occupations chaffing against the lack of innovation in their jobs.
This is bad of course. What makes it worse is the big companies, having pushed the middle ground companies out of business, are increasingly turning to open source for their innovation. The best companies of course use open source for their own ends (to innovate, as big companies never innovate), but give as much back to the open source community, in the ways where they are good - bug fixing, intensive useability tests and fixes.
The worst (perhaps the vast bulk?) treat open source as a reservoir of free talent, and take it (without any of its high minded principles) and never give anything back. This, in the IT downturn, has led to highly experienced people being sacked, with open source taking their place.
The biggest quesion facing open source is therefore, in my mind, how can open source sustain itself, when it's very success is being used against itself, destroying the livelihoods of the people who created it?
Increasingly, given a choice (or more likely not given a choice), I would move out of computing an a career, and stick to open source. However, having devoted 20 years to computing, and having gained an under-used PhD in it, there is very little else I want to do as a living. Catch 22.