LCA 2010 videos are showing up
Not all the videos are there yet, but they are starting to show up . Yay. See http://mirror.internode.on.net/pub/linux.conf.au/2010/index.html or your local LA mirror.
LCA 2010 videos are showing up
Not all the videos are there yet, but they are starting to show up . Yay. See http://mirror.internode.on.net/pub/linux.conf.au/2010/index.html or your local LA mirror.
Yay Dell-with-Ubuntu down under
Dell has been offering Ubuntu on selected models for a while. I had however nearly given up hope on being able to buy one, because they hadn’t started doing that in Australia. I am very glad to see this has changed though – check out their notebook page. Not all models yet, but a reasonable number have Ubuntu as an option.
Yay!
Using UEC instead of EC2
So, we wanted to move a Hudson CI server at Canonical from using chroots to VM’s (for better isolation and security), and there is this great product Ubuntu Enterprise Cloud (UEC – basically Eucalyptus). To do this I needed to make some changes to the Hudson EC2 plugin – and thats where the fun starts. While I focus on getting Hudson up and running with UEC in this post, folk generally interested in the differences between UEC and EC2, or getting a single-machine UEC instance up for testing should also find this useful.
Firstly, getting a test UEC instance installed was a little tricky – I only had one machine to deploy it on, and this is an unusual configuration. Nicely though, it all worked, once a few initial bugs and misconfiguration items got fixed up. I wrote up the crux of the outcome on the Ubuntu community help wiki. See ‘1 Physical system’. The particular trap to watch out for seems to be that this configuration is not well tested, so the installation scripts have a hard time getting it right. I haven’t tried to make it play nice with Network Manager in the loop, but I’m pretty sure that that can be done via interface aliasing or something similar.
Secondly I needed to find out what was different between EC2 and UEC (Note that I was running on Karmic (Ubuntu 9.10) – so things could be different in Lucid). I couldn’t find a simple description of this, so this list may be incomplete:
So the next step then is to modify the Hudson EC2 plugin to support these differences. Fortunately it is in Java, and the Java community has already updated the various libraries (jets3t and typica) to support UEC – I just needed to write a UI for the differences and pass the info down the various code paths. Kohsuke has let me land this now even though it has an average UI (in rev 27366), and I’m going to make the UI better now by consolidating all the little aspects into a couple of URL’s. Folk comfortable with building their own .hpi can get this now by svn updating and rebuilding the ec2 plugin. We’ve also filed another bug asking for a single API call to establish the endpoints, so that its even easier for users to set this up.
Finally, and this isn’t a UEC difference, I needed to modify the Hudson EC2 plugin to work with the ubuntu user rather than root, as Ubuntu AMI’s ship with root disabled (as all Ubuntu installs do). I chose to have Hudson reenable root, rather than making everything work without root, because the current code paths assume they can scp things as root, so this was less disruptive.
With all that done, its now possible to configure up a Hudson instance testing via UEC nodes. Here’s how:
cat >> /etc/apt/sources.list << EOF deb http://archive.ubuntu.com/ubuntu/ karmic multiverse deb http://archive.ubuntu.com/ubuntu/ karmic-updates multiverse deb http://archive.ubuntu.com/ubuntu/ karmic-security multiverse EOF export http_proxy=http://192.168.1.1:8080/ export DEBIAN_FRONTEND=noninteractive apt-get update echo "buildd shared/accepted-sun-dlj-v1-1 boolean true" | debconf-set-selections apt-get install -y -f sun-java6-jre
Note that I have included my local HTTP proxy there – just remove that line if you don’t have one.
Note that Hudson will try to use java from s3 if you don’t install it, but that won’t work right for a few reasons – I’ll be filing an issue in the Hudson tracker about it, as thats a bit of unusual structure in the existing code that I’m happier leaving well enough alone .
Is a code of silence evil?
Looking at using google apps for my home email, as I want to be able to have my home machines totally turned off from time to time.
Found this interesting gem in the sign up agreement (which I have not yet agreed to ):
11. PR. Customer agrees not to issue any public announcement regarding the existence or content of this Agreement without Google’s prior written approval. Google may (i) include Customer’s Brand Features in presentations, marketing materials, and customer lists (which includes, without limitation, customer lists posted on Google’s web sites and screen shots of Customer’s implementation of the Service) and (ii) issue a public announcement regarding the existence or content of this Agreement. Upon Customer’s request, Google will furnish Customer with a sample of such usage or announcement.
This is rather asymmetrical: If I agree to the sign up page, I cannot say ‘I am using google apps’, but google can say ‘Robert is using google apps’. While I can appreciate not wanting to be dissed on if something goes wrong, this is very much not open! A couple of implications: Everyone seeking support for google apps in the apps forums is probably in violation of the sign up agreement; we can assume that anyone having a terrible experience has been squelched under this agreement.
Le sigh.
Adding new languages to Ubuntu
Scott recently noted that we don’t have Klingon available in Ubuntu. Klingon is available in ISO 639, so adding it should be straight forward.
Last time I blogged about this three packages needed changing, as well as Launchpad needing a translation team for the language. The situation is a little better now: only two packages need changing as gdm now dynamically looks for languages based on installed locales.
libx11 still needs changing – a minimal diff would be:
=== modified file 'nls/compose.dir.pre' --- libx11-1.2.1/nls/compose.dir.pre +++ libx11-1.2.1/nls/compose.dir.pre @@ -406,0 +406,1 @@ +en_US.UTF-8/Compose: tlh_GB.UTF-8 === modified file 'nls/locale.alias.pre' --- libx11-1.2.1/nls/locale.alias.pre +++ libx11-1.2.1/nls/locale.alias.pre @@ -1083,0 +1083,1 @@ +tlh_GB.utf8: tlh_GB.UTF-8 === modified file 'nls/locale.dir.pre' --- libx11-1.2.1/nls/locale.dir.pre +++ libx11-1.2.1/nls/locale.dir.pre @@ -429,0 +429,1 @@ +en_US.UTF-8/XLC_LOCALE: tlh_GB.UTF-8
Secondly, langpack-locales has to change for two reasons. Firstly a locale definition has to be added (and locales define a place – a language and locale information like days of the week, phone number formatting etc. Secondly the language needs to be added to the SUPPORTED list in that package, so that language packs are generated from Launchpad translations.
Now, gdm autodetects, but it turns out that only ‘complete’ locales were being shown. And that on Ubuntu, this was not looking at language pack directories, rather at
/usr/share/locale
which langpack-built packages do not install translations into. So it could be a bit random about whether a language shows up in gdm. Martin Pitt has kindly turned on the ‘with-incomplete-locales’ configure flag to gdm, and this will permit less completely translated locales to show up (when their langpack is installed – without the langpack nothing will show up).
lCA 2010 Friday
Tridge on ‘Patent defence for open source projects’. Watch it! Some key elements:
LCA 2010 Friday keynote/lightning talks
Nathan Torkington on 3 lightning keynotes:
1) Lessons learnt!
‘Technology solves problems’… no it doesn’t, its all about the meatsacks!
‘If you live a good life you’ll never have to care about marketing’… steer the meatsacks
‘English is an imperative language for controlling meatsacks.’… Tell the smart meatsacks what you want (english is declarative).
2) Open source in New Zealand:
A bit of a satire ‘Sheep calculator’, tatoos as circuit diagrams. The reserve bank apparently has a *working* water-economy-simulator. Shades of Terry Pratchett!
3) Predictions – more satire about folk that make predictions – financial analysts, science journalists.
After that, it was lightning talk time. I’ve just grabbed some highlights.
Selena Deckelmann talked about going to Ondo in Nigeria and un-rigging an election:
http://flossmanuals.net – nice friendly manuals in many languages writen at book sprints.
Kate Olliver presented on making an origami penguin.
Mark Osbourne presented ‘Open Source School’ – a school in New Zealand that has gone completely open source, even though the NZ school system pays microsoft 10Million/year for a country wide license.
LCA 2010 Thursday
Jeremy Allison on ‘The elephant in the room – free software and microsoft’. While he works at Google, this talk was ‘off the leash’ – not about Google . As usual – grab the video We should care about Microsoft because Microsoft’s business model depends on a monopoly [the desktop]. Microsoft are very interested in ‘Open Source’ – Apache, MIT, BSD licenced software – the GPL is intolerable. Jeremy models Microsoft as a collection of warring tribes that hate each other… e.g. Word vs Excel.
The first attack was on protocols – make the protocols more complex and sophisticated. MS have done this on Kerberos, DCE/RPC, HTTP, and higher up the stack via MSIE rendering modes, ActiveX plugins, Silverlight… The EU case was brought about this in the ‘Workgroup Server Market’. MS were fined 1 Billion Euros and forced to document their proprietary protocols.
OOXML showed up rampant corruption in the ISO Standards process – but it got through even though it was a battle against nearly everyone! On the good side it resulted into an investigation into MS dominance in file formats -> MS implemented ODF and MS have had to document their old formats.
MS have an ongoing battle in the world wide web – IE / Firefox, ajax applications/ silverlight.
All of these things are long term failures for MS… so what next?… Patents . Patents are GPL incompatible, but fine with BSD/MIT. The Tom Tom is the first direct attack using MS’s patent portfolio. This undermines all the outreach work done by the MS Open Source team – which Jeremy tells us are true believers in open source, trying to change MS from the inside. Look for MS pushing RAND patented standards: such things lock us out.
Netbooks are identified as a key point for MS to fight on – lose that and the desktop position is massively weakened.
We should:
Jonathan Oxer spoke about the google Moon X-prize and the lunarnumbat.org project – it needs contributors: software and hardware hackers, arduino/beagleboard/[M]JPEG2000 gooks, code testers and reviewers, web coding, documentation, math heads & RF hackers. Sounds like fun… now to find time!
Paul McKenney did another RCU talk – and as always it was interesting… Optimisation Gone Bad (RCU in Linux 1993-2008). Linux 2.6 -rt patch made RCU much much much more complex with atomic operations, memory barriers, frequent cache misses, and since then it was slowly being whittled back, but there is now a new simpler RCU based around the concept of doing the accounting during context switches & tracking running tasks.
LCA 2010 Thursday Keynote – Glyn Moody
Glyn Moody – Hackers at the end of the world. Rebel code is now 10 years old… 50+ interviews over a year – and could be considered an archaeology now I probably haven’t down the keynote justice – it was excellent but high density – you should watch it online
Glyn talks about open access – various examples like the public library of science (and how the scientific magazine business made 30%-40% profit margins. The Human Genome Project & the ‘Bermuda Principles’: public submimssion of annotated sequences. In 2000 Celera were going to patent the entire human genome. Jim Kent spent 3 weeks writing a program to join together the sequenced fragments on a 100 PC 800Mhz Pentium processor. This was put into the public domain on just before Celera completed their processing – and by that action Celera were prevented from patenting *us*.
Openness as a concept is increasing within the scientific community – open access to result, open data, open science (the full process). An interesting aspect to it is ‘open notebook science’ – daily writeups, not peer reviewed: ‘release early, release often’ for science.
Amazingly, Project Gutenberg started in 1971!
Glyn ties together the scientific culture (all science is open to some degree) and artistic culture (artists share and build on /reference each others work) by talking about a lag between free software and free content worlds. In 1999 Larry Lessig setup ‘Copyright’s Commons’ built around an idea of ‘counter-copyright’ – copyleft for non-code. This didn’t really fly, and Creative Commons was setup 2 years later.
Wikipedia and newer sharing concepts like twitter/facebook etc are discussed. But… what about the real world: transparency and governments, or companies? They are opening up.
However, data release != control release. And there are challenges we all need to face:
Glyn argues we need a different approach to economic governance: the commons. 2009 Nobel laureate for Economic Sciences – Elinor Ostrom – work on commons and their management via user associations… which is what we do in open source!
Awesome!
LCA 2010 Wednesday
Pandora-build. There for support – I’ve contributed patches. Pandora is a set of additional glue and layers to improve autotools and make it easier to work with things like gettext and gnulib, turn on better build flags and so forth. If you’re using autotools its well worth watching this talk – or hop on #drizzle and chat to mtaylor
The open source database survey talk from Selena was really interesting – a useful way of categorising databases and a list of what db’s turned up in what category. E.g. high availability,community development model etc. Key takeaway: there is no one-true-db.
I gave my subunit talk in the early afternoon, reasonably well received I think, though I wish I had been less sick last week: I would have loved to have made the talk more polished.
Ceph seems to be coming along gangbusters. Really think it would be great to use for our bzr hosting backend. 0.19 will stablise the disk format! However we might not be willing to risk btrfs yet
Next up, the worst inventions ever.. catch it live if you can!
New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.
Keep up with the latest Advogato features by reading the Advogato status blog.
If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!