Older blog entries for rillian (starting at number 76)

raph, your comments on GNU autotools are pretty must spot on, but I think your point number 3 is inaccurate. It's not that everyone uses the GNU toolchain; there are applications where other compilers are still interesting, Solaris is still alive as a vendor unix in the free software space, and as you point out, Apple has heavily modified the linker in MacOS X. So there are at least three common platforms that require special logic for building shared libraries, for example.

Rather, I would say that the reduced diversity as the vendor unicies become less relevent brings the issue of dependency detection and configuration to the forefront. And those are precisely the areas where the autotools approach of a maximum-portability script that tests the local system configuration is weakest. The imake style where the build tool knows what's installed is much more efficient here.

In fact, if you already know sh (portable or not) and are willing to depend on pkg-config (as most gtk/gnome software now does) you can do almost as well with just GNU make + pkg-config. The text substitution operators of the former do most of the convenience (as opposed to portabtility) features of automake, and the latter can do most of what autoconf is now used for. The rest of the configure script can just be implemented in the Makefile rules.

Still, I'd also like to see one of the more modern build tools take off, especially something that would flatten the learning curve for these things. SCons seems to most promising of these, but it still has a ways to go before it becomes a compeling replacement for larger projects.

Recently, cinamod tacked on a rant about PDF being a non-free format. I'd like to hear more about that. Certainly, my views are probably coloured by working on Ghostscript but I also think I care more about free formats than post people, and it's not really my perception that PDF is evil.

It's true that the format is controlled by Adobe and they don't have an open development process. On the other hand, the is good, freely available documentation of the format and Adobe has generally behaved in a way consistent with their being congnizant of it's value as an interchange format with multiple vendor support. Both of these things contrast with Microsoft Office document formats, the other example under discussion.

It is true that Adobe claims some patents on aspects of PDF. Their is however a blanket grant for applications compliant with the spec.

It is also true that the PDF spec, especially later versions includes a number of non-free (or silly) formats by reference, like JBIG2 and JPEG 2000 (the later is at least getting lower risk as time goes on). The latest release (PDF 1.6) even includes support for embedding the U3D format, which as near as we can tell hasn't been published in any form yet!

But, I don't actually see problems with the issues cinamod mentions directly. The LZW patent expired in the last jurisdiction a year ago. The Compression Labs patent on baseline JPEG (if that's what was being referred to) is generally considered invalid. The colourspace conversion issues are covered by the grant mentioned above.

Better, one advantage we have in Free and Open Source software is that we can choose the features we implement based on technical merit rather than the need to sell this year's upgrade to our software. This is I think were PDF really starts to look good. It's much easier to write a parser for than PostScript, which is/was the old standard, and the updated imaging model, compression, and portable rendering features make it a much better choice for everything we traditionally used postscript (which was another Adobe-controlled standard) for. The only drawback really is that you can't generate simple files with printf() like you can with postscript.

There's no compelling argument for us to be producing documents containing JBIG2 or JPEG 2000 images. The spec does contain support for LZW, but for the most part we continue to do what Ghostscript did until last year, and that's only support compressing with the free Deflate (zlib) filter.

So what exactly is non-free about it? It's easier to get the documentation and start implementing than many "more" free formats, like baseline JPEG or TIFF fax compression. By producing files based on a sane subset we can avoid both the patent and technical excesses and parent format. In fact, a number of such subsets are already defined and available for reference, like PDF/X or PDF/A.

Of course it's nice to have something controlled by and open community (like the Xiph multimedia codecs) but network effects are also very important. I think it's a good idea to just use PDF and make it our own.

redi, the disk does fill up that often. raph and I both use the same machine for email, and basically whenever there's a spike in the virus traffic the disk fills up.

Knowing this is the actual cause of the accounts going missing, I'll try a little harder to keep that from happening. As far as I knew it was something related the disk corruption when the previous incarnation of the system died a firey death last year. This explanation is much more reassuring. :)

Letting people make their own mistakes

rbultje complained that we at Xiph were being idiots trying to implement a new media framework badly, in reference to Arc's OggStream project. Please, give us a little more credit than that. We've never had any interest in implementing a media framework, and nowadays there's no reason too.

Xiph is an open source project, and people are free to contribute what they think is useful under our general umbrella. That doesn't mean the rest of us think just anything someone says they're doing 'for xiph' is a good idea, or will end up 'officially' recommended by the foundation per se. This is such a case. I think Arc has confused here his personal interest in writing something with the community's need for it, that's all. And quite a number of people have told him so, but he's not one to be deterred by such. If you talked to some of the calmer developers you might have gotten a different picture.

There is a need for a convenience library like vorbisfile that handles theora and probably all our other codecs, for someone who wants to use them but doesn't want to tie themselves to one of the big media frameworks. Arc's proposal grew out of that need. What most of us would rather see is something lightweight based on Conrad Parker's excellent liboggz, a libfishsight to go with his libfishsound.

Clearly we do have some kind of image problem, since it's become fashionable to talk about how lame Xiph is. But we are just a very small open source project trying to do something much harder than most people want to be bothered with. If you don't like how it's going, it's up to you to help fix it.

Ogg Chaining

rbultje, glad to hear the chaining bugs are finally getting fixed. It's a bit of a show stopper for me with totem. :)

As far as how to handle them in the ui, I can suggest three options. One is just to treat chain segments as independent clips. Each one gets its own playlist entry, and there's no need to upgrade the seek bar. This makes a lot of sense for things like saved streams and album-as-concatenated-files. There's an xmms patch for this.

Another is to treat them as subclips, and insert visual/jump boundaries in the seek bar, sort of the way older versions of iMovie worked. In totem's case this would mean developing a custom widget to replace the seek slider.

One could generalize the above two, where chained files get their own (sub) playlist, but a playlist (of any origin) can been treated as a whole in terms of the behavior of the seek slider, which would show the item boundaries, and let you jump to any part of any item. I could see that being a nice feature for (shorter) playlists. You'd have some feedback about how many songs there were and where you were within the list without having to scroll through the playlist, as well as having a quick random access option.

The competing point of course is that Ogg chaining is sometimes just used for edits, and the divisions may be meaningless. In that case, it makes sense for a playback app to just ignore them. You may be able to make a guess about the appropriate behavior based on the associated metadata; if the title didn't change, or a segment has no metadata maybe it's just an edit. Likewise in DVD-Video, chapters could be treated as segments, but programs should always be like separate files.

Anyway, that's the direction I'd experiment with.

hating computers

So, advogato is back up-ish. Raph has moved it to the new machine we bought to replace the ghostscript.com host, which expired some time ago and only survives now only as an artificial creature, dependent on life support.

There may be a few more glitches as we transition the machine's primary responsibilities, but hopefully things will stay up a little better now.

Sometimes I hate computers. I feel like I've spent more than half of the last 3 months fighting with broken servers. The hosts for both ghostscript.com and xiph.org imploded about the same time, and getting replacements online was in both cases something of a nightmare. The lesson, at least for me, is that when you're trying to do things on the cheap, build the machine locally and ship; the kind of on-site support you need if it doesn't work costs more than the hardware, and you won't save anything by having someone build something locally.

For Xiph.org we also ended up switching hosting providers. Our primary server is now with the very cool folks at the Oregon State University Open Source Lab. We were also inspired by the pain of the downtime and data loss to set up some redundancy, and in particular mirrors for the websites. If you'd like to help us out, email the xiph.org webmaster; we need both mid-bandwidth web hosts for the sites, and high-bandwidth mirrors for media content and release files.

Still, an end is hopefully in site. At least if my home machine would also stop trashing its disk.

Ex Londonium

I finally received my Canadian immigration papers this past November, and officially became a resident on December 21st last year. After spending xmas was my partner's family in Kingston, we found a nice apartment in downtown Vancouver, then went back to London to pack up there. We came back ourselves at the end of February and the things we shipped finally arrived in June, so we're officially here. It's really nice to be back.

The reverse culture shock was interesting. When I first moved to Vancouver 10 years ago it was the biggest city I had ever lived in. After a while I got used to the scale and enjoyed taking advantage of all the things on offer. Then we went to London, one of the largest cities in the world. So when we came back, we were struck by how pleasant and friendly everyone was, how clean the streets were, but also how small it all was. Now Vancouver is just a place with only two cheese shops.

These things wear off though, and we're still happy with our decision to return. It was a great experience to be able to live in Europe for a few years, but London wasn't our first choice as a place to stay.

xmms

fatal: I'm not surprised you're getting forks. The xmms program remains the most widely used audio playing in linuxland, and when the custodians of such an important brand don't provide leadership, other people inevitably step forward to try. We all want things to get better, and of course the right to fork is all about that being possible. Ideally such things either motivate you to finally do your own version, or one of the forks eventually becomes the official version again though clear superiority or popularity. But by telling people they can't call it 'xmms2' you're ensuring that someone else will try, until there really is a successful xmms2...even if it's still called xmms.

Re making it a video player, I've always been confused about that. Adding a general multiplexed media framework seemed the obvious step back when 1.2 was new, and the intent was obviously there at some point or it would just have been called 'xms'. I didn't know it was a schism between xmms.org and 4Front. (If I may define the groups that way; seemed like they were almost the same back in 2000.)

Oh well, someday soon we'll get a decent modern player. Pity it won't be called xmms, because it's a very cool name.

Hating computers

It looks like the hard drive corruption I complained about last entry was just the disk itself going bad. I started to get repeatable errors reading from the root partition, though the home partition remained thankfully readable. Coincidence with the DVD burner apparently. And maybe obvious in retrospect. Oops.

Anyway, I backup up a lot of stuff and moved the rest onto the newer 80 GB drive, now booting with a fresh debian install. That was kind of crowded though, so I bought a new 160 GB drive for general data storage. Yay free space!

London

I've finally been approved for an immigration visa to Canada. HOORAY! I'm cutting short the current stay in London to go home and finish the paper work. The plan is to 'land', as they quaintly refer to it, soon after xmas and start setting up in Vancouver again in January. Anyone in the area who knows of a free room we could rent for a while (even just a week or three while we look for a real place) please let me know!

S is keeping her post at UCL for the moment; she'll help pick out the apartment and then return to London, and I'll follow when everything is settled for a last round of europe. There are museums in london still on the 'must see' list. Ordinarily we'd avoid paying double rent, but the London flat is free in the sense that S hasn't found a job in Vancouver yet and I feel safer having somewhere to go to on the other side. Hopefully the chance to see more of europe will be worth the time apart.

Since I'll be in California anyway, I'm going to SCALE in November with a bunch of the xiph people, most of whom I haven't met in person before. Should be a lot of just, and hopefully we can drum up a little more interest.

Solar Power

Bram, there are several problems with your solar energy proposal. The most fundamental is that you're trying to drive a steam turbine with the pressure generated by the boiling water. If the only reason it's boiling is that you've pumping on it, there's no postive pressure differential with the ambient air you can use to turn the turbine. All your steam will out out through the vacuum pump, and the only way the turbine will turn is if you pull outside air through it backward with the vacuum pump: a classic perpetual motion design. Pumping air out of the enclosure also isn't free, and would likely cost too much for the system to run even if the boiling magically worked.

Using enough mirrors that the water boils on its own does work fine however. It's called Solar Thermal generation, and there are a number of plants in operation based on that principle. I've driven by the ones in California a number of times. They're kind of pretty.

One drawback of the whole boiler-turbine-condenser loop is that the machinery and pressure vessels tend to be large and heavy, so they're more cost effective at the MW scale than as something you'd want for household generation. Which is why most building-scale solar thermal systems just produce hot water, not electricity, and people use photovoltaic panels instead if they want power. It's possible something like a 2m dish solar stirling engine would work for that, at least until we get cheap, quantum-efficient solar electric materials. Plants, after all, do it everyday.

I hate computers

Ok, not really, but it's been ridiculous. Last week I bought a DVD-R drive. This is actually my first burner: I kept waiting for CD-R drives to get cheaper (and could usually borrow from friend and office when I needed one) and when they were finally cheap enough for me, well, I was waiting for DVD-R drives to get cheaper. :)

But it's been hard to make backups since we moved to London, and I was starting to run out of disk space, so when I saw one for £140 I bought it. Yay, at last! I hooked it up, installed cdrecord, picked a likely looking 4 GB of backups, and started a burn.

First hitch. It turns out cdrecord doesn't support burning dvds; that's a payware add-on. Except it's still freeware for personal/academic use. So there's a binary you can download. Which I do. Except it doesn't burn more than 1 GB. Oh, there's a 'license key' you have to set in an environment variable. Ok, we write a script to do that, which looks just like the script they suggest you use further down the README page.

Yay, my first complete burn of...a coaster. Huh. Ok, we'll try again, this time at half the speed. I bought cheap media, so maybe this is the problem. 2x takes awhile...Let's just check this in the drive itself instead of the laptop. Nice, mount locks the machine! That's not good.

So I hard reset the machine. I check the disk on the laptop, and yes, another coaster. Not having gotten a clue yet, I try again at 1x after the fsck completes. That really takes awhile, and this time I eject the disk before trying to mount it, having vaguely remembered that resets broken drives. Coaster number three. This is not the shiny toy experience I was hoping for.

I give up and go back to work. A little while later the machine locks, independent of mount. This time, as I reset, I notice the hard drive light is on solid. Uh-oh. I reboot, and this time it wants a manual fsck. (yes, my root partion was still ext2). It finds a lot of problems. With a sinking feeling I try reboot, and the kernel load fails. So do all the backup kernels. Did I mention I'd not made a backup in way too long? And my rescue cd's are all from over a debian version ago, and can't even mount my reiserfs partitions. Oh, the irony.

Anyway, near as I can tell, turning DMA on for the DVD-R drive (a Pioneer A06) causes disk corruption on the other controller. At least with my motherboard and recent 2.4 linux. What? Is this 1997? It's not like DMA is this fancy new technology we're just figuring out how to support.

I borrowed a rh8 cd to use for rescue, made backups of what data I could, managed to burn a debian netinst image, and did a fresh install from there. Fortunately, all the home directories seem to have escaped damage, so it was just the system config I lost. As usual after a fire, there's been refreshing new growth: things run better now that I don't have so much cruft installed, I'm trying Gnome again, and I was finally motivated to clean up the partitioning and, for example, consolidate my two home directories--legacy of when what used to be two computers became this one. And I can burn discs fine now. But what a frustrating experience! Also turns out that dvd+rw-tools includes a program that will burn CD- and DVD-R, despite the name. Of course, we're all still using iso9660 instead of UDF, but that's a problem for another day.

Life

Back in London for a few weeks now. It's good to be home. Strangely, I'm finally coming to like living here, now that we're almost ready to leave. I think maybe it just takes a while (like more than a year) for me to get comfortable in a place; until then I don't know if I like it or not.

Sandra and I had a really nice time on the california coast. We especially liked Morro Bay, and had an authentic surfer dude encounter in Santa Cruz. I've always found that area really relaxing, so it was an effective vacation.

After Sandra when back to the UK, I spent a week visiting jack in Albuquerque. We had a great time and it was good to see him again. They've got a nice set-up there, which I got to see finally, and we did geek-tourism things like visiting the VLA. It's a pity we don't want to live in the US; I hope we can do as well in Vancouver.

Virus bounces

Sorry you've joined the esteemed ranks of the virus bounce recipients, garym. I've been on it for a while, mostly as an admin of the ghostscript.com lists, since we have TMDA filtering of the xiph.org lists. However, in this outbreak I've been getting lots to my xiph.org address; apparently it trolls the web cache, which explains that, and why everyone's suddenly asking us to purge the addresses from the archives.

Inappropriate spam bounces actually account for more than half of the messages caught by the list filters. Of course it's terrible; I expect the problem is there's no integration between the site's MTA and the virus filter's they've bought, so the filter has no way to obtain the envelope sender or otherwise guess whether the from is spooffed.

Of course, replying at all in the case of the virulent Sobig strains, which are known to spoof, is inexcusable laziness. I find a ray of hope two weeks ago when I complained to a particularly personally worded bounce and the admin actually said they would clean up the problem. Hooray!

I don't know though. It's really starting to look like we need a replacement to mailing lists. Usenet news was better, but we moved to lists because of spam. Now it's time to move again, but I don't know what to.

C macros (sometimes) harmful

chalst responded to raph's complaint about macros and error reporting protesting that C macros (unlike M4 in the context of autotools) work fine. That is generally true, and they're quite a powerful feature; C would be 2/3 the language it is without them. The contraints are mostly idiomatic however. Another example of poor error reporting and debuging performance raph had in mind was in fact Ghostscript, which is written in C. It also implements memory structure tracing for garbage collection and an object system (among other things) with preprocessor macros. It's not unusual for these macros to be 5 layers deep, with definitions spanning several files, qualitatively similar to the complexity you see with the autotools. It's definitely a demonstration of the power of the macro system, but it can be no fun at all to debug the code. I suppose it's a combination of the unfamiliar idiom and the lack of debugger support, but there's probably a reason the preprocessor in C is usually limited to constant substitution, trival inline functions, conditional compilation. and some definitional magic within header files.

Travel

I'm in California for a few weeks. Came to San Francisco to get the required physical for my Canadian Immigration Visa. S came with me this time so we've been having some vacation. In addition to the usual visiting with friends we went to Yosemite for 4 days. S followed up with a week long backpacking trip while I stayed in the Bay Area working and visiting with raph, and now we're renting a car for a week and travelling along the central coast. Looking forward to more perfectly wonderful weather. :)

Poynton's Google Juice

I wonder how much of raph's experience with gamma correction information on google has to do with Poyton's article having fallen off the web for over a year. My impression was that it was becoming the definitive reference for this, when it disappeared. I'm glad it's back up again, but it will take a while for people to relink. Even the Gamma FAQ itself still references the old location.

67 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!