Older blog entries for rillian (starting at number 84)

It's a pity so many people seem to have left, but it's also nice to be able to read the complete recentlog again. :)

SteveRainwater have you changed the "multiple posts in a day clobber each other" behaviour? I think the planets have demonstrated the value of letting people have multiple entries.

Hooray, Advogato has been saved. Thanks StevenRainwater.

nutella, certs are entirely one way, and they're always a positive assertion. The idea is that trust flows along certification links, so a spammer can create as many accounts as they want, but unless a significant number of people already in the trusted group can be pursuaded to certify those accounts, they will still be cut off because they look like an island.

So fake accounts making some random certs of real accounts makes the fake accounts look a little less fake, but actually hurts their chances of being trusted. As long as most people cert based on their knowledge of another person's work, the trust metric will continue to work.

All this spam cleanup is just about cleaning up the pool of untrusted user accounts, and preventing spreading google juice where it doesn't belong.

DOS

Whoever was hammering the advogato person index page this (wednesday) morning, please don't do that again. It's an expensive page to generate and, well, you brought poor apache to its knees.

That's bad for your karma.

Was in San Francisco yesterday for an Artifex staff meeting. Had dinner with raph and Silvia Pfeiffer who runs the annodex project. They've been strong supporters of free and open multimedia for some time and it was great to finally meet here in person.

recruiting

We're looking for someone to help out with Ghostscript integration in Free Software, on a part-time contract. Yes, that means paid work. Help sort, review and update patches from the distros into upstream, write a Firefox plugin, that sort of thing.

Please send interesting resumes to giles@ghostscript.com.

whacky medieval latin

raph and nymia, google suggests ojusdem might be eiusdem, the third person, singular, feminine, genitive pronoun.

So inter omnes curvas ojusdem longitudinus might be "between each of its curves lengthwise." But I know exactly enough Latin to be extremely dangerous with a dictionary. Caveat lector.

恭喜發財

freetype, the comment about phasing out M4 was from atai, not me.

I don't actually mind M4. The syntax is an odd marriage, which made it hard to learn ("even more fun with quoting!") but it's just a macro substitution language, and I'm not sure what a better alternative would be given the goal of outputting portable sh. You can't even define subroutines in portable sh! (that's why configure scripts are so big)

No, as I've said before, the complexity comes from the fact that you're trying to write an expert system in a combination of sh, M4, and the code of the autotools themselves. Most of the knowledge is embedded in code, and in many different locations and formats. That makes it difficult, and brittle.

As far as replacement goes, I can see wrapping the old macros in a newer scripting language like perl or python, so that the original M4+sh still gets expanded and run in an external shell, but newer code could be written directly in a nicer language. That way you could port macros one at a time and not lose the vast store of knowledge accumulated in the GNU autotools and builds that use it.

autotools

freetype doesn't give any examples of what autoconf is used for that GNU make + pkg-config can't do, but when I said that I was thinking more about autotools as a whole, not just autoconf.

Re configure scripts in repositories, I guess I'd always considered "lack of portability" of configure scripts to be a bug; having autoconf generate a portable sh script is the whole point. The valid objection to checking them into a source repository is that they're not source.

As machine-generated code, it's ugly, irrelevent, and if developers are using different versions of autoconf et al. it can generate a huge amount of noise in the diffs. The only advantage I can see (besides the mentioned working around buggy configure scripts) is that it removes some dependencies for random people building straight out of the repository.

raph, your comments on GNU autotools are pretty must spot on, but I think your point number 3 is inaccurate. It's not that everyone uses the GNU toolchain; there are applications where other compilers are still interesting, Solaris is still alive as a vendor unix in the free software space, and as you point out, Apple has heavily modified the linker in MacOS X. So there are at least three common platforms that require special logic for building shared libraries, for example.

Rather, I would say that the reduced diversity as the vendor unicies become less relevent brings the issue of dependency detection and configuration to the forefront. And those are precisely the areas where the autotools approach of a maximum-portability script that tests the local system configuration is weakest. The imake style where the build tool knows what's installed is much more efficient here.

In fact, if you already know sh (portable or not) and are willing to depend on pkg-config (as most gtk/gnome software now does) you can do almost as well with just GNU make + pkg-config. The text substitution operators of the former do most of the convenience (as opposed to portabtility) features of automake, and the latter can do most of what autoconf is now used for. The rest of the configure script can just be implemented in the Makefile rules.

Still, I'd also like to see one of the more modern build tools take off, especially something that would flatten the learning curve for these things. SCons seems to most promising of these, but it still has a ways to go before it becomes a compeling replacement for larger projects.

Recently, cinamod tacked on a rant about PDF being a non-free format. I'd like to hear more about that. Certainly, my views are probably coloured by working on Ghostscript but I also think I care more about free formats than post people, and it's not really my perception that PDF is evil.

It's true that the format is controlled by Adobe and they don't have an open development process. On the other hand, the is good, freely available documentation of the format and Adobe has generally behaved in a way consistent with their being congnizant of it's value as an interchange format with multiple vendor support. Both of these things contrast with Microsoft Office document formats, the other example under discussion.

It is true that Adobe claims some patents on aspects of PDF. Their is however a blanket grant for applications compliant with the spec.

It is also true that the PDF spec, especially later versions includes a number of non-free (or silly) formats by reference, like JBIG2 and JPEG 2000 (the later is at least getting lower risk as time goes on). The latest release (PDF 1.6) even includes support for embedding the U3D format, which as near as we can tell hasn't been published in any form yet!

But, I don't actually see problems with the issues cinamod mentions directly. The LZW patent expired in the last jurisdiction a year ago. The Compression Labs patent on baseline JPEG (if that's what was being referred to) is generally considered invalid. The colourspace conversion issues are covered by the grant mentioned above.

Better, one advantage we have in Free and Open Source software is that we can choose the features we implement based on technical merit rather than the need to sell this year's upgrade to our software. This is I think were PDF really starts to look good. It's much easier to write a parser for than PostScript, which is/was the old standard, and the updated imaging model, compression, and portable rendering features make it a much better choice for everything we traditionally used postscript (which was another Adobe-controlled standard) for. The only drawback really is that you can't generate simple files with printf() like you can with postscript.

There's no compelling argument for us to be producing documents containing JBIG2 or JPEG 2000 images. The spec does contain support for LZW, but for the most part we continue to do what Ghostscript did until last year, and that's only support compressing with the free Deflate (zlib) filter.

So what exactly is non-free about it? It's easier to get the documentation and start implementing than many "more" free formats, like baseline JPEG or TIFF fax compression. By producing files based on a sane subset we can avoid both the patent and technical excesses and parent format. In fact, a number of such subsets are already defined and available for reference, like PDF/X or PDF/A.

Of course it's nice to have something controlled by and open community (like the Xiph multimedia codecs) but network effects are also very important. I think it's a good idea to just use PDF and make it our own.

75 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!