Older blog entries for bagder (starting at number 180)

Project Denise is pretty much dead, and we instead adapted ares in libcurl for asynch name resolves. We have however collected a good bunch of patches for it that the maintainer doesn't want, so we're slowly forking away out on our new c-ares branch...

I'm working (albeit slowly) to get my FSF assignment paper fixed so that my upcoming code donations to wget will end up somewhere.

In cURL I'm trying to get things fixed for the upcoming 7.11 release. There's a dozen or so outstanding issues left to be done... The previously announced curl programming competition got nowhere as only 4 persons brought contributions for it! ;-(

RockBox was again featured on slashdot. It has been some heat on the mailing list since Open Neo forked Rockbox for the Neo series of mp3 players, claiming they made it and with all credits and copyright texts removed... The issue has since been sorted, and we're all thrilled to see Rockbox moving over to work on mp3 players produced by other companies than just Archos.

And to comment the recent entries on Advogato: I always prefer using 'man [command]' to fiddling around in an annoying info browser...

In spite of the low-profile announcement and hardly any talk about it, suddenly roffit is finding its way out and gets used "out there". It seems more people than me are fed up with crappy-looking man page to HTML converters.

I am interested in improving this utility, as currently it is pretty naive about the input format and my nroff knowledge level is next to zero.

If you try it out and find problems, please bear with me.

Funnily enough, this time the BeOS version was the first port/package of cURL 7.10.8 to appear.

My life is turned up-side-down since the birth of my daughter a little over a month ago, but a never ending flood of friendly users have fed me with a huge amount of patches so the number of changes and bugfixes that went into this release surpasses what I usually get done for an average release.

It is neat to see that curl still hits the top on freshmeat's vitality chart when I submit the release, even though I did it on a saturday morning.

In an attempt to boost contributions we've announced the cURL programming competition. Make a cool libcurl program and win US$ 300!

This is only possible thanks to previous donations to the project.

Now, let's get that new release out the door soon...

Holy macarony!

I checked the size of my spam folder for August today, and noticed that it was over 1GB in size! I deleted the 12600 "spam" mails received so far this month. There's no way I can go on saving old spams anymore.

Two hours later, the mailbox is again 39MB consisting of 400 mails (each sobig.f mail is about 100K).

Thanks to bogofilter and a simple sobig.f-filter in my procmailrc, they don't interfer too much with my regular emailing.

Dear debian people,

Please fix the package naming rules that make the perl interface for libcurl end up being called "libwww-curl-perl". This is confusingly similar to the perl interface to libwww which incidently is called "libwww-perl". Oddly enough, "libwww-curl-perl" has nothing to do with libwww, but the Debian naming rules make it look so.

The perl binding for libcurl would of course be called "libcurl-perl" if libwww and libcurl were treated equally.

(Yes, I know the perl-binding for libcurl is put in the WWW section in CPAN, but that is not a good excuse for Debian to prefix the package with libwww.)

1 Aug 2003 (updated 1 Aug 2003 at 10:03 UTC) »

The well known tool GNU wget seems to have been left orphaned since the last year or so.

It is an interesting project to watch. How long can a well-known, well-used project be left to dry without anyone forking the source base and starting to improve it in a separate source tree?

People still mail in odd patches and bug reports to the mailing list, but there's no one with CVS access around.

The latest mail posted to the wget mailing list by the maintainer, Hrvoje Niksic, was posted back in December 2002.

Anyone out there interested in a 'HTTP authentication' library? Possibly only for NTLM, but I would imagine that it could do Digest and GSS-Negotiate too as they're not that simple to implement (at least it takes a lot of reading and testing).

A few weeks ago I added Digest authentication support to libcurl, Daniel Kouril added GSS-Negotiate authentication and then I challanged the programmer gods and added NTLM authentication too.

Now, there's a few other HTTP-oriented open source tools out there that I believe also would like to have these authentication methods supported. Why not gather a crowd and see if we can work up an API and then we could move the now written code to use that, and I could make the auth stuff available for others as well... (the libcurl source code is MIT licensed and thus not restricted to GPL, non-GPL, BSD or non-BSD projects)

It would also make the code better, as we all would strive on improving this.

If you think this is a good idea, well, mail me.

25 Mar 2003 (updated 25 Mar 2003 at 16:21 UTC) »

This is supposedly some kind of an open source/free software community. Yet 75% of the members claim to be involved in no projects at all, with another 10% involved in only one project...

I do get mildly annoyed by the fact that with the recent Redhat betas of 8.1 or 9 or whatever they'll end up calling it, curl fails to build with OpenSSL. Again, Redhat steps forward and causes havok.

This is because they've decided that we should all use 'pkg-config --cflags openssl' to figure out that we need the include path setup to point out the kereberos headers in order to be able to successfully include <openssl/ssl.h>...


Today my 'postmaster' got mail from "spamcop.net". He was utterly confused and mailed me and asked why he got this weird mail...

It turned out that spamcop.net is a blatantly stupid complain-to-the-spammers system, which tracks down the origins of URLs in "spams" and mails the postmasters of the supposedly offending spammer sites.

Now, the "spam" in this case was a mail in which someone described his travel by train in Peru, and there he added a URL to my page with a few pictures from the city of Aguas Calientes near Machu Picchu:


So, I'm an alleged spammer because I keep a page that one person liked to much he mailed a bunch of people pointing it out.

Well, I figure someone pulled some strings to make this happen and someone can of course have made a perfectly normal human error here, but it sure ended up very strange in my side of the world..

171 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!