Older blog entries for jclement (starting at number 13)

On the weekend Anji and I purchased a LG DVD recorder. We had several reasons for wanting one:

  • It's cool
  • So we can make family movie DVDs featuring our daughter
  • So I can backup my favorite DVDs to keep with my laptop (I don't want to carry around the originals in case I damage them)
After several hours surfing the Internet for Linux software I realized that was just too much work so I tried for Windows software and struct paydirt. The site doom9.net has tonnes of good documentation / software for freely dealing with DVDs.

I specifically wanted to copy DVD-5 and DVD-9 DVD's to a blank DVD-R. This is really easy for the DVD-5 DVD's but for the double layer ones, not so much. I came across a solution involving ripping the DVD to HDD, recompressing it to fit on 4.7GB, building an ISO and burning it and it actually works. Here is a copy of the instructions and software I finally found that worked (all free stuff).

I wonder how long until we have something similar for Linux. As it is I can burn DVD images using cdrecord-dvdpro, and I believe transcode can do the compression scaling stuff but I have no idea how to put all this together to seemlessly convert DVDs (if you know please let me know) with the menus and whatnot intact.

Alrighty. My server now moved from my crappy Telus ADSL connection where I could get upload speeds of around 50K and download of 150K to a new colo facility in Calgary where I've got 100M link both ways. I tried transfering some files between my machine and sunsite and was getting 4.2MB/s (yeah). So now I'm just waiting for the DNS to propagate and I'll be up and running.

11 Aug 2003 (updated 11 Aug 2003 at 13:48 UTC) »

On the weekend some friends introduced me to a couple of really cool games. Truck Dismount and Stairs Dismount where basically you try and cause as much damage to the crash test dummy as possible. It's unfortunately a Windows game but might run under wine.

1 Aug 2003 (updated 1 Aug 2003 at 17:33 UTC) »

I'm currently working on a fairly large website for a client that's basically a searchable web based cataog and user registration system. The catalog side is reasonably simple since it's just populating some templates from the database but the registration is much more difficult and consists of 100s of fields with fairly complex validation rules. My first version of the site, done under the crunch as usual, was using Python Service Objects (CGI) mainly because it has fairly nice form handling. Unfortunately this didn't work because the load on the site was higher than expected and it bogged down the server. So here comes version 2:

This time around I have several goals:

  • Make the application faster (applet / servlets / etc)
  • Make the application code easier to work with for me and my successor
  • Make the application easier to deploy

This job seems ideally suited to CherryPy, a python based web application development platform. It basically takes python code and bundles it into a standalone server which can run behind Apache and service the requests. The result is that it is really fast. Also it should be easier to deploy since it's really just requiring Python on the target system. There is no reason why the app wouldn't just run happily on Windows servers too.

For the database side of things, the dataset isn't particularly large. I was thinking of moving away from MySQL and using Metakit. Metakit is a nice little embedded relational database I've used in the past and again it makes the application less dependent on the server. I'm wondering how well Metakit handles multiuser load and concurrency. From what I can tell it seems to do it well but it remains to be seen. Here are some links on the topic:

All in all I think this will make for a faster more self contained application which the client can just plug into their website. The tools are different from the norm but I'm confident that any reasonable coder should be able to pick up python and CherryPy with very little effort.

15 Jul 2003 (updated 15 Jul 2003 at 21:09 UTC) »

So for fun today I thought I would phone Telus and have them fix my reverse DNS. I host my own DNS but for some reason Telus prefers to host the reverse DNS (fair enough). They have this little web tool for updating said reverse DNS. But it doesn't work anymore.

So after only about 43min on the phone with Telus I managed to convince their first level "analyst" that their servers were indeed serving my reverse DNS. I think I repeated myself four times and each time he put me on hold for about 10min. He had to speak with the senior analysts to confirm even that. Then he wanted to login and look at my settings for the web based tool for updating reverse DNS and the guy actually asked me for MY password. It seems that Telus tech staff doesn't have access to go in and look at this stuff through some sort of administrative backend.

Anyways. After all that the final decision by the Telus guy was that I only made my changes a week ago and Telus's servers take up to two (2) business days to update. It seems that he couldn't grasp that one week is significantly longer than two business days regardless of whether it was over a weekend or not. I'm so impressed by the quality of the support staff. Grrr. Oh and what's with the two business days. Do the computers need vacation time? Or is some poor tech editing zone files by hand each time we make changes.

What is it with support these days? I remember the old days when you could phone a company and get someone with at least some idea of what was going on.

Nice morning project. Today I setup my PI phone number search engine. It's a rather fast way of finding the first occurence of any seven digit sequence in PI. Unlike my previous PI search engine that search for arbitrary strings in PI, this one is really fast.

Just found a very interesting project called cx_Freeze. It's basically like py2exe and installer in that it bundles a python program into a standalone executable with the related shared libraries so that it can run on a machine without Python. This one, however, seems to work for a wide variety of target platforms including Windows, Linux and commercial Unix systems. I've tried it out and it works quite well. I wonder if I can "cross compile" with it?

10 Jul 2003 (updated 15 Jul 2003 at 01:49 UTC) »

Cool! In my quest for more digits of Pi I found a Univeristy in Japan offering 4.2 billion digits to download for the general public. They are on their way down to my server. Once there I plan to revive my Pi search engine and allow it to search 4.2 billion digits! I think I'll have to implement some sort of Pi searching queue so that only one search is processed at a time since this will probably be a bit tough on my servers.

Update: So I've been working on my Pi search engine, trying to scale it up to 4.2B digits and have run into a snag. My previous search engine just mmap'd the whole mess and then used that to seek back and forth in. This does not, however, work with this much data since x86's can only mmap 2^32 bytes of memory and big chunks of that are, as I understand, already used up by the kernel and whatnot. So it looks like I'll have to change the algoritm to either not use mmap or to use several smaller files and hop between them. Ugh.

Kyle had an interesting idea. Since usually we are searching for phone numbers I could index all the phone numbers in the first 4.2B digits of Pi into some sort of database. I'm thinking my K62/550 probably isn't up to the challenge :)

Another update: Marco Aur?lio Graciotto Silva mailed me with this link to a distributed PI calculation project. Unfortunately the challenge is over :(

I switched back to DBJDNS from MaraDNS. The first reason is I know Dan Bernstein writes good solid and very secure code. The other reason is it is just easier to use. DJBDNS's config files do not resemble zone files (which MaraDNS's do) and that makes me very happy. Hopefully I managed to do this switch properly and DNS service wasn't / won't be interrupted.

9 Jul 2003 (updated 9 Jul 2003 at 22:15 UTC) »

Rebuilt website and webserver over the weekend. Site is more machine and bandwidth friendly. Limited dynamic content. Server is now K62/550 with 512M ram and running Debian 3.0.

Investigated secondary nameserver service secondary.org. It's really simple to use and works quite well. Not sure what the update interval is. Still waiting for it to pull over the last batch of changes from ns1.bluesine.com.

Also added a backup MX server for bluesine.com which is the domain handling most of my mail and quite a few others mail. This was really simple to do using Qmail. Basically Kyle added bluesine.com to his rcpthosts and I add his machine as the backup in my DNS records.

4 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!