Older blog entries for gnutizen (starting at number 24)

Right now I am dealing with using things such as web browsers via Windows Remote Desktop and UNIX's X-Windows and it occurs to me that Remote Desktop is much faster than X-Windows for this sort of remote stuff. I wonder if any speed comparisons for this sort of thing have been done.

I have been looking over the web sites which show what kind of hardware is compatible with Debian (or Linux in general, or free software even more generally). I'm looking not only at the standard motherboard and hard drives, but things such as scanners and so forth as well. I have never done any kernel hacking before, but I might do so on the machine I will buy. I am not looking to go cheap-o either, I want a kick-ass Debian desktop which I will use for my entree into kernel hacking as well as just being my general desktop. I have caught the Debian bug and will not install non-free Java and the like on it - if it's not free I won't use it. I've seen some of the larger Debian/Linux hardware compatibility sites and FAQs, if anyone has any comments about all of this I'd be happy to hear about it.

Wage slavery

After a blissful year of working for myself and being poor, I am becoming a wage slave again. When I was in my 20's, my thoughts were on making a lot of money and getting stock options that would make me rich. Nowadays my thoughts tend to be on working for myself. I know if I worked for myself and made $70k a year, I would be very happy, I'm not sure how much less than that I could make and still live without having to worry about money.

Anyhow, me and my partners business is off and running - we are selling on e-Bay, our osCommerce site is up with its modules, we are in Froogle, we have all the corporation papers and have accounts with suppliers. Hopefully, with all the time I will be spending on my new job, all of the groundwork is done, and I can run the business on the weekend, along with my partner and others peripherally involved. An influx of capital will be helpful in some respects.

As I said before, my perspective has changed since my early 20's. Time is more valuable to me than money nowadays, although it is not easy to find a 40 hour a week IT job, everyone is hellbent in expropriating more and more surplus labor time from me and everyone else. I would gladly take a large pay cut to work a 40 hour job, but that is quite difficult to find nowadays. I am quite unhappy selling myself back into wage slavery, but hopefully the capital influx etc. can get me going on working for myself. It's unfortunate the workers of the United States have to spend so much of their lives working to achieve their own freedom from the bonds of wage slavery, with few even being able to even achieve this. Of course, in terms of individual struggle this will just get worse, only a long campaign of organized struggle will put an end to this.

Mucking with Linux modules

It took me so long to get my CD-ROM burner, DVD player, and USB wireless ethernet adapter (which uses an external kernel module) to work for Linux 2.4, that I haven't gone to 2.6 yet. I'm too busy and don't want to go through all of that again. But anyhow, I want to receive faxes on my box and started mucking with that stuff. I compiled the related kernel modules. Now I'm reading that my modem has no free as in speech, or even free as in beer Linux drivers. I hope I don't muck with this for hours more only to find out there's no Linux support and I can't receive faxes. In other news, my roommate's Windows Internet connection keeps conking out for no reason, so I guess it could be worse - I could be using Windows.

So anyhow, my journey into the world of e-commerce continues.

This weekend I signed up with Dreamhost. I was going back and forth of whether to do the expense for a while, but it is only $120 for the next year and I get a lot of stuff (web hosting, mail, ssh etc.) so I just jumped in and did it. Then LA has a blackout the next day! But they recovered nice and did a quick fix for an NTP problem I had complained about, so I'm happy.

I set up osCommerce there. They have a pushbutton way of doing it, and it was pretty easy. Hopefully I will be able to delay the need for SSL and a unique IP. The Post Code thing bothered me so I changed it to Zip Code and made some other changes. So now I have to figure out what I'll do for the credit card backend (we'll probably use Paypal to begin with), and perhaps I'll sign up for an account with USPS. With that done we'll do a test purchase or two, then start looking for real customers. No big rush though - I haven't even set up any mail yet, all of my e-mail is still going over Yahoo Mail for this.

Most of my sales have been over eBay thus far. Later on we'll try to drive sales to the web site through various methods. I've been analyzing sales on eBay for my little niche. I didn't like their watchlist system so I designed my own. My system uses a MySQL backend. I grab information on auctions with a PERL script. I display the results via PHP on my private Apache web server. I'm just trying to get an idea of what sells, and for how much. If something sells consistently, at a price I can make a profit on - I buy.

Which brings me to suppliers. Finding and dealing with them is a bit of a pain. We are low budget, because that seems like the smartest thing to be, we want to put off spending as much as we can - these people sometimes want to do a credit check on you before they even tell you what products they sell. They expect me to have a fax number as well as a regular number. Also, eBay is cut-throat - in my little niche, I can't offer a competitive price on most of the stuff that sells on eBay. We have found a handful of items we can make money on though. As we find more suppliers, buy in more quantity and so forth we'll probably do better.

I noticed eBay has an API interface which allows several thousand free queries of their database per whatever. I signed up for it but have been too busy with other things to look into it much.

Doing sysadmin work for the dot-bombs and later Fortune 100 financial companies, I made over $80k a year. I have been working full-time on this for the past few months, and am looking forward to getting up to the point where I'm making $20k a year. Yes, it is less money, but I do not have to answer a pager, I do not have to worry about being laid off, I do not have to listen to a boss and all of that junk. I do expect to eventually get back up to my former salary though, and do it working for myself, which, now that I'm in my early thirties, is more important to me than working for someone else and making more money. I think over the long run, you only make more money when you're working for yourself anyway.

I am doing a lot of PERL programming now. I've written short scripts in PERL for years, and wrote one very long script a while back, but am not much of an expert. I'll have to learn how to deal with the things needed once scripts get long enough so they don't become spaghetti code. My code was looking like spaghetti code until I broke it into sections - get information, parse information, print information. Next step is putting the information into MySQL, joy. Anyhow, this is taking less time than I thought it would, which is good. Once I finish my information grabber PERL script(s), I will write some PHP (and possibly PERL) scripts that pull this information from the database and put it on a web page. Luckily, CPAN has a lot of the functions I need written already, like HTML parsing. Splits, shifts, pops, and the ability to do sed/regexp stuff easily like this:

$variable =~ s/"//g;

is quite handy. I would probably write my scripts to view this information in PERL as well, but I feel PHP is better, and simpler, for this type of interaction with MySQL. Anyhow, this is all elementary stuff I suppose, but I'm just surprised how quickly and easily I completed my task. If I was a real PERL wizard, I'd probably have done it even faster.

I'm looking for a place to host a web site, and want to know if anyone has any recommendations from their experience. The ideal web site would let me have ssh access, would have e-mail, ftp, Apache, MySQL, PHP/PERL, SSL-capable, OScommerce capable etc. It would also have good uptime, decent speed, and that sort of thing. A lot of stuff I've been looking at is in the $10-20 a month range which is what I'm looking for. The ability to host several domains for this price would be a plus. Any suggestions? The one I've been looking at most is Dreamhost. Another I've been looking at is Liquidweb (possibly via what I believe is a reseller of it, addaction, or maybe directly). Does anyone have any thoughts of these two, or have been happy with an alternative site? Especially if it runs Apache, PHP and MySQL and has shell (ssh) access.

In other news - Sourceforge finally fixed their damned statistics. I did some nice recoding of Gnutizen in February, but have been busy since then. Gnutizen is still alpha since it can't share. I wonder if I should allow it to be an ultrapeer before going into beta - I guess the Gnutella community would demand it. There's not much of a reason to use mine other than others anyhow, although I wonder if any Gnutella clients work on a UNIX command line anymore, I haven't looked for a while. I know gnut doesn't. It's a good thing I cleaned up the code before trying to implement sharing anyhow. Geez, it was a mess. Code Complete was a helpful read.

autoconf and automake - ugh. They remind me of what sendmail used to be like.

I rewrote a lot of Gnutizen, especially the Gnutella packet parsing section. Relative to the old sections my functions are much more clear and logical, I look back at the old ones and say, what was I thinking? Anyhow, I have been working for a while to clean things up before I add new functionality, and I am about finished with all of that, which is good because I want to add new functionality like the ability to share files.

One of the main reasons I'm writing Gnutizen is I'd like to see some of my ideas for peer-to-peer advancement realized and one way to do that is to write a peer-to-peer program. Not so easy - especially when the protocol keeps changing! With the advent of file hashes for Edonkey, Gnutella and Bittorrent however, I'm beginning to see that I might not even have to write all of, this, in an "art of unix programming" way, I might just be able to write what I want and then "pipe" it to another program, in the same manner web browsers launch video files with a fork to a video player. It would be real simple for me to get some of this done, so I won't even go into detail here. I'd like to have a look at the basics before thinking about the harder architectural problems.

29 Jan 2005 (updated 29 Jan 2005 at 11:54 UTC) »
Bolsh: I do not even agree that the assumptions behind the "law of supply and demand" is accurate in your reference to Robert. Price is not a "mechanism of information transmission" by any means except in a negative sense - if some corporation produces some commodity that people do not want to buy (e.g. exchange for something of equal value), then it may realize it has wasted its time producing something no one wants - something that happens all of the time in economies such as the US's. But in that case, information is only transmitted in cases where you're losing money making and selling something. Thinking that numbers such as prices, or even the objects themselves have any inherent information or value in and of themselves is fetishization. All value is simply congealed in a commodity by a homogenous measure of labor time. Of course, in many ways, rent does not even fall as easily into our modern economy's rules since it is a holdover from the days of feudalism - we even call the people we pay rent to "land lords" as they did 1000 years ago. I live in New York City and rent control has worked fine here and I have benefited from it. There is no sense of a lack of supply at all unless you absolutely must be one of the million and a half people packed into the 30 square mile island of Manhattan. And people I know in San Francisco had no trouble finding apartments which were decently priced in the Mission or Oakland even at the height of the boom. I think the real question is which is crazier - rent control, or an economic system with the kind of insane boom and bust that sends people into San Francisco like madmen before sending them fleeing? Rent control is a minor blip on an insane economic system which veers between massive overproduction and recession, to mention just one of its negative qualities. One would think what is wanted is that someone working 40 hours a week at a useful job can afford to keep a roof over his head, and the economy should be geared to that, instead what is being suggested is that we have an economy that is not there to serve us, but that we exist to serve the economy and it's crazy quirks as it veers about. Of course different people look at this differently - I rent, but I'm sure a landlord would be of an opposite bent, there is not one correct view of how this works, but two conflicting ones. If there wasn't, the renters would have probably have "seen the light" of the landlord's soi disant wisdom decades (centuries) ago.

I just skimmed through Code Complete, 2nd edition and liked it. I just found the big bug I have been hunting down recently - the format of what was coming through over my socket changed. The "program defensively" section is obviously most needed when one is listening to sockets over which almost anything can come. The section on functions was what I needed to read the most. I also browsed through the K&R book again reading about functions. I've also been playing with valgrind, ddd, gprof, splint and other tools more. My code is getting better and less buggy, hopefully it will become more so as time goes on.

23 Jan 2005 (updated 23 Jan 2005 at 04:51 UTC) »
My Wintendo

I wanted to check out Exeem on my Debian, so I downloaded WINE and loaded it up. At one point, I had a menu scroll down and moved my mouse and the menu just hung there, as it does in Windows, and even followed me around across workspaces. *Shudder*. It brought back bad memories of the days I had Wintendo on my desktop. Of course, this is just one of the things that goes wrong all the time in Windows, and WINE faithfully emulates it. And people complain about GNOME. I wonder how Debian and WINE handle Exeem's Cydoor's spyware. Pretty well it seems. Spyware and adware have gotten worse as time has gone by, I had one invade a Windows box (from Overnet) that took me a week to remove instead of the usual minutes. It was ftp'ing stuff and everything.

Debian on my desktop

As I said in my last entry, although I've had a Debian on my home LAN for a while, I switched from Windows to Debian for my desktop recently because among other things Debian had better support for my Linksys 802.11b USB wireless adapter. I have learned and realized a lot of things in a short time. This will probably be a long entry.

One general thing I am unhappy with is I just want things to work without much of a hassle like they do on Windows (sometimes, some things, like the Linksys, or not needing the crappy Windows/OEM CD's to "repair" or "reinstall" are more superior on (Debian) GNU/Linux than Windows. But a lot of things just work with Windows easily that don't on Linux, which is a little bit of a hassle.

What I'm thinking of right now is support for my old 3dfx Voodoo 3 graphics card so that I can get more than 1 frames per second in Tux Racer, but that's one I haven't completely tackled yet. I do know it will take me a while to figure out whether the 2.4 kernel supports 3dfx cards decently (or whether I can get the drivers somewhere to do so), and to implement it if I can. In Windows these things just work without so much hassle. Well, that's one I have only started looking at, let me go over things I have looked at.

I am really more of a Solaris person than a Linux person, so I have not done millions of kernel compiles like some Linux people. Thus, every time I have been changed .config, I've been recompiling the entire kernel, although it occurred to me recently I could just do an "M" in stead of "Y" and just add new features as modules. Actually, modules were causing me headaches when I was first trying to get my 802.11b working with Debian (I was attempting to use the horrible atmelwlandriver in stead of the beautiful at76c503a driver), so that kind of turned me off to modules for a little while. It's kind of like chess where some bad experience sets off Pavlovian bells in the head when considering certain moves.

So anyhow, I've been recompiling the kernel for every small change instead of building modules, but that's how it is, I'll probably be doing the module thing more in the future I prefer make xconfig to the other makes, especially the very helpful help button which pops up (menuconfig has this as well, although it doesn't pop up of course). Actually in my opinion, I can use even more user-friendliness than make xconfig, maybe something that looks through my dmesg or something and compiles based on that somewhat automagically? I keep getting SMP in my kernel automatically, which I don't need, yet I had to instruct the config to put a filesystem in for my internal DVD drive, not to mention the USB (and SCSI) stuff I needed to put in (none of my devices being supported within the kernel, I used outside modules for all if them). So the most advanced config that comes with the kernel installs by default an SMP which I don't need, yet is blind to the need of my DVD's player need for the UDF filesystem.

Which brings me to "make xconfig"...ok, I have been using Linux on and off since before 1.0 was released, although anyone reading this obviously is aware I am not a hardcore user. But if these things bug, or worse, confuse me, imagine what it will do for Joe Linux Desktop User or even junior sysadmins (or even senior sysadmins, for some of the kludgier Linux problems). Anyhow, I run "make xconfig" the first time and I forget what the hell the error message that was spit out all those days ago was. Eventually I realize I need Tcl/TK and apt-get install them. OK after doing that, let me show you what it said after I ran "make xconfig" for a second try at it:

Application initialization failed: no display name and no $DISPLAY environment variable
Error in startup script: invalid command name "button"
    while executing
"button .ref"
    (file "scripts/kconfig.tk" line 51)
make: *** [xconfig] Error 1
intron:/usr/src/linux# 
OK, now what is Joe User who is trying to get his DVD player to work thinking right now, after by some miracle he correctly figured out to apt-get install tcl/tk from whatever the hell make xconfig spit out prior to this? OK, so what do I do now? "DISPLAY=0:0 ; export DISPLAY" of course. Let's try it again (I have trimmed the error message for space considerations:
Application initialization failed: couldn't connect to display "0:0"
Huh? This has worked for me in various situations for the past decade or so. At this point I have to go to Google. I learn for some reason I have to do "DISPLAY=:0 ; export DISPLAY" because for some reason the first 0 which I have been putting there for a decade in various UNIX and X environments now won't work. Fine, I go back and make xconfig again. (errors trimmed)
Xlib: connection to ":0.0" refused by server
Xlib: Invalid MIT-MAGIC-COOKIE-1 key
So I open up another user and type "xhost +" (come on in, boys), try it again and Calloo! Callay! xconfig pops up.

Now you can have two perspectives on this: 1) I am an idiot because I don't fully understand (or remember) the entire X protocol and why I have to remove that 0 nowadays or 2) Even someone with as much UNIX and sysadmin experience as me still had to go to Google to learn that this wouldn't work without me removing the first zero in "DISPLAY=0:0 ; export DISPLAY", never mind needing to know enough to apt-get install TCL/TK, or that a root X window needs an xhost command (as the normal user) and that sort of thing. Windows Control Panel it is not.

This is not the main thing that bugged me, just of one of the many small things that did. I borrowed an HP printer/scanner recently, and went out and get the drivers (the freedom of which I'm not sure, I may have sinned against my Debian system). Anyhow, I set it up with the HP drivers, get printing working, get SANE (a scanner interface), and get that working. Then I come back in a few days and scanning isn't working. Eventually I realize that I have to run "/etc/init.d/hplip start" manually and then xsane will recognize the scanner and I can scan, I guess I forgot this, or it had started on installation last time or something. Did I mention xsane's penchant for crashing after every third scan with a segmentation fault? And that all the GPL OCR programs suck (I guess this is true on Windows as well though, I wonder if the commercial packages for Linux are decent)?

OK now to my DVD (and CD) player. As I said before, kernel default configuration gives me SMP support I don't need, but misses the fact that my dmesg sees I have a DVD player. Anyhow, I compiled a kernel with UDF. I forget if I had iso9660 in my initial kernel or not, it's turned on now anyhow. I never really thought about audio CD's in my CD/DVD player, the first time I put it in without thinking I tried to mount my audio CD as an ISO9660 mount. I went to Google and saw cdplay would play my CD's. I'm not really sure how this works in Linux - on Windows my Winamp automagically plays CDs and MP3s, right now I have XMMS playing mp3's and am using cdplay to play my audio CDs. I've asked XMMS to "play" /dev files in stead of mp3s but it has not complied. Well, the less convenient XMMS/cdplay split *works* even though Windows was better in this respect, and I have bigger fish to fry (getting more than 1 frame per second for my tux racer) that this goes to the bottom of my list - mp3s can play, audio CDs can play, once I get graphics, my zip drive and so forth working, then I can go back and bother to look to see if xmms can play audio cds or if some other package does.

Oh another thing. Debian was launching into GNOME when GNOME was not working initially. I would boot up, gdm would launch, I'd get the selector, but nothing would work - not even the x-term. Very, very annoying. I don't even remember what I did, I think I went into emergency boot mode on my install CD, and linked my gdm init.d script to /dev/null ot something like that. When I was ready to deal with GNOME and GDM I installed apt-get installed x-term, so I could escape from GDM, and eventually got GNOME working.

I have been getting educated on GNOME recent developments (or non-developments more like it). My first indication of how far behind woody is was when I was trying to install the stuff to get my H-P to work. Debian Woody used Python 2.1.3, which was released almost three years ago. H-P wanted something a little more modern, like 2.2 at least. I had no idea how wacked Debian's releases had become when I installed it. I'm not even sure how much of a problem it is, I'm happy with sarge ("testing"), I just wish Debian would be more forthcoming about what is very obviously a problem on it's web page. It's front page says "Getting Started - The latest stable release of Debian is 3.0. The last update to this release was made on January 1st, 2005." Which is a link. Which links me to how to download woody. The page does say "Debian GNU/Linux 3.0 (a.k.a. woody) was released on 19th of July, 2002." which perhaps should have set off alarm bells, but as the front page had had the date 1/1/05 as the last update, I had assumed some kind of Debian stable "base" was from 2002, and that things like post-2.1 python would be in the updates, I didn't know the updates were just security patches and that sort of thing.

Which brings me to the general Debian craziness. Honestly, I am very happy with their position regarding free software, although one can reach a point where one is too zealous - I understand some people want to remove the choice of non-free , which I think is ridiculous. Theodore Tso has talked about splitting Debian, I think what should happen is the 100% purist zealous Debian developers should hopefully continue to contribute to Debian, but also start their own kernel, a completely pure and free kernel "Free Debian" or something like that. Debian needs to navigate between the Scylla of needing to be a "mass movement" and Charybdis of needing to be and promote free software. It needs to be *both* and I hope it can be both, although at certain points it will have to give up users to remain free, and at certain points it will have to make some compromises so it doesn't become an OS no one uses except a handful of die-hards.

I don't want to go too hard on Lunar Knight free software zealots. I'm glad they exist and that *they* do not run a system that tuns unfree software. It gives me great satisfaction knowing Richard Stallman will not run non-free software on his machine. For myself, I want to run free software whenever possible. And I like knowing when the software I'm running, the drivers I'm running and so forth are not free. But I want to have the option, the choice of running them. I run blackdown on my Debian, not Kaffe. When Kaffe is in decent shape, I'll run Kaffe. If I could program better, I would contribute to Kaffe

Gnutizen

Which brings me to my project, Gnutizen. For some reason, although it worked well on my other Debian machine, and my Windows machine, now it is breaking all the time. I also just realized I haven't been doing mutex locking on global variables. I guess those bugs are just popping up now for whatever reason. Blimey! Gnutizen is written in C, it is a peer-to-peer application running on the Gnutella protocol which (used to) compile on Linux and Windows (as well as at times past FreeBSD, OpenBSD and Solaris). I am not a good C programmer, especially considering I have spent the past year mostly fixing bugs in the program (doing the minimal possible to keep up with the evolving Gnutella protocol). I can connect to the Gnutella network (initial bootstrapping is very manual, although now that I have leaf support somewhat, it is better), search and download. Or I could when the program didn't crash as often on my last Linux, before the mutex locking bugs showed up on my new system.

Fixing bugs mostly for a year, doing the minimum to keep up with the evolving protocol and so forth is not fun. And now all this thread locking stuff. I would rewrite it all, but my C skill is so poor I don't know if I should do so.

In chess, when two grandmasters play a game, other grandmasters go over the game and note brilliant moves, mistakes and so forth in the game. Grandmasters will never go over my games like that (maybe if I go to the Village and beg them some would), just as great coders will not be going over my current code and telling me what I'm doing wrong (unless I beg them). But is there anything out there that exists like that on the web? There really should be. Like "OK, we're going to take a look at less 2.90 - what it does right, what it does wrong etc." I suppose what I'm looking for is code that is super-commented and meant for someone who doesn't know C that well (as in - we put extern here because so-and-so, we have this in a header file in stead of the c file because of so-and-so.

I suppose what I have to do is just look at well-coded C packages, not that large, well-used and well-coded, just look at them, take advice from people like Linus like "functions should be short" (something I initially had a problem with) and then just sink or swim. I went to NANOG a number of years ago and was impressed, well a number of people impressed me, and Van Jacobson was certainly one of them. So when I first started coding my big C package, I said, I'll use traceroute.c as a guide of sorts, since it is networking code, and because Van Jacobson is a god, I mean he wrote traceroute for heaven's sake (along with TCP/IP header compression and a lot of other stuff). So when I had settled on using traceroute.c as a guide, here is what Van Jacobson somehow had the foresight to tell me:

/*
[...]
* Don't use this as a coding example.  I was trying to find a
* routing problem and this code sort-of popped out after 48 hours
* without sleep.  I was amazed it ever compiled, much less ran.
[...]
*/

I'm sure he meant that in a flip manner, and I'm sure code he pops out after 48 hours without sleep is better than my best code, but it's funny how he put a warning in the code 15 years ago to not use it as a coding example, when that was exactly what I was going to do. Then again, if you listen to the old-timers, they all say that a lot of the code we use and assume it does weird things for a good reason, actually has things implemented in an odd way because it was written on a PDP or something like that.

Anyway, if anyone has any ideas on how I (or anyone) can improve their C skill, let me know. I guess practice is one way - in retrospect I now realize the decision to add threading was a major one (although I just copied gnut, a model for gnutizen, in that decision, although of course, they locked their mutexes for global variables). I don't even know what the hell gtk-gnutella does, someone on #gtk-gnutella says they use "magic()". I would love to contribute to GTK-Gnutella and perhaps learn from better programmers in the process, but I think my ability is not up to their standards.

Anyhow, I definitely am a believer in do it yourself, sink or swim learning for things like learning C and programming an application, but helpful pointers from smarter people once in a while saves me a lot of time and frustration. People on IRC have been really helpful, actually. I try not to be *too* bothersome.

Anyhow, I know enough C to write a program that connects to Gnutella and can download a file (borrowing a lot of the command-line interface from the GPL'd gnut, as well as some other minor things). I also think I made some good design decisions, like trying to be cross-compatible, and trying to allow flexibility for the UI. But my code has been very buggy, which is no fun, so if anyone knows of any good things on the web on how to program C better, or good examples of C programming (of a reasonable size - Linux and emacs are huge), perhaps you can respond here. I bought The Art of Computer Programming, which is very math heavy, dense and concentrates on assembler. It seems very hardcore to me, and I hope to read through all three of the published books at some point, perhaps something not as intense as that. Like Linus's coding style where he talks about how he feels about comments, variables, functions and so forth. Or I've read Bram Cohen's articles here on some of his ideas. Good example programs are appreciated, as well as articles with advice that addresses the whole shebang, not just how a function should work and so forth.

I have two Intel boxes at home - an old one and a newer one which has more stuff (DVD player, Zip drive, soundcard). The old one ran Debian and the newer one ran Windows 98. Recently, I had a lot of trouble getting my Linksys 802.11b USB wireless adapter to work with Windows 98, while it worked with my Debian 2.4.20 just fine. I had to reinstall Windows from these crappy Compaq Windows 98 recovery CD's and it wiped my C drive (which I expected) and my D drive (to put 3-4 small files). Plus it didn't recognize the ethernet card suddenly. So I said screw it and installed Debian on the newer, nicer machine. As long as I can write a resume in Microsoft Word format, which of course, I can do in GNU/Linux, I'll be fine. Besides, my roommate has a Windows box if I need one.

I've been working on Gnutizen, my Gnutella p2p servent. I recently put leaf functionality in it for CVS, and have added even more although that is not in CVS yet.

I've also been checking out GTK-Gnutella's source code. It makes me realize how much C language I have yet to learn. I would like to help them out if I can. I know the Gnutella protocol fairly well, which may even me out on some level with better C programmers who don't know the protocol.

15 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!