Older blog entries for logic (starting at number 158)


So, in my new role, I'll be responsible for the shared administration of a rather large Gentoo deployment. So, in the spirit of eating your own dogfood, I'm loading my new company-issued laptop with Gentoo Linux.

First impression: I feel like I'm back in 1994. Really. Oh, the package management is generally pretty cooked; in fact, I'm really quite impressed with how they're handling a generally nebulous thing (from-source package generation) in a pretty consistant manner. I'm trying to imagine building Fedora from source RPMs, and the idea gives me the willies; they have the build dependancies handled pretty well. No, my problem is the same problem I have with Debian: choice is good, but too much choice is a PITA. You can infer from this that I disagree with the Perl axiom of "more than one way to do it"; all that means is that the language (or in this case, distribution) maintainer didn't have the intestinal fortitude to make a decision, and left the problem of bikeshed arguments to the users. It's actually worse than that: on their own, a lot of the little variances don't matter, but get enough of them together, and you have a maintenance nightmare.

This problem is compounded by the community belief that customization at all levels (specifically at the buildchain level) is a good thing. At the end of the day, you have a distribution where you are truly on your own from a support perspective in many cases. Not a big deal for an operating system targetting hackers and tweakers (in fact, I'll probably have a ball with it on my laptop), but when I put my management hat on, the idea of using this in production frightens me; you're essentially locking yourself into using senior-level talent to manage your infrastructure, and hiring junior talent that can grow into the position starts becoming less and less attractive. Not bad for me, but bad for the bottom line.

I expect I'm expounding on arguments that have been had over and over in the Gentoo community over the years, so this is more of a first-impression kind of vent. I'll skip on discussing the apparent lack of development and "stable" tracks for general deployment, and a few other similar things I've noticed missing from the "process" around the distribution, because they're all fundamentally part of the same issue: the Gentoo community appears to strongly appeal to the hacker/tweaker, which defines the community's behavior from a packaging and ongoing maintenance perspective.

So, Slackware for smarter people. :-)

Syndicated 2005-12-15 10:11:00 from esm


After being so happy about getting the framerate up a bit last night, I decided to finally get around to updating the BIOS to the last version issued by ECS (it's a K7VZA v1.0 motherboard, if anyone's curious). Seeing as it's been out since 2000, I figured, "What could possibly go wrong?"

I ended up with a brick.

Enter the Willem EPROM burner I bought a few months back for flashing ECU chips for Erica's Laser. I finally set up a machine for doing nothing but burner duty, and pulled the image off the chip. As suspected, it was corrupt, so I tried writing the image out again with the burner...no dice, it would fail after a random number of sectors. After a LOT of searching, I turned up a bit of information: first, BIOS chips (in this case, an ASD AE29F2008) seem to have a defence mechanism against just blatting a new image onto them, and you need to disable this before you can write your image (which is the real "magic" of the flash update software that motherboard manufacturers issue you). Second, version 0.97ja of the Willem software (the most recent version available) can't actually disable it; you have to backdate your version to 0.97g. Tried out the older version, clicked the button that magically appeared to disable software protection, and viola: the image burned perfectly.

Shouting triumphantly (and waking Erica up, doh!), I rushed downstairs with my freshly flashed BIOS, plugged it in, powered the machine back up, and...BEEEEEEEEEEEP*crackle*. Same thing, just powered down. Dammit. Okay, back upstairs, downloaded and burned the older version that we were running before, ran back down, plugged it in, powered it up, and all was well again. I have no idea why the new image isn't working, but I'm perfectly happy with the BIOS revision I have now, thankyouverymuch. :-)

I also took a second to slap another 30GB drive I had lying around into it for my ogg collection and various other multimedia goodies for sharing with the rest of the machines here. A quick fdisk and pvcreate /dev/hdd1 (etc), and I think I'm about ready to call it a night.

So, on the upside, I now have a decent station to burn chips at; this eliminates the last of my reasons for waffling on getting a new chip made for the Laser, so I'll probably play with that next.

Syndicated 2005-12-06 22:51:00 from esm

Radeon 7000/VE

I never thought I'd be happy to see 114 frames per second out of glxgears, but that's double what I was getting before, so I claim success. I suspect if I cranked the resolution down from 1600x1200 to something a little smaller, or dropped the color depth down to 16bpp, I'd get a little better performance, but frankly, I'm not really willing to sacrifice either. Call me spoiled. ;-)

For the curious: lspci identifies my card as a Radeon RV100 QY (Radeon 7000/VE), which is apparently some bastard child of the Radeon 7000 lineup that has a bad time with DRI. It also has the fact that it's a PCI card working against it, but hey, at least it's got 128MB of painfully-slow-to-access video memory onboard. :-) Two initial problems cropped up: first, I had an MTRR conflict which required manual intervention to clear up (I found a thread over at Rage3D.com that put me onto a solution for that), and second, DRI is disabled on the Fedora build of Xorg, and needs to be explicitly re-enabled for the Radeon 7000/VE (add Options "DRI" to the Device section of xorg.conf). Basic desktop behavior is a bit quicker now, and video playback is actually doable now, so it's a good place to stop for the night, but this is truly terrible performance compared to what I was expecting.

Syndicated 2005-12-06 00:52:00 from esm


The musical and martial arts worlds have a history long enough to learn something from when thinking about new (which is a relative term, of course) fields of study. Something they both got right was the idea of "practice"; repetition of basic tasks so as to both reinforce the basics, and to prevent the student from "practicing" on the job (how embarassing to deliver that B as a b flat during your on-stage solo).

Let's apply that to the IT world. If you're a programmer, when is the last time you sat down and worked through some of the basics in your current development tool of choice? Written a b-tree implementation from scratch lately, complete with sorting and searching? How about just a basic list or queue? If you're a systems administrator, when is the last time you tried working through a simulated emergency, so that you're better prepared for the next fire you have to put out on the job? How quickly could you get the company's website back up if the database server needed a major component replacement? How about if that component was the complete disk array?

I wish this was my idea, of course, but I'm really just re-telling an excellent concept from Dave Thomas, one of the fellows who wrote The Pragmatic Programmer (I talked about it a long time ago over on Advogato back when I originally read it; if you haven't ever picked it up, I highly recommend it). He coined the term Code Kata in a bow to the Japanese concept of (Kata) in the martial arts. Literally "form", it could more practically be called "practice"; most forms of martial arts have a series of pre-determined forms, or Kata, that the student memorizes and exercises until they can be performed essentially from muscle memory.

Similarly, Dave Thomas suggests 21 Code Kata for programmers to tackle during the practice sessions he thinks we're all missing out on. I've had a rudimentary form of this that I've tackled over time, but that was targetted mainly at the acquisition of new skills, not at practicing existing skills. So, I decided to start working on at least a few of these as time permits; having just started, it's amazing the amount of knowledge from back in my university days is still there (and how much trouble I seem to be having recovering it ;-)).

Practice is good.

Syndicated 2005-12-05 16:09:00 from esm

Outsourcing administration?

I read an excellent essay recently which discussed the fallacy that outsourcing programmers was akin to outsourcing manufacturing capabilities; he convincingly makes the argument that programming, unlike manufacturing, is an iterative design process. Just like you'd never see a major automotive manufacturer outsource it's design team, because design is their core competancy, outsourcing your programming talent (assuming software is your business) would similarly cripple your competitive advantage. His discussion got me thinking about my own chosen profession (systems administration, in case anyone reading this in the past might have mistaken me for one of those hippy geek programmers ;-)). Does his argument apply to systems administration and management, or do we have a job function that could be done from an operations manual?

Systems administration consists of a number of well-defined roles, combined with a lot of day-to-day vagueness. The well-defined stuff is obvious:

  • Keep the hardware running, and schedule fix/replace work as neccesary.
  • Monitor operating system and hardware resource usage, and notify application owners/schedule upgrades as necessary.
  • Schedule operating system (and application, if that's in your job description too) patching and regular maintenance.
  • Keep up-to-date with the current "state of the art" in systems administration "best practices" and software/hardware revisions.

I could go on a bit longer, but you get the idea. Keep things running, schedule stuff that helps keep it running, and fix stuff when it breaks. Simple, right? So outsourcing sounds, at this point, like it's a good idea. Bring it on: remote facilities and eyes-and-hands services cover 90% of that, and consulting resources can probably manage the remaining work, right?

This is where I bring up that second part of the job: the day-to-day vagueness. It includes things like:

  • Help a windows developer become acclimated to programming on UNIX (or, alternatively, commisserate with the poor UNIX programmer forced to write in VBScript all day long). Explain select() to someone whose most complicated previous program has been an interface mockup in MS Access.
  • Help project management understand why the great idea they had for streamlining their project plan will actually extend the deadline, because it doesn't take into account an implementation detail that they didn't know about, such as growing SAN capacity or adding switches.
  • Serve as a liason between the technology staff and management. Explain that their carefully crafted budget from last year was exceeded by 150%, not because of "out of control geeks", but because management didn't actually ask the geeks how much their projects would cost.
  • Write code, because automation gives you more time to write blog entries about how you're automating yourself out of a job.
  • Convince a vendor that, yes Virginia, there really is a bug in your product. Provide truss/strace/etc. output as necessary. Step through it with a debugger if you have to, or supply the line number of the "encrypted" perl code they supplied you with that contains the bug.
  • Figure out why a particular vendor-supplied operating system patch didn't apply to one machine out of fifty, in fifteen minutes, because that's how long your maintenance window is. Roll back all fifty machines if you can't figure it out.
  • Sit in every meeting for a particular project, not because you have anything to contribute to any of them, but because of that one meeting where someone will suggest something utterly impossible, from a technical perspective, and you need to be there to save yourself and your team from the job of making it happen.

I could go on forever with this part. The primary role of a System Administrator, in my mind, is not to do the day-to-day technical work; management is right, all of that can be outsourced. What they can't outsource is the role of someone you can turn to and ask "does this make sense, technically?" Just like automotive design will never be outsourced, systems and network design can't be outsourced without terrible consequences. Envision a case of a large company who has outsourced all of their administrative talent. Who is making the design decisions for the network? Do you outsource that, or does management take on the role? If you outsource it, who represents the designers in new deployments? Who do you turn to when you have questions about how one system interacts with another? How do you fill in gaps in your staff's technical knowledge without technical leaders to turn to?

Make no mistake, though: this argument demands that systems administrators grow into technical leaders. We need to be architects, designers, mentors, programmers, project managers, janitors, and free-thinkers; we need to have a breadth of expertise that a specialist simply cannot bring to the table. Someone who wants to take a class and be "certified" as a "Linux Guru" doesn't have much of a future in this business (or at least, not at this pay grade) because that role can be handed off to anyone who can read an operations manual. Want to keep your fat paycheck? Recognize the areas that can't be outsourced, and excel at them. If there's a "Learn X in 24 Hours" book, it's a pretty good bet that X isn't what you should be exclusively concentrating on.

I started writing this not sure where I'd end up at the end; I half-expected to end up making the argument that I'm not necessary, but that would only be half-right. Only those of us who can't provide technical leadership and handle the "undefinables" of the job are unnecessary. I, for one, welcome the thinning of the herd. :-)

Postscript: I wrote this in a coffee shop waiting for a job interview today. I'm writing this addendum on the train ride home afterward, after finding out that I was actually being interviewed for two positions: a "pure" system administration role, where everything is clearly operationally defined and the team has clear goals and deadlines to work in, or a position which was defined as "we do the stuff that falls through the cracks" by one of the senior administrators on the team. I don't think the recruiter had any idea why I had such a huge grin on my face when she told me that, and seemed surprised when I didn't have any difficulty saying which position I felt more attracted to.

Syndicated 2005-11-22 21:15:00 from esm


I finally got around to setting up Trac for "managing" (if that's the right word) my software development. I don't have a lot ongoing right now, but all the old stuff I worked on has been languishing on my harddrive without any visibility. while some of it is personally pretty embarassing to admit to writing, it doesn't do any good to anyone collecting dust. So, have at it, if source code is your thing. It's bolted up to my Subversion respository (in regular and XML flavors), which you can also browse with ViewVC.

Syndicated 2005-11-21 13:39:00 from esm

More DOM stuff

So, a few more random observations on DOM browser compatibility.

First, don't bother using element.setAttribute('class', 'someclass'). Oh, it looks fine in Firefox, but it won't do squat in Internet Explorer. Use element.className = 'someclass' instead, which seems to work in both browsers. (The problem I noticed, if someone catches this in a search, is that no errors were being raised anywhere, but the styles I specified in the stylesheet weren't being applied in IE. I have no idea if this problem applies to other element attributes. I found a reference to the solution at WebmasterWorld.)

Next, dynamically creating DOM objects in JavaScript is cool. ;-) By being able to yank the navigational aids out of the HTML source, browsers like Lynx and ELinks render with less useless fluff, and if you disable JavaScript in your browser, it "just works" (and looks basically correct).

Finally, I am amazed at how well Opera "just works". I didn't have to touch anything at all to make Opera render everything as I wanted; as far as I can tell, it behaved exactly like Firefox in most cases (specifically, it looks like they implemented the W3C DOM Events specification). Happiness.

Syndicated 2005-11-18 14:34:00 from esm

Depricated to DOM in 30 seconds!

So, an old friend of mine (one of the few people who actually reads this blog, methinks) dropped me an email to let me know that my wonderful new standards-compliant navel-gazing website wasn't as up-to-date as it should be: horror of horrors, I dared to use <blah onclick="javascript goodness"> all over the damn place, giving everything that oh-so-1999 feel. Okay Ben, here you go; both the sidebar hide/show functionality and the resume section expansion has been unceremoniously ripped from the HTML (along with the plethora of id="XXX" tags I had to sprinkle everywhere) and replaced with a gaggle of JavaScript borrowed from several places and a bunch of DOM manipulation.

May I point out, however, that getting the "new DOM way" working was a huge PITA? My first cut working with Firefox worked like a champ. Then I fired up Internet Explorer (why does firing up that web browser always mean I'm going to have a late nigh?), and was greeted by...nothing. A big, fat blank screen. WTF? I notice that if I comment out the call to the external JavaScript file, the page displays properly. Hmm. On a whim, I tried changing the reference from <script ... /> to <script ...></script> and viola, it works. You have got to be kidding me.

Then, we get into the mess that is addEventListener vs. attachEvent vs. document.eventname. That took a lot of straightening out; I eventually completely ditched the code that Ben originally suggested from Scott Andrew, and ended up using Andy Smith's very nice cross-browser event addition handler, using anonymous functions so that this works like it bloody well should. (Microsoft, are you listening? Of course you aren't, how silly of me. :-P). Andy's code has the additional benefit of handling nested event handlers in a very slick manner.

Finally, I had a problem with Firefox "bouncing" back to the top of the screen every time I'd click on one of my psuedo-links for expanding and collapsing a section. Easily solved by calling preventDefault() on the event object passed to the event handler, but once again, it's not supported by IE events (nor did IE do what, in retrospect, appears to be the right thing: fall through to the default event handler if mine doesn't explicitly say not to). So, that's all fixed up, and I appear to have, once again, a working cross-browser website (with substantially simpler HTML). Isn't standards-compliance fun? ;-)

As a side note, there's still a bit more to do; rather than mangling .innerHTML in my expand/collapse code, I should really just create a couple of objects that I can swap in and out there instead. But that will come tomorrow.

Syndicated 2005-11-18 02:04:00 from esm


I'm making a note about this here mainly to refresh my own memory later, when I find myself needing to create a decrypted version of a PDF that I have the damn password for: pdftk is a handy little program which very neatly takes a password and PDF as input, and can produce an unencrypted version (with or without any of the permissions changed) as output. I can't believe I spent all night searching for something like this (and started looking at Ghostscript hackery as possible solutions) and never came across it. My Google-fu is weak. :-P

Syndicated 2005-11-17 22:16:00 from esm

CSS again

So, I apparently can't leave well enough alone. I rewrote most of the HTML that drives this website to use <div ...> sections and semi-proper semantic markup (where possible), and came up with a quickie stylesheet for the whole mess. The end result is actually something I'm fairly happy with; it still has some rough edges, but considering I'm finally calling it a night on this version at 2:45 AM, I think I can forgive a few minor problems. At least it validates as proper XHTML 1.0 and CSS, with no warnings, so I can sleep well knowing that it's standards-compliant, and renders pretty much the same in both Firefox and Internet Explorer, and appears to be readable in both Lynx and ELinks. I'll check on the Linux laptop tomorrow, which also has Opera installed.

As an aside, I'm loving Pyblosxom's "flavors" capability. It made building and testing a completely new website using the old content a breeze; just whip up a new set of flavor templates, and modify to your heart's content. In fact, the RSS feed RSS that Pyblosxom generates is just another flavor. Spiffy stuff.

Syndicated 2005-11-17 03:03:00 from esm

149 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!