Older blog entries for ncm (starting at number 123)

8 May 2005 (updated 11 Oct 2005 at 23:12 UTC) »
So here we all sit with these huge brains we're so proud of. Using them, we can throw a rock to knock down a bird. We can invent mathematics, and organza, and politics and everything. Meanwhile, the lowly starfish creeps along the ocean floor, eon upon eon, nibbling, nibbling. It has no brain. It has no head to put one in. All these centuries we all thought they wandered aimlessly, a sort of aquatic Roomba. We thought so right up until somebody made time-lapse movies, and found that starfish have an active social life, and dominance contests, and are such deadly hunters that when one comes around the snails all flee as fast as each one's little head/foot can carry it. (Bugs hunt, too, and fight, but these things have no brain.)

So, what is it about a brain? It's all these nerve cells wired together (although the wires are nerve cells, too). If you're in a hurry, you really have no choice but to stuff all your nerve cells into as nearly the same place as you can manage, because it takes so long to get a message from one place to the next. If you're in a hurry, you have to keep them all working at the same time, and each one doing just one thing. After all, every millisecond counts.

But suppose you're not in a hurry. Suppose the same survival pressures drive you to evolve ever more elaborate structure, but days or months drift lazily by in your struggle to best your neighbor. Where would you keep your nerve cells: all in one vulnerable spot, or spread out, more or less evenly? Would they be on all the time, burning calories? Maybe they could be doing something else when they weren't called upon to think. Maybe they could be muscle cells, too, or bone cells, or taste buds, or all three and more.

How many would you need? When you're in a hurry, as many as you can afford -- and they're expensive to keep fed. When you're not in a hurry, you can do the same amount of processing with many fewer, just more slowly. Each clump (if indeed they clump) of cells acting as nerves might worry at all stages of a problem, instead of handing off each intermediate result to some other ganglion.

What about the wiring? When you're in a hurry, nothing will do but to wire each bit as directly it can be to every other bit that might need to hear from it. When you're not in a hurry, a lot less wiring can do the same job. Maybe you won't need any; people once got along sending letters before we got phones.

The point is, suppose a slow creature was thinking big, slow thoughts. How would you find which bits were doing the thinking? How would you recognize that you had found them? How would you even know to look? We have no idea what use a great Sequoia tree might have for intelligence. If it had, we might never notice (not being trees), nor spot any anatomical feature big or small that seemed meant for thinking with.

Search for intelligent life in the universe? We've hardly begun here.

Crashy Galeon got much worse after I turned on memory resource accounting ("echo 2 > memory_overcommit"). Damn OOM-killer. Hey Mr. Kernel Guy, why not keep around enough spare pages to be sure you can swap something out if you need to? Usually when OOM-killer attacks Galeon, I'm not even using it, and there's always gobs of swap unused.

deekayen: Your 5 May entry was the funniest story I've read all week. You should make it a regular column.

Orkut: "Java", 14853; "I hate Java", 752. "Perl", 3808; "I hate Perl", 312. "C++", 10329; "I hate C++", 127. "Lisp", 740; "I resent LISP", 63. "Python", 2579; "I hate Python", 53. Perl (3808/312=12) retains the crown of "Most Hated Language", with LISP (12) skulking resentedly nearby. Java (20) is second in line, with Python (49) a distant third, leaving C++ (81) least hated by an astonishing margin. More people, still, hate Java than like Lisp. Of course these statistics disproportionately represent the feelings of Brazilians (not that there's anything wrong with that!), who now constitute 69% of Orkut's users. They seem to hate Perl more than Java, while Norte Americanos (under 8% of Orkut users!) hate Java a lot more. I imagine Brazilians don't encounter as much Java.

2 May 2005 (updated 3 May 2005 at 02:24 UTC) »
Just realized that C++ istream has just the apparatus needed to reflect Unix pipe semantics all the way out to the user level. std::istream::readsome() will do it, assuming only that std::filebuf::in_avail() uses ioctl(fd, FIONREAD, &count), and its underflow() just returns after a short read, instead of insisting on refilling the whole buffer. Looks like gcc-4.0.1 might Do The Right Thing. Next goal is compatibility with non-blocking sockets! I guess we need to document this stuff for it to be useful...

Been reading a truly remarkable book called "Flow: the Psychology of Optimal Experience", by Mihaly Csikszentmihaly, 1990. It includes some very specific advice on how it is that happy people manage to be; but without the research presented it would seem entirely implausible, so I won't repeat it here. Also read "Jonathan Strange and Mr. Norrell", which I am still mulling over, but the key chapter seems to start on page 500. If you have kids be sure to check out all the Colin Thompson books from the library, or ask it to buy them if it hasn't got them.

I finally found out who made the the most astonishing origami I have ever encountered: Satoshi Kamiya. Unfortunately the image of it (a dragon with all the claws, teeth, horns, scales rendered) that Google finds is washed out.

I wonder if they made the new Pope sit in the special outhouse, and peeked from below to make sure the right bits "pendula bene", as is a traditional part of vetting a new pope. They're supposed to be strong on tradition around there, even moreso than, say, forty years ago, although people who claim to respect tradition usually seem curiously selective about it. (No offense intended: if it was offensive to talk about, wouldn't it be so much more offensive to do?)

chalst: It would be a great pleasure to host a visit. N.B., sooner is better! LtU, in the bits I've seen, has quite a distressingly high wankage quotient for such a supposedly literate bunch. I've finally joined LtU, so expect a long article explaining to them why I think so. (I don't expect them to take it at all well. ;-) Guy Steele has become a bit of a disappointment, if not quite yet an embarrassment. (Haven't we all, though?) The difference between a computer scientist and a software engineer is that the computer scientist solves every problem by adding another level of indirection, and the engineer solves his by stripping them back out again.

msevior: I've decided to take pity on you. For your piece table, just put each piece in its own fixed-size 32K buffer. Keep an array of descriptors, just a pointer to the piece and its offset into the document. (You could put more indexing in the piece buffer itself, but how long could re-indexing <32K take?) Every time you move the cursor away from a piece buffer, and it's less than half full, merge it with an adjacent buffer or two. You never have to move more than a few tens of kbytes at a time. Even a huge document (10Mbytes of text really is a huge doc, think about it) is only a few hundred piece buffers. When you insert or remove a piece buffer, you can just scoot the latter part of your descriptor array around; that's another few hundred bytes to move -- again, so what? (You could keep a tree if you can't stand not being fancy.) You might mmap the piece buffers to a file, to make it quicker to save intermediate state, and just sync them out now and again. Give each buffer just enough state so the index can be rebuilt at need. (For transactional safety -- mmap might sync out a page any time, without being asked -- never modify a buffer in place; instead copy it, and then write in its sequence number and hash value when it's ready to replace the old one, so you know if it was all written right.) Your undo text you can just plaster onto the end of some scratch buffers, logwise.

VoIP is working fine, except when I call my employer's conference call bridge. That drops me several times per hour -- or somehow provokes my softphone to drop. BTW, what aumix calls "IGain" is what the other mixers call "Capture". That's the slider that matters for microphone sensitivity -- not that you could tell from RTF M or anything else.

Galeon has got distressingly crashy lately. However, it usually crashes when nobody is touching it, as if it's really the OOM-killer at work. Examining /var/log/messages, I find long diatribes by the oom-killer that reveal very little, other than that there's really gobs of space free and no plausible reason to kill anything. They also don't reveal who it's killing. (This, in kernel 2.6.11.7 built with gcc-2.95.4.) Seems somebody broke something, again. Now I'm experimenting with "echo 2 >/proc/sys/vm/overcommit_memory" and bumping likewise /proc/sys/vm/swappiness up from 0 to 10. This is stupid.

Too much slashdot disease here lately, again. In English, the apostrophe never means plural, and "it's" always means "it is". Please.

To get VoIP working I ended up having to use a binary tarball of IAXComm from three releases back ("rc1"). The Debian experimental version using PortAudio19 (which is supposed to understand ALSA and Jack) suffered persistent assertion failures. I wonder why all the community-oriented VoIP web sites (Free World Dialup, e164.org, etc.) seem almost deliberately opaque. FWD makes it hard even to get out of their splash page, until you spot the "site map" tag concealed in the extreme lower left corner.

One of the odd things I found while getting IAXComm working was that nobody documents what Linux ALSA mixers do, and what all those damn sliders and switches are really for. What I was able to figure out was

  • Gamix seems to be the least annoying GUI mixer.

  • The "Mic" slider only controls how much of the Mic signal is fed back to the headphones. If it's not muted, people hear loud static at the other end. Putting it at zero doesn't really zero it; you have to push the mute button on it. (I don't know if this is an es1978 problem or an ALSA oddity.)

  • The real mic sensitivity control is labeled "Capture". Its mute button is the right one to use in conference calls.

Congratulations to everybody who had a hand in Gcc-4!

This is the best thing I've read all week.

Daniel Veillard, everyone who matters loves you. Rude people make their own hell, let them squat in it alone.

I've spent way too much time trying to get my Compaq laptop to do an ACPI suspend-to-RAM. For some reason the /proc and /sys files that are supposed to be used to trigger it (or even cpufreq) don't show up, but it doesn't say why. (But I can use ACPI to throttle my CPU to 1/8 its normal speed, whee.) All the online references advise disabling ACPI and using APM, but then poking the volume-control buttons on the front edge makes the whole box freeze.

I spent too much time, too, trying to get linux kernel swsusp2 to work. It hangs instead of resuming, and I haven't time to triage drivers. (Should I blame eepro100? AGP?) Anyway suspend-to-swap would be way too slow resuming if it did work. There's the reason why BIOSes should boot in three seconds instead of sixty! (Shouldn't an AMD-based laptop work with a Free BIOS? Supposedly AMD works really closely with embedded-Linux people. Somebody with an AMD, please give it a try.)

I spent too much time trying to get IAX softphones (IAXComm, Kiax) built and working. They dial and connect fine (yay voipjet.com) and I hear my wife say, "Hello? Hello?" into our regular phone, but they won't send any audio, even though I can hear the mike signal in my own headphones. Instead, they complain over and over about "PortAudio: read interrupted!". Isn't ALSA and /dev/dsp0 supposed to be mature?

It's strange that hardly any of the current VoIP stuff is packaged in Debian yet.

I spent way too much time comparing Dell Latitude with Apple Powerbook and iBook. The upshot is that (1) Apple still claims a $500 price premium over a more-or-less equivalent Dell, (2) a Powerbook really does give you about the right amount of extra value, for the money, than an iBook, and (3) A Powerbook would be worth a lot more if you could order it with three touchpad buttons. Arrogant gits.

At the shared office where I park my telecommuting butt, they gave up trying to make the regular network connection reliable, and connected my RJ-45 socket directly to the router on their back-up DSL connection. It continued dropping connections several times daily -- apparently every time a script kiddie tried to mount its shares -- until I turned off its pathetic excuse for a firewall. Now connections stay up fine. Don't ever buy a LinkSys router whose firmware you can't override with an image that works.

4 Apr 2005 (updated 5 Apr 2005 at 12:54 UTC) »

Started the new job at Codesourcery, telecommuting. I wondered how it would be, not seeing anybody. It turns out that IM becomes very important. I'm ensconced at an office shared with lots of other itinerants, some interesting, such as the mechanical engineer who did the first Roomba. Unfortunately network service absolutely stinks. If it's not fixed I will need to figure out something else.

At the moment I am choosing to interpret the low turnout of votes for Debian Project Leader to imply confidence in the entire slate of candidates. People don't know who to vote for, but figure it doesn't matter so much who wins, as all the ones likely to get any votes are good.

I just found out that modern laptop batteries get used up after a couple of years whether you charge/discharge cycle them or not. That means it's a big mistake to buy an extra "for later". For most of us, too, it means time will use them up faster than cycling them will, so you should only alternate with a spare if you (at least occasionally) really need the extra battery time. I also just found out that modern batteries have to be "calibrated" before a laptop can tell anything about how much juice they have left.

Galeon has got pretty crashy again lately. So far the most reliable cause of crashes is the www.cambridgema.gov/~CPL library page, and in particular its outsourced "minuteman" catalog service. Unlike before, though, the most likely immediate cause of a crash is running some other program...

bolsh: Are you telling us you actually reply to recruiter spam? That's even worse than regular spam. You realize, don't you, that their posted jobs are fakes? Likewise most on Monster, Dice, etc. web sites.

avriettea: I can understand somebody interviewing at Microsoft. (Ethics are hardly a universal accoutrement.) I can't understand telling us about it. So, you hope shortly to be involved in underhanded efforts to damage Free Software, and you are looking forward to it, and want us all to know? Help me here.

25 Mar 2005 (updated 25 Mar 2005 at 13:17 UTC) »

How appropriate it is that the spammer in our midst, johnnyb, asks us to help spam a hospice with bottles of water to interfere with their efforts to care for their still-living patients. The degree of hypocrisy needed to carry on about "saving" a single, er, "patient", while celebrating the death by maiming and dehydration of tens of thousands of children abroad, and also trying to scuttle the Social Security program that has saved millions of their own kin from actual starvation after retirement, beggars the imagination. Probably in most cases it's basic stupidity, but those responsible know what they're up to. (Certainly our own spammer can have no excuse.) There's a special place in hell for their like.

On a less baleful note... the old C++ books mentioned here recently probably are not such great references any more. Object orientation (i.e. virtual functions) has turned out to be much less important than a lot of people thought in the early 90s. It's an alternative to tables of function pointers or lots of switch statements in C, but how often does that come up, really? The only fundamentally important C++ feature from those days turns out to be the destructor, which subsequent languages have failed to adopt. In modern C++ practice, templates are much more important. The most useful C++ libraries these days declare no virtual functions at all.

I'm sympathetic to chalst's distaste for Python. However, its natural competition is Perl, which is unimaginably worse on every conceivable axis. Python is unabashedly slow, which is bad, but its promoters make no pretense to the contrary. Certainly there ought to be a much less crufty scripting language that is much faster, yet equally easy to pick up and do something useful with. Some people imagine, wishfully, that Ruby is that language; others see it in OCaml, a few even in Lisp. The natural successor to Python, as for C++, has not yet appeared, so we must soldier on mopping up holdout enclaves of Perl and (respectively) C adherents. In later decades, we will do the same to Python and C++ holdouts after much better languages finally surface. (Java and Visual Basic will share the twilight COBOL lingers in today, forgotten but not gone.)

16 Mar 2005 (updated 23 Mar 2005 at 20:09 UTC) »

I seem to be about to get a new job at Codesourcery. I'll be writing a non-object-oriented C++ library to do automatically vectorized and -parallelized, embeddable signal processing, to be paid for by the USAF but useful for the rest of us, and licensed under the GPL, with workplace and hours of my own choosing. Now I need a place to park a desk in Cambridge. Working alone in a silent, bare office is tough. I expect to be working in cafes until I find something.

Optics
I've been trying to find an optical engineer/physicist to chat with. When quasars seemed impossibly bright and far away (according to their red-shift), Emil Wolf suggested a mechanism by which correlated photons passing through certain varying magnetic fields could be artificially red-shifted, and Daniel FV James showed how such fields might arise naturally. I'm interested in engineering applications of the Wolf effect. My question is, how hard is it to get thermal photons to "correlate"? (I understand that a collection of similar photons moving together tend to adopt a common state, but not precisely how or under what conditions.)

Need an example of an engineering use? Gather sunlight and run it through a diffraction grating, persuade resulting slices of more-or-less monochromatic light to self-correlate, and then pipe it through just the right EM field. When the light gets red-shifted, the energy must be going into the field. We already know how to extract energy from slowly-varying EM fields.

Could the nearly-monochromatic light efficiently pump a laser? Would garden-variety coherent light suffice to provoke the Wolf effect? Can the Wolf effect operate usefully over less than astronomical distances at reasonable field strengths? (Consider that the conversion efficiency of the competitors for solar energy conversion is a low target.)

[Update: the energy that comes out of the red-shifted photons doesn't go into the field, it goes into blue-shifting some of the other photons. Furthermore, the red-shifted photons come out at angles, so within a fiber they would mush together and you would just end up with frequency dispersion. Bummer.]

chalst: Thanks for the welcome. Apologies for having taken so long to post.

msevior: It's not a mistake to talk to a Microsoft program manager. It's a huge mistake to hope for any substantive help from one. What he says about MS's process is completely unsurprising, and in fact explains why their products suck and why they need to illegally enforce a monopoly to make everyone keep paying for them. (E.g. consider that bug fixes have practically zero revenue potential.) Their only real opportunities to grow revenue are making people upgrade more, porting to new languages, and getting people with bootlegged copies to pay for them. It's almost surprising that they have any coding staff on it at all. The less you know about MS Word, the better off you'll be.

Sour Grapes
Recent postings -- an interview with Alan Kay, and a complaint about "object-oriented" languages on Lambda the Ultimate -- expose the most unattractive feature of academic-language communities: sour grapes. Kay actually repeats the old myths about early C++ being just a macro processor on C, despite that just about every new language since has been implemented first in exactly the same way; and about Bell Labs aggressively promoting it (with its $3000 budget?). Lisp people still try to criticize C++ by criticizing the notion of an object-oriented language, despite that C++ isn't. (Most of the language is to support what has come to be called generic programming.)

Lisp and Smalltalk have had every chance to take over the world. They haven't caught on, and for very practical reasons. Not least among those has been ideology. I don't know of any successful language founded on ideology. If your pet language isn't taking the world by storm, you should figure out what keeps people from being able to use it in their projects, and fix that. GC is a usual culprit.

Orkut
"Lisp", 708; "I resent LISP", 63; 708/63=11. "C++", 9325; "I hate C++, 129: 72. "Java", 13274; "I hate Java", 744; 18. "Perl", 3745; "I hate Perl", 309: 12. "Python", 2490; "Python sucks", 32: 78. More people still hate Java than like Lisp. Perl and LISP are duking it out for most hated (or resented) language. Java is in eclipse (or vice versa?) but still making a respectable showing.

Bram: C02 produced by respiration doesn't count, because whatever you exhale was absorbed recently by what you just ate, and will be re-absorbed again. What counts is the CO2 released from ancient carbon stores that had been out of circulation. Similarly, the CO2 exhaled by cows doesn't count, but the methane (hydrocarbons) they, er, expel does, because it is not re-metabolized like the CO2, but just hangs around until it oxidizes.

Recent discoveries suggest that the mysterious carbon sink (by which the CO2 levels in the atmosphere haven't grown as fast as all the well-known sources and sinks would imply) is actually dissolution in the oceans, producing carbonic acid -- which seems (also) to account for the coral bleaching. A problem perhaps more worrisome than acidified oceans, and no more coral reefs (!), is that as the upper ocean gets saturated, it stops absorbing so much CO2, and then the atmospheric concentration shoots up. There's some evidence to suggest that this is already happening.

BTW, if you're serious about improving bittorrent, how about a way for the recipient of a bad block to tell the sender, so the sender can re-hash its own copy and see if it's been corrupted, and maybe get it again from some other participant? I just got several hundred copies of the same bad block, over the course of four hours. Failing that, how about the client not asking that particular sender for that particular block any more?

Ankh: As I noted in the original article, a solution even in C is trivial if you use somebody else's library, such as the ISO C standard library. The point is not to solve the problem by reference to somebody else's work, but to use the many tiny difficulties inherent in the problem as a hard surface to use to chip off your mental inflexibilities. Cheating doesn't teach you much.

yeupou: Since you can run any ordinary x86 kernel, of whatever stripe, on an amd64, it is inherently as well supported by OSes as Intel chips. The difference is that it can also run 64-bit OSes and binaries, and run substantially faster because of its extra registers, faster memory pathway, superior instruction decoding, and non-crippled cache and ALUs. A 2GHz amd64 can usually outrun a 3.2GHz P4 -- as can, often, an old P3. I installed Ubuntu Linux on mine. I suppose it will run 32-bit binaries under the 64-bit kernel and user-space, through some kind of "mount MS_BIND" ldso trick, although I haven't yet encountered any reason to try it.

9 Nov 2004 (updated 9 Nov 2004 at 06:52 UTC) »

I finally posted my solutions to last week's Coding Challenge. I'm hoping somebody (e.g. you) can do better. The point of the challenge was that a problem that looks trivial but messy can absorb (and reward, sort of) an astonishing amount of care. If you can do well on one of these, please tell me.

miah: I can't imagine why anybody would buy a P4 these days, instead of an amd64. Can you still return it all?

114 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!