Older blog entries for chalst (starting at number 43)

graydon: It's good to see you posting diary entries again!

I agree that Paul Graham's writings have an excessively partisan flavour, but to defend the superiority of LISP macros over its alternatives:

  • LISP macros are not preprocessor macros: there is flexibility over when macros are expanded, and indeed macro expansion can be done at run-time using `eval'.
  • I'm surprised to see you think TCL `satisfies the menu': I think string manipulation is a fundamentally flawed way of doing this kind of thing. IMO, Paul Graham is entitled to dismiss TCL as not having a real `syntactic metaprogramming system'.
  • We've talked about this before (and I still haven't looked further into ocamlp4 vs. MetaML, so I may be being unfair to ocamlp4): I don't think that the macro facilities offered in the ML world are as powerful as those in the LISP world. The whole issue of how to integrate a real macro system into a statically typed language is in need of further work.
Having said this, I think the Common LISP macro facilties look antiquated being used to what is on offer in the Scheme world. Common LISP was fixed just before a lot of advances were made in this technology: hygienic macros, syntactic closures, `Macros that work', first-class macros; this is stuff which can't be grafted onto the language using defmacro. The design issues around macros are tied up with the issues around module systems, and this is an area where Common LISP is just plain broken.

raph (from 11/5/2002): I think it's a bit premature to say arc ... could well become the most compelling LISP dialect. I think the *discussion* around arc is going to be very interesting (it already is), but it is not clear to me from what he has written so far that Paul Graham has any ideas that will put arc substantially ahead of where Olin Shiver's scsh already is for the kind of tasks he has in mind. We'll see.

Grit: I think the ability to guess domain names isn't an important one, informativeness of URLs is much more important. If musedoma do their job right, then one knows that sfmoma.museum is indeed what one is looking for.

The point I was making about persistence is that for a renewal policy to make sense it would have to apply to all/most domain names, which directly undermines the utility of a DNS.

shlomif: I'd say an important advantage of Java over C++ and (to a lesser extent) perl is that it is pretty predictable: there are relatively few nasty snafus in writing code. Maybe `less insights' is the price, but I can see why people might use it to get the job done.

The `It's not real code' argument against Perl is lame, but IMO Perl doesn't scale well; I have to say I don't think of Perl as a `proper' programming language.

tk: But badvogato *is* a master, if not with that honour in the advogato trust metric...

Grit: The proposals I am familiar with for expanding the TLD domain name system are suggesting that many of the new TLDs will have different policies (ie. like .edu/.gov/.mil), so they are not proposing the .com/.org/.net free-for-all you show to be absurd. A renewal policy I think is an awful idea: the point about the DNS is precisely to have persistent points of reference to internet resources.

freetype: Are you familiar with the work on region management (ie. a compile-time alternative to garbage collection)?

lkcl: Congratulations on the new job!

tk: I don't read geek code. Well... OK then, `y!' means you're a chap?

shlomif: Check what version of ghostscript you are using: 5.50 had serious problems with generating and rendering PDF. You can check that the PDF is OK using acroread.

12 May 2002 (updated 12 May 2002 at 16:29 UTC) »
raph: I found Bertrand Meyer's article to be biased against Java: generics have been planned for Java since the first release (rumor has it that the first release of the Java SDK was almost cacelled because it didn't support them), and has been `backported' to all the old releases: one of the reasons for the delay was getting it right... The JVM is really not a bad target for most languages: the absence of longjumps (ie. continuations) in the JVM is a showstopper for scheme but otherwise the architecture doesn't fare too badly. The platform comparison you cite by John Gough talks about the overhead Java introduces by its need to box and unbox representations: that isn't the whole story, since a good JVM->native code compiler can eliminate most of this cost (I was not so impressed by this article: it didn't even mention the longjump issue).

My prediction is that there will be a big growth in domain specific languages for the .NET platform, and these will be popular with developers. I guess that the C++.NET language will be a failure for the reasons Joel Spolsky gives. I'll be interested to see if FORTRAN.NET takes off for scientific computing.

Lastly, maybe you will find Java/CNI interesting (the `Cygnus Native Interface' for Java compiled to UNIX/C using gcj).

Just certified Perrin to Journeyer for his work on Freeciv.

Postscript: Just upgraded my certification of tk to Journeyer based on his/her wide involvement in free software projects and interesting diary entries.

Joel Spolsky thinks Bertrand Meyer's article on .NET language independence supports what he says about the weakness of language independence for .NET. I disagree, I think, while .NET doesn't support many of the reasons that Joel cites for moving to other languages, as Bertrand says it is a good framework for most modern languages: it's good enough to support LISP and Scheme, which as Paul Graham says is the model new programming languages are evolving towards, and which the JVM doesn't support due to its lack of tail recursive function calls.

While I'm on the subject of .NET, I want to say again that I don't think Gnome on Mono is a good idea, for two reasons:

  • MS's aims with .NET are anti-competitive and will hurt free software;
  • Free software support for Java is better advanced, and it is better to build on what we have already than support .NET.
However I do think .NET is innovative and important, and so is something the free software world should properly understand. A better strategy would be to put more effort in free software Java based efforts, and pressurise SUN into remedying Java's defects (eg. the prejudicial framework for J2EE certification, the weakness of the Java Community Program, the lack of tail-recursive calls in the JVM). Sun is not an ideal free software partner, but it's a damn sight better in almost every respect than MS.

Postscript #1: Since reading this, I read the Interview with Danese Cooper on Slashdot, which seems to suggest that Sun is at last dealing with the open source J2EE problem. Didn't say anything about the problems with the Java Community Program, though, more's the pity.

Postcript #2: The Stallman factor: spot on!

Postscript #3: Certified of shlomif as Journeyer because I like his diary entries (a lot), and apparently he has good skills and makes worthwhile contributions to free software.

9 May 2002 (updated 9 May 2002 at 16:46 UTC) »

I guess the critera I use for apprentice certification are pretty much: shows willing and occasionally posts diary entries. Certified jooon to apprentice after receiving a comment from him about using one's `Notes' section to record the justifications for one's certifications. I don't like this because (i) it is messy and (ii) certifications change and I like to keep a historical record of this kind of activity. Also certified ClimbNorth to apprentice: I'm interested generally in games programming, and I'd like to encourage free software games developers.

I hardly read slashdot anymore, but one thing I've noticed is they've stopped acknowledging `ping' requests. I think is unfriendly behaviour (yes -they take a tiny amount of time, yes -slashdot has problems with DOSers, but even so ping is important network glue, and one should eventually acknowledge at least the first few ping queries), so one thing I do when I remember is have a few processes constantly sending them messages. Maybe I'll make put this on my cron scheduler...

Postscript #1: I found this essay on trust and the media at Dan Gillmour's weblog (this year's Reith Lecture, I don't suppose many folks here know what that is). The author's argument is basically that currently the big media companies have a `licence to deceive' that is normally defended by appeals to John Stuart Mill's arguments that are based on an outmoded view of the press. Good stuff. Dave Winer's idea about weblogs being a way to improve the quality of information is one well-known idea, but I wonder if a system based on (attack-resistant) certification could do better? What would such a system need to do?

Postscript #2: Certified Grit to journeyer: has good skills, does interesting research, has an interesting free software project and writes interesting diary entries.

7 May 2002 (updated 7 May 2002 at 11:35 UTC) »

Very busy at the moment. In last three days finished first draft of a big research proposal, refereeing a paper I very much liked, and put the finishing touches on a technical report. Sometimes work goes well, would be good if I could remember that when things don't go so well.

I've decided to attach information on my certification activity to my diary entries, starting with amars, who I've certified as journeyer because of his comments on and work on PHP: I don't like PHP, but it's important and free, and amars seems to have a good perspective on it and has done apparently useful free software work with it.

Two suggestions for advogato: (i) fix the 404 for http://www.advogato.org/html.html and (ii) give an option to add a URL to certifications so that folks can link to diary entries explaining their certifiction activity...

Postscript #1 jlbec: I agree that the analogy between Apartheid-era South Africa and Israel isn't perfect but I find athe nalogy between the condition of Palestine and the then condition of the homelands to be striking. Also that an obviously unjust and unsustainable status quo is maintained by fear of change and a surreal failure to grasp their real position. I only hope things work out as well in the Middle East as they did in SA.

Postscript #2: Certified mcs to apprentice - sounds like he needs encouragement.

Postscript #3: Joel Spolsky's critique of language independence on .NET seems to miss the point: MS wants to make it easy for existing projects to migrate to the .NET platform. Bring 'em in and lock the door... I like the way he makes the point about the programming language syntax arguments, though - they really are lame.

zhaoway: No, macro's ability to control environments is nothing to do with currying. The point I was making about let is simply that there is no way one could define let as a normal function, ie. (define (let binders body) ...).

Let me try to make the point another way. If I have a function call (f arg), arg is evaluated in the current environment. If I have a macro call (m arg), the macro can create arbitrary computational contexts for arg: it may refer to variables that do not occur in the current computational context, its evaluation may be delayed until varius I/O operations are performed, or various updates to global variables are performed, or some other such thing. Not only that, we can do sophisticated transformations on arg, eg. we might perform a CPS transformation on the arguments through a structural recursion on the given code fragments (Jeffrey Mark Siskind's Screamer code, integrating a constraint-solving language into Common LISP, is based on this technique).

18 Apr 2002 (updated 18 Apr 2002 at 15:54 UTC) »
zhaoway: While important, the main use of macros is not efficiency, so much as ability to manipulate syntax. The main application I find myself using macros for, and crucial to the `little languages' approach, is the ability to control the environment in which computations are performed. Think about the let special form, which since R5RS is defined using macros in terms of lambda. The body of the let is not evaluated in the environment in which the let occurs (which is the only option with normal functions), but in a new environemnt which has a set of variables defined in it.

Paul Graham's `On lisp' has a pretty good, well-motivated, introduction to macros in Common LISP. A nice thing when using Scheme's syntax-rules is that this is a hygienic macro system so that you don't need to worry about the variable capture that happens in Common LISP.

Last thing: I'm interested in your ideas about debugging. What do you mean by hidden macros? Do you mean redefining normal features of the language, like define, etc. to include tests and traces?

bjf: English has its tricky bits: english has lots of compound verbs that sound like good, simple english, but in fact are completely new usages with a somewhat complex grammar, such as `She was put off by his tacky t-shirt', `They set out to destroy the evidence', etc. Gender is a real nuisance, but I don't think case is all that bad: it's a bit of a pain when one first learns the language, but it's quite well behaved, and when you've learnt it, that's it. And when you are comfortable with case, you find it gives you some opportunities to improve the way you put things.

sej: I wasn't saying I agreed with the Economist article, actually I think the last thing the Middle East needs is a heavy handed US intervention, but what I found interesting about the article is that the authors don't think peace between Israel and Palestine is hopeless. Do you think a UN-led peacekeeping force would have any chance? I have to say I am pessimistic.

16 Apr 2002 (updated 16 Apr 2002 at 16:22 UTC) »
Israel, Palestine: This article at the Economist makes an interesting argument. It claims that since moderate attempts at restoring peace are bound to fail, due firstly to the gulf that exists between most Palestianians and Israelis and secondly due to the impossibility of a ceasefire with no prospect of a settlement acceptable to both parties, the only hope for a peaceful settlement lies in a more ambitious plan that America attempts to impose on both parties.

duncanm: The argument only works for spoken languages; the goodness of the written form of languages varies. English has an unusally bad written form, where learning the spelling of words has to be done by rote, with no masterable system behind it. By contrast German is excellent, learning the system relating the written form of the language to its pronunciation is a relatively simple thing. The cost of English's poor written form is measured in the lagging behind of English native speakers in international leagues, high rates of semi-literacy amongst adults in countries with universal education systems, etc. Congrats on the North Eastern post, btw...

34 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!