Older blog entries for chalst (starting at number 39)

Joel Spolsky thinks Bertrand Meyer's article on .NET language independence supports what he says about the weakness of language independence for .NET. I disagree, I think, while .NET doesn't support many of the reasons that Joel cites for moving to other languages, as Bertrand says it is a good framework for most modern languages: it's good enough to support LISP and Scheme, which as Paul Graham says is the model new programming languages are evolving towards, and which the JVM doesn't support due to its lack of tail recursive function calls.

While I'm on the subject of .NET, I want to say again that I don't think Gnome on Mono is a good idea, for two reasons:

  • MS's aims with .NET are anti-competitive and will hurt free software;
  • Free software support for Java is better advanced, and it is better to build on what we have already than support .NET.
However I do think .NET is innovative and important, and so is something the free software world should properly understand. A better strategy would be to put more effort in free software Java based efforts, and pressurise SUN into remedying Java's defects (eg. the prejudicial framework for J2EE certification, the weakness of the Java Community Program, the lack of tail-recursive calls in the JVM). Sun is not an ideal free software partner, but it's a damn sight better in almost every respect than MS.

Postscript #1: Since reading this, I read the Interview with Danese Cooper on Slashdot, which seems to suggest that Sun is at last dealing with the open source J2EE problem. Didn't say anything about the problems with the Java Community Program, though, more's the pity.

Postcript #2: The Stallman factor: spot on!

Postscript #3: Certified of shlomif as Journeyer because I like his diary entries (a lot), and apparently he has good skills and makes worthwhile contributions to free software.

9 May 2002 (updated 9 May 2002 at 16:46 UTC) »

I guess the critera I use for apprentice certification are pretty much: shows willing and occasionally posts diary entries. Certified jooon to apprentice after receiving a comment from him about using one's `Notes' section to record the justifications for one's certifications. I don't like this because (i) it is messy and (ii) certifications change and I like to keep a historical record of this kind of activity. Also certified ClimbNorth to apprentice: I'm interested generally in games programming, and I'd like to encourage free software games developers.

I hardly read slashdot anymore, but one thing I've noticed is they've stopped acknowledging `ping' requests. I think is unfriendly behaviour (yes -they take a tiny amount of time, yes -slashdot has problems with DOSers, but even so ping is important network glue, and one should eventually acknowledge at least the first few ping queries), so one thing I do when I remember is have a few processes constantly sending them messages. Maybe I'll make put this on my cron scheduler...

Postscript #1: I found this essay on trust and the media at Dan Gillmour's weblog (this year's Reith Lecture, I don't suppose many folks here know what that is). The author's argument is basically that currently the big media companies have a `licence to deceive' that is normally defended by appeals to John Stuart Mill's arguments that are based on an outmoded view of the press. Good stuff. Dave Winer's idea about weblogs being a way to improve the quality of information is one well-known idea, but I wonder if a system based on (attack-resistant) certification could do better? What would such a system need to do?

Postscript #2: Certified Grit to journeyer: has good skills, does interesting research, has an interesting free software project and writes interesting diary entries.

7 May 2002 (updated 7 May 2002 at 11:35 UTC) »

Very busy at the moment. In last three days finished first draft of a big research proposal, refereeing a paper I very much liked, and put the finishing touches on a technical report. Sometimes work goes well, would be good if I could remember that when things don't go so well.

I've decided to attach information on my certification activity to my diary entries, starting with amars, who I've certified as journeyer because of his comments on and work on PHP: I don't like PHP, but it's important and free, and amars seems to have a good perspective on it and has done apparently useful free software work with it.

Two suggestions for advogato: (i) fix the 404 for http://www.advogato.org/html.html and (ii) give an option to add a URL to certifications so that folks can link to diary entries explaining their certifiction activity...

Postscript #1 jlbec: I agree that the analogy between Apartheid-era South Africa and Israel isn't perfect but I find athe nalogy between the condition of Palestine and the then condition of the homelands to be striking. Also that an obviously unjust and unsustainable status quo is maintained by fear of change and a surreal failure to grasp their real position. I only hope things work out as well in the Middle East as they did in SA.

Postscript #2: Certified mcs to apprentice - sounds like he needs encouragement.

Postscript #3: Joel Spolsky's critique of language independence on .NET seems to miss the point: MS wants to make it easy for existing projects to migrate to the .NET platform. Bring 'em in and lock the door... I like the way he makes the point about the programming language syntax arguments, though - they really are lame.

zhaoway: No, macro's ability to control environments is nothing to do with currying. The point I was making about let is simply that there is no way one could define let as a normal function, ie. (define (let binders body) ...).

Let me try to make the point another way. If I have a function call (f arg), arg is evaluated in the current environment. If I have a macro call (m arg), the macro can create arbitrary computational contexts for arg: it may refer to variables that do not occur in the current computational context, its evaluation may be delayed until varius I/O operations are performed, or various updates to global variables are performed, or some other such thing. Not only that, we can do sophisticated transformations on arg, eg. we might perform a CPS transformation on the arguments through a structural recursion on the given code fragments (Jeffrey Mark Siskind's Screamer code, integrating a constraint-solving language into Common LISP, is based on this technique).

18 Apr 2002 (updated 18 Apr 2002 at 15:54 UTC) »
zhaoway: While important, the main use of macros is not efficiency, so much as ability to manipulate syntax. The main application I find myself using macros for, and crucial to the `little languages' approach, is the ability to control the environment in which computations are performed. Think about the let special form, which since R5RS is defined using macros in terms of lambda. The body of the let is not evaluated in the environment in which the let occurs (which is the only option with normal functions), but in a new environemnt which has a set of variables defined in it.

Paul Graham's `On lisp' has a pretty good, well-motivated, introduction to macros in Common LISP. A nice thing when using Scheme's syntax-rules is that this is a hygienic macro system so that you don't need to worry about the variable capture that happens in Common LISP.

Last thing: I'm interested in your ideas about debugging. What do you mean by hidden macros? Do you mean redefining normal features of the language, like define, etc. to include tests and traces?

bjf: English has its tricky bits: english has lots of compound verbs that sound like good, simple english, but in fact are completely new usages with a somewhat complex grammar, such as `She was put off by his tacky t-shirt', `They set out to destroy the evidence', etc. Gender is a real nuisance, but I don't think case is all that bad: it's a bit of a pain when one first learns the language, but it's quite well behaved, and when you've learnt it, that's it. And when you are comfortable with case, you find it gives you some opportunities to improve the way you put things.

sej: I wasn't saying I agreed with the Economist article, actually I think the last thing the Middle East needs is a heavy handed US intervention, but what I found interesting about the article is that the authors don't think peace between Israel and Palestine is hopeless. Do you think a UN-led peacekeeping force would have any chance? I have to say I am pessimistic.

16 Apr 2002 (updated 16 Apr 2002 at 16:22 UTC) »
Israel, Palestine: This article at the Economist makes an interesting argument. It claims that since moderate attempts at restoring peace are bound to fail, due firstly to the gulf that exists between most Palestianians and Israelis and secondly due to the impossibility of a ceasefire with no prospect of a settlement acceptable to both parties, the only hope for a peaceful settlement lies in a more ambitious plan that America attempts to impose on both parties.

duncanm: The argument only works for spoken languages; the goodness of the written form of languages varies. English has an unusally bad written form, where learning the spelling of words has to be done by rote, with no masterable system behind it. By contrast German is excellent, learning the system relating the written form of the language to its pronunciation is a relatively simple thing. The cost of English's poor written form is measured in the lagging behind of English native speakers in international leagues, high rates of semi-literacy amongst adults in countries with universal education systems, etc. Congrats on the North Eastern post, btw...

arc: I've a few ideas about programming languages that I think might be right for Paul Graham's arc language. I'll try to get them a bit more polished than this and then send them in, but here are the general themes:
  • Alan Bawden suggested that arc might benefit from a module system based on his `first-class macros', ie. where procedures can take macros as arguments and still have everything resolved at compiler time. I think this would be wonderful, and would really allow arc to stand out as a technically innovative programming language.
  • `Unix won'
    1. Case sensitivity: I'm not sure about this - Unix has won the battles so far, but it still might be displaced by an MS/NT system as the de facto server standard. A practical consequence: it can sometimes be a bit of a nuisance to handle MS filesystems using a case sensitive symbol system. I'd like to see a way to make this something that the user can configure at run-time. I'm putting together a proposal on how to handle this.
    2. UNIX awareness: This is a good thing, but I'd like to see an attempt to be multi-platform. Python has got this almost right, with language design being focussed around what is practical to do on all platforms. Having said this, sometimes Python has a lowest-common-denominator feel. Perhaps having parallel arc/UNIX, arc/JVM and arc.NET implementations, with an attempt to keep the intersection of the three (Portable arc?) as large as possible, would be a good thing?
  • Soft typing: I'd like to see language support for soft typing, perhaps based on intersection types.
  • Infix/currying/laziness:I'd like to see something allowing it easy to incorporate Haskell style user definable infix operators, currying and lazy functions/list manipulations. I've an idea that this can be done all with special forms:
    1. We have a [...] special form that allows infix notation of non-function types.
    2. We have a [> ...] special form that allows curried infix notation with eager semantics: the syntax is as before but we allow `_' parameters which are treated as parameters of a fn/lambda construction (so [_ + _] is (fn (x y) (+ x y)));
    3. We have a [< ...] special form similar to the eager form, that allows curried infix notation with lazy semantics, handed using some future like mechanism;
    All of these would macro-expand into normal S-expressions.

    The idea behing the `<' and `>' mnemonic is that eager reduction strategies in the lambda calculus tends to evaluate beta redexes further to the right than lazy reduction strategies do. This would allow a lot of the Bird--Meertens formalism to be modelled painlessly in arc;

  • Regexps: I'd like to see support for regular expressions in the core language, perhaps following the proposal of Olin Shivers.
  • Lexemes: I've an idea about extending usual treatment of environments with combinator parsing-like ideas: we extend environments so that instead of just mapping symbols to values, they also allow us to map patterns that can stand into infinitely many symbols to combintor parsers that recursively build up an expression. This might be nice for handling Perl-style regexps.
tk: The anti-Chomsky article is incompetent. I am not a linguist (my wife is though), and certainly no fan of Chomsky, but I have no difficulties finding reams of falsehoods and fallacies in this article. Just a few of the worst, to where I couldn't face reading anymore.
  • Chomsky abandoned the TGG theory in the late 1980s in favour of his minimalist program. It's really astonishing for the article simply to omit mentioning this -- were the authors perhaps unaware of this. His book on the program has been around since 1995.
  • Most of the earlier points are devoted to ridiculing a view Chomskians don't hold, namely that linguistic competence is exhausted by a description of the syntactic constructions of language.
  • Point 7, poverty of the stimulus ``It is clear that those who make such a claim have never even once seriously studied the behavior of infants or remotely bothered to consider what babies actually experience in their daily lives.''. The underdetermination of response by stimulus is widely held, not just by Chomskian linguists, but across most of the field of cognitive science, and by almost all developmental psychologists. Why do the authors, neither of whom are psychologists, think they are experts who don't they provide arguments here?
  • Point 8: innateness of language: my wife (who is a linguist, also not a Chomsky fan) reckons most non-Chomskians agree language is innate. Linguistics as a whole recognises this as a controversy.
  • Point 9: do the authors seriously assert that there are a finite set of sentences that humans are capable of grasping? Perhaps they would care to enumerate them?
  • Point 10: Chomskians do distinguish between first language acquisition and later language acquisition.
Wow, pretty bad going. But apparently the authors win all their fights on sci.lang, so they must be right anyway!

I think we've got to wait a bit before a really good exposee of what is wrong with the Chomsky cult appears. I'm afraid this isn't it. At least they didn't resort to the `self-hating jew' argument... no wait, there it is in point 35. *sigh*

9 Apr 2002 (updated 9 Apr 2002 at 19:12 UTC) »

Putting the finishing touches on an article that's been accepted by the JCSS. Nice to see the end of it. My contact with the publisher has been good, but it kind of sticks in the throat having to sign away my copyright rights to a giant publisher.

zhaoway: I'm interested to hear about your progress with Clean: I've never used it, but I've heard rather a lot about, and I'm particularly interested in the attempt to give semantics for it using term-graph rewriting. I'm also interested in the idea of using ILL-inspired ideas about control of resources.

pom: Good lead, and good work. I've been looking for something like pigale for a while.

Postscript: Does anyone know of a good way to switch off Ctrl-S/Ctrl-Q SUSPEND/RESUME terminal behaviour. By good I mean a simple way that works at most shells, on weird UNIX systems and almost all tty providing contexts?

Lots of pressure: conference deadline 1st April and I haven't even proven all my results yet. And it's to be coauthored, and my coauthor hasn't any of my new results in over a year. Better get moving...

raph: Very nice post, has got me thinking... Some immediate reactions: not all LISP-like implementations have bad FFIs: in the scheme world guile and scheme->C have good FFIs and scheme48 (my favourite dialect) has a reasonable FFI. I think FFIs in the scheme world are better, on the whole, than in the Common LISP world, but that maybe just my prejudice speaking. Scsh (built on scheme48 and with a forthcoming guile implementation) has excellent IO facilities, maybe the best of any language I know (I know C, Scheme, C++, Java and Python reasonably well).

Dynamic languages need not be inefficient: checkout the papers on soft typing at the Rice repository (especially Matthias Felleisen's papers). The basic idea behind soft typing is that when you apply type inference for reasonable type systems to dynamic languages, most functions are typable. So you only need the run-time overhead of dynamic type dispatch where it is really needed.

Biotechnology: Read this recent Economist book review, I think I will buy the book. I am impressed by the potential of DNA computing (essentially it is engineering with life-like processes). This technology sidesteps the ethical difficulties and ecological dangers associated with mainstream bio-technology, and, it seems to me, it pretty much has all the medical and agricultural potential of bio-technology (eg. DNA machines can produce just the same range of proteins as live DNA, so it seems reasonable to suppose that anything that can be done by splicing genes into organisms can be done by DNA `helpers' living symbiotically inside an untampered with organism). Not heard this thought before, so I thought I'd get it off my chest.

jfleck: Got to say, I've never found a Zippy cartoon funny.

30 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!