Older blog entries for chalst (starting at number 36)

zhaoway: No, macro's ability to control environments is nothing to do with currying. The point I was making about let is simply that there is no way one could define let as a normal function, ie. (define (let binders body) ...).

Let me try to make the point another way. If I have a function call (f arg), arg is evaluated in the current environment. If I have a macro call (m arg), the macro can create arbitrary computational contexts for arg: it may refer to variables that do not occur in the current computational context, its evaluation may be delayed until varius I/O operations are performed, or various updates to global variables are performed, or some other such thing. Not only that, we can do sophisticated transformations on arg, eg. we might perform a CPS transformation on the arguments through a structural recursion on the given code fragments (Jeffrey Mark Siskind's Screamer code, integrating a constraint-solving language into Common LISP, is based on this technique).

18 Apr 2002 (updated 18 Apr 2002 at 15:54 UTC) »
zhaoway: While important, the main use of macros is not efficiency, so much as ability to manipulate syntax. The main application I find myself using macros for, and crucial to the `little languages' approach, is the ability to control the environment in which computations are performed. Think about the let special form, which since R5RS is defined using macros in terms of lambda. The body of the let is not evaluated in the environment in which the let occurs (which is the only option with normal functions), but in a new environemnt which has a set of variables defined in it.

Paul Graham's `On lisp' has a pretty good, well-motivated, introduction to macros in Common LISP. A nice thing when using Scheme's syntax-rules is that this is a hygienic macro system so that you don't need to worry about the variable capture that happens in Common LISP.

Last thing: I'm interested in your ideas about debugging. What do you mean by hidden macros? Do you mean redefining normal features of the language, like define, etc. to include tests and traces?

bjf: English has its tricky bits: english has lots of compound verbs that sound like good, simple english, but in fact are completely new usages with a somewhat complex grammar, such as `She was put off by his tacky t-shirt', `They set out to destroy the evidence', etc. Gender is a real nuisance, but I don't think case is all that bad: it's a bit of a pain when one first learns the language, but it's quite well behaved, and when you've learnt it, that's it. And when you are comfortable with case, you find it gives you some opportunities to improve the way you put things.

sej: I wasn't saying I agreed with the Economist article, actually I think the last thing the Middle East needs is a heavy handed US intervention, but what I found interesting about the article is that the authors don't think peace between Israel and Palestine is hopeless. Do you think a UN-led peacekeeping force would have any chance? I have to say I am pessimistic.

16 Apr 2002 (updated 16 Apr 2002 at 16:22 UTC) »
Israel, Palestine: This article at the Economist makes an interesting argument. It claims that since moderate attempts at restoring peace are bound to fail, due firstly to the gulf that exists between most Palestianians and Israelis and secondly due to the impossibility of a ceasefire with no prospect of a settlement acceptable to both parties, the only hope for a peaceful settlement lies in a more ambitious plan that America attempts to impose on both parties.

duncanm: The argument only works for spoken languages; the goodness of the written form of languages varies. English has an unusally bad written form, where learning the spelling of words has to be done by rote, with no masterable system behind it. By contrast German is excellent, learning the system relating the written form of the language to its pronunciation is a relatively simple thing. The cost of English's poor written form is measured in the lagging behind of English native speakers in international leagues, high rates of semi-literacy amongst adults in countries with universal education systems, etc. Congrats on the North Eastern post, btw...

arc: I've a few ideas about programming languages that I think might be right for Paul Graham's arc language. I'll try to get them a bit more polished than this and then send them in, but here are the general themes:
  • Alan Bawden suggested that arc might benefit from a module system based on his `first-class macros', ie. where procedures can take macros as arguments and still have everything resolved at compiler time. I think this would be wonderful, and would really allow arc to stand out as a technically innovative programming language.
  • `Unix won'
    1. Case sensitivity: I'm not sure about this - Unix has won the battles so far, but it still might be displaced by an MS/NT system as the de facto server standard. A practical consequence: it can sometimes be a bit of a nuisance to handle MS filesystems using a case sensitive symbol system. I'd like to see a way to make this something that the user can configure at run-time. I'm putting together a proposal on how to handle this.
    2. UNIX awareness: This is a good thing, but I'd like to see an attempt to be multi-platform. Python has got this almost right, with language design being focussed around what is practical to do on all platforms. Having said this, sometimes Python has a lowest-common-denominator feel. Perhaps having parallel arc/UNIX, arc/JVM and arc.NET implementations, with an attempt to keep the intersection of the three (Portable arc?) as large as possible, would be a good thing?
  • Soft typing: I'd like to see language support for soft typing, perhaps based on intersection types.
  • Infix/currying/laziness:I'd like to see something allowing it easy to incorporate Haskell style user definable infix operators, currying and lazy functions/list manipulations. I've an idea that this can be done all with special forms:
    1. We have a [...] special form that allows infix notation of non-function types.
    2. We have a [> ...] special form that allows curried infix notation with eager semantics: the syntax is as before but we allow `_' parameters which are treated as parameters of a fn/lambda construction (so [_ + _] is (fn (x y) (+ x y)));
    3. We have a [< ...] special form similar to the eager form, that allows curried infix notation with lazy semantics, handed using some future like mechanism;
    All of these would macro-expand into normal S-expressions.

    The idea behing the `<' and `>' mnemonic is that eager reduction strategies in the lambda calculus tends to evaluate beta redexes further to the right than lazy reduction strategies do. This would allow a lot of the Bird--Meertens formalism to be modelled painlessly in arc;

  • Regexps: I'd like to see support for regular expressions in the core language, perhaps following the proposal of Olin Shivers.
  • Lexemes: I've an idea about extending usual treatment of environments with combinator parsing-like ideas: we extend environments so that instead of just mapping symbols to values, they also allow us to map patterns that can stand into infinitely many symbols to combintor parsers that recursively build up an expression. This might be nice for handling Perl-style regexps.
tk: The anti-Chomsky article is incompetent. I am not a linguist (my wife is though), and certainly no fan of Chomsky, but I have no difficulties finding reams of falsehoods and fallacies in this article. Just a few of the worst, to where I couldn't face reading anymore.
  • Chomsky abandoned the TGG theory in the late 1980s in favour of his minimalist program. It's really astonishing for the article simply to omit mentioning this -- were the authors perhaps unaware of this. His book on the program has been around since 1995.
  • Most of the earlier points are devoted to ridiculing a view Chomskians don't hold, namely that linguistic competence is exhausted by a description of the syntactic constructions of language.
  • Point 7, poverty of the stimulus ``It is clear that those who make such a claim have never even once seriously studied the behavior of infants or remotely bothered to consider what babies actually experience in their daily lives.''. The underdetermination of response by stimulus is widely held, not just by Chomskian linguists, but across most of the field of cognitive science, and by almost all developmental psychologists. Why do the authors, neither of whom are psychologists, think they are experts who don't they provide arguments here?
  • Point 8: innateness of language: my wife (who is a linguist, also not a Chomsky fan) reckons most non-Chomskians agree language is innate. Linguistics as a whole recognises this as a controversy.
  • Point 9: do the authors seriously assert that there are a finite set of sentences that humans are capable of grasping? Perhaps they would care to enumerate them?
  • Point 10: Chomskians do distinguish between first language acquisition and later language acquisition.
Wow, pretty bad going. But apparently the authors win all their fights on sci.lang, so they must be right anyway!

I think we've got to wait a bit before a really good exposee of what is wrong with the Chomsky cult appears. I'm afraid this isn't it. At least they didn't resort to the `self-hating jew' argument... no wait, there it is in point 35. *sigh*

9 Apr 2002 (updated 9 Apr 2002 at 19:12 UTC) »

Putting the finishing touches on an article that's been accepted by the JCSS. Nice to see the end of it. My contact with the publisher has been good, but it kind of sticks in the throat having to sign away my copyright rights to a giant publisher.

zhaoway: I'm interested to hear about your progress with Clean: I've never used it, but I've heard rather a lot about, and I'm particularly interested in the attempt to give semantics for it using term-graph rewriting. I'm also interested in the idea of using ILL-inspired ideas about control of resources.

pom: Good lead, and good work. I've been looking for something like pigale for a while.

Postscript: Does anyone know of a good way to switch off Ctrl-S/Ctrl-Q SUSPEND/RESUME terminal behaviour. By good I mean a simple way that works at most shells, on weird UNIX systems and almost all tty providing contexts?

Lots of pressure: conference deadline 1st April and I haven't even proven all my results yet. And it's to be coauthored, and my coauthor hasn't any of my new results in over a year. Better get moving...

raph: Very nice post, has got me thinking... Some immediate reactions: not all LISP-like implementations have bad FFIs: in the scheme world guile and scheme->C have good FFIs and scheme48 (my favourite dialect) has a reasonable FFI. I think FFIs in the scheme world are better, on the whole, than in the Common LISP world, but that maybe just my prejudice speaking. Scsh (built on scheme48 and with a forthcoming guile implementation) has excellent IO facilities, maybe the best of any language I know (I know C, Scheme, C++, Java and Python reasonably well).

Dynamic languages need not be inefficient: checkout the papers on soft typing at the Rice repository (especially Matthias Felleisen's papers). The basic idea behind soft typing is that when you apply type inference for reasonable type systems to dynamic languages, most functions are typable. So you only need the run-time overhead of dynamic type dispatch where it is really needed.

Biotechnology: Read this recent Economist book review, I think I will buy the book. I am impressed by the potential of DNA computing (essentially it is engineering with life-like processes). This technology sidesteps the ethical difficulties and ecological dangers associated with mainstream bio-technology, and, it seems to me, it pretty much has all the medical and agricultural potential of bio-technology (eg. DNA machines can produce just the same range of proteins as live DNA, so it seems reasonable to suppose that anything that can be done by splicing genes into organisms can be done by DNA `helpers' living symbiotically inside an untampered with organism). Not heard this thought before, so I thought I'd get it off my chest.

jfleck: Got to say, I've never found a Zippy cartoon funny.

18 Mar 2002 (updated 18 Mar 2002 at 21:48 UTC) »

Hi again, advogato. Three week surprise honeymoon in South Africa. Unsurprisingly mixed feelings about being back... Three and a half weeks without any responsibility to read email is something I definitely like the idea of doing from time to time...

zhaoway: I find `which language is best' arguments to be uniformly tiresome. Much better to ask what the strengths and weaknesses are of particular languages, eg. for programming a particular kind of task, for future employability of developers and maintainers, for compiler efficiency, for expressive completeness, for FFI, etc. Python clearly has many strengths that LISP/scheme lack, also vica versa. Python seems better suited to the neophyte computer programmer than scheme/LISP, with fewer pitfalls early on the path to learning and a more widely appeciated syntax. It also has perhaps the most user-responsive developer communities of any programming languages. Schemers invented the RFI process, pythonians made it work.

The advantages of LISP/scheme come later in a programmers development: expressive completeness, program writing programs, concurrency/distributed computing with continuations, control over the compilation process. I think it is a pretty elitist language, despite the TeachScheme! initiative, with correspondingly poor network effects (fewer people working on compilers, fewer chances of replacing key developers, fewer off-the-shelf libraries).

Goodbye dear diary. I'm getting married tomorrow and won't get back from my Hochzeitsreise (honeymoon) until mid-March. Cheerio, Advogato!

zhaoway: Hmm. Closures are a piece of implementation technology, which ensure that recursion can occur without the latest invocation of a function messing up the older invocations local variables. They've been around for a while, but closures can be stack allocated (like C) or heap allocated (like Scheme). To have continuations and tail recursion you need to heap allocate your closures.

Currying is *quite* different, it is a piece of semantics. It is a correspondence between functions with two arguments and functions with one argument that return a function with one argument, eg. currying this:
(lambda (x y) (+ x y))
gives you this:
(lambda (x) (lambda (y) (+ x y)))

Languages like ML and Haskell use currying implicitly in their syntax, it can be quite nice to have. You can use macros to do the same thing in Scheme, not many people do, though.

27 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!