Older blog entries for chalst (starting at number 34)

16 Apr 2002 (updated 16 Apr 2002 at 16:22 UTC) »
Israel, Palestine: This article at the Economist makes an interesting argument. It claims that since moderate attempts at restoring peace are bound to fail, due firstly to the gulf that exists between most Palestianians and Israelis and secondly due to the impossibility of a ceasefire with no prospect of a settlement acceptable to both parties, the only hope for a peaceful settlement lies in a more ambitious plan that America attempts to impose on both parties.

duncanm: The argument only works for spoken languages; the goodness of the written form of languages varies. English has an unusally bad written form, where learning the spelling of words has to be done by rote, with no masterable system behind it. By contrast German is excellent, learning the system relating the written form of the language to its pronunciation is a relatively simple thing. The cost of English's poor written form is measured in the lagging behind of English native speakers in international leagues, high rates of semi-literacy amongst adults in countries with universal education systems, etc. Congrats on the North Eastern post, btw...

arc: I've a few ideas about programming languages that I think might be right for Paul Graham's arc language. I'll try to get them a bit more polished than this and then send them in, but here are the general themes:
  • Alan Bawden suggested that arc might benefit from a module system based on his `first-class macros', ie. where procedures can take macros as arguments and still have everything resolved at compiler time. I think this would be wonderful, and would really allow arc to stand out as a technically innovative programming language.
  • `Unix won'
    1. Case sensitivity: I'm not sure about this - Unix has won the battles so far, but it still might be displaced by an MS/NT system as the de facto server standard. A practical consequence: it can sometimes be a bit of a nuisance to handle MS filesystems using a case sensitive symbol system. I'd like to see a way to make this something that the user can configure at run-time. I'm putting together a proposal on how to handle this.
    2. UNIX awareness: This is a good thing, but I'd like to see an attempt to be multi-platform. Python has got this almost right, with language design being focussed around what is practical to do on all platforms. Having said this, sometimes Python has a lowest-common-denominator feel. Perhaps having parallel arc/UNIX, arc/JVM and arc.NET implementations, with an attempt to keep the intersection of the three (Portable arc?) as large as possible, would be a good thing?
  • Soft typing: I'd like to see language support for soft typing, perhaps based on intersection types.
  • Infix/currying/laziness:I'd like to see something allowing it easy to incorporate Haskell style user definable infix operators, currying and lazy functions/list manipulations. I've an idea that this can be done all with special forms:
    1. We have a [...] special form that allows infix notation of non-function types.
    2. We have a [> ...] special form that allows curried infix notation with eager semantics: the syntax is as before but we allow `_' parameters which are treated as parameters of a fn/lambda construction (so [_ + _] is (fn (x y) (+ x y)));
    3. We have a [< ...] special form similar to the eager form, that allows curried infix notation with lazy semantics, handed using some future like mechanism;
    All of these would macro-expand into normal S-expressions.

    The idea behing the `<' and `>' mnemonic is that eager reduction strategies in the lambda calculus tends to evaluate beta redexes further to the right than lazy reduction strategies do. This would allow a lot of the Bird--Meertens formalism to be modelled painlessly in arc;

  • Regexps: I'd like to see support for regular expressions in the core language, perhaps following the proposal of Olin Shivers.
  • Lexemes: I've an idea about extending usual treatment of environments with combinator parsing-like ideas: we extend environments so that instead of just mapping symbols to values, they also allow us to map patterns that can stand into infinitely many symbols to combintor parsers that recursively build up an expression. This might be nice for handling Perl-style regexps.
tk: The anti-Chomsky article is incompetent. I am not a linguist (my wife is though), and certainly no fan of Chomsky, but I have no difficulties finding reams of falsehoods and fallacies in this article. Just a few of the worst, to where I couldn't face reading anymore.
  • Chomsky abandoned the TGG theory in the late 1980s in favour of his minimalist program. It's really astonishing for the article simply to omit mentioning this -- were the authors perhaps unaware of this. His book on the program has been around since 1995.
  • Most of the earlier points are devoted to ridiculing a view Chomskians don't hold, namely that linguistic competence is exhausted by a description of the syntactic constructions of language.
  • Point 7, poverty of the stimulus ``It is clear that those who make such a claim have never even once seriously studied the behavior of infants or remotely bothered to consider what babies actually experience in their daily lives.''. The underdetermination of response by stimulus is widely held, not just by Chomskian linguists, but across most of the field of cognitive science, and by almost all developmental psychologists. Why do the authors, neither of whom are psychologists, think they are experts who don't they provide arguments here?
  • Point 8: innateness of language: my wife (who is a linguist, also not a Chomsky fan) reckons most non-Chomskians agree language is innate. Linguistics as a whole recognises this as a controversy.
  • Point 9: do the authors seriously assert that there are a finite set of sentences that humans are capable of grasping? Perhaps they would care to enumerate them?
  • Point 10: Chomskians do distinguish between first language acquisition and later language acquisition.
Wow, pretty bad going. But apparently the authors win all their fights on sci.lang, so they must be right anyway!

I think we've got to wait a bit before a really good exposee of what is wrong with the Chomsky cult appears. I'm afraid this isn't it. At least they didn't resort to the `self-hating jew' argument... no wait, there it is in point 35. *sigh*

9 Apr 2002 (updated 9 Apr 2002 at 19:12 UTC) »

Putting the finishing touches on an article that's been accepted by the JCSS. Nice to see the end of it. My contact with the publisher has been good, but it kind of sticks in the throat having to sign away my copyright rights to a giant publisher.

zhaoway: I'm interested to hear about your progress with Clean: I've never used it, but I've heard rather a lot about, and I'm particularly interested in the attempt to give semantics for it using term-graph rewriting. I'm also interested in the idea of using ILL-inspired ideas about control of resources.

pom: Good lead, and good work. I've been looking for something like pigale for a while.

Postscript: Does anyone know of a good way to switch off Ctrl-S/Ctrl-Q SUSPEND/RESUME terminal behaviour. By good I mean a simple way that works at most shells, on weird UNIX systems and almost all tty providing contexts?

Lots of pressure: conference deadline 1st April and I haven't even proven all my results yet. And it's to be coauthored, and my coauthor hasn't any of my new results in over a year. Better get moving...

raph: Very nice post, has got me thinking... Some immediate reactions: not all LISP-like implementations have bad FFIs: in the scheme world guile and scheme->C have good FFIs and scheme48 (my favourite dialect) has a reasonable FFI. I think FFIs in the scheme world are better, on the whole, than in the Common LISP world, but that maybe just my prejudice speaking. Scsh (built on scheme48 and with a forthcoming guile implementation) has excellent IO facilities, maybe the best of any language I know (I know C, Scheme, C++, Java and Python reasonably well).

Dynamic languages need not be inefficient: checkout the papers on soft typing at the Rice repository (especially Matthias Felleisen's papers). The basic idea behind soft typing is that when you apply type inference for reasonable type systems to dynamic languages, most functions are typable. So you only need the run-time overhead of dynamic type dispatch where it is really needed.

Biotechnology: Read this recent Economist book review, I think I will buy the book. I am impressed by the potential of DNA computing (essentially it is engineering with life-like processes). This technology sidesteps the ethical difficulties and ecological dangers associated with mainstream bio-technology, and, it seems to me, it pretty much has all the medical and agricultural potential of bio-technology (eg. DNA machines can produce just the same range of proteins as live DNA, so it seems reasonable to suppose that anything that can be done by splicing genes into organisms can be done by DNA `helpers' living symbiotically inside an untampered with organism). Not heard this thought before, so I thought I'd get it off my chest.

jfleck: Got to say, I've never found a Zippy cartoon funny.

18 Mar 2002 (updated 18 Mar 2002 at 21:48 UTC) »

Hi again, advogato. Three week surprise honeymoon in South Africa. Unsurprisingly mixed feelings about being back... Three and a half weeks without any responsibility to read email is something I definitely like the idea of doing from time to time...

zhaoway: I find `which language is best' arguments to be uniformly tiresome. Much better to ask what the strengths and weaknesses are of particular languages, eg. for programming a particular kind of task, for future employability of developers and maintainers, for compiler efficiency, for expressive completeness, for FFI, etc. Python clearly has many strengths that LISP/scheme lack, also vica versa. Python seems better suited to the neophyte computer programmer than scheme/LISP, with fewer pitfalls early on the path to learning and a more widely appeciated syntax. It also has perhaps the most user-responsive developer communities of any programming languages. Schemers invented the RFI process, pythonians made it work.

The advantages of LISP/scheme come later in a programmers development: expressive completeness, program writing programs, concurrency/distributed computing with continuations, control over the compilation process. I think it is a pretty elitist language, despite the TeachScheme! initiative, with correspondingly poor network effects (fewer people working on compilers, fewer chances of replacing key developers, fewer off-the-shelf libraries).

Goodbye dear diary. I'm getting married tomorrow and won't get back from my Hochzeitsreise (honeymoon) until mid-March. Cheerio, Advogato!

zhaoway: Hmm. Closures are a piece of implementation technology, which ensure that recursion can occur without the latest invocation of a function messing up the older invocations local variables. They've been around for a while, but closures can be stack allocated (like C) or heap allocated (like Scheme). To have continuations and tail recursion you need to heap allocate your closures.

Currying is *quite* different, it is a piece of semantics. It is a correspondence between functions with two arguments and functions with one argument that return a function with one argument, eg. currying this:
(lambda (x y) (+ x y))
gives you this:
(lambda (x) (lambda (y) (+ x y)))

Languages like ML and Haskell use currying implicitly in their syntax, it can be quite nice to have. You can use macros to do the same thing in Scheme, not many people do, though.

davidw: scsh has excellent text munging facilities, with a family of string split like functions. It's designed to be used in place of Perl, so the standard library for scsh doesn't omit these kinds of things. These utilities are also covered in SRFI's 13 and 14.

It's not so surprising that regular scheme omits these things: this string manipulation is a characteristic of a kind of programming that scheme wasn't designed to cater for.

What is up with Tom Lord's `arch' project? I trust CVS, Subversion and Bitkeeper not to lose my data. Why should I trust arch?

zhaoway: Nice to feel I made a difference! But my, you want to call C and still feel functional? That's a tough call... Maybe the `little languages' approach to calling the outside world can help?

zhaoway again: Continuation hacking is not unique to scheme/LISP (SML/NJ has call/cc), but scheme is in many ways its natural home: Guy Steele's original rabbit compiler introduced the idea that coninuation-passing-style might be a good way to build a compiler. A lot of water has passed under *that* bridge since then...

welisc: Well, if you're modelling proteins I guess compiler efficiency is a big issue, and I don't think scsh runs on any implementations which stress compiler efficiency. I guess I'd stick with OCaml, but if you are interested in getting deep into scheme48 (scsh's mother) you *can* program the performance critical parts in prescheme. I'm afraid the learning curve for prescheme is pretty steep, I haven't scaled it myself yet.

25 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!