Older blog entries for chalst (starting at number 48)

25 Oct 2002 (updated 3 Aug 2004 at 18:11 UTC) »
Bram: The graph isomorphism problem in its general form is known to be NP-hard, this is a standard result. See, eg.:
[ATCH99] Algorithms and Theory of Computation Handbook, Mikhail J. Atallah, ed., CRC Press LLC, 1999.

raph: web-based proof assistants, some random comments:

1. I think the OBJ3 suggestion is interesting, but I don't think it is something you need to get right straightaway: I would say it was a version 2 sort of issue. What it offers is a way of modularising axiomatic frameworks in a way that gives inheritance, so that the group axiomatisiation can inherit from the monoid axiomatisiation. Drawbacks: it involves category theory, nontrivial implementation, and you have to think carefully about namespace issues. Trying to figure out what basic axioms most peopl should use for the base theory is probably the best place to start.
2. Since Nat <=> Nat x Nat we have the following are equivalent:
• One place predicates over Nat;
• Sets of Nat;
• Sets of Nat x Nat;
• Two place relations over Nat.
Since functions Nat => Nat are a subspace of the two place relations over Nat, and two place relations are isomorphic to the functions Nat x Nat => {0,1}, moving between the two representations for Z_2 is pretty trivial (with the right definitional framework).
3. One thing I'd like to see in a web-based proof assistant, is some support for different classes of proof, with documentation,

so that a proof can have an quickly-checked for-machine proof, a well-documented research paper like proof (also machine checkable), and an elementary long-but-in-easy-steps didiactic proof, complete with URLs for background reading.

raph: I'm not surprised by the quality of the responses you have received to your recent musings on web-based proof systems: this is a difficult area, both technically and philosophically and as a perfect outsider you have talked about the topic intelligently and persuasively. I think that there is every chance that your web-based proof-exchange protocols will prove to be successful and important.

• Solovay's comment about strength of ZFC and HOL, while true, may be misleading. I am not familiar with HOL, but I believe that it is based on the Girard-Reynolds polymorphic lambda calculus: the embedding of such a calculus into ZFC is not straightforward; see John Reynold's classic paper, `polymorphism is not set-theoretic'.
• A poster suggested thinking about translations between proof representations using category theoretic formalisms. I think this is probably a good idea. The poster also mentioned topos theory in this connection: this is possibly more work than is necessary for your purposes.
• I suggested second-order arithmetic as a good basis formalism: I think it makes life easy if most people use a single formalism, and the extra strength of ZFC is almost never necessary (also one can just add axioms to SOA to obtain equivalent strength to ZFC).

advotake: A nice quick hack; use expect as an intelligent xml-rpc client to post advogato diary entries.

redi's proposed hack: <quote>redi </quote> I like the idea. I think `redirect' is a better name, though...

graydon: I think the thing I have enjoyed most in our past communications is discovering just how differently it is possible for an intelligent and well-informed computer scientist to view the field of programming languages. From some of your past posts, in no particular order:

• TCL: Actually Tcl is one of my favourite languages (in the guise of Don Libe's `expect'). I just don't think that string manipulation is `real' syntactic metaprogramming. To my taste, Alan Bawden's `Quasiquotation in LISP' article presents the definitive case against string manipulation as meta programming.
• Hehner's introduction to Goedel's proof at his website is the work of an incompetent. Maybe he just had a bad day...
• I've had a look at camlp4 (at last): it's a nice solution to a problem that doesn't arise in languages like LISP that don't get fixated on `source code' vs `object code'. Because in LISP-like languages (read <file>), (macro-expand <s-exp>) and (eval <s-exp> <environment>) are just normal programs, the semantics of the language need make no distinction between read time vs run time, or between interpreted code and compiled code. It confuses beginners, but once you've grokked the basic idea it makes life much simpler.

raph: I very much like the way your ideas are going with web-based proof assistants. Some thoughts:

• On proofs amd code - I recommend reading Euclid's Elements; there are a lot of parallels between what you say and the way he goes about constructing his arithmetic and geometry;
• I think tactics were invented by LF-ers, not HOL-ers. I suppose HOL is based on the Edinburgh LF (the chaps who invented ML in the 1970s);
• I think the `right' logic for most purposes (ie. the axiomatic framework we might the `east pole' on the `more mathematical' scale of Wiedijk's graph) is classical second-order arithmetic. It is probably the most intensively studied strong proof theory in mathematics: it pretty much immediately encodes all weaker systems, and most stronger systems can naturally be expressed in it.

Bram: You can protect against this kind of thing (buffer overflows, character set attacks, etc.) on either the server or client side by applying the `command' design pattern and then embedding the control language in your favourite tcl-alike.

http://www.advogato.org/person/Roger/: Is there a clean correspondence between SlideML and Slatex?

suse->debian: I've converted my laptop from a Suse 6.4 box to a stable debian woody. Pretty smooth, but a few lumps are proving difficult to iron out:

• My X server crashes when I suspend. An ugly apmd vs. XFree86 vs. Trident vs. God knows what else conflict.
• I can't get video output and laptop display from the same client plane.
• Debconf has put an ugly network environment script in my init.d directory which I can't track down.
• I have a funny DNS vs. wwwoffle conflict.
• The usual round of trouble with .Xresources being handled differently by every other program...
Note how 60% of my troubles are caused by X misconfigurations...

reading: Wittgenstein's Lectures on the Foundations of Mathematics. I'm a glutton for punishment...

work: is going well |-D

Bram: It's a mistake to infer the existence of large cardinals from the consistency of large cardinal set theory. I agree that it is likely that large cardinal set theory is consistent, but the falseness`of 'consistency implies existence' follows from thinking about theories like PA+not(con(PA)): if PA is true then it is consistent, and so is PA+not(con(PA)) (by Goedel's incompleteness theorems), but surely this theory isn't true!
Back from Dresden where I took part in a two-week workshop of `Proof theory and Computation' and gave a five-lecture course on `Harmonic type theory' (my course was more-or-less on an attempt to give a philosophical foundation-for/critique-of the formulae-as-types correspondence). I found the other lectures very interesting, especially that of Francois Lamarche, who has made me take seriously the `Geometry of Computation' thesis (due to the linear logic crowd), which, roughly, is that computational activity is fundamentally geometric in nature, and we should formulate our computational calculi in a way that respects the topological structure of computation in a more illuminating way.

Goingware's `Make a bonfire of your reputations': I liked reading this mini-essay very much, although I am not persuaded by it. Why not? Well, I think sometimes other people's opinions, though wrong, are threatening enough that you shouldn't challenge them. Two extreme cases make the point: at the height of Stalin's power, speaking out was quite simply suicide. `Making a bonfire of your reputations', and challenging cruelty and injustice was not a wise course of action for any Russian who put value on their life. Much better to keep silent, bide your time, and do what little things you can do without endgangering yourself.

A rather different example would be the position of a full-time Scientologist who has come eventually to the realisation that their `Church' is in fact a psycho-terrorist mafia. Here, the power that threatens those who speak out is not lethal, but rather the ability of the organisation to destroy its heretics economically and psychologically.

What's the conclusion? Well, I think to be truly happy with yourself you need to feel the power to do and say what you think is right regardless of whom you might annoy, but it really is the case that the other things in life you might lose by doing this (above: your life, your sanity and your security!) might be more important still.

World cup

The race goes not to the swift nor the battle to the strong, nor does food come to the wise nor wealth to the brilliant nor favor to the learned; but time and chance happen to them all Ecclesiates 9(11)

...but even so the three teams I'm supporting (in order, England, Germany and Senegal) are all through to the quarter final :>.

graydon: It's good to see you posting diary entries again!

I agree that Paul Graham's writings have an excessively partisan flavour, but to defend the superiority of LISP macros over its alternatives:

• LISP macros are not preprocessor macros: there is flexibility over when macros are expanded, and indeed macro expansion can be done at run-time using `eval'.
• I'm surprised to see you think TCL `satisfies the menu': I think string manipulation is a fundamentally flawed way of doing this kind of thing. IMO, Paul Graham is entitled to dismiss TCL as not having a real `syntactic metaprogramming system'.
• We've talked about this before (and I still haven't looked further into ocamlp4 vs. MetaML, so I may be being unfair to ocamlp4): I don't think that the macro facilities offered in the ML world are as powerful as those in the LISP world. The whole issue of how to integrate a real macro system into a statically typed language is in need of further work.
Having said this, I think the Common LISP macro facilties look antiquated being used to what is on offer in the Scheme world. Common LISP was fixed just before a lot of advances were made in this technology: hygienic macros, syntactic closures, `Macros that work', first-class macros; this is stuff which can't be grafted onto the language using defmacro. The design issues around macros are tied up with the issues around module systems, and this is an area where Common LISP is just plain broken.

raph (from 11/5/2002): I think it's a bit premature to say arc ... could well become the most compelling LISP dialect. I think the *discussion* around arc is going to be very interesting (it already is), but it is not clear to me from what he has written so far that Paul Graham has any ideas that will put arc substantially ahead of where Olin Shiver's scsh already is for the kind of tasks he has in mind. We'll see.

Grit: I think the ability to guess domain names isn't an important one, informativeness of URLs is much more important. If musedoma do their job right, then one knows that sfmoma.museum is indeed what one is looking for.

The point I was making about persistence is that for a renewal policy to make sense it would have to apply to all/most domain names, which directly undermines the utility of a DNS.

shlomif: I'd say an important advantage of Java over C++ and (to a lesser extent) perl is that it is pretty predictable: there are relatively few nasty snafus in writing code. Maybe `less insights' is the price, but I can see why people might use it to get the job done.

The `It's not real code' argument against Perl is lame, but IMO Perl doesn't scale well; I have to say I don't think of Perl as a `proper' programming language.

tk: But badvogato *is* a master, if not with that honour in the advogato trust metric...

Grit: The proposals I am familiar with for expanding the TLD domain name system are suggesting that many of the new TLDs will have different policies (ie. like .edu/.gov/.mil), so they are not proposing the .com/.org/.net free-for-all you show to be absurd. A renewal policy I think is an awful idea: the point about the DNS is precisely to have persistent points of reference to internet resources.

freetype: Are you familiar with the work on region management (ie. a compile-time alternative to garbage collection)?

lkcl: Congratulations on the new job!

tk: I don't read geek code. Well... OK then, `y!' means you're a chap?

shlomif: Check what version of ghostscript you are using: 5.50 had serious problems with generating and rendering PDF. You can check that the PDF is OK using acroread.

12 May 2002 (updated 12 May 2002 at 16:29 UTC) »
raph: I found Bertrand Meyer's article to be biased against Java: generics have been planned for Java since the first release (rumor has it that the first release of the Java SDK was almost cacelled because it didn't support them), and has been `backported' to all the old releases: one of the reasons for the delay was getting it right... The JVM is really not a bad target for most languages: the absence of longjumps (ie. continuations) in the JVM is a showstopper for scheme but otherwise the architecture doesn't fare too badly. The platform comparison you cite by John Gough talks about the overhead Java introduces by its need to box and unbox representations: that isn't the whole story, since a good JVM->native code compiler can eliminate most of this cost (I was not so impressed by this article: it didn't even mention the longjump issue).

My prediction is that there will be a big growth in domain specific languages for the .NET platform, and these will be popular with developers. I guess that the C++.NET language will be a failure for the reasons Joel Spolsky gives. I'll be interested to see if FORTRAN.NET takes off for scientific computing.

Lastly, maybe you will find Java/CNI interesting (the `Cygnus Native Interface' for Java compiled to UNIX/C using gcj).

Just certified Perrin to Journeyer for his work on Freeciv.

Postscript: Just upgraded my certification of tk to Journeyer based on his/her wide involvement in free software projects and interesting diary entries.

Joel Spolsky thinks Bertrand Meyer's article on .NET language independence supports what he says about the weakness of language independence for .NET. I disagree, I think, while .NET doesn't support many of the reasons that Joel cites for moving to other languages, as Bertrand says it is a good framework for most modern languages: it's good enough to support LISP and Scheme, which as Paul Graham says is the model new programming languages are evolving towards, and which the JVM doesn't support due to its lack of tail recursive function calls.

While I'm on the subject of .NET, I want to say again that I don't think Gnome on Mono is a good idea, for two reasons:

• MS's aims with .NET are anti-competitive and will hurt free software;
• Free software support for Java is better advanced, and it is better to build on what we have already than support .NET.
However I do think .NET is innovative and important, and so is something the free software world should properly understand. A better strategy would be to put more effort in free software Java based efforts, and pressurise SUN into remedying Java's defects (eg. the prejudicial framework for J2EE certification, the weakness of the Java Community Program, the lack of tail-recursive calls in the JVM). Sun is not an ideal free software partner, but it's a damn sight better in almost every respect than MS.

Postscript #1: Since reading this, I read the Interview with Danese Cooper on Slashdot, which seems to suggest that Sun is at last dealing with the open source J2EE problem. Didn't say anything about the problems with the Java Community Program, though, more's the pity.

Postcript #2: The Stallman factor: spot on!

Postscript #3: Certified of shlomif as Journeyer because I like his diary entries (a lot), and apparently he has good skills and makes worthwhile contributions to free software.

39 older entries...