mglazer: Hypothetical question -- Would you feel any regret for your anti-arab invective if you heard that it motivated a vicious hate crime against innocent arabs?
Web based proof exchange systems:
I am still very enthusiastic about raph's work here, although I am disappointed he has used python for the implementation. Apart from that, I think raph's writings show excellent taste and insight. What I think he has got right:
There have been lots of interesting comments appearing on recentlog, but I've not had a chance to write any responses. I'll try to do this in the next few weeks.
Nice series of articles on windows and its free rivals. Just a little point: just because POSIX specifies a file system that doesn't support interesting permissions models doesn't mean that linux and FreeBSD can't: both of the have VFS layers that allow them to do just this sort of thing.
Wishing all advogatans a Happy New Year!
Can you put a price on heroism?
Yes! And you can manage it sensibly or wastefully too! I've been growing irritated with the Economist over recent years (since the departure of Rupert Pennant-Rea, I think...), and I hardly buy it anymore, but the occasional article like this one keeps me checking the website...
There's an article by Thierry Coquand, A new paradox in type theory that gives a good explanation of Reynold's argument that polymorphism is not set-theoretic. I think this goes some way towards making important ideas about the nature of the polymorphic lambda-calculus accessible purely through online references, though a lot of important references are from the 70s and 80s and would benefit from good online summaries. I'd be grateful to hear of more references. BTW mail sent to email@example.com will get to me, I'm quite happy that my main email address has been lost from the internet, since I don't have to spend so much time deleting spam...
jameson: Were you in Thomas Streicher's group when you were at Darmstadt? Who are you working with at Boulder? Despite my reservations, Girard's book is already something of a classic: I don't think you would have any problems convincing the librarian to try to get a copy.
Postscript: The last `Communications of the ACM' issue has as it main focus an event that I think will prove to be of great importance to computer science: The ACM has rejected licensing of software engineers essentially because code construction is not a normal engineering task. Interesting also in this light is Phil Greenspun's article reflecting on management at Ars Digita. Conclusions? I don't think software will *ever* be a `normal' engineering discipline, the way that civil engineering is today. I do think that the `Wild West' nature of the disipline will not last, however, because I think that in the next <n> decades it will become feasible to construct large systems that are proven to satisfy a formal specification.
I got to your Advogato diary entries through those of Raph Levien's. Just a few things about HOL:
- It's not based on System F, but a polymorphic version of Church's simple type theory, making its type system even simpler than that of ML. This also means that it's easy to give it a set theoretic semantics.
- HOL is a direct descendent of LCF, the project that invented ML. HOL's first developer was Mike Gordon was one of the people that worked on LCF with Robin Milner in Edinburgh and Stanford. LF is later, mainly unrelated, project.
raph: I wish it was normal for most university professors to apply the same intellectual energy and curiosity that you do to fields outside their main area of enquiry... To answer the questions you ask:
Your question was: how is it that the polymorphic lambda calculus (ie. System F) can have the same strength as Z_2? It's hard to explain without reference to Girard's proof of strong-normalisation for system F, but the essential idea is that (i) the impredicative quantification over types possessed by system F is powerful because a lambda-term possessing this form of type abstraction is abstracted over its *own* type; (ii) this quantification doesn't introduce vicious circles because we can stratify the terms of the calculus by their types, assigning a rank to them based on how deeply their most deeply nested type quantifier occurs, and the type system forbids application of terms of given rank to terms of equal or higher rank. In this scheme applications of Pi^1_k comprehension in Z_2 correspond to terms of rank k. This is a difficult idea: if you really want to understand it you should read Girard's book.
Given that you want a Z_2-like consequence relation, I don't think either formulation is `the right thing' as a proof exchange format: the theories should have a rich set of datatypes and quantifiers, split into first order data-types of countable cardinality (finite enumeration sets, finite lists and tuples of countables, integers, rationals, polynomials, algebraic numbers, finite binary trees, etc.), and second-order data-types of uncountable cardinality (eg. infinite streams of bits, infinite streams of integers, infinite binary trees, n-ary relations over finite datatypes, n-ary functions upon finite datatypes, etc.). The point is that one can find good translations from such richer theories into the simpler axiomatisations of Z_2 that metamathematicians use.
Bram: Quite so, I'm afraid I am guilty of being careless about the terminology complexity theorists use: I come from graph rewriting where I call this the graph matching problem. Sorry for wasting your time.
graydon: Sorry for the long time in responding to your last response to me. I keep meaning to reply, but each time I start composing a diary entry, I spend far too long thinking about raph's ideas about proofs and run out of time. Just briefly, I want to explain what the serious problems are with Hehner's article, and give a reasonable definition of syntactic metaprogramming. At least I can deal with one of your points about LISP macros: a macro system by definition gives a static code transformation. It is just that it makes sense for most LISP macro systems to carry out this transformation at run-time, as happens when there is no compile time (eg. interpreters), in the context of (eval ...) statements, and when you want staged computation (eg. in dynamic compilation). Insisting on pre-processing as your source transformation system causes all kinds of tiresome difficulties when you move away from the C batch-processing paradigm.
[ATCH99] Algorithms and Theory of Computation Handbook, Mikhail J. Atallah, ed., CRC Press LLC, 1999.
raph: web-based proof assistants, some random comments:
so that a proof can have an quickly-checked for-machine proof, a well-documented research paper like proof (also machine checkable), and an elementary long-but-in-easy-steps didiactic proof, complete with URLs for background reading.
New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.
Keep up with the latest Advogato features by reading the Advogato status blog.
If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!