Older blog entries for graydon (starting at number 93)

I'm working fully-remote now, which is an interesting change, but really has just served to demonstrate the flexibility of free software to me; with a little bit of glue here and there, the system feels exactly as if I were in the office. minus all the socializing :(

ported some bits of slib to ocaml and built a quick sqlite binding. finding ocaml just keeps growing on me as a "serious hacking" language.

otherwise pretty relaxed. I'll be heading up to OLS in a couple weeks to give a talk on building ridiculously small domain specific compilers. and then I'm taking a fantastic vacation through the western end of the country. very much looking forward to it.

chalst:

  • I think that eval, in its lisp form, is another of those sorts of things the ML/haskell world usually frowns on as "a bit too dynamic". you don't even know when an eval'ed sexp is well typed. that's scary. that said, there is still dynamic caml should the need arise.
  • TCL is actually an interesting case. I used to think it was just a complete joke, what with basically no datatypes, but I'm finding myself occasionally forced to use it (sigh.. work), and each time I am impressed with how far you can go with a clever quoting system and a single datatype. it's further than you think.
  • with respect to camlp4, I do think the macros offered are "as powerful" as those in lisp, by any reasonable measure. consider:
    • you can write your own lexer. you don't have to shoehorn your macro notation into sexps</a>
    • you have explicit or implicit control over source coordinates
    • you can programmatically construct your results, or construct them through quotations, or both
    • you can name and layer quotation expansions of different notations
    • you can replace the core grammar of ocaml, essentially re-tasking the ocaml compiler to a "new" language with ocaml semantics.
nicholasl:

I think you are right about the scene graph location. I was initially quite a fan of "location transparency", but have come to feel that there are always at least 2 locations in any computer program: here and elsewhere. Here is at least host-local, probably user-local, process-local, and even thread-local. Really, I don't think a display server ought to have more than one thread anymore either. I think Miguel was quite right in that argument we had years ago. But that's a different kettle of fish.

In any case, collapsing here and elsewhere into just one case leads to, well, corba. Increasingly, I don't think corba is wise at all; remote resources are just too intrinsically different from pointers.

One might be motivated, then, to wonder what remote resources are like. Obviously with some sequence number trickery, remote resources can be organized into things like streams (hence TCP socket ~~ file descriptor), but can they be made into something even more structured? 9p thinks remote resources can appear like a VFS tree: open, read, write, close, and walk. Seems a useful, and mostly harmless extension. But I think it's important to keep open vs. closed, and explicit i/o, as part of the equation. Trying to do those things transparently is trouble.

I wish paul graham would stop publishing articles claiming lisp has the only useful syntactic metaprogramming system. it's especially irritating since he seems like a smart guy, and should know better. you don't need car and cdr to do good syntactic metaprogramming (indeed, alone they are awkward); you just need a quasi-quote notation and a place to hook your extensions into the evaluator.

hell, even TCL satisfies the menu here, though few people bother to learn it. unless he's been hiding under a rock for a couple years, I can't believe he's seen and understood camlp4, and still thinks defmacro is "unique to lisp".

in response to chalst, and further on the topic of type systems:

there is no deep theoretical difference between static and dynamic type systems. you are still talking about writing a specification (program) in which the computer may do something you do not expect or desire. the only practical question worth asking is: how much confidence do you have that the specification you wrote will get the computer to do the thing that you, or your customers, want?

testsuites test some cases. type systems check consistency. the assumption is "consistency implies correctness", and it's hard to come up with much better definition for correctness than both "testsuite pass" and consistency; in practise this is as close to true as we get.

I've used many C and lisp systems professionally, and a few HM systems, and the HM systems are by far more designed around consistency checking. you can consistency-check lisp if you have the luxury of using ACL2 or a vendor compiler with a deep partial evaluator or soft-type inferencer. but usually in lisp you are working on a lousy implementation, with an OO or struct system someone threw together for fun, and less checking even than the "disjointness guarantee" inherent in fundamental scheme types. likewise in C you are not usually working with a larch or lclint-annotated system; you've got a system which "made up" an OO type system on top of liberal pointer casting, and not a hope in hell of checking its consistency. C++ thankfully has a good type system, but as the original article points out it can get pretty brutal to compile.

the more checking I can perform before I ship something to a customer, the better. imho everything else is religion and bafflegab. ocaml and haskell currently give me lots of good checking without taking away too much, so I use them when I can.

technik: the SICP text is still online here, though mirroring it might be wise.

work that body

frances and I just bought a small TV/VCR with which to pursue our mutual love of crass anime and high quality older films. lucky for us, suspect video carries lots of both. we're hoping this device will not lead to more actual broadcast TV watching, as we both lost sizable chunks of our childhood to that. so far we're seen keiko kamen and dead reckoning (bogart, 1947). going for joan or arc (bergman, 1948) tonight.

miscellaneous link-dropping: highly satisfied with the recent ghc 5.00.1 release. anyone who's held back from haskell for a while would do well to try it. It now includes a mixed-mode compiler/interpreter with dep chasing and an interactive mode not unlike hugs, only you get access to nice optimized copies of hslibs. It's a 5-minute fix to get emacs hooked up to it, then you have a nice ":reload" function at your fingertips. also pleased with unison, which does what you thought rsync was going to do when you first got it. counterbalance the amusingly pragmatic nature of dumbcode with some positive evidence that not all free software consists of broken instant messaging clients in python: boost, atlas, R, maxima and openDX. finally, was tuning nasty code with some nice profiling tools recently, worth pointing out: cacheprof, functioncheck, perfAPI, and jprof

obTechnicalPhilosophy: there is a delicate balance necessary between sticking with the things you know and can rely on, and exploring things which have the potential to be better. assuming that one or another of these strategies is the one true way is silly.

remove tonsils, get icecream!

A fab month. Graduated with surprisingly higher marks than I thought I deserved, got married and spent the weekend playing skee-ball and air hockey and mini-golf, and am now happily settled into my den of evil at redcygnushat.

Still basically obsessed with the prospects presented in hehner's book which has been consuming my free time since I found it in february. Many new things afoot, but increasingly unable to talk about them. poo.

school's out

submitted final version of the compiler for school, kicked coffee for a week to see if my eye would stop twitching. had last day of uneventful classes. saw "the third man" a couple weeks ago, which was really good, and "of human bondage" (the bette davis version) which was really not. sent out a whack of wedding invites, got some new shoes.

slepnir ran up to me last night and coughed up a roach exoskeleton on my face. then she pissed in my shoe. she hates me. this is because I clean her cage out, and rats like mess.

went out to john's italian with frances, got a nice panini and a glass of wine. got my "program completion" notice from the university and a large tax refund from the government. start work at RH on monday. looking forward to some perverted hacking and having my evenings free again.

splork: imho dynamically typed languages are really not your friend for large projects. the arguments put forth in this article are really wrong-headed.

  • testing probes individual cases for correctness. it's important for making sure your assumptions are correct. type systems prove entire categories of flaw do not exist in your code, i.e. an infinite number of test cases. I do not for a minute believe the claim that normal unit tests will find "every" error a typecheck will find. tests and types do different things.

  • moreover they are fast. a typecheck is faster than all but the most utterly trivial test. the argument he makes against lengthy compile times is surprising considering how much longer a really strong regression suite will take to run.

  • really, the compile time argument is an argument against compiling C++, which involves both extensive machine-level optimization and a hiddeously poor declaration management strategy (textual inclusion). this is not a valid critique of static type systems. if you want near-instant builds of programs in typed languages, get a native non-optimizing bytecode compiler for a language with binary declarations like ocamlc.opt or jikes. they cook.

  • in fact, if you are trying to produce optimized code, your compile time with a dynamically typed language is almost always going to be longer, because you have to do far more extensive partial evaluation. type systems save you the hassle, and make it much easier to write optimizing compilers.

  • modern HM-based languages infer nearly all their types from the data constructors anyway, so you don't actually have to enter them into the source code. so unless you're actively avoiding a strong type system, you can basically get one for free. it doesn't affect "agile processes" like XP's "aggressive refactoring" any worse than pointing out when you make an inconsistent change, which is generally something you want to know about.

  • redeploying software is always a fragile affair. I can just as easily change a "base class" or some other central dependency bottleneck and break a typeless language as a typed one. the solution in any case is to reduce coupling around the points which are likely to change. It's slightly hard in C++ because it computes vtbl offsets explicitly (again, for speed) but you can get around it with a compiler firewall or some componentware if you think your implementation is going to change its layout.

I agree with his assertion that "scripting languages" are becoming more important, but not because they're good for producing robust software. they're important because a lot of people want to write one-off programs, and languages with "simple syntax" lower the barrier to entry. this is good. this de-thrones the high priests of programming and puts automation in the hands of normal people. it's ok to have some failures in a one-off program to automate a task you would have done sloppily, by hand anyways. nonetheless, with friendly enough syntax, you can slip a good type system in while nobody's looking, and some people will appreciate the extra error checking. perhaps mondrian will actually introduce legions of windows slaves to haskell. doubt it, but who knows?

84 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!