I've posted the choice thread on ghilbert.org. For those of you who are slavishly following the development of Ghilbert, or fans of the Axiom of Choice, it should offer a glimmer of enlightenment.
The New York Times reaches about 1.5 million people. This posting is possibly of interest to two dozen. But the difference between my blog and the NYT is that my post will reach those two dozen :)
As NTK says, No self-respecting Thinker Of Hard Thoughts these days is without their own Deep Theory Of How To Do Version Control. It's not surprising to see so much activity in this sphere now. CVS has been broken for a long time, and it's now clear that Subversion only solves some of the problems of CVS.
I haven't actually played with Codeville yet, but I look forward to it. When Bram puts his mind to something, it often turns out well. I was also very interested to see Ken Shalk's CodeCon presentation on Vesta, a project I've actually been following since its inception about a decade ago.
The bottom line is that I think Vesta gets a few things very right, but some of the design decisions are going to hold it back from hitting the big time. Vesta is a source repository, a configuration manager, and a build tool. If you buy in to the Vesta way of doing things, all these pieces interact in a very nice way. For example, because you keep not only your source files but also the tools needed for building in the repository, you can always go back to a specific build, bit for bit. It uses some neat tricks to work - the files in the repository are exported through NFS, and, not so coincidentally, that's how the build knows what the dependencies are. If the file is accessed during the build, it's a depenedency, otherwise not.
The biggest downside, I think, is that it's quite Unix-specific. It's not impossible to run NFS on Windows or Mac, but it's not exactly convenient either.
I think it is possible to take the best ideas from Vesta and put them in a portable framework. Rather than a build being a script which runs random commands and litters directories with temporary and result files, it should be a functional program from input to output. All intermediate results should be considered a cache. Indeed, I see no reason why you shouldn't be able to take a source package, run a simple command, and have it spit out a .deb for Debian, .rpm's of the various flavors for the Red Hat-based distros (including some intelligent analysis of how many variants are actually needed), a .pkg or .dmg or whatever the Mac people decided is the preferred way to distribute OS X apps, an InstallShield-like installer for Windows, and a .pdb for Palms. Throw in a couple flags, and the Unix build is instrumented to support debugging and profiling, or maybe gcc bounds checking. Better yet, have it run in an interpreter such as eic, so that you can debug runtime violations at a source level.
What exactly is standing in the way of such a thing? My guess is that the main thing is inertia.