- Alan Bawden suggested that arc might benefit from a module system based on his `first-class macros', ie. where procedures can take macros as arguments and still have everything resolved at compiler time. I think this would be wonderful, and would really allow arc to stand out as a technically innovative programming language.
- `Unix won'
- Case sensitivity: I'm not sure about this - Unix has won the battles so far, but it still might be displaced by an MS/NT system as the de facto server standard. A practical consequence: it can sometimes be a bit of a nuisance to handle MS filesystems using a case sensitive symbol system. I'd like to see a way to make this something that the user can configure at run-time. I'm putting together a proposal on how to handle this.
- UNIX awareness: This is a good thing, but I'd like to see an attempt to be multi-platform. Python has got this almost right, with language design being focussed around what is practical to do on all platforms. Having said this, sometimes Python has a lowest-common-denominator feel. Perhaps having parallel arc/UNIX, arc/JVM and arc.NET implementations, with an attempt to keep the intersection of the three (Portable arc?) as large as possible, would be a good thing?
- Soft typing: I'd like to see language support for soft typing, perhaps based on intersection types.
- Infix/currying/laziness:I'd like to see something allowing it easy to
incorporate Haskell style user definable infix operators, currying and lazy
functions/list manipulations. I've an idea that this can be done all with
- We have a [...] special form that allows infix notation of non-function types.
- We have a [> ...] special form that allows curried infix notation with eager semantics: the syntax is as before but we allow `_' parameters which are treated as parameters of a fn/lambda construction (so [_ + _] is (fn (x y) (+ x y)));
- We have a [< ...] special form similar to the eager form, that allows curried infix notation with lazy semantics, handed using some future like mechanism;
The idea behing the `<' and `>' mnemonic is that eager reduction strategies in the lambda calculus tends to evaluate beta redexes further to the right than lazy reduction strategies do. This would allow a lot of the Bird--Meertens formalism to be modelled painlessly in arc;
- Regexps: I'd like to see support for regular expressions in the core language, perhaps following the proposal of Olin Shivers.
- Lexemes: I've an idea about extending usual treatment of environments with combinator parsing-like ideas: we extend environments so that instead of just mapping symbols to values, they also allow us to map patterns that can stand into infinitely many symbols to combintor parsers that recursively build up an expression. This might be nice for handling Perl-style regexps.