I'm working on a series of classes for intermediate-level programmers to take their programming up to the next notch, and I'd like input from some of you.
I'm working on a series of classes for intermediate-level programmers to take their programming up to the next notch, and I'd like input from some of you.
I'm working on a series of classes for intermediate-level programmers to take their programming up to the next notch, and I'd like input from some of you.
The idea is that we are going to be looking at features from various computer languages (mostly Lispish ones), showing how they aid programming, and show how to use those ideas in languages that don't necessarily have direct support for them.
The current list of classes is at
http://www.eskimo.com/~johnnyb/computers/programmers_toolbox.html
I'm looking for comments about language features and paradigms that I could add to the course, or any thoughts you had on the existing class list.
From my brief reading that looks like a pretty <functional/> list, which would be worthwhile learning. However, you might want to expose them to some Design Patterns stuff. For good or ill, objects are here to stay and OO puts a premium on good design.
My day job today deals with Oracle. I never wound up taking the database class at college, but it turns out that what helped me the most was learning functional programming via FP (an abstract functional programming language, though one which does have a few implementations floating around) and Scheme (I credit FP more, actually), and then logical programming (= Prolog). Learning the two in combination helps to break out of the procedural mold of thinking that hampers writing powerful SQL statements.
(It also helped greatly that PL/SQL is basically Ada, and they made us learn Ada for several courses... but that's another matter.)
The functional/logical way of thinking is more useful than learning the intricacies of garbage collection, IMPO. Those fundamentals have a better chance to influence programming elsewhere in other languages/paradigms (= procedural, for most of us) than how Scheme does garbage collection.
I'm going to include some of Alexandrescu's design pattern work in the part about compile-time programming. However, I could never really boil design patterns down to a basic idea. If you have some help in that direction, I'd greatly appreciate it.
I also had originally scheduled a class for packaging state (encapsulation) and interface programming (inheritance), but figured that that's something most people will probably already have exposure to, although maybe not as orthogonal subjects to each other (they are usually tought as basically the same thing). I will discuss objects to some extent in the section on closures (since objects are really just a vector of functions defined over a closure).
If I could find something new or interesting to say about object-oriented programming, I'd love to. I'm a big fan of OOP simply because it's a commonly-encountered special case of closures.
Don't advocate PL-oriented hacks too boldly like C++ template expressions. They are not useful for programming-in-the-large. It's just a passing fancy. (But some hacks can actually be useful like Qt's)
Do emphasize important language features:
Also consider pragmatics of true programming
System facilities, what's good for making use of?
Algorithms
Those are excellent ideas. It's a little different focus than what I'm doing - you were giving examples of programming language features, and I was thinking along the lines of programming languages ideas - a little difference, but not much.
I'm not going to teach algorithms or system facilities, because, well, it's not yet a focus.
The higher-order functions is an excellent one. I kind of have it with closures, but higher-order functions may be a topic to itself. However, I haven't yet thought of a good way to go about teaching it. Any ideas or good examples of tutorials on higher-order functions would be appreciated.
Pattern-matching is a good one. There are a few different ways to approach it - REGEXs and more generalized patterns. I would actually like to do _both_, so people have a good "I can do this right now" lesson to go play with, as well as the theory. The problem is, _I_ don't know pattern matching theory very well, so any good pointers would be appreciated.
Extensible syntax is a little language-specific, but I'm kind of covering it in the "macros" section.
What is there to say about evaluation?
I _am_ going to include C++ template hacks, however, as you can do a half-way version of a lot of the good ideas from Scheme and Lisp with them, especially compile-time programming. Most of it is going to be based on Alexandrescu's Modern C++ Design. Yes, it's hacky, but I want to give the classes so that they are immediately useful to those people in the classes in the projects they are in right now. To say, "this is a great feature, but you have to totally change programming environments and rewrite your code in language X to use it" makes it completely pointless to most people.
I think you should show along the lines of C++ stdlib, it might be useful. There are lots of functions in <algorithm> etc. that accept "function objects" which makes them higher-order (in the imperative sense).
For instance, if you've got a tree, a function that takes a function to transform it to another tree would be such a function.....
In real functional languages, there are real higher order functions such as map or fold. Most functional language tutorials will cover them extensively.
Thanks,
Look at original JavaScript object model comparing to Java Classes. There is good examples to see differencies.
What is really good in JS - you could enumerate all properties of object - that means really all - functions are also properties. And you could change any property! (again including functions).
This helps me a lot in learning Netscape4 runtime object model (IE is not good in following this idea). I have used same JS to learn JS DOM :)
Also one thing to point out is ability to write 24h/7wd software. You need not compile it nor stop-and-restart application just to patch or update some code (just update objects and properties at runtime). I have heard XoTCL(?) has same ability.
BTW. there is one thing which is not supposed to be part of a language, but actually is. It is code storage and preprocessing. In C/C++ it is #define/#ifdef/#iclude preprocessing. In perl is require/.. (see return code), php - include/require (see diff), java - you should define every public class in separate file etc.
johnnyb, you can't put your thumb on Design Patterns.
Common problem. The best introduction is
ISBN 0195019199, A Pattern Language - the original book,
written for architecture.
As with anything arriving with fanfair, there are detractors
who oppose the popularity, if not the idea. Setting the
expectatoins to something reasonable can defuse these spats.
DesignPatterns on the Perl Patterns Wiki links to a few "are patterns worth it" articles,
and tries to relate patterns back to their origins.
The most striking thing about the original "Design Patterns" is how
it ties together large scale structures with the structure beneith
them, encouraging involvement at all levels. If you're a home owner,
the next largest scale are cross referenced - neighborhood improvement,
dealing with the city, forming neighboorhood parks, reducing
through traffic, and so forth. Things on the current scale are
discussed - things you can do to or with your house. Things
on a smaller scale are cross referenced - the concepts of
open green spaces, working with the sun, adding a patio.
Likewise, if you're desinging a city from the ground up, it cross
references transporation, public transit, "county fingers",
parks, highways. Throughout, it is always assumed that the
reader is doing her utmost to work towards highist ideals,
even if they are constrained in some ways.
Nomenclature, cross referencing, working across all scales,
and working towards high ideals
are the most profound useful parts. In modern "Design Patterns",
only two of these elements have been preserved.
Perl is idiomatic because
sequences of compressed logic are difficult to decipher, but familar
once understood. Java is idiomatic because large structures must
be built to take advantage of OO.
Off topic, I know. Forgive me. I hope this puts some perspective
on using Design Patterns in education, though.
Programming is learnt by doing, as is surgery. In medicial
school, you'll practise operations over and over. It won't be
until the end of your cirriculum that you'll even begin to hear
about experimental procedures or technologies. OO and Design
Patterns solve problems of programmer communication and building
large scale applications. It has been said that BASIC ruins a
programmer for life - I've found the opposite to be true as well -
a programmer who never experiences programming without automatic
storage (the stack) and nested control structures has little
concept of what they are instructing the computer to do. To
the degree that this is true, I would advise you to start as
simple as is correct, and then only add complexity as you
can find ways to prove the need for it to your students.
As a final parting link,
Worse Is
Better.
The problem I had with teaching design patters is that just teaching the concept gives someone nothing, but you can't teach all the patterns in one class (especially because they are infinite), so it's hard to give someone something concrete. The idea behind design patterns really isn't all that new or different - it's something a lot of people knew anyway, just never wrote down.
If you have any ideas how to teach design patterns as a general approach to programming, I'd love to see it.
I've seen a lot of people refer to the book you mention as a correction to the GoF Design Patterns, but I've never seen anyone try to retell it in a tutorial-like fashion, and I don't have the cash right now to buy it.
Hi,
looks like you've covered many interesting things-- however, you really
should go into detail about type systems. Writing working programs without type checking just doesn't work beyond the trivial cases (which is why you don't see many big Python/Perl programs out there), so you want to make sure (a) that your program is at least somewhat sane when compiling it, and (b) you want to strip out as many run-time sanity checks as possible. Yep, type systems do constrain you, quite a bit. But I find that I have a hard time coming up with non-metaprogramming examples that can't be expressed in System F (a previous challenge of mine to find one remained unanswered, although I assume that most people are smart enough not to read my diary ;).
As references: There's a nice article from John Reynolds: "Three approaches to type structure" (1985), which details the Curry-Howard-motivated one used by Hindley and Milner and implemented in SML, the "subtyping" one used in some OO languages, and System F (aka the second order polymorphic lambda calculus), which you can also find in Girard's Book (Proofs and Tyes, 1989) (although you'll find many papers with a quick overview over it on the 'net, such as Wadler's "The Girard-Reynolds Isomorphism"). Beyond that, linear type systems (motivated by Linear Logic, Girard, 1987, Theory of CS 50, implemented in "Clean" (http://www.cs.kun.nl/~clean/)-- see Abramsky's 1993 "Computational interpretations of linear logic" for a nice introduction and overview over the theory), and, of course, monads (Moggi ("Computational lambda calculus and monads", 1988), Wadler ("Comprehending Monads", "Imperative functional programming",...(long list)), implemented in Haskell) come to mind. From a more imperative background, the old distinction between explicit and structural sub-typing should be noted, and you could also mention Modula-3's "partial revelation" subtyping and its tagged datatypes. Kim Bruce's recent "Foundations of Object Oriented Languages" (FOOL) also is a nice book for an OO view to (sufficiently powerful, from a functional point of view) type systems.
Also note that there's a relatively recent paper by Simon Peyton Jones, discussing type-safe meta-programming in Haskell-- since you're doing meta-programming (to some extent), this may be of interest.
Oh, and currying might be worth mentioning, too.
If you're interested in tools for proving (partial) correctness of programs or papers regarding that, I can point you to a few as well. Please contact me if you can't find any of the references listed above; I didn't double-check most of them.
-- Christoph
Maybe there aren't many large Python/Perl programs, but there are quite a number of large Lisp programs (including whole OSes). Not that learning about advanced static type-systems is a bad thing (it can be quite interesting), but implying that they are necessary is not correct.
If you want to include some more Lispish concepts, Scheme isn't going to cut it. You should also discuss more useful concepts such as the condition system (more useful than raw continuations, for its purpose), and advanced object oriented programming techniques such as multi-method dispatch, method combination, etc. Particularly, you should do this before any ``Design Patterns'' stuff, since half of that is just work-arounds for the crappiness of C++.
The reason that there aren't many large Perl/Python programs is that the equivalent programs in those languages require significantly less code :)
Type systems looks like it may be something to look into. I will look into your links as I have a chance.
As for Lisp vs. Scheme - I actually plan on doing _some_ Lisp (defmacro for one), but concentrating on Scheme. If you have any ideas for good Lisp ideas I should teach, let me know.
Multi-method dispatch might be interesting, although I don't know of any good papers on it. Links to such would be nice - especially dealing with the exponential number of specializations this can require.
What on earth is method combination?
About Perl/Python in large programs: Honestly, I've done quite a bit of code in Perl, and generally don't use other languages unless there is a specific reason. This includes a large web store which includes a reseller network, all with different prices/currencies, multiple carts per session (possibly in different currencies), and with different pricing schemes based on the reseller (some price out model/options, others price the same items only in a preconfigured state). Oh yes, and a templating system for fast creation of store pages. Source Lines of Code is drastically smaller than anything that I could have imagined in any other language.
Hi,
Referring to the paraphrased statement "Equivalent programs in Perl/Python require significantly less code": Compared to what? (C was not what I had in mind.)
I'll admit that I may just be a bad Perl/Python programmer; my programs in both of these languages usually wound up being larger and less stable than corresponding Haskell/ML programs would have been (if these languages had had the corresponding standard library functionality). The lack of stability I usually blamed on typing, since the kinds of bugs I got in Perl/Python during run-time checking (or when using the resulting programs) were often discovered by the type checker in other languages. Code size may not matter that much, since coming up with small and good functional programs requires somewhat more thought (and time) than randomly hacking together imperative statements in semi-arbitrary order (unless you've never done the latter and have a very mathematical mind).
The point I'm trying to make is mostly that today's type systems (and I'm not referring to the ones used in Java or C++) give you a lot of security-- the type system forcing you to dissect a separated sum type in its entirety (particularly wrt ML's "Option" or Haskell's "Maybe" types, whose corresponding entities in practically all other languages are implicit checks throwing null-pointer exceptions or segfaulting at run-time when failing) or enforcing full referential transparency (in Haskell) (i.e. you can always substitute a function call (f x) by its results, even if these include side-effects) raises program safety to a very different level when compared to, say, a language where mis-typing a variable name gives you a valid program where the variable's name is substituted by some (meaningless) default value during execution.
The arguments made against type systems are usually (a) that additional notational clutter is introduced, and (b) that they restrict the kinds of programs you can write. The counter-arguments to (a) ("it's documentation", and "type inference works") are well-known, but I'm not aware of any for (b)-- I'm still looking for any examples there anyway...
As a conclusion, however, the fact that small Perl/Python programs can be made to do "a lot of stuff" already (i.e. they are easy to inspect and powerful, compared to assembly/C programs) is a valid argument for these for small-scale use (where "small-scale" refers to the size of the program, i.e. a "small-scale" Perl/Python program tends to be more expressive than a "small-scale" C program, according to that definition). This is due to their powerful and well-tested standard (and extension) libraries as well; but these two things (small code size, good standard libraries) don't scale (per definition).
(No rant intended. Just trying to justify my opinion that lack of typing implies lack of scalability. Of course, that kind of pain has rarely stopped people from doing strange things before...)
-- Christoph
See the book Art of the Meta Object Protocol, it is essentially a walk-through of an implementation of the Common Lisp Object System and how to equip it with reflective and introspective capabilities. Also relevant would be the Common Lisp Hyperspec.
New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.
Keep up with the latest Advogato features by reading the Advogato status blog.
If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!