Claim 2: General tail-recursion is an essential element of a programming language.
By `general' tail-recursion I mean that all calls that occur in `tail' position avoid allocating. Many languages or implementations support `self' tail-recursion in which tail-recursive calls to the same procedure are compiled or interpreted as loops, but fewer support the completely general case of arbitrary mutual tail-recursive procedures acting as a loop.
`Essential' is trickier to justify. There are many arguments for it and one really big argument against it. Might as well play devil's advocate first.
General tail recursion cannot be `essential' in a meaningful way because few programming languages support it and no one is complaining.
That is certainly a true statement, so let's compare a hypothetical language that has two variants, one has no tail recursion, the other supports general tail recursion. The first observation we can make is this: Any program that runs in a language without tail recursion will also run in a language with tail recursion. (That is, if it completes with an answer, both versions would produce the same answer.) The second observation is this: There are programs that run in a tail recursive language that will fail to run in a language without tail recursion. These would obviously be the programs that run out of stack or memory. So the claim that general tail recursion is essential is equivalent to the claim that there exist interesting programs that run in a tail recursive language and fail to run in a language without tail recursion. The devil's advocate position would deny this:
General tail recursion cannot be `essential' because there are no interesting programs that require a general tail-recursive implementation.
It seems unlikely that this is an a priori truth. Even the fact that there exists a class of programs that require general tail-recursion is interesting in itself. There is a slightly weaker argument that gets around this objection without much difficulty:
There are no interesting programs that require a general tail-recursive implementation because there are equivalent programs that do not require a
general tail-recursive implementation.
This is also true, but the argument has been critically weakened. The equivalent programs are not, in general, `macro expressible' in the original language. You can turn the self tail-recursion into a loop fairly easily but you cannot make a loop out of mutually tail-recursive procedures without whole-program analysis (that is, an analysis at the level above the scoping of the mutually recursive procedures).
So why doesn't anyone complain about this? Most programming languages require you to write loops using special primitive looping constructs. These languages encourage the programmer to indentify and separate out the iterative and recursive forms. The programmer is trained from very early on to perform the larger analysis needed to identify iterative control flow and rewrite it as a looping construct. It isn't all that difficult, you have to do it anyway, so what's there to complain about?
Continuation passing style.
Continuation passing style is an example of a programming paradigm that can be very difficult to convert into a looping construct. Every CPS function performs a tail-recursive call to another CPS function, and rather than keep nested state on the stack, the state is kept in nested closures. If every function takes exactly one continuation argument, then it isn't too hard to `unravel' the CPS code into an equivalent `direct style' program, but the more complicated CPS code that takes more than one continuation cannot be easily converted to direct style.
There are few loops in continuation passing style. Control is transferred from function to function in a uniform manner and the difference between `iteration' and `recursion' is seen only as a difference in the space complexity of the continuation arguments. The interpreter or compiler must be sure to avoid inessential allocation when transferring control or an otherwise iterative construct would consume an arbitrary amount of memory (or stack).
If you agree with my claim that languages that support continuation-passing-style are more expressive than ones that do not, and that general tail recursion is a necessary prerequisite for continuation-passing-style, then you ought to agree that languages that have general tail recursion are more expressive than ones that do not.
Some more about general tail recursion later.
What about costs of introducing tail recursion into a language? E.g., influences on debuggability (stack traces), interaction with conditions and special variables.
ReplyDeleteI'm not entirely sure how to address the question. I was contrasting a pair of equivalent hypothetical languages that differed only in that one guaranteed tail recursion and the other consumed stack on every call. You seem to be asking about the practical difficulties of retrofitting general tail recursion into a language that did not take it into account when it was originally designed.
ReplyDeleteThis is a bit of a digression, but I can offer a couple of opinions.
Some languages, like C++, place such restrictions on the implementation to make tail recursion essentially impossible (control must return to a frame in order to run the destructors). If your language has some feature like that, then you are basically hosed. You simply cannot use mutually recursive procedures in an unbounded way or you'll overflow the stack.
Other languages, like Common Lisp (which I think you had in mind), have only a few impediments to offering fully general tail recursion. You mention special variables and conditions. There are two approaches: the easiest is to avoid using special variables in tail-recursive loops. That's practical enough, but not theoretically satisfying. The second approach is to solve the `dead binding' problem and collect the special variable stack. Darius Bacon wrote a short page on the problem and Henry Baker discusses it as well. Conditions can be handled in a manner analagous to dead bindings.
The other thing you mentioned is stack traces. There are a few points to make. The first is that debugging isn't usually considered part of the `language proper' as far as semantics are concerned. The second is that traditional iterative constructs like for and while do not push frames for debugging and no one seems to mind. But I think the best `argument' (athough what are we arguing?) is that you can preserve a fixed amount of backtrace information without making recursion use unbounded space. MIT Scheme has a `history' mechanism that records the most recent tail-recursive calls. You can dynamically adjust the history size to give you more or fewer frames of backtrace.