Wednesday, May 26, 2010

I have two collections and I need to see if one of them is a representation of the other. Since each kind of object has an Id, the two sets of Ids should match, but since collections are unordered, you can't just do a pairwise match.

The solution is to use the containsAll method to see if each collection contains all the elements of the other one. In Scheme this is trivial:
(let ((foo-ids (map foo-id (get-foos)))
      (bar-ids (map bar-id (get-bars))))
  (and (contains-all foo-ids bar-ids)
       (contains-all bar-ids foo-ids)))
in Java, not so easy:
Collection<Id> fooIds =
                           new Function<Foo, Id> () {
                             public Id apply (Foo foo) {
                               return foo.getId();
Collection<Id> barIds =
                           new Function<Bar, Id> () {
                             public Id apply (Bar bar) {
                               return bar.getId();
return fooIds.containsAll(barIds) &&
I know that this is not Java's forte, but this isn't some obscure stratified monadic combinator, it's just map for heaven's sake! If future maintainers of this code can't handle map, maybe they should consider a career in the fast food industry.

Tuesday, May 25, 2010


Collection expectedIds =
                           new Function () {
                             public Id apply (Foo foo) {
                               return foo.getId();

Looking forward to ILC 2010

Daniel Herring sends this announcement:

The ALU is pleased to announce that the International Lisp Conference 2010 (ILC-2010) will be held mid October in Reno, Nevada.

ILC-2010 will be colocated with OOPSLA/SPLASH.

Colocated conferences allow people to explore other sessions; but the ILC itself will be the same independent, lisp-intensive event you've grown to love.

More announcements, including the ILC conference website and call for papers, will appear in the coming weeks.

p.p. Daniel Herring
ALU Board of Directors

I'm planning on being there. Anybody want to write a paper?

Tuesday, May 11, 2010

Doing it wrong

I'm hacking some Java code and I'm going to do a query and return some objects. Since I have no need to mutate the collection, I decide to use an immutable list. To make a long story short, I did this:

final ImmutableList.Builder<Object> intermediateCollection = ImmutableList.builder();
    if (fromObjectClass.isInstance(object))
    query (
This is bad enough, but we'll let it slide.

It turns out that the query method wants to mutate the collection. (This is obnoxious. When I borrow something I try to leave it in the same condition as when I got it. I don't go mutating it and I expect my libraries to extend the same courtesy.) I guess I don't care all that much — I can simply make the collection mutable.

So why can't I just delete the `Immutable' prefix?

// Now it is named `Lists' with an `s'

// Now I have to specify the implementation technique.
// Why isn't it a List.builder?
  final List<Object> intermediateCollection = Lists.newArrayList();

// Why isn't there build that is a no-op?
   query (intermediateCollection)
This is simply wrong.

Thursday, May 6, 2010

Not too many takers on that batch of questions. Oh well.

Anyone care to take a stab at how these notions of equality relate to procedural abstraction?

Wednesday, May 5, 2010


What does it mean to say that two programs are ‘the same’?

What does it mean to say that two programs are ‘equivalent’?

What is the difference between intensional and extensional equivalence?

What is ‘observational equivalence’?

Tuesday, May 4, 2010

C++ vs. Lisp

Yes, a provocative title.

In this blog post, Mr. Ebhakt decides to compare C++ and Lisp. (Go ahead and read it, I'll wait...)

Wow! It seems the original article was written by Brandon Corfman. My apologies to Mr. Corfman. Shame on you, Ebhakt.

I had to laugh. First he is disappointed that it took him eight hours to find a solution (Norvig did it in two), and his initial attempt took 220 lines of code (Norvig did it in 45). One source of conciseness was because Lisp function calls return useful values while C++ function calls typically cause side effects. Thus the Lisp expression
(format t "~a:~ {  ~a}~%" num (reverse words))
could make use of the return value from reverse whereas the C++ version
cout << num << ":"; 
for (list::const_iterator i = words.begin(); i != words.end(); ++i) 
    cout << *i; 
cout << "\n";
could not. Mr. Ebhakt also noted that ‘most of the complexity of the STL arises from its explicit use of iterators.’ Henry Baker pointed out in 1992 that iterators are “Signs of Weakness in Object-Oriented Languages”. Armed with this knowledge, Mr. Corfman “decided to rewrite all of the STL’s algorithms to accept containers instead of iterators” and “also make the algorithms (where possible) return copies of containers instead of iterators”. Furthermore, he decided “to build another layer of utility functions on top of the STL to perform common tasks such as tokenizing strings, looking up values in containers, reading from files” etc. The end result? “The new line count is 79 lines.” I wont bother quoting Philip Greenspun at this point. Mr. Corfman sums it all up: ‘C++ isn’t all that bad.’ Of course it doesn't have garbage collection, macros, and closures, but it has ‘extensive library support’ (which he just spent hours throwing away and re-implementing) ‘and of course, its premier status’ which, with an additional two dollars will buy you a large coffee. Mr. Corfman came so close to the mark and just missed it. Lisp isn't magic, it's at a sweet spot in the design space. Lisp doesn't do anything the other languages cannot, but it does what the other ones do not. It works that much harder for the programmer so the programmer can do other things (like program). Sure, a Ford Model-A isn't all that bad. It can get you where you're going and it is unsurpassed in popularity, but if you spend most of your waking hours behind the wheel, wouldn't you rather drive a Dusenberg?
After finding out that this was written by Brandon Corfman, I decided to see if his opinion had changed at all. On his Road to Lisp page, he says “Lisp still has the definite edge on the power curve. I think the clue for me is that I still find myself saying, ‘Why didn't they do this like Lisp?’”

Monday, May 3, 2010


Every now and then I run into ‘quasi-cranks’ in the computer industry. These people don't fit the mold of the standard crank. For instance, they often appear reasonably normal (although in this field, there is more than the usual amount of latitude for ‘normal’), they generally don't rant unless you specifically mention certain trigger words, and they seem to have successful computer careers. But when you start talking about technical details, you realize that these people are profoundly confused.

I often help people debug their code. Most of the time I can follow what they're thinking and how the code is structured. Every now and then, though, I'll come across someone that has some insane hairball of code. They'll point out the location where the error is raised and explain what the intended behavior is. (I don't know why they insist on telling me this. If the code behaved the way it was intended, there wouldn't be an error.) I listen patiently and then ask them what the code actually did, and then try to understand where the reasoning went wrong. It's at this point I get the first inklings of weirdness. I'll find out that what they want to do is fairly simple, but that the error is bizarre. For example, they'll say they want to look something up in a table, but the error message is something like Error: Variable reference to syntactic keyword.

Now this can go one of three ways. The best is when I ask them if they're doing something weird and they sheepishly admit “Yeah, I thought I could model this with a first-class environment... I guess I should use a hash table.” These people have a future in computers. They're exploring.

Second best is when they say “I'm using a first-class environment to hold my lookup table.” and when I suggest some more conventional mechanism they assert “No, I need to be able to call eval.” and it turns out they have some hair-brained idea that kinda sounds plausible, but won't work in practice. These people can make a living in computers, but they need a structured environment.

Then there are the third type of people. When you look at the hairball of code you see a bunch of things that make no sense at all: continuations that are captured, but not bound to a variable, spurious data structure copying, strange and unexpected variable names, checks for errors that cannot logically occur, etc. If you ask them what they are doing, they'll say something like “I have to eval the current continuation on the stack.” or “That forces the oldspace to swap.” If you try to encourage them to try a standard approach they explain “Look, this code pushes a binding on the continuation and I need to find a way to remove it. The error message is because the garbage collector is evaluating the wrong binding. If I could remove the binding here, then the collector will see the correct binding here.”

The first couple of times I ran into people like this I tried to nudge them in the right direction. It doesn't work. People like this are profoundly confused. I'm not sure what they are doing. They think they are programming, but the logic is unrelated to any model of programming I am familiar with. I'll usually feign ignorance and suggest some debugging tool or something. (Sometimes they'll come back with a ‘solution’ to their bug: “Oh yeah, it was easy, I just pushed a second null on the continuation.”)

You can tell these quasi-cranks from the way they use computer jargon and terminology. They “misunderstand or fail to use standard notation and terminology,” and “ignore fine distinctions which are essential to correctly understand mainstream belief.” (as per the Wikipedia article). But it astounds me that some of these people seem to have careers as programmers despite their confusion.