It's no secret that I'm an aficionado of Lisp. It's my go to language, especially when I don't know what I'm doing. I call it research and prototyping, but it's really just playing around until something works.
We had a need for some auditing of some of our databases at work. They ought to agree with each other and with what GitHub and CircleCI think. It took a couple of weeks part time to prototype a solution in Common Lisp. It showed that the databases were in 99% agreement and found the few points of disagreement and anomalies that we ought to fix or look out for.
I want to integrate this information into a dashboard on one of our tools. I prototyped this by spinning up a Common Lisp microservice that returns the information in JSON format.
But management prefers that new services are written in golang. It would be easier for me to rewrite the service in golang than to try to persuade others to use Common Lisp. It also gives me the opportunity to compare the two languages head to head on a real world problem.
No, this is not a fair comparison. When I wrote the Lisp code I was exploring the problem space and prototyping. I'm much more experienced with Lisp than with golang. The golang version has the advantage that I know what I want to do and how to do it. In theory, I can just translate the Common Lisp code into golang. But then again, this is a “second system” which is not a prototype and has slightly larger scope and fuller requirements. So this cannot be a true head to head comparison.
The first point of comparison is macros (or lack thereof).  I
  generally don't use a lot of macros in Common Lisp, but they come in
  handy when I do use them.  One macro I wrote is
  called audit-step, which you can wrap around any
  expresion and it prints out a message before and after the
  expression is evaluated.  The steps are numbered in sequence, and
  nested steps get nested numbers (like step 2.3.1).  If you wrap the
  major function bodies with this macro, you get a nice trace of the
  call sequence in the log.
Golang doesn't have macros, but it has first class functions. It's easy enough to write a function that takes a function as an argument and wraps it to output the trace messages. In fact, the macro version in Common Lisp just rewrites the form into such a function call. But the macro version hides a level of indentation and a lambda. In golang, my major functions all start with
func MajorFunction (args) int {
        return AuditStep("MajorFunction", "aux message", func() int {
                // body of MajorFunction
                // Actual code goes here.
        })    
}
The bodies of all my major functions are indented by 16 spaces, which is a little much.
I like higher order functions.  I can write one higher order
  function and parameterize it with functions that handle the specific
  cases.  In my auditing code, one such workhorse function is
  called collate.  It takes a list of objects and creates
  a table that maps values to all objects in the list that contain
  that value.  To give an example, imaging you have a list of objects
  that all have a field called foo.  The foo
  field is a string.  The collate function can return a
  table that maps strings to all objects that have that string in the
  foo field.
collate is very general.  It takes a list of objects
  and four keyword arguments.  The :key argument is a
  function that extracts the value to collate on.
  The :test argument is a function that compares two keys
  (it defaults to eql if not specified).
  The :merger
  argument is a function to add the mapped object to its appropriate
  collection in the table (it defaults to adjoin).  The :default argument
  specifies the initial value of a collection in the table (it
  defaults to nil).
The :merger function is the most interesting.  It takes the key and
  the object and the current value of the table at that key.  It
  returns the new value of the table at that key.  The default merger
  function is adjoin, which adds the object to the
  collection at the key if it is not already there.  But you can
  specify a different merger function.  For example, if you want to
  count the number of objects at each key, you can specify a merger
  function that increments a counter.
The functional arguments to the collate function are often the
  results of other higher order functions.  For example,
  the :key argument is often the result of composing
  selector functions.  The :merger argument is often the
  result of composing a binary merge function with a unary transformer
  function.  The transformer function is often the result of composing
  a number of primitive selectors and transformers.
In Common Lisp, it is quite easy to write these higher order
  functions.  We can compose two unary functions with
  the compose2 function:
(defun compose2 (f g) (lambda (x) (funcall f (funcall g x)))
and then compose as many functions as we like
  by fold-left of compose2 starting with
  the identity function:
(defun compose (&rest fs) (fold-left #'compose2 #'identity fs))
We can compose a binary function with a unary function in three ways: we can pipe the output of the binary function into the unary function, or we can pipe the output of the unary function into one or the other of the inputs of the binary function.
(defun binary-compose-output (f g) (lambda (x y) (funcall f (funcall g x y)))) (defun binary-compose-left (f g) (lambda (x y) (funcall f (funcall g x) y))) (defun binary-compose-right (f g) (lambda (x y) (funcall f x (funcall g y))))
The collate function can now assume that a lot of the
  work is done by the :key and :merger
  functions that are passed in.  It simply builds a hash table and
  fills it:
(defun collate (item &key (key #'identity) (test #'eql) (merger (merge-adjoin #'eql)) (default nil))
  (let ((table (make-hash-table :test test)))
    (dolist (item items table)
      (let ((k (funcall key item)))
        (setf (gethash k table) (funcall merger (gethash k table default) item))))))
(defun merge-adjoin (test)
  (lambda (collection item)
    (adjoin item collection :test test)))
So suppose, for example, that we have a list of records.  Each
  record is a three element list.  The third element is a struct that
  contains a string.  We want a table mapping strings to the two
  element lists you get when you strip out the struct.  This is easily
  done with collate:
(collate records :key (compose #'get-string #'third) :test #'equal ; or #'string= if you prefer :merger (binary-compose-right (merge-adjoin #'equal) #'butlast))
The audit code reads lists of records from the database and from GitHub
  and from CircleCI and uses collate to build hash tables
  we can use to quickly walk and validate the data.
Translating this into golang isn't quite so easy.  Golang has first
  class function, true, but golang is a statically typed language.
  This causes two problems.  First, the signature of the higher order
  functions includes the types of the arguments and the return value.
  This means you cannot just slap on the lambda symbol,
  you have to annotate each argument and the return value.  This is
  far more verbose.  Second, higher order functions map onto
  parameterized (generic) types.  Generic type systems come with their
  own little constraint language so that the computer can figure out
  what concrete types can correctly match the generic types.  This
  makes higher order functions fairly unweildy.
Consider compose2.  The functions f
  and g each have an input and output type, but the
  output type of g is the input type of f
  so only three types are involved
func Compose2[T any, U any, V any](f func(U) V, g func(T) U) func(T) V {
	return func(x T) V {
		return f(g(x))
	}
}
If want to compose three functions, we can write this:
func Compose3[T any, U any, V any, W any](f func(V) W, g func(U) V, h func(T) U) func(T) W {
	return func(x T) W {
		return f(g(h(x)))
	}
}
The generic type specifiers take up as much space as the code
itself.
I don't see a way to write an n-ary compose function. It would have to be dynamically parameterized by the intermediate types of all the functions it was composing.
For the collate function, we can write this:
func Collate[R any, K comparable, V any](
	list *Cons[R],
	keyfunc func(R) K,
	merger func(V, R) V,
	defaultValue V) map[K]V {
	answer := make(map[K]V)
	for list != nil {
		key := keyfunc(list.Car)
		probe, ok := answer[key]
		if !ok {
			probe = defaultValue
		}
		answer[key] = merger(probe, list.Car)
		list = list.Cdr
	}
	return answer
}
We have three types to parameterize over:  the type of the
  list elements (i.e. the record type) R, the type of
  the key K, and the type of the value V.
  The key type is needs to be constrained to be a valid key in a map,
  so we use the comparable constraint.  Now that we have
  the types, we can annotate the arguments and return value.  The list
  we are collating is a list of R elements.  The key
  function takes an R and returns a K.  The
  merger takes an existing value of type V and the record
  of type R and returns a new value of
    type V.
The magic of type inference means that I do not have to annotate all the variables in the body of the function, but the compiler cannot read my mind and infer the types of the arguments and return value. Golang forces you to think about the types of arguments and return values at every step of the way. Yes, one should be aware of what types are being passed around, but it is a burden to have to formally specify them at every step. I could write the Common Lisp code without worrying too much about types. Of couse the types would have to be consistent at runtime, but I could write the code just by considering what was connected to what. In golang, the types are in your face at every function definition. You not only have to think about what is connected to what, you have to think about what sort of thing is passed through the connection.
I'm sure that many would argue that type safety is worth the trouble of annotation. I don't want to argue that it isn't. But the type system is cumbersome, awkward, and unweildy, especially when you are trying to write higher order functions.
It is taking me longer to write the golang version of the audit service than it did to write the Common Lisp version. There are several reasons. First, I am more experienced with Common Lisp than golang, so the right Common Lisp idioms just come to mind. I have to look up many of the golang idioms. Second, the golang code is trying to do more than the Common Lisp code. But third, golang itself introduces more friction than Common Lisp. Programs have to do more than express the algorithm, they have to satisfy the type system.
There are more points of comparison between the two languages. When I get frustrated enough, I'll probably write another post.
 
 
6 comments:
Very nice concrete example of why static typing can be a pain if your type system and/or inference isn't powerful enough to handle all cases. The Interweb tells me variadic AND generic Go functions are simply ENOTSUP for now.
You're probably aware of this but this is quality reading on a lot of Go's issues, even from the perspective of static typing (which I'm generally strongly a fan of outside of Lisp): https://fasterthanli.me/articles/i-want-off-mr-golangs-wild-ride
For the first part I think you want to use a defer call to AuditStep() instead. Eg see: https://www.reddit.com/r/golang/comments/zdv7d7/simplify_defer_logging/
The most difficult thing when coming to a different language is to leave the other language behind. The kind of friction experienced here is common when transliterating ideas from one language to another. Go (in this case) is telling you it just doesn’t like to work like this.
I struggle with this, and I’ve seen others struggle too.
Try writing simple Go, instead of reaching for Lisp idioms. Then find the ways that work for Go to express the concepts you find.
This helped me coming from OO to writing Elixir, and again when starting Go.
Yeah I agree with this take. Go is good at, well, writing code to do what it needs to do.
As much as I appreciate Common Lisp, it’s powerful. Much more than Go. This isn’t a fair comparison because you would just write the code in Go, without the Common Lisp idioms.
Post a Comment