Thursday, July 30, 2009

Confessions of a Math Idiot

I wasn't always a math idiot. I took the ‘advanced’ math classes in grade school. While I was in high school I took Calculus II over at the community college (the professor was a Harvard grad with a Ph.D. in theoretical physics from Cornell). I took the AP Calc test and my math SAT score was nothing to sneeze at. I went as far as differential equations and linear algebra in college.

But then I found out about computers. On the first day of 6.001 the professor wrote this on the board:

                f(x + dx) - f(x)
f'(x) =  lim  ------------------
        dx->0        dx

(define dx 0.0001)

(define (deriv f)
  (define (f-prime x)
    (/ (- (f (+ x dx)) (f x))

(define (cube x) (* x x x))

(define g (deriv cube))

(g 4)
;Value: 48.00120000993502
It was the beginning of the end.

I started to unlearn math. I began seeing math through the eyes of a computer scientist. Things that I believed in stopped making sense. I used to believe that integration and differentiation were opposites: that one ‘undid’ what the other ‘did’. But look:

(define (deriv f) 

(define (integrate f low hi)
The arity of deriv is 1 and the arity of integrate is 3. They can't be opposites!

I used to think you couldn't divide by zero. But look:
(/ 2.3 0.0)
;Value: #[+inf]

I started seeing ambiguities in even the simplest math. Is ‘x + 3’ an expression, a function, or a number? Is ‘f(x)’ a function or the value of a function at x? When I see log (f(x)) = dH(u)/du + y, are they stating an identity, do they want a solution for y, or are they defining H?

I started reading weird things like Gosper's Acceleration of Series and Denotational Semantics by Joseph Stoy and Henry Baker's Archive. These look like math, but they seemed to make more sense to me.

But I drifted further away from mathematics. A few years ago a fellow computer scientist explained to me the wedge product. It is a straightforward abstraction of multiplication with the only exception being that you are not permitted to ‘combine like terms’ after you multiply (so you get a lot of irreducable terms). Then he explained the Hodge-* operator as the thing that permits you to combine the like terms and get answers. It's really not very complicated.

But then I looked it up online.
The exterior algebra ∧(V) over a vector space V is defined as the quotient algebra of the tensor algebra by the two-sided ideal I generated by all elements of the form x⊗x such that x ∈ V.
And I discovered that I'm now a math idiot.

I'm probably going to have to live like this the rest of my life. I'm trying to cope by relearning what I used to know but translating it to a style I'm more comfortable with. It will be a long process, but I'm optimistic.


  1. no shame in community college classes, even if your prof isn't ivy league.

  2. I wouldn't lose any sleep over it. I kinda recognised your dy/dx bit from doing calculus 30 years ago, and the digital angle was familiar to me from DSP stuff maybe 15 years ago.

    But I still need to work out most of the times table effectively from scratch (e.g. 7x12 = 7x10 + 7x2).

    My only math these days is coding (and got caught out by some simple trig the other week trying to rotate a turtle :)

  3. You are not alone.

    -- A 6-3 compatriot

  4. The exterior algebra ∧(V) over a vector space V is defined as the quotient algebra of the tensor algebra by the two-sided ideal I generated by all elements of the form x⊗x such that x ∈ V.

    This is the language of modern/abstract/universal algebra & category theory. Some books:

  5. I agree that oftentimes, math notation makes little sense. For example, I finished my first calculus class recently, and there's something that didn't typecheck in my brain.

    Say you have a function: y = f(x) = x^2 + 3. Now you want the derivative of that function. The manual indicated that there were many notations to represent the derivative (why so many notations anyway?)

    You could write f'(x), which I had no problem with; dy/dx which I found a little weird, but I could buy it. What I didn't buy was that they said that df/dx was valid and I was screaming at the book "f is a function from real to real and y is a real, this doesn't type check!"

    Of course, don't mention that to math people, cause they'll say you're being too rigorous (and when you aren't they yell at you too)

  6. Vincent,

    df/dx should be acceptable. People should know what is meant by that. The reason for the differing notation is that calculus existed for a long time before it was polished and cleaned up. The problem is that there is a lot of legacy notation. This exacerbated because physicists have their notational quirks with regard to calculus. It's unfortunate.

  7. It feels nice to find that I'm not alone !!

  8. This comment has been removed by the author.

  9. My experience is quite the opposite of your own. I started college in 1988 and from day 1 we started using the modern notation. We did differential equations in the second year, using the formal
    definition of differential and perfectly non-ambiguous notations. We saw the traditional notation later, in Physics courses. But I studied in Italy, not in the US. In my experience the European
    university is much more formal than
    the American. Anyway, you are talking about differential geometry here and it is not strange that most people are ignorant about it. It would be absurd to teach differential geometry to everybody. I studied it in the fourth year, in the course about General Relativity (I am a Physicists), but it was a complimentary course, most people did graduate without having studied
    differential geometry at all.

  10. The thing about the hodge star turning the wedge products into simple multiplications seems interesting. Could you throw some more light on this? I'm trying to multiply some numbers in "quaternion space" but the number of basis combinations (and coefficients) seem to explode in "memory space". Maybe the hodge * can help me out here?

  11. (define (integrate-from from)
    (define integrated-from (f to)
    (integrate f from to))

    Produces inverses of differentiation. Forgot what the exact requirements for the integral being the inverse (in the given sense) The looking at a textbook(Well, dictate) requirement doesn't seem very strict; (= (integrate (deriv f) a b) (- (f b) (f a))) when f continuously differentiable. Mathematicians often mean things -in-the-sense-that.

    To link the web a little:

  12. to Vincent Foley:

    dy/dx was the notation used by Leibniz

    d'(x) was the notation used by Newton

    df/dx well, if you change the independent variable y to f(x), then it's just the same thing as dy/dx... it's just a substitution...

    Best regards.