I need to post more frequently, and if I wait for my thoughts to become coherent it may be quite some time.
I wanted to say something about abstraction, so I was thinking about some of the fundamental abstractions computer scientists use. What's really happening (for some sense of ‘real’) in the computer is some very complex electromagnetic field interactions. But no computer hacker I know sits down at the terminal and starts programming from Maxwell's equations. There are better ways to think about what a computer does.
Maybe the most basic abstraction is what I'll call primitive ontological abstraction. We'll take a fairly stable pattern of currents and we'll call that a ‘number’ or a ‘byte’ or some such thing. A circuit that can maintain the pattern we'll call a ‘register’, so we can talk about ‘register EAX’ holding ‘the value 27’ or somesuch thing. Here we abstracting away the idea of patterns of current flowing through the computer to simple objects. I call it primitive ontological abstraction because we're not dealing with ‘Big-O Object-Oriented’ ideas, but rather with the notion that there are ‘things’ in the computer like bits, bytes, characters, text strings, code chunks, etc. that are the basic ‘stuff’ we're going to compute with.
Without primitive ontological abstraction, our interaction with computer is going to be very limited. We could fiddle around with the electrical signals going in to the processor and maybe cause the computer to halt or generate a non-maskable interrupt, but it's hard to program the computer when you have no notion of ‘instructions’. But primitive ontological abstraction only lifts us up to the machine-code level where we are putting different byte values in sequential memory and then telling the computer to load the ‘program counter’ with a particular start address. At this point we need the next abstraction: naming.
Ok, my bus ride is over. We'll leave naming to the next post.
The other advantage of "primitive ontological abstraction" is that hardware designers are free to implement the abstraction in many ways. The EAX register has been implemented hundreds of times, and will be implemented many more times in the future.
ReplyDeleteBut that's probably not where you're going in this essay...
This post is actually dancing around a blog topic I've had in my head for some time.
ReplyDeleteTo me, the notion of building abstraction is essentially what programmers do. And, as you noted, the lowest level (registers, bytes, reading from devices, writing to devices, etc.) of computing is still just that - an abstraction someone *invented*.
But it even goes deeper than that.
You can't even talk about the notion of magnetic fields or electrons without reverting to what are essentially abstractions. For example, I picture electrons as little dots whizzing around a core of protons. That's as useful an abstraction as any to allow me to reason about how electrons behave.
The truly exciting thing about being a programmer is that we get to invent this stuff all the time. We get to build up our own little universes, all without getting our hands the slightest bit dirty.
Until you work fairly closely to hardware design (as I do at Intel) and learn how close we are to the analog edge where our abstractions stop holding. Acceptable soft-error rates really scare me, but they are a fact of life in modern micros. So, at some point we all may have to get our hands dirty. Not that that detracts from where you appear to go in the next parts. (And, yes I worked backward to get here.)
ReplyDelete