Wednesday, January 8, 2020

If 6 was 9

Now if 6 turned out to be 9
I don't mind, I don't mind
Alright, if all the hippies cut off all their hair
I don't care, I don't care
Dig, 'cos I got my own world to live through
And I ain't gonna copy you           -- Jimi Hendrix 
Fortunately, integers are immutable and usually implemented that way (with a few notable exceptions, mostly on early computers), so there would be no harm in Jimi copying 6 around whenever he felt like it. It would always remain 6 and never turn out to unexpectedly be 9. After all, in a computer there are hundreds of thousands of copies of the integer 1 and they agree because they are all the same and are all immutable.

There is no need for “defensive copies” of immutable data, either. This can improve the Ospace() of a program, sometimes dramatically. It also makes it much easier to encapsulate the representation of abstract data, so object-oriented code becomes easier to write correctly. Without mutators, there's half the code for each field in an object.

Cache coherency is trivial when your data is immutable: simply eject old data. No need for write-back, write-through, or cache snooping.

“Immutable by default” still hasn't taken over the mainstream, but “optionally immutable” has gained a lot of popularity over the past decade, especially as people have realized the advantages of immutable data in distributed systems.

Jimi probably wasn't thinking along these lines when he wrote that song, though.





2 comments:

John Cowan said...

Very early Fortran compilers had a bug (but contrary to rumor it was a bug): if a constant was passed to a procedure and the procedure mutated its variable representing that argument, the constant would also be mutated, because call by reference. This typically smashed the rest of your program, since constants were already being coalesced. Modern compilers allocate a location and pass a reference to that if the argument is not a simple variable.

The IBM 1120 used decimal bignums for all arithmetic and kept addition and multiplication tables in memory rather than as part of the CPU. There was no memory protection, so if you munged that part of memory you not only screwed up your own job, but all following jobs until someone figured out the problem and cold-loaded the computer. (In those days, shutting off the computer did not affect the contents of memory.)

Paul Steckler said...

On that subject: https://www.youtube.com/watch?v=kWhaxH03Xok