Thinking about the "Embodiment" of Code
Thinking aloud about the "World of Code", which will include both "big ideas" and also technical details that the non-technical reader can safely ignore.
So, in the interests of using text, and, more specifically, this Substack, as a way to sort out my thoughts, I’ve doing some research and some thinking about the “embodiment” of the World of Code, which is, of course, mostly our modern computers and computer chips, and trying to think through some of the implications of their current embodiment.
This means doing something I’ve always intended to do but have never really gotten around to: namely, wrapping my brain around at least the key details of computer chips and the way that computer code controls and interacts with them. In my youth, that meant learning “machine language” or, perhaps more properly, “assembly code”, but at this point I’m not really interested in doing that, except as a way to think more clearly about how computer code works in its simplest and purest (or at least most “embodied”) form.
One of my favourite web-pages on this subject has been, for many years now, the introductory page of the online book Eloquent JavaScript. However, it was fun to run across, today, an early resource on this subject, the Usborne Introduction to Machine Code for Beginners. It’s a little more approachable than the book I would probably have picked up back when I was a kid, if I had actually been serious about learning to program in assembly language!
But of course my intent now is not to learn assembly language at all. Rather, what I’m interested in now is coming to a general understanding of how modern computer chips actually work “under the hood” that determines how computer code works, which in turn has shaped the modern world in which we live, The World of Code. And what I really want is to be able to distill this down into something both simple and accurate enough for anyone to understand.
So far in my research, I have not found anything that actually boils down, nice and simply, what I’m actually looking for, so, unless I can find something that’s already “out there” (Surely someone must have done something like this before!), I’m stuck trying to look into the details in order to try to distill them myself.
Here are (mostly for my own reference) a collection of the Wikipedia articles related to my subject that I’ve found most helpful so far:
Logic gate (“Logic gates can be cascaded in the same way that Boolean functions can be composed, allowing the construction of a physical model of all of Boolean logic, and therefore, all of the algorithms and mathematics that can be described with Boolean logic. Logic circuits include such devices as multiplexers, registers, arithmetic logic units (ALUs), and computer memory, all the way up through complete microprocessors,[5] which may contain more than 100 million logic gates.”)
Digital electronics (“Digital electronic circuits are usually made from large assemblies of logic gates, often packaged in integrated circuits. Complex devices may have simple electronic representations of Boolean logic functions.”)
Random-access machine (The StackExchange article that led me to this one summarizes its importance quite nicely: “"Modern computers are not based on the Turing machine model. Turing machines are very slow, and don't represent the capabilities of hardware. From the software side, a modern computer is similar to the RAM machine, which allows indirect addressing and has an unlimited "alphabet" (its registers hold arbitrarily large integers), though actual machines have limited registers, and this sometimes makes a big difference (for example, when doing arithmetic on large numbers). I don't know of a good model for the hardware side; Boolean circuits, popular in theoretical computer science, model neither memory nor iterative computation.”)
Other pages that have been somewhat helpful have been:
Boolean algebra (particularly the bits on Operations and Digital logic gates)
Boolean circuit (mostly for the cool example diagram on the page!)
and Brainf*ck (Sorry, that’s what it’s called! I’ve edited the name here for a bit of politeness…), a deliberately ridiculously minimalist but fully functional (“Turing complete”) programming language with only eight commands.
So, all that definitely gets me “deep in the weeds”… Where, ultimately, does it lead me and how do I abstract/simplify my way out of this without it turning into a sort of “Turing tarpit” like the last-cited programming language?
Well, let’s see… Thanks to modern computer systems we now live in a “digital” age, rather than an “analog” one. We digitize due to convenience: boolean logic can be used to describe such a wide variety of mathematical and algorithmic problems, and can be so neatly and conveniently “embodied” as electronic circuits in the form of cascading logic gates that its applications, controlling physical or virtual machinery via computer code, are virtually infinite.
Infinite, but not all encompassing. There are different degrees of infinity, after all. The digital can only ever approximate the analog - and computer systems, being necessarily embodied physically and logically analytical, are necessarily missing an essential spiritual component that we would understand as essential to life.
As a result of all this, ambiguity is something that computer systems have always struggled with, and that they are only just now beginning to “comprehend”, in a very approximate way, by analyzing and ingesting the untold billions of psychic connections as embodied in the patterns of human language, which is itself a limited abstraction of human thought.
So, why the obsession with this World of Code? Why are we continually expanding it and building out its architecture, despite its limitations, which we are continually wrestling with and trying to overcome? (Quantum computing being perhaps a good example of an attempt to develop an alternative architecture that will at least have different limitations.) The answer can probably be summed up in three words: convenience, comprehensibility, and applicability.
We human beings are always striving to control and understand our world. The World of Code, originally invented largely for codebreaking and the development of the atomic bomb, gives us a powerful, very widely applicable, and convenient tool to help us do so, although some ambiguity is necessarily lost in the process.
And this analog ambiguity is also somehow essential to what makes us human, to our lived, real, spiritual experience of the world around us.
It is of interest to me that I renewed my love of computers at seminary. Building functional computers out of old ones that people no longer needed was not only an economical way to obtain computers as a poor student - the exercise was a discrete, limited, solvable problem, which proved mentally valuable to me as I simultaneously wrestled with the infinite and incomprehensible represented in both theology and the pastoral problems embodied in practical anthropology.
It is also of interest to me that a recent study (2020) revealed that an aptitude for language was a better predictor of whether someone would be good at computer programming than an aptitude for mathematics. As I wrote in a recent article composed for the weekly parent bulletin at the school I work for:
When you think about it, this makes sense on a number of different levels. To be a good communicator, you need to be good at logic, which is essential for thinking through (and thinking up) the algorithms that are at the heart of computer programming. And, of course, computer languages are a stylized subset of human language, most notably with as much ambiguity removed as possible - which again fits well with those who like to play with language: in order to remove ambiguity (multiple meanings) from language, you need to be able to identify that ambiguity in the first place!
This resolves for me the apparent contradiction of being someone who is not only "into" computer programming, but also someone who loves to write poetry. One of the many ways that poets use language is to play with multiple meanings (and reflect on them) - not to mention rhythm and rhyme and pretty much every other aspect of language imaginable.
So, I think I’ll end these reflections with an as-yet unpublished poem from my Japan Journals, which seems somehow relevant here:
Pandora opened up her box:
a thousand horrid hurts rushed out;
one grace remained—and that was Hope—
and one more, neither good nor bad,
came out, but neither came nor went,
but hovered, lit upon the lid,
but when the man came with a shout
this last one quickly flew and hid
inside Pandora, and its name
today is known as Paradox.
Oh, and because I’ve been thinking about using this Substack to refine my thinking, and because thinking is essentially a collaborative practice, I’ve decided it makes no sense to restrict the chat for this Substack to my mythical paid subscribers, so, as of today, I’m opening it up to all subscribers. Should you feel so inclined, please let me know what you think of my thinking! I’d be more than happy to hear any thoughts on any of this that you might be willing to share.