Monday, January 13, 2020

A Call for a Typology of Technology (in resolution of my read of Benjamin H. Bratton The Terraforming)

I'm trying, once again, to read Benjamin H. Bratton. He makes it very hard. So hard that after reading The Stack the first time, I wrote a review on Goodreads that was as impenetrable as he is. My Goodreads reviews are just bookmarks for me, seldom being inviting enough for someone else to read, much less appreciate. My writing is often impenetrable even by me! I find it somewhat strange to be afforded the tool, though I should remind myself more often how ephemeral it is. I own nothing of myself there. Or here. Or anywhere, which may be part of Bratton's point.

Bratton is hard to read on purpose, in a way. It's almost as though he deliberately - by means of his personal agency - wants to hide the serious flaws in his reasoning. Of course I don't quite believe he's doing that deliberately. Meaning that he fools himself first. He actually believes what he writes. That, I believe, might form the crux of the problem for me, in his stance.

The hard problems which must be resolved, but which can't be resolved:
  • Devising and implementing a structure for sovereignty that won't deploy power for devious, insider-privileging and ultimately planet-destroying ends. How can any technology for sovereignty be trusted?
  • Finding a lever by which to change culture (if the lever is intellectual, it has to be on a grade-three reading level). Bratton seems to believe that redesign of automation processes will entail changes to culture (without regard to whether those changes contribute to a positive or negative feedback loop) perhaps in the way that the earth-as-marble space shot (should have?) changed culture previously.
I follow, or at least I think I follow, most of his reasoning, and I have to say that he fills me with hope. I can begin - almost - to construct a trustworthy narrative for myself which doesn't have to include mass collapse on the way to steady state. I mean steady state for whatever becomes the highest generalization for whatever we mean by (grounded!) human. 

My hope is grounded on his sturdy deconstructions of the meanings of "natural" and "artificial" and on his exposure of the ground for the necessary consequent decentering of humans. He either announces or proposes a new Copernican shift away yet again from anthropocentrism. Hurray! (But the distinction between announcing and proposing is incredibly consequential.) 

Copernicus discovered something by means of technology newly available to him. It will take me a few more re-reads to know if Bratton has discovered something or if he is making more of a declaration. If the latter, then I will only be able to retain his brilliant analysis and be required to jettison any, or most, of his conclusions.

There is some shifty sleight of hand when he fails to clarify that to decenter humanity, he must declare what he calls "agency" as some sort of eternal cosmic property, recently discovered as such. The dance around this is what makes reading him so difficult. 

I believe he does this by some definition for artifice as patterns which can't have occurred without agency. His examples include distinguishing an arrowhead from a rock, or most notably, distinguishing climate change from something that would have happened without us. It takes agency to know agency, and what we have to change - very deliberately change - is the destructive AI that we have loosed upon the earth by relinquishing our agency (over it?).

When the asteroid strikes, that can't involve agency by his definition. It would be pure accident on a more massive scale even, than the catastrophe of climate change which is precisely the emergent/accident from the new field for accident which is the human overlay of AI. An awful lot of nifty accidents got us to this place by way of cosmic-scale evolution. It's just that we can't have been the end of that.

But now I am starting to understand Bratton, and before I completely retract my earlier review (somehow I was supposing that his insights would look as silly as phrenology - I must have been reading something about phenology at the time - when reviewed on the basis of some future understanding. Presuming that future understanding is better understanding. A technologically enhanced higher level of generalization/abstraction?

There's the rub for me. Maybe I'm too much of a Platonist to think that we aren't destroying and covering up even, or especially, as we "progress," though I prefer to think I'm too much infected by the entirely non-Platonic Chinese tradition. (which used to be all about steady state and no progress). In that case, too much proliferation and complexity is degenerate and not progressive. Wait, that sounds Platonic! I think I'm trying to discover if Bratton is a closet Platonist. By which I think I mean a Westerner uncontaminated by a differing cosmology. It is important to distinguish what is an accident. It can't just be distinguished by desire, can it?

I'm sure that Bratton's ears perked up, or would have, when Nancy Pelosi declared "all of us" ready to die for our country last night. Really? Die for some principle maybe, but no longer for this country. I'm glad she's there for sure, but I think she just did declare the end of the nation-state as an entity worthy of self-identification. Enter Bratton's reconfigurations for sovereignty description.

I'm not so good at any kind of -ology, for sure, which is probably why I never could succeed as a scholar. Every time I come across such words as phenomenology, teleology, epistemology, . . . I have to look them up. Not necessary as much for their meaning as for their usage, which never sticks for me. Maybe it's slippery out there in the wild. Who knows?

So, as I try to grok Bratton's rather novel generalizations as he deconstructs terms like "nature" and "artifice" I am starting to realize that most of my complaints involve his (apparent, since I can hardly claim to have mastered his entire corpus - by which I don't mean having read everything he's written - more toward getting a grasp on his terminology -ology) . . .his generalizations about technology.

In a world where writing now is commonly called a technology, alcohol is called a drug, and therefore hallucinogenics are as dangerous as opioids (dangerous to which sovereign, is the question), a typology for such a term as technology is certainly in order. It feels strange to me that it gets used as though there were some unitary meaning for it, no metaphors required. I *really* surprises me that Bratton seems to use it that way. (I can only say "seems" since I do believe that he believes that its meaning is as basic as the distinction between artifice and nature.)

If writing is a technology, then there must be distinctions between and among types of writing. Those would have to include distinctions between handwriting and typewriting, sound transcription, and Chinese writing with characters, just for some quick examples.

Of course Chinese writing also transcribes sound, and maybe the difference between sound transcription and alphabetic writing isn't so distinct. One can certainly get a computer to transcribe Chinese - more easily and accurately, actually, than happens for English. By the same token, however, handwriting Chinese engages differing parts of the brain toward language mastery as compared with keyboard entry than is the case for English. These distinctions can be important, for sure, at least because they might form the basis for exceptions to higher-level generalizations about the technology of writing.

We certainly did transform as a species in the transformation from orality to literacy, qua Ong, and much got destroyed. Our impact on earth certainly became more consequential. I wonder how we are transforming in the era of earth-encompassing new technologies. I wonder if Bratton is unreconstructed, and if I am.

So starting at the highest level of generalization that I can come up with off the top of my head - high enough, I hope, to avoid too many obvious exceptions - let me first list a catalog of distinctions:
  • digital is not the same as analog
  • mechanical technology is not the same as communications technology
  • technology for the purposes of discovery (microscope) is not the same as technology to accomplish some already established end (bulldozer)
A couple of things might be obvious from the construction of this short list: 
  • distinctions among types of technology aren't the same as to say that one type may or may not be embodied by a different and distinct type. Thus any analog mechanism may be embodied to some arbitrary degree of precision by means of digital technology. It still won't be analog.
  • schematic representations may seem identical across distinctions
  • because of the above, which might refer to a kind of flow chart of a purpose for technology, the intention of the maker of any technology must be a part of the cataloging of distinctions
  • invention is distinct from discovery (we in the West are culturally disposed to credit inventors, where in China, for instance, the possibility for a "firearm" was discovered by way of bamboo-tube rocketry, perhaps)
I would like to make note here of some interesting tangential matters:
  • Reading Henry Adams recently, I was struck by how often he mentioned the compass, gunpowder, and even paper as the things that had changed everything across the turn of the twentieth century. That may have been before China laid legitimate (for the West) claim to "discovery" of same.
  • Neurologists seem to have discovered that we make decisions before we are conscious of having decided. That seems related to a distinction in the brain between its function to catalog and organize impressions on the one hand, and to make predictions to guide actions on the other. Only the important stuff gets handed up to consciousness.
  • The construction of narratives is basic to mind's predictive function. They form the basis for some projection into the future by means of which we may guide our actions to maximize our likelihood for survival (or pleasure or pain-avoidance, whatever).
  • Once we have language, we participate in shared narrative.
  • Once we have writing, we can extend our narratives into a more distant and reliable past to assure better projections into some future (now a collective future).
  • Narrative must therefore be pre-cogitive, since our brains decide before we do consciously.
  • This may easily appear as precognition. Strange and meaningful coincidence may seem less so once and if we account for all the perceptions which never make it to consciousness in the first place. Preparation of the ground conditions revelation. Only stuff that is relevant to our narrative may appear.
  • Fate and the subconscious are the same thing, so far as we can know and tell
  • New digital and especially communications technologies are as large a break with writing as was relativity and quantum physics with material determinism.
  • The direction of the change is backwards. (our sovereign identifications shrink, and our identities fracture). Of course, backward/forward and up/down are not cardinal directions.
  • Humanity is in the process of becoming all over again.
This above list of "tangentials" is meant to explain my first distinction between digital and other forms of technology. Digital can't do random, can't resolve Zeno's paradox, and can't extend the ground for its predictive computations beyond its internal decision tree (s). Godel Escher Bach. Tralala.

I am hereby making the claim for a stark distinction between an embodied brain and a disembodied difference engine. The brain is contiguous with a nebulously bounded external ground. It may be contiguous in ways analogous (metaphorically, please!) to holographs or ether-nets, rendering boundedness meaningless for mind. On/off logical computation is radically bounded from any ground beyond its fully describable structures. Signalling must be overt, material representable, and non-emotive. Period.

Emotions are likely the most proximate cause for human "decisions." Impulse, attraction, revulsion.

Emotions are likely also the basis for the rationalizations made when we take ownership for a choice that was not properly ours, in the sense of our conscious self; the one credited with "free will." (I will have to follow with a taxonomy for emotion).

Some tech defines and/or defends the bounds of sovereignty. Obvious that would include military tech, but also housing, and communication that defines a group, where emotional bonds can extend a boundary of security.

While digital machinery may connect to and join with an external ground by means of inputs and outputs, in and out are and must be as clearly (and as starkly) defined as the distinctions between on and off.

Real and imaginary are distinct in the same way. I'm almost as much of a materialist as Bratton is. Not nearly so much of an idealist, though!

Furthermore, while the term technology has become a dog whistle for digital technology, that usage also buries digital technology beneath the rather foolish notion that technology can describe in a single word everything we cover with that one term. We've lost track of our metaphors. 

So, here's my grid:


A Typology of Technology
Type Digital Mechanical Communications
Artistic/cool input/output only Uncarved Block mediated mesh
Intentional/hot designed and iterated driven and iterated disembodied mesh
Sovereignty institutionally evolved fiat, skins and walls group definitional

Of course I have something of McLuhan's treatment of media in mind, to a certain extent. But he was only dealing with communication as that thing which does, perhaps, make us most human. He was more concerned with Chardin's noosphere than with Bratton's metaphorical stack. His "extensions of man" weren't quite so physical.

But again, the eyes are privileged, the ears second, as inputs to a disembodied brain. The embodied person deals with much more than what is treated by McLuhan and others as "media." A clue is provided in the usage of "Information Technology," which sometimes elides into or from "communications technology." As Bratton urges us to understand, all technology is embodied and energy consuming and has real impacts on the earth. Maybe signal fires and waving flags would have less embodiment, despite their tangible physicality. Are books the same when read on the web, or read to one by a computer voice, or when read without taking notes or making some sort of response?

Mechanical technology is perhaps only distinguished from digital in the matter of the location for/of "accident." All relations in a mechanism are tangible and explicit and can be perfectly represented by some schematic. When things break, the site and cause can be discovered. In the case of digital technology to the point where it has now evolved, most or much of the mechanism is obscured. As Bratton so brilliantly exposes, most of the embodiments of digital technology (where the carbon gets emitted) is utterly unavailable to ordinary consciousness, and almost never included on any schematic. But by now even the schematics are themselves computer generated (the templates for the chips, say) and the ordinary presumption that some expert can locate the break feels no different than a presumption in prior times that the high priest should know. That's the field for accident that Bratton locates. Automation as ecology.

For sure, I'm not too thrilled with this "grid" of mine. I'll have to be more three-dimensional or find an altogether different way to generate some useful typology. But I do think that the distinction between "artistic" and "intentional" is an important one. Fundamentally, that's a distinction between intentionality and agency on the one side, and something more interactive on the other. 

And if I'm right that emotions provide the inception for agency, I think it also opens a way to include money in the typology. Frankly, that's what I think Bratton is missing. It's not just that tech is happening and will happen and that we have to get on top of it. There is no manifest destiny to technology any more than there is a single direction for history; for progress. 

Bratton knows this at least implicitly by addressing the very history-changing project of terraforming. It is at least accepting of the notion that decisions have to be made and we can't just let the tech run amok. Technology has made us responsible. For technology, along with everything else. There is precisely no manifest destiny, and tech can't solve anything by itself.

I'm proposing that most of the issues that people - including me - might have about the proliferation of technology have more to do with money, which often stands in for more basic human emotion. Money sublimates most of our animal behaviors, and provides the impulse that had been provided by survival needs. 

Of course money is a communications technology. It's been an instance of AI since way before digital tech. But how should it be categorized? Perhaps we need to investigate the term "Artificial Intelligence" along the same lines that Bratton destabilizes Western assumptions about the difference between "natural" and "artificial." Can there every be any higher tech than automation? I don't think there can.

What if there is no such thing as "intelligence" apart from biological humanity. What if it's less a matter of the logical, thinking, metaphorically computer-like brain and - as the term works in Chinese - intelligence *always* incorporates emotion in its definition. Most anything can be automated, probably including what we often refer to as thinking. But if intelligence *must* incorporate emotion, then it would be hard to outsource it to any kind of technology, no matter how evolved or sophisticated.

I still can't figure out if I'm fundamentally agreeing with Bratton or not. I wonder if I'll ever have the energy . . . (how much carbon do I burn by thinking so hard?) I think that Bratton is still a literate human; therefore a human who believes in intentional design and who is most emotionally engaged with his individual identity. He feels more like an inventor or discoverer than he does a participant in collective action. His belief structure feels vaguely fascistic, if there is a good and positive sense for that.

I think that emergent humanity will (once again?) be more emotionally engaged with some sovereign entity that is much larger than an individual. I'm not sure that automation can be ecology any other way.

The extreme first person shooter individualism on display at the moment is but the last gasp of what it once did mean to be human. Team sport politics are as dead as the flag. Computer gaming is an exercise in who gives a fuck. And yet the human spirit has never been so alive and well.

No comments: