Tuesday, August 4, 2020

Hard AI

This will be difficult to write. I turned 65 yesterday, and my daughters, having my number, gave me a bottle of bourbon (the kind I can never afford, Maker's Mark, which is oh so good) and a really nice cigar between them. Of course I thanked them by making both my medical proxy. Mostly it will be difficult because I will be punching so high above my weight.

Like most of us, I've been wrenching my mind around trying to get some little sense about how people can actually support Trump. I read an article recently (where oh where was it, dear cognition) written by a daughter describing her difficulties with herTrumpster Dad. At one point his Dad confessed that his daughter was likely right about it all, but that he, the Dad, could never think it all through that way, and the Trumpster truisms just came easier to him. Hey, I found the article!

Then this morning, in my aged-out fog, there was this mention of a new AI engine. That's not what the experts call it. They call it a language model, and I guess it's a huge step forward toward the concoction of a machine-built general intelligence. Or so some seem to think. It's trick is to create elaborate tracts which are difficult to distinguish (sometimes) from human-written language. It starts with scant cues and then constructs some narrative, perhaps based on the writings available by Internet of whomever it imitates. 

Of course, I have maintained for some long time that we humans have been turning ourselves into machines. (I always have a hard time linking to my own writings, but it's in there, up here) I've also speculated (whether out loud or in writing, I can't remember just now) that the Trumpsters just like to hear what they feel they already understand, but which they feel they get in trouble for saying out loud in the wrong company.

It has always been built in to the American model of government that we could get (and often have gotten) a narcissistic buffoon for president. The trouble is that this time the one in office doesn't even feel the need to play the part of president. More likely, he feels the need not to which was P.T. Barnum's trick as well. The role has been that debased. 

In my assessment, it's the advent of first, mass-media (let's say it started with TV and Kennedy/Nixon for this argument) now accelerated by video-enhanced Internet, which has pushed our model to its logical end. We're almost guaranteed to get a Trump in there eventually, or perhaps even all the time now.

First up among the philosophers called on to comment on this new language model for AI is David Chalmers, who is often referenced as the top consciousness dude out there. In my recent reading, I have come across a theory of consciousness which is nearly diametrically opposed to Chalmers approach. Riccardo Manzotti calls his model the Spread Mind.

The spread mind can't be modelled by machine, and the brain is not the seat of consciousness. Manzotti starts by saying (shouting, really) that there are no images in the mind. The images are in the world, and our experiences are related back to us as memories by a kind of neurological delay-tactic. I've checked my read with the author himself and I get him, apparently, mostly right.

I think I come off too cranky and home-schooled to engage him beyond brief correspondence. This is, after all, his life's work and there are many more important thinkers close to his field. 

What I would like to urge him to do is to include emotion in his model. He and Chalmers both seem to consider emotion as a secondary quality of consciousness. Manzotti notes that he hardly mentions emotion, which is indeed true, and that he doesn't know much about it except that it must be "out there" among the perceptual reality of his spread mind.

Like all scientists, I would say by definition, he can't admit subjectivity to his reality, even though Manzotti makes a valiant effort. Indeed part of the burden of his theory is to dissolve the subject-object distinction. It's not just that our mind is built on our perceptions, but that those percepts constitute our mind. The supposed images in our mind are actual things out there in the world about us, and we could no more upload our mind into a machine (no matter how complex the machine) than we could give our twin sibling our exact experience of the world, though the differences may be subtle to that precise extent. 

So, the trouble for Trumpsters is that they see those of us on the elite side of the language game as truly believing that we have come closer to some truth than we could possibly have gotten. We are, in other words, hoist by our own petard. By and large we think that belief in God is for silly losers, that gay sex is just as normal as hetero, and that we can rise above racism by thinking hard about it. Implicitly, we believe that by following our model for cognition we can eventually arrive at the one correct answer to all conundra.

As a channeler of the common mind (no deep reading or even viewing required), Trump affirms the sense of reality of the great unwashed masses (though even they are growing alarmed at the chaos unleashed once we got hit by pandemic). The answers have grown too erratic even for the die-hards, I'm happy to observe. But the unsolved problems remains unsolved.

In part, it's easy to see the role of media here. No matter how much we try to distinguish ourselves - be authentic - by our instagram posts, we're mostly in-formed by the same stuff blasted now all over the globe. Nearly everything that could be is captured digitally, moving or still, sound or just fury. The important stuff inevitably becomes both invisible and secondary. The felt and touched love, for instance.

I'm happy to observe that Manzotti agrees with me that cognition is too slow for survival purposes, and that it's the emotive centers of our mind (not of our brain, although I find it useful to locate them there in the brain stem, which is already present in reptiles) which make the important decisions. Those happen to be the very centers to which media panders, left, right and center. Everyone clamors for our attention, and sometimes it might as well be the machine doing the writing, since we really can't tell the difference.

Trump knows, of course, that any mention props him up, and he especially cranks up the so-called Main Stream Media (the "fake news") which can't resist how easy it is for them to get clicks and reads by mentioning yet another Trump atrocity. In simple terms, the man must be important for how often he gets tagged (meaning how often his tag gets used).

So yes, I'm saying that all of us are artificially intelligent, no matter how much or what quality of stuff we read or otherwise digest. We are artificially intelligent to the extent that we divide ourselves from our actual mind by internalizing complex texts in place of actual experience. Of course I'm including film as text in this usage. Metaphor abounds.

Manzotti seems to think that our emotions are out there among our perceptions, while I am urging that our emotions are out there in the same literal sense those things that we perceive are always out there and never in our brains. Emotion, in other words is a part of reality and not apart from it. It is as real as percepts, but is composed instead of conceptual relations among objects, of which we are one ourselves.

Our one-ness is, of course, just a colony of aliens, most of which don't even share our DNA. In our quest for insta-authenticity, we are questing after the impossible - that we should be distinguished from every other of our  acquaintance. We are, instead, composed of all those others, and perhaps especially those we can't know because they speak some alien tongue or have a different culture, perhaps encoded by dress or skin color. 

In this sense American exceptionalism can only be ironic. If we are exceptional, that is precisely because we have admitted so many nationalities to within our borders. And this all - our NOW - becomes our adolescent identity crisis. 

Of course the pandemic divides us each from other. Doh!

The flaw is not with Trump. The flaw is with how we define what is human. We have been mistaken for a very very long time. It's not our general intelligence which distinguishes us as human. It is our caring. Read closely, and you will be able to tell which is the machine writing. It's hardly ever the stuff that's mostly wrong in the particulars.


No comments: