Monday, September 15, 2025

Yudkowsky Bayesian Priors

I did once watch Yellow Submarine, stoned. I was watching for the reference to an artist near-family member. Thank goodness I never became a stoner. I haven't watched it since, maybe because there was no big impression. 

Improbably, I'm now a Bills fan. I host a watch party every single game. I've always hated football, and now hate it even more for the money and exclusion of it. Something like 90% of the seats in the new stadium will be reserved for seat-license holders. We used to walk to the game from church on Sundays when Jack Kemp used to favor us. The cost of a ticket was negligible. I later held the same job that Cookie Gilchrist did - delivering beer kegs - back when being even a famous player wasn't a full-time life-sustaining job. We both deserved and got the same workout.

I'm also a sailor, always on some boat that I substantially rebuilt. I've never become very skilled. I'm no longer bold, but I once was and sometimes consider the odds that day when I was the only boat on the water, sailing alone as usual, and I picked up a fellow way way out among the steep Lake Erie rollers. I'd heard his screaming, picturing some unwise family outboard capsized. It sounded like a bunch of kids. 

It was hard to tell the noise from background howling, but it kept returning. I tied off the tiller and climbed up onto the winch alongside the mast and saw nothing. The second time, I saw the desperate body waving arms and screaming. My engine was going bad and wouldn't start, so I executed what ended up being a near perfect pickup. One hand on mast, foot on tiller, boat heeled far over, I passed close above him and pulled him aboard over the low side in one swift action, his grasp matching mine.

What were his odds?

I take one look at the profile of the Bayesian mega-yacht that foundered in a freak storm, basically by being in the precise wrong or right spot, and I knew it couldn't be stable. The unschooled man had become a billionaire by clarifying the odds on the way to LLM AI. His yacht was a finger flipped at something.

Even though anywhere you go you hear Bills fans arguing in almost political frenzy about what went wrong and what went right, always laced with foul expletives, the Bills are what cuts through politics and brings us all together in Buffalo. That's why I watch them. I want to be a part of us.

What were their odds?

Just before the second miraculous comeback the other day, both watched on TV, the calculated odds for the Bills to win were something close to nil. 

I am far from certain, but I think that the difference between a Bayesian and a plain old statistician is how you might place your bet when a coin has come up heads fifty times in a row. Canonically, the odds always remain 50/50. The coin has been certified fair by previous flipping.

Since there's no magic allowed, the standard answer is that the next toss is still 50/50. The Bayesian looks at the priors and says nope, I'm betting the farm on a heads. 

I ran into Eliezer Yudkowsky, virtually, back when he was, to me, some kind of acolyte of Ray Kurzweil. I'd read Kurtzweil's Singularity book and was repulsed. Google embraced him, as well they might. His claim to some sort of fame was voice to text or maybe text to voice software. It was so much fun to watch my nieces and nephews talk to the computer that Dad got from AARP and IBM. They were rolling on the floor from the sketchy conversions.

Yudkowsky and Kurzweil both believed and, as far as I know, still believe, that there will be a moment in the very near future when "intelligence," sometimes so called, will crystalize across the cosmos in a virtual instant, displacing all the random stuff. Displacing life. Yudkowsky once looked forward to the disembodied immortality he fully expected. Does he still?

What were the odds against his now becoming horrified by AI? Does he somehow value life? If so, why is he betting against it?

The unexamined suppositions about what intelligence is (and isn't) are already far along the way to destroying life, the universe, and everything, even though we already know that the answer is 42, right?

I am sorely pressed to put God in where the Bills win. Where the arrogant ship founders. I resist, for some reason. Probably because the religionists have totalized God and use him not for good anymore. He's become a battering ram for one's "side".

The probability of humans in the cosmos is surely lower than the probability of the Bills' recent comeback, or the Bayesian mega-yacht's foundering, or that Jet Ski rider being rescued (his buddy dumped him to jump the waves alone, not having the nautical experience to realize that once lost in the waves, a swimmer is lost forever without a spotter).

AI LLMs are working off human language. Our natural languages are being totalized. Trump is the natural response. Figure it out. Yudkowsky thinks that ship will float. He's properly scared of it. 

I went to Yale to become an engineer. Improbably, I'd been accepted at both CalTech and MIT. I doubt I'd have made it into Yale without the engineering slant. I ended up with a degree in Chinese lit, and even started a PhD in classical Chinese lit.

What are the odds? Well, I'm odd. 

I didn't last long in grad school, having been lured away by a wooden boat, or the dream of one, based on a rotting hulk.

I had been misplaced as a freshman into a rather advanced English lit class. I had no clue. I remember later, sitting in a circle in some high-class English lit seminar where each would read a line from a poem. I was shut down quickly, probably sounding like how a computer would read a poem

But I also remember learning to read the classical Chinese poets, where my absent priors in tony Prep Schools hadn't already set my course for idiocy. There's no way no how that an LLM AI can "decode" those poems. 

Or could they? 

My own professor, with whom I became fairly close, was famous for having memorized the entire corpus of Tang Dynasty significant poets, the Complete Tang Poems. He chastened me once for correcting his reading of a single character in a poem our class was assigned. The odds against me were that high. It wouldn't happen again, he said. 

Humanity is imperfect. My professor is imperfect. I am certainly imperfect. But I do have a heart, which is something our President also has, though in many sizes too small. 

My only talent in life has been to repair things which should long ago have fallen apart. Boats, cars, houses, schools, that kind of thing. I don't always succeed. I lack the hubris for invention. Well, honestly, I don't even believe there is such a thing. Invention is purloined credit for the all that came before. Being first should almost never be credited as being best.

Now I'm certain that it would be extremely useful to have some AI extract whatever I might want from the Complete Tang Poems. It might even get me commendations from my professor. But one must first know how to read. At least a few poems must smack you straight to the heart. You must be able to refer to at least a few referents and progenitors in and by your own mind alone. And you must have lived in an actual body, which is the seat of ones emotional self, spread out far beyond our brain. 

I would love for this to be the end of what I have to say. I'm certainly not making much progress against the powers of late-state wealth-promotion, which runs the planet now. Wealth begets wealth in almost the same way that AI begets AI in humans.

I share Yudkowsky's fierce calling out of the dangers of our current working assumptions. I don't share his cosmic address. As I recall, he was saddened by the death of his brother, and wanted to banish that possibility. In my amateurish observation, grieving and sorrow occur in inverse relation to connection while alive. Many Tang poets would agree with me on that. 

My lovely daughters sometimes joke about how many times I've almost bit it. They're not wrong. Or as Yudkowsky might say, they are less wrong. Less wrong than I was.

I don't believe that God can or shall be banished from the cosmos, whatever we might perform against our Earth. I have shrunk from most of my more youthful boldness. No more motorcycles for me. No more entering the storm. I'm tired and frail and sore all the time. I simply lack the energy. I'm done with work for someone else.

I'm still betting that humanity will awaken. We can and will let go of the silly notion that some of us are better. That some of us deserve better. Just imagine the battle of the AIs which means the battle of the self-promoters, which means the human storm arising all about us. 

It is all about us, isn't it? Will you fall in love with an AI enhanced lover? Will you be excited by a mega yacht and younger lay, a bigger house, servants? Someone who always knows what to say? If you enhance your essay out into the world for the sake of your own self-promotion, you have already enhanced the priors against your very soul. 

Good luck with that. 

We all live in a Yellow submarine.


No comments: