Gods! Robots! Aliens! Zombies!

Epistemic Status: Endorsed
Content Warning: Neuropsychological Infohazard, Evocation Infohazard
Part of the Series: Truth
Previous Post: Time Binders

Rationality From AI to Zombies is a sprawling six-volume compilation of two years worth of daily(!) rationality blogging by autodidactic scholar Eliezer Yudkowsky. My apologies to Yudkowsky if this comes off as vaguely negative about him. As I was doing research there were a lot of places where I kept being surprised by certain things he did. I tried to get an interview out of him to see if my beliefs about his beliefs were actually correct interpretations of his worldview, but he never got back to me. If he does and his answers update me towards a more charitable view of his views, then I can always come back and rewrite this essay later.

So once more let’s start with the man. Like Korzybski, Yudkowsky was raised by well educated and relatively well to do parents and like Korzybski, he’s mostly self-taught in the areas that interest him. Something that happened between Korzybski and Yudkowsky however, was an explosion in the popularity of science fiction as a genre. His parents seemed to like science fiction and introduced him to science fiction and what he calls traditional rationality at a young age. There are actually a lot of entertaining parallels between Korzybski and Yudkowsky, although I think Korzybski wins the contest of who is the most extra. (Unless Eliezer took up sword fighting and fought in a war and never told anyone). 

While his parents were modern orthodox Jews, Yudkowsky himself was raised more within American culture than within traditional Jewish culture to the point where, in one interview, he recalls a realization that Judaism was never really his childhood religion, space travel was. 

This is interesting to remark upon because I think it describes a lot of people raised in religious households. Unless your parents deeply insulate you, chances are you’re going to spend more time being more exposed to modern secular American society than you are being exposed to the traditional religious cultures of your family’s past. Church/Synagogue only comes once a week, but Star Trek is on TV every day. There’s only one Torah/Bible, but there are lots of science fiction books. 

Yudkowsky’s parents were also both employed in science-oriented technical fields. His father was a physicist and his mother was a psychiatrist which is where he got his introduction to what he calls traditional rationality. He specifically namedrops early skeptic Martin Gardner, debunker of the supernatural James Randi, physicist Richard Feynman, and disciple of Korzybski S.I. Hayakawa. 

However Eliezer also comes off as rather dismissive of traditional rationality and aside from those few namedrops (particularly Feynman, who he namedrops a few times despite him not being, in my opinion, a very central example of traditional rationality) he spends very little time talking about where he came from in either in his writing or his interviews, much to their detriment I think.  His treatment of General Semantics is especially egregious in this regard, he never refers to it by name, never mentions Korzybski, and in the entire volume of his writing, he references Hayakawa twice.

With the way Yudkowsky talks about science and his ideas, the way he references catgirls and other literary and anime tropes, and the way he fails to identify where he got the ideas that he didn’t invent, it wouldn’t be a surprise if someone thought the primary source of these ideas was Yudkowsky and TV Tropes. Without discussing the intellectual traditions that led him to where he is, thinking that he came up with ideas like The Map and Territory would be an easy mistake to make. 

Yudkowsky takes from traditional rationality things like empiricism, (the virtue of performing your own experiments) falsifiability, (make predictions that can be proven wrong) and warrant, (the importance of justifying your beliefs). He then applies probability theory and decision theory to traditional rationality in the service of creating the juggernaut that is within the rationalist community colloquially referred to as The Sequences. 

Rationality From AI to Zombies shouldn’t really be thought of as a book. Literally speaking it’s a book of course, you can buy physical copies with pages made of paper you can leaf through. However, it is really still just a collection of essays. It lacks a lot in the way of a central thesis and mostly just throws rationality flavored spaghetti at the wall and hopes some of it sticks. As of the time of this writing, I’ve done four read-throughs of his material and with each pass have become somewhat less favorable in my viewing of it. 

The biggest weakness of the sequences is that it doesn’t do a very good job of keeping itself organized in sections or focused on a theme or sorted by topic. One day Eliezer will talk about identifying truth, then the next he’ll talk about that time he totally pwned a religious person in an argument by bringing up Aumann’s Agreement Theorem, then the next he’ll talk about cognitive biases. I was hoping when he compiled all of his writing into Rationality from AI to Zombies that he would produce a more coherent and focused work, one with a central thesis, but Rationality from AI to Zombies is just the sequences arranged into book form with some of the weaker essays removed or rearranged. 

Yudkowsky does bring some things to the table, in particular incorporating rigor, probability, and quantifiability into what was previously a mostly qualitative field. This is important for a number of reasons, but maybe not the ones you might expect. There are definitely important things to take from the sequences, but a lot of them are concepts he either doesn’t name or obliquely references without pointing directly towards. 

Reading the sequences as they’re written and missing the important between-the-lines reasoning probably won’t lead you to the place he wants. It’s possible to get to that place, but it requires a lot of reading between the lines. Without that, they’ll lead you somewhere vaguely adjacent but with critical pieces missing, malformed, or underdeveloped. This is a rather big problem with the sequences because Eliezer clearly intends them as a step by step instruction manual for finding the holy grail of good thinking. But as Scott Alexander says 

The thing about grail quests is – if you make a wrong turn two blocks away from your house, you end up at the corner store feeling mildly embarrassed. If you do almost everything right and then miss the very last turn, you end up being eaten by the legendary Black Beast of Aaargh whose ichorous stomach acid erodes your very soul into gibbering fragments.

There are lots of people who manage to take useful things from the sequences, but there are also lots of people left stranded on the side of the road in really weird epistemological traps that they thought themselves into and then couldn’t find their way out of. We’ll be attempting over the following essays to sketch out the most important and underspecified parts of the sequences and hopefully build some ladders out of those potholes. Along the way, we’ll crash through most of the important concepts and principles, and hopefully, you’ll come away from it with a bit more grounded understanding of what this is all about. 

Part of the Series: Truth
Next Post: Occam’s Guillotine
Previous Post: Time Binders

Leave a comment