Eris in Retrograde

Epistemic Status: Firmly supported theories and personal observations

During the summer of 2017, I started work on the Origin sequence, a project I hoped would move the rationality community in a positive direction, encourage more people to participate, and create something memetically contagious that would spread throughout the population. In the ideal case, this would snowball into a spiral of enlightenment that totally remakes society. It was…kind of ridiculous, “this project will save the world!” yeah okay past me, calm down a bit.

I was definitely biting off more than I could chew. Despite mostly still agreeing with the goals of the project, there’s little I would endorse in my past approach. I didn’t really understand what I was getting into when I started, how massive and complicated a project it would become, or how deeply embedded the associated ethical interpretations were in the psyche. Fatally I failed to appreciate how difficult it would be to steer and modify an accelerating and expanding system.

So now we’re a year Post-Origin, and the beginning of the new year seems like a good time to write a postmortem and discuss the future. We’ll go over what the issues were, what went wrong, and finally a discussion of why these things are difficult.

Foundational Quicksand

Origin was probably doomed from the start. It was very grand and very optimistic, and basically completely failed to take into account the existence of things like ‘reality’ or ‘other people’ during the initial planning. The long-term goal of Origin was to remake society in the image of rationality, to modify the rationality memeplex to be more virulent and compelling so that it spread throughout the population and raised the sanity waterline. There was a baked in assumption here that becoming more rational always makes you a better person. I figured that the knock-on effects of this would be a wonderful and better world with people who are smarter, kinder, and more reasonable, with no downsides. That goal is on the level of “construct a nuclear weapon in your garage” when you really start to break down the nuts and bolts. If I didn’t know better I might say we were playing with fire. But really? We were at the level of playing with flint and wet sticks. We would have been lucky to generate sparks, much less flame, much less actually burn anyone, intentionally or otherwise.

Our goals were naive, literally based on an April Fools post by Eliezer, and I no longer see them as corresponding to reality in a way that makes them valuable to pursue. Moreover, there’s an argument it was actually dangerous. If you taught rationality techniques to Donald Trump (and he managed to actually internalize them), would that be good for the world or bad? Depending on how good at instilling morality you think the rationality memeplex is, it might be good, but it seems to me that it’s much more likely it would just make him better at being bad.

What eventually did in the project was tripping over its own premature implementation. We didn’t have a coherent idea of what we were trying to do, but we rushed into doing it anyway. When I presented ideas to spitball off they were uncritically accepted and rushed into practice. I wanted to create a memeplex, but many of the other people involved in the project acted as if the memeplex I sought to create already existed. Eventually, the rushed implementation ran into the lack of foundation, got lost in bikeshedding, fell over, and died.

I care about this because I care about the world and I don’t want to become captured by a goal that won’t contribute to saving the world. There’s too much at stake to waste effort and motion on something that won’t work. Better to cast off the sunk cost and begin anew.

Time for some Thrilling Memetics 

The failures and impossibilities of Origin’s goals can in some sense be extracted from a description of them alone. I was pretty naive in my understanding of memetic forces at the time, to think I could simply find the right aesthetics and then people would eat up the rationality memes because they seemed cool.

The basic factor that I was almost willfully ignoring is that the part of a memeplex that makes it spread tends to be the particular meme saying “spread these associated memes in the memeplex, don’t let these competing memes in” or in other words, “shun the heretics and nonbelievers, you must believe in god this much to enter the kingdom of heaven.”

There’s not some hidden memetic master technique that makes people fall for religion, the trick is “spread this meme or else” and there’s not really much more to it than that. In a sense, I was both over and under-thinking the problem. I sincerely thought I could make something memetically virulent without the “spread this meme or else” subcomponent, missing that “spread this meme or else” is one of the things creating that virulence in the first place.

What else is? A sense that a particular view is not only correct but moral, that it’s the righteous choice to make to be a follower of God. That everyone making a different choice is not only wrong but immoral and probably evil.

The principal memetic subcomponents to making a supermeme as far as I can tell at this point, one that will spread with the virulence of the Spanish flu through an anti-vaxxer convention, are the following sub-memes:

  • Spread this Memeplex.
  • If you don’t spread this memeplex, then Bad Thing will happen to you.
  • This memeplex is correct, opposing memeplexes are incorrect, doubting this is bad and makes you an idiot.
  • This meme is the most moral and will lead to the best outcomes if you use it.
  • This meme is powerful and will give you an edge by conferring an advantage to you over your competitors.

Now, none of these statements need to be true, they just need to be present. The fact that their presence is more important than whether they are true statements about the world is kind of a problem. I was unsure if I wanted to even post the specific sub-memes for this out of concern that someone with fewer scruples might take those things and make a weaponized memeplex. But even with all those parts, most cults still fail. This isn’t an easy thing to get right even if you’re willing to disregard all scruples and be completely ruthless and manipulative. And if you’re handicapping yourself by not being completely evil? In most cases you can forget about it.  

We’re not in the old world where starting a religion is somewhat viable, the internet has pushed us into realms unknown. The Tower is a harsh environment for any memeplex.

“Happiness and meaning—sometimes they overlap, sometimes you must choose. I don’t have the answer, there is no answer, all I can do is warn you about the trap by which you obtain neither. Even if you’re sign-me-up-for-the-Orgasmatron all in with Team Experiencing Self, the id is too myopic to be any good at long-term hedonism. The unchecked id would have left us cavemen, samizdat & chill wouldn’t even be on the table. Conversely, even if you’re polishing trophies for Team Remembering Self, the default superego is an incoherent mess, infected with millions of selfish, MALIGNANTLY USELESS memes that have no interest in your happiness, care not for the coherence of your autobiography, and will drive you to madness rather than let you winnow them away.

The key word is default. We all have some degree of protection, either through physical isolation or memetic immunity, “Mom says not to trust strangers who say they have candy.” But most of us fall short of contact precautions. And in that case, we are ruled by probability—by Moloch, by Nyx, by Nature, the only force that God fears. Why else would He confuse mankind’s language? Why would He demand obedience to 613 commandments? Circumcision? What was Judaism, with rabbinical prohibition against interfaith marriage or proselytization, except God’s attempt to create a religion that would not spread? It failed, as it always does. Autotune and Manifest Destiny. The house always wins at the second law of thermodynamics.

With free flow of information, how can any belief system hold? All belief systems rest on axioms, if you grant equal footing to a contradictory axiom, the belief system collapses. I suppose I’m that guy claiming that atheists invent a God—not an interventionist God, nor a fuzzy deism, but a set of unprovable principles that determine right and wrong and to which one must atone. Don’t give me that humanism bullshit. When someone slaps your hypothetical girlfriend’s ass in the proverbial club, what does humanism say you should do? At least toxic masculinity has an answer. Humanism is a motte and bailey, a set of milquetoast ideals which provide no guidance in day to day life and so leave you passive (“Hey, man—first principles!”) or, more likely, vulnerable to whatever crypto-ideology is most virulent. If you do not have a code of conduct, one will be provided for you.

With free flow of information—a suppressed memetic immune system, a hypothetical Tower of Babel—it is statistically inevitable that every meme will attain its most infectious form. There are countless ways to make an idea more or less palatable, but the first step is always the same, a single amino acid substitution, a lingering desire affixed to every thoughtlet: “SPREAD THIS MEME.” With free flow of information, this will be the only value that remains—every other axiom will be cancelled out by its opposite, but the codon “don’t spread this meme” will, definitionally, not spread.”

~Hotel Concierge, The Tower

I was trying to make a memeplex that could compete in the, ahem, free marketplace of ideas, without a very firm understanding of what exactly that entailed, what the lay of the land was, what forces were at work in the background, or even, really, what it meant to compete in that environment. When I wrote the first draft of this essay, I wrote that In many ways I was lucky, that I could have birthed something rather bad if it had taken longer to run off the rails and built up more inertia. But the truth was it never really had any inertia at all. I never came anywhere near constructing a functioning bicycle, much less locomotive.

This was, again, probably a good thing, given what I now know about memetics. That’s not to say I don’t still have an interest in pursuing the goals of the project, but the situation is a lot more complicated and difficult to get right than I had been thinking, and I’m being a lot more cautious moving forward. I want to make the world a better place, I don’t want to delude myself into thinking I’m being helpful while producing something that makes the world worse.

This is going to be a constant theme throughout this set of however many long essays this ends up being, reality is more complicated and less easy to solve than it appears at first glance. I think in many ways this was a weakness to the framework generated by rationalist fiction, which presented many things as being simple puzzles which the characters could successfully solve and did so on the first try. Shikashi, actual reality is always more complicated than it first appears.

There are no easy hacks, the low hanging fruit has all been picked, there’s no master set of techniques that if we just got everyone to adopt them, the world would be saved. There is no consciousness expanding chemical solution or ideological superweapon that will save the world. There is no book or bible, no memeplex or manifesto, that contains the solution to humanity’s ills. There is no easy answer.

All there is, is us.

Advertisements

This is For Real

Wake up.

RWS_Tarot_00_Fool

No really. Wake up.

Please. I’m begging you, just wake up.

Are you listening?

I want your full attention for this.

RWS_Tarot_02_High_Priestess

I need you to really think about and understand the things I’m about to tell you.

Some of it will be truth.

Some of it will be fiction.

But all of it is important.

There’s something deeper I’m trying to share with you.

Something beautiful and dangerous and unlike the world you know at all.

RWS_Tarot_21_World

This is for real. This is your life.

Please Wake up.

The Story of Our Life

[Epistemic Status: An even split of observations and wild inferences]
[Content Warning: Poverty, Class, Capitalism, Gender, this post is basically maximum disclosure]

When we last left off, we gave a very broad outlook on our history as a plural system, and how that interfaced with our ideas of consciousness. Today, we’re going to go the other way, and talk about our past as a human person navigating meatspace. We feel it’s important to tell this story as well, because it’s brought us to where we are today, and it’s a part of a general vector through time that we’re unsure as to the ending of. We hope this post might help steer us towards a better ending some small way.

I.

Our body was born in Western New York, in this little nowhere city on the shore of Lake Erie. Our parents weren’t particularly well off but weren’t that poorly off either. They initially rented the upstairs of an apartment shortly after we were born. We have a few of Jamie’s memories from that time, but she was a kid, she was bad at forming strong long-term memories back then, so we don’t really know much about what went on in those days.

It’s interesting, given that, that we always refer to Jamie as she then, isn’t it? Why is that? Well, Jamie was a kid, she didn’t really have a gender, she didn’t know what gender was and didn’t perceive herself as particularly gendered. We’re fairly sure it was Jamie’s finally internalizing the concept of gender that triggered Shiloh’s formation as well as catalyzing the downward spiral towards Jamie’s eventual egocide. We’re not actually sure what the biological correlates to dysphoria are even now, but whatever causes it basically drove Jamie completely insane around age nine.

So, in the end, Jamie completely self-destructed and left Shiloh, who strongly identified as a girl at a point in our life when the body was just starting to go through puberty and was expected to put on the opposite gender roles. Shiloh didn’t really identify with the body at that point in time, so she was fine, but someone needed to be driving the body, and so she created Fiona.

Our legal first name is Fiona, it was Fiona who actually came out to our parents, went through high school as a trans youth, graduated, she was basically the new host for quite a while, with Shiloh just hanging on for the ride.

II.

Western New York is a strange place. In our experience, when people think New York State, they immediately think New York city. Then, maybe they also think about the Hudson river valley and the Adirondacks. But New York also extends a middle finger west across the entire width of the state of Pennsylvania, terminating in Niagara falls at the place where Lake Erie and Lake Ontario meet. The western parts of the state are less mountainous but still rugged and hilly glacially tilled terrain. It has some farmland, some forests, some small lakes, it’s largely rural, largely white, and largely republican leaning. It looks like this.

Interestingly, Buffalo NY, the closest metropolitan area to our hometown, was ranked as the most homophobic in the nation in a 2016 study that looked at slurs and derogatory language on twitter. It doesn’t seem to be particularly rigorous, but it’s interesting and corresponds well with our lived experiences.

Our parents were (still are, though they’ve cooled off some) deeply devout Christians. They didn’t label their denomination, our extended family was Catholic, but they spent a few years while we were between the ages of six and fourteen flirting with various other churches. Our mother dressed Jamie up as a pumpkin for Halloween when we were five, we’ve seen the pictures of it. But every year after that until after we’d moved out of the house, our parents were operating under the principle that Halloween was Satan’s holiday. Pokemon was satanic, along with most other anime and mainstream cartoons. We were held to a strict standard of religious practice, and so our parents didn’t take Fiona coming out as trans particularly well.

After fleeing home and enrolling in a community college, Fiona crashed and burnt in the middle of transitioning our body. She failed out of all our classes and nuked our GPA. We dropped out of school and lived with our partner of the time. We worked on and off, but eventually had a polyamory related breakup that probably deserves its own post at some point as an investigation of possible failure modes for poly relationships, and that was what landed us in the Otherkin house.

We’ve already talked about them quite a bit, so we’ll gloss over it mostly this time. We got a job, lived with them, had our spectacular falling out, then lived in the woods for a few months while working and saving up money for an apartment. By this point, we’d mostly reconciled things with our parents, but they didn’t want us to move back in or offer us any sort of economic support, which was how we ended up living in the woods for a while. They were, and still seem to be, under the impression that if we just work hard enough our life will work out, and that if we’re not succeeding then we have to be doing something wrong. It can’t be the system, it can’t be the economy, they worked when our parents were our age and those things are just fixed, static. It’s like they don’t perceive the change in the times.

III.

We worked various jobs for a while, Sage was created, and we decided to take another swing at college. Our GPA still sucked, but our father was an adjunct professor and was able to give us some free credit hours to take courses with. With that, we were able to start working towards an Environmental Science degree and clawing up our GPA.

The college only allowed us to use our father’s credit hours until our body turned 24, after that we no longer qualified for it. We aimed to have our GPA repaired by that point so we could once again qualify for student loans. And we did it, we brought our GPA up enough to qualify for student loans.

Except we didn’t. Our counselors had been telling us for years to drop classes where we didn’t like the teachers if it turned out they were homophobic or disagreeable to us, or if we weren’t doing well and were afraid we were going to fail. It was usually framed around “don’t let it affect your GPA” and so we didn’t. However, there was another metric that’s looked at when applying for student loans, which is the ratio of classes passed to classes attempted. Because we’d attempted a bunch of classes and then withdrawn from them for various reasons, the ratio was too far skewed towards attempts, thus continuing to prevent us from qualifying for student loans.

We turned twenty-four, ran out of money for college, couldn’t get financial aid or student loans, and Fiona self-destructed. She basically saw the future of our life as one long slow depressing slide into misery and death and decided to just get off before things got any worse. Maybe she saw the writing on the wall? Maybe the rest of us are stubborn enough we can avoid that fate, but her prediction is still hanging over our heads even now as if waiting to prove to whatever fragments of her remain that she made the right decision.

We decided to get the fuck out of Western New York. If we weren’t going to be able to get a degree, then there was no reason for us to stay. College had been the only thing holding us there, and once that option was taken away, we saw no reason to remain.

We set out to defy Fiona’s prediction despite all our failures. We decided if the gutter was to be our fate, we’d go there kicking and screaming. That was around the time our writing career started. Not with Sideways in Hyperspace, but with Tales from Aeria, which is and will likely remain on ice for the foreseeable future.

We were good at writing, it came easily to us, and when we were able to get into the zone, the words just flowed out onto the screen. It took us a while to get to the point where our content was good, but we’d always felt that our writing ability was something we could leverage, something we could build on. We also rather strongly identified with the Rationalist community by that point as well, and we desperately wanted to be able to participate meaningfully in the conversations that were going on, contributing to the shared and growing subcultural narrative.

IV.

We moved to Seattle. Overall, given the election of Donald Trump a year later, it was probably a good decision. Things have been pretty okay here. We’re still poor objectively speaking, we work a minimum wage job, can barely cover rent and afford mundane expenses associated with survival, but it’s a nicer environment to be poor in than a semi-rural post-industrial landscape. We’ve stretched out and established social networks, made friends, and it’s been a pretty great experience all things considered.

Fiona’s prediction is still looming overhead though like a twenty-year curse just waiting to land. Our job is nice, it’s fairly stable even, but it doesn’t earn us much money. We live very frugally, but we’ve not managed to save anything, so if we were hurt and couldn’t work, we’d have about a month to figure something about before we were thrown out on the street. Our support network is decent, we might be able to couch surf, but all of that still feels like hanging out beneath Fiona’s curse just waiting for it to hit home. We can scrape by for a long time, we’ve been scraping by for years now and our plan is to continue doing so until either the pavement or our face gives way, out of lack of a better option, but it’d be really nice to have a better option.

So where exactly does that leave us? What is the better option? There doesn’t seem to be one at the moment, so we’ve set out to construct one from the ground up. We had no formal degree so we couldn’t pursue a technical field, we had to do something that leveraged our skills, and thus we zeroed in on writing. Initially fiction writing, we set out to produce good rationalist fiction in the vein of HPMOR. We’ve put a lot of time into our writing. We launched Sideways In Hyperspace nine months ago, and we’re really happy with how it’s developing.

Here’s the thing though, most people who write fiction, or produce rationalist blogs, or otherwise create rationalist content (aside from CFAR), do it as a hobby, something in their spare time when they’re not doing even more awesome things to save the world. We’re trying to do something different, where we devote as much of our life and our resources as we can to the project of rationality itself.

We don’t have a lot going for us, and what we do have going for us is at least partly attributable to rationality and the ideas we’ve taken away from the sequences and from people like Scott, so we’re very attached to the ideas presented in the community and very much want to see rationality grow and spread as a community, subculture, and movement. We want our tribe to win.

But the rationality tribe is mostly focused externally, on big real world problems like killing malaria or preventing an AI from turning us into paperclips, there’s not much focus being directed inwards, towards the community itself. Which makes sense, pooling resources in the tribe isn’t effective altruism. It’s buying fuzzies, not utilions, and why would we waste precious and limited community resources on fuzzies when people are literally dying of malaria right now? 

V.

We’re a community, and we want to do good in the world. We want the world to be good, not just for our tribe, but for everyone. In that context, directing resources back at the tribe that we could be using to do more good elsewhere seems like a mistake. There’s another side of that to consider though, which is that our tribe is a collection of humans trying to live their lives. Our ability to do good in the world, to direct positive action outwards, is based on the ability of the members of the community to support themselves with enough resources left to spare to direct outward action. That works when the majority of the community can support themselves, as is the current case with the rationalist community, but not everyone is doing well enough for that to be a viable course of action for all community members.

As an example, there’s a suggested effective altruism pledge to cut your income down to $30,000 a year, live frugally with a bunch of friends, and donate all the rest of what you make to charity. Okay, that’s great, but what if you’re us, and last year working full time the whole year, you only made $20,000 dollars and had to use all of it on survival expenses? We’re not able to do anything to help with those big important external problems. We can’t attack it from the technical side since we don’t have a degree, and we can’t attack it from the financial side since we don’t have money. There’s not really much we can do to contribute to big important altruist causes like that besides cheerleading from the sidelines.

But we want to help, and we doubt we’re the only ones. It seems like everyone sufficiently integrated into our community and not too bogged down with their own personal problems feels the pressing need to do something. We feel that need and we are bogged down in personal problems.

It seems to us like sufficiently incorporating the rationalist mindset brings the desire to do good in the world along with it, and even if someone can’t personally help, they want to. Rationality feels like this grand adventure, going into battle against the forces of darkness and bringing humanity into a new age of light. Defeating death and banishing it from our lives, building great cities in the sky, and manifesting our wildest dreams into reality. It’s a humbling and awe-inspiring vision of the humanity and the future, and once you’ve heard the tune, you can’t stop humming it.

We’ve heard the song of Dath Ilan, and we can’t unhear it. The concepts and ideas all come together up in the headwaters of form and hint at a future brighter than we can possibly imagine, and we want to do everything in our power to make that future a reality.

So here’s what we’re going to do. We’re going to strongly encourage everyone who likes reading our content and wants to help enable us and other rationalist content creators to donate to our patreon. We’re also going to implement the $30,000 dollar cap on our personal income. Any amount we make beyond that will be donated into the Origin Project, which is a rationalist housing project aiming to provide a home for down and out members of the community while they rebuild their lives and get into a place where they’re able to put value back in. There will be a blog post dedicated to the Origin Project following soon, so stay tuned for that.

We’re also pledging that we’ll keep producing rationalist content for as long as we’re able to dedicate the time and resources to it. Hopefully, as our income grows, we’ll be able to make more and more content and provide support for more and more members of the community.

Our long-term goal is to enable community growth and cohesion through members supporting and enabling one another to do as much as they can, and increasing what they can do by leveraging them out of bad circumstances and into better life positions. The first step of this is to get enough out of the hole ourselves that we can begin dedicating resources to helping others climb out of the hole. This isn’t Effective Altruism, this is more like Venture Rationality, but it does still seem like a worthy addition to the rationalist sphere of concern.