We Fell For It
Inciting Incident
A friend recently sent me this message on Discord:
so apparently one of the engineers helping elon essentially coup the government is a diehard rationalist lmao https://web.archive.org/web/20241124181237/http://colekillian.com/ lists “rationality a-z” as the book that most influenced him
is he doing some weird game theory shit behind the scenes do you think? or has he legit drank the koolaid
This friend is like half-familiar with the rationalist community and its ideas, as I’ve gotten him more into it over the years. He was not, however, necessarily as up-to-date about NRx, so some of my reply will be old news to many of you.
The following is my reply, lighted edited (i.e. conversational, without removing hyperboles or figurative language or simplifications) and heavily augmented with links:
What I Said
Y’know, I think somewhere in the process he drank Elon’s kool-aid and sorta left his brain on like 2014/15-era autopilot.
He might literally think Elon’s Nazi salute wasn’t a Nazi salute.
On a game-theory level, it’s extremely easy to basically think of yourself as a technolibertarian, and all the fascist stuff is just “necessary evils” to save America from SF-like zoning laws. (see: Anduril, Palantir)
For smart people, unfortunately, lots of the time we just rationalize stuff harder. Plenty of people swear by Rationality A-Z and have even read Meditations On Moloch, but then think they’re one of the good capitalists and that the social system totally won’t turn into a virus that eats them.
IMHO a bigger and more worrying influence is Curtis Yarvin, aka Mencius Moldbug, aka “the guy who wants a monarchical dictatorial CEO president”. He beefed with and unfortunately apparently converted some rationalists to his cause, he was invested in by Peter Thiel, Thiel funds JD Vance, and Yarvin has written a buttload of posts over the years detailing how he wants Trump to do a coup.
Most lizardlike people on earth –> manipulate –> idealistic rationalists –> extract their intelligence –> use them up like ammo (a metaphor used by pre-Nazi-Musk employees to describe how he hired and fired) –> crystallize the intelligence into AI –> competence on-demand for the most powerful (i.e. mostly-selected to be the most-ruthless/psychotic) people on earth –> we all die by rogue AI and/or non-rogue AI drones and/or slow-burn police state and/or climate wars.
“Manipulate? But they’re rationalists!” Sorry, get in line. I won thousands of dollars from a writing contest indirectly funded by SBF.
>“I’m one of the good ones.”
>“I’m making tough choices.”
>“It’s a necessary evil.”
>"Just survive until AI safety is solved, anyway let’s build bigger and more-powerful AI."
>“I’ll just take my piece of the pie before leaving, they can’t golden-handcuff me!”
>“I don’t believe in AI, so my actions make sense on a long-run future [40 paragraphs about how early Christians had more babies and now the world is perfect and Jesus-like, therefore if we just…]”
“Is there still hope?” Yes! It relies on unlikely, difficult, and/or unprecedented things happening. Then again, the world is getting more “unprecedented” every day, so maybe that’s not the “drawback” it sounds like.
Bonus: a joke from later in the conversation
Yey, the awareness-of-the-Neoreactionaries (the sect that Yarvin basically owns) has now reached “leftist redditor tinfoil-hat comments”, which basically means it’ll be government policy by next week.
Closing Thoughts
I wish leftists existed who read and made The Sequences a part of themselves.
I wish those same people existed, and didn’t then decide to become a cult of pro-Hell alleged serial killers.
I wish more leftists understood natural selection, selection effects in general, and memetics in particular.
I wish more rationalists and AI safety/alignment/governance people took Moloch seriously on the ground level, in their real everyday lives, and how they related to and are influenced by a systemic society ideology social system (insert 50 leftist buzzwords that are really just normal words).
I wish more rationalists would bother steelmanning classic leftist paranoid-redditor-tier leftist ideas. Sure, the “political journey” and the “valley of bad rationality” would’ve gotten even messier (I can attest to this!), but the long-term gains, I think, would’ve been worth it. Who knows, maybe the left and the rats would’ve learned more about memetics and group dynamics. Ideas mixing and developing, which could have (Do I dare hope?) eventually become free of blank slates and noble savages and might-makes-rights and power-ignorance and sociopathy-denial.
I wish that both the “tip top” and the “medium top” of educated people had consensus that both “property is downstream of power” and “prices are downstream of supply and demand”.
I wish /r/leftrationalism had, like, any activity whatsoever.
But enough wishes.
Remember that hope I mentioned? And how it may flourish even now, with how weird things are going? Well, it would do you well to remember how the rationality community has prepared each of us for COVID and Big AI. The nontrivial percentage of us who made money from crypto and NVDA. The people with strange thoughts and the tools to check them against reality, if we desire.
Building and updating and extrapolating world-models, and from there making positive and creative change. The rest is commentary. Better late than never, eh?
When the going gets weird, the weird turn pro. - Hunter S. Thompson
If , help us write more by donating to our Patreon.
Tagged: