(This post won’t make much sense if you don’t know about the game Cards Against Humanity. Fortunately it has a web site. If you know the game Apples to Apples, well, CAH’s gameplay is almost identical to Apples to Apples … but the cards range from snarky to perverted to shockingly un-PC.)
After the LW meetup in Mountain View yesterday, the idea came up of a Less Wrong expansion set for Cards Against Humanity … with a roughly Shit Rationalists Say theme, with a little help from Eliezer Yudkowsky Facts. Regardless of whether this ever happens, we felt the need to share the pain with the rest of the community.
These are meant to be mixed with the standard deck. Hence, the completed phrase “That which can be destroyed by being a motherfucking sorceror should be” is a clearly winning combination, as is “Why am I sticky? Grass-fed butter.”
Black cards:
That which can be destroyed by _____ should be.
_____ is the mind-killer.
The thirteenth virtue of rationality is _____.
_____ is truly part of you.
“Let me not become attached to _____ I may not want.”
_____ is vulnerable to counterfactual mugging.
What is true is already so. _____ doesn’t make it worse.
_____ is not the territory.
_____ will kill you because you are made of _____ that it could use for something else.
“I’m an aspiring _____.”
In the new version of Newcomb’s problem, you have to choose between a box containing _____ and a box containing _____.
Instrumental rationality is the art of winning at _____.
Less Wrong is not a cult so long as our meetups don’t include _____.
In an Iterated Prisoners’ Dilemma, _____ beats _____.
The latest hot fanfic: _____ and the Methods of _____.
_____ is highly correlated with _____.
Absence of _____ is evidence of _____.
The coherent extrapolated volition of humanity includes a term for _____.
We have encountered aliens who communicate through _____.
In the future, Eliezer Yudkowsky will be remembered for _____.
I’m signed up with Alcor, so _____ will be frozen when I die.
“I am running on corrupted _____.”
An improperly-programmed AI might tile the universe with _____.
You know what they say: one person’s _____ is another person’s _____.
“I want to want _____.”
_____ is what _____ feels like from the inside.
_____ is the unit of caring.
If you’re not getting _____, you’re spending too many resources on _____.
Every _____ wants to be _____.
Inside Eliezer Yudkowsky’s pineal gland is not an immortal soul, but _____.
Before Bruce Schneier goes to sleep, he scans his computer for uploaded copies of _____.
Eliezer Yudkowsky updates _____ to fit his priors.
Eliezer Yudkowsky doesn’t have a chin; under his beard is _____.
Never go in against _____ when _____ is on the line.
Reversed _____ is not _____.
You have no idea how big _____ is.
Why haven’t I signed up for cryonics?
What am I optimizing for?
The Quantified Self people have finally figured out how to measure _____.
You can’t fit a sheep into a _____.
Make beliefs pay rent in _____.
Why did my comment get downvoted?
“You make a compelling argument for _____.”
“My model of you likes _____.”
“I can handle _____, because I am already enduring it.”
White cards:
Eliezer Yudkowsky
Friendly AI
Unfriendly AI
Lukeprog’s love life
The New York meetup group
Updating
Ugh fields
Ben Goertzel
Guessing the teacher’s password
Confidence intervals
Signaling
Polyamory
The paleo diet
Asperger’s syndrome
Ephemerisle
Burning Man
Grass-fed butter
Dropping acid
Timeless Decision Theory
Pascal’s mugging
The Sequences
Deathism
Alcor
The Singularity Institute for Artificial Intelligence
Quirrellmort
Dark Arts
Tenorman’s family chili
Affective death spirals
Rejection therapy
The cult attractor
Akrasia
The Bayesian Conspiracy
Paperclips
The Copenhagen interpretation
Clippy
Shit Rationalists Say
Babyeaters
Superhappies
Aubrey de Grey’s beard
Robin Hanson
The blind idiot god, Evolution
Getting downvoted on Less Wrong
Two-boxing
The obvious Schelling point
Negging
Peacocking
P-Zombies
Tit-for-Tat
Applause lights
Rare diseases in cute puppies
Rationalist fanfiction
Sunk costs
Vibram Fivefingers
RationalWiki
The Chaos Legion Marching Song
Poor epistemic hygiene
A sheep-counting machine
A horcrux
Getting timelessly physical
The Stanford Prison Experiment
A ridiculously complicated Zendo rule
Utils
Wireheading
My karma score
Wiggins
Ontologically basic mental entities
The invisible dragon in my garage
Meta-contrarianism
Mormon transhumanists
Nootropics
Quantum immortality
Quantum immorality
The least convenient possible world
Cards Against Rationality
Moldbuggery
The #1 reviewed Harry Potter / The Fountainhead crossover fanfic, “Howard Roark and the Prisoner of Altruism”
Low-hanging fruits
The set of all possible fetishes
Rationalist clopfic
The Library of Babel’s porn collection
Counterfactual hugging
Acausal sex
Post your own!
EDIT, 2012-08-29: Several additions from the thread and elsewhere.
EDIT, 2012-12-25: This is licensed under Creative Commons CC BY-NC-SA 2.0 license, because Cards Against Humanity is.
Cards Against Rationality
(This post won’t make much sense if you don’t know about the game Cards Against Humanity. Fortunately it has a web site. If you know the game Apples to Apples, well, CAH’s gameplay is almost identical to Apples to Apples … but the cards range from snarky to perverted to shockingly un-PC.)
After the LW meetup in Mountain View yesterday, the idea came up of a Less Wrong expansion set for Cards Against Humanity … with a roughly Shit Rationalists Say theme, with a little help from Eliezer Yudkowsky Facts. Regardless of whether this ever happens, we felt the need to share the pain with the rest of the community.
These are meant to be mixed with the standard deck. Hence, the completed phrase “That which can be destroyed by being a motherfucking sorceror should be” is a clearly winning combination, as is “Why am I sticky? Grass-fed butter.”
Black cards:
That which can be destroyed by _____ should be.
_____ is the mind-killer.
The thirteenth virtue of rationality is _____.
_____ is truly part of you.
“Let me not become attached to _____ I may not want.”
_____ is vulnerable to counterfactual mugging.
What is true is already so. _____ doesn’t make it worse.
_____ is not the territory.
_____ will kill you because you are made of _____ that it could use for something else.
“I’m an aspiring _____.”
In the new version of Newcomb’s problem, you have to choose between a box containing _____ and a box containing _____.
Instrumental rationality is the art of winning at _____.
Less Wrong is not a cult so long as our meetups don’t include _____.
In an Iterated Prisoners’ Dilemma, _____ beats _____.
The latest hot fanfic: _____ and the Methods of _____.
_____ is highly correlated with _____.
Absence of _____ is evidence of _____.
The coherent extrapolated volition of humanity includes a term for _____.
We have encountered aliens who communicate through _____.
In the future, Eliezer Yudkowsky will be remembered for _____.
I’m signed up with Alcor, so _____ will be frozen when I die.
“I am running on corrupted _____.”
An improperly-programmed AI might tile the universe with _____.
You know what they say: one person’s _____ is another person’s _____.
“I want to want _____.”
_____ is what _____ feels like from the inside.
_____ is the unit of caring.
If you’re not getting _____, you’re spending too many resources on _____.
Every _____ wants to be _____.
Inside Eliezer Yudkowsky’s pineal gland is not an immortal soul, but _____.
Before Bruce Schneier goes to sleep, he scans his computer for uploaded copies of _____.
Eliezer Yudkowsky updates _____ to fit his priors.
Eliezer Yudkowsky doesn’t have a chin; under his beard is _____.
Never go in against _____ when _____ is on the line.
Reversed _____ is not _____.
You have no idea how big _____ is.
Why haven’t I signed up for cryonics?
What am I optimizing for?
The Quantified Self people have finally figured out how to measure _____.
You can’t fit a sheep into a _____.
Make beliefs pay rent in _____.
Why did my comment get downvoted?
“You make a compelling argument for _____.”
“My model of you likes _____.”
“I can handle _____, because I am already enduring it.”
Eliezer Yudkowsky
Friendly AI
Unfriendly AI
Lukeprog’s love life
The New York meetup group
Updating
Ugh fields
Ben Goertzel
Guessing the teacher’s password
Confidence intervals
Signaling
Polyamory
The paleo diet
Asperger’s syndrome
Ephemerisle
Burning Man
Grass-fed butter
Dropping acid
Timeless Decision Theory
Pascal’s mugging
The Sequences
Deathism
Alcor
The Singularity Institute for Artificial Intelligence
Quirrellmort
Dark Arts
Tenorman’s family chili
Affective death spirals
Rejection therapy
The cult attractor
Akrasia
The Bayesian Conspiracy
Paperclips
The Copenhagen interpretation
Clippy
Shit Rationalists Say
Babyeaters
Superhappies
Aubrey de Grey’s beard
Robin Hanson
The blind idiot god, Evolution
Getting downvoted on Less Wrong
Two-boxing
The obvious Schelling point
Negging
Peacocking
P-Zombies
Tit-for-Tat
Applause lights
Rare diseases in cute puppies
Rationalist fanfiction
Sunk costs
Vibram Fivefingers
RationalWiki
The Chaos Legion Marching Song
Poor epistemic hygiene
A sheep-counting machine
A horcrux
Getting timelessly physical
The Stanford Prison Experiment
A ridiculously complicated Zendo rule
Utils
Wireheading
My karma score
Wiggins
Ontologically basic mental entities
The invisible dragon in my garage
Meta-contrarianism
Mormon transhumanists
Nootropics
Quantum immortality
Quantum immorality
The least convenient possible world
Cards Against Rationality
Moldbuggery
The #1 reviewed Harry Potter / The Fountainhead crossover fanfic, “Howard Roark and the Prisoner of Altruism”
Low-hanging fruits
The set of all possible fetishes
Rationalist clopfic
The Library of Babel’s porn collection
Counterfactual hugging
Acausal sex