$500. I can wait a little longer to get a new laptop.
WrongBot
Unknown knowns: Why did you choose to be monogamous?
Took it and laughed several times.
There is an objectively real morality. (10%) (I expect that most LWers assign this proposition a much lower probability.)
The future is probably an impending train wreck. But if we can save the train, then it’ll grow wings and fly up into space while lightning flashes in the background and Dragonforce play a song about fiery battlefields or something. We’re all stuck on the train anyway, so saving it is worth a shot.
I hate to see smart people who give a shit losing to despair. This is still the most important problem and you can still contribute to fixing it.
TL;DR: I want to give you a hug.
If we discover laws of physics that only seem to be active in the brain, that would convince me. If we discover that the brain sometimes violates the laws of physics as we know them, that would convince me. If we build a complete classical simulation of the brain and it doesn’t work, that would convince me. If we build a complete classical simulation of the brain and it works differently from organic brains, that would convince me. Ditto for quantum versions, even, I guess.
And there are loads of other things that would be strong evidence on this issue. Maybe we’ll find the XML tags that encode greenness in objects. I don’t expect any of these things to be true, because if I did then I would have updated already. But if any of these things did happen, of course I would change my mind. It probably wouldn’t even take evidence that strong. Hell, any evidence stronger than intuition would be nice.
What would it take to convince you that this entire line of inquiry is confused? Not just the quantum stuff, but the general idea that qualia are ontologically basic? Not just arguments, necessarily, experiments would be good, too.
If Mitchell is unable or unwilling to answer this question, no one should give him any amount of money no matter the terms.
PSA: There is an actual physical sensation that accompanies religious experiences. If you feel the presence of a being of awesome power and an unusual sensation of… fullness?… in your chest, don’t panic or starting believing in a god or anything crazy.
It’s a physiological thing that happens to people, especially in altered states (drugs, sleep deprivation, etc.), and it doesn’t mean anything.
I feel like a jerk for saying this, but in the four days since you announced your intention to cut back on top-level posting for a while, this is the second top-level post you’ve made.
To be blunt, you are violating community norms by posting large quantities of material despite general disinterest or disapproval from other community members. Before making future posts, please try to calculate the expected value of their content for your readers. When in doubt, refrain from posting.
This discussion section is intended to have lower standards than the main section of the site, but even so those standards are much higher than those of almost any other internet discussion forum.
While I can’t speak for anyone else here, I would appreciate it if you would cease making top-level posts entirely until your karma rises above 0.
My hypothesis is that this is a “realist”/”idealist” divide. Or, to put it another way, one camp is more concerned with being right and the other is more concerned with doing the right thing. (“Right” means two totally different things, here.)
Quality of my post aside (and it really wasn’t very good), I think that’s where the dividing line has been in the comments.
Similarly, I think most people who value PUA here value it because it works, and most people who oppose it do so on ethical or idealistic grounds. Ditto discussions of status.
The reason the arguments between these camps are so unfruitful, then, is that we’re sorting of arguing past each other. We’re using different heuristics to evaluate desirability, and then we’re surprised when we get different results; I’m as guilty of this as anyone.
An alternate hypothesis: people are loss-averse.
One of the best pieces of evidence for this theory is an incident that occurred during the development of online role-playing game World of Warcraft. While the game was in beta testing, its developer, Blizzard, added a “rest” system to the game to help casual players develop their characters at a pace slightly closer to that of the game’s more serious players, who tended to devote much more time to the game and thus “leveled up” much more quickly.
The rest system gives “rested experience” at a gradual rate to players who are not logged into the game. As initially implemented, characters who had available rest experience would acquire experience points for their character at a 100% rate, diminishing their rest experience in the process. Once you were out of rest experience, your character was reduced to earning experience at a 50% rate. Because rest experience accumulated slowly, only while offline, and capped out after about a day and a half, players who logged on to the game every day for short periods of time were able to earn experience points most efficiently, lowering the extent to which they were outpaced by heavy players.
But while the system was achieving its goal, almost all of the game’s testers hated it, no matter how much they played. They felt like they were being penalized for playing too long, which just didn’t seem fair.
Blizzard fixed it by changing the rested rate to 200% and the normal rate to 100%, without changing the actual number of experience points earned.
They just relabeled the percentages, told everyone that that was what they were doing, and then everyone stopped complaining and was perfectly happy with the system.
If arguments over the relative merits of Star Wars and Star Trek became a regular feature of this site, I would feel compelled to burn it to the ground.
I don’t have it handy to grab specifics, but Sex at Dawn discusses a couple dozen different cultures in which promiscuity is the norm. Some of those cultures shame the non-promiscuous, to varying extents.
Human society as we currently know it is less than 10,000 years old. That’s an incredibly short timescale for major evolutionary changes.
Please don’t delete comments. It makes it hard to understand orphaned replies. Adding an [Edit: Withdrawn] at the end of the comment serves the same purpose, but maintains conversational continuity.
I’m sorry, but “Hagrid is lonely” is not a concern worth five seconds of thought when Harry could be working on getting rid of dementors or Azkaban or Death Eaters or death.
Harry trusts Quirrell less now than ever before, and he spent much of the chapter before this one rhapsodizing about Hermione’s exceptional moral behavior, which definitely sounds to me like it could be his something to protect.
What would he have to do to convince you that he’s on the road to hell?
Anything evil? I’m still a little dubious of Harry’s judgment of late (though it seems to be recovering), but I’m really surprised you’re worried about his intentions.
Cached thoughts are default answers to questions. Unquestioned defaults are default answers to questions that you don’t know exist.
Reading this made my brain hurt. It’s a pile of false analogies that ignores the best arguments disagreeing with it, which is particularly ironic in light of the epigraph. (I’m thinking of Chalmers specifically, but really you can take your pick.)
I’m tempted to go through and point out every problem with this post, but I noticed at least a dozen on my first read-through and I just don’t have the time.
Posts arguing against the LW orthodoxy deserve disproportional attention and consideration to combat groupthink, but this is just too wrong for me to tolerate.
- 15 Mar 2012 2:22 UTC; 72 points) 's comment on Cult impressions of Less Wrong/Singularity Institute by (
“I don’t have much to gain from hanging out with Hagrid” and “I don’t care about Hagrid’s well-being” are radically different statements, and the former doesn’t imply the latter.
Harry believes that he is unusually capable of improving the world. That means his time is valuable, and shutting up and multiplying suggests that he should avoid entanglements unless they are expected to improve his chances of success. Harry is acting cold but not evil.
On Eileen Barker:
Much like in 2 above, many people have chosen to sign up for cryonics based on advice from the likes of Eliezer and Robin; indeed, Eliezer has advised that anyone not smart enough to do the math should just trust him on this.
I believe that most LW posters are not signed up for cryonics (myself included), and there is substantial disagreement about whether it’s a good idea. And that disagreement has been well received by the “cult”, judging by the karma scores involved.
Several us/them distinctions have been made and are not open for discussion. For example, theism is a common whipping-boy, and posts discussing the virtues of theism are generally not welcome.
Theism has been discussed. It is wrong. But Robert Aumann’s work is still considered very important; theists are hardly dismissed as “satanic,” to use Barker’s word.
Of Barker’s criteria, 2-4 of 6 apply to the LessWrong community, and only one (“Leaders and movements who are unequivocally focused on achieving a certain goal”) applies strongly.
On Shirley Harrison:
I’m not sure if ‘from above’ qualifies, but Eliezer thinks he has a special mission that he is uniquely qualified to fulfill.
I can’t speak for Eliezer, but I suspect that if there were a person who was obviously more qualified than him to tackle some aspect of FAI, he would acknowledge it and welcome their contributions.
While ‘revealed’ is not necessarily accurate in some senses, the “Sequences” are quite long and anyone who tries to argue is told to “read the Sequences”. Anyone who disagrees even after reading the Sequences is often considered too stupid to understand them.
No. The sequences are not infallible, they have never been claimed as such, and intelligent disagreement is generally well received.
Many people here develop feelings of superiority over their families and/or friends, and are asked to imagine a future where they are alienated from family and friends due to their not having signed up for cryonics.
What you describe is a prosperous exaggeration, not “[t]otalitarianism and alienation of members from their families and/or friends.”
There is volunteer effort at Lw, and posts on Lw are promoted to direct volunteer effort towards SIAI. Some of the effort of SIAI goes to paying Eliezer.
Any person who promotes a charity at which they work is pushing a cult, by this interpretation. Eliezer isn’t “lining his own pockets”; if someone digs up the numbers, I’ll donate $50 to a charity of your choice if it turns out that SIAI pays him a salary disproportionally greater (2 sigmas?) than the average for researchers at comparable non-profits.
So that’s 2-6 of Harrison’s checklist items for LessWrong, none of them particularly strong.
My filters would drop LessWrong in the “probably not a cult” category, based off of those two standards.
“If you perform experiments to determine the physical laws of our universe, you will learn how to make powerful weapons.”
It’s all about incentives.