Upvote this if you think separate agree/disagree and upvote/downvote buttons are a good idea.
Bongo
Thus, when aiming to maximize expected positive impact, it is not advisable to make giving decisions based fully on explicit formulas.
I love that you don’t seem to argue against maximizing EV, but rather to argue that a certain method, EEV, is a bad way to maximize EV. If this was stated at the beginning of the article I would have been a lot less initially skeptical.
- 18 Aug 2011 23:33 UTC; 7 points) 's comment on Why We Can’t Take Expected Value Estimates Literally (Even When They’re Unbiased) by (
I disagree. I’m entertained.
Upvote this if you want quotes in the main LW.
Should LW have a public censorship policy?
Upvote this if you think separate agree/disagree and upvote/downvote buttons are a bad idea.
Omega can be replaced by amnesia
People subscribing to Eliezer’s comments is what you would expect if they were cultists that wanted to make sure they don’t miss a word from Mr. cult leader!
I think LW is not a cult, but please don’t make bad arguments for why LW is not a cult.
not implementing Projects, people will improve their Rationality skills at a far slower pace. [4] You will thus run afoul of Bhagwat’s Law of Commitment: “The degree to which people identify with your group is directly proportional to the amount of stuff you tell them to do that works.”
This seems to equate “improving Rationality skills” with “identifying with the group”. I find this frightening, and a step towards just using “rationality” as something in the name of which to grub power, influence and followers and as a flag to rally a generic community around. Maybe that’s the function of religious teachings for religious communities, but I hope not for LW.
Upvote this if you think neither.
Do you also think that global warming is a hoax, that nuclear weapons were never really that dangerous, and that the whole concept of existential risks is basically a self-serving delusion?
Also, why are the folks that you disagree with the only ones that get to be described with all-caps narrative tropes? Aren’t you THE LONE SANE MAN who’s MAKING A DESPERATE EFFORT to EXPOSE THE TRUTH about FALSE MESSIAHS and the LIES OF CORRUPT LEADERS and SHOW THE WAY to their HORDES OF MINDLESS FOLLOWERS to AN ENLIGHTENED FUTURE? Can’t you describe anything with all-caps narrative tropes if you want?
Not rhethorical questions, I’d actually like to read your answers.
If we go through with the double karma, let’s at least collect statistics: given that a user voted something up/down on one karma, with what likelihood did they also vote it in the same direction on the other karma?
Maybe it’ll be revealed that for most posts, most users either click neither or both, and in the same direction.
Good advice for the real world, maybe. But consider that here on LW, we are among analytical people, and wouldn’t have it otherwise.
Heard on #lesswrong:
BTW, I figured out why Eliezer looks like a cult leader to some people. It’s because he has both social authority (he’s a leader figure, solicits donations) and epistemological authority (he’s the top expert, wrote the sequences which are considered canonical).
If, for example, Wei Dai kicked Eliezer’s ass at FAI theory, LW would not appear cultish
This suggests that we should try to make someone else a social authority so that he doesn’t have to be.
(I hope posting only a log is ok)
site:lesswrong.com “artificial intelligence” = 30,700 results site:lesswrong.com “Singularity” = 32,000 results
Thought this was because of the logo at the top of the page, so searched for “Singularity Institute for Artificial Intelligence” and got:
site:lesswrong.com “Singularity Institute for Artificial Intelligence” = 111,000 results
So something’s weird. Also, if you move “site:lesswrong.com″ to the right side you get 116,000 instead.
No, I think the central “problem” is that having preferences that others can thwart with little effort is risky because it makes you more vulnerable to extortion.
For example, if you have a preference against non-prime heaps of pebbles existing, the aliens can try to extort you by building huge numbers of non-prime heaps on their home planet and sending you pictures of them, and therefore, the argument goes, it’s crazy and stupid to care about non-prime heaps.
The argument also yields a heuristic that the farther away a thing is from you, the more stupid and crazy it is to care about it.
plus “Most people would kill themselves with unlimited willpower. Maybe willpower is limited so that one can’t pursue bad goals too far.”
LW Minecraft server anyone?
norm against boasting about … predicted success
This is a great idea!
Slavoj Zizek