Just donated 400 €.
ArisKatsaris
I noticed that the people who didn’t like the book were essentially put off by the rationality. They thought Harry was arrogant and condescending.
If they said that Harry was being arrogant and condescending, perhaps you shouldn’t immediately translate this into your mind as “essentially put off by the rationality”?
In a previous version of the story (I believe Eliezer has since revised it, probably because he did realize it was going too far), Harry in one case called McGonnagal “Minerva” and considering calling her “Minnie”, when McGonagall had been calling him “Mr Potter” throughout.
Harry has indeed been an arrogant and condescending little twit.
$1,200 donated.
I’d like to remark on something that annoys me: Your “donation meter” (at least the one on your site, if not the one in the post above) ought either be certain to be updated daily, or at the very least it should note when it was last updated. I find the phrase “raised to date” frustrating and annoying when I can’t trust that the “to date” is actually current.
Don’t react naturally (eg: “Waterline is a clever meaningful in-group signal and sounds pretty”), ask yourself how your target will react (eg: “Oh, are they whale environmentalists?”).
I think that consideration may be highly overestimated in the discussion here. Facebook isn’t about faces, Twitter isn’t about songbirds, google has little to do with the number “googol”, The Apple Corporation isn’t selling fruit… etc, etc.
A short pretty name to remember and be able to look up if you need to may be just as good as marketing. Something like “Waterline Institute” needs be clarified one (“they’re talking about raising a metaphorical ’sanity waterline in the human population”), then it’s a memorable enough name and visual alike.
But something like “Bayesian House” can only be clarified by making a long explanation about mathematical formulas… And it’s not immediately memorable afterwards, because frankly it’s just ‘Bayes’ is just a name, called after Thomas Bayes.
But honestly, I’ve never studied marketing or anything like that, so I may just be talking out of my ass here...
Assuming evil people will be susceptible to such arguments
I didn’t say evil people will be susceptible to such arguments.
I was naming three reasons that good people have to not be evil, not three arguments that would cause evil people to stop being evil.
Contrast that with someone who denies the existence of anthropogenic global warming (AGW)
I don’t have the knowledge of climatology to make a reasoned claim about AGW myself one way or another. Whether I believe or disbelieve in AGW, it would therefore currently have to be completely done based on trusting the positions of other people. Which are indeed Bayesian evidence, but “mistrusting the current climatological elite” even if someone places a wrong prior on how likely said climatological elite is to manufacture/misinterpret data, is not remotely similar to the same sort of logical hoops that your average theist has to go through to explain and excuse the presence of evil in the world, the silence of the gods, the lack of material evidence, archaelogical and geological discrepancies with their holy texts etc, etc, etc.
So your test isn’t remotely as good. It effectively tests just one thing: one’s prior on how likely climatologists are to lie or misinterpret data.
I love the article, but this is a bad name for a fallacy, as it hinders neutral discussion of its relative badness compared to other fallacies.
If I could pick a name, I’d probably choose something like “tainting categorization”.
I’m pretty sure the solution is as follows (I’ve already posted it in TV tropes forum). I’m ROT13, if anyone still wants to figure it out: Yhpvhf Znysbl pynvzrq gb unir orra haqre Vzcrevhf ol Ibyqrzbeg. Ibyqrzbeg jnf qrsrngrq ol Uneel Cbggre. Sebz Serq & Trbetr’f cenax jr xabj gung xvyyvat gur jvmneq gung unf lbh haqre gur Vzcrevhf phefr perngrf n qrog. Erfhyg: Yhpvhf Znysbl naq rirel bgure Qrngu rngre pynvzvat gb unir orra vzcrevbfrq ner abj haqre yvsr qrog gb Uneel Cbggre. Ur pna fgneg erqrrzvat.
- 23 Mar 2012 2:56 UTC; 8 points) 's comment on Harry Potter and the Methods of Rationality discussion thread, part 11 by (
Took the survey. Cooperated.
signal your outgroup hatred with a downvote and move on.
Downvoted because I don’t find it appropriate to uncharitably interpret the meaning of any downvotes one receives, and certainly not out loud and in advance.
Okay, after thinking a few minutes about the Batman-Joker/where do you put Dark Wizards if you’re determined not to use Dementors anymore problem...
Unbreakable Vow anyone? Just give Dark Wizards the option “either you take an Unbreakable Vow to never knowingly kill/torture/Imperio a human being ever again, nor to ever knowingly assist in such, or we just execute you right now”.
I can think up of possible ways out of this meta-problem, in order to sustain the dilemma: Perhaps really powerful Dark Wizards require too vast a portion of magical power to sustain the vow. Perhaps there are dark rituals whereby using them, Dark Wizards can break out of even an (ill-named) Unbreakable Vow. Perhaps Dark Wizards tend to have made other rituals that already make them immune to Unbreakable Vows… Perhaps unbreakable vows need be really really specific in some weird manner like “I will not kill Bill Weasley”, and “I will not kill Charlie Weasley” necessarily are two separate vows, so that “I will not kill any human” isn’t enforceable...
But these are additional problems that are not yet mentioned/listed/foreshadowed in the story. Ugh, Unbreakable Vows seem something of a game breaker right now.
Sidenote: Whenever I think of something such, I worry that the author will think he’ll have to rewrite/revise everything he had already planned, and that we’ll never get an update again. Not my intention, I swear.
Either:
1) the audience of LW changed significantly in the half-year interim;
or 2) the lack of personal input in the second post caused people to more freely voice their true opinions, rather than those they suspected I would take offense at.
Some more alternatives:
3) A couple of different people chanced to participate in the latter thread when they hadn’t participated in the former thread. In short that you generalize from one example.
4) That the latter thread began with your comparison between transsexuality and otherkin, thus anchoring transsexualism to the low status of otherkin—if anything you’re fortunate people didn’t explicitly compare transsexuals to “stupid role-playing trolls” (the current status that otherkin have in my mind)
I’m not MIRI affiliated, but as a member of the LessWrong forum, and talking for myself alone, I’ll just repeat what I’ve said before: There’s only so many times someone can call me a brainwashed cultist, before I stop forgiving them.
You’ve spent the past few years insulting and mocking people for having different opinions tha8n you. That’s it. That’s the entirety of the crime of LessWrong/MIRI: you’ve not produced a hint of unethicality or dishonesty in regards to any of MIRI and/or LessWrong’s doings, but you bash them viciously for having different opinions.
LessWrongers always treated you (and Rationalwiki too), and is still treating you and any of your different opinions, much more civilly than you (or Rationalwiki) ever did us and any of ours. So you getting health related issues as a result of the viciousness you perpetrate—okay, that’s like repeatedly punching someone and then complaining that your fist has started to hurt.
We don’t have, nor ever had, a “Why Alexander Kruel/Xixidu sucks” page that we can take down. You are the one with the bazillion “Why LessWrong/MIRI sucks” pages. Unlike you have done with EY, I haven’t even screenshotted the comments by you that you’ve later chosen to take down because you found them embarrassing to yourself. Gee, it must be nice NOT having someone devoted to mocking you.
I wish you good health, as a general moral principle of my humanism. But I also care about the problems you caused on the targets of your viciousness.
- 24 Nov 2014 2:16 UTC; 15 points) 's comment on Breaking the vicious cycle by (
Yeah, but the difference is that the majority of people actually have SAT scores.
A majority of US people perhaps. Aargh the Americano-centrism, yet again.
Two obvious questions missing from the survey btw are birth country, and current country of residence (if different).
I upvoted this because it was highly amusing—but ultimately it’s silly, a perfect example of how some people can be so sharp that they cut themselves.
I wonder, if instead one-box and two-box for a prize of 100$ or 200$, we had “Selection A: horrible self-mutilation” and “Selection B: one million dollars”, with Prometheus creating only the people that he believed would pick Selection A, and reject Selection B.… would the people that one-box here STILL onebox?
Well, I’d just choose to win instead and thus pick the one million dollars instead of the horrible self-mutilation. I think that’s the sane thing to do—if Prometheus has a 99.99% predictive capacity on this there’ll be 10000 people who’ll select self-mutilation for every one like me who’ll pick the money. But I already know I’m like me, and I’m the one I’m concerned about.
The relevant probability isn’t P(chooses Self-mutilation|Prometheus created him) ~= 0, but rather the P(chooses one million dollars|Is Aris Katsaris) ~= 1.
Because he’s also the character least likely to kill an innocent child because of some prophecy? Does Hagrid seem like that much of a consequentialist to you?
I propose not messing with what isn’t broken.
And fixing what is broken.
Nicholas Flamel (born 1340) could be almost as good a source of ancient spells lost to the Interdict of Merlin as Slytherin’s Monster
From Chapter 77:
A single glance would tell any competent wizard that the Headmaster has laced that corridor with a ridiculous quantity of wards and webs, triggers and tripsigns. And more: there are Charms laid there of ancient power, magical constructs of which I have heard not even rumors, techniques that must have been disgorged from the hoarded lore of Flamel himself.
So Dumbledore’s already using some of Flamel’s knowledge in his efforts against Voldemort.
I think we could modify our sense of it to mean that if you are down to having to accept a 0.01% probability, because you’ve excluded everything else, then it’s probably better to go back over your logic and see if there’s any place you’ve improperly limited your hypothesis space.
Several paradigm-changing theories introduced concepts that would have previously been thought impossible (like special relativity, or many-worlds interpretation)
- 11 Jul 2011 16:54 UTC; 0 points) 's comment on Rationality Quotes July 2011 by (
That was the most horribly designed thing I’ve ever seen anyone do on LessWrong, as I once described here so please, please, no video.
The questions are text. Have your answer on text too, so that we can actually read them—unless there’s some particular question which would actually be enhanced by the usage of video, (e.g. you’d like to show an animated graph or a computer simulation or something)
If there’s nothing I can say to convince you against using video, then I beg you to atleast take the time to read my more specific problems in the link above and correct those particular flaws—a single audio that we can atleast play and listen in the background, while we’re doing something else, instead of 30 videos that we must individually click. If not that, atleast a clear description of the questions on the same page (AND repeated clearly on the audio itself), so that we can see the questions that interest us, instead of a link to a different page.
But please, just consider text instead. Text has the highest signal-to-noise ratio. We can actually read it in our leisure. We can go back and forth and quote things exactly. TEXT IS NIFTY.