I have taken the survey.
RowanE
Actually, it probably is the right place—you don’t actually want people who aren’t smart, they would suck at the job. You want very smart people who are sufficiently aware of people smarter than them to be humble about it, and a forum with an average intelligence as high as it is here seems perfect in that respect.
You seem to be missing the point of CAE_Jones’ comment there. The people you ask about your unconventional idea “rationalizing” why they’re not doing something that seems obvious to you is what it feels like on the inside when your “obvious” idea is actually dumb and the people you’re asking have good reasons not to be doing it, what makes you so confident that that’s not what’s going on?
I have taken the survey.
I didn’t even read the post before I started mentally filling in all of the blanks with “my penis” or just “penis” as appropriate.
I want to add some kind of context to that, to avoid seeming like I’m just puerile, but nothing really comes to mind besides the general “what is wrong with me?” kind of thing that’s just signalling and not really helpful for anything.
I don’t think “here’s my vision of eutopia, isn’t it controversial?” is a type of post we should have here. Even if we really should be discussing the possibilities it considers, this seems a particularly bad way of bringing up the ideas—it gives a particular answer to the eutopia question, instead of exploring the relevant aspects of the question, so isn’t going to promote useful discussion so much as attempts to smack down the idea.
I think the problem here is with many trivia questions you either know the answer or you don’t; the dominant factor in my results so far is that I either have no answer in mind, assign 0 probability to my being right and am correctly calibrated there, and then all of my answers at other levels of certainty have turned out right so far so my calibration curve looks almost rectangular.
I might just be getting accurate information that I’m drastically underconfident, but I think this might be one of the worse types of questions to calibrate on. I mean, even if the problem is just that I’m drastically underconfident on trivia questions and shouldn’t be assigning less than 50% probability to any of my answers when I have an answer, that sounds sufficiently unrepresentative of most areas where you need calibration, and how most people perform on other calibration tests, for this to be a pretty bad measure of calibration.
Perhaps it would be better as a multiple choice test, so one can have possible answers raised to attention that may or may not be right, and assign probabilities to those?
The standard pledge for people in the rationalist sphere trying to make the world a better place is 10% of income to efficient charities, which if you’re making the typical kind of money for this site’s demographics, is closer to “token” than “difficult and thankless task”, even if it’s loads more than most people do.
Personally, my own response was to notice how little guilt I felt for not living up to moral obligations and decide I was evil and functionally become an egoist while still thinking of utilitarianism as “the true morality”.
Possibly donating money is easier when it’s funging against luxuries than when it’s funging against early retirement, and it’s hard for people who don’t plan on retiring early to read and follow frugality advice that’s framed in terms of how much better financial independence is than whichever luxury?
All beliefs are probability estimates, although it can be hard to trace how a particular belief got to the degree of confidence it’s at, and while it might be a nice norm to have in a perfect world I think it’s unreasonable to demand that every time someone expresses how confident or unconfident they are in a belief, they should also clarify the entire precise history of that belief’s presence in their mind.
I sometimes come across an interesting scientific paper where the study being done seems easy and/or low-budget enough to make me think “hey, I could do that” (on this occasion, this paper on theanine levels in tea, which I skimmed too quickly the first time to notice that they used big, proper and presumably expensive lab equipment to measure it because I was reading it for practical reasons (reading about modafinil amplifying the side-effects of caffeine, while beginning an all-nighter powered by those chemicals)), and to me there’s a strong “coolness factor” to being someone who’s published real research, especially if that also means a finite Erdos number. How easy/difficult is it to become author or co-author of a scientific paper as an amateur, given that you’re trying to actually accomplish something and not munchkin for “get my name published as easily as possible”?
Unrelatedly, I’m pretty sure posting under the influence of caffeine and modafinil is a terrible idea for me. I just spent two hours writing and re-writing that question, and I’m only stopping now because I’m giving up on trying to get it right. That’s only exacerbating a tendency I already have, but damn.
- 5 May 2015 7:24 UTC; 8 points) 's comment on Group rationality diary, May 5th − 23rd by (
The idea of being 99% confident of the correct phone number for a distant acquaintance, without actually checking on Facebook or something to confirm, boggles my mind.
Seems heavy on sneering at people worried about AI, light on rational argument. It’s almost like a rationalwiki article.
I think the entire core of his argument is a sleight-of-hand between “improbable” and “the kind of absurd improbability involved in Pascal’s wager”, without even (as others have pointed out) giving any arguments for why it’s improbable in the first place.
Sabotage of a big company’s IT systems, or of an IT company that maintains those systems, to force people to use paperclip-needing physical documents while the systems are down. Can have the paperclips be made mention of, but as what seems to the players like just fluff describing how this (rival company/terrorist/whatever) attack has disrupted things.
We shouldn’t select our fitness gurus for whether they’re of our tribe, we should select our fitness gurus for the effectiveness and truth of what they teach.
On that basis, do you have any reasons beyond “it’s nerdy!” for recommending this website over any number of other ones, many of which are very good? If it’s the gimmicky motivational approaches, I think LessWrong has that down pat—loads of us play HabitRPG and I’m pretty sure Beeminder’s founders were some of our own.
Edit: For some reason my links ate themselves and the text between them so I took them out.
I think this could be better put as “what do you believe, that most others don’t?”—being wrong is, from the inside, indistinguishable from being right, and a rationalist should know this. I think there have actually been several threads about beliefs that most of LW would disagree with.
Funds take a fraction of the earnings out, as management fees, and you want the fund that charges the lowest such fees. The early retirement blogs I read seem to agree on Vanguard being the best choice, at least in the US.
One person being horribly tortured for eternity is equivalent to that one person being copied infinite times and having each copy tortured for the rest of their life. Death is better than a lifetime of horrible torture, and 3^^^3, despite being bigger than a whole lot of numbers, is still smaller than infinity.
I’m doing the survey while I should be in a lecture, and I just reached the akrasia questions.