Tenoke
‘Hasn’t it ever occurred to you that in your promiscuous pursuit of women you are merely trying to assuage your subconscious fears of sexual impotence?’
‘Yes, sir, it has.’
‘Then why do you do it?’
‘To assuage my fears of sexual impotence.’
Joseph Heller, Catch-22
explaining /= explaining away
Hmm, I did worse on those calibration questions than I would’ve expected.
“Goedel’s Law: as the length of any philosophical discussion increases, the probability of someone incorrectly quoting Goedel’s Incompleteness Theorem approaches 1”
--nshepperd on #lesswrong
That censorship because of what people think of LessWrong is ridiculous. That the negative effect on the reputation is probably significantly less than what is assumed. And that if EY thought that censorship of content for the sake of LW’s image is in order he should’ve logically thought that omitting fetishes from his public OKCupid profile(for the record I’ve defended the view that this is his right) among other things is also in order as well. And some other thoughts of this kind.
Close enough
The current top post on /r/HPMOR is a proposal that using babies to make horcruxes is a net ethical positive. You’d do well on LessWrong.com, Ilverin.
-EY
I agree but man does EY/MIRI need a better PR agent.
I have a bunch of comments on this:
I really liked the bit. Possibly because I’ve been lowkey following his efforts.
He looks quite good, and I like the beard on him.
..
I’ve always thought that his failed attempts at researching weightloss and applying what he learned were a counter example of how applicable LW/EY rationality is. Glad to see he solved it when it became more important.
Eliezer clearly gets too much flack in general, and especially in this case. It’s not like I haven’t criticised him but come on.
-
several people’s reaction was, “Why is this guy talking to me like I’m his friend, I don’t even know him”
Really? Fine, you don’t know him but if you don’t know EY and are at a rationalist event why would you be surprised by not knowing a speaker? From the public’s reaction to his openning it should’ve been clear most people did know him.
I’m not against the concept of triggering—some stuff can be, including eating disorders, but like this? Can a person not talk at all about weight gain/loss? Is the solstice at all LW-related if things can’t be discussed even at their fairly basic (and socially accepted) level? Please, if you hated it give a detailed response as to why. I’m genuinely curious.
“Man is not going to wait passively for millions of years before evolution offers him a better brain.”
--Corneliu E. Giurgea, the chemist who synthesized Piracetam and coined the term ‘Nootropic’
Am I the only one who finds it funny that a post bashing identity is written by a person with such an identity-signaling nickname?
Anyway, it is a good first post even if it makes some stronger claims than what seems reasonable here and there.
jimrandomh’s comment, linked in the OP, is the current best explanation of the epistemic concerns.
Excluding the personal stuff, this comment is just a somewhat standard LW critique of a LW post (which has less karma than the original post fwiw). If this is the criteria for an ‘epistemic concerns’ ban, then you must’ve banned hundreds of people. If you haven’t you are clearly banning him for the other reasons, I don’t know why you insist on being dishonest about it.
Linking to MIRI’s donation page might be useful but please please don’ link to LessWrong on 4chan—it could have some horrible consequences.
While it is true that we (techies, rationalists, etc.) have the opportunity to catch a gold rush by becoming early adopterst, I suspect survivorship bias is at play. There are plenty of people who try to systematically ‘grind’ on such opportunitities but it doesn’t pan out for many of them—I know some people who used to mass-register domains, and made a neglible profit in the end, people who jump on all sorts of altcoins, programmers who join a promising startup, where they sacrifice salary for equity, etc. Additionally, I also started messing with bitcoins in 2011, and while it has been quite profitable, I have made less than six figures, since I wasn’t very serious about it at the time. And yes, in retrospect I can say that I should’ve put more money in (and kept them in bitcoin), but if I follow the same line of reasoning with all the seemingly-promising things I see, I might very well go broke.
I wish I was already an experienced gold rush spotter, so I could explain how best to do it, but as indicated above, I participated in the ones that I did more or less by luck.
Indeed, luck seems to be a big part of it, and the main action that you can take to facilitate the process is probably to put yourself in the right circles, so you can hear and look into innovations early on. This, however, is something that you and many people on here already do, and I doubt that you can easily find another intervention that will have as big of an impact on your chances to participate in a gold rush.
Prediction: Since you are making this significantly harder than what it needs to be most people will give up before they even do Everyman for a week. And nobody will sustain Uberman since you aren’t even committing to trying to keep it up.
Unrelated: Bay Area rationalists seem to rely a lot on arguments from authority over everything else(I don’t have any data but I just see comments like the one about Matt relatively often). I wonder if seeing awesome people that are right a lot of the time might be coming with a cost.
I used to believe that bitcoin is under-priced before, but there are so many agents involved in it now (including Wall Street), that I can’t really convince myself that I know better than them—the market is too efficient for me.
Additionally, I’d be especially wary about buying based on arguments regarding the future price based on such obvious metrics, that many agents pay attention to.
Some #lesswrong regulars who are currently learning to code have made a channel for that purpose on freenode - #lw-prog
Anyone who is looking for a place to learn some programming alongside fellow lesswrongers is welcome to join.
I know you do follow-ups with most/all CFAR attendees. Do you have any aggregate data from the questionnaires? How much do they improve on the outcomes you measure and which ones?
After a short discussion on irc regarding basilisks I declared that if anyone has any basilisks that they consider dangerous or potentially real in anyway to please private message me about them. I am extending this invitation here as well. Furthermore, I will be completely at fault for any harm caused to me by learning about it. Please don’t let my potential harm discourage you.
- 10 Aug 2013 10:08 UTC; 2 points) 's comment on Open thread, August 5-11, 2013 by (
- 16 Aug 2013 18:30 UTC; 0 points) 's comment on Open thread, August 12-18, 2013 by (
“The future is always ideal: The fridge is stocked, the weather clear, the train runs on schedule and meetings end on time. Today, well, stuff happens.”
Hara Estroff Marano on procrastination in Psychology Today as cited here
Some of the ‘descriptions of LessWrong’ can make for a great quote on the back of Yudkowsky’s book.
I read this post where you keep claiming you are banning him for ‘epistemic concerns’ but then link to 0 examples and mostly talk about some unrelated real-life thing which you also give 0 real explanation for.
The comments here mention a sex crime, but OP doesn’t. If that’s what happened why vaguebook, stay silent for a year and lie the ban’s for ‘epistemic concerns’? Who else have you banned for ‘epistemic concerns’ - nobody?
Honestly, after reading everything here I do have major concerns about ialdabaoth’s character, but the main epistemic concerns I have are about OP presenting this dishonestly after a year of silence.