there is an enormous behavioral/performance component to being hot that men seem to be oblivious to despite being highly sensitive to it. and this component has fairly large variance similar to IQ (it seems clearly quite IQ loaded). i have no ability to say anything further that would convince anyone of this but im very close with a very successful stripper.
agrippa
in the solana world there’s been a very recent (past few weeks) influx of major bug bounties claimed / exploits exploited. its strongly consensus among multiple auditor friends of mine that this represents AI model auditing reaching a certain power level as of recently
Hmmm, I don’t think I said that humans are not affected by instrumental convergence or that my emulated superbrain would certainly not be dangerous. I just think if you made a random human 1000x more intelligent it’s pretty unlikely they would immediately extinguish the rest of the human race, and the emulated superbrain seems like it belongs in a similar reference class.
I am pretty confident in my memory because I was a bit surprised to read it myself so it was salient. It was a tweet.
Twitter’s search engine ignores the “%” symbol in search queries and I can’t DM him on Twitter without paying. @Eliezer Yudkowsky paging
Maybe there is a better way to search the corpus of his tweets.
I was personally acquainted with Avraham Eisenberg via EA, who is now convicted of fraud. I work in crypto and it would be genuinely difficult for me to find a professional contact who isn’t familiar with what he did, it was that high profile.
I’m not sure if you would or should consider “ampdot” prominent, as far as I understand they have received EA funding for AI research, but I could definitely be wrong about that. They did a lot of memecoin promotion, as did their social circle. I’m not sure if they consider themselves an EA or just a rationalist.
In total I know of about 6? rationalist/EAs who ran memecoin pump and dumps, but would not call them prominent.
I don’t think sapph is trying to use these examples as persuasive evidence.
Not sure what you’re on, but “You might listen to an idiot doctor that puts you on spiro” is definitely a real transition downside
flood lights seem best?
However, Annie has not yet provided what I would consider direct / indisputable proof that her claims are true. Thus, rationally, I must consider Sam Altman innocent.
This is an interesting view on rationality that I hadn’t considered
Omen decouples but has prohibitive gas problems and sees no usage as a result.
Augur was a total failboat. Almost all of these projects couple the market protocol to the resolution protocol, which is stupid, especially if you are Augur and your ideas about making resolution protocols are really dumb.
Your understanding is correct. I built one which is currently offline, I’ll be in touch soon.
I found the stuff about relationship success in Luke’s first post here to be useful! thanks
Ok, this kind of tag is exactly what I was asking about. I’ll have a lok at these posts.
Thanks for giving an example of a narrow project, I think it helps a lot. I have been around EA for several years, I find that grandiose projects and narratives at this point alienate me, and hearing about projects like yours make my ears perk up and feel like maybe I should devote more time and attention to the space.
I guess it’s good to know it’s possible to be both a LW-style rationalist and quite mentally ill.
Not commenting on distributions here, but it sure as fuck is possible.
I liked the analogy and I also like weird bugs
While normal from a normal perspective, this post is strange from a rationalist perspective, since the lesson you describe is X is bad, but the evidence given is that you had a good experience with X aside from mundane interpersonal drama that everyone experiences and that doesnt sound particularly exacerbated by X. Aside from that you say it contributed to psychosis years down the line, but its not very clear to me there is a strong causal relationship or any.
(of course, your friend’s bad experience with cults is a good reason to update against cults being safe to participate in)
I am not really a cult advocate. But it is okay (and certainly bayesian) to just have a good personal experience with something and conclude that can be safer or nicer than people generally think. Just because you’re crazy doesnt mean everything you did was bad.
Edit: This is still on my mind so I will write some more. I feel like the attitude in your post, especially your addendum, is that its fundamentally obviously wrong to feel like your experience was okay or an okay thing to do. And that the fact you feel/felt okay about it is strong evidence that you need to master rationality more, in order to be actually okay. And that once you do master rationality, you will no longer feel it was ok.
But “some bad things happened and also some good things, I guess it was sort of okay” is in fact a reasonable way to feel. It does sound like some bad things happened, some good things, and that it was just sort of okay (if not better). There is outside view evidence about cults being bad. Far be it from me to say that you should not avoid cults. We should certainly incorporate the outside view into our choices. But successfully squashing your inside view because it contradicts the outside view is not really an exercise in rationality, and is often the direct opposite. Also, it makes me sad.
how are you personally preparing for this?
Okay Grok seems to have done a good job looking at EY’s tweets. https://x.com/ESYudkowsky/status/1070095112791715846 tweet thread gives your 50% number so perhaps you’re right