there is an enormous behavioral/performance component to being hot that men seem to be oblivious to despite being highly sensitive to it. and this component has fairly large variance similar to IQ (it seems clearly quite IQ loaded). i have no ability to say anything further that would convince anyone of this but im very close with a very successful stripper.
agrippa
in the solana world there’s been a very recent (past few weeks) influx of major bug bounties claimed / exploits exploited. its strongly consensus among multiple auditor friends of mine that this represents AI model auditing reaching a certain power level as of recently
Hmmm, I don’t think I said that humans are not affected by instrumental convergence or that my emulated superbrain would certainly not be dangerous. I just think if you made a random human 1000x more intelligent it’s pretty unlikely they would immediately extinguish the rest of the human race, and the emulated superbrain seems like it belongs in a similar reference class.
I am pretty confident in my memory because I was a bit surprised to read it myself so it was salient. It was a tweet.
Twitter’s search engine ignores the “%” symbol in search queries and I can’t DM him on Twitter without paying. @Eliezer Yudkowsky paging
Maybe there is a better way to search the corpus of his tweets.
A minor point about instrumental convergence that I would like feedback on
I was personally acquainted with Avraham Eisenberg via EA, who is now convicted of fraud. I work in crypto and it would be genuinely difficult for me to find a professional contact who isn’t familiar with what he did, it was that high profile.
I’m not sure if you would or should consider “ampdot” prominent, as far as I understand they have received EA funding for AI research, but I could definitely be wrong about that. They did a lot of memecoin promotion, as did their social circle. I’m not sure if they consider themselves an EA or just a rationalist.
In total I know of about 6? rationalist/EAs who ran memecoin pump and dumps, but would not call them prominent.
I don’t think sapph is trying to use these examples as persuasive evidence.
Not sure what you’re on, but “You might listen to an idiot doctor that puts you on spiro” is definitely a real transition downside
flood lights seem best?
However, Annie has not yet provided what I would consider direct / indisputable proof that her claims are true. Thus, rationally, I must consider Sam Altman innocent.
This is an interesting view on rationality that I hadn’t considered
Omen decouples but has prohibitive gas problems and sees no usage as a result.
Augur was a total failboat. Almost all of these projects couple the market protocol to the resolution protocol, which is stupid, especially if you are Augur and your ideas about making resolution protocols are really dumb.
Your understanding is correct. I built one which is currently offline, I’ll be in touch soon.
I found the stuff about relationship success in Luke’s first post here to be useful! thanks
Ok, this kind of tag is exactly what I was asking about. I’ll have a lok at these posts.
[Question] Can LessWrong provide me with something I find obviously highly useful to my own practical life?
Thanks for giving an example of a narrow project, I think it helps a lot. I have been around EA for several years, I find that grandiose projects and narratives at this point alienate me, and hearing about projects like yours make my ears perk up and feel like maybe I should devote more time and attention to the space.
I guess it’s good to know it’s possible to be both a LW-style rationalist and quite mentally ill.
Not commenting on distributions here, but it sure as fuck is possible.
I liked the analogy and I also like weird bugs
Okay Grok seems to have done a good job looking at EY’s tweets. https://x.com/ESYudkowsky/status/1070095112791715846 tweet thread gives your 50% number so perhaps you’re right