It might be worthwhile to distinguish Capital-R Rationalists from people wearing rationalism as attire.
My lived experience is that your negative observations do not hold for people who have read The Sequences.
To avoid the “No True Scottsman”-fallacy: Could you provide an example of a person who claims to have read and internalized The Sequences who subscribe to any of the following claims/characteristics?
The person believes that RLHF and similar “alignment” techniques provide Alignment as the word would be used by Eliezer
The person is strongly confident that current and near-future LLMs do not have any capacity to suffer
The “AI Safety” work done to prevent LLMs from expressing certain phrases is actually Real AI Safety/AINotKillEveryoneism
Strongly disagrees with your “Sane Response”
Believes the current US administration’s attitude towards China with regards to AI Safety is strongly better than your suggestion
Did shady, or outright illegal things, like released blatant pump-and-dump coins and promoted them on their persona Twitter
Did not care about FTX
Works in a Frontier Lab, primarily motivated by money (excluding earning-to-give)
No True Rationalist(tm) would hold all these beliefs, and I predict you will find it hard to find a single example of someone matching the above description. I’m willing to pay $100 for the first example, then for subsequent $50, $25,...
Thanks. SBF and Caroline would probably be examples of Bad Rationalists, though the link mostly is Caroline saying The Sequences didn’t update her much.
idk if I’m allowed to take the money if I’m not the OP, but it really doesn’t seem hard to find other examples who read and internalized the Sequences and went on to do at least one of the things you mentioned: the Zizians, Cole Killian, etc. I think I know the person OP meant when talking about “releasing blatant pump-and-dump coins and promoting them on their personal Twitter”, I won’t mention her name publicly. I’m sure you can find people who read the Sequences and endorse alignment optimism or China hawkism (certainly you can find highly upvoted arguments for alignment optimism here or on the Alignment Forum) as well.
You’re allowed! Please PM me with how you’d prefer to receive $200.
I’m confused about this subject. I grant that SBF, Caroline, Zizians are examples of Bad Rationalists, (not sure which Bullet Point Cole Killian falls for), and I trust you when you say that there’s at least one more.
If one lowers the bar for Rationalist to “Have read The Sequences with no requirement for endorsing/internalizing”, then probably Sam Altman, Dario Amodei, Leopold A, and others fit the criteria. However, these are people who are widely denounced in the Rationalist Community ; our community seem to have a consensus around the negation of the Bullet Points.
sapphire is (IMHO) not saying goodbye to the Rationalist Community because we endorse SBF or Sam Altman. sapphire is (IMHO) not ceasing posting to LessWrong because Leopold and the Zizians are writing replies to the posts. Something else is going on, and I’m confused.
It might be worthwhile to distinguish Capital-R Rationalists from people wearing rationalism as attire.
My lived experience is that your negative observations do not hold for people who have read The Sequences.
To avoid the “No True Scottsman”-fallacy: Could you provide an example of a person who claims to have read and internalized The Sequences who subscribe to any of the following claims/characteristics?
The person believes that RLHF and similar “alignment” techniques provide Alignment as the word would be used by Eliezer
The person is strongly confident that current and near-future LLMs do not have any capacity to suffer
The “AI Safety” work done to prevent LLMs from expressing certain phrases is actually Real AI Safety/AINotKillEveryoneism
Strongly disagrees with your “Sane Response”
Believes the current US administration’s attitude towards China with regards to AI Safety is strongly better than your suggestion
Did shady, or outright illegal things, like released blatant pump-and-dump coins and promoted them on their persona Twitter
Did not care about FTX
Works in a Frontier Lab, primarily motivated by money (excluding earning-to-give)
No True Rationalist(tm) would hold all these beliefs, and I predict you will find it hard to find a single example of someone matching the above description. I’m willing to pay $100 for the first example, then for subsequent $50, $25,...
This seems trivial. Ctrl+F “the Sequences” here
Thanks. SBF and Caroline would probably be examples of Bad Rationalists, though the link mostly is Caroline saying The Sequences didn’t update her much.
idk if I’m allowed to take the money if I’m not the OP, but it really doesn’t seem hard to find other examples who read and internalized the Sequences and went on to do at least one of the things you mentioned: the Zizians, Cole Killian, etc. I think I know the person OP meant when talking about “releasing blatant pump-and-dump coins and promoting them on their personal Twitter”, I won’t mention her name publicly. I’m sure you can find people who read the Sequences and endorse alignment optimism or China hawkism (certainly you can find highly upvoted arguments for alignment optimism here or on the Alignment Forum) as well.
You’re allowed! Please PM me with how you’d prefer to receive $200.
I’m confused about this subject. I grant that SBF, Caroline, Zizians are examples of Bad Rationalists, (not sure which Bullet Point Cole Killian falls for), and I trust you when you say that there’s at least one more.
If one lowers the bar for Rationalist to “Have read The Sequences with no requirement for endorsing/internalizing”, then probably Sam Altman, Dario Amodei, Leopold A, and others fit the criteria. However, these are people who are widely denounced in the Rationalist Community ; our community seem to have a consensus around the negation of the Bullet Points.
sapphire is (IMHO) not saying goodbye to the Rationalist Community because we endorse SBF or Sam Altman. sapphire is (IMHO) not ceasing posting to LessWrong because Leopold and the Zizians are writing replies to the posts. Something else is going on, and I’m confused.
Technically I guess there is no consensus against alignment optimism (which is fine by itself).