“Maybe the Effective Altruist movement should accept people like you because they’re a big tent and they’re friendly and welcoming, but the rationalist community should be elitist and only accept people who say tsuyoku naritai—there’s a reason this is on LessWrong and not the EA forum”
As the EA community has become less intense, sometimes I’ve wondered whether there would be value in someone starting an LW or EA adjacent group that’s on the more intense part of the spectrum.
I definitely see risks associated with this (people pushing themselves too hard, fanaticism) and I probably wouldn’t want to be part of it myself, but I imagine that it could be a good fit for some people.
Sounds like evaporative cooling in reverse (although actually more in keeping with the literal meaning): the fieriest radicals boiling off to leave the more tepid behind.
“Maybe the Effective Altruist movement should accept people like you because they’re a big tent and they’re friendly and welcoming, but the rationalist community should be elitist and only accept people who say tsuyoku naritai—there’s a reason this is on LessWrong and not the EA forum”
As the EA community has become less intense, sometimes I’ve wondered whether there would be value in someone starting an LW or EA adjacent group that’s on the more intense part of the spectrum.
I definitely see risks associated with this (people pushing themselves too hard, fanaticism) and I probably wouldn’t want to be part of it myself, but I imagine that it could be a good fit for some people.
Motto: “Maximising utility isn’t everything, it’s the only thing!”
Sounds like evaporative cooling in reverse (although actually more in keeping with the literal meaning): the fieriest radicals boiling off to leave the more tepid behind.