Yes, for sure – it’s almost identical! Thanks for sharing.
108k views means that many more people have heard of this thesis than I previously thought.
Tobias H
The ones who walk away are the ones who recognize all of this and are no longer willing to participate in the collective illusion (hence, alone).
I like this reading, I hadn’t thought about it before!
Thanks for pointing out that this is a big omission! I’ve added a bit about it.
I don’t think ‘you’re not able to accept a pure utopia’ is the only theme of the story, but it is a large and (to me) dominant one.
If I read the ending in isolation, it does feel like a critique of utilitarianism. But since the story introduces the suffering child, the ‘utilitarian downside’ of the calculus, as a clear farce, I find it not a plausible reading overall.
Taking the ending seriously is as bit as if you took the following argument against utilitarianism seriously: “Imagine there’s a child drowning in a shallow pond. You’re wearing a swimsuit and could easily save them. Don’t believe me? Okay, let me make it more believable: imagine there’s also a cute puppy guarding the pond that you’d have to kill to reach the child. Would you do it?”They leave Omelas, they walk ahead into the darkness, and they do not come back. The place they go towards is a place even less imaginable to most of us than the city of happiness. I cannot describe it at all. It is possible that it does not exist. But they seem to know where they are going, the ones who walk away from Omelas.
She mentions that they know where they are going, but she doesn’t mention why they are going. It could be because they’d personally be unhappy in such a place. It could be because they think there’s an even better possible place. It could be because they reject the utilitarian calculus. I’m genuinely confused what the end is about.
I agree that the ending doesn’t fit either. I’ve mentioned something similar here. I’m genuinely confused what the ending is about, and have mainly settled on ‘the story would be really bleak and unenjoyable without it’.
I’m not sure how I missed this one. Thanks!
Omelas Is Perfectly Misread
Looking forward to ‘Sustrik on River Swimming’, if you need info on Bern (arguably the best city for it) feel free to reach out.
What types of policy/governance research is most valuable for control? Are there specific topics you wish more people were working on?
It seems as if you think of most people as cats. Does this mean that your AI safety work is largely motivated by ‘animal welfare’-like concerns, or do you mainly do it for the rest of the people who you don’t think of as cats?
I do think the book is just very high-quality (I read a preview copy) and I would obviously curate it if it was a post, independently of its object-level conclusions.
Would you similarly promote a very high-quality book arguing against AI xrisk by a valued LessWrong member (let’s say titotal)?
I’m fine with the LessWrong team not being neutral about AI xrisk. But I do suspect that this promotion could discourage AI risk sceptics from joining the platform.
What could current examples be where very smart people are deluded? Jhanas?
I assumed that there were a large number of unknown cases and that the unknown cases, on average, had less severe consequences. But I haven’t read the paper deeply enough to really know this.
Quite an interesting paper you linked:
Conventional wisdom during World War II among German soldiers,
members of the SS and SD as well as police personnel, held that any order given
by a superior officer must be obeyed under any circumstances. Failure to carry
out such an order would result in a threat to life and limb or possibly serious
danger to loved ones. Many students of Nazi history have this same view, even
to this day.
Could a German refuse to participate in the round up and murder of
Jews, gypsies, suspected partisans,”commissars”and Soviet POWs—unarmed
groups of men, women, and children—and survive without getting himself shot
or put into a concentration camp or placing his loved ones in jeopardy?
We may never learn the full answer to this, the ultimate question for
all those placed in such a quandry, because we lack adequate documentation
in many cases to determine the full circumstances and consequences of such a
hazardous risk. There are, however, over 100 cases of individuals whose moral
scruples were weighed in the balance and not found wanting. These individuals
made the choice to refuse participation in the shooting of unarmed civilians or
POWs and none of them paid the ultimate penalty, death! Furthermore,very few
suffered any other serious consequence!Table of the consequences they faced:
Yeah, I got it 3 times but it’s not showing up. EA man...
2025 Q3 Pivotal Research Fellowship: Applications Open
(Not sure if I got the maths right here.)
Manifold gives two interesting probabilities:Using the simplifying assumption that until 2050 dramatic longevity gains happen only if ASI ‘solves ageing’, we have:
I couldn’t find a good number, but let’s assume Manifold also thinks there’s a 25% chance of doom (everyone is dead) until 2050 given ASI. This leaves:
Multiplying by the overall chance of ASI (65%), the simplified unconditional outcomes are:
ASI with large longevity gains by 2050:
ASI with doom by 2050:
ASI but no large increase in life expectancy by 2050:
No ASI by 2031:
This would imply that Manifold believes there to be a 32% chance that ASI by 2031, but by 2050 (19 years later), humanity survives but US life expectancy still hasn’t reached 100+ years.
2025 Q1 Pivotal Research Fellowship (Technical & Policy)
For sure, but that leads to much more individualised advice of the form “If you’re fine to be exposed to sun for up to 2h with SPF 50, you should not expose yourself for much more than 1h with SPF 30”. The quoted section makes it seem like “You’re fine as long as you wear SPF 50+ sunscreen, but SPF 45 just won’t cut it.”, which doesn’t generalise for most individuals and their level of sunlight exposure.
I’ve watched Big Joel videos before, which makes me wonder if I have seen this one but forgotten about it. I expressed the same basic idea before this video was released, but the structure I used here is very similar.