Yes—Lewis held this, for instance, in the most famous paper on the topic.
omnizoid
Lots of people disagree with 2.
On Leif Wenar’s Absurdly Unconvincing Critique Of Effective Altruism
I didn’t make a betting argument.
The Closed Eyes Argument For Thirding
Impervious to reason? I sent you an 8,000 word essay giving reasons for it!
Just to be clear, I banned you because I find your comments to be annoying consistently. You are, in fact, the first commenter I’ve ever banned.
As for the question, they look at the various neural correlates of suffering on different theories, split their credence across them, and divy up the results based on expected consciousness. The report is more detailed.
Why The Insects Scream
Conspiracy Theorists Aren’t Ignorant. They’re Bad At Epistemology.
It may be imaginable, but if it’s false, who cares. Like, suppose I argue, that fundamental reality has to meet constraint X and view Y is the only plausible view that does so. Listing off a bunch of random ones that meet constraint X but are false doesn’t help you .
Well, UDASSA is false https://joecarlsmith.com/2021/11/28/anthropics-and-the-universal-distribution. As I argue elsewhere, any view other than SIA implies the doomsday argument. The number of possible beings isn’t equal to the number of “physically limited beings in our universe,” and there are different arrangements for the continuum points.
The argument for Beth 2 possible people is that it’s the powerset of continuum points. SIA gives reason to think you should assign a uniform prior across possible people. There could be a God-less universe with Beth 2 people, but I don’t know how that would work, and even if there’s some coherent model one can make work without sacrificing simplicity, P(Beth 2 people)|Theism>>>>>>>>>>>>>>>>>>>>>>P(Beth 2 people)|Atheism. You need to fill in the details more beyond just saying “there are Beth 2 people,” which will cost simplicity.
Remember, this is just part of a lengthy cumulative case.
If theism is true then all possible people exist but they’re not all here. SIA gives you a reason to think many exist but says nothing about where they’d be. Theism predicts a vast multiverse.
The cases are non-symmetrical because a big universe makes my existence more likely but it doesn’t make me more likely to get HTTTTTTTHTTHHTTTHTTTHTHTHTTHHTTTTTTHHHTHTTHTTTHHTTTTHTHTHHHHHTTTTHTHHHHTHHHHHHHTTTTHHTHHHTHTTTTTHTTTHTTHHHTHHHTHHTHTHTHTHTHHTHTHTTHTHHTTHTHTTHHHHHTTTTTTHHTHTTTTTHHTHHTTHTTHHTTTHTTHTHTTHHHTTHHHTHTTHHTTHTTTHTHHHTHHTHHHHTHHTHHHTHHHHTTHTTHTHHTHTTHTHHTTHHTTHHTH. The most specific version of the evidence is I get those sequence of coin flips, which is unaffected by the number of people, rather than that someone does that. My view follows trivially from the widely adopted SIA which I argued for in the piece—it doesn’t rely on some basic math error.
Theism Isn’t So Crazy
SIA Is Just Being a Bayesian About the Fact That One Exists
I didn’t attack his character, I said he was wrong about lots of things.
//If you add to the physical laws code that says “behave like with Casper”, you have re-implemented Casper with one additional layer of indirection. It is then not fair to say this other world does not contain Casper in an equivalent way.//
No, you haven’t reimplemented Casper, you’ve just copied his physical effects. There is no Casper, and Casper’s consciousness doesn’t exist.
Your description of the FDT stuff isn’t what I argued.
//I’ve just skimmed this part, but it seems to me that you provide arguments and evidence about consciousness as wakefulness or similar, while Yudkowsky is talking about the more restricted and elusive concept of self-awareness. //
Both Yudkowsky and I are talking about having experiences, as he’s been explicit about in various places.
//Your situation is symmetric: if you find yourself repeatedly being very confident about someone not knowing what they are saying, while this person is a highly regarded intellectual, maybe you are overconfident and wrong! I consider this a difficult dilemma to be in. Yudkowsky wrote a book about this problem, Inadequate Equilibria, so it’s one step ahead of you on the meta.//
I don’t talk about the huge range of topics Yudkowsky does. I don’t have super confident views on any topic that is controvsial among the experts—but Yudkowsky’s views aren’t, they mostly just rest on basic errors.
If you half and don’t think that your credence should be 2⁄3 in heads after finding out it’s Monday you violate the conservation of evidence. If you’re going to be told what time it is, your credence might go up but has no chance of going down—if it’s day 2 your credence will spike to 100, if it’s day 1 it wont’ change.