I really don’t think we should talk about or link to that article anymore. Patching it won’t change the fact that we’ve anchored the discussion to dust specks. If your goal is to persuade more people to shut up and multiply (and thereby do a little to decrease the actual amount of expected torture and suffering in the real world), surely there are ways to do that that aren’t quite so saturated with mind-kill. Just write a brand new post, modeled after Eliezer’s — ‘Cancer or hiccups?’, say — that avoids...
… strange and poorly understood examples. (I’m still not sure I get what this tiny ‘dust speck’ experience is supposed to feel like.)
… examples that emphasize agency (and therefore encourage deontological responses), as opposed to worldly misfortunes.
… numbers that are difficult not just for laypeople to intuitively grasp, but even to notationally unpack.
Such elements aren’t even useful for neutral illustration. This is among the most important rhetorical battles the LW and EA communities have ever addressed, and among the most challenging. Why go out of our way to handicap ourselves? Just use an unrelated example.
I think that particular number makes the argument harder to understand, since I’m not sure what it would even mean to pay such an amount. Should I postulate that we have colonized the entire observable universe and I’m the supreme ruler of the universe, thereby allowing for the possibility of such a sum even existing, let alone me having the opportunity to pay it?
Should I postulate that we have colonized the entire observable universe and I’m the supreme ruler of the universe, thereby allowing for the possibility of such a sum even existing...
Not to beat a dead horse, but $(3^^^3/100) is way way way way way way way way way more money than you could spend if you merely colonized the universe with trillionaires the size of quarks.
I thought of it like this: Well, I wouldn’t even pay one trillion dollars, so surely I wouldn’t pay $(3^^^3/100). Given that paying $(3^^^3/100) is the logical consequence of choosing “torture” in “torture vs. shampoo”, I clearly must not pick “torture”.
Yeah, I did arrive at something similar after a moment’s thought, but an example that the reader needs to explicitly transform into a different one (“okay, that number makes no sense, but the same logic applies to smaller numbers that do make sense”) isn’t a very good example.
I think I tend to do things like that automatically, so it wasn’t a problem for me. But I can see why that would be a problem to other people who think differently, so I agree with you.
How do you disagree? I agree on both of those counts.
I’m suggesting ‘Shampoo in eyes vs. being slowly devoured alive by ants’ would be even more convincing to most people, especially if you used a dollar figure whose notation most people understand.
These are good points, though as a counter-point to your original post, torture vs. specks is one of the things that comes up on Google when you search for Eliezer’s name, so we may be stuck with it.
Going with the thought anyway… aside from getting people to not spend a million dollars to save one life (i.e. making sure their million dollars saves at least a few hundred lives), where are other good problem areas to focus on for the sake of practical improvements in people’s ability to shut up and multiply? “Yes, it really is OK to accept an increased risk of terrorist attack for the sake of making air travel more convenient”?
Actually, given how crazy the US went after 9/11, I’m not sure that’s the best example. A little inconvenience in our air travel is a reasonable price to pay for avoiding another Iraq war. This doesn’t totally ruin the example, because there’s some level of inconvenience we would not accept in order to avoid another Iraq, but that level is high enough to not make the example work so well.
I really don’t think we should talk about or link to that article anymore. Patching it won’t change the fact that we’ve anchored the discussion to dust specks. If your goal is to persuade more people to shut up and multiply (and thereby do a little to decrease the actual amount of expected torture and suffering in the real world), surely there are ways to do that that aren’t quite so saturated with mind-kill. Just write a brand new post, modeled after Eliezer’s — ‘Cancer or hiccups?’, say — that avoids...
… strange and poorly understood examples. (I’m still not sure I get what this tiny ‘dust speck’ experience is supposed to feel like.)
… examples that emphasize agency (and therefore encourage deontological responses), as opposed to worldly misfortunes.
… examples that are not just emotionally volatile horrors, but are actively debated hot-button political issues.
… numbers that are difficult not just for laypeople to intuitively grasp, but even to notationally unpack.
Such elements aren’t even useful for neutral illustration. This is among the most important rhetorical battles the LW and EA communities have ever addressed, and among the most challenging. Why go out of our way to handicap ourselves? Just use an unrelated example.
I disagree. I think Chris made the example more clear (using shampoo) and the argument more convincing for me (“you wouldn’t pay $(3^^^3/100).”)
I think that particular number makes the argument harder to understand, since I’m not sure what it would even mean to pay such an amount. Should I postulate that we have colonized the entire observable universe and I’m the supreme ruler of the universe, thereby allowing for the possibility of such a sum even existing, let alone me having the opportunity to pay it?
Not to beat a dead horse, but $(3^^^3/100) is way way way way way way way way way more money than you could spend if you merely colonized the universe with trillionaires the size of quarks.
I thought of it like this: Well, I wouldn’t even pay one trillion dollars, so surely I wouldn’t pay $(3^^^3/100). Given that paying $(3^^^3/100) is the logical consequence of choosing “torture” in “torture vs. shampoo”, I clearly must not pick “torture”.
Yeah, I did arrive at something similar after a moment’s thought, but an example that the reader needs to explicitly transform into a different one (“okay, that number makes no sense, but the same logic applies to smaller numbers that do make sense”) isn’t a very good example.
I think I tend to do things like that automatically, so it wasn’t a problem for me. But I can see why that would be a problem to other people who think differently, so I agree with you.
What number would you recommend?
How do you disagree? I agree on both of those counts.
I’m suggesting ‘Shampoo in eyes vs. being slowly devoured alive by ants’ would be even more convincing to most people, especially if you used a dollar figure whose notation most people understand.
On second thought, I don’t disagree.
These are good points, though as a counter-point to your original post, torture vs. specks is one of the things that comes up on Google when you search for Eliezer’s name, so we may be stuck with it.
Going with the thought anyway… aside from getting people to not spend a million dollars to save one life (i.e. making sure their million dollars saves at least a few hundred lives), where are other good problem areas to focus on for the sake of practical improvements in people’s ability to shut up and multiply? “Yes, it really is OK to accept an increased risk of terrorist attack for the sake of making air travel more convenient”?
Actually, given how crazy the US went after 9/11, I’m not sure that’s the best example. A little inconvenience in our air travel is a reasonable price to pay for avoiding another Iraq war. This doesn’t totally ruin the example, because there’s some level of inconvenience we would not accept in order to avoid another Iraq, but that level is high enough to not make the example work so well.
Hmmm… better examples?