I’d suggest not using palm reading as an example, since palm lines really do not affect the future (unless you believe they do).
Likewise, the example of a machine learning system misanalyzing sunlight patterns as tanks is apparently fake.
The Solomon and Bathsheba problem looks more a discussion of an Oedipal complex at first. What a mess!
Eliezer had the good sense to transform the Smoking Lesion into a Chewing Gum Lesion, since smoking does in fact cause cancer. But chewing gum doesn’t cause lesions. Alex Altair’s example of toxoplasmosis was at least plausible.
In summary, examples should be realistic. I can deal with unrealistic thought experiments like dropping people through space in elevators and catching balls tossed from 0.99c trains—but decision theory is full of counter-factuals, so let’s not add in any more confusion.
I’d suggest not using palm reading as an example, since palm lines really do not affect the future (unless you believe they do).
Likewise, the example of a machine learning system misanalyzing sunlight patterns as tanks is apparently fake.
The Solomon and Bathsheba problem looks more a discussion of an Oedipal complex at first. What a mess!
Eliezer had the good sense to transform the Smoking Lesion into a Chewing Gum Lesion, since smoking does in fact cause cancer. But chewing gum doesn’t cause lesions. Alex Altair’s example of toxoplasmosis was at least plausible.
In summary, examples should be realistic. I can deal with unrealistic thought experiments like dropping people through space in elevators and catching balls tossed from 0.99c trains—but decision theory is full of counter-factuals, so let’s not add in any more confusion.