Thank you, wonderful series!
How should we deal with the cases when epistemic rationality contradicts instrumental? For example, we may want to use placebo effect because one of our values is that healthy is better than sick, and less pain is better than more pain. But placebo effect is based on the fact that we believe pill to be a working medicine that is wrong. Is there any way to satisfy both epistemic and instrumental rationality?
I agree with the point about the continuous ability to suffer rather than a threshold. I totally agree that there is no objective answer, we can’t measure sufferings. The problem is, however, that it leaves a practical question that is not clear how to solve, namely how we should treat other animals and our code.
I like the idea. Basically, you suggest taking the functional approach and advance it. What do you think can be this type of process?
Sorry, I didn’t get what do you mean by “non-dominant political controllership”, can you rephrase it?
Of course, placebo is useful from the evolutionary point of view, and it is a subject of quite a lot of research. (Main idea—it is energetically costly to have your immune system always at high alert, so you boost it in particular moments, correlating with pleasure, usually from eating/drinking/sex, which is when germs usually get to the body. If interested, I will find the link to the research paper where it is discussed. ).I am afraid I still fail to explain what I mean. I do not try to deduce from the observation that we are in a simulation, I don’t think it is possible (unless simulators decide to allow it). I am trying to see how the belief that we are in simulation with benevolent simulators can change my subjective experience. Notice, I can’t just trick myself to believe only because it is healthy to believe. This is why I needed all this theory above—to show that benevolent simulators are indeed highly likely. Then, and only then, I can hope for the placebo effect (or for real intervention masquerading under placebo effect), because now I believe that it may work. If I could just make myself to believe in whatever I needed, of course I would not need all these shenanigans—but, after being faithful LW reader for a while, it is really hard, if possible at all.