If one values winning above everything else, then everything that leads to winning is rational. The reductio to this is if torturing a googolplex of beings at maximum duration and increasing intensity leads to winning, then that’s what must be done.
Yet… perhaps winning then is not what we should most value? Perhaps we should value destroying the thing which values torturing a googolplex of beings. What if we need to torture half of a googolplex of beings to outcompete something willing to torture a googolplex of beings? What if outcompeting such a thing is impossible? What is the threshold for the number of beings tortured, in total? Such a question must by definition seem irrational to someone winning at all costs, this is the tradeoff one makes for valuing winning at all costs and calling it rationality. At which point does one say, “The most rational move is stopping all forward momentum immediately.”?(“You are missing the point! Rationality is just your *independant* strategy!” That is missing the point.) This does not appear to be a universe where a system which intends to maximize truth and ethics can win. I suspect once we can transcend temporal bias and egocentric bias via convincing virtual experience, in the specific sense of living lives like Junko Furuta’s and Elisabeth Fritzl’s, we will not appreciate winning at all costs. The paradox here is the thing which tends to reach convincing virtual simulations is not the thing which values simulating such things. That little voice in your head that says , “Error. Irrational appeal to emotion.” is the same voice which tortures the entire multiverse to win(if this is the winning strategy). The conclusion here is that ethics and truth don’t win. The thing which is least hindered by a commitment to values other than winning, wins. If anything could be said to bad, that is, if one is not a moral nihilist, then that would be bad news. Again worth noticing the little voice that rejects this word “bad”, which upon having one’s hands planted into hot coals for no reason, would appreciate things differently and realize an objective property of consciousness that is as grounded as the most basic mathematical expression.
You’re confusing ends with means, terminal goals with instrumental goals, morality with decision theory, and about a dozen other ways of expressing the same thing. It doesn’t matter what you consider “good”, because for any fixed definition of “good”, there are going to be optimal and suboptimal methods of achieving goodness. Winning is simply the task of identifying and carrying out an optimal, rather than suboptimal, method.
If there are objectively correct and false values, then it matters to the epistemic rationalist which subjective values they have, because they might be wrong. (it also matters to the ER whether values are subjective).
Epistemic and instrumental rationality have never been the same thing. “Rationality is winning” cannot define them both, but and, as it happens, only defines IR.
If one values winning above everything else, then everything that leads to winning is rational. The reductio to this is if torturing a googolplex of beings at maximum duration and increasing intensity leads to winning, then that’s what must be done.
Yet… perhaps winning then is not what we should most value? Perhaps we should value destroying the thing which values torturing a googolplex of beings. What if we need to torture half of a googolplex of beings to outcompete something willing to torture a googolplex of beings? What if outcompeting such a thing is impossible? What is the threshold for the number of beings tortured, in total? Such a question must by definition seem irrational to someone winning at all costs, this is the tradeoff one makes for valuing winning at all costs and calling it rationality. At which point does one say, “The most rational move is stopping all forward momentum immediately.”?(“You are missing the point! Rationality is just your *independant* strategy!” That is missing the point.) This does not appear to be a universe where a system which intends to maximize truth and ethics can win. I suspect once we can transcend temporal bias and egocentric bias via convincing virtual experience, in the specific sense of living lives like Junko Furuta’s and Elisabeth Fritzl’s, we will not appreciate winning at all costs. The paradox here is the thing which tends to reach convincing virtual simulations is not the thing which values simulating such things. That little voice in your head that says , “Error. Irrational appeal to emotion.” is the same voice which tortures the entire multiverse to win(if this is the winning strategy). The conclusion here is that ethics and truth don’t win. The thing which is least hindered by a commitment to values other than winning, wins. If anything could be said to bad, that is, if one is not a moral nihilist, then that would be bad news. Again worth noticing the little voice that rejects this word “bad”, which upon having one’s hands planted into hot coals for no reason, would appreciate things differently and realize an objective property of consciousness that is as grounded as the most basic mathematical expression.
You’re confusing ends with means, terminal goals with instrumental goals, morality with decision theory, and about a dozen other ways of expressing the same thing. It doesn’t matter what you consider “good”, because for any fixed definition of “good”, there are going to be optimal and suboptimal methods of achieving goodness. Winning is simply the task of identifying and carrying out an optimal, rather than suboptimal, method.
If there are objectively correct and false values, then it matters to the epistemic rationalist which subjective values they have, because they might be wrong. (it also matters to the ER whether values are subjective).
Epistemic and instrumental rationality have never been the same thing. “Rationality is winning” cannot define them both, but and, as it happens, only defines IR.