I agree with much of the thrust of this post. It is very bad that the causes of discount rates (such as opportunity costs) exist.
But your reaction to Carl Shulman’s time travel argument leaves me wondering whether you have a coherent position.
If a Friendly AI with a nonzero discount rate would conclude that it has a chance of creating time travel, and that time travel would work in a way that would abolish opportunity costs, then I would conclude that devoting a really large fraction of available resources to creating time travel is what a genuine altruist would want.
Can you clarify whether you really mean to say that an AI shouldn’t devote a lot of resources toward something which would abolish opportunity costs (i.e. give everyone everything they can possibly have)?
Of course, it’s not clear to me that an AI would believe it has a chance of creating time travel. And it’s not clear to me that time travel would be sufficient to abolish opportunity costs, arbitrage interest rates to zero, etc. I sometimes attempt to imagine a version of time travel which would do those things, but my mind boggles before I get close to deciding whether such a version is logically consistent.
The only model of time travel I understand well enough to believe it is coherent is the one proposed by David Deutsch, which does not appear powerful enough to abolish opportunity costs or arbitrage interest rates to zero. If you were thinking of this model of time travel, then please clarify why you think it says anything interesting about the existence of discount rates.
I agree with much of the thrust of this post. It is very bad that the causes of discount rates (such as opportunity costs) exist. But your reaction to Carl Shulman’s time travel argument leaves me wondering whether you have a coherent position. If a Friendly AI with a nonzero discount rate would conclude that it has a chance of creating time travel, and that time travel would work in a way that would abolish opportunity costs, then I would conclude that devoting a really large fraction of available resources to creating time travel is what a genuine altruist would want. Can you clarify whether you really mean to say that an AI shouldn’t devote a lot of resources toward something which would abolish opportunity costs (i.e. give everyone everything they can possibly have)? Of course, it’s not clear to me that an AI would believe it has a chance of creating time travel. And it’s not clear to me that time travel would be sufficient to abolish opportunity costs, arbitrage interest rates to zero, etc. I sometimes attempt to imagine a version of time travel which would do those things, but my mind boggles before I get close to deciding whether such a version is logically consistent. The only model of time travel I understand well enough to believe it is coherent is the one proposed by David Deutsch, which does not appear powerful enough to abolish opportunity costs or arbitrage interest rates to zero. If you were thinking of this model of time travel, then please clarify why you think it says anything interesting about the existence of discount rates.