also, even if we don’t set anything like this up now, you should expect that in worlds where alignment gets solved and ASI is built, people will probably consider it a wise decision to retroactively reward people who made things go well in general based on their calculated impact.
(i’m also default-skeptical of anything to do with crypto in any way whatsoever for the purposes of building lasting institutions. i don’t care how good a DAO could be in theory, the average DAO has a lifespan shorter than the average SF startup)
Yes, people might, but how can that be mapped back in a way that gets people to invest/prioritize those future rewards now? Not sure if DAO is the best way, either. But it seems there needs to be some kind of verifiable committment now that people could reference.
also, even if we don’t set anything like this up now, you should expect that in worlds where alignment gets solved and ASI is built, people will probably consider it a wise decision to retroactively reward people who made things go well in general based on their calculated impact.
(i’m also default-skeptical of anything to do with crypto in any way whatsoever for the purposes of building lasting institutions. i don’t care how good a DAO could be in theory, the average DAO has a lifespan shorter than the average SF startup)
Yes, people might, but how can that be mapped back in a way that gets people to invest/prioritize those future rewards now? Not sure if DAO is the best way, either. But it seems there needs to be some kind of verifiable committment now that people could reference.