So the idea is that you only force the people to pay who actually are willing to pay. Obviously in the real world, you don’t know who these people are. In the post I wrote:
The theoretical model of Dominant Assurance Contracts assumes away some things that you have to deal with in the real word
Perfect information about pricing: In the example problem above, we assumed that 10 villagers were willing to pay $15 to pave the road. In real life you do not have that information and risk over-pricing/under-pricing your contract. Presumably the contracts run in the real world that failed were over-priced.
So in the real world an asteroid deflection DAC risks being over-priced (and then we all die from an asteroid) or under-priced (some people free-ride). I still think this is an improvement over other mechanisms.
I think any mechanism that involves “we all die from an asteroid because we were trying to make sure no one benefitted unduly from our asteroid-deflecting plan” as a possible outcome is obviously flawed. Though obviously this might work for lesser problems, I think in general it needs something else to refine the mechanism and allow for some fluctuations. Otherwise the contract really is just a way to ensure that people respect their commitment to pay X, which you can also do already by simply having e.g. a penalty to pay if you don’t do your part after signing the contract.
I agree, if an actually dangerous asteroid came onto the collision course, and if DACs being popularized even gave the appearance of reducing the chance of a successful deflection by a tiny fraction, due to the aforementioned new failure possibility, then it’s very likely that many many people and organizations will want to go after whoever popularized it. If they’re still alive.
Loss aversions is much stronger then any possible gratitude for benefits brought by DACs when ramped up to such an extreme scale. Which in fact, after a bit of reflection, might ironically make the worst possible supporting example for popularizing DACs and nearly the best possible counter example. Sorry OP.
(I would estimate as much as 5% of the population would actively seek out revenge against whoever impaired the possibility of saving their children and family by even the slightest amount, when put under such extreme duress, and humans would be unable to take revenge against gravity so they would need to direct their blind fury elsewhere...)
I think calling it “loss aversion” here implies an irrational bias that isn’t even there. The loss from an extinction level impact is near infinite, so honestly I would say wanting to avoid it at all costs is rational. If someone can shell out the money needed unilaterally then freeriding isn’t a worry. Heck, if you can gather the money from people under threat of force (which after all is kind of how taxes work now) it’s probably well morally justified, as long as you keep the sum reasonable. It’s just a very extreme situation in which “but what if someone freerides” definitely shouldn’t be the top concern.
So the idea is that you only force the people to pay who actually are willing to pay. Obviously in the real world, you don’t know who these people are. In the post I wrote:
So in the real world an asteroid deflection DAC risks being over-priced (and then we all die from an asteroid) or under-priced (some people free-ride). I still think this is an improvement over other mechanisms.
I think any mechanism that involves “we all die from an asteroid because we were trying to make sure no one benefitted unduly from our asteroid-deflecting plan” as a possible outcome is obviously flawed. Though obviously this might work for lesser problems, I think in general it needs something else to refine the mechanism and allow for some fluctuations. Otherwise the contract really is just a way to ensure that people respect their commitment to pay X, which you can also do already by simply having e.g. a penalty to pay if you don’t do your part after signing the contract.
I agree, if an actually dangerous asteroid came onto the collision course, and if DACs being popularized even gave the appearance of reducing the chance of a successful deflection by a tiny fraction, due to the aforementioned new failure possibility, then it’s very likely that many many people and organizations will want to go after whoever popularized it. If they’re still alive.
Loss aversions is much stronger then any possible gratitude for benefits brought by DACs when ramped up to such an extreme scale. Which in fact, after a bit of reflection, might ironically make the worst possible supporting example for popularizing DACs and nearly the best possible counter example. Sorry OP.
(I would estimate as much as 5% of the population would actively seek out revenge against whoever impaired the possibility of saving their children and family by even the slightest amount, when put under such extreme duress, and humans would be unable to take revenge against gravity so they would need to direct their blind fury elsewhere...)
I think calling it “loss aversion” here implies an irrational bias that isn’t even there. The loss from an extinction level impact is near infinite, so honestly I would say wanting to avoid it at all costs is rational. If someone can shell out the money needed unilaterally then freeriding isn’t a worry. Heck, if you can gather the money from people under threat of force (which after all is kind of how taxes work now) it’s probably well morally justified, as long as you keep the sum reasonable. It’s just a very extreme situation in which “but what if someone freerides” definitely shouldn’t be the top concern.