There are other, more interesting and important ways to use that compute capacity. Nobody sane, human or alien, is going to waste it on running a crapton of simulations.
This is a very silly argument, given the sorts of things we use compute capacity for, in the real world, today.
Pick the most nonsensical, absurd, pointless, “shitpost”-quality webcomic/game/video/whatever you can think of. Now find a dozen more like it. (This will be very easy.) Now total up how much compute capacity it takes to make those things happen, and imagine going back to 1950 or whenever, and telling them that, for one teenager to watch one cat video (or whatever else) on their phone takes several orders of magnitude more compute capacity than exists in their entire world, and that not only do we casually spend said compute on said cat video routinely, as a matter of course, without having to pay any discernible amount of money for it, but that in fact we regularly waste similar amounts of compute on nothing at all because some engineer forgot to put a return statement in the right place and so some web page or other process uses up CPU cycles needlessly, and nobody really cares enough to fix it.
People will absolutely waste compute capacity on running a crapton of simulations.
(And that’s without even getting into the “sane” caveat. Insane people use computers all the time! If you doubt this, by all means browse any social media site for a day…)
This is a very silly argument, given the sorts of things we use compute capacity for, in the real world, today.
Pick the most nonsensical, absurd, pointless, “shitpost”-quality webcomic/game/video/whatever you can think of. Now find a dozen more like it. (This will be very easy.) Now total up how much compute capacity it takes to make those things happen, and imagine going back to 1950 or whenever, and telling them that, for one teenager to watch one cat video (or whatever else) on their phone takes several orders of magnitude more compute capacity than exists in their entire world, and that not only do we casually spend said compute on said cat video routinely, as a matter of course, without having to pay any discernible amount of money for it, but that in fact we regularly waste similar amounts of compute on nothing at all because some engineer forgot to put a return statement in the right place and so some web page or other process uses up CPU cycles needlessly, and nobody really cares enough to fix it.
People will absolutely waste compute capacity on running a crapton of simulations.
(And that’s without even getting into the “sane” caveat. Insane people use computers all the time! If you doubt this, by all means browse any social media site for a day…)