“Premature optimisation is the root of all evil.” -- Don Knuth.
Let us suppose that free will exists. So the universe is not deterministic. So those who built the current magical infrastructure could not know that their choices are optimal—they could only hope. So we may ask: is it not possible that we might improve upon their work?
[Edit] Perhaps more to the point, if everything is so well optimised, why there are simultaneously (a) “many bright students” and (b) many ways for them to tragically destroy themselves and their loved ones? Well, I suppose that this is a very old question. Namely, why is there a tree of knowledge with a big sign on it saying “do not eat”?
So we may ask: is it not possible that we might improve upon their work?
This is true. The response is: What probability of success do you need to reach before this becomes the wise course of action, given the consequences of failure?
The wizarding world is a post scarcity society already.
Sometimes your best isn’t good enough and you should take this into account when you do risk assessments, especially when someone more experienced tells you not to do something. The universe is under no obligation to reward you for being persistent.
On a more object level, there is no lesson: This guy did the equivalent of trying to hack an AGI. You don’t get to complain when that backfires.
“Premature optimisation is the root of all evil.” -- Don Knuth.
Let us suppose that free will exists. So the universe is not deterministic. So those who built the current magical infrastructure could not know that their choices are optimal—they could only hope. So we may ask: is it not possible that we might improve upon their work?
[Edit] Perhaps more to the point, if everything is so well optimised, why there are simultaneously (a) “many bright students” and (b) many ways for them to tragically destroy themselves and their loved ones? Well, I suppose that this is a very old question. Namely, why is there a tree of knowledge with a big sign on it saying “do not eat”?
This is true. The response is: What probability of success do you need to reach before this becomes the wise course of action, given the consequences of failure?
The wizarding world is a post scarcity society already.
How can we estimate such things, if the scientific method is forbidden to us?
That is not the lesson this was intended to convey.
Your Marcus Valerius Corvus used the scientific method:
And you erased him. What lesson were we supposed to hear?
Sometimes your best isn’t good enough and you should take this into account when you do risk assessments, especially when someone more experienced tells you not to do something. The universe is under no obligation to reward you for being persistent.
On a more object level, there is no lesson: This guy did the equivalent of trying to hack an AGI. You don’t get to complain when that backfires.