The weirdest part about “an optimization demon” is “this is our measure of good (outcomes), but don’t push to hard towards it and you’ll get something bad”, when intuitively something that is optimizing at our expense would have a harder time meeting stricter constraints.
The reasoning behind it is a) us and b) everything we call brains, being the result of “pushing too hard”. It’s not immediately clear how a “semi-optimization demon” would come to be, or what that would mean.
It’s also not clear how when and how you’d have the issue aside from running a genetic algorithm for ages.
The weirdest part about “an optimization demon” is “this is our measure of good (outcomes), but don’t push to hard towards it and you’ll get something bad”, when intuitively something that is optimizing at our expense would have a harder time meeting stricter constraints.
The reasoning behind it is a) us and b) everything we call brains, being the result of “pushing too hard”. It’s not immediately clear how a “semi-optimization demon” would come to be, or what that would mean.
It’s also not clear how when and how you’d have the issue aside from running a genetic algorithm for ages.