A reason to act less than optimally: a fun thought experiment. To complexify anyone trying to predict you.
You might be in a hostile or at least not optimal simulation. There would be people trying to predict you and control you so the simulation is stable (for whatever reason they want the society you are in to persist).
If you act naively rationally you are predictable, your actions will be predicted. So the system as a whole will tend towards simplicity. This isn’t good because the simulation systems need to deal with a complex outer world too.
So be irrational in a way that is purposeful.
Make big bets you know you will lose (but stimulate other people to do interesting things). Get money pumped for a while to learn about those systems.
Maybe send messages by acting irrational on purpose. Bring life to the world.
A reason to act less than optimally: a fun thought experiment. To complexify anyone trying to predict you.
You might be in a hostile or at least not optimal simulation. There would be people trying to predict you and control you so the simulation is stable (for whatever reason they want the society you are in to persist).
If you act naively rationally you are predictable, your actions will be predicted. So the system as a whole will tend towards simplicity. This isn’t good because the simulation systems need to deal with a complex outer world too.
So be irrational in a way that is purposeful.
Make big bets you know you will lose (but stimulate other people to do interesting things). Get money pumped for a while to learn about those systems.
Maybe send messages by acting irrational on purpose. Bring life to the world.
This may be valuable in less-than-adversarial complex equilibria. Even if things aren’t controlled or predicted from outside, they contain lots of forces that are pushing toward over-simple optimization (see https://www.lesswrong.com/w/moloch). Pushing away from optimal can add slack (https://subgenius.fandom.com/wiki/Slack).