If you never miss a commuter train, you’re always on the station too early. If you never miss a holiday flight, that’s fine.
If you’ve never failed a job interview, you could get paid much more. If you never get fired, you might be leaving something on the table, but I wouldn’t complain.
If your jokes never offend anyone, you’re not going to be a standup comedian. If your jokes always offend someone, consider that you might not that funny after all.
A pessimist won’t be disappointed. But an optimist might be happier. Pessimist will be right a lot more, though.
If your business never encounters fraud, you could be saving money on security measures. If everyone knew exactly how likely it’s to get caught, you’d have to spend a lot more. Or perhaps a lot less. Maybe there’s some cheap signaling you could do?
If you have a low risk tolerance, you’re leaving a lot of value on the table. If you’re insensitive or oblivious to the downsides, you’ll lose a lot more.
I think about this in my head as “in practice you converge faster to the optima if you overshoot sometimes so do that when overshooting is affordable” and have the counterexample that learning to drive shouldn’t involve accidentally killing a couple people.
1: I see the main point of OP as variance-expectation trade off, where variance is bad when risk averse e.g. whet bad outcomes are much more bad than good outcomes are good. Perhaps you meant this—what you said reads like you may have meant that the process of overshooting teaches you new stuff.
2: When learning to park in an empty parking lot I realized I was consistently turning too early and so decided to go for enough that I’d expect to overshoot just as often/by as much; this suddenly made me much better and got me to learn the correct turning time faster. Notably, there was no risk of hitting someone if I overshot to the right instead of to the left.
I haven’t flushed out my idea clearly. I’m saying something like “In asymmetric scenarios, the more costly failures are, the harder it is to reach the optima (for a given level of risk-averseness)” + “In hindsight, most people will think they are too risk-averse for most things”. It isn’t centrally relevant to what OP is saying upon reflection.
If you never miss a commuter train, you’re always on the station too early. If you never miss a holiday flight, that’s fine.
If you’ve never failed a job interview, you could get paid much more. If you never get fired, you might be leaving something on the table, but I wouldn’t complain.
If your jokes never offend anyone, you’re not going to be a standup comedian. If your jokes always offend someone, consider that you might not that funny after all.
A pessimist won’t be disappointed. But an optimist might be happier. Pessimist will be right a lot more, though.
If your business never encounters fraud, you could be saving money on security measures. If everyone knew exactly how likely it’s to get caught, you’d have to spend a lot more. Or perhaps a lot less. Maybe there’s some cheap signaling you could do?
If you have a low risk tolerance, you’re leaving a lot of value on the table. If you’re insensitive or oblivious to the downsides, you’ll lose a lot more.
I think about this in my head as “in practice you converge faster to the optima if you overshoot sometimes so do that when overshooting is affordable” and have the counterexample that learning to drive shouldn’t involve accidentally killing a couple people.
1: I see the main point of OP as variance-expectation trade off, where variance is bad when risk averse e.g. whet bad outcomes are much more bad than good outcomes are good. Perhaps you meant this—what you said reads like you may have meant that the process of overshooting teaches you new stuff.
2: When learning to park in an empty parking lot I realized I was consistently turning too early and so decided to go for enough that I’d expect to overshoot just as often/by as much; this suddenly made me much better and got me to learn the correct turning time faster. Notably, there was no risk of hitting someone if I overshot to the right instead of to the left.
I haven’t flushed out my idea clearly. I’m saying something like “In asymmetric scenarios, the more costly failures are, the harder it is to reach the optima (for a given level of risk-averseness)” + “In hindsight, most people will think they are too risk-averse for most things”. It isn’t centrally relevant to what OP is saying upon reflection.