Someone told me that they were feeling disgusted by the view of trying to optimize for specific things, using specific objectives. This is what I wrote to them:
That feeling of being disgusted is actually some form of optimization itself. Disgust is a feeling that is utilized for many things, that we perceive as negative. It was probably easier for evolution to rewire when to feel disgusted, instead of creating a new feeling. The point is that that feeling that arises is supposed to change your behavior steering you in certain directions. I.e. it redirects what you are optimizing for. For example, it could make you think about why trying to optimize for things directly using explicit objectives is actually a bad thing. But the value judgment comes first. You first feel disgusted, and then you try to combat in some way the thing that you are disgusted by and try to come up with reasons why it is bad. So it is ironic that one can feel disgusted at optimization when feeling disgusted is part of an optimization process itself.
We were talking about maximizing positive and minimizing negative conscious experiences. I guess with the implicit assumption that we could find some specification of this objective that we would find satisfactory (one that would not have unintended consequences when implemented).
It’s understandable to feel disgust at some visible optimization processes, while not feeling disgust at others, especially ones that aren’t perceived as intrusive or overbearing. And that could easily lead to disgust at the INTENT to optimize in simple/legible ways, without as much disgust for complex equilibrium-based optimizations that don’t have human design behind them.
Yes. There are lots of optimization processes built into us humans, but they feel natural to us, or we simply don’t notice them. Stating something that you want to optimize for, especially if it is something that seems to impose itself on the entire structure of the universe, is not natural for humans. And that goal, if implemented would restrict the individual’s freedoms. And that humans really don’t like.
I think this all makes sense when you are trying to live together in a society, but I am not sure if we should blindly extrapolate these intuitions to determine what we want in the far future.
I am not sure if we should blindly extrapolate these intuitions to determine what we want in the far future.
I’m pretty sure we shouldn’t. Note that “blindly” is a pretty biased way to describe something if you’re not trying to skew the discussion. I’m pretty sure we shouldn’t even knowingly and carefully extrapolate these intuitions terribly far into the future. I’m not sure whether we have a choice, though—it seems believable that a pure laissez-faire attitude toward future values leads to dystopia or extinction.
Disgust is optimizing
Someone told me that they were feeling disgusted by the view of trying to optimize for specific things, using specific objectives. This is what I wrote to them:
That feeling of being disgusted is actually some form of optimization itself. Disgust is a feeling that is utilized for many things, that we perceive as negative. It was probably easier for evolution to rewire when to feel disgusted, instead of creating a new feeling. The point is that that feeling that arises is supposed to change your behavior steering you in certain directions. I.e. it redirects what you are optimizing for. For example, it could make you think about why trying to optimize for things directly using explicit objectives is actually a bad thing. But the value judgment comes first. You first feel disgusted, and then you try to combat in some way the thing that you are disgusted by and try to come up with reasons why it is bad. So it is ironic that one can feel disgusted at optimization when feeling disgusted is part of an optimization process itself.
[edited]
We were talking about maximizing positive and minimizing negative conscious experiences. I guess with the implicit assumption that we could find some specification of this objective that we would find satisfactory (one that would not have unintended consequences when implemented).
It’s understandable to feel disgust at some visible optimization processes, while not feeling disgust at others, especially ones that aren’t perceived as intrusive or overbearing. And that could easily lead to disgust at the INTENT to optimize in simple/legible ways, without as much disgust for complex equilibrium-based optimizations that don’t have human design behind them.
Yes. There are lots of optimization processes built into us humans, but they feel natural to us, or we simply don’t notice them. Stating something that you want to optimize for, especially if it is something that seems to impose itself on the entire structure of the universe, is not natural for humans. And that goal, if implemented would restrict the individual’s freedoms. And that humans really don’t like.
I think this all makes sense when you are trying to live together in a society, but I am not sure if we should blindly extrapolate these intuitions to determine what we want in the far future.
I’m pretty sure we shouldn’t. Note that “blindly” is a pretty biased way to describe something if you’re not trying to skew the discussion. I’m pretty sure we shouldn’t even knowingly and carefully extrapolate these intuitions terribly far into the future. I’m not sure whether we have a choice, though—it seems believable that a pure laissez-faire attitude toward future values leads to dystopia or extinction.