One time I was camping in the woods with some friends. We were sat around the fire in the middle of the night, listening to the sound of the woods, when one of my friends got out a bluetooth speaker and started playing donk at full volume (donk is a kind of funny, somewhat obnoxious style of dance music).
I strongly felt that this was a bad bad bad thing to be doing, and was basically pleading with my friend to turn it off. Everyone else thought it was funny and that I was being a bit dramatic—there was nobody around for hundreds of metres, so we weren’t disturbing anyone.
I think my friends felt that because we were away from people, we weren’t “stepping on the toes of any instrumentally convergent subgoals” with our noise pollution. Whereas I had the vague feeling that we were disturbing all these squirrels and pigeons and or whatever that were probably sleeping in the trees, so we were “stepping on the toes of instrumentally convergent subgoals” to an awful degree.
Which is all to say, for happy instrumental convergence to be good news for other agents in your vicinity, it seems like you probably do still need to care about those agents for some reason?
Yes, I don’t think this will let you get away with no specification bits in goal space at the top level like John’s phrasing might suggest. But it may let you get away with much less precision?
The things we care about aren’t convergent instrumental goals for all terminal goals, the kitchen chef’s constraints aren’t doing that much to keep the kitchen liveable to cockroaches. But it seems to me that this maybe does gesture at a method to get away with pointing at a broad region of goal space instead of a near-pointlike region.
One time I was camping in the woods with some friends. We were sat around the fire in the middle of the night, listening to the sound of the woods, when one of my friends got out a bluetooth speaker and started playing donk at full volume (donk is a kind of funny, somewhat obnoxious style of dance music).
I strongly felt that this was a bad bad bad thing to be doing, and was basically pleading with my friend to turn it off. Everyone else thought it was funny and that I was being a bit dramatic—there was nobody around for hundreds of metres, so we weren’t disturbing anyone.
I think my friends felt that because we were away from people, we weren’t “stepping on the toes of any instrumentally convergent subgoals” with our noise pollution. Whereas I had the vague feeling that we were disturbing all these squirrels and pigeons and or whatever that were probably sleeping in the trees, so we were “stepping on the toes of instrumentally convergent subgoals” to an awful degree.
Which is all to say, for happy instrumental convergence to be good news for other agents in your vicinity, it seems like you probably do still need to care about those agents for some reason?
Yes, I don’t think this will let you get away with no specification bits in goal space at the top level like John’s phrasing might suggest. But it may let you get away with much less precision?
The things we care about aren’t convergent instrumental goals for all terminal goals, the kitchen chef’s constraints aren’t doing that much to keep the kitchen liveable to cockroaches. But it seems to me that this maybe does gesture at a method to get away with pointing at a broad region of goal space instead of a near-pointlike region.