I mean, I’ve kinda read the thing, but it’s not very legible to me.
It kinda sounds like you’re just saying “alignment to non-instrumental goals is hard”, which everyone agrees with, and then you’re also saying “I like it when there’s more intelligence, I think that’s valuable, regardless of any other features of what the intelligence is trying to do besides get more intelligence”, which seems false and bad and you haven’t argued for it here AFAICT. But maybe I’m not understanding.
may I recommend you read the thing? ive gone through most of the arguments you proposed.
I mean, I’ve kinda read the thing, but it’s not very legible to me.
It kinda sounds like you’re just saying “alignment to non-instrumental goals is hard”, which everyone agrees with, and then you’re also saying “I like it when there’s more intelligence, I think that’s valuable, regardless of any other features of what the intelligence is trying to do besides get more intelligence”, which seems false and bad and you haven’t argued for it here AFAICT. But maybe I’m not understanding.
sorry, I don’t think it makes sense for me to discuss your opinions on something you kinda read.