Either swimmer or Dave, are either of you aware of a practical methodology for rationalizing the masses
For a sufficiently broad understanding of “practical” and “the masses” (and understanding “rationalizing” the way I think you mean it, which I would describe as educating), no. Way too many people on the planet for any of the educational techniques I know about to affect more than the smallest fraction of them without investing a huge amount of effort.
It’s worth asking what the benefits are of better educating even a small fraction of “the masses”, though.
or a reason to think why a more efficient society would be any less oppressive or war driven
That depends, of course, on what the society values. If I value oppressing people, making me more efficient just lets me oppress people more efficiently. If I value war, making me more efficient means I conduct war more efficiently.
My best guess is that collectively we value things that war turns out to be an inefficient way of achieving. I’m not confident the same is true about oppression.
In fact, in a worst case scenario, I see a world of majorly rational people as transforming into an even more efficient war machine, and killing us all faster.
Sure. But that scenario implies that wanting to kill ourselves is the goal we’re striving for, and I consider that unlikely enough to not be worth worrying about much.
What is the perceived end goal of friendly Ai? Is it that an unbiased, unfailing intelligence replaces humans as the primary organizers and arbiters of power in our society
Similar, yes. A system designed to optimize the environment for the stuff humans value will, if it’s a better optimizer than humans are, get better results than humans do.
That depends, of course, on what the society values. If I value oppressing people, making me more efficient just lets me oppress people more efficiently. If I value war, making me more efficient means I conduct war more efficiently.
So does rationality determine what a person or group values, or is it merely a tool to be used towards subjective values?
Sure. But that scenario implies that wanting to kill ourselves is the goal we’re striving for, and I consider that unlikely enough to not be worth worrying about much.
My scenario does not assume that all of humanity views themselves as one in-group. Whereas what you are saying assumes that it does. Killing ourselves and killing them are two very different things. I don’t think many groups have the goal of killing themselves, but do you not think that the eradication of competing out groups could be seen as increasing in-group survival?
Almost entirely orthogonal.
You are going to have to explain what you mean here.
So does rationality determine what a person or group values, or is it merely a tool to be used towards subjective values?
Dunno about “merely”, but yeah, the thing LW refers to by “rationality” is a tool that can be used to promote any values.
My scenario does not assume that all of humanity views themselves as one in-group. Whereas what you are saying assumes that it does.
I don’t think it assumes that, actually. You mentioned “a world of majorly rational people [..] killing us all faster.” I don’t see how a world of people who are better at achieving what they value results in all of us being killed faster, unless people value killing all of us.
If what I value is killing you and surviving myself, and you value the same, but we end up taking steps that result in both of us dying, it would appear we have failed to take steps that optimize for our goals. Perhaps if we were better at optimizing for our goals, we would have taken different steps.
do you not think that the eradication of competing out groups could be seen as increasing in-group survival?
Sure.
Almost entirely orthogonal.
You are going to have to explain what you mean here.
I mean that whether humanity is digitized has almost nothing to do with the perceived end goal.
For a sufficiently broad understanding of “practical” and “the masses” (and understanding “rationalizing” the way I think you mean it, which I would describe as educating), no. Way too many people on the planet for any of the educational techniques I know about to affect more than the smallest fraction of them without investing a huge amount of effort.
It’s worth asking what the benefits are of better educating even a small fraction of “the masses”, though.
That depends, of course, on what the society values. If I value oppressing people, making me more efficient just lets me oppress people more efficiently. If I value war, making me more efficient means I conduct war more efficiently.
My best guess is that collectively we value things that war turns out to be an inefficient way of achieving. I’m not confident the same is true about oppression.
Sure. But that scenario implies that wanting to kill ourselves is the goal we’re striving for, and I consider that unlikely enough to not be worth worrying about much.
Similar, yes. A system designed to optimize the environment for the stuff humans value will, if it’s a better optimizer than humans are, get better results than humans do.
Almost entirely orthogonal.
So does rationality determine what a person or group values, or is it merely a tool to be used towards subjective values?
My scenario does not assume that all of humanity views themselves as one in-group. Whereas what you are saying assumes that it does. Killing ourselves and killing them are two very different things. I don’t think many groups have the goal of killing themselves, but do you not think that the eradication of competing out groups could be seen as increasing in-group survival?
You are going to have to explain what you mean here.
Dunno about “merely”, but yeah, the thing LW refers to by “rationality” is a tool that can be used to promote any values.
I don’t think it assumes that, actually. You mentioned “a world of majorly rational people [..] killing us all faster.” I don’t see how a world of people who are better at achieving what they value results in all of us being killed faster, unless people value killing all of us.
If what I value is killing you and surviving myself, and you value the same, but we end up taking steps that result in both of us dying, it would appear we have failed to take steps that optimize for our goals. Perhaps if we were better at optimizing for our goals, we would have taken different steps.
Sure.
I mean that whether humanity is digitized has almost nothing to do with the perceived end goal.