Don’t you need AI to go through the many millions of experiences that it might take to develop a good morality strategy?
I’m entranced by Jordan Peterson’s descriptions, which seem to light up the evolutionary path of morality for humans. Shouldn’t AI be set up to try to grind through the same progress?
Don’t you need AI to go through the many millions of experiences that it might take to develop a good morality strategy?
I’m entranced by Jordan Peterson’s descriptions, which seem to light up the evolutionary path of morality for humans. Shouldn’t AI be set up to try to grind through the same progress?