Nearly all software that superficially looks like it’s going to go skynet on you and kill you, isn’t going to do that, either.
Sure. Because nearly all software that superficially looks to a human like it’s a seed AI is not a seed AI. The argument for ‘programmable indirect normativity is an important research focus’ nowhere assumes that it’s particularly easy to build a seed AI.
“If there are seasoned AI researchers who can’t wrap their heads around the five theses”, then you are going to feel more pleased with yourself, being a believer
Hm? No. Dissonance is painful. People feel happier agreeing than disagreeing.
which releases dopamine and reinforces what ever fallacies of reasoning you make.
Releasing dopamine also reinforces whatever correct reasoning one carries out. Good reasoning is just as much a brain process as bad reasoning.
Sure. Because nearly all software that superficially looks to a human like it’s a seed AI is not a seed AI. The argument for ‘programmable indirect normativity is an important research focus’ nowhere assumes that it’s particularly easy to build a seed AI.
Hm? No. Dissonance is painful. People feel happier agreeing than disagreeing.
Releasing dopamine also reinforces whatever correct reasoning one carries out. Good reasoning is just as much a brain process as bad reasoning.