You don’t have to understand the real reason. It just has to convince you. Eliezer Yudkowski can convince someone to let an AI out of a box in a thought experiment, and give him money in real life, despite not believing that to be the logical course of action.
You don’t have to understand the real reason. It just has to convince you. Eliezer Yudkowski can convince someone to let an AI out of a box in a thought experiment, and give him money in real life, despite not believing that to be the logical course of action.