Unknown: “But it is quite impossible that the complicated calculation in Eliezer’s brain should be exactly the same as the one in any of us: and so by our standards, Eliezer’s morality is immoral. And this opinion is subjectively objective, i.e. his morality is immoral and would be even if all of us disagreed. So we are all morally obliged to prevent him from inflicting his immoral AI on us”
Well, I would agree with this point if I thought what Eliezer was going to inflict upon us was so out of line with what I want that we would be better off without it. Since, you know, NOT dying doesn’t seem like such a bad thing to me, I’m not going to complain, when he’s one of the only people on Earth actually trying to make that happen...
On the other hand, Eliezer, you are going to have to answer to millions if not billions of people protesting your view of morality, especially this facet of it (the not dying thing), so yeah, learn to be diplomatic. You NOT allowed to fuck this up for the rest of us!
Unknown: “But it is quite impossible that the complicated calculation in Eliezer’s brain should be exactly the same as the one in any of us: and so by our standards, Eliezer’s morality is immoral. And this opinion is subjectively objective, i.e. his morality is immoral and would be even if all of us disagreed. So we are all morally obliged to prevent him from inflicting his immoral AI on us”
Well, I would agree with this point if I thought what Eliezer was going to inflict upon us was so out of line with what I want that we would be better off without it. Since, you know, NOT dying doesn’t seem like such a bad thing to me, I’m not going to complain, when he’s one of the only people on Earth actually trying to make that happen...
On the other hand, Eliezer, you are going to have to answer to millions if not billions of people protesting your view of morality, especially this facet of it (the not dying thing), so yeah, learn to be diplomatic. You NOT allowed to fuck this up for the rest of us!