Maybe he wouldn’t, but that is a fact about him, not about AI. There’s a narrow slice of concept space that includes uFAI that is almost benign. Not that I think it’s likely that we could intentionally build such an entity. And we shouldn’t want to, for basically the same reasons that we shouldn’t want to build uFAI generally.
Eliezer would never portray a failed creation of an FAI as someone so impotent and comparatively benign.
Maybe he wouldn’t, but that is a fact about him, not about AI. There’s a narrow slice of concept space that includes uFAI that is almost benign. Not that I think it’s likely that we could intentionally build such an entity. And we shouldn’t want to, for basically the same reasons that we shouldn’t want to build uFAI generally.
Yes. I asserted a fact about Eliezer, not about AI or green cheese. No ‘but’ is required.