On the Russell / Pinker debate, I thought Pinker had an interesting rhetorical sleight-of-hand that I hadn’t heard before...
When people on the “AGI safety is important” side explain their position, there’s kinda a pedagogical dialog:
A: Superintelligent AGI will be awesome, what could go wrong? B: Well it could outclass all of humanity and steer the future in a bad direction. A: OK then we won’t give it an aggressive goal. B: Even with an innocuous-sounding goal like “maximize paperclips” it would still kill everyone… A: OK, then we’ll give it a good goal like “maximize human happiness”. B: Then it would forcibly drug everyone. A: OK, then we’ll give it a more complicated goal like … B: That one doesn’t work either because …
...And then Pinker reads this back-and-forth dialog, removes a couple pieces of it from their context, and says “The existential risk scenario that people are concerned about is the paperclip scenario and/or the drugging scenario! They really think those exact things are going to happen!” Then that’s the strawman that he can easily rebut.
Pinker had other bad arguments too, I just thought that was a particularly sneaky one.
On the Russell / Pinker debate, I thought Pinker had an interesting rhetorical sleight-of-hand that I hadn’t heard before...
When people on the “AGI safety is important” side explain their position, there’s kinda a pedagogical dialog:
A: Superintelligent AGI will be awesome, what could go wrong? B: Well it could outclass all of humanity and steer the future in a bad direction. A: OK then we won’t give it an aggressive goal. B: Even with an innocuous-sounding goal like “maximize paperclips” it would still kill everyone… A: OK, then we’ll give it a good goal like “maximize human happiness”. B: Then it would forcibly drug everyone. A: OK, then we’ll give it a more complicated goal like … B: That one doesn’t work either because …
...And then Pinker reads this back-and-forth dialog, removes a couple pieces of it from their context, and says “The existential risk scenario that people are concerned about is the paperclip scenario and/or the drugging scenario! They really think those exact things are going to happen!” Then that’s the strawman that he can easily rebut.
Pinker had other bad arguments too, I just thought that was a particularly sneaky one.