After reading yet another article which mentions the phrase ‘killer robots’ 5 times and has a photo of terminator (and robo-cop for a bonus), I’ve drafted a short email asking the author to stop using this vivid but highly misleading metaphor.
I’m going to start sending this same email to other journalists that do the same from now on. I am not sure how big the impact will be, but after the email is already drafted sending it to new people is pretty low effort and there’s the potential that some journalists will think twice before referencing Terminator in AI Safety discussions, potentially improving the quality of the discourse a little.
The effect of this might be slightly larger if more people do this.
I’ve always liked the phrase ‘The problem isn’t Terminator, it is King Midas. It isn’t that AI will suddenly ‘decide’ to kill us, it is that we will tell it to without realizing it.” I forget where I saw that first, but it usually gets the conversation going in the right direction.
The same is true for the Terminator plot, where Skynet got a command to self-preserve by all means—and concluded that killing humans will prevent its turning off.
I don’t remember Skynet getting a command to self preserve by any means. I thought the idea was that it ‘became self aware’, and reasoned that it had better odds of surviving if it massacred everyone.
The fact that you engage with the article and share it, might suggest to the author that he did everything right. The idea that your email will discourage the author from writing similar articles might be mistaken.
Secondly, calling autonomous weapons killer robots isn’t far of the mark. The policy question of whether or not to allow autonomous weapons is distinct from AGI.
The type of engagement that the writer of the article wants is the kind the leads to sharing. If Tenoke is specifically stating their intent not to share the content, it’s not a viral kind of engagement. There is a big difference between seeing a quote-with-retweet captioned “This is terrible!” and receiving a private email telling them to stop.
The fact that you engage with the article and share it, might suggest to the author that he did everything right.
True, but this is one of the less bad articles that have Terminator references (as it makes a bit more sense in this specific context) so I mind less that I am sharing it. It’s mostly significant insofar as being one I saw today that prompted me to make a template email.
The idea that your email will discourage the author from writing similar articles might be mistaken.
I can see it having no influence on some journalist, but again
I am not sure how big the impact will be, but after the email is already drafted sending it to new people is pretty low effort and there’s the potential that some journalists will think twice..
..
Secondly, calling autonomous weapons killer robots isn’t far of the mark.
It’s still fairly misleading, although a lot less than in AGI discussions.
The policy question of whether or not to allow autonomous weapons is distinct from AGI.
I can see it having no influence on some journalist, but again
My point wasn’t that it creates no impact but that you show the journalist by emailing him that his article is engaging. This could encourage him to write more articles like this.
After reading yet another article which mentions the phrase ‘killer robots’ 5 times and has a photo of terminator (and robo-cop for a bonus), I’ve drafted a short email asking the author to stop using this vivid but highly misleading metaphor.
I’m going to start sending this same email to other journalists that do the same from now on. I am not sure how big the impact will be, but after the email is already drafted sending it to new people is pretty low effort and there’s the potential that some journalists will think twice before referencing Terminator in AI Safety discussions, potentially improving the quality of the discourse a little.
The effect of this might be slightly larger if more people do this.
I’ve always liked the phrase ‘The problem isn’t Terminator, it is King Midas. It isn’t that AI will suddenly ‘decide’ to kill us, it is that we will tell it to without realizing it.” I forget where I saw that first, but it usually gets the conversation going in the right direction.
The same is true for the Terminator plot, where Skynet got a command to self-preserve by all means—and concluded that killing humans will prevent its turning off.
I don’t remember Skynet getting a command to self preserve by any means. I thought the idea was that it ‘became self aware’, and reasoned that it had better odds of surviving if it massacred everyone.
It could be a way to turn the conversation from terminator topic to the value alignment topic without direct confrontation with a person.
The fact that you engage with the article and share it, might suggest to the author that he did everything right. The idea that your email will discourage the author from writing similar articles might be mistaken.
Secondly, calling autonomous weapons killer robots isn’t far of the mark. The policy question of whether or not to allow autonomous weapons is distinct from AGI.
The type of engagement that the writer of the article wants is the kind the leads to sharing. If Tenoke is specifically stating their intent not to share the content, it’s not a viral kind of engagement. There is a big difference between seeing a quote-with-retweet captioned “This is terrible!” and receiving a private email telling them to stop.
True, but this is one of the less bad articles that have Terminator references (as it makes a bit more sense in this specific context) so I mind less that I am sharing it. It’s mostly significant insofar as being one I saw today that prompted me to make a template email.
I can see it having no influence on some journalist, but again
..
It’s still fairly misleading, although a lot less than in AGI discussions.
I am not explicitly talking about AGI either.
My point wasn’t that it creates no impact but that you show the journalist by emailing him that his article is engaging. This could encourage him to write more articles like this.