“People will draw conclusions that harm me” and “people will draw conclusions that weaken my argument” are very different things. Yelling that you shit your pants is in the first category. Saying things that make people less likely to believe in AI danger is in the second.
Hiding information in the second category may help you win, but your goal is to find the truth, not to win regardless of truth. Prosecutors have to turn over exculpatory evidence, and there is a reason for this.
It’s not dishonesty or hiding the truth to explain something to someone in a way they might understand. Maybe I’m misunderstanding, but your argument seems like a general argument against persuasion, pedagogy, and any deliberate ordering of information at all.
You always hide information by choosing to present it later. You can’t simply dump everything you know about a subject at the same time. You have to choose what’s the most important thing for the person you’re speaking to to hear at that time.
We don’t start teaching physics to middle school kids by starting with quantum mechanics because it’s the “most true”, instead we start with the easiest to understand information (Newtonian mechanics stuff) that lays the groundwork for the later stuff. You’re not hiding quantum mechanics, you’re presenting the information in the best order to teach it. If you start with quantum mechanics, they’ll be scared away.
In the case of talking about AI with my mom, there wasn’t any kind of debate or truth-seeking process happening. It was pure exposition. My mom was completely uninformed on the subject and ill-equipped to contribute. I was sharing an idea with her I found interesting/concerning, and she was reacting. If I started talking about nanobots and boiling oceans, there wouldn’t have been a more fair debate because there was no debate. It would have given my mom the wrong impression about AI extinction risk (that it sounds crazy) and accomplished nothing of value. Giving someone the wrong impression, that’s what’s dishonest, even if your literal words were true.
It would have given my mom the wrong impression about AI extinction risk (that it sounds crazy)
“It sounds crazy” is a correct impression, by definition. I assume you mean “the wrong impression (that it is crazy)”.
But there’s a fine line between “I won’t mention this because people will get the wrong impression (that it’s crazy)” and “I won’t mention this because people will get the wrong impression (that it’s false)”. The former is a subset of the latter; are you going to do the latter and conceal all information that might call your ideas into doubt?
(One answer might be “well, I won’t conceal information that would lead to a legitimate disagreement based on unflawed facts and reasoning. Thinking I’m crazy is not such a disagreement”. But I see problems with this. If you believe in X, you by definition think that all disagreement with X is flawed, so this doesn’t restrict you at all.)
I would say don’t conceal any important information. But you don’t have to lead with information that sounds crazy. Maybe bioweapons don’t make it into the 1 minute elevator pitch, but can be explained in the 10 minute version, or during an ensuing back-and-forth. If bioweapons were somehow critical to the AI extinction argument I wouldn’t say this, but all the sci-fi stuff isn’t actually part of the core argument anyway.
“People will draw conclusions that harm me” and “people will draw conclusions that weaken my argument” are very different things. Yelling that you shit your pants is in the first category. Saying things that make people less likely to believe in AI danger is in the second.
Hiding information in the second category may help you win, but your goal is to find the truth, not to win regardless of truth. Prosecutors have to turn over exculpatory evidence, and there is a reason for this.
It’s not dishonesty or hiding the truth to explain something to someone in a way they might understand. Maybe I’m misunderstanding, but your argument seems like a general argument against persuasion, pedagogy, and any deliberate ordering of information at all.
You always hide information by choosing to present it later. You can’t simply dump everything you know about a subject at the same time. You have to choose what’s the most important thing for the person you’re speaking to to hear at that time.
We don’t start teaching physics to middle school kids by starting with quantum mechanics because it’s the “most true”, instead we start with the easiest to understand information (Newtonian mechanics stuff) that lays the groundwork for the later stuff. You’re not hiding quantum mechanics, you’re presenting the information in the best order to teach it. If you start with quantum mechanics, they’ll be scared away.
In the case of talking about AI with my mom, there wasn’t any kind of debate or truth-seeking process happening. It was pure exposition. My mom was completely uninformed on the subject and ill-equipped to contribute. I was sharing an idea with her I found interesting/concerning, and she was reacting. If I started talking about nanobots and boiling oceans, there wouldn’t have been a more fair debate because there was no debate. It would have given my mom the wrong impression about AI extinction risk (that it sounds crazy) and accomplished nothing of value. Giving someone the wrong impression, that’s what’s dishonest, even if your literal words were true.
“It sounds crazy” is a correct impression, by definition. I assume you mean “the wrong impression (that it is crazy)”.
But there’s a fine line between “I won’t mention this because people will get the wrong impression (that it’s crazy)” and “I won’t mention this because people will get the wrong impression (that it’s false)”. The former is a subset of the latter; are you going to do the latter and conceal all information that might call your ideas into doubt?
(One answer might be “well, I won’t conceal information that would lead to a legitimate disagreement based on unflawed facts and reasoning. Thinking I’m crazy is not such a disagreement”. But I see problems with this. If you believe in X, you by definition think that all disagreement with X is flawed, so this doesn’t restrict you at all.)
I would say don’t conceal any important information. But you don’t have to lead with information that sounds crazy. Maybe bioweapons don’t make it into the 1 minute elevator pitch, but can be explained in the 10 minute version, or during an ensuing back-and-forth. If bioweapons were somehow critical to the AI extinction argument I wouldn’t say this, but all the sci-fi stuff isn’t actually part of the core argument anyway.