Preface: I’m mostly thinking out loud here due to personal interest, and I’m a bad writer myself. It may be ouside of the scope of what you are trying to say, or put excessive pressure on your ideas for the purpose of addressing rare edge-cases with little benefit. Feel free to discard any of all of the following:
I’m curious how far you’re taking the idea that truth is less harmful than lies. Infinitely? I have personally asked myself the types of questions which has driven people to suicide or existential crisis, and eventually arrived at some of the most uplifting truths I now know. On the other hand, this world is riddled with lies (and I think that some of them are elephants in the room). You can get yourself or other people killed just by stating the truth. I’m under the impression that the average person knows a few of such things, but maybe that’s not the case. Said axiomatically: Some truth seems like it’s not aligned with humanity at all, and some truth only appears to be aligned with a subset of humanity (and these subsets tend to have friction between them).
A thing which interests me personally is how some truths sound terrible only because of other lies: ”You’re an egoist”, “No, how dare you say such a thing!”, “It’s a tautology that you always do what you think you should do in that moment. I didn’t say that you harmed other people for your own benefit. Actually, you’re likely doing other people good for your own benefit, it’s probably the case that your ego wants to help other people, making you moral rather than immoral.”
But it’s extremely likely that any truthful statement is taken to be an opinion or evaluation, that other people doubt the intention behind your truthfulness, and even that they claim you’re lying (because they disagree with what they think you imply). An easy example is that (universally disliked person) is intelligent, since “intelligent” is often regarded as a compliment rather than as a description. The hidden truth here is that intelligence and morality correlate less than we want to believe, and this is because of the hidden false belief that morality must be correct or valid (that the inherent value of preferences isn’t enough for comfort).
By the way, I agree that AI is dangerous (not only that, this danger is trivially true). I actually think that stopping all technological growth might be desirable soon, or at least that we should be selective about future advancements.
Finally, I have to disagree with “Push for public conversations” if that entails lowering the barrier of entry. Every topic requires a level of intelligence and knowledge before one can engage with it in a productive manner. If you talk in a way that only intelligent and knowledgable people understand, then you automatically filter out most who are unqualified. If such were to read this comment, or scan it for tokens which are taboo or outside of the overton window, he’d not find much, and thus not engage with me in a hostile manner. Abstracting to a higher scope than both sides of a culture war subject “The majority is not necessarily correct” seems better than a specific statement which is easier to understand but more likely to look like it’s taking a stance “Bullying is not ideal”, “Cancel culture is not ideal”.
Preface: I’m mostly thinking out loud here due to personal interest, and I’m a bad writer myself. It may be ouside of the scope of what you are trying to say, or put excessive pressure on your ideas for the purpose of addressing rare edge-cases with little benefit. Feel free to discard any of all of the following:
I’m curious how far you’re taking the idea that truth is less harmful than lies. Infinitely? I have personally asked myself the types of questions which has driven people to suicide or existential crisis, and eventually arrived at some of the most uplifting truths I now know. On the other hand, this world is riddled with lies (and I think that some of them are elephants in the room). You can get yourself or other people killed just by stating the truth. I’m under the impression that the average person knows a few of such things, but maybe that’s not the case. Said axiomatically: Some truth seems like it’s not aligned with humanity at all, and some truth only appears to be aligned with a subset of humanity (and these subsets tend to have friction between them).
A thing which interests me personally is how some truths sound terrible only because of other lies:
”You’re an egoist”, “No, how dare you say such a thing!”, “It’s a tautology that you always do what you think you should do in that moment. I didn’t say that you harmed other people for your own benefit. Actually, you’re likely doing other people good for your own benefit, it’s probably the case that your ego wants to help other people, making you moral rather than immoral.”
But it’s extremely likely that any truthful statement is taken to be an opinion or evaluation, that other people doubt the intention behind your truthfulness, and even that they claim you’re lying (because they disagree with what they think you imply). An easy example is that (universally disliked person) is intelligent, since “intelligent” is often regarded as a compliment rather than as a description. The hidden truth here is that intelligence and morality correlate less than we want to believe, and this is because of the hidden false belief that morality must be correct or valid (that the inherent value of preferences isn’t enough for comfort).
By the way, I agree that AI is dangerous (not only that, this danger is trivially true). I actually think that stopping all technological growth might be desirable soon, or at least that we should be selective about future advancements.
Finally, I have to disagree with “Push for public conversations” if that entails lowering the barrier of entry. Every topic requires a level of intelligence and knowledge before one can engage with it in a productive manner. If you talk in a way that only intelligent and knowledgable people understand, then you automatically filter out most who are unqualified. If such were to read this comment, or scan it for tokens which are taboo or outside of the overton window, he’d not find much, and thus not engage with me in a hostile manner. Abstracting to a higher scope than both sides of a culture war subject “The majority is not necessarily correct” seems better than a specific statement which is easier to understand but more likely to look like it’s taking a stance “Bullying is not ideal”, “Cancel culture is not ideal”.