Hm, I would interpret Cowen’s position somewhat differently.
I think his advice is roughly: if someone is saying something you generally agree with, but they have other bad beliefs and also there is likely to be some social cost due to offensiveness or maybe predicted offensiveness given that the position they have wasn’t arrived at with a good process, then you should not align yourself with them.
You may be pro position X, but that doesn’t mean you have to endorse every group that espouses position X—you should pick your allies, and agreeing on X doesn’t mean that it’s worth it to associate yourself with them.
I don’t think he’s saying that you should not talk about X though. Like, you can believe that covid restrictions should be lifted, and argue for that position and endorse some who hold that position, without endorsing the convoy. The convoy, on top of being expensive in reputational terms to ally with, has a cluster of salient beliefs that you probably don’t endorse if you listen to advice from Tyler.
In other words, I don’t read Tyler as saying that you should pick your beliefs based on reputations of people who are loudly advocating for something you agree with. I think Tyler would generally advocate believing things that are true as the most important element. But you should update less to poorly founded opinions compared to well founded opinions (obv) and you should try to associate with people who think things for good reasons as opposed to agreeing with you for bad ones.
I think his advice is roughly: if someone is saying something you generally agree with, but they have other bad beliefs and also there is likely to be some social cost due to offensiveness or maybe predicted offensiveness given that the position they have wasn’t arrived at with a good process, then you should not align yourself with them.
I don’t think Tyler argues here about social cost.
Tyler generally believes that it’s good to yield to experts in their domains. He is saying things like economists should listen more to philosophers when it comes to topics with philosophic implications.
Hm, I would interpret Cowen’s position somewhat differently.
I think his advice is roughly: if someone is saying something you generally agree with, but they have other bad beliefs and also there is likely to be some social cost due to offensiveness or maybe predicted offensiveness given that the position they have wasn’t arrived at with a good process, then you should not align yourself with them.
You may be pro position X, but that doesn’t mean you have to endorse every group that espouses position X—you should pick your allies, and agreeing on X doesn’t mean that it’s worth it to associate yourself with them.
I don’t think he’s saying that you should not talk about X though. Like, you can believe that covid restrictions should be lifted, and argue for that position and endorse some who hold that position, without endorsing the convoy. The convoy, on top of being expensive in reputational terms to ally with, has a cluster of salient beliefs that you probably don’t endorse if you listen to advice from Tyler.
In other words, I don’t read Tyler as saying that you should pick your beliefs based on reputations of people who are loudly advocating for something you agree with. I think Tyler would generally advocate believing things that are true as the most important element. But you should update less to poorly founded opinions compared to well founded opinions (obv) and you should try to associate with people who think things for good reasons as opposed to agreeing with you for bad ones.
I don’t think Tyler argues here about social cost.
Tyler generally believes that it’s good to yield to experts in their domains. He is saying things like economists should listen more to philosophers when it comes to topics with philosophic implications.