Exposing yourself to information that might alter your character in a way you don’t currently want (the argument that repeatedly seeing violence in video or photographic form makes us less compassionate)
Exposing yourself to persuasive arguments that might make you do things you currently consider morally wrong (Mein Kampf, serial killers’ manifestoes)
Exposing yourself to content that might persuade you to do things you don’t currently want to do (advertising, watching the Food Network if you’re dieting/fasting; the sirens’ song, if you’re Odysseus)
This is too vague. I think you don’t realize what most people will use this way of reasoning for.
example:
X is bad because the Koran said so and I don’t want to learn positive information about behaviour X lest I be tempted to engage in it.
Its easy to fit this for apologia for religious prescriptions, world-views (blank slateism, antivax, …) and ideologies of all stripes (luddite environmentalist, communist, fascist,)
Exposing yourself to effective, emotionally manipulative arguments for things you’re currently confident are false (possibly religious apologetics)
One should think really hard if he isn’t already in a emotional state that might make him more prone to biases in this kind of thinking. Anyway to nitpick all arguments are emotionally manipulative, this includes those made in such a way to avoid a emotional response.
Exposing yourself to “cynical” true information that lowers your utility/motivation/happiness in everyday life (public choice theory, if you’re a civil servant; accounts of unsuccessful and dissatisfied grad students/law students, if you’re a student)
If you know its cynical but true and therefore don’t want to explore it aren’t we really talking about belief in belief? I don’t know how much of that I want regardless of its effects on my happiness.
Exposing yourself to content that’s “disgusting” or “degrading” in your view (Two Girls One Cup; Tucker Max; gangsta rap)
I could easily imagine gleaning some insight from Tucker Max or gangster rap (well sort of).
Overall I don’t by default see anything wrong with the list listed as a value preserving methods, but I just think people aren’t really as attached to all of their values as much as they think. I’m pretty sure we humans are easy to fool about what our “real” values are.
This is too vague. I think you don’t realize what most people will use this way of reasoning for.
example: X is bad because the Koran said so and I don’t want to learn positive information about behaviour X lest I be tempted to engage in it.
Its easy to fit this for apologia for religious prescriptions, world-views (blank slateism, antivax, …) and ideologies of all stripes (luddite environmentalist, communist, fascist,)
One should think really hard if he isn’t already in a emotional state that might make him more prone to biases in this kind of thinking. Anyway to nitpick all arguments are emotionally manipulative, this includes those made in such a way to avoid a emotional response.
If you know its cynical but true and therefore don’t want to explore it aren’t we really talking about belief in belief? I don’t know how much of that I want regardless of its effects on my happiness.
I could easily imagine gleaning some insight from Tucker Max or gangster rap (well sort of).
Overall I don’t by default see anything wrong with the list listed as a value preserving methods, but I just think people aren’t really as attached to all of their values as much as they think. I’m pretty sure we humans are easy to fool about what our “real” values are.