I wouldn’t expect that a very nasty infohazard is likely to increase my epistemic alignment with the world. It could for example be a very convincing deep fake that’s engineered in a way to be particularly sticky in my mind.
Part of caring about epistemtic rationality is engaging with sources of information that you think are likely to improve your alignment with reality and not decrease it.
Most of the crowd who believes in epistemic rationality also doesn’t see experiential knowledge as being able to provide knowledge in a good way. If a new infohazard has such effects it might be a lot better to read a research article about it then exposing oneselves directly to the piece.
I wouldn’t expect that a very nasty infohazard is likely to increase my epistemic alignment with the world. It could for example be a very convincing deep fake that’s engineered in a way to be particularly sticky in my mind.
Part of caring about epistemtic rationality is engaging with sources of information that you think are likely to improve your alignment with reality and not decrease it.
Most of the crowd who believes in epistemic rationality also doesn’t see experiential knowledge as being able to provide knowledge in a good way. If a new infohazard has such effects it might be a lot better to read a research article about it then exposing oneselves directly to the piece.