True fact: the female orgasm is not necessary for reproduction.
If group A believes that reproduction is inherently good, knowledge of this true fact replacing a prior belief in the necessity of the female orgasm for reproduction may lead to deprioritization of female orgasms among members of group A, and therefore, cause a direct reduction in the amount of orgasmic joy experienced by ~50% of that group. This reduction in joy could be classed as harm.
The Puritans of the Massachusetts Bay Colony believed that the reproduction was good, and that the female orgasm during the sexual act was necessary to its’ achievement.
Nope. You’re evaluating their strategies using your utility function. Infohazards occur when individuals or groups create strategies using their own utility functions and then do worse under their own utility functions when knowledge of true facts is added to them.
So, the true fact ‘the female orgasm is not necessary for reproduction’ would not qualify as an infohazard to a colonial Puritan, who believes that reproduction is good, and that the female orgasm is necessary for its’ accomplishment?
In order to turn it into an infohazard to that Puritan, do I have to add the (unstated in previous) assertion that ‘experiencing orgasmic joy is utility positive’? Is there a way to fix this example or am I just completely off base here?
Edit: I’m trying to define an edge case, hope I’m not offending anyone.
You’re basically just failing at modeling rational agents with utility functions different from yours, I’m sorry to say. If the Puritans value pleasure, they can pursue it even after learning the true facts of the matter. If they don’t value pleasure, but you do, you’re unhappy they learned the secret because now they’ll do things you don’t want, but they do want to do those things under their own utility functions.
Oh I understand, I’m trying to apply an external measure of utility, but it doesn’t necessarily match up to an internal measure, so this example fails. Thank you!
Edit: you’ve written before about your experiences growing up in an insular religious environment. Can you in retrospect identify any pieces of true widely known information that would qualify as an infohazard to that group using your definition? Obviously I wouldn’t ask you to actually state the true fact or the reason it’s an infohazard.
True fact: the female orgasm is not necessary for reproduction.
If group A believes that reproduction is inherently good, knowledge of this true fact replacing a prior belief in the necessity of the female orgasm for reproduction may lead to deprioritization of female orgasms among members of group A, and therefore, cause a direct reduction in the amount of orgasmic joy experienced by ~50% of that group. This reduction in joy could be classed as harm.
The Puritans of the Massachusetts Bay Colony believed that the reproduction was good, and that the female orgasm during the sexual act was necessary to its’ achievement.
Am I doing it right?
Nope. You’re evaluating their strategies using your utility function. Infohazards occur when individuals or groups create strategies using their own utility functions and then do worse under their own utility functions when knowledge of true facts is added to them.
So, the true fact ‘the female orgasm is not necessary for reproduction’ would not qualify as an infohazard to a colonial Puritan, who believes that reproduction is good, and that the female orgasm is necessary for its’ accomplishment?
In order to turn it into an infohazard to that Puritan, do I have to add the (unstated in previous) assertion that ‘experiencing orgasmic joy is utility positive’? Is there a way to fix this example or am I just completely off base here?
Edit: I’m trying to define an edge case, hope I’m not offending anyone.
You’re basically just failing at modeling rational agents with utility functions different from yours, I’m sorry to say. If the Puritans value pleasure, they can pursue it even after learning the true facts of the matter. If they don’t value pleasure, but you do, you’re unhappy they learned the secret because now they’ll do things you don’t want, but they do want to do those things under their own utility functions.
Oh I understand, I’m trying to apply an external measure of utility, but it doesn’t necessarily match up to an internal measure, so this example fails. Thank you!
Edit: you’ve written before about your experiences growing up in an insular religious environment. Can you in retrospect identify any pieces of true widely known information that would qualify as an infohazard to that group using your definition? Obviously I wouldn’t ask you to actually state the true fact or the reason it’s an infohazard.