In addition to Roko’s point that this sort of opinion-falsification is often habitual rather than a strategic choice that a person could opt not to make, it also makes strategic sense to lie in such surveys.
First, the promised “anonymity” may not actually be real, or real in the relevant sense. The methodology mentions “a secure online survey system which allowed for recording the identities of participants, but did not append their survey responses to their names or any other personally identifiable information”, but if your reputation is on the line, would you really trust that? Maybe there’s some fine print that’d allow the survey-takers to look at the data. Maybe there’d be a data leak. Maybe there’s some other unknown-unknown you’re overlooking. Point is, if you give the wrong response, that information can get out somehow; and if you don’t, it can’t. So why risk it?
Second, they may care about what the final anonymized conclusion says. Either because the lab leak hypothesis becoming mainstream would hurt them personally (either directly, or by e. g. hurting the people they rely on for funding), or because the final conclusion ending up in favour of the lab leak would still reflect poorly on them collectively. Like, if it’d end up saying that 90% of epidemiologists believe the lab leak, and you’re an epidemiologist… Well, anyone you talk to professionally will then assign 90% probability that that’s what you believe. You’d be subtly probed regarding having this wrong opinion, your past and future opinions would be scrutinized for being consistent with those of someone believing the lab leak, and if the status ecosystem notices something amiss...?
But, again, none of these calculations would be strategic. They’d be habitual; these factors are just the reasons why these habits are formed.
Answering truthfully in contexts-like-this is how you lose the status games. Thus, people who navigate such games don’t.
In addition to Roko’s point that this sort of opinion-falsification is often habitual rather than a strategic choice that a person could opt not to make, it also makes strategic sense to lie in such surveys.
First, the promised “anonymity” may not actually be real, or real in the relevant sense. The methodology mentions “a secure online survey system which allowed for recording the identities of participants, but did not append their survey responses to their names or any other personally identifiable information”, but if your reputation is on the line, would you really trust that? Maybe there’s some fine print that’d allow the survey-takers to look at the data. Maybe there’d be a data leak. Maybe there’s some other unknown-unknown you’re overlooking. Point is, if you give the wrong response, that information can get out somehow; and if you don’t, it can’t. So why risk it?
Second, they may care about what the final anonymized conclusion says. Either because the lab leak hypothesis becoming mainstream would hurt them personally (either directly, or by e. g. hurting the people they rely on for funding), or because the final conclusion ending up in favour of the lab leak would still reflect poorly on them collectively. Like, if it’d end up saying that 90% of epidemiologists believe the lab leak, and you’re an epidemiologist… Well, anyone you talk to professionally will then assign 90% probability that that’s what you believe. You’d be subtly probed regarding having this wrong opinion, your past and future opinions would be scrutinized for being consistent with those of someone believing the lab leak, and if the status ecosystem notices something amiss...?
But, again, none of these calculations would be strategic. They’d be habitual; these factors are just the reasons why these habits are formed.
Answering truthfully in contexts-like-this is how you lose the status games. Thus, people who navigate such games don’t.