Marxist regimes — and Stalin in particular — treated biology and physics asymmetrically.
In biology, Stalin and other prominent Marxist leaders elevated the views of the quack antigeneticist Trofim Lysenko to state-supported orthodoxy, leading to the dismissal of thousands of geneticists and plant biologists. Lysenkoism hurt Soviet agriculture, and helped trigger the deadliest famine in human history during China’s Great Leap Forward.
In physics, on the other hand, leading scientists enjoyed more intellectual autonomy than any other segment of Soviet society. Internationally respected physicists ran the Soviet atomic project, not Marxist ideologues. When their rivals tried to copy Lysenko’s tactics, Stalin balked. A conference intended to start a witch hunt in Soviet physics was abruptly canceled, a decision that had to originate with Stalin. Holloway recounts a telling conversation between Beria, the political leader of the Soviet atomic project, and Kurchatov, its scientific leader: “Beria asked Kurchatov whether it was true that quantum mechanics and relativity theory were idealist, in the sense of antimaterialist. Kurchatov replied that if relativity theory and quantum mechanics were rejected, the bomb would have to be rejected too. Beria was worried by this reply, and may have asked Stalin to call off the conference.”
The “Lysenkoization” of Soviet physics never came.
The best explanation for the difference is that modern physics had a practical payoff that Stalin and other Communist leaders highly valued: nuclear weapons.
And:
We encounter the price-sensitivity of irrationality whenever someone unexpectedly offers us a bet based on our professed beliefs. Suppose you insist that poverty in the Third World is sure to get worse in the next decade. A challenger immediately retorts, “Want to bet? If you’re really ‘sure,’ you won’t mind giving me ten-to-one odds.” Why are you are unlikely to accept this wager? Perhaps you never believed your own words; your statements were poetry— or lies. But it is implausible to tar all reluctance to bet with insincerity. People often believe that their assertions are true until you make them “put up or shut up.” A bet moderates their views— that is, changes their minds— whether or not they retract their words.
How does this process work? Your default is to believe what makes you feel best. But an offer to bet triggers standby rationality. Two facts then come into focus. First, being wrong endangers your net worth. Second, your belief received little scrutiny before it was adopted. Now you have to ask yourself which is worse: Financial loss in a bet, or psychological loss of self-worth? A few prefer financial loss, but most covertly rethink their views. Almost no one “bets the farm” even if — pre-wager — he felt sure.
More (#1) from The Myth of the Rational Voter:
And: