Francis Crick once responded, “My dear chap, there was never a time in the early years of molecular biology when we sat around the table with a bunch of philosophers saying ‘let us define life first.’ We just went out there and found out what it was: a double helix.” In the sciences, definitions often follow, rather than precede, conceptual advances.
This is a great example of a situation where this behavior led to trouble! For “gene” had previously referred to heritable traits, and was after identified with bits of DNA—this led to many people thinking DNA was the only source of heritability.
Moreover, Crick’s claim is flatly contradicted by history. Erwin Schrödinger wrote “What Is Life?” in 1944 and predicted that biological information would be stored in aperiodic molecules controlling the unfolding metabolic processes of life. This book was part of the reason there was widespread enthusiasm for the search for these hypothesized molecules, and part of the reason Crick became famous after stealing the status and associated career-supporting resources for the discovery from Rosalind Franklin by looking at her notebooks which contained actual data from actual work and publishing first.
The lesson I’ve learned from Crick’s history is that unethical self-congratulatory blowhards can succeed at the social games of academic science, just as much as they can succeed in other social games, so long as they have victims to steal from. As someone interested in “the effecting of all things possible” who wishes both solid and productive thinking and solid and productive experimental work to be rewarded so that incentives for individual scientists encourage real progress, I consider Crick more of a popularizing semi-parasite than either a “thinker” or a “doer”. (Relatedly, see: Stigler’s Law.)
Indeed, there’s a reason for the line “publish or perish”. Popularization is important, not just for the publisher but for the world. But in addition to telling people what they have discovered, a scientist can also explain how it was discovered. Credit assignment, within a mind or between minds, is a hard problem, whose solution usually involves increased performance.
My objection with Crick isn’t that he didn’t get the message out about DNA’s double helix structure quite successfully, nor that he didn’t illustrate a method for advancing one’s scientific career, but that with this quote, his own report of his supposed contribution and methods gives misleading evidence about what actually makes a research program go faster or better. The credibility flows from his presumptive causal role in a revolution in biology based on his fame. In contrast, the best content I know on the subject of learning how to do good research, representing the condensation of enormous volumes of evidence, is an old chestnut:
At Los Alamos I was brought in to run the computing machines which other people had got going, so those scientists and physicists could get back to business. I saw I was a stooge. I saw that although physically I was the same, they were different. And to put the thing bluntly, I was envious. I wanted to know why they were so different from me. I saw Feynman up close. I saw Fermi and Teller. I saw Oppenheimer. I saw Hans Bethe: he was my boss. I saw quite a few very capable people. I became very interested in the difference between those who do and those who might have done.
When I came to Bell Labs, I came into a very productive department. Bode was the department head at the time; Shannon was there, and there were other people. I continued examining the questions, Why?'' andWhat is the difference?″ I continued subsequently by reading biographies, autobiographies, asking people questions such as: ``How did you come to do this?″ I tried to find out what are the differences. And that’s what this talk is about.
In that talk, Hamming spends some words on the question of conceptual analysis:
Great scientists tolerate ambiguity very well. They believe the theory enough to go ahead; they doubt it enough to notice the errors and faults so they can step forward and create the new replacement theory. If you believe too much you’ll never notice the flaws; if you doubt too much you won’t get started. It requires a lovely balance. But most great scientists are well aware of why their theories are true and they are also well aware of some slight misfits which don’t quite fit and they don’t forget it. Darwin writes in his autobiography that he found it necessary to write down every piece of evidence which appeared to contradict his beliefs because otherwise they would disappear from his mind. When you find apparent flaws you’ve got to be sensitive and keep track of those things, and keep an eye out for how they can be explained or how the theory can be changed to fit them. Those are often the great contributions.
This is a great example of a situation where this behavior led to trouble! For “gene” had previously referred to heritable traits, and was after identified with bits of DNA—this led to many people thinking DNA was the only source of heritability.
Moreover, Crick’s claim is flatly contradicted by history. Erwin Schrödinger wrote “What Is Life?” in 1944 and predicted that biological information would be stored in aperiodic molecules controlling the unfolding metabolic processes of life. This book was part of the reason there was widespread enthusiasm for the search for these hypothesized molecules, and part of the reason Crick became famous after stealing the status and associated career-supporting resources for the discovery from Rosalind Franklin by looking at her notebooks which contained actual data from actual work and publishing first.
The lesson I’ve learned from Crick’s history is that unethical self-congratulatory blowhards can succeed at the social games of academic science, just as much as they can succeed in other social games, so long as they have victims to steal from. As someone interested in “the effecting of all things possible” who wishes both solid and productive thinking and solid and productive experimental work to be rewarded so that incentives for individual scientists encourage real progress, I consider Crick more of a popularizing semi-parasite than either a “thinker” or a “doer”. (Relatedly, see: Stigler’s Law.)
“Yes, but when I discovered it, it stayed discovered.”—Lawrence Shepp
Indeed, there’s a reason for the line “publish or perish”. Popularization is important, not just for the publisher but for the world. But in addition to telling people what they have discovered, a scientist can also explain how it was discovered. Credit assignment, within a mind or between minds, is a hard problem, whose solution usually involves increased performance.
My objection with Crick isn’t that he didn’t get the message out about DNA’s double helix structure quite successfully, nor that he didn’t illustrate a method for advancing one’s scientific career, but that with this quote, his own report of his supposed contribution and methods gives misleading evidence about what actually makes a research program go faster or better. The credibility flows from his presumptive causal role in a revolution in biology based on his fame. In contrast, the best content I know on the subject of learning how to do good research, representing the condensation of enormous volumes of evidence, is an old chestnut:
In that talk, Hamming spends some words on the question of conceptual analysis: