The Myth of the Myth of the Lone Genius

“Our species is the only creative species, and it has only one creative instrument, the individual mind and spirit of man. Nothing was ever created by two men. There are no good collaborations, whether in music, in art, in poetry, in mathematics, in philosophy. Once the miracle of creation has taken place, the group can build and extend it, but the group never invents anything. The preciousness lies in the lonely mind of a man.”

- John Steinbeck

“The Great Man theory of history may not be truly believable and Great Men not real but invented, but it may be true we need to believe the Great Man theory of history and would have to invent them if they were not real.”

- Gwern


The Myth of the Lone Genius is a bullshit cliche and we would do well to stop parroting it to young people like it is some deep insight into the nature of innovation. It typically goes something like this—the view that breakthroughs come from Eureka moments made by geniuses toiling away in solitude is inaccurate; in reality, most revolutionary ideas, inventions, innovation etc. come from lots of hard work, luck, and collaboration with others.

Here is a good description of the myth from The Ape that Understood the Universe: How the Mind and Culture Evolve by psychologist Steve Stewart-Williams.

”We routinely describe our species’ cultural achievements to lone-wolf geniuses – super-bright freaks of nature who invented science and technology for the rest of us. … It’s a myth because most ideas and most technologies come about not through Eureka moments of solitary geniuses but through the hard slog of large armies of individuals, each making—at best—a tiny step or two forward”

The problem here is that the myth of the lone genius is itself a myth. History (ancient and recent) is full of geniuses who came up with a revolutionary idea largely on their own—that’s why the archetype even exists in the first place (Aristotle, Newton, Darwin, Einstein to name the most obvious examples). The author of the above quote would seem to grant that at least some ideas and technologies come from eureka moments of solitary geniuses. Others would seem to go further—the author of an article entitled “The Myth of the Genius Solitary Scientist is Dangerous” holds up Einstein, Maxwell, and Newton as examples of this archetype, but then exposes the falsehood of these examples by saying:

“Newton looked down on his contemporaries (while suspecting them of stealing his work) but regularly communicated with Leibniz, who was also working on the development of calculus. Maxwell studied at several prestigious institutions and interacted with many intelligent people. Even Einstein made the majority of his groundbreaking discoveries while surrounded by people with whom he famously used as sounding boards.”

Uhhh ok, so they talked to other people while working on their ideas? Sure, we shouldn’t have this naive view that these so-called solitary geniuses work 1000% on their own without any input whatsoever from other people, but that doesn’t mean that they didn’t do most of the heavy lifting. Similarly, another proponent of the myth of the lone genius focuses on the power of partnership (Joshua Shenk, Powers of Two: How Relationships Drive Creativity). From the introduction of an interview with Shenk on Vox:

“After struggling for years trying to develop his special theory of relativity, Einstein got his old classmate Michele Besso a job at the Swiss patent office — and after “a lot of discussions with him,” Einstein said, “I could suddenly comprehend the matter.” Even Dickinson, a famous recluse, wrote hundreds of poems specifically for people she voraciously corresponded with by letter.

The idea isn’t that all of these situations represent equal partnerships — but that the lone genius is a total myth, and all great achievements involve some measure of collaboration.”

This seems contradictory—so there is still a dominant person in the partnership doing most (or all) of the difficult work, but at the same time the lone genius is a TOTAL myth. I have a feeling that Einstein’s contribution was a little more irreplaceable than that of this Besso fellow. Is there not room for a more moderate position here? I guess that doesn’t really sell books.

It’s not hard to see why the myth of the lone genius is so popular—it is a very politically correct type of idea, very much going along with the general aversion to recognizing intelligence and genes as meaningful sources of variation in social/​intellectual outcomes. It is also kind of a natural extension of the “you can achieve anything you set your mind to!” cliche. The fact that most of the geniuses in question are white men probably plays a not insignificant role in people’s quickness to discredit their contributions. At the end of the day, it’s really tough to admit that there are geniuses in the world and you aren’t one of them.

Defenders of the myth would probably argue that the vast majority of people are not solitary geniuses and the vast majority of innovations do not come from people like this, so we should just preach the message that hard work and collaboration are what matters for innovation. In this view, the myth of the lone genius is a kind of noble lie—the lessons we impart by emphasizing the fallacy of the lone genius are more beneficial than the lessons imparted from uncritical acceptance of the lone genius story. I’m not sure this is true, and in fact I would argue that the uncritical acceptance of the myth of the lone genius is just as bad as uncritical acceptance of the lone genius story.

What lessons are we really trying to impart with the myth of the lone genius?

(1) You are not just going to have a brilliant idea come to you out of thin air.

(2) Creativity is enhanced by collaboration and sharing ideas with others. Most good ideas come from recombining pre-existing ideas.

(3) Be humble and don’t expect that it will be easy to find good ideas. No, you will not “solve” quantum mechanics after taking your first high school physics class.

Ok great, I’m on board with all of these lessons, it’s kind of impossible not to be. The problem is that by harping so much on the fallacy of the lone genius we are also sending some implicit messages that are actively harmful to aspiring scientists/​engineers/​entrepreneurs.

(4) There are no such things as geniuses, and even if there were you are not one of them.

(5) You won’t come up with a great idea by spending lots of time thinking deeply about something on your own. The people who think they can do this are crackpots.

(6) Thinking isn’t real work and ideas are cheap, anything that doesn’t produce something tangible is a waste of time. Go do some experiments, have a meeting, write a paper, etc.

(1)-(3) are certainly valuable lessons, but I think most relatively intelligent people eventually learn them on their own to some degree. My concern is that lessons (4)-(6) can become self-fulfilling prophecies—upon learning about how innovation really works from the myth of the lone genius, the next would-be revolutionary thinker will give up on that crazy idea she occasionally worked on in her free time and decide to devote more time to things like networking or writing academic papers that no one reads. We should want exceptional people to believe they can do exceptional things on their own if they work hard enough at it. If everyone internalizes the myth of the lone genius to such a degree that they no longer even try to become lone geniuses then the myth will become a reality.

My argument here is similar to the one that Peter Thiel makes about the general lack of belief in secrets in the modern world.

“You can’t find secrets without looking for them. Andrew Wiles demonstrated this when he proved Fermat’s Last Theorem after 358 years of fruitless inquiry by other mathematicians— the kind of sustained failure that might have suggested an inherently impossible task. Pierre de Fermat had conjectured in 1637 that no integers a, b, and c could satisfy the equation an + bn = cn for any integer n greater than 2. He claimed to have a proof, but he died without writing it down, so his conjecture long remained a major unsolved problem in mathematics. Wiles started working on it in 1986, but he kept it a secret until 1993, when he knew he was nearing a solution. After nine years of hard work, Wiles proved the conjecture in 1995. He needed brilliance to succeed, but he also needed a faith in secrets. If you think something hard is impossible, you’ll never even start trying to achieve it. Belief in secrets is an effective truth.

The actual truth is that there are many more secrets left to find, but they will yield only to relentless searchers. There is more to do in science, medicine, engineering, and in technology of all kinds. We are within reach not just of marginal goals set at the competitive edge of today’s conventional disciplines, but of ambitions so great that even the boldest minds of the Scientific Revolution hesitated to announce them directly. We could cure cancer, dementia, and all the diseases of age and metabolic decay. We can find new ways to generate energy that free the world from conflict over fossil fuels. We can invent faster ways to travel from place to place over the surface of the planet; we can even learn how to escape it entirely and settle new frontiers. But we will never learn any of these secrets unless we demand to know them and force ourselves to look.”


Maybe I’m overthinking all of this—does the myth of the lone genius really affect anyone’s thinking in any substantial way? Maybe it only has the tiniest effect in the grand scheme of things. Even still, I would argue that it matters—uncritical acceptance of the lone genius myth is one more cultural force among many that is making it more and more difficult for individuals to do innovative work (and last time I checked, humanity is made up of individuals). In a fast-paced world full of intense economic/​scientific/​intellectual competition and decreasing opportunities for solitude, it is harder than ever before to justify spending significant time on intangible work that may or may not pay off. You can’t put on your resume—“I spend a lot of time thinking about ideas and scribbling notes that I don’t share with anyone.”

I guess what I want to counteract is the same thing that Stephen Malina, Alexey Guzey, Leopold Aschenbrenner argue against in “Ideas not mattering is a Psyop”. I don’t know how we could ever forget that ideas matter—of course they matter - but somewhere along the way I think we got a little confused. How this happened, I don’t know—you can probably broadly gesture at computers, the internet, big data, etc. and talk about how these have led to a greater societal emphasis on predictability, quantifiability, and efficiency. Ideas (and the creative process that produces them) are inherently none of these things; as Malina et al. remind us—Ideas are often built on top of each other, meaning that credit assignment is genuinely hard” and “Ideas have long feedback loops so it’s hard to validate who is good at having ideas that turn out to be good”. I would also mention increased levels of competition (as a result of globalism, increased population sizes, and the multitude of technologies that enable these things) as a major culprit. For any position at a college/​graduate school/​job you are likely competing with many people who have done all kinds of impressive sounding things (although it is probably 90% bullshit) so you better stop thinking about crazy ideas (remember, there are no such things as lone geniuses) and starting doing things, even if the things you are doing are boring and trivial. As long as they look good on the resume...


The life and times of Kary Mullis provide an illustration of this tension between individual genius and collaboration in the production of radical innovation. Kary Mullis is famous for two things—inventing the polymerase chain reaction (which he would win the nobel prize for) and having some very controversial views.

“A New York Times article listed Mullis as one of several scientists who, after success in their area of research, go on to make unfounded, sometimes bizarre statements in other areas. In his 1998 humorous autobiography proclaiming his maverick viewpoint, Mullis expressed disagreement with the scientific evidence supporting climate change and ozone depletion, the evidence that HIV causes AIDS, and asserted his belief in astrology. Mullis claimed climate change and HIV/​AIDS theories were promulgated as a form of racketeering by environmentalists, government agencies, and scientists attempting to preserve their careers and earn money, rather than scientific evidence.”

This is another reason why people are so leery of the lone genius—it often comes with a healthy dose of crazy. Yes, obviously this can go poorly—his ideas on AIDS did NOT age well—but, as we all know because there is an idiom for it, sometimes you have to break a few eggs to make an omelette.

“Mullis told Parade magazine: “I think really good science doesn’t come from hard work. The striking advances come from people on the fringes, being playful”

Proponents of the lone genius myth might be wondering at this point—did Mullis really invent PCR all on his own in a brilliant flash of insight? We shouldn’t be surprised that the answer is yes in fact he did, but also that it’s a little more complicated than that.

“Mullis was described by some as a “diligent and avid researcher” who finds routine laboratory work boring and instead thinks about his research while driving and surfing. He came up with the idea of the polymerase chain reaction while driving along a highway.”

“A concept similar to that of PCR had been described before Mullis’ work. Nobel laureate H. Gobind Khorana and Kjell Kleppe, a Norwegian scientist, authored a paper 17 years earlier describing a process they termed “repair replication” in the Journal of Molecular Biology.[34] Using repair replication, Kleppe duplicated and then quadrupled a small synthetic molecule with the help of two primers and DNA polymerase. The method developed by Mullis used repeated thermal cycling, which allowed the rapid and exponential amplification of large quantities of any desired DNA sequence from an extremely complex template.”

“His co-workers at Cetus, who were embittered by his abrupt departure from the company,[10] contested that Mullis was solely responsible for the idea of using Taq polymerase in PCR. However, other scientists have written that the “full potential [of PCR] was not realized” until Mullis’ work in 1983,[35] and that Mullis’ colleagues failed to see the potential of the technique when he presented it to them.”

“Committees and science journalists like the idea of associating a unique idea with a unique person, the lone genius. PCR is thought by some to be an example of teamwork, but by others as the genius of one who was smart enough to put things together which were present to all, but overlooked. For Mullis, the light bulb went off, but for others it did not. This is consistent with the idea, that the prepared (educated) mind who is careful to observe and not overlook, is what separates the genius scientist from his many also smart scientists. The proof is in the fact that the person who has the light bulb go off never forgets the “Ah!” experience, while the others never had this photochemical reaction go off in their brains.”


So what’s the take-home message? Let’s not treat the myth of the long genius like it’s gospel. Sometimes really smart people think long and hard about something and come up with an idea that changes the world. Yes, this happens very rarely and most innovation comes from the “hard slog of large armies of individuals, each making—at best—a tiny step or two forward”, but if we aren’t careful then these Eureka moments will become fewer and farther between and everything will be a hard slog. Let’s do better by providing a more nuanced picture of innovation in which solitary exploration by “geniuses” and collaboration both play critical roles.

(originally posted at Secretum Secretorum)