# The Simple Math of Everything

I am not a professional evolutionary biologist. I only know a few equations, very simple ones by comparison to what can be found in any textbook on evolutionary theory with math, and on one memorable occasion I used one incorrectly. For me to publish an article in a highly technical ev-bio journal would be as impossible as corporations evolving. And yet when I’m dealing with almost anyone who’s *not* a professional evolutionary biologist…

It seems to me that there’s a substantial advantage in knowing the *drop-dead basic fundamental embarrassingly simple* mathematics in as many different subjects as you can manage. Not, necessarily, the high-falutin’ complicated damn math that appears in the latest journal articles. Not unless you plan to become a professional in the field. But for people who can read calculus, and sometimes just plain algebra, the drop-dead basic mathematics of a field may not take that long to learn. And it’s likely to change your outlook on life more than the math-free popularizations *or* the highly technical math.

Not Jacobean matrices for frequency-dependent gene selection; just Haldane’s calculation of time to fixation. Not quantum physics; just the wave equation for sound in air. Not the maximum entropy solution using Lagrange multipliers; just Bayes’s Rule.

*The Simple Math of Everything,* written for people who are good at math, might not be all that weighty a volume. How long does it take to explain Bayes’s Rule to someone who’s good at math? *Damn* would I like to buy that book and send it back in time to my 16-year-old self. But there’s no way I have time to write this book, so I’m tossing the idea out there.

Even in reading popular works on science, there is yet power. You don’t want to end up
like those poor souls in that recent interview (I couldn’t Google)
where a well-known scientist in field XYZ thinks the universe is 100
billion years old. But it seems to me that there’s substantially *more*
power in pushing until you encounter some basic math. Not complicated
math, just basic math. F=ma is *too* simple, though. You should take the highest low-hanging fruit you can reach.

Yes, there are sciences whose soul is not in their math, yet which are nonetheless incredibly important and enlightening. Evolutionary psychology, for example. But even there, if you kept pushing until you encountered equations, you would be well-served by that heuristic, even if the equations didn’t seem all that enlightening compared to the basic results.

I remember when I finally picked up and started reading through my copy of the *Feynman Lectures on Physics,* even though I couldn’t think of any realistic excuse for how this was going to help my AI work, because I just got fed up with not knowing physics. And—you can guess how this story ends—it gave me a new way of looking at the world, which all my earlier reading in popular physics (including Feynman’s QED) hadn’t done. Did that help inspire my AI research? Hell yes. (Though it’s a good thing I studied neuroscience, evolutionary psychology, evolutionary biology, Bayes, and physics *in that order*—physics alone would have been *terrible* inspiration for AI research.)

In academia (or so I am given to understand) there’s a huge pressure
to specialize, to push your understanding of one subject all the way
out to the frontier of the latest journal articles, so that you can
write your own journal articles and get tenure. Well, one may certainly have to learn the far math of one field, but why *avoid *the simple math of others? Is it too *embarrassing* to learn just a little math, and then stop? Is there an unwritten rule which says that once you start learning any math, you are obligated to finish it all? Could that be why the practice isn’t more common?

I know that I’m much more embarrassed to know a few simple equations of physics, than I was to know only popular physics. It feels wronger to know a few simple equations of evolutionary biology than to know only qualitative evolutionary biology. Even mentioning how useful it’s been seems wrong, as if I’m boasting about something that no one should boast about. It feels like I’m a dilettante—but how would I be diletting *less* if I hadn’t studied even the simple math?

- Where to Draw the Boundaries? by 13 Apr 2019 21:34 UTC; 84 points) (
- Conjuring An Evolution To Serve You by 19 Nov 2007 5:55 UTC; 47 points) (
- Possibility and Could-ness by 14 Jun 2008 4:38 UTC; 42 points) (
- [Link] A gentle video introduction to game theory by 13 Dec 2011 8:52 UTC; 33 points) (
- Eric Drexler on Learning About Everything by 27 May 2009 12:57 UTC; 31 points) (
- What math is essential to the art of rationality? by 15 Oct 2014 2:44 UTC; 16 points) (
- 15 Mar 2012 21:16 UTC; 15 points) 's comment on Cult impressions of Less Wrong/Singularity Institute by (
- The “best” mathematically-informed topics? by 14 Nov 2014 3:39 UTC; 13 points) (
- Creating The Simple Math of Everything by 20 Jul 2009 22:45 UTC; 12 points) (
- An idea: Sticking Point Learning by 8 Sep 2009 9:52 UTC; 10 points) (
- An Intuitive Explanation of Quantum Mechanics by 12 Jun 2008 3:45 UTC; 10 points) (
- 1 Jun 2009 21:07 UTC; 9 points) 's comment on Open Thread: June 2009 by (
- 2 Jul 2009 8:22 UTC; 7 points) 's comment on Open Thread: July 2009 by (
- [SEQ RERUN] The Simple Math of Everything by 30 Oct 2011 2:07 UTC; 6 points) (
- 27 May 2019 15:57 UTC; 4 points) 's comment on 0.999...=1: Another Rationality Litmus Test by (
- 3 Dec 2014 22:03 UTC; 4 points) 's comment on Good things to have learned.... by (
- 6 May 2014 6:10 UTC; 2 points) 's comment on Rationality Quotes May 2014 by (
- 7 Jan 2010 20:04 UTC; 2 points) 's comment on Open Thread: January 2010 by (
- 2 Oct 2011 22:12 UTC; 2 points) 's comment on Physics Video Lectures by (
- 14 Apr 2012 23:57 UTC; 1 point) 's comment on Suggestions needed: good articles for a meetup discussion by (
- 19 Aug 2014 20:38 UTC; 1 point) 's comment on Welcome to Less Wrong! (6th thread, July 2013) by (
- 23 Jul 2009 10:43 UTC; 1 point) 's comment on Creating The Simple Math of Everything by (
- 23 Nov 2011 20:04 UTC; 0 points) 's comment on What mathematics to learn by (
- The best mathematically-informed topics by 14 Nov 2014 3:38 UTC; 0 points) (
- 23 Nov 2011 20:05 UTC; -1 points) 's comment on What mathematics to learn by (

Would you have time to start a wiki whose purpose was to be edited into a book, coauthored by dozens of contributors, who can explain the basic simple math of their field to non-math-phobic laypeople? (This is different from just scraping Wikipedia; these would be targeted articles, perhaps some invited ones...) Of course that could end up taking more time due to the infamous herding cats problem. But I’d love to have that book to read on the BART train.

For those wondering about the answer—he did, it’s called Arbital. but it was discontinued (see arbital postmortem)

I don’t think most people feel more ashamed of knowing a little than knowing nothing; they just don’t try. But, Eliezer’s shame reminds me of the story where Feynman is having trouble learning something, and his wife tells him to read like a beginner again. I believe it is a common speculation that people avoid learning new things to avoid feeling like a beginner.

I remember reaching exactly this point and making exactly this wish many years ago. I tried to learn as many fields as I could by reading introductory textbooks, and most of those texts avoid any math. I thought that a text that was willing to use simple math could teach me a lot more a lot faster. My theory was that there were too few people who could handle simple math and would want to learn many fields to support the book. But I’d love to be shown wrong.

A better way of looking at it may be as Mathematics for Understanding, as opposed to maths for research, instead of Simple Math.

What I find embarrassing about knowing just a little bit about a subject is that outside of a formal class, there are few places to talk about it; particularly, few places to talk about it with people who will bring your further toward understanding what you’ve learned. If you learn a little bit of the mathematics of a subject, you’re not interesting to the specialists, and most others won’t be interested in the subject at all.

It seems easier to find a community around learning things that are less academic subjects, where you’ll generally learn them in an informal structure anyhow—cooking, crafts, foreign languages.

(I do like the idea of The Simple Math of Everything...)

Hi, I’m a lurker on this site. I think this is a brilliant idea. I’ve just set up a wiki at http://scratchpad.wikia.com/wiki/The_Simple_Math_of_Everything

Please go forth and edit!

Note: I am not the administrator; I have no special privileges. More info on that page.

I agree about the usefulness of a basic technical understanding of as many fields as possible. As for the push to specialize in academia- well, it’s complicated. I’m not a professor, I’m a grad student, but here’s my experience. If you’re in one of the relatively “pure” discipline- physics, computer science, and so on- the push to specialize is very real, as is the push to focus on what everyone else (including granting agencies) thinks is “hot.” But there

isa lot of multi-disciplinary work going on, an increasing amount really. Trouble is, that quickly becomes a new discipline in its own right. My alma mater now has 5 different biology majors, each of them interdisciplinary in interesting ways. My own field- materials science- encompasses the study of solids and liquids. Metals, alloys, ceramics, oxides, semiconductors, polymers, and even biological materials. It can’t be done unless you understand organic and inorganic chemistry, crystallography (applied group theory, really), physics (classical- strain fields, shearing forces; and quantum- bloch waves, electronic band structure), and enough computer science to right some basic simulations. You end up with professors working in fields that didn’t exist when they started out. So they keep taking classes and reading each other’s books.A little knowledge can be more dangerous—and embarrassing—than complete ignorance.Yes. As a math professor, I sort of agree and sort of disagree with this post. On the one hand, people have lots of misunderstandings about math, as people like John Allen Paulos have written. But on the other hand, it’s NOT true that everything has a simple mathematical model. Often mathematical models that might be useful in physics are not especially useful elsewhere, and even more often the most important thing is not the model’s predictions, but the errors.

Look at the Social Security model, for example. It’s incredibly unreliable, because it makes long-time predictions based on a single parameter (average growth of GNP) which is assumed to be constant over 40 years. And the difference in predictions by changing this widely varying number is on the order of 10-20 years.

But the problem is that a few people think they know the math here and think they understand the situation completely because of it. In fact they know a tiny bit of math (or trust that other people know the math), and end up doing incredibly stupid things because of it. If they actually knew more, they would be a lot more careful with things like personal accounts and such. Instead we trust a few political appointees, process a couple of the numbers involved, and base everything on that.

And if you disagree with me about personal accounts on Social Security or something, and just think I’m a liberal who shouldn’t be taken seriously, compare the Doomsday argument http://en.wikipedia.org/wiki/Doomsday_argument. It uses statistics (which most people don’t understand) to make a trivial prediction with absurd consequences that gets taken seriously. People with a little understanding of statistics will take it seriously, but people who actually understand the limitations of statistics will realize it’s ridiculous.

Agreed, but people with enough experience of the limits of simple mathematical models in one field are less likely to make that mistake in other fields.

A hypothetical

“The Simple Maths of Everything”textbook should include warnings about the limits of the models, and a few memorable examples of how those models go wrong.If you ever get as seriously curious about electronics as you were about physics, look at Horowitz and Hill,

The Art of Electronics. Very very useful for someone who already knows the math and wants to understand electronics principles and the practicalities of one-off discrete circuit design.While simple “me too”ing is generally bad netiquette, I have to say that

The Simple Math of Everythingis a just plain fantastic idea.My guess is that most people simply don’t know that knowing the math is important to understanding a subject. Until you have some technical understanding of a subject it may seem that a non-technical understanding is all there is.

A little knowledge can be more dangerous—and embarassing—than complete ignorance.

Has there been any progress towards this idea? I as well think it would be a fantastic book and would love to read it

edit:I see there’s a wiki page regarding this idea, with some linksThe dangers of a “little learning” are easily offset by pointing out the ways the relevant “simple math” fails in a given case. Cf. Feynman’s (for example) use of analogies. He’d state the analogy, then point out the ways in which the analogy is wrong or misleading, the specific features that fail to map, etc. This strategy gets you the pedagogical benefits of structure mapping while minimizing the risk (that Bill Swift warns against, supra) that a little learning will be mistaken for a great deal.

There are some laudable attempts for such a book by a few people, the first one coming to mind is “the computational beauty of nature”. Although it contains only a few fields, it’s still a great book for the “not-afraid-of-a-few-basic-equations” crowd. Wish there were more books like that.

Add the information to Connexions. (http://cnx.org/) It seems built for just such a purpose and was highlighted in one of the TED talks a year or so ago if anyone wants to go watch a video overview.

Beautiful idea!

Is a Wiki separate from Wikipedia needed?

Similar problem: One thing I run in to often on Wikipedia is entries that use the field’s particular mathematical notation for no reason other than particular symbols and expressions are the jargon of the field. They get in the way of understanding what the entry is saying, though.

Similar problem is there seem to be academic papers that have practical applications and yet the papers are written to be as unclear as possible—perhaps to take on that “important” sheen, perhaps simply because the authors are deep in their own jargon and assume all readers know everything they know. Consider papers in the AI field. :)

Pete: I was just thinking the same thing, that we ought to start a wiki to do this project. Questions do come up though like “where ought one draw the line between the simple and nonsimple”? This question relates even ti billswift’s comment about the name.

For instance, in physics, ought we include Hamilton’s equations/the hamiltonian? There’s certainly understanding to be found by considering a system in those terms. But deriving those and so on probably is a bit deeper than what one might want to consider “easy math”… or maybe not. Those are in some ways the starting point that leads to the deep stuff.

There’s probably analogous questions in other fields. So we have to decide what we’re going to consider the “easy” math.

My suggestion would be not to draw the line… but to grade things on how hard they are (fundamental, basic, intermediate...).

That way, anybody can start, and can stop at any time they want to...

I’ve been reading a book similar to what you have in mind I think. It’s “Mathematics: From the birth of numbers” (http://www.amazon.com/Mathematics-Birth-Numbers-Jan-Gullberg/dp/039304002X). It starts very basic but covers all sorts of advanced topics. It’s designed for someone with no higher math learning. I’m about 1/4 of the way through it and so far very impressed.

First off, that book looks wonderful. It looks, just from the description, like it goes deeper into Math, rather than covering the math of other fields. As delightful as Math can be, I’d be much more interested in having a primer on the math of all sorts of other things.

Douglas, if all you say is “some cats have more babies than other cats” then you have missed out the key element of heritable variation and therefore haven’t said anything about evolution by natural selection.

Here’s Hanson’s take on the Doomsday argument:

http://hanson.gmu.edu/nodoom.html

Link is 404. I can’t find Hanson’s article elsewhere. I so hate link rot.

Steve, would you care to elucidate what’s ridiculous about the Doomsday argument? I’d be especially interested in an explanation based on the “limitations of statistics” as opposed to a hand-waving argument. The Doomsday argument strikes most people as absurd on its face, and yet it’s surprisingly resistant to refutation. My own opinion is that it’s not absurd at all, and is among the ideas that reveal a deep truth about reality.

Didn’t Steven Hawking say that his publisher told him that every equation he put in his book would halve the sales? So that’s why real math doesn’t make it into most popular science books, one of the reasons there’s a band-gap between narrative science and professional texts. Would be nice to have this filled, I agree.

A little learning is

nota dangerous thing to one who does not mistake it for a great deal. William A WhiteQuoted in Ronald Gross’s Independent Scholar’s Handbook. Which, unfortunately, is not particularly useful for technical fields.

That’s a GREAT idea. I’ve been trying to do the same as Robin, but the availability of good textbooks is somewhat limited where I live (and they’re quite expensive to import). A volume containing the introductory math for many fields would make things much easier, and I’d certainly be buying it.

Well answered!

Douglas, I’m not saying that there are cats that don’t have heritable variation, any more than you’re saying that there are cats that don’t have varying numbers of offspring. I’m saying that the fact that cats have heritable variation is just as relevant to evolution as the fact that their number of offspring varies.

g—cats without heritable variation? Where you get some of them?

If what you’re proposing is like a “Advanced Mathematical Principals for Dummies”, I think you have a great idea.

You say you don’t have the time, but you could probably put together a few people to put something together. 4-5 people writing two chapters. The “Dummies” folks would probably publish something like that. I’d consider buying it.

The math of a subject is only valuable when one understands the basic terminology of the subject. As Chris points out, knowing when to use statistics (the basic assumptions and what the word applies to) makes something like the Doomsday Arguement good for a laugh. It is ridiculous. On evolutionary biology-- Evolution is defined as ” any change in the frequency of alleles within a gene pool from one generation to the next.” This frequency changes with each birth. So to make the definition into regular English we could say Evolution is defined as “living things reproduce” (the fact of evolution). In modem evolutionary genetics, natural selection is defined as “the differential reproduction of genotypes (individuals of some genotypes have more offspring than those of others)”. In English- some cats have more babies than other cats. So the statement “It is a fact that some cats have more babies than other cats,” would be the proof of evolution by natural selection as the terms are currently defined. Doesn’t that help more than a mathematical equation?

This doesn’t follow.

Well, the obvious point is that the Copernican Principle is frequently wrong. The Anthropic Principle does a fairly good job at pointing out the weaknesses of the CP, to start with, and remembering that all else is rarely equal takes care of most of the rest.

Now will someone set up a futures market tied to the publication of a book with that title by a non-vanity press within the next 18 months?

I’d sure as hell buy it (well given it was not published by Springer and priced accordingly :P)!

Hi Guys,

I am also a lurker/admirer of this site and I would love to have such a book! I will be watching this topic and the wikipedia linked to, hoping something comes of it. Eventually I will put up simple neuroscience equations.

GRAET JOB BRO

hi and u suck

hi and u suck