Why learning programming is a great idea even if you’d never want to code for a living

Here is the short version:
Writing program code is a good way of debugging your thinking—Bill Venables
It’s short, apt, and to the point. It does have a significant flaw: it uses a term I’ve come to hate, “bug”. I don’t know if Grace Murray Hopper is to blame for this term and the associated image of an insect creeping into a hapless programmer’s hardware, but I suspect this one word may be responsible in some part for the sad state of the programming profession.
You see, a lot gets written about bugs, debugging, testing, and so on. A lot of that writing only serves to obscure one plain fact, which if I were slightly more pretentious I’d call one of the fundamental laws of software:
Every “bug” or defect in software is the result of a mismatch between a person’s assumptions, beliefs or mental model of something (a.k.a. “the map”), and the reality of the corresponding situation (a.k.a. “the territory”).
The software industry is currently held back by a conception of programming-as-manual-labor, consisting of semi-mechanically turning a specification document into executable code. In that interpretation “bugs” or “gremlins” are the equivalent of machine failures: something unavoidable, to be controlled by rigorous statistical controls, replacement of faulty equipment (programmers), and the like.
A better description would be much closer to “the art of improving your understanding of some business domain by expressing the details of that domain in a formal notation”. The resulting program isn’t quite a by-product of that activity—it’s important, though not nearly as important as distilling the domain understanding.
You think you know when you can learn, are more sure when you can write, even more when you can teach, but certain when you can program. -- Alan Perlis
So, learning how to program is one way of learning how to think better. But wait; there’s more.

An art with a history

It’s easy, if your conception of programming is “something people do to earn a few bucks on freelance exchange sites by coding up Web sites”, to think of programming as an area where only the past five years or so are of any interest. Get up to speed on the latest technology, and you’re good to go.
In fact programming is a discipline with a rich and interesting history1. There is a beauty in the concrete expression of algorithmic ideas in actual programming languages, quite independently of the more mathematical aspects which form the somewhat separate discipline of “computer science”. You can do quite a lot of mathy computer science without needing concepts like modularity, coupling or cohesion which are of intense interest to practicing programmers (the competent ones, at any rate) and which have engendered a variety of approaches.
People who like elegant intellectual constructions will appreciate what is to be found in programming languages, and if you can sort the “classics” from the dregs, in the architecture and design of many programs.

Deep implications

Mathematicians are concerned with the study of quantity and structure. Programming requires knowledge of what, despite its being considered a part of mathematics, strikes me as a distinct discipline: the intersection between the theory of computation and the theory of cognition. To program well, you have to have a feel for how computations unfold, but you must also have a well grounded understanding of how humans parse and manipulate textual descriptions of computations. It is in many ways a literary skill.
What is especially exciting about programming is that we have good reason to believe that our own minds can be understood adequately by looking at them as computations: in some sense, then, to become more familiar with this medium, textual descriptions of computations, is to have a new and very interesting handle on understanding ourselves.
This brief presentation of programming needs to be completed—in further posts—by a look the “dark side” of programming: biases that are occupational hazards of programmers; and by a closer look at the skill set of a competent programmer, and how that skill set overlaps with a rationalist’s developmental objectives.

1
This history is sadly ignored by a majority of practicing programmers, to detrimental effect. Inventions pioneered in Lisp thirty or forty years ago are being rediscovered and touted as “revolutions” every few years in languages such as Java or C# - closures, aspects, metaprogramming...