Coding doesn’t directly improve rational thinking, but it improves logical thinking, because it is basically a generalized approach to problem solving. It is also a skill that will lead to a much better understanding of topics like game theory and Friendly AI, and it seems pretty obvious to me that in the future, more and more interesting and rewarding jobs will require coding skills.
There was a time when secretaries had to use pen and paper. Then they had to learn how to use a typewriter. Then they had to learn how to use MS Word and Outlook. Today some of them need to write Excel scripts. Secretaries.
So if you’re not a coder yet, and don’t have any specific reason not to spend a considerable amount of time and effort to learn how to code, I strongly urge you to do so.
That said, I disagree with the choice of programming languages to start with. In general, for someone with no coding experience and no inherent affinity for working under artificial constraints, I suggest starting with a scripting language like Lua instead of a full-blown programming language. Or if you play PC games, see if maybe for some of your favourite ones there exist modding tools with an own script language, as is often the case with strategy games. You won’t learn all that much about actual programming, but the impact on your logical thinking should be similar and you will probably be more motivated as well.
If you insist on learning a real language, don’t start with a weakly typed, high level, or interpreter language (Python, Java, Objective C, BASIC, Javascript, C#, etc). I also can’t advocate Ruby as a beginner language because of its syntax.
What I can advocate is Ada (more or less an improved version of Pascal), a language that will not only lead to a deeper understanding of programming logic, but also automatically teaches good programming practices. For example, the command “assign the value 0 to the variable x” is written as “x := 0“, while in many other languages the same thing is written as “x = 0”, which creates the illusion of equivalence with the mathematical expression “x = 0”; something that many people new to coding struggle with.
Do not learn C. I repeat: Do Not Learn C.
If you’re absolutely sure your first language has to be one that is currently widely used in the industry, I can reluctantly recommend C++, but only if you’re really, really sure about this. Better to start with Ada, then switch to C++.
I don’t agree with the language advice. Different languages teach different things.
C is a good language for learning how the machine works, what memory looks like at a low level, and so forth. It’s lousy for getting most practical work done in.
Python is a nice language for munging text and other data. It’s pretty good these days for numerical work, graph generation, and so forth (thanks to numpy and matplotlib.)
JavaScript is good if you want to use a web browser to interact with your programs, and if you’re prepared to also learn HTML alongside learning to program.
My sense is that Java/C#/ObjectiveC/C++ are too complex to be really good “first languages”. Useful professionally, useful for many kinds of programming, but not great for learning.
There are a lot of good intro-programming resources for Java, C, and Python. My impression is that Java and Python are the most popular “first languages” academically. I don’t believe Ada has nearly as much traction in that niche, which means there will be less support..
I think probably Python is the best bet, all-in-all.
C++ is a better language than C in every single regard, including to learn. You don’t need to learn OOP or exception handling to use C++, but you can still use proper strings, streams, and so on. There is absolutely no reason to use C rather than C++, except when you’re building libraries for existing C architectures. The only thing C teaches that C++ doesn’t is bad habits. If you ever had to work with C++ code written by a C coder you know what I mean.
Python is a nice language, but if you want to learn how to code for the sake of improving your quality of logical thinking, I don’t see any advantage it has over scripting languages, which are easier to learn. Same thing with Javascript.
By the way, Pascal was specifically developed as a language for teaching, and Ada improved on that. The only reason schools today mostly don’t teach Pascal/Ada anymore is because C (and later C++) emerged as the dominant language in the industry, mostly due to its performance and because you can go all the way down to assembly if you want to. So a language great for teaching was largely abandoned in favour of a language great for making money. Similar things are now happening with web-languages like Java, Javascript, HTML5, PHP, …
So I guess it’s best to decide what your priorities are and proceed from there.
Why are people down voting this and other comments? They are on topic and are well and persuasively written. People might disagree but surely replying is a better way of doing that than voting down?
There was a time when secretaries had to use pen and paper. Then they had to learn how to use a typewriter. Then they had to learn how to use MS Word and Outlook. Today some of them need to write Excel scripts. Secretaries.
The Unix line editor ed (which nobody uses any more) and the typesetting system roff (whose descendant nroff is today only used for Unix manual pages) were once used by the patent attorneys at AT&T.
It’s not that it’s wrong or bad, just that it’s unusual in some ways and generally not very readable. This comes primarily from Ruby treating practically everything as objects. Also, you’ll be using more characters like # and @, which makes learning more difficult and frustrating. You can do without these in most languages as long as you don’t use pointers.
I’m not sure what you refer to when you say “comparative programming languages”...
I still have warm memories of, when I was first teaching myself SmallTalk, trying to look up the SmallTalk equivalent of a for loop in a reference chart and being unable to find it, and later discovering that “to: by: do:” was defined as a method on class Integer.
If you insist on learning a real language, don’t start with a weakly typed, high level, or interpreter language (Python, Java, Objective C, BASIC, Javascript, C#, etc).
Depends on your goal. C# is wonderful for dipping your toes in to coding and immediately being able to make something useful to yourself. Javascript is an incredibly flexible language that doesn’t require anything more than a browser and text editor. Ada is great if you want to invest in a more hard core understanding of how code works.
I personally start people on either Javascript (no messing about with installing compilers, no need to understand types) or C# (amazing UI, perfect for a novice to lay out applications in—most people are used to windows, not terminals!!)
C# is a great language, but not a good starting language because you need to deal with a lot of secondary elements like header files, Visual Studio with its unintuitive solution system, or the .NET framework before you can code anything of practical value.
If you want to learn coding for the sake of improving the quality of your logical thinking, low-level languages are the way to go, and I’m not aware of any language that teaches this better than Ada.
If you want to see quick results, go learn a scripting language. They’re all pretty much the same; Lua just seems to be the most popular these days.
There are also lots of “esoteric” languages that are designed to fit specific (often absurd) programming philosophies rather than maintain functionality; I know there are some that aim to make coding as painful as possible (Brainfuck and Whitespace come to mind), but there may also be some that teach programming logic especially well. I’m not particularly knowledgeable about this huge field of languages, so I leave recommendations to someone else.
I’m not sure when you last used C#, but solutions are only used if you want to group 2+ separate projects together, are fairly intuitive (there’s a single unintuitive bit—running the main project doesn’t rebuild the others, but it DOES pop up a warning to that effect), and I don’t think I’ve ever seen someone struggle with them outside of the complexities of a multi-user office where the solution file is stored in subversion (solved by not storing solution files in SVN!)
Equally, I’m not sure why the ”.NET framework” would add any complexity. The ability to look at a control in a UI and see all of it’s attributes is something most people find a lot more intuitive. The ability to double-click a control and add an obvious default event is also very helpful, in my experience with teaching.
Header files, I will concede. For basic programs, C# automatically includes the necessary ones, however—so it’s not something that really comes up until you’re doing more advanced code.
I agree with the second half of this. Pick a language that suits your needs. I use Visual Basic extensively, for interfacing with spreadsheets, and I wrote a bejeweled program (which sucked, since it took comparatively forever to get the color of a pixel and tell me what color the square was) in AutoHotkey, an awesome program that will let you remap hotkeys on your computer. I know a bit of PHP and C++, but the vast majority of what I do is in VB and AutoHotkey, because that’s what’s most accessible to me.
Great post.
Coding doesn’t directly improve rational thinking, but it improves logical thinking, because it is basically a generalized approach to problem solving. It is also a skill that will lead to a much better understanding of topics like game theory and Friendly AI, and it seems pretty obvious to me that in the future, more and more interesting and rewarding jobs will require coding skills. There was a time when secretaries had to use pen and paper. Then they had to learn how to use a typewriter. Then they had to learn how to use MS Word and Outlook. Today some of them need to write Excel scripts. Secretaries.
So if you’re not a coder yet, and don’t have any specific reason not to spend a considerable amount of time and effort to learn how to code, I strongly urge you to do so.
That said, I disagree with the choice of programming languages to start with. In general, for someone with no coding experience and no inherent affinity for working under artificial constraints, I suggest starting with a scripting language like Lua instead of a full-blown programming language. Or if you play PC games, see if maybe for some of your favourite ones there exist modding tools with an own script language, as is often the case with strategy games. You won’t learn all that much about actual programming, but the impact on your logical thinking should be similar and you will probably be more motivated as well.
If you insist on learning a real language, don’t start with a weakly typed, high level, or interpreter language (Python, Java, Objective C, BASIC, Javascript, C#, etc). I also can’t advocate Ruby as a beginner language because of its syntax.
What I can advocate is Ada (more or less an improved version of Pascal), a language that will not only lead to a deeper understanding of programming logic, but also automatically teaches good programming practices. For example, the command “assign the value 0 to the variable x” is written as “x := 0“, while in many other languages the same thing is written as “x = 0”, which creates the illusion of equivalence with the mathematical expression “x = 0”; something that many people new to coding struggle with.
Do not learn C. I repeat: Do Not Learn C.
If you’re absolutely sure your first language has to be one that is currently widely used in the industry, I can reluctantly recommend C++, but only if you’re really, really sure about this. Better to start with Ada, then switch to C++.
I don’t agree with the language advice. Different languages teach different things.
C is a good language for learning how the machine works, what memory looks like at a low level, and so forth. It’s lousy for getting most practical work done in.
Python is a nice language for munging text and other data. It’s pretty good these days for numerical work, graph generation, and so forth (thanks to numpy and matplotlib.)
JavaScript is good if you want to use a web browser to interact with your programs, and if you’re prepared to also learn HTML alongside learning to program.
My sense is that Java/C#/ObjectiveC/C++ are too complex to be really good “first languages”. Useful professionally, useful for many kinds of programming, but not great for learning.
There are a lot of good intro-programming resources for Java, C, and Python. My impression is that Java and Python are the most popular “first languages” academically. I don’t believe Ada has nearly as much traction in that niche, which means there will be less support..
I think probably Python is the best bet, all-in-all.
C++ is a better language than C in every single regard, including to learn. You don’t need to learn OOP or exception handling to use C++, but you can still use proper strings, streams, and so on. There is absolutely no reason to use C rather than C++, except when you’re building libraries for existing C architectures. The only thing C teaches that C++ doesn’t is bad habits. If you ever had to work with C++ code written by a C coder you know what I mean.
Python is a nice language, but if you want to learn how to code for the sake of improving your quality of logical thinking, I don’t see any advantage it has over scripting languages, which are easier to learn. Same thing with Javascript.
By the way, Pascal was specifically developed as a language for teaching, and Ada improved on that. The only reason schools today mostly don’t teach Pascal/Ada anymore is because C (and later C++) emerged as the dominant language in the industry, mostly due to its performance and because you can go all the way down to assembly if you want to. So a language great for teaching was largely abandoned in favour of a language great for making money. Similar things are now happening with web-languages like Java, Javascript, HTML5, PHP, …
So I guess it’s best to decide what your priorities are and proceed from there.
Why are people down voting this and other comments? They are on topic and are well and persuasively written. People might disagree but surely replying is a better way of doing that than voting down?
The Unix line editor
ed(which nobody uses any more) and the typesetting systemroff(whose descendantnroffis today only used for Unix manual pages) were once used by the patent attorneys at AT&T.What specifically is wrong with Ruby’s syntax? (I don’t know much about comparative programming languages.)
It’s not that it’s wrong or bad, just that it’s unusual in some ways and generally not very readable. This comes primarily from Ruby treating practically everything as objects. Also, you’ll be using more characters like # and @, which makes learning more difficult and frustrating. You can do without these in most languages as long as you don’t use pointers.
I’m not sure what you refer to when you say “comparative programming languages”...
I love this feature. Apart from allowing some amazing library implementations it just leaves me with a warm tingly feeling inside.
I still have warm memories of, when I was first teaching myself SmallTalk, trying to look up the SmallTalk equivalent of a for loop in a reference chart and being unable to find it, and later discovering that “to: by: do:” was defined as a method on class Integer.
This delighted me in ways difficult to express.
Depends on your goal. C# is wonderful for dipping your toes in to coding and immediately being able to make something useful to yourself. Javascript is an incredibly flexible language that doesn’t require anything more than a browser and text editor. Ada is great if you want to invest in a more hard core understanding of how code works.
I personally start people on either Javascript (no messing about with installing compilers, no need to understand types) or C# (amazing UI, perfect for a novice to lay out applications in—most people are used to windows, not terminals!!)
C# is a great language, but not a good starting language because you need to deal with a lot of secondary elements like header files, Visual Studio with its unintuitive solution system, or the .NET framework before you can code anything of practical value.
If you want to learn coding for the sake of improving the quality of your logical thinking, low-level languages are the way to go, and I’m not aware of any language that teaches this better than Ada.
If you want to see quick results, go learn a scripting language. They’re all pretty much the same; Lua just seems to be the most popular these days.
There are also lots of “esoteric” languages that are designed to fit specific (often absurd) programming philosophies rather than maintain functionality; I know there are some that aim to make coding as painful as possible (Brainfuck and Whitespace come to mind), but there may also be some that teach programming logic especially well. I’m not particularly knowledgeable about this huge field of languages, so I leave recommendations to someone else.
I’m not sure when you last used C#, but solutions are only used if you want to group 2+ separate projects together, are fairly intuitive (there’s a single unintuitive bit—running the main project doesn’t rebuild the others, but it DOES pop up a warning to that effect), and I don’t think I’ve ever seen someone struggle with them outside of the complexities of a multi-user office where the solution file is stored in subversion (solved by not storing solution files in SVN!)
Equally, I’m not sure why the ”.NET framework” would add any complexity. The ability to look at a control in a UI and see all of it’s attributes is something most people find a lot more intuitive. The ability to double-click a control and add an obvious default event is also very helpful, in my experience with teaching.
Header files, I will concede. For basic programs, C# automatically includes the necessary ones, however—so it’s not something that really comes up until you’re doing more advanced code.
I agree with the second half of this. Pick a language that suits your needs. I use Visual Basic extensively, for interfacing with spreadsheets, and I wrote a bejeweled program (which sucked, since it took comparatively forever to get the color of a pixel and tell me what color the square was) in AutoHotkey, an awesome program that will let you remap hotkeys on your computer. I know a bit of PHP and C++, but the vast majority of what I do is in VB and AutoHotkey, because that’s what’s most accessible to me.