Professional stuff: I work in tech, but I’ve never worked as a developer — I have fifteen years of experience as a sysadmin and site reliability engineer. I seem to be unusually good at troubleshooting systems problems — which leaves me in the somewhat unfortunate position of being most satisfied with my job when all the shit is fucked up, which does not happen often. I’ve used about a dozen computer languages; these days I code mostly in Python and Go; for fun I occasionally try to learn more Haskell. I’ve occasionally tried teaching programming to novices, which is one incredible lesson in illusion of transparency, maybe even better than playing Zendo. I’ve also conducted around 200 technical interviews.
Personal stuff: I like cooking, but I don’t stress about diet; I have the good fortune to prefer salad over dessert. I do container gardening. I’ve studied nine or ten (human) languages, but alas am only fluent in English; of those I’ve studied, the one I’d recommend as the most interesting is ASL. I’m polyamorous and in a settled long-term relationship. I get along pretty well with feminists — and think the stereotypes about feminists are as ridiculous as the stereotypes about libertarians. My Political Compass score floats around (1, –8) in the “weird libertarian” end of the pool. I play board games; I should probably play more Go, but am more likely to play more Magic. I was briefly a Less Wrong meetup organizer.
What’s the best programming language to learn in order to get a job? Or a good job, if the two answers would differ.
(Open question; it’s too bad there isn’t an “ask everyone who works in tech” thread or somesuch. For background, I used to know Java, as well as BASIC and bits of assembly, but a series of unfortunate chance events distracted me from programming about five years ago and I haven’t done any since.)
What’s the best programming language to learn in order to get a job?
Eh, depends on what sort of job.
In my line of work, Python or maybe Ruby — they’re both widely used by major employers, and particularly for automation tools.
But Java if you want to write for business computing; C# if you want to write for Windows; Objective-C if you want to write for the Mac or iGizmos; PHP if you want Great Cthulhu to rise from his tomb at R’lyeh. And Perl, Python, or Ruby and a smattering of shellscript if you want to do systems stuff.
Also C for a lot of embedded-systems things, and C++ ditto (and also for a fair amount of applications and a whole lot of what you might call scientific computing: computer vision, financial simulations, games engines, etc. -- but C++ is another Great Cthulhu Language).
Also, even if your only real interest is in getting a good job, it is very worthwhile learning more languages, preferably highly varied ones. The ideas that are natural or even necessary in one language may be useful to have in your mental toolbox when working in another. Consider, e.g., (1) some variety of assembly language to get a better idea of what the machine is actually doing, (2) a functional language like Haskell to show you a very different style of software design, (3) Common Lisp for its unusual (but good) approaches to OO and exception handling and to show you what a really powerful macro system looks like, (4) some languages with very different execution models—Prolog (unification and backtrack-based searching), Forth or PostScript (stack machine), Mathematica (pattern-matching), etc.
Warning: the more different languages you are familiar with, the more you will notice the annoying limitations of each particular language.
I’ve occasionally tried teaching programming to novices, which is one incredible lesson in illusion of transparency, maybe even better than playing Zendo.
How typical do you think your experience has been in this regard? IME, teaching programming to complete novice has been cruise-control stuff and one of the relatively few things where I know exactly what’s going on and where I’m going within minutes of starting.
For context: I’ve had success in teaching a complete novice with vague memory of high-school-math usage of variables how to go from that to writing his own VB6 scripts to automate simple tasks, of retrieving and sending data to fields on a screen using predetermined native functions in the scripting engine (which I taught him how to search and learn-to-use from the available and comprehensive reference files). This was on maybe my third or fourth attempt at doing so.
What I actually want to know is how typical my experience is, and whether or not there’s value in analyzing what I did in order to share it. I suspect I may have a relatively rare mental footing, perspective and interaction of skillsets in regards to this, but I may be wrong and/or this may be more common than I think, invalidating it as evidence for the former.
I think it would be a very good idea to analyse what you’re doing, and probably valuable to have some transcripts of sessions—what you think you’re do may not be what you actually do.
Do you teach in person? By phone? I’m wondering how much you use subtle clues to find out what your student is thinking.
Usually, in person (either as a tag-team or “I’ll be right over here, call me when you’re stumped” approach; I’ve experimentally confirmed that behind-the-shoulder teaching has horrible success rates, at least for this subject), though a few times by chat / IM while passing the code back and forth (or better yet, having one of those rare setups where it’s live-synch’ed).
TL;DR: Look at examples of wildly successful teaching recipes, take cues from them and from LW techniques and personal experience at learning, fiddle a little with it all, and bam, you’ve got a plan for teaching someone to program! Now you just need pedagogical ability.
My general approach is to feel out what dumb-basics they know by looking at it as if we were inventing programming piecemeal, naturally with my genius insight letting us work out most of the kinks on the spot. I also go straight for my list of Things I Wish Someone Would Have Told Me Sooner, the list of Things That Should Be In Every Single So-Called “Beginner’s Tutorial To Programming” Ever, and the list of Kindergarden Concepts You Need To Know To Create Computer Programs—written versions pending.
For instance, every “Beginner’s Tutorial to Programming” I’ve ever seen fails to mention early enough that all this code and fancy stuff they’re showing is nice and all, but to actually have meaningful user interactions and outputs from your program to other things (like the user’s screen, such as making windows appear and put text and buttons in them!) you have to learn to find the right APIs, the right handles and calls to make, and I’ve yet to see a single tutorial, guide, textbook, handbook, “crash course” or anything that isn’t trial-and-error or a human looking at what you did that actually teaches how to do that. So this is among the first things I hammer into them -
“You want to display a popup with yes/no buttons? Open up the Reference here, search for “prompt”, “popup”, “window”, “input” or anything else that seems related, and swim around until you find something that looks like it does what you’re doing, copy the examples given as much as possible in your own code, making changes only to things you’ve already mastered, and try it!”
...somewhat like this, though that’s only for illustration. In a real setting, I’d be double-checking every step of the way there that they remember and understand what I told them about Developer References earlier on, that their face doesn’t scrunch up at any of the terms I suggest for their search, that they can follow the visual display / UI of this particular reference I’m showing them (I’m glaring at you, javadoc! You’re horribly cruel to newbies.) and find their way around it after a bit of poking around, and so on.
Obviously, that’s nowhere near the first things to tackle, though. Most tutorials devote approximately twelve words to the entire idea of variables, which is rather ridiculous when contrasted with the fact that most people barely remember their math classes from high school, and never had the need or chance to wrap their head around the concept of variables as it stands in programming. Just making sure a newbie can wrap their mind comfortably around the idea that a variable won’t have a set value (I pointedly ignore constants at that point, because it’s utterly, completely unnecessary and utterly confusing to mention them until they have an actual need for them, which is way way way waaaaaaaay later—they can just straight-up leave raw values right in the source code until then), that a variable will probably change as the program works, that it won’t change on its own but since programs get big and you can’t be sure you won’t have anything else ever changing it you should always assume it could change somewhere else, etc. etc. etc. There are so many concepts that already-programmers and geeks and math-savvy people just gloss right over that obviously those not part of those elites aren’t going to understand a thing when you start playing guerilla on their brain with return values, mutable vs immutable, variable data types, privates and scopes, classes vs instances, statics, and all that good stuff.
Buuut I’m rambling here. I suppose I just approach this as a philosophical “blend” between facilitating a child’s wonder-induced discovery of the world and its possibilities, and a drill sergeant teaching raw recruits which fingers to bend how in what order and at what speed to best tie their army boot shoelaces and YOU THERE, DON’T FOLD IT LIKE THAT! DO YOU WANT YOUR FINGERS TO SLIP AND DROP THE LACE AND GIVE YOUR ENEMY TIME TO COME UP BEHIND YOU? START OVER!
Of course, it might be my perspective that’s different. I was forewarned both by my trudging, crawly, slow learning of programming and by others about the difficulty of teaching programming, and as silly as it might sound, I have a lot more experience than the average expert self-taught wiz programmer at learning how to program, since I took such a sinuous, intermittent, unassisted and uncrunched road through it.
Anecdotally, I think I’ve re-learned what classes and objects were (after forgetting it from stopping my self-teaching for months) at least eight times. So I have at least eight different, internal, fully-modeled experiences of the whole process of learning those things and figuring out what I’m missing and so on, without anyone ever telling me what I was doing or thinking wrong, to draw from as I try to imagine all the things that might be packed and obfuscated in all the abstracts and concepts in there.
I’m not sure, but one of the techniques that seems most salient to me is breadth-first search. Partly this is to hold off on proposing solutions. Take just a little bit longer to look at the problem and gather data before generating hypotheses. The second part is to find cheap tests to disprove your hypotheses instead of going farther down the path that an early hypothesis leads. Folks who use depth-first search, building up a large tree of hypotheses first or going down a long path of possible tests and fixes, seem more likely to get stuck.
I also really like troubleshooting out loud with colleagues who aren’t afraid to contradict each other. Generating lots of hypotheses and quickly disconfirming most of them can quickly narrow down on the problem. “Okay, maybe the cause is a bad data push. But if that were so, it would be on all the servers, not just the ones in New York, because the data push logs say the push succeeded everywhere. But the problem’s just in New York. So it’s not the data push.”
Sure, what the heck. Ask me stuff.
Professional stuff: I work in tech, but I’ve never worked as a developer — I have fifteen years of experience as a sysadmin and site reliability engineer. I seem to be unusually good at troubleshooting systems problems — which leaves me in the somewhat unfortunate position of being most satisfied with my job when all the shit is fucked up, which does not happen often. I’ve used about a dozen computer languages; these days I code mostly in Python and Go; for fun I occasionally try to learn more Haskell. I’ve occasionally tried teaching programming to novices, which is one incredible lesson in illusion of transparency, maybe even better than playing Zendo. I’ve also conducted around 200 technical interviews.
Personal stuff: I like cooking, but I don’t stress about diet; I have the good fortune to prefer salad over dessert. I do container gardening. I’ve studied nine or ten (human) languages, but alas am only fluent in English; of those I’ve studied, the one I’d recommend as the most interesting is ASL. I’m polyamorous and in a settled long-term relationship. I get along pretty well with feminists — and think the stereotypes about feminists are as ridiculous as the stereotypes about libertarians. My Political Compass score floats around (1, –8) in the “weird libertarian” end of the pool. I play board games; I should probably play more Go, but am more likely to play more Magic. I was briefly a Less Wrong meetup organizer.
What’s the best programming language to learn in order to get a job? Or a good job, if the two answers would differ.
(Open question; it’s too bad there isn’t an “ask everyone who works in tech” thread or somesuch. For background, I used to know Java, as well as BASIC and bits of assembly, but a series of unfortunate chance events distracted me from programming about five years ago and I haven’t done any since.)
Eh, depends on what sort of job.
In my line of work, Python or maybe Ruby — they’re both widely used by major employers, and particularly for automation tools.
But Java if you want to write for business computing; C# if you want to write for Windows; Objective-C if you want to write for the Mac or iGizmos; PHP if you want Great Cthulhu to rise from his tomb at R’lyeh. And Perl, Python, or Ruby and a smattering of shellscript if you want to do systems stuff.
Also C for a lot of embedded-systems things, and C++ ditto (and also for a fair amount of applications and a whole lot of what you might call scientific computing: computer vision, financial simulations, games engines, etc. -- but C++ is another Great Cthulhu Language).
Also, even if your only real interest is in getting a good job, it is very worthwhile learning more languages, preferably highly varied ones. The ideas that are natural or even necessary in one language may be useful to have in your mental toolbox when working in another. Consider, e.g., (1) some variety of assembly language to get a better idea of what the machine is actually doing, (2) a functional language like Haskell to show you a very different style of software design, (3) Common Lisp for its unusual (but good) approaches to OO and exception handling and to show you what a really powerful macro system looks like, (4) some languages with very different execution models—Prolog (unification and backtrack-based searching), Forth or PostScript (stack machine), Mathematica (pattern-matching), etc.
Warning: the more different languages you are familiar with, the more you will notice the annoying limitations of each particular language.
You could start one.
How typical do you think your experience has been in this regard? IME, teaching programming to complete novice has been cruise-control stuff and one of the relatively few things where I know exactly what’s going on and where I’m going within minutes of starting.
For context: I’ve had success in teaching a complete novice with vague memory of high-school-math usage of variables how to go from that to writing his own VB6 scripts to automate simple tasks, of retrieving and sending data to fields on a screen using predetermined native functions in the scripting engine (which I taught him how to search and learn-to-use from the available and comprehensive reference files). This was on maybe my third or fourth attempt at doing so.
What I actually want to know is how typical my experience is, and whether or not there’s value in analyzing what I did in order to share it. I suspect I may have a relatively rare mental footing, perspective and interaction of skillsets in regards to this, but I may be wrong and/or this may be more common than I think, invalidating it as evidence for the former.
I think it would be a very good idea to analyse what you’re doing, and probably valuable to have some transcripts of sessions—what you think you’re do may not be what you actually do.
Do you teach in person? By phone? I’m wondering how much you use subtle clues to find out what your student is thinking.
Usually, in person (either as a tag-team or “I’ll be right over here, call me when you’re stumped” approach; I’ve experimentally confirmed that behind-the-shoulder teaching has horrible success rates, at least for this subject), though a few times by chat / IM while passing the code back and forth (or better yet, having one of those rare setups where it’s live-synch’ed).
TL;DR: Look at examples of wildly successful teaching recipes, take cues from them and from LW techniques and personal experience at learning, fiddle a little with it all, and bam, you’ve got a plan for teaching someone to program! Now you just need pedagogical ability.
My general approach is to feel out what dumb-basics they know by looking at it as if we were inventing programming piecemeal, naturally with my genius insight letting us work out most of the kinks on the spot. I also go straight for my list of Things I Wish Someone Would Have Told Me Sooner, the list of Things That Should Be In Every Single So-Called “Beginner’s Tutorial To Programming” Ever, and the list of Kindergarden Concepts You Need To Know To Create Computer Programs—written versions pending.
For instance, every “Beginner’s Tutorial to Programming” I’ve ever seen fails to mention early enough that all this code and fancy stuff they’re showing is nice and all, but to actually have meaningful user interactions and outputs from your program to other things (like the user’s screen, such as making windows appear and put text and buttons in them!) you have to learn to find the right APIs, the right handles and calls to make, and I’ve yet to see a single tutorial, guide, textbook, handbook, “crash course” or anything that isn’t trial-and-error or a human looking at what you did that actually teaches how to do that. So this is among the first things I hammer into them -
“You want to display a popup with yes/no buttons? Open up the Reference here, search for “prompt”, “popup”, “window”, “input” or anything else that seems related, and swim around until you find something that looks like it does what you’re doing, copy the examples given as much as possible in your own code, making changes only to things you’ve already mastered, and try it!”
...somewhat like this, though that’s only for illustration. In a real setting, I’d be double-checking every step of the way there that they remember and understand what I told them about Developer References earlier on, that their face doesn’t scrunch up at any of the terms I suggest for their search, that they can follow the visual display / UI of this particular reference I’m showing them (I’m glaring at you, javadoc! You’re horribly cruel to newbies.) and find their way around it after a bit of poking around, and so on.
Obviously, that’s nowhere near the first things to tackle, though. Most tutorials devote approximately twelve words to the entire idea of variables, which is rather ridiculous when contrasted with the fact that most people barely remember their math classes from high school, and never had the need or chance to wrap their head around the concept of variables as it stands in programming. Just making sure a newbie can wrap their mind comfortably around the idea that a variable won’t have a set value (I pointedly ignore constants at that point, because it’s utterly, completely unnecessary and utterly confusing to mention them until they have an actual need for them, which is way way way waaaaaaaay later—they can just straight-up leave raw values right in the source code until then), that a variable will probably change as the program works, that it won’t change on its own but since programs get big and you can’t be sure you won’t have anything else ever changing it you should always assume it could change somewhere else, etc. etc. etc. There are so many concepts that already-programmers and geeks and math-savvy people just gloss right over that obviously those not part of those elites aren’t going to understand a thing when you start playing guerilla on their brain with return values, mutable vs immutable, variable data types, privates and scopes, classes vs instances, statics, and all that good stuff.
Buuut I’m rambling here. I suppose I just approach this as a philosophical “blend” between facilitating a child’s wonder-induced discovery of the world and its possibilities, and a drill sergeant teaching raw recruits which fingers to bend how in what order and at what speed to best tie their army boot shoelaces and YOU THERE, DON’T FOLD IT LIKE THAT! DO YOU WANT YOUR FINGERS TO SLIP AND DROP THE LACE AND GIVE YOUR ENEMY TIME TO COME UP BEHIND YOU? START OVER!
Of course, it might be my perspective that’s different. I was forewarned both by my trudging, crawly, slow learning of programming and by others about the difficulty of teaching programming, and as silly as it might sound, I have a lot more experience than the average expert self-taught wiz programmer at learning how to program, since I took such a sinuous, intermittent, unassisted and uncrunched road through it.
Anecdotally, I think I’ve re-learned what classes and objects were (after forgetting it from stopping my self-teaching for months) at least eight times. So I have at least eight different, internal, fully-modeled experiences of the whole process of learning those things and figuring out what I’m missing and so on, without anyone ever telling me what I was doing or thinking wrong, to draw from as I try to imagine all the things that might be packed and obfuscated in all the abstracts and concepts in there.
Do you have a view on Scala?
Never tried it.
How’d you get to be this way?
I’m not sure, but one of the techniques that seems most salient to me is breadth-first search. Partly this is to hold off on proposing solutions. Take just a little bit longer to look at the problem and gather data before generating hypotheses. The second part is to find cheap tests to disprove your hypotheses instead of going farther down the path that an early hypothesis leads. Folks who use depth-first search, building up a large tree of hypotheses first or going down a long path of possible tests and fixes, seem more likely to get stuck.
I also really like troubleshooting out loud with colleagues who aren’t afraid to contradict each other. Generating lots of hypotheses and quickly disconfirming most of them can quickly narrow down on the problem. “Okay, maybe the cause is a bad data push. But if that were so, it would be on all the servers, not just the ones in New York, because the data push logs say the push succeeded everywhere. But the problem’s just in New York. So it’s not the data push.”
Thanks!