I thought I’d share this story about a recent, very strange event involving fixing a problem with a large inferential distance. I don’t know what I should conclude from it, but, even with what I’ve said about “real understanding”, I didn’t expect this to happen.
At work, my technical lead wanted to spend a day with me to fix a bug that was causing major problems in our site. In order to be helpful on this task, however, I had to get up to speed on the infrastructure of the website. So my lead started the day by explaining it to me, and I made sure to ask for clarification on anything I either didn’t understand OR (and this is important) that I could not connect with the rest of the system (in my mental model of it).
The discussion eventually turned to the matter of what happens when you commit a change to the code, i.e., update it to a slightly newer version.
On that topic, he eventually explained, “Next, the 1st server tells the other (redundant) servers to act as if they were upgraded to the new version of the code (in technical jargon, it changes the “dependencies” of the other servers and their resulting “virtual environment”.) But before it actually updates the other servers’ code to the new version, it does one final sanity check, and if that fails, it does not proceed with updating the servers.
“See the problem, Silas? The dependencies and code are out of sync on all the other servers because the former was upgraded while the latter was not. So it crashes when it tries to use something that it didn’t expect to use.”
I replied, “So, then why not wait until the final test passes before telling the other servers to upgrade their dependencies?”
Short pause as my lead thinks. “Yeah, that would solve this problem.”
Well, it turned out this wasn’t some toy problem to teach a lesson; it had actually caused several emergencies. And all my lead had to do to solve the problem was walk through the process the site uses for updating to a new version.
This isn’t uncommon. I’ve had lots of experiences where I go over to someone else’s desk to help with some bug/compile error/etc, but they figure out the cause while explaining the problem, or after I ask a “stupid” question.
Conclusion: try explaining your problems to an inanimate object on your desk. Sometimes the solution is obvious, if you can activate all the necessary concepts in your brain at the same time...
I thought I’d share this story about a recent, very strange event involving fixing a problem with a large inferential distance. I don’t know what I should conclude from it, but, even with what I’ve said about “real understanding”, I didn’t expect this to happen.
At work, my technical lead wanted to spend a day with me to fix a bug that was causing major problems in our site. In order to be helpful on this task, however, I had to get up to speed on the infrastructure of the website. So my lead started the day by explaining it to me, and I made sure to ask for clarification on anything I either didn’t understand OR (and this is important) that I could not connect with the rest of the system (in my mental model of it).
The discussion eventually turned to the matter of what happens when you commit a change to the code, i.e., update it to a slightly newer version.
On that topic, he eventually explained, “Next, the 1st server tells the other (redundant) servers to act as if they were upgraded to the new version of the code (in technical jargon, it changes the “dependencies” of the other servers and their resulting “virtual environment”.) But before it actually updates the other servers’ code to the new version, it does one final sanity check, and if that fails, it does not proceed with updating the servers.
“See the problem, Silas? The dependencies and code are out of sync on all the other servers because the former was upgraded while the latter was not. So it crashes when it tries to use something that it didn’t expect to use.”
I replied, “So, then why not wait until the final test passes before telling the other servers to upgrade their dependencies?”
Short pause as my lead thinks. “Yeah, that would solve this problem.”
Well, it turned out this wasn’t some toy problem to teach a lesson; it had actually caused several emergencies. And all my lead had to do to solve the problem was walk through the process the site uses for updating to a new version.
This isn’t uncommon. I’ve had lots of experiences where I go over to someone else’s desk to help with some bug/compile error/etc, but they figure out the cause while explaining the problem, or after I ask a “stupid” question.
Conclusion: try explaining your problems to an inanimate object on your desk. Sometimes the solution is obvious, if you can activate all the necessary concepts in your brain at the same time...
Its called rubberducking, after a guy who (allegedly) used a rubber duck as the inanimate object.