I believe I understand the warning here. The whole field of philosophy reminds me of the introduction to one of the first books on computer system development—The mythical man-month.
“No scene from prehistory is quite so vivid as that of the mortal struggles of great beasts in the tar pits. In the mind’s eye one sees dinosaurs, mammoths, and saber-toothed tigers struggling against the grip of the tar. The fiercer the struggle, the more entangling the tar, and no beast is so strong or so skillful but that he ultimately sinks.
Large-system programming has over the past decade been such a tar pit, and many great and powerful beasts have thrashed violently in it. Most have emerged with running systems—few have met goals, schedules, and budgets. Large and small, massive or wiry, team after team has become entangled in the tar. No one thing seems to cause the difficulty—any particular paw can be pulled away. But the accumulation of simultaneous and interacting factors brings slower and slower motion. Everyone seems to have been surprised by the stickiness of the problem, and it is hard to discern the nature of it. But we must try to understand it if we are to solve it.”
The tar pit, as the book goes on to describe, is information complexity, and far too many philosophers seem content to jump right into the middle of that morass, convinced they will be able to smash their way out. The problem is not the strength of their reason, but the lack of a solid foothold—everything is sticky and ill-defined, there is nothing solid to stand on. The result is much thrashing, but surprisingly little progress.
The key to progress, for nearly everyone, is to stay where you know solid ground is. Don’t jump in the tar pit unless you absolutely have no other choice. Logic is of very little help when you have no clear foundation to rest it on.
There has been some recent work in tackling the dependence on intuitions.
The Experimental Philosophy (X-Phi) movement has been doing some very interesting stuff examining the role of intuition in philosophy, what intuitions are and to what extent they are useful.
One of the landmark experiments was doing surveys that showed cross cultural variation in responses to certain philosophical thought experiments, (for example in what cases someone is acting intentionally) e.g. Weinberg et al (2001). Which obviously presents a problem for any Philosophical argument that uses such intuitions as premises.
The next stage being explaining these variations, and how by acknowledging these issues you can remove biases, without going too far into skepticism to be useful.
To caricature the problem, if I can’t trust certain of my intuitions I shouldn’t trust them in general. But then how can I trust very basic foundations, (such as: a statement cannot be simultaneously true and false) and from there build up to any argument.
This area seems particularly relevant to this discussion, as there has been definite progress in the very recent past, in a manner very consistent rationalist techniques and goals.
[This is my first LW post, so apologies for any lack of clarity or deviation from accepted practice]
You’re right that there has been lots of progress on this issue in the recent past. Other resources include the book Rethinking Intuition, this issue of SPE, Brian Talbot’s dissertation, and more.
In fact I’m writing up a post on this subject, so if you have other resources to point me to, please do!
Weinberg is awesome. He’s going to be a big deal, I think.
I believe I understand the warning here. The whole field of philosophy reminds me of the introduction to one of the first books on computer system development—The mythical man-month.
“No scene from prehistory is quite so vivid as that of the mortal struggles of great beasts in the tar pits. In the mind’s eye one sees dinosaurs, mammoths, and saber-toothed tigers struggling against the grip of the tar. The fiercer the struggle, the more entangling the tar, and no beast is so strong or so skillful but that he ultimately sinks.
Large-system programming has over the past decade been such a tar pit, and many great and powerful beasts have thrashed violently in it. Most have emerged with running systems—few have met goals, schedules, and budgets. Large and small, massive or wiry, team after team has become entangled in the tar. No one thing seems to cause the difficulty—any particular paw can be pulled away. But the accumulation of simultaneous and interacting factors brings slower and slower motion. Everyone seems to have been surprised by the stickiness of the problem, and it is hard to discern the nature of it. But we must try to understand it if we are to solve it.”
The tar pit, as the book goes on to describe, is information complexity, and far too many philosophers seem content to jump right into the middle of that morass, convinced they will be able to smash their way out. The problem is not the strength of their reason, but the lack of a solid foothold—everything is sticky and ill-defined, there is nothing solid to stand on. The result is much thrashing, but surprisingly little progress.
The key to progress, for nearly everyone, is to stay where you know solid ground is. Don’t jump in the tar pit unless you absolutely have no other choice. Logic is of very little help when you have no clear foundation to rest it on.
Yup! Most of analytic philosophy’s foundation has been intuition, and, well… thar’s yer problem right thar!
There has been some recent work in tackling the dependence on intuitions. The Experimental Philosophy (X-Phi) movement has been doing some very interesting stuff examining the role of intuition in philosophy, what intuitions are and to what extent they are useful.
One of the landmark experiments was doing surveys that showed cross cultural variation in responses to certain philosophical thought experiments, (for example in what cases someone is acting intentionally) e.g. Weinberg et al (2001). Which obviously presents a problem for any Philosophical argument that uses such intuitions as premises.
The next stage being explaining these variations, and how by acknowledging these issues you can remove biases, without going too far into skepticism to be useful. To caricature the problem, if I can’t trust certain of my intuitions I shouldn’t trust them in general. But then how can I trust very basic foundations, (such as: a statement cannot be simultaneously true and false) and from there build up to any argument.
This area seems particularly relevant to this discussion, as there has been definite progress in the very recent past, in a manner very consistent rationalist techniques and goals.
[This is my first LW post, so apologies for any lack of clarity or deviation from accepted practice]
Welcome to LW!
You’re right that there has been lots of progress on this issue in the recent past. Other resources include the book Rethinking Intuition, this issue of SPE, Brian Talbot’s dissertation, and more.
In fact I’m writing up a post on this subject, so if you have other resources to point me to, please do!
Weinberg is awesome. He’s going to be a big deal, I think.