TruthMapper scares me, for the same reason Objectivists I used to know who thought they knew a formal deductive proof going from “A is A” to “Taxation is slavery”, justifying each step with an inference rule scared me.
I’m not sure whether TruthMapper encourages people to be sloppy, or whether it’s such a good tool that the sloppiness is just much more obvious than it would be on a message board.
But I’m inclined to lay a bit of the blame on the site itself. For one thing, the video claims that it lets people make all assumptions explicit, which I take to mean that the company behind it believes that. For another, the entire philosophy seems to be that argument should work like an Aristotelian syllogism, and that’s part of the problem. For a third, I can’t take them seriously with that logo. Did they pay the designer per Photoshop layer effect used?
Debategraph looks like a mind map kind of thing. I suppose if that’s the way you like seeing your information organized, it could be useful. I’m just wary of the whole concept of formalizing debate too much (by formal, I mean formal as in official, not formal as in formal systems). Once you start thinking like a high school kid at Debate Club, you’ve already lost, and I worry these sites could encourage that mode.
The idea of truth-seeking software is a good one, but there’s got to be a way to avoid aiming it at the lowest common denominator.
I used to be quite interested in that kind of technology, I had even set up a few experiments on wiki, though they never went that far … I used to argue that those could be a good way of creating information on divisive issues, as an alternative to having both sides set up their own resources and avoid linking to good arguments from the other side.
I guess now I’ve lost interest about those, and don’t think they’re that useful. Someday I’ll have to go back and try all the “high-tech debate” sites that have sprunt up, but I’m more skeptical about the benefit of those kinds of “debate technology”. (one red flag is I don’t feel that inclined to participate in them, at least, much less than I would in forums or blog comments)
I think having publicly edited “chains of reasoning” could be interesting, because they could help show someone where others might disagree with his logic. Like, if the objectivists you mention had their formal proofs laid out for public criticism, they’d probably be forced to admit that it isn’t as strong as what they thought.
In other words, I don’t think “pyramids of logic” have much value, but these sites might help point out the weaknesses of pyramids of logic to those who rely on them too much (Blaise Pascal, I’m looking at you).
If I am not mistaken, you have several criticisms of truthmapper. I’ve tried to respond to them in a carefully numbered fashion. This separation might be a rough approximation of the way a software tool would structure an argument.
A proof from the sole premise ‘A is A’ concluding ‘Taxation is slavery’ is certainly fallacious, I agree. Can you expand on what the ‘same reason’ is? I’m not sure what I’m expected to see in the argument you reference. It is awkward at the very least, but it is more detailed, concrete and falsifiable than many trollish claims, and some of its flaws are pointed out in the critiques.
The site may encourage people to be sloppy in their argumentation, or it may make sloppiness more obvious.
The video makes a fallacious claim “all assumptions explicit”, and that diminishes my trust of the organization, I agree.
I’m not sure what you mean by “argument should work like an Aristotelian syllogism”. There are many flaws in syllogisms—the one I remember is the inability to prove that a horse’s head is an animal’s head. Because the claims are natural language text, the structure truthmapper enforces is looser than a syllogism; merely a tree of claims and supporting claims.
You’re entirely correct, the logo is not good.
Paying official attention to argumentation may encourage making it a status contest, with individuals striving to “win” rather than striving to discover the truth. This is a thorny problem for rationality, but I don’t think it is confined to argumentation software.
I think we only disagree on 4. (You agree with me on 2,3, and 5, and I agree with you that 6 is not confined to software). I think the expansion of 1 you want really is 4, and I admit I explained 4 poorly. It is kind of tangled in my own head, but maybe I can do better:
TruthMapper encourages people to think that an argument on politics or religion or culture is structured like a deductive proof, where if there’s a problem, it’s because someone accidentally used (A → B) and (B) to conclude (A) or something silly like that. The real problem with all of these arguments is that no one’s grounded their morality properly, people are treating generalizations as universals, people import hidden assumptions, people think proving a single major negative of a disliked theory is enough without running a cost-benefit analysis, people are using words wrongly and so on.
But upon further thought, you’re right that this is the program making a common flaw more obvious, not the program creating the flaw. But if I were to encounter for example the argument about art on a message board, I would try to explain why the whole argument was hopeless because of these points, and how the person’s argument style could become more rigorous. Whereas on TruthMapper, I am reduced to sniping at why Point 4 doesn’t follow from Point 3.
But I’m open to testing the system empirically. I trust the people here to avoid the sort of mistakes the people in the art argument used. If you want to organize a LessWrong debate about something on TruthMapper, I’ll participate and change my mind if the debate goes better than it would on a comment thread here.
Awesome; I think we may have actually communicated.
Despite my posting these things, I don’t really want to organize a LessWrong debate on TruthMapper or DebateGraph. They’re both so clumsy and annoying in user interface that I’d rather wait (or work) for something more pleasant to use.
Good analysis Yvain. I guess TruthMapper could be handy when some reasoning pushed the limits of your working memory, yet the emphasis on debate and persuassiveness, grates almost as much as the ghastly web design.
The first thing I noticed was “What is not fully understood is not possessed.”—Goethe.
I’ve got a car, a book on string theory and a spleen that I am quite confident I possess. Annoying.
The first thing I noticed was “What is not fully understood is not possessed.”—Goethe. I’ve got a car, a book on string theory and a spleen that I am quite confident I possess. Annoying.
TruthMapper scares me, for the same reason Objectivists I used to know who thought they knew a formal deductive proof going from “A is A” to “Taxation is slavery”, justifying each step with an inference rule scared me.
See for example the proof that commercialization of fine art hurts society.
I’m not sure whether TruthMapper encourages people to be sloppy, or whether it’s such a good tool that the sloppiness is just much more obvious than it would be on a message board.
But I’m inclined to lay a bit of the blame on the site itself. For one thing, the video claims that it lets people make all assumptions explicit, which I take to mean that the company behind it believes that. For another, the entire philosophy seems to be that argument should work like an Aristotelian syllogism, and that’s part of the problem. For a third, I can’t take them seriously with that logo. Did they pay the designer per Photoshop layer effect used?
Debategraph looks like a mind map kind of thing. I suppose if that’s the way you like seeing your information organized, it could be useful. I’m just wary of the whole concept of formalizing debate too much (by formal, I mean formal as in official, not formal as in formal systems). Once you start thinking like a high school kid at Debate Club, you’ve already lost, and I worry these sites could encourage that mode.
The idea of truth-seeking software is a good one, but there’s got to be a way to avoid aiming it at the lowest common denominator.
I used to be quite interested in that kind of technology, I had even set up a few experiments on wiki, though they never went that far … I used to argue that those could be a good way of creating information on divisive issues, as an alternative to having both sides set up their own resources and avoid linking to good arguments from the other side.
I guess now I’ve lost interest about those, and don’t think they’re that useful. Someday I’ll have to go back and try all the “high-tech debate” sites that have sprunt up, but I’m more skeptical about the benefit of those kinds of “debate technology”. (one red flag is I don’t feel that inclined to participate in them, at least, much less than I would in forums or blog comments)
I think having publicly edited “chains of reasoning” could be interesting, because they could help show someone where others might disagree with his logic. Like, if the objectivists you mention had their formal proofs laid out for public criticism, they’d probably be forced to admit that it isn’t as strong as what they thought.
In other words, I don’t think “pyramids of logic” have much value, but these sites might help point out the weaknesses of pyramids of logic to those who rely on them too much (Blaise Pascal, I’m looking at you).
If I am not mistaken, you have several criticisms of truthmapper. I’ve tried to respond to them in a carefully numbered fashion. This separation might be a rough approximation of the way a software tool would structure an argument.
A proof from the sole premise ‘A is A’ concluding ‘Taxation is slavery’ is certainly fallacious, I agree. Can you expand on what the ‘same reason’ is? I’m not sure what I’m expected to see in the argument you reference. It is awkward at the very least, but it is more detailed, concrete and falsifiable than many trollish claims, and some of its flaws are pointed out in the critiques.
The site may encourage people to be sloppy in their argumentation, or it may make sloppiness more obvious.
The video makes a fallacious claim “all assumptions explicit”, and that diminishes my trust of the organization, I agree.
I’m not sure what you mean by “argument should work like an Aristotelian syllogism”. There are many flaws in syllogisms—the one I remember is the inability to prove that a horse’s head is an animal’s head. Because the claims are natural language text, the structure truthmapper enforces is looser than a syllogism; merely a tree of claims and supporting claims.
You’re entirely correct, the logo is not good.
Paying official attention to argumentation may encourage making it a status contest, with individuals striving to “win” rather than striving to discover the truth. This is a thorny problem for rationality, but I don’t think it is confined to argumentation software.
I think we only disagree on 4. (You agree with me on 2,3, and 5, and I agree with you that 6 is not confined to software). I think the expansion of 1 you want really is 4, and I admit I explained 4 poorly. It is kind of tangled in my own head, but maybe I can do better:
TruthMapper encourages people to think that an argument on politics or religion or culture is structured like a deductive proof, where if there’s a problem, it’s because someone accidentally used (A → B) and (B) to conclude (A) or something silly like that. The real problem with all of these arguments is that no one’s grounded their morality properly, people are treating generalizations as universals, people import hidden assumptions, people think proving a single major negative of a disliked theory is enough without running a cost-benefit analysis, people are using words wrongly and so on.
But upon further thought, you’re right that this is the program making a common flaw more obvious, not the program creating the flaw. But if I were to encounter for example the argument about art on a message board, I would try to explain why the whole argument was hopeless because of these points, and how the person’s argument style could become more rigorous. Whereas on TruthMapper, I am reduced to sniping at why Point 4 doesn’t follow from Point 3.
But I’m open to testing the system empirically. I trust the people here to avoid the sort of mistakes the people in the art argument used. If you want to organize a LessWrong debate about something on TruthMapper, I’ll participate and change my mind if the debate goes better than it would on a comment thread here.
Awesome; I think we may have actually communicated.
Despite my posting these things, I don’t really want to organize a LessWrong debate on TruthMapper or DebateGraph. They’re both so clumsy and annoying in user interface that I’d rather wait (or work) for something more pleasant to use.
Good analysis Yvain. I guess TruthMapper could be handy when some reasoning pushed the limits of your working memory, yet the emphasis on debate and persuassiveness, grates almost as much as the ghastly web design.
The first thing I noticed was “What is not fully understood is not possessed.”—Goethe. I’ve got a car, a book on string theory and a spleen that I am quite confident I possess. Annoying.
Would you prefer Feynman’s formulation?