Overall, specific errors in reasoning should generally be highlighted instead of arguing that the other person is biased. One reason is because such an accusation is an ad hominem attack—I think that such indirect methods of analyzing the rationality of an argument have an alarming potential to provoke mind-killing.
The more obvious and important reason is that citing a logical error/fallacy/bad interpretation of data is so much more reliable than trying to read emotional cues of whether someone is biased; this is especially true considering the lack of insight which we have into each other’s mind.
A more correct (less wrong?) restatement of the criticism is that Kurzweil seemingly allows his own hope for immortality to bias his predictions as to when immortality will be achieved.
Isn’t it? If there are significant causal pressures on his conclusions other than the reality of the thing he’s trying to draw conclusions about, then it’s not clear how his conclusions would become/remain entangled with reality.
This is not an important criticism; it is ad hominem in its purest form.
Prediction: Given the right incentive and five minutes to think Eliezer would be able to give an example of a criticism that is a more pure form of fallacious ad hominem. I am only slightly less confident that a randomly selected 15 year old student, allowing the ‘5 minutes’ to include an explanation of what ad hominem means if necessary.
This is not an important criticism; it is ad hominem in its purest form.
In certain contexts, where someone is relying on someone’s expertise and lack the resources to evaluate the details of a claim, then relying on experts make sense. If a given potential expert has a reason to be biased that’s a reason to rely on that expert less.
You mean when criticizing his timeframes one should actually point out real flaws instead of just pointing out how they nicely align with his life expectancy?
At first glance I totally fail to see the ad-hominem, maybe a second will help.
This is not an important criticism; it is ad hominem in its purest form.
Overall, specific errors in reasoning should generally be highlighted instead of arguing that the other person is biased. One reason is because such an accusation is an ad hominem attack—I think that such indirect methods of analyzing the rationality of an argument have an alarming potential to provoke mind-killing.
The more obvious and important reason is that citing a logical error/fallacy/bad interpretation of data is so much more reliable than trying to read emotional cues of whether someone is biased; this is especially true considering the lack of insight which we have into each other’s mind.
A more correct (less wrong?) restatement of the criticism is that Kurzweil seemingly allows his own hope for immortality to bias his predictions as to when immortality will be achieved.
Isn’t it? If there are significant causal pressures on his conclusions other than the reality of the thing he’s trying to draw conclusions about, then it’s not clear how his conclusions would become/remain entangled with reality.
Prediction: Given the right incentive and five minutes to think Eliezer would be able to give an example of a criticism that is a more pure form of fallacious ad hominem. I am only slightly less confident that a randomly selected 15 year old student, allowing the ‘5 minutes’ to include an explanation of what ad hominem means if necessary.
In certain contexts, where someone is relying on someone’s expertise and lack the resources to evaluate the details of a claim, then relying on experts make sense. If a given potential expert has a reason to be biased that’s a reason to rely on that expert less.
True. I should have said ‘popular.’ I’ve updated.
Perhaps a better criticism is that his prediction timeframes are the opposite of conservative estimates.
How so?
You mean when criticizing his timeframes one should actually point out real flaws instead of just pointing out how they nicely align with his life expectancy?
At first glance I totally fail to see the ad-hominem, maybe a second will help.