That is quite a bit of conjunction you’ve got going on there. Rather extraordinary if it is true, I’ve yet to see appropriately compelling evidence of this. Based on what evidence I do see I think the sequences, at least the ones I’ve read so far, are probably “mostly right”, interesting and perhaps marginally useful to very peculiar kinds of people for ordering their lives.
I also think the sequences are badly-organized and you should just read them chronologically instead of trying to lump them into categories and sub-categories, but I digress.
The error in your comment is that the sequences were all created by a few reliable processes, so it’s as much of a conjunction fallacy as “My entire leg will function.” Note that this also means that if one of the articles in the Sequences is wrong, it doesn’t even mean Eliezer has made a grievous mistake. I have Nick Tarleton to thank for this insight, but I can’t find where he originally said it.
The error in your comment is that the sequences were all created by a few reliable processes, so it’s as much of a conjunction fallacy as “My entire leg will function.”
Even professional runners will occasionally trip. Even Terry Tao occasionally makes a math error.
The point isn’t that even highly reliable processes are likely to output some bad ideas over the long term.
That is quite a bit of conjunction you’ve got going on there. Rather extraordinary if it is true, I’ve yet to see appropriately compelling evidence of this. Based on what evidence I do see I think the sequences, at least the ones I’ve read so far, are probably “mostly right”, interesting and perhaps marginally useful to very peculiar kinds of people for ordering their lives.
I think I agree with this.
The error in your comment is that the sequences were all created by a few reliable processes, so it’s as much of a conjunction fallacy as “My entire leg will function.” Note that this also means that if one of the articles in the Sequences is wrong, it doesn’t even mean Eliezer has made a grievous mistake. I have Nick Tarleton to thank for this insight, but I can’t find where he originally said it.
Even professional runners will occasionally trip. Even Terry Tao occasionally makes a math error.
The point isn’t that even highly reliable processes are likely to output some bad ideas over the long term.
I think it’s a Kaasism.
The point isn’t being mostly right. It’s being less wrong.