Re-reading her post it seems plausible she also does not understand/see all implications of “boundedness” selection pressures, idk. If this is the case I’d concede that neither of you gets this point.
Which responses specifically? The Lonelyton reply addresses whether some selection continues, not whether selection’s direction is what you believe. I don’t think in any other response you gave your explanation why ‘increased intelligence/adaptability’ is such a small niche in natural evolution, or why Lands/yours argument about the eschaton would be so much better than other arguments about eschatology, or actually most of what I’m writing about. I made the arguments in somewhat compressed form, but Claude can expand/explain
do you think bacteria and ants have a stronger shot at winning the lightcone than humans?
in general, if you don’t think intelligence gives a significant advantage, why would you worry about ASI?
eschatology: please consider that it’s not me who says a superintelligence will take over the universe. my claim is simply that, if that’s the case, its main goal wouldn’t have been any dumb unchanging goal. the eschaton is something you continually bring up, together with the necessity to prevent it.
Re-reading her post it seems plausible she also does not understand/see all implications of “boundedness” selection pressures, idk. If this is the case I’d concede that neither of you gets this point.
Which responses specifically? The Lonelyton reply addresses whether some selection continues, not whether selection’s direction is what you believe. I don’t think in any other response you gave your explanation why ‘increased intelligence/adaptability’ is such a small niche in natural evolution, or why Lands/yours argument about the eschaton would be so much better than other arguments about eschatology, or actually most of what I’m writing about. I made the arguments in somewhat compressed form, but Claude can expand/explain
do you think bacteria and ants have a stronger shot at winning the lightcone than humans?
in general, if you don’t think intelligence gives a significant advantage, why would you worry about ASI?
eschatology: please consider that it’s not me who says a superintelligence will take over the universe. my claim is simply that, if that’s the case, its main goal wouldn’t have been any dumb unchanging goal. the eschaton is something you continually bring up, together with the necessity to prevent it.