@TsviBT I don’t know if you were the one who downvoted my comment, but yeah I don’t think you’ve engaged with the strongest version (steelman?) of my critique. Laws (including laws promoting genomic liberty) don’t carry the same weight during a cold war as they do during peacetime. Incentives shape culture, culture shapes laws.
And the incentives change significantly when a technology upsets the fundamental balance of power between the world’s superpowers.
Or maybe you’re arguing “don’t develop any technology” or “don’t develop any powerful technology” because “governments might misuse it”. That’s something you could reasonably argue, but I think you should just argue that in general if that’s what you’re saying, so the case is clearer.
I didn’t downvote any of your comments, and I don’t see any upthread comments with any downvotes!
Anyway, you could steelman your case if you like. It might help if you compared to other technologies, like “We should develop powerful thing X but not superficially similar powerful thing Y because X is much worse given that there are governments”, or something.
I’m not universally arguing against all technology. I’m not even saying that an arms race means this tech is not worth pursuing, just be aware you might be starting an arms race.
Intelligence-enhancing technologies (like superintelligent AI, connectome-mapping for whole brain emulation, human genetic engineering for IQ) are worth studying in a separate bracket IMO because a very small differential in intelligence leads to a very large differential in power (offensive and defensive, scientific and business and political, basically every kind of power).
@TsviBT I don’t know if you were the one who downvoted my comment, but yeah I don’t think you’ve engaged with the strongest version (steelman?) of my critique. Laws (including laws promoting genomic liberty) don’t carry the same weight during a cold war as they do during peacetime. Incentives shape culture, culture shapes laws.
And the incentives change significantly when a technology upsets the fundamental balance of power between the world’s superpowers.
Or maybe you’re arguing “don’t develop any technology” or “don’t develop any powerful technology” because “governments might misuse it”. That’s something you could reasonably argue, but I think you should just argue that in general if that’s what you’re saying, so the case is clearer.
I didn’t downvote any of your comments, and I don’t see any upthread comments with any downvotes!
Anyway, you could steelman your case if you like. It might help if you compared to other technologies, like “We should develop powerful thing X but not superficially similar powerful thing Y because X is much worse given that there are governments”, or something.
Okay!
I’m not universally arguing against all technology. I’m not even saying that an arms race means this tech is not worth pursuing, just be aware you might be starting an arms race.
Intelligence-enhancing technologies (like superintelligent AI, connectome-mapping for whole brain emulation, human genetic engineering for IQ) are worth studying in a separate bracket IMO because a very small differential in intelligence leads to a very large differential in power (offensive and defensive, scientific and business and political, basically every kind of power).