I didn’t downvote any of your comments, and I don’t see any upthread comments with any downvotes!
Anyway, you could steelman your case if you like. It might help if you compared to other technologies, like “We should develop powerful thing X but not superficially similar powerful thing Y because X is much worse given that there are governments”, or something.
I’m not universally arguing against all technology. I’m not even saying that an arms race means this tech is not worth pursuing, just be aware you might be starting an arms race.
Intelligence-enhancing technologies (like superintelligent AI, connectome-mapping for whole brain emulation, human genetic engineering for IQ) are worth studying in a separate bracket IMO because a very small differential in intelligence leads to a very large differential in power (offensive and defensive, scientific and business and political, basically every kind of power).
I didn’t downvote any of your comments, and I don’t see any upthread comments with any downvotes!
Anyway, you could steelman your case if you like. It might help if you compared to other technologies, like “We should develop powerful thing X but not superficially similar powerful thing Y because X is much worse given that there are governments”, or something.
Okay!
I’m not universally arguing against all technology. I’m not even saying that an arms race means this tech is not worth pursuing, just be aware you might be starting an arms race.
Intelligence-enhancing technologies (like superintelligent AI, connectome-mapping for whole brain emulation, human genetic engineering for IQ) are worth studying in a separate bracket IMO because a very small differential in intelligence leads to a very large differential in power (offensive and defensive, scientific and business and political, basically every kind of power).