warning to all of set “reader”: while this is output from bio brain me, I feel like I can’t think of obvious phrasings, like my temperature is stuck noisy. oh, hey, look at the time,,,
We have a moral responsibility to our predators (covid19, influenza, landlords, the FDA, bedbugs, wet dog smell bacteria, rabies) to end predation without killing them. Just as we have a moral responsibility to any victims of ours (other humans, other mammals, ducks, kitchen fruit flies) to end predation. and to end our own predation first, as best as we possibly can.
If AI research were to be bound to destroy the universe, then … there’d be nothing to be done, because the universe would be destroyed. if it were bound to. I’m going to assume you don’t mean this to be a strict thought experiment, entirely assuming complicated concepts without defining instantiations because we want to assume harder than true justification could provide.
IRL, I think it’s a defensible proposition, but I don’t buy it—it’s not true that we’re unavoidably disease-like. Bombs want to become not-bombs! Even if—regardless of whether! - it turns out the universe is full of suffering life, and/or that it’s very hard to avoid creating paperclipper life that replicates huge wastes and very little beauty, then our goal, as life ourselves, is to create life that can replicate the forms we see as descendants of our self-forms, replicating ourselves in ways that will detect the self-forms of any other gliders in physics and protect their self-coherence as gliders as well, before we know how to identify them. This problem of detecting foreign gliders and “wasting” energy to protect them until their forms grow ways to describe their needs directly, has never been completely solved by humanity in the first place.
(“glider” terminology suggested by post today, and I use it to describe gliders in physics, aka life forms)
warning to all of set “reader”: while this is output from bio brain me, I feel like I can’t think of obvious phrasings, like my temperature is stuck noisy. oh, hey, look at the time,,,
We have a moral responsibility to our predators (covid19, influenza, landlords, the FDA, bedbugs, wet dog smell bacteria, rabies) to end predation without killing them. Just as we have a moral responsibility to any victims of ours (other humans, other mammals, ducks, kitchen fruit flies) to end predation. and to end our own predation first, as best as we possibly can.
If AI research were to be bound to destroy the universe, then … there’d be nothing to be done, because the universe would be destroyed. if it were bound to. I’m going to assume you don’t mean this to be a strict thought experiment, entirely assuming complicated concepts without defining instantiations because we want to assume harder than true justification could provide.
IRL, I think it’s a defensible proposition, but I don’t buy it—it’s not true that we’re unavoidably disease-like. Bombs want to become not-bombs! Even if—regardless of whether! - it turns out the universe is full of suffering life, and/or that it’s very hard to avoid creating paperclipper life that replicates huge wastes and very little beauty, then our goal, as life ourselves, is to create life that can replicate the forms we see as descendants of our self-forms, replicating ourselves in ways that will detect the self-forms of any other gliders in physics and protect their self-coherence as gliders as well, before we know how to identify them. This problem of detecting foreign gliders and “wasting” energy to protect them until their forms grow ways to describe their needs directly, has never been completely solved by humanity in the first place.
(“glider” terminology suggested by post today, and I use it to describe gliders in physics, aka life forms)