I don’t find it amazing or something. It’s more like… I dont know how to write the pseudocode for an AI that actually cares about human welfare. In my mind that is pretty close to something that tries to be aligned. But if even evolution managed to create agents capable of this by accident, then it might not be that hard.
I initially had a Paragraph explaining my motivation in the question, but then removed it in favor of brevity. Kind of regretting this now because people seem to read into this that I think of evolution as some spirit or something.
I don’t find it amazing or something. It’s more like… I dont know how to write the pseudocode for an AI that actually cares about human welfare. In my mind that is pretty close to something that tries to be aligned. But if even evolution managed to create agents capable of this by accident, then it might not be that hard.
But, like, evolution made a bunch of other agents that didn’t have these properties.
Yes. The right word would probably have been possible not easy in my title. Easy is quite subjective.
I initially had a Paragraph explaining my motivation in the question, but then removed it in favor of brevity. Kind of regretting this now because people seem to read into this that I think of evolution as some spirit or something.