This maybe the most horrifying thing I have ever read. Not even counting the ending (which didn’t really feel plausible to me).
This is the first time that I viscerally felt how tricky the alignment problem is, and how impossible it is to find all of the problems, much less make the problems make sense to all of the stakeholders. Optimization can leak through so many cracks, and and this species is not equipped to deal with that. This problem is too subtle for us.
I think it even adds to the horror that this senario is compatible with being a Great Filter that doesn’t generate a meaningfully goal oriented successor that would do anything after destroying or stagnating us. The goal oriented Mesa Optimizer is effectively trapped inside a system that’s objective is simplicity and stagnation.
This maybe the most horrifying thing I have ever read. Not even counting the ending (which didn’t really feel plausible to me).
This is the first time that I viscerally felt how tricky the alignment problem is, and how impossible it is to find all of the problems, much less make the problems make sense to all of the stakeholders. Optimization can leak through so many cracks, and and this species is not equipped to deal with that. This problem is too subtle for us.
I’m amused that this sentence is likely the highest praise for my writing I’ve ever received.
I think it even adds to the horror that this senario is compatible with being a Great Filter that doesn’t generate a meaningfully goal oriented successor that would do anything after destroying or stagnating us. The goal oriented Mesa Optimizer is effectively trapped inside a system that’s objective is simplicity and stagnation.