Its that I and many others would identify with WBE and such a group of WBE much more than the more pure AI. If the WBE behaves like a human then it is aligned by definition to me.
If we believe AI is extreme power, we already have too much power, its all about making something we identify with.
I understand that. But inaccuracies in emulation, the effectively thousands of years (or millions) of lived experience a WBE will have. Neural patches and enhancements to improve performance.
You have built an ASI, just you have narrowed your architecture search from “any possible network the underlying compute can efficiently host” to a fairly narrow space of spaghetti messes of spiking neural networks that also have forms of side channel communications through various emulated glands and a “global” model for csf and blood chemistry.
So it’s an underperforming ASI but still hazardous.
Its that I and many others would identify with WBE and such a group of WBE much more than the more pure AI. If the WBE behaves like a human then it is aligned by definition to me.
If we believe AI is extreme power, we already have too much power, its all about making something we identify with.
I understand that. But inaccuracies in emulation, the effectively thousands of years (or millions) of lived experience a WBE will have. Neural patches and enhancements to improve performance.
You have built an ASI, just you have narrowed your architecture search from “any possible network the underlying compute can efficiently host” to a fairly narrow space of spaghetti messes of spiking neural networks that also have forms of side channel communications through various emulated glands and a “global” model for csf and blood chemistry.
So it’s an underperforming ASI but still hazardous.