If, this thing internalized that conscious type of processing from scratch, without having it natively, then resulting mind isn’t worse than the one that evolution engineered with more granularity.
OK. I guess I had trouble parsing this. Esp. “without having it natively”.
My understanding of your point is now that you see consciousness from “hardware” (“natively”) and consciousness from “software” (learned in some way) as equal. Which kind of makes intuitive sense as the substrate shouldn’t matter.
Corollary: A social system (a corporation?) should also be able to be conscious if the structure is right.
OK. I guess I had trouble parsing this. Esp. “without having it natively”.
My understanding of your point is now that you see consciousness from “hardware” (“natively”) and consciousness from “software” (learned in some way) as equal. Which kind of makes intuitive sense as the substrate shouldn’t matter.
Corollary: A social system (a corporation?) should also be able to be conscious if the structure is right.