I don’t think it’s possible that our hardware could trick us in this way (making us doing self-interested things by making them appear moral).
To express the idea “this would be good for the tribe” would require the use of abstract concepts (tribe, good) but abstract concepts/sentences are precisely the things that are observably under our conscious control. What can pop up without our willing it are feelings or image associations so the best trickery our hardware could hope for is to make something feel good.
I don’t think it’s possible that our hardware could trick us in this way (making us doing self-interested things by making them appear moral).
To express the idea “this would be good for the tribe” would require the use of abstract concepts (tribe, good) but abstract concepts/sentences are precisely the things that are observably under our conscious control. What can pop up without our willing it are feelings or image associations so the best trickery our hardware could hope for is to make something feel good.