there is not enough bandwidth between DNA and the neural network; evolution can input some sort of a signal like “there should be a subsystem tracking social status, and that variable should be maximized” or tune some parameters, but it likely does not have enough bandwidth to transfer some complex representation of the real evolutionary fitness.
I think you would agree that evolution has enough bandwidth to transmit complex strategies. The mess of sub-agents is, I think, more like the character in the analogy. There are some “player” calculations done in the brain itself, but many occur at the evolutionary level.
I like your point about where most of the computation/lovecraftian monsters are located.
I’ll think about it more, but if I try to paraphrase it in my picture by a metaphor … we can imagine an organization with a workplace safety department. The safety regulations it is implementing are result of some large external computation. Also even the existence of the workplace safety department is in some sense result of the external system. But drawing boundaries is tricky.
I’m curious about how the communication channel between evolution and the brain looks like “on the link level”. It seems it is reasonably easy to select e.g. personality traits, some “hyperparameters” of the cognitive architecture, and similar. It is unclear to me if this can be enough to “select from complex strategies” or if it is necessary to transmit strategies in some more explicit form.
I think hyperintelligent lovecraftian creature is the right picture. I don’t think the player is best located in the brain.
I think you would agree that evolution has enough bandwidth to transmit complex strategies. The mess of sub-agents is, I think, more like the character in the analogy. There are some “player” calculations done in the brain itself, but many occur at the evolutionary level.
I like your point about where most of the computation/lovecraftian monsters are located.
I’ll think about it more, but if I try to paraphrase it in my picture by a metaphor … we can imagine an organization with a workplace safety department. The safety regulations it is implementing are result of some large external computation. Also even the existence of the workplace safety department is in some sense result of the external system. But drawing boundaries is tricky.
I’m curious about how the communication channel between evolution and the brain looks like “on the link level”. It seems it is reasonably easy to select e.g. personality traits, some “hyperparameters” of the cognitive architecture, and similar. It is unclear to me if this can be enough to “select from complex strategies” or if it is necessary to transmit strategies in some more explicit form.