I don‘t need the gory details, but „the brain is doing some variant of gradient descent“ or „the brain is doing this crazy thing that doesn‘t seem to depend on local information in the loss landscape at all“ would seem like particularly valuable pieces of information to me, compared to other generic information about the AGI we have, for things I am working on right now.
I don‘t need the gory details, but „the brain is doing some variant of gradient descent“ or „the brain is doing this crazy thing that doesn‘t seem to depend on local information in the loss landscape at all“ would seem like particularly valuable pieces of information to me, compared to other generic information about the AGI we have, for things I am working on right now.