This is important because agents in general don’t have to use beliefs, but they do all have to have goals.
I think you mean that agents don’t have to use beliefs or goals, but they do all have to choose between actions.
If you really meant what you said, then you drew some deep bizarre counterintuitive conclusion there that I can’t understand, and I’d really like to see an argument for it.
I think you mean that agents don’t have to use beliefs or goals, but they do all have to choose between actions.
If you really meant what you said, then you drew some deep bizarre counterintuitive conclusion there that I can’t understand, and I’d really like to see an argument for it.
Yep, my mistake. Fixed.