I am pretty sure I don’t approve of Matt Newport’s comment. If uncertainty about receiving something means that it is worth less then you would conventionally just build that into the utility function.
IMO, rewards are more-or-less the brain’s attempt to symbolically represent utility.
I am pretty sure I don’t approve of Matt Newport’s comment. If uncertainty about receiving something means that it is worth less then you would conventionally just build that into the utility function.
IMO, rewards are more-or-less the brain’s attempt to symbolically represent utility.