Well most certainly yes, but what does that actually look like at the level of physics? How do I determine the extent to which my robot vacuum is forming beliefs that pay rent in the form of anticipated experiences? And most importantly, what if I don’t trust it to answer questions truthfully and so don’t want to rely on its standard input/output channels?
So, your proposed definition of knowledge is information that pays rent in the form of anticipated experiences?
Well most certainly yes, but what does that actually look like at the level of physics? How do I determine the extent to which my robot vacuum is forming beliefs that pay rent in the form of anticipated experiences? And most importantly, what if I don’t trust it to answer questions truthfully and so don’t want to rely on its standard input/output channels?