Short answer: It won’t guarantee that, because rats learn most of what they know. The equation I developed turns out to be identical to an equation saying that the amount of information contained in facts and data must be at least as great as the amount of information that it takes to specify the ontology. So any creature that learns its ontology, automatically satisfies the equation.
I don’t understand the question. It’s an inequality, and in cases where the inequality isn’t satisfied, the answer it gives is “I don’t know”. The answer for a rat will always be “I don’t know”.
Short answer: It won’t guarantee that, because rats learn most of what they know. The equation I developed turns out to be identical to an equation saying that the amount of information contained in facts and data must be at least as great as the amount of information that it takes to specify the ontology. So any creature that learns its ontology, automatically satisfies the equation.
… Could we take as the input the most a rat could ever learn?
I don’t understand the question. It’s an inequality, and in cases where the inequality isn’t satisfied, the answer it gives is “I don’t know”. The answer for a rat will always be “I don’t know”.
I must profess I didn’t understand most of what you’ve said, but did I guess the following right? The equation says that
IF my knowledge is “bigger” than my ontology THEN I might be conscious
And in the case of learning my ontology, it means that my ontology is a subset of my knowledge and thus never bigger than the former.
Right.
Exactly.