That’s a wonderful observation. Do you remember how many questions you counted?
But the belief could be said to be “true”, if you’re strict about what “logical” means. The brain doesn’t use logical representations; it uses neural activation patterns, which are non-discrete, and which, unlike signs, exist in a metric space. However it combines these, it does so computationally, which is probably what you and I mean by logically. But AFAIK no logics that anyone actually uses for any applications have the power of a Turing machine.
In semiotics, they are that strict about what “logical” means. They believe that people think only in words. This is silly to anyone who knows much about biology or neuroscience. But if one believes that, then notes that people know things that can’t be represented in the words we have for those concepts—for instance, knowing not just that they feel hot or cold, but how hot or how cold they are—one might call it extra-logical.
I would be very careful of drawing conclusions like that. I don’t think that experts in that domain think that every sign is a word in the way the “word” is commonly understood by laypeople.
I don’t remember the number of questions but I played it for quite a while. I remember being frustrated because I couldn’t get the 50% option calibrated, since I didn’t want to purposely choose the option that felt less likely. Presumably more practice would have helped distinguish between weaker and stronger impressions of that kind.
I agree that the brain does this kind of thing computationally, and it would be a mistake to suppose that there is some other non-computational way to do it.
That’s a wonderful observation. Do you remember how many questions you counted?
But the belief could be said to be “true”, if you’re strict about what “logical” means. The brain doesn’t use logical representations; it uses neural activation patterns, which are non-discrete, and which, unlike signs, exist in a metric space. However it combines these, it does so computationally, which is probably what you and I mean by logically. But AFAIK no logics that anyone actually uses for any applications have the power of a Turing machine.
In semiotics, they are that strict about what “logical” means. They believe that people think only in words. This is silly to anyone who knows much about biology or neuroscience. But if one believes that, then notes that people know things that can’t be represented in the words we have for those concepts—for instance, knowing not just that they feel hot or cold, but how hot or how cold they are—one might call it extra-logical.
I would be very careful of drawing conclusions like that. I don’t think that experts in that domain think that every sign is a word in the way the “word” is commonly understood by laypeople.
I don’t remember the number of questions but I played it for quite a while. I remember being frustrated because I couldn’t get the 50% option calibrated, since I didn’t want to purposely choose the option that felt less likely. Presumably more practice would have helped distinguish between weaker and stronger impressions of that kind.
I agree that the brain does this kind of thing computationally, and it would be a mistake to suppose that there is some other non-computational way to do it.