Shannon information , which is basically just a quantity of bits, not a meaning, is an objective quantity.
But information as meaning depends on interpretation, as Dacyn says
But the PC is not particularly well-suited to describing counterfactuals. Once the initial conditions are set, it can describe what will happen, but it cannot describe why the light in the first scenario carries information, and the identical evolution in the second scenario does not
The PC is very apt to describe conditionals, because it describes how a system evolves conditional on an initial state. And a counterfactual is just a conditional whose antecedent didn’t happen (as I interpret the term).
Instead, the information is a counterfactual property: it is only meaningful to say that the electromagnetic field carries information if it could have been in a different state. (We can reach a similar conclusion by considering the case where the lamp is stuck in the ‘off’ position
AFAICS, that’s just a special case of the inverse relationship between probability and(Shannon) information. If the lamp is stuck “on”, the probability of an “on” signal is 1.000 and the information content is 0.000.
So it’s not fundamentally about counterfactuals at all.
I think I disagree with your characterisation of the split between ‘objective’ Shannon information and information as meaning, which requires interpretation.
As you point at the end of your comment, Shannon information requires you to know the probability distribution from which your data is drawn. And probabilities are reflections of your own state of knowledge, which is subjective. (Or at least subjectively objective, if you are using ‘objective’ in that sense, then I guess I agree.) For example, if Alice sends Bob a string ’11111′, we might be tempted to say that she has sent Bob 5 bits of information, but if Bob knows that Alice can only send two possible strings ‘00000’ or ‘11111’, then he would say that she has only sent one bit. All signals, not just what you call ‘information as meaning’ require some degree of interpretation. And this interpretation, I argue, requires knowing the possible signals that could be sent, even if they are not actually sent. These possible signals are what I am calling counterfactuals.
I’m not sure I understand your point about conditionals vs counterfactuals.
AFAICS, that’s just a special case of the inverse relationship between probability and(Shannon) information. If the lamp is stuck “on”, the probability of an “on” signal is 1.000 and the information content is 0.000. So it’s not fundamentally about counterfactuals at all.
I kind of agree with this, but it doesn’t tell the whole story. Consider the case where, instead of being stuck ‘on’, the lamp flickers randomly and is on 50% of the time and off 50% of the time. In this case, you would not be able to use the lamp to send information, even though the probability of an ‘on’ signal is 0.5 and, in one sense, the Shannon entropy would be maximal. To send information requires that it is possible for you to change the signal sent by the lamp. This is what I was trying to get at in this post. Another way of thinking about it is to say that you must have a causal effect on the state of the signal. In both the case where the lamp is stuck on and the case where it is flickering uncontrollably, you have no causal link to the state of the signal. I tried to explain the link between counterfactuals, information and causality in the subsequent post.
As you point at the end of your comment, Shannon information requires you to know the probability distribution from which your data is drawn. And probabilities are reflections of your own state of knowledge, which is subjective
Not necessarily. Objective probabilities could exist. That just gives you two different measures, an objective one and a subjective one.
For example, if Alice sends Bob a string ’11111′, we might be tempted to say that she has sent Bob 5 bits of information, but if Bob knows that Alice can only send two possible strings ‘00000’ or ‘11111’, then he would say that she has only sent one bit.
If Bob doesnt know that Alice can only send one of two five bit strings, then she, objectively, had sent only one bit, and his subjective subjective estimate based on subjective probability is wrong.
In short , the same relationship between probability and information content holds in both contexts.
In this case, you would not be able to use the lamp to send information, even though the probability of an ‘on’ signal is 0.5 and, in one sense, the Shannon entropy would be maximal. To send information requires that it is possible for you to change the signal sent by the lamp
The Shannon information is maximal, so your second use of “information” has to refer to something other than Shannon information.
Yes, you have to causally control a signal to send information-as-meaning, and that has something to do with counterfactuals, but it isn’t just counterfactuals. An uncontrolled, random sequence could have been different, so it has counterfactual versions.
“Information” means more than one thing...
Shannon information , which is basically just a quantity of bits, not a meaning, is an objective quantity.
But information as meaning depends on interpretation, as Dacyn says
The PC is very apt to describe conditionals, because it describes how a system evolves conditional on an initial state. And a counterfactual is just a conditional whose antecedent didn’t happen (as I interpret the term).
AFAICS, that’s just a special case of the inverse relationship between probability and(Shannon) information. If the lamp is stuck “on”, the probability of an “on” signal is 1.000 and the information content is 0.000. So it’s not fundamentally about counterfactuals at all.
I think I disagree with your characterisation of the split between ‘objective’ Shannon information and information as meaning, which requires interpretation.
As you point at the end of your comment, Shannon information requires you to know the probability distribution from which your data is drawn. And probabilities are reflections of your own state of knowledge, which is subjective. (Or at least subjectively objective, if you are using ‘objective’ in that sense, then I guess I agree.) For example, if Alice sends Bob a string ’11111′, we might be tempted to say that she has sent Bob 5 bits of information, but if Bob knows that Alice can only send two possible strings ‘00000’ or ‘11111’, then he would say that she has only sent one bit. All signals, not just what you call ‘information as meaning’ require some degree of interpretation. And this interpretation, I argue, requires knowing the possible signals that could be sent, even if they are not actually sent. These possible signals are what I am calling counterfactuals.
I’m not sure I understand your point about conditionals vs counterfactuals.
I kind of agree with this, but it doesn’t tell the whole story. Consider the case where, instead of being stuck ‘on’, the lamp flickers randomly and is on 50% of the time and off 50% of the time. In this case, you would not be able to use the lamp to send information, even though the probability of an ‘on’ signal is 0.5 and, in one sense, the Shannon entropy would be maximal. To send information requires that it is possible for you to change the signal sent by the lamp. This is what I was trying to get at in this post. Another way of thinking about it is to say that you must have a causal effect on the state of the signal. In both the case where the lamp is stuck on and the case where it is flickering uncontrollably, you have no causal link to the state of the signal. I tried to explain the link between counterfactuals, information and causality in the subsequent post.
Not necessarily. Objective probabilities could exist. That just gives you two different measures, an objective one and a subjective one.
If Bob doesnt know that Alice can only send one of two five bit strings, then she, objectively, had sent only one bit, and his subjective subjective estimate based on subjective probability is wrong.
In short , the same relationship between probability and information content holds in both contexts.
The Shannon information is maximal, so your second use of “information” has to refer to something other than Shannon information.
Yes, you have to causally control a signal to send information-as-meaning, and that has something to do with counterfactuals, but it isn’t just counterfactuals. An uncontrolled, random sequence could have been different, so it has counterfactual versions.