Trying to patch the thing which I think this example was aiming for:
Let A be an n-bit number, O be 0 or 1 (50/50 distribution). Then let Y = A if A≡Omod2, else Y = 0. If the sender knows O, then they can convey n-1 bits with every message (i.e. n bits minus the lowest-order bit). If the sender does not know O, then half the messages are guaranteed to be 0 (and which messages are 0 communicates at most 1 bit per, although I’m pretty sure it’s in fact zero bits per in this case, so no loophole there). So at most ~n/2 bits per message can be conveyed if the sender does not know O.
So, sender learning the one bit O doesn’t add one bit to the capacity, it doubles the capacity.
I’m now basically convinced that that’s the answer; the conjecture is false.
I now expect that the upper limit to bits of optimization gained from one bit of observation is ~doubling the number of bits of optimization (as in this example). Though we also know that observing one bit can take us from zero bits of optimization to one bit of optimization, so it’s not as simple as just doubling; there’s at least a small additional term in there.
Tripling is I think definitely a hard max since you can send 3 messages, action if true, action if false, + which is which—at least assuming you can reliably send a bit at all without the observation.
More tightly it’s doubling + number of bits required to send a single bit of information.
Trying to patch the thing which I think this example was aiming for:
Let A be an n-bit number, O be 0 or 1 (50/50 distribution). Then let Y = A if A≡Omod2, else Y = 0. If the sender knows O, then they can convey n-1 bits with every message (i.e. n bits minus the lowest-order bit). If the sender does not know O, then half the messages are guaranteed to be 0 (and which messages are 0 communicates at most 1 bit per, although I’m pretty sure it’s in fact zero bits per in this case, so no loophole there). So at most ~n/2 bits per message can be conveyed if the sender does not know O.
So, sender learning the one bit O doesn’t add one bit to the capacity, it doubles the capacity.
I’m now basically convinced that that’s the answer; the conjecture is false.
I now expect that the upper limit to bits of optimization gained from one bit of observation is ~doubling the number of bits of optimization (as in this example). Though we also know that observing one bit can take us from zero bits of optimization to one bit of optimization, so it’s not as simple as just doubling; there’s at least a small additional term in there.
That sounds about right.
Tripling is I think definitely a hard max since you can send 3 messages, action if true, action if false, + which is which—at least assuming you can reliably send a bit at all without the observation.
More tightly it’s doubling + number of bits required to send a single bit of information.