Just consider the limiting case—both are perfect predictors of Q, with value 1 for Q, and value 0 for not Q. And therefore, perfectly correlated.
Consider small deviations from those perfect predictors. The correlation would still be large. Sometimes more, sometimes less, depending on the details of both predictors. Sometimes they will be more correlated with each other than with Q, sometimes more correlated with Q than each other. The degree of correlation with of A and B with Q will impose limits on the degree of correlation between A and B.
And of course, correlation isn’t really the issue here anyway, much more like mutual information, with the same sort of triangle inequality limits to the mutual information.
If someone is feeling energetic and really wants to work this our, I’d recommend looking into triangle inequalities for mutual information measures, and the previously mentioned work by Jaynes on the maximum entropy estimate of a variable from it’s known correlation with two other variables, and how that constrains the maximum entropy estimate of the correlation between the other two.
Just consider the limiting case—both are perfect predictors of Q, with value 1 for Q, and value 0 for not Q. And therefore, perfectly correlated.
Consider small deviations from those perfect predictors. The correlation would still be large. Sometimes more, sometimes less, depending on the details of both predictors. Sometimes they will be more correlated with each other than with Q, sometimes more correlated with Q than each other. The degree of correlation with of A and B with Q will impose limits on the degree of correlation between A and B.
And of course, correlation isn’t really the issue here anyway, much more like mutual information, with the same sort of triangle inequality limits to the mutual information.
If someone is feeling energetic and really wants to work this our, I’d recommend looking into triangle inequalities for mutual information measures, and the previously mentioned work by Jaynes on the maximum entropy estimate of a variable from it’s known correlation with two other variables, and how that constrains the maximum entropy estimate of the correlation between the other two.