Y’know, I figured someone must have already shown this and used it before, but I wasn’t sure where. Didn’t expect it’d already be on my bookshelf. Thanks for making that connection!
This means that the ordering of the layers is not strictly reflected in the partial ordering in the Bayes net. As an extreme example of where this goes wrong, consider the setting of a Markov chain M0 → M2 → M1 in which M2 has an arrow “back” to M1.
I think this part should not matter, so long as we’re not using a do() operator anywhere? For chains specifically, we can reverse all the arrows and still get an equivalent factorization. Not sure off the top of my head how that plays with mixed-direction arrows, but at the very least we should be able to factor the sequence of nested markov blankets as an undirected chain (since each blanket mediates interactions between “earlier” and “later” blankets), and from there show that it factors as a directed chain.
Not sure off the top of my head how that plays with mixed-direction arrows,
If you have arrows M0 ← M1 → M2, then it is also equivalent to an ordinary chain.
If you have arrows M0 → M1 ← M2, then it becomes inequivalent, due to collider bias. Basically, if you condition on M1, then you introduce dependencies between M0 and M2 (and also everything upstream of M0 and M2).
(I actually suspect collider bias ends up mattering for certain types of abstractions, and I don’t think it has been investigated how in detail.)
Y’know, I figured someone must have already shown this and used it before, but I wasn’t sure where. Didn’t expect it’d already be on my bookshelf. Thanks for making that connection!
I think this part should not matter, so long as we’re not using a do() operator anywhere? For chains specifically, we can reverse all the arrows and still get an equivalent factorization. Not sure off the top of my head how that plays with mixed-direction arrows, but at the very least we should be able to factor the sequence of nested markov blankets as an undirected chain (since each blanket mediates interactions between “earlier” and “later” blankets), and from there show that it factors as a directed chain.
If you have arrows M0 ← M1 → M2, then it is also equivalent to an ordinary chain.
If you have arrows M0 → M1 ← M2, then it becomes inequivalent, due to collider bias. Basically, if you condition on M1, then you introduce dependencies between M0 and M2 (and also everything upstream of M0 and M2).
(I actually suspect collider bias ends up mattering for certain types of abstractions, and I don’t think it has been investigated how in detail.)