I no longer believe that anthropic probabilities make sense (see http://lesswrong.com/lw/891/anthropic_decision_theory_i_sleeping_beauty_and/ and subsequent posts—search “anthropic decision theory” in less wrong), only anthropic decisions do. Applying this to these situations, total utilitarians should (roughly) act as if there was a late filter, while average utilitarians and selfish beings should act as if was an early filter.
I no longer believe that anthropic probabilities make sense (see http://lesswrong.com/lw/891/anthropic_decision_theory_i_sleeping_beauty_and/ and subsequent posts—search “anthropic decision theory” in less wrong), only anthropic decisions do. Applying this to these situations, total utilitarians should (roughly) act as if there was a late filter, while average utilitarians and selfish beings should act as if was an early filter.