There are many reasons someone might make an error of judgement—but when the error in question stems (allegedly) from an incorrect application of a particular theory or idea, it makes no sense to attribute responsibility for the error to the theory.
Eh, I’m a little concerned in general, because this, without restrictions could be used to redirect blame away from the theory, even in cases where the implementation of a theory is evidence against the theory.
The best example is historical non-capitalist societies, especially communist societies where communists claimed when responding to criticism roughly said that the communist societies weren’t truly communist, and thus communism could still work if they were truly communist.
This is the best example of this phenomenon, but I’m sure there’s other examples of this phenomenon.
If you have galaxy brained the idea of the St. Petersberg Paradox, it seems like Alameda style fraud is +EV.
I don’t think so. At the very least, it seems debatable. Biting the bullet in the St Petersburg paradox doesn’t mean taking negative-EV bets. House of cards stuff ~never turns out well in the long run, and the fallout from an implosion also grows as you double down. Everything that’s coming to light about FTX indicates it was a total house of cards. Seems really unlikely to me that most of these bets were positive even on fanatically risk-neutral, act utilitarian grounds.
Maybe I’m biased because it’s convenient to believe what I believe (that the instrumentally rational action is almost never “do something shady according to common sense morality.”) Let’s say it’s defensible to see things otherwise. Even then, I find it weird that because Sam had these views on St Petersburg stuff, people speak as though this explains everything about FTX epistemics. “That was excellent instrumental rationality we were seeing on display by FTX leadership, granted that they don’t care about common sense morality and bite the bullet on St Petersburg.” At the very least, we should name and consider the other hypothesis, on which the St Petersburg views were more incidental (though admittedly still “characteristic”). On that other hypothesis, there’s a specific type of psychology that makes people think they’re invincible, which leads to them taking negative bets on any defensible interpretation of decision-making under uncertainty.
Eh, I’m a little concerned in general, because this, without restrictions could be used to redirect blame away from the theory, even in cases where the implementation of a theory is evidence against the theory.
The best example is historical non-capitalist societies, especially communist societies where communists claimed when responding to criticism roughly said that the communist societies weren’t truly communist, and thus communism could still work if they were truly communist.
This is the best example of this phenomenon, but I’m sure there’s other examples of this phenomenon.
I don’t think so. At the very least, it seems debatable. Biting the bullet in the St Petersburg paradox doesn’t mean taking negative-EV bets. House of cards stuff ~never turns out well in the long run, and the fallout from an implosion also grows as you double down. Everything that’s coming to light about FTX indicates it was a total house of cards. Seems really unlikely to me that most of these bets were positive even on fanatically risk-neutral, act utilitarian grounds.
Maybe I’m biased because it’s convenient to believe what I believe (that the instrumentally rational action is almost never “do something shady according to common sense morality.”) Let’s say it’s defensible to see things otherwise. Even then, I find it weird that because Sam had these views on St Petersburg stuff, people speak as though this explains everything about FTX epistemics. “That was excellent instrumental rationality we were seeing on display by FTX leadership, granted that they don’t care about common sense morality and bite the bullet on St Petersburg.” At the very least, we should name and consider the other hypothesis, on which the St Petersburg views were more incidental (though admittedly still “characteristic”). On that other hypothesis, there’s a specific type of psychology that makes people think they’re invincible, which leads to them taking negative bets on any defensible interpretation of decision-making under uncertainty.
Who were you responding to, since I didn’t make the argument that you were responding to.
Oh, I was replying to Iceman – mostly this part that I quoted:
(I think I’ve seen similar takes by other posters in the past.)
I should have mentioned that I’m not replying to you.
I think I took such a long break from LW that I forgot that you can make subthreads rather than just continue piling on at the end of a thread.