In a previous discussion about this, an argument mentioned was “having all your friends and colleagues believe in a thing is probably more epistemically compromising than the equity.”
Which seems maybe true. But, I update in the other direction of “you shouldn’t take equity, and, also, you should have some explicit plan for dealing with the biases of ’the people I spend the most time with think this,
(This also applies to AI pessimists to be clear, but I think it’s reasonable to hold people extra accountable about it when they’re working at a company who’s product has double-digit-odds of destroying the world)
Yeah, certainly there are other possible forms of bias besides financial conflicts of interest; as you say, I think it’s worth trying to avoid those too.
In a previous discussion about this, an argument mentioned was “having all your friends and colleagues believe in a thing is probably more epistemically compromising than the equity.”
Which seems maybe true. But, I update in the other direction of “you shouldn’t take equity, and, also, you should have some explicit plan for dealing with the biases of ’the people I spend the most time with think this,
(This also applies to AI pessimists to be clear, but I think it’s reasonable to hold people extra accountable about it when they’re working at a company who’s product has double-digit-odds of destroying the world)
Yeah, certainly there are other possible forms of bias besides financial conflicts of interest; as you say, I think it’s worth trying to avoid those too.