Thx, I think I got most of this from your top level comment & Mikhail’s post already. I strongly expect that I do not know your policy for confidentiality right now, but I also expect that once I do I’d disagree with it being the best policy one can have, just based on what I heard from Mikhail and you about your one interaction.
My guess is that refusing the promise is plausibly better than giving it for free? But I guess that there’d have been another solution where 1) Mikhail learns not to screw up again, and 2) you get to have people talk more freely around you to a degree that’s worth loosing the ability to make use of some screw-ups, and 3) Mikhail compensates you in case that 1+2 is still too far away from a fair split of the total expected gains.
I expect you’ll say that 2) sounds pretty negative to you, and that you and the community should follow a policy where there’s way less support for confidentiality, which can be achieved by exploiting screw-ups and by sometimes saying no if people ask for confidentiality in advance, so that people who engage in confidentiality either leave the community or learn to properly share information openly.
I mostly just want people to become calibrated about the cost of sharing information with strings attached. It is quite substantial! It’s OK for that coordination to happen based on people’s predictions of each other, without needing to be explicitly negotiated each time.
I would like it to be normalized and OK for someone to signal pretty heavily that they consider the cost of accepting secrets, or even more intensely, the cost of accepting information that can only be used to the benefit of another party, to be very high. People should therefore model that kind of request as likely to be rejected, and so if you just spew information onto the other party, and also expect them to keep it secret or to only be used for your benefit, that the other party is likely to stop engaging with you, or to tell you that they aren’t planning to meet your expectations.
I think marginally the most important thing to do is to just tell people who demand constraints on information, without wanting to pay any kind of social cost for it, to pound sand.
(A large part of the goals of this post is to communicate to people that Oliver considers the cost of accepting information to be very high, and make people aware that they should be careful around Oliver and predict him better on this dimension, not repeating my mistake of expecting him not to do so much worse than a priest of Abadar would.)
I think you could have totally written a post that focused on communicating that, and it could have been a great post! Like, I do think the cost of keeping secrets is high. Both me and other people at Lightcone have written quite a bit about that. See for example “Can you keep this confidential? How do you know?”
Thx, I think I got most of this from your top level comment & Mikhail’s post already. I strongly expect that I do not know your policy for confidentiality right now, but I also expect that once I do I’d disagree with it being the best policy one can have, just based on what I heard from Mikhail and you about your one interaction.
My guess is that refusing the promise is plausibly better than giving it for free? But I guess that there’d have been another solution where 1) Mikhail learns not to screw up again, and 2) you get to have people talk more freely around you to a degree that’s worth loosing the ability to make use of some screw-ups, and 3) Mikhail compensates you in case that 1+2 is still too far away from a fair split of the total expected gains.
I expect you’ll say that 2) sounds pretty negative to you, and that you and the community should follow a policy where there’s way less support for confidentiality, which can be achieved by exploiting screw-ups and by sometimes saying no if people ask for confidentiality in advance, so that people who engage in confidentiality either leave the community or learn to properly share information openly.
I mostly just want people to become calibrated about the cost of sharing information with strings attached. It is quite substantial! It’s OK for that coordination to happen based on people’s predictions of each other, without needing to be explicitly negotiated each time.
I would like it to be normalized and OK for someone to signal pretty heavily that they consider the cost of accepting secrets, or even more intensely, the cost of accepting information that can only be used to the benefit of another party, to be very high. People should therefore model that kind of request as likely to be rejected, and so if you just spew information onto the other party, and also expect them to keep it secret or to only be used for your benefit, that the other party is likely to stop engaging with you, or to tell you that they aren’t planning to meet your expectations.
I think marginally the most important thing to do is to just tell people who demand constraints on information, without wanting to pay any kind of social cost for it, to pound sand.
(A large part of the goals of this post is to communicate to people that Oliver considers the cost of accepting information to be very high, and make people aware that they should be careful around Oliver and predict him better on this dimension, not repeating my mistake of expecting him not to do so much worse than a priest of Abadar would.)
I think you could have totally written a post that focused on communicating that, and it could have been a great post! Like, I do think the cost of keeping secrets is high. Both me and other people at Lightcone have written quite a bit about that. See for example “Can you keep this confidential? How do you know?”
This post focuses on communicating that! (+ being okay with hosting ai capabilities events + less important misc stuff)