Is this referring to my insights in particular or something similar somebody else said?
It’s meant to gesture at a category of thinking, a given instance of which may or may not be worthwhile or interesting, but which leads people to be very overly worried about the consequences of spreading the ideas involved, compared to how bad the consequences actually are. For example, sometimes [people who take hypothetical possibilities very seriously] newly think of something, such as the potential of BCIs or the potential of thinking in such-and-such unconventional way or whatever. Then they implicitly reason like this: There’s a bunch of potential here; previously I hadn’t thought of this idea; previously I hadn’t pursued efforts related to this idea; now I’ve thought of this idea; the fact that I just now thought of the idea and hadn’t previously explains away the fact that I haven’t previously pursued related efforts; so probably my straightforward inside view of why there’s potential here is correct or at least a good rough draft guess; which means there are huge implications here; and the reason others aren’t pursuing related efforts is probably that they didn’t think of the idea; and since the idea is powerful, I shouldn’t share it.
Usually some but not all of these inferences are correct. Often the neglectedness is mainly because others don’t believe in hypothetical possibilities, not because no one has thought of it. Rarely does the final inference go through.
I’ve already had conversations with multiple billionaires.
I would think the problem here would be failing at transfering the relevant info, not transfering too much info!
But if you manage to get their attention you could get them to copy your preferred choices instead.
The only morally acceptable thing to copy in this way is an orientation against making decision this way.
Hmm I get what you’re saying but my whole claim is yes a good researcher can get the whole inference to go through atleast some of the time.
Maybe we need to discuss actual examples.
I would think the problem here would be failing at transfering the relevant info, not transfering too much info!
I agree the first problem is hard. My bigger worry is the second problem—transferring wrong info rather than too much.
For instance I might write an article titled “3 types of BCIs and 50 cool things you can do with them”, 3 years later I realise “holy shit some of those things I thought were cool could actually hurt lots of people (but provide gain to the investor/founder)” but now it’s too late because some founder of a BCI startup has already read my first article and is inspired by it now.
The only morally acceptable thing to copy in this way is an orientation against making decision this way.
This seems weirdly adversarial, maybe I didn’t communicate my point well..You use a toothbrush somebody else designed, you live in a home someone else designed, you use telephone calls and telephone numbers using social technology someone else figured out, you work a 40 hour work week because someone decided creating a law against overwork was a good idea, etc.
I could go talk to a toothbrush manufacturer and show them a cheaper polymer or better design and it could affect which brush you use for example. I might not even have to talk to the same manufacturer you buy from, since manufacturers also will all copy each other once one of them has something cool.
This also applies to thoughts, if I find a superior (or even just different) way of thinking about economics or market research or life philosophy or how best to tie your shoelaces, you might start thinking in patterns similar to mine once lots of people copy my thought pattern.
The examples in this comment are about “oops I had an idea that sounds good but is accidentally bad”. That’s a reasonable thing to worry about but doesn’t seem like the thing you were actually asking about. You wrote:
I don’t expect to be particularly good at coordinating with my perfect clones for example. I’m sure if you put me in a room with my perfect clone and a source of massive power (such as a controllable ASI), we’d beat each other half to death fighting for it.
This seems much more central, and indicates a major problem.
I’ve been confused around why I find it so hard to trust people and this discussion has made me a little less confused. There seem to be multiple reasons. Thank you for discussing so far.
I agree that that seems to be the biggest problem—even if someone shared all my beliefs and values—I would struggle to coordinate with them right now.
I am also dealing with a bunch of painful personal shit right now that might be affecting my ability to trust people or lead a happy/meaningful life. I don’t want to share too much about that on a public forum. (It could actually fuck up my life if I did.)
I know the standard advice is to go fix my personal shit before I think about the future of the world, but at some point I do need to figure out who to trust or not, and it’s going to have implications for both my personal and professional life, I can’t just cleanly separate the two.
It’s meant to gesture at a category of thinking, a given instance of which may or may not be worthwhile or interesting, but which leads people to be very overly worried about the consequences of spreading the ideas involved, compared to how bad the consequences actually are. For example, sometimes [people who take hypothetical possibilities very seriously] newly think of something, such as the potential of BCIs or the potential of thinking in such-and-such unconventional way or whatever. Then they implicitly reason like this: There’s a bunch of potential here; previously I hadn’t thought of this idea; previously I hadn’t pursued efforts related to this idea; now I’ve thought of this idea; the fact that I just now thought of the idea and hadn’t previously explains away the fact that I haven’t previously pursued related efforts; so probably my straightforward inside view of why there’s potential here is correct or at least a good rough draft guess; which means there are huge implications here; and the reason others aren’t pursuing related efforts is probably that they didn’t think of the idea; and since the idea is powerful, I shouldn’t share it.
Usually some but not all of these inferences are correct. Often the neglectedness is mainly because others don’t believe in hypothetical possibilities, not because no one has thought of it. Rarely does the final inference go through.
I would think the problem here would be failing at transfering the relevant info, not transfering too much info!
The only morally acceptable thing to copy in this way is an orientation against making decision this way.
Hmm I get what you’re saying but my whole claim is yes a good researcher can get the whole inference to go through atleast some of the time.
Maybe we need to discuss actual examples.
I agree the first problem is hard. My bigger worry is the second problem—transferring wrong info rather than too much.
For instance I might write an article titled “3 types of BCIs and 50 cool things you can do with them”, 3 years later I realise “holy shit some of those things I thought were cool could actually hurt lots of people (but provide gain to the investor/founder)” but now it’s too late because some founder of a BCI startup has already read my first article and is inspired by it now.
This seems weirdly adversarial, maybe I didn’t communicate my point well..You use a toothbrush somebody else designed, you live in a home someone else designed, you use telephone calls and telephone numbers using social technology someone else figured out, you work a 40 hour work week because someone decided creating a law against overwork was a good idea, etc.
I could go talk to a toothbrush manufacturer and show them a cheaper polymer or better design and it could affect which brush you use for example. I might not even have to talk to the same manufacturer you buy from, since manufacturers also will all copy each other once one of them has something cool.
This also applies to thoughts, if I find a superior (or even just different) way of thinking about economics or market research or life philosophy or how best to tie your shoelaces, you might start thinking in patterns similar to mine once lots of people copy my thought pattern.
The examples in this comment are about “oops I had an idea that sounds good but is accidentally bad”. That’s a reasonable thing to worry about but doesn’t seem like the thing you were actually asking about. You wrote:
This seems much more central, and indicates a major problem.
You are right.
I’ve been confused around why I find it so hard to trust people and this discussion has made me a little less confused. There seem to be multiple reasons. Thank you for discussing so far.
I agree that that seems to be the biggest problem—even if someone shared all my beliefs and values—I would struggle to coordinate with them right now.
I am also dealing with a bunch of painful personal shit right now that might be affecting my ability to trust people or lead a happy/meaningful life. I don’t want to share too much about that on a public forum. (It could actually fuck up my life if I did.)
I know the standard advice is to go fix my personal shit before I think about the future of the world, but at some point I do need to figure out who to trust or not, and it’s going to have implications for both my personal and professional life, I can’t just cleanly separate the two.