Hmm I get what you’re saying but my whole claim is yes a good researcher can get the whole inference to go through atleast some of the time.
Maybe we need to discuss actual examples.
I would think the problem here would be failing at transfering the relevant info, not transfering too much info!
I agree the first problem is hard. My bigger worry is the second problem—transferring wrong info rather than too much.
For instance I might write an article titled “3 types of BCIs and 50 cool things you can do with them”, 3 years later I realise “holy shit some of those things I thought were cool could actually hurt lots of people (but provide gain to the investor/founder)” but now it’s too late because some founder of a BCI startup has already read my first article and is inspired by it now.
The only morally acceptable thing to copy in this way is an orientation against making decision this way.
This seems weirdly adversarial, maybe I didn’t communicate my point well..You use a toothbrush somebody else designed, you live in a home someone else designed, you use telephone calls and telephone numbers using social technology someone else figured out, you work a 40 hour work week because someone decided creating a law against overwork was a good idea, etc.
I could go talk to a toothbrush manufacturer and show them a cheaper polymer or better design and it could affect which brush you use for example. I might not even have to talk to the same manufacturer you buy from, since manufacturers also will all copy each other once one of them has something cool.
This also applies to thoughts, if I find a superior (or even just different) way of thinking about economics or market research or life philosophy or how best to tie your shoelaces, you might start thinking in patterns similar to mine once lots of people copy my thought pattern.
The examples in this comment are about “oops I had an idea that sounds good but is accidentally bad”. That’s a reasonable thing to worry about but doesn’t seem like the thing you were actually asking about. You wrote:
I don’t expect to be particularly good at coordinating with my perfect clones for example. I’m sure if you put me in a room with my perfect clone and a source of massive power (such as a controllable ASI), we’d beat each other half to death fighting for it.
This seems much more central, and indicates a major problem.
I’ve been confused around why I find it so hard to trust people and this discussion has made me a little less confused. There seem to be multiple reasons. Thank you for discussing so far.
I agree that that seems to be the biggest problem—even if someone shared all my beliefs and values—I would struggle to coordinate with them right now.
I am also dealing with a bunch of painful personal shit right now that might be affecting my ability to trust people or lead a happy/meaningful life. I don’t want to share too much about that on a public forum. (It could actually fuck up my life if I did.)
I know the standard advice is to go fix my personal shit before I think about the future of the world, but at some point I do need to figure out who to trust or not, and it’s going to have implications for both my personal and professional life, I can’t just cleanly separate the two.
Hmm I get what you’re saying but my whole claim is yes a good researcher can get the whole inference to go through atleast some of the time.
Maybe we need to discuss actual examples.
I agree the first problem is hard. My bigger worry is the second problem—transferring wrong info rather than too much.
For instance I might write an article titled “3 types of BCIs and 50 cool things you can do with them”, 3 years later I realise “holy shit some of those things I thought were cool could actually hurt lots of people (but provide gain to the investor/founder)” but now it’s too late because some founder of a BCI startup has already read my first article and is inspired by it now.
This seems weirdly adversarial, maybe I didn’t communicate my point well..You use a toothbrush somebody else designed, you live in a home someone else designed, you use telephone calls and telephone numbers using social technology someone else figured out, you work a 40 hour work week because someone decided creating a law against overwork was a good idea, etc.
I could go talk to a toothbrush manufacturer and show them a cheaper polymer or better design and it could affect which brush you use for example. I might not even have to talk to the same manufacturer you buy from, since manufacturers also will all copy each other once one of them has something cool.
This also applies to thoughts, if I find a superior (or even just different) way of thinking about economics or market research or life philosophy or how best to tie your shoelaces, you might start thinking in patterns similar to mine once lots of people copy my thought pattern.
The examples in this comment are about “oops I had an idea that sounds good but is accidentally bad”. That’s a reasonable thing to worry about but doesn’t seem like the thing you were actually asking about. You wrote:
This seems much more central, and indicates a major problem.
You are right.
I’ve been confused around why I find it so hard to trust people and this discussion has made me a little less confused. There seem to be multiple reasons. Thank you for discussing so far.
I agree that that seems to be the biggest problem—even if someone shared all my beliefs and values—I would struggle to coordinate with them right now.
I am also dealing with a bunch of painful personal shit right now that might be affecting my ability to trust people or lead a happy/meaningful life. I don’t want to share too much about that on a public forum. (It could actually fuck up my life if I did.)
I know the standard advice is to go fix my personal shit before I think about the future of the world, but at some point I do need to figure out who to trust or not, and it’s going to have implications for both my personal and professional life, I can’t just cleanly separate the two.