Appeal to fictional evidence, that’s dangerous too. Involving the dark side of star wars will elicit cached thoughts. The force is a fictional contraption devised by people for a story, and it doesn’t work in the same way as rationality do.
That said, is it still ok to rob a bank to give to a charity though ? We must be damn sure of our truth, and of the nobleness of our purposes, to lie others into the same understanding as ours.
I wonder if there’s a bit of Aumann agreement in there. We might disagree with other people, but to just hack their brains cancels any useful updates we might have got from their unique knowledge.
That is much too complicated to be solved in one sentence. However, ultimately, we’ll make a bet, on the assurance that we must be right. If we indeed are, it makes sense to convert other people to our worldview, provided their objectives are similar to ours, since that will help them.
Historically, though, it has been shown that people believing they were right, were not even close to being that. Would we be repeating that mistake if we said that what we advocate is the truth ?
What do we advocate anyway ? It seems our vision of truth is much more flexible than any other seen so far. We don’t even have a fixed vision, anything we believe at this point, is liable to be rewritten.
It seems to me that to be a good rationalist, you should ideally not need someone else to show you unique knowledge, that might change your mind. You should be able to do it yourself. But that idea can be potentially abused too.
Appeal to fictional evidence, that’s dangerous too. Involving the dark side of star wars will elicit cached thoughts. The force is a fictional contraption devised by people for a story, and it doesn’t work in the same way as rationality do.
That said, is it still ok to rob a bank to give to a charity though ? We must be damn sure of our truth, and of the nobleness of our purposes, to lie others into the same understanding as ours.
I wonder if there’s a bit of Aumann agreement in there. We might disagree with other people, but to just hack their brains cancels any useful updates we might have got from their unique knowledge.
That is much too complicated to be solved in one sentence. However, ultimately, we’ll make a bet, on the assurance that we must be right. If we indeed are, it makes sense to convert other people to our worldview, provided their objectives are similar to ours, since that will help them.
Historically, though, it has been shown that people believing they were right, were not even close to being that. Would we be repeating that mistake if we said that what we advocate is the truth ?
What do we advocate anyway ? It seems our vision of truth is much more flexible than any other seen so far. We don’t even have a fixed vision, anything we believe at this point, is liable to be rewritten.
It seems to me that to be a good rationalist, you should ideally not need someone else to show you unique knowledge, that might change your mind. You should be able to do it yourself. But that idea can be potentially abused too.