Agreed, but I think it is easier to see yourself confront your irrational impulses with blackjack. For instance, you’re faced with a 16 versus a 10; you know you have to hit, but your emotions (at least mine) tell me not to. Anyone else experience this same accidental rationality test?
pwno
Funny how the holiday became devoted towards making rational self-improvement goals. Which leads me to my next point: I think people who decide to self-improve on their own, and actually follow through, are already more rational than the average joe. Most people rationalize their mediocrity, find reasons not to self-improve, and stay preoccupied with tasks that don’t impinge on their mental comfort level.
Wouldn’t making a probability estimate for your project completion dates influence your date of completion? Predicting your completion times successfully won’t prove your rationality.
I always made a distinction between rationality and truth-seeking. Rationality is only intelligible when in the context of a goal (whether that goal be rational or irrational). Now, if one acts rationally, given their information set, will chose the best plan-of-action towards succeeding their goal. Part of being rational is knowing which goals will maximize their utility function.
My definition of truth-seeking is basically Robin’s definition of “rational.” I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example?
I think EY just means that she doesn’t get the benefit from truly believing in God, but another, possibly similar, benefit one gets by deceiving oneself into believing in God.
How can more intelligence lead to be more likely to defend your irrational beliefs?
If you don’t want to see it, because you’re worried the new perspective will screw you up, that’s a legitimate fear.
But I feel like any drastic perspective change will screw me up, whether it be positive or negative. Why would someone make the choice to change their preferences (as opposed to optimize them)?
If the drug reveals new information about the world (possibly how my brain works), then why wouldn’t I want to take the drug?
I think what you are referring to in drug A that is so beneficial is the removal of irrational fears. In other words, you have these emotions that you know aren’t inline with your preferences, therefore, it would only be rational to have a drug that only removes those emotions.
But what if drug A completely changes your preference on something. Say, before you were pretty happy working towards your thesis, and working at starbucks, but after taking drug A you suddenly really start caring for people.
You end up wanting to quit everything and join some charitable group. Let’s assume that you may even get more bliss when you start doing charitable work, I still believe you wouldn’t choose to take the drug. You wouldn’t knowingly change your preferences to something else even if it makes you happier. You wouldn’t be optimizing your current preferences by taking the drug.
How certain are you that the only effects the drug has (after it wears off) is reveal new information about the world?
Here is what I think is a possible (other) effect the drug can have:
Imagine that after taking the drug it made you more emotional towards others. This caused you to quit everything you’re doing currently and join some charitable organization. Why would you take the drug knowing there is a possibility your goals would completely change?
It could be that it would only happen as a result of revealing new information and therefore, be what you really always wanted to do. This is assuming that the drug only reveals new information. I am not convinced that all the drug might do is reveal new information after it wears off.
If karma was hidden, would you expect it to be linear?
Plus, as far as I know, we can’t see the total up votes and down votes i.e. more relevant information.
Are most Less Wrong readers already aware of the theory that self-esteem is the way the calculation of status feels from the inside, or is that worth another post?
I wasn’t aware, but it makes a lot of sense. Especially because you perception of yourself is a self-fulfilling prophecy.
Imagine a room of 100 people where none of them have any symbols pre-validated to signal for status. Upon interacting over time, I would guess that the high self-esteem people would most likely be perceived as high status.
I did a similar experiment on myself when I went on an organized trip to Israel. When we stopped at the Whaling (Western) Wall, I decided to test out my rationality. As you know, you’re suppose to write down a wish on a piece of paper and put it in the wall i.e. another way of praying. I decided to write down “I wish my family would die in 2 weeks,” and put it in the wall to see if I can do it.
To my surprise, I did feel a bit weird, a little anxious, but after a while I was fine. It is hard to overcome the emotions induced by our biases, but can be done with practice.
Just curious, would anyone not write the note (that I wrote)? Assuming you’d be compensated for your effort to write it and put it in the wall.
- 23 Jun 2016 8:16 UTC; 1 point) 's comment on Rationality test: Vote for trump by (
Well, I wanted to specifically ask whether taking psilocybin will help me optimize my preferences. In other words, whether it would be rational to take them.
we have difficulty with but the presence of many phenomena outside our usual experience.
Everything is equally a phenomena, just some phenomena we have or haven’t evolved to be un-astounded by. Conversely, there are some phenomena we are more inclined to be astounded by, namely, waterfalls.
I think the subjects just had a stronger incentive to test out their true tolerance. Without hearing about the two types, they probably just took their hands out to minimize discomfort.
Don’t you think it’s still useful to find contradictions within the religious framework in order to convince theists they are wrong? I know some converts who converted because the bible just “didn’t make sense” and contained too many contradictions. Some people even convert to religion because the bible (or whatever holy text) “made so much sense.” They think, “Well, I agree with Y, and since X implies Y, I will now believe X.”
Perhaps you’re referring to truth-seeking, having maps that match the territory best?
Almost everything we do is partially influenced by status-seeking.
Make it easier to see comment threads. It is hard to tell which comments belong to what thread.