So how should I expect this to affect goal setting/achievement? Is this going to make it easier to achieve certain types of goals or easier to select intelligent goals or easier to break my goals down into manageable parts? Would you expect that it should improve my akrasia fighting abilities?
In short, what sorts of material gains should I expect from this sort of activity that I might expect to be able to do a controlled study of?
I would say that the most common and general benefit is likely to be an increased ability to observe and act against habitual biases. One effect of cultivating attention and perception to the degree necessary for enlightenment is that many cognitive processes which were previously murky or hard to see or ‘subconscious’ become rather clear. A person interested in improving this facet of rationality would likely find it extremely advantageous to be able to clearly see some of the garbage coursing through their mind, either masquerading as ‘my belief’ or connected to processes which generate beliefs that are not reliably accurate. Seeing it is likely to allow one to guard themselves against acting on such things or against regarding them as anything more than cognitive babble of questionable provenance.
If a person is not interested in this kind of self-improvement, all bets are off.
(I mentioned the above in Part 1 already.)
“Making it easier to achieve certain types of goals” is so easy a target to hit that you should be surprised if I said “no.” What kinds of goals do you have in mind?
Assuming that enlightenment really is a more accurate understanding of what you now regard as your ‘self,’ you will be able to select goals that are more in line with what you would select if you were smarter and knew more. (Cf. CEV). I don’t know if this falls under what you mean by “intelligent goals.” I also mentioned this in Part 1.
I would explicitly disclaim improvements in akrasia or in organizational abilities related to goal-setting as being likely to follow from this method or from enlightenment.
I imagine that there are some benefits you might receive from enlightenment which are too idiosyncratic for me to predict.
I think there are numerous benefits to well-being / mental health, but I am torn between seeing those as “manifesting in idiosyncratic ways, dependent on personality” and “reasonably common.” Part of my hesitation is that I realized that I know a good bit about what people who self-select as interested in enlightenment for its own sake ultimately say about enlightenment with respect to their mental health, but nothing much about what people who are interested in enlightenment specifically in order to improve their rationality say about it, AND that the benefits which translate into mental health improvements are described in a way which seems highly dependent on a particular personality and goal structure (so would not necessarily be expected to universalize across different types of people).
This seems like a special case (and a relatively easy one) of the problem of identifying in advance whether changing a way of thought is an improvement.
So how should I expect this to affect goal setting/achievement? Is this going to make it easier to achieve certain types of goals or easier to select intelligent goals or easier to break my goals down into manageable parts? Would you expect that it should improve my akrasia fighting abilities?
In short, what sorts of material gains should I expect from this sort of activity that I might expect to be able to do a controlled study of?
I would say that the most common and general benefit is likely to be an increased ability to observe and act against habitual biases. One effect of cultivating attention and perception to the degree necessary for enlightenment is that many cognitive processes which were previously murky or hard to see or ‘subconscious’ become rather clear. A person interested in improving this facet of rationality would likely find it extremely advantageous to be able to clearly see some of the garbage coursing through their mind, either masquerading as ‘my belief’ or connected to processes which generate beliefs that are not reliably accurate. Seeing it is likely to allow one to guard themselves against acting on such things or against regarding them as anything more than cognitive babble of questionable provenance.
If a person is not interested in this kind of self-improvement, all bets are off.
(I mentioned the above in Part 1 already.)
“Making it easier to achieve certain types of goals” is so easy a target to hit that you should be surprised if I said “no.” What kinds of goals do you have in mind?
Assuming that enlightenment really is a more accurate understanding of what you now regard as your ‘self,’ you will be able to select goals that are more in line with what you would select if you were smarter and knew more. (Cf. CEV). I don’t know if this falls under what you mean by “intelligent goals.” I also mentioned this in Part 1.
I would explicitly disclaim improvements in akrasia or in organizational abilities related to goal-setting as being likely to follow from this method or from enlightenment.
I imagine that there are some benefits you might receive from enlightenment which are too idiosyncratic for me to predict.
I think there are numerous benefits to well-being / mental health, but I am torn between seeing those as “manifesting in idiosyncratic ways, dependent on personality” and “reasonably common.” Part of my hesitation is that I realized that I know a good bit about what people who self-select as interested in enlightenment for its own sake ultimately say about enlightenment with respect to their mental health, but nothing much about what people who are interested in enlightenment specifically in order to improve their rationality say about it, AND that the benefits which translate into mental health improvements are described in a way which seems highly dependent on a particular personality and goal structure (so would not necessarily be expected to universalize across different types of people).
This seems like a special case (and a relatively easy one) of the problem of identifying in advance whether changing a way of thought is an improvement.