I’m using the word “conscious” to refer to things I want to want and things I would do more with infinite willpower. I’m using the word “unconscious” to refer to things I don’t want to want and things I would do less with infinite willpower. I don’t think it’s too controversial that those are two different categories.
But they’re not natural categories. The problem is that “consciousness” tends to focus on behaviors rather than the goals of those behaviors… as will be obvious to you if you’ve ever been a programmer trying to get people to give you actual requirements instead of just feature specifications. ;-)
So, it can be quite factually the case that you want not to do certain things, while also wanting (implicitly) some part of the result of those actions.
The problem is that protesting you don’t want the action is not helpful. Our preferences are most visible in the breach, because consciousness is effectively an error handler. So your attention is drawn to the errors caused by the behavior, rather than to the goal of the behavior. Your brain wants you to just fix the error, and leave the working part of the system (from its point of view) alone.
But in order to fix the errors intelligently, you need to understand a bigger part of the system than just the location where the error is occurring. Specifically, you need to understand the requirements that are actually being met by the behavior, so that you can find other ways to implement those requirements.
What’s more, I can guarantee you that when you find out those requirements, they will ultimately be something that you either do want, or did want at some time in the past, even if on reflection they are no longer relevant. Calling them a product of the unconscious mind is a factual error, as well as misleading: it implies they came out of nowhere and there’s nothing you can do about them, when in actual fact they are (part of) your true preferences, and you can choose to pay attention and find out what they are, as well as searching for better ways to get them met.
But they’re not natural categories. The problem is that “consciousness” tends to focus on behaviors rather than the goals of those behaviors… as will be obvious to you if you’ve ever been a programmer trying to get people to give you actual requirements instead of just feature specifications. ;-)
So, it can be quite factually the case that you want not to do certain things, while also wanting (implicitly) some part of the result of those actions.
The problem is that protesting you don’t want the action is not helpful. Our preferences are most visible in the breach, because consciousness is effectively an error handler. So your attention is drawn to the errors caused by the behavior, rather than to the goal of the behavior. Your brain wants you to just fix the error, and leave the working part of the system (from its point of view) alone.
But in order to fix the errors intelligently, you need to understand a bigger part of the system than just the location where the error is occurring. Specifically, you need to understand the requirements that are actually being met by the behavior, so that you can find other ways to implement those requirements.
What’s more, I can guarantee you that when you find out those requirements, they will ultimately be something that you either do want, or did want at some time in the past, even if on reflection they are no longer relevant. Calling them a product of the unconscious mind is a factual error, as well as misleading: it implies they came out of nowhere and there’s nothing you can do about them, when in actual fact they are (part of) your true preferences, and you can choose to pay attention and find out what they are, as well as searching for better ways to get them met.