What Exactly Do We Mean By “Rationality”?

In summary, the goal of this post is to start a discussion on the current meaning of ‘rationality’ as it is defined on less wrong. I am specifically trying to find out:

  1. What people think of the current definition of ‘rationality’ on less wrong

  2. What people think a better definition of ‘rationality’ would include. I am not necessarily looking for a perfect definition. I am more looking for a definition that would better highlight the areas that people should look into if they wish to become less wrong.

I think that the description below from the What Do We Mean By ‘Rationality’ post sums up the current meaning of ‘rationality’ as it is used on this site:

Epistemic rationality: believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory. The art of obtaining beliefs that correspond to reality as closely as possible. This correspondence is commonly termed “truth” or “accuracy”, and we’re happy to call it that.

Instrumental rationality: achieving your values. Not necessarily “your values” in the sense of being selfish values or unshared values: “your values” means anything you care about. The art of choosing actions that steer the future toward outcomes ranked higher in your preferences. On LW we sometimes refer to this as “winning”.

...

“X is rational!” is usually just a more strident way of saying “I think X is true” or “I think X is good”. So why have an additional word for “rational” as well as “true” and “good”? Because we want to talk about systematic methods for obtaining truth and winning.

The word “rational” has potential pitfalls, but there are plenty of non-borderline cases where “rational” works fine to communicate what one is getting at, likewise “irrational”. In these cases we’re not afraid to use it.

Now, I think that the definition or description of ‘rationality’ above is pretty good. In fact, if I wanted to introduce someone to the concept of rationality then I would probably refer to it, but I would explain that it is a working definition. This means that it conveys the general idea and that most of the time this suffices. I have no problems with working definitions. One of my favorite ideas on less wrong is the idea that words and concepts are pointers to areas in concept space. This idea allows you to use working definitions and to not waste your time on semantic issues. But, I think that an often neglected aspect of this idea is that you still need to ensure that the words you use point to the right and restricted areas in concept space. When people say “I am not here to argue about definitions”, this does not abdicate their responsibility to create decent definitions. It is like saying: “hey, I know that this definition is not perfect, but I think that it’s close enough that it will be able to convey the general idea that I am getting at”. If that is all that you are trying to do, then avoiding refining your definitions is fine, but it should be noted that the more important and cited the concept becomes the more neccesary it is to improve the definitions of the concept.

I think the definition of rationality above has two major problems:

  • it doesn’t highlight all of the important areas, but it can be extended to cover them. If something is highlighted, then it would mean that it was obvious and clear that you were referring to it. When I think of instrumental rationality, I don’t, for example, think of seeing things from multiple perspectives, finding the best way to interpret situations, aligning your values, training creativity etc. (there are probably better examples). The point I am getting at is that Instrumental rationality (“winning”) seems to me like it can be expanded to include almost anything, but it doesn’t necessarily point to all the important points that I think a more explicit definition of ‘rationality’ would.

  • it describes methods to achieve rationality, not what rationality is. The way that ‘rationality’ is defined is kind of like defining ‘fit’ by referring to ‘exercise’ and ‘eating right’. This is because Epistemic rationality is only valuable instrumentally. It helps you to create truer beliefs, but these beliefs need to be applied before they can actually be useful. If you spend lots of effort creating truer beliefs or trying to understand what rationality means and then compartmentalize that knowledge, you have effectively gained nothing. An example is Robert Aumann, he knows a lot about rationality, but he doesn’t seem to be too rational as it looks like he believes in non overlapping magisteria.

Perhaps the biggest issue I have with the definition is not anything to do with how it currently is, but instead with how hard it is to improve. This sounds like a good thing, but it’s not. The definition is hard to improve, not because it is perfect, but because instrumental rationality is just too big. Any ideas or improvements to the definition that would seem plausible are likely to be quickly discarded as they can be made to fall into the instrumental rationality category.

I am not going to be providing a better definition of ‘rationality’ as the goal of this post is just to start a discussion, but I do think that a lot of the problems I have mentioned above can be best solved by first choosing a simple core definition of what it means to be rational and then having a seperate myriad of areas in which improvements lead to increases in rationality. In general, the more granularised, results-orientated and verified each of these areas is the better.

A possible parent or base definition for ‘rationality’ is already on the wiki. It says that ‘rationality’ is the: “characteristic of thinking and acting optimally”. This, to me, seems like a pretty good starting point, although, I do admit that the definition itself is too concise and that it doesn’t really tell us much since ‘optimal’ is also hard to define.

That is not to say that we don’t have any idea of what ‘optimal’ means, we do. It is just that this understanding (logic, probability and decision theory etc.) is mostly related to the normative sense of the word. This is a problem because we are agents with certain limitations and adaptations which make it so that our attempts to do things the normative way are often impractical, cumbersome and flawed. It is for this reason that any definition of ‘rationality’ should be about more than just: ‘get closer to the normative model’. Of course, getting closer to the results of the normative model is always good, but I still think that a decent definition of ‘rationality’ should take into account, for example, ecological and bounded rationality as well as the values of the agent under consideration.

  • Ecological rationality takes into account the context and representation of information. If a certain representation of information has been recurrent and stable during an agents evolution, then that agents cognitive processes are likely to be better adapted to those representations. There is a big difference between being irrational and performing poorly at specific types of problems because your cognitive processes have not adapted to information in a particular format.

  • Bounded rationality takes into account the fact that most agents are limited by the information they have, the cognitive limitations of their minds, and the time available for them to make decisions. For limited agents, the fast and frugal heuristical approach to a problem may be what is optimal or at least not as bad as it seems. From, What Does It Mean to be Biased: Motivated Reasoning and Rationality

    The rather surprising conclusion from a century of research purporting to show humans as poor at judgment and decision making, prone to motivational distortions, and inherently irrational is that it is far from clear to what extent human cognition exhibits systematic bias that comes with a genuine accuracy cost.

  • It is also important to take into account what the agent values. An alien is not irrational just because it values things differently than we do.

It is possible that ‘rationality’ isn’t the best word to be using since there already exists an extremely large number of varied opinions on what it means. I would not be against choosing a different word if this would help to better illuminate what allows people to become less wrong. At the end of day, I don’t really care about ‘rationality’, persay, all I care about is becoming less wrong. If for whatever reason ‘rationality’ and ‘becoming lesswrong’ become different or divergent, then I will move away from ‘rationality’.
To start of the discussion here are a few questions that I have in regards to the current meaning of ‘rationality’ on less wrong:
  1. Do you think that a discussion on the meaning of ‘rationality’ would be helpful?

  2. Do you have any issues that were not mentioned above with how ‘rationality’ is currently defined on less wrong?

  3. Do you think that the issues of ‘rationality’ that I describe above make sense and are valid criticisms.

  4. Do you think that explaining general areas and topics that lead to improvements in rationality would be helpful?

  5. Is there anything you can think of that is related to becoming less wrong and you also think has nothing or very little to do with becoming more rational?