Interesting topic! I’m a huge fan of “out of the box” thinking. But I prefer to apply “out of the box” thinking to the phrase itself by referring to this type of thinking as epiphytic thinking.
The phrase “epiphytic thinking” helps promote/advertise epiphytes. Did you know that the orchid family is the largest plant family? Around 10% of all plants are orchids.… and most orchids are epiphytes.
Epiphytes can help sequester as much carbon as trees do. They also help create a gazillion different niches which has has helped increase animal speciation/biodiversity.
Epiphytes can certainly help save the world. What are boxes good for? Helping you pretend that you’re a robot?
Therefore...
epiphytic thinking > “out of the box” thinking
I’ll apply some epiphytic thinking to your topic.
Let’s say that we have a time machine and we travel back to a hundred years before people discovered that the earth was round. Our mission, which we’ve chosen to accept, is to try and persuade people that the earth is actually round!
It stands to reason that no two people are going to be equally willing to hear us out. In this sense… perhaps we can say that there’s a continuum that ranges from the most close-minded person all the way to the most open-minded person. To help quantify this continuum we’ll use a scale from 0 to 10.
The question is...if somebody is a 10 on this scale… does this necessarily mean that they’ll believe us that the world is actually round? Just because they’ll be really willing to listen to our very different perspective on the shape of the world… does this mean that they’ll take our word for it? Not really… because this would imply that our open-mindedness scale was the same thing as a gullibility scale.
So clearly it would help to bring some evidence with us on our mission. Stronger evidence is always better than weaker evidence but let’s just say that our evidence is good.
Imagine if we share our good evidence with 100 people who are all a 10 on the open-mindedness scale. What percentage of them are going to change their beliefs accordingly? Of course we can’t really know the real answer… but it doesn’t seem very likely that 100% of them would exchange their belief in a flat world for a belief in a round world.
From our perspective, we would know that anybody who didn’t change their belief accordingly was making a mistake. Why did they make the mistake though? Was it a lack of intelligence? Lack of rationality? Lack of critical thinking skills? Was there some sort of bias involved? Or stubbornness?
Just like no two people are equally open-minded… I don’t think that any two people are equally, for lack of a better term… “evidence-minded”. Is there a better term? “Rationality” seems close but it doesn’t seem quite right to refer to somebody as “irrational” just because our good evidence didn’t persuade them that their belief in a flat world was wrong.
What’s the point here? Well… at one point everybody was really wrong about the shape of the world. So perhaps it’s a pretty good idea for us to fully embrace the possibility that we’re all really wrong about the shape of… say… the best government. Because if it’s really difficult to appreciate the fact that you might be wrong… then it’s going to be really difficult for you to accept any good evidence that proves that you are wrong. Therefore, anybody who’s a 10 on the evidence-minded scale will probably really embrace fallibilism.
Interesting topic! I’m a huge fan of “out of the box” thinking. But I prefer to apply “out of the box” thinking to the phrase itself by referring to this type of thinking as epiphytic thinking.
The phrase “epiphytic thinking” helps promote/advertise epiphytes. Did you know that the orchid family is the largest plant family? Around 10% of all plants are orchids.… and most orchids are epiphytes.
Epiphytes can help sequester as much carbon as trees do. They also help create a gazillion different niches which has has helped increase animal speciation/biodiversity.
Epiphytes can certainly help save the world. What are boxes good for? Helping you pretend that you’re a robot?
Therefore...
epiphytic thinking > “out of the box” thinking
I’ll apply some epiphytic thinking to your topic.
Let’s say that we have a time machine and we travel back to a hundred years before people discovered that the earth was round. Our mission, which we’ve chosen to accept, is to try and persuade people that the earth is actually round!
It stands to reason that no two people are going to be equally willing to hear us out. In this sense… perhaps we can say that there’s a continuum that ranges from the most close-minded person all the way to the most open-minded person. To help quantify this continuum we’ll use a scale from 0 to 10.
The question is...if somebody is a 10 on this scale… does this necessarily mean that they’ll believe us that the world is actually round? Just because they’ll be really willing to listen to our very different perspective on the shape of the world… does this mean that they’ll take our word for it? Not really… because this would imply that our open-mindedness scale was the same thing as a gullibility scale.
So clearly it would help to bring some evidence with us on our mission. Stronger evidence is always better than weaker evidence but let’s just say that our evidence is good.
Imagine if we share our good evidence with 100 people who are all a 10 on the open-mindedness scale. What percentage of them are going to change their beliefs accordingly? Of course we can’t really know the real answer… but it doesn’t seem very likely that 100% of them would exchange their belief in a flat world for a belief in a round world.
From our perspective, we would know that anybody who didn’t change their belief accordingly was making a mistake. Why did they make the mistake though? Was it a lack of intelligence? Lack of rationality? Lack of critical thinking skills? Was there some sort of bias involved? Or stubbornness?
Just like no two people are equally open-minded… I don’t think that any two people are equally, for lack of a better term… “evidence-minded”. Is there a better term? “Rationality” seems close but it doesn’t seem quite right to refer to somebody as “irrational” just because our good evidence didn’t persuade them that their belief in a flat world was wrong.
What’s the point here? Well… at one point everybody was really wrong about the shape of the world. So perhaps it’s a pretty good idea for us to fully embrace the possibility that we’re all really wrong about the shape of… say… the best government. Because if it’s really difficult to appreciate the fact that you might be wrong… then it’s going to be really difficult for you to accept any good evidence that proves that you are wrong. Therefore, anybody who’s a 10 on the evidence-minded scale will probably really embrace fallibilism.