Extensions and Intensions

“What is red?”
“Red is a color.”
”What’s a color?”
″A color is a prop­erty of a thing.”

But what is a thing? And what’s a prop­erty? Soon the two are lost in a maze of words defined in other words, the prob­lem that Steven Har­nad once de­scribed as try­ing to learn Chi­nese from a Chi­nese/​Chi­nese dic­tio­nary.

Alter­na­tively, if you asked me “What is red?” I could point to a stop sign, then to some­one wear­ing a red shirt, and a traf­fic light that hap­pens to be red, and blood from where I ac­ci­den­tally cut my­self, and a red busi­ness card, and then I could call up a color wheel on my com­puter and move the cur­sor to the red area. This would prob­a­bly be suffi­cient, though if you know what the word “No” means, the truly strict would in­sist that I point to the sky and say “No.”

I think I stole this ex­am­ple from S. I. Hayakawa—though I’m re­ally not sure, be­cause I heard this way back in the in­dis­tinct blur of my child­hood. (When I was 12, my father ac­ci­den­tally deleted all my com­puter files. I have no mem­ory of any­thing be­fore that.)

But that’s how I re­mem­ber first learn­ing about the differ­ence be­tween in­ten­sional and ex­ten­sional defi­ni­tion. To give an “in­ten­sional defi­ni­tion” is to define a word or phrase in terms of other words, as a dic­tio­nary does. To give an “ex­ten­sional defi­ni­tion” is to point to ex­am­ples, as adults do when teach­ing chil­dren. The pre­ced­ing sen­tence gives an in­ten­sional defi­ni­tion of “ex­ten­sional defi­ni­tion”, which makes it an ex­ten­sional ex­am­ple of “in­ten­sional defi­ni­tion”.

In Hol­ly­wood Ra­tion­al­ity and pop­u­lar cul­ture gen­er­ally, “ra­tio­nal­ists” are de­picted as word-ob­sessed, float­ing in end­less ver­bal space dis­con­nected from re­al­ity.

But the ac­tual Tra­di­tional Ra­tion­al­ists have long in­sisted on main­tain­ing a tight con­nec­tion to ex­pe­rience:

“If you look into a text­book of chem­istry for a defi­ni­tion of lithium, you may be told that it is that el­e­ment whose atomic weight is 7 very nearly. But if the au­thor has a more log­i­cal mind he will tell you that if you search among min­er­als that are vit­re­ous, translu­cent, grey or white, very hard, brit­tle, and in­sol­u­ble, for one which im­parts a crim­son tinge to an un­lu­mi­nous flame, this min­eral be­ing trit­u­rated with lime or witherite rats-bane, and then fused, can be partly dis­solved in muri­atic acid; and if this solu­tion be evap­o­rated, and the resi­due be ex­tracted with sul­phuric acid, and duly puri­fied, it can be con­verted by or­di­nary meth­ods into a chlo­ride, which be­ing ob­tained in the solid state, fused, and elec­trolyzed with half a dozen pow­er­ful cells, will yield a glob­ule of a pink­ish silvery metal that will float on gasolene; and the ma­te­rial of that is a spec­i­men of lithium.”
— Charles San­ders Peirce

That’s an ex­am­ple of “log­i­cal mind” as de­scribed by a gen­uine Tra­di­tional Ra­tion­al­ist, rather than a Hol­ly­wood scriptwriter.

But note: Peirce isn’t ac­tu­ally show­ing you a piece of lithium. He didn’t have pieces of lithium sta­pled to his book. Rather he’s giv­ing you a trea­sure map—an in­ten­sion­ally defined pro­ce­dure which, when ex­e­cuted, will lead you to an ex­ten­sional ex­am­ple of lithium. This is not the same as just toss­ing you a hunk of lithium, but it’s not the same as say­ing “atomic weight 7” ei­ther. (Though if you had suffi­ciently sharp eyes, say­ing “3 pro­tons” might let you pick out lithium at a glance...)

So that is in­ten­sional and ex­ten­sional defi­ni­tion., which is a way of tel­ling some­one else what you mean by a con­cept. When I talked about “defi­ni­tions” above, I talked about a way of com­mu­ni­cat­ing con­cepts—tel­ling some­one else what you mean by “red”, “tiger”, “hu­man”, or “lithium”. Now let’s talk about the ac­tual con­cepts them­selves.

The ac­tual in­ten­sion of my “tiger” con­cept would be the neu­ral pat­tern (in my tem­po­ral cor­tex) that in­spects an in­com­ing sig­nal from the vi­sual cor­tex to de­ter­mine whether or not it is a tiger.

The ac­tual ex­ten­sion of my “tiger” con­cept is ev­ery­thing I call a tiger.

In­ten­sional defi­ni­tions don’t cap­ture en­tire in­ten­sions; ex­ten­sional defi­ni­tions don’t cap­ture en­tire ex­ten­sions. If I point to just one tiger and say the word “tiger”, the com­mu­ni­ca­tion may fail if they think I mean “dan­ger­ous an­i­mal” or “male tiger” or “yel­low thing”. Similarly, if I say “dan­ger­ous yel­low-black striped an­i­mal”, with­out point­ing to any­thing, the listener may vi­su­al­ize gi­ant hor­nets.

You can’t cap­ture in words all the de­tails of the cog­ni­tive con­cept—as it ex­ists in your mind—that lets you rec­og­nize things as tigers or non­tigers. It’s too large. And you can’t point to all the tigers you’ve ever seen, let alone ev­ery­thing you would call a tiger.

The strongest defi­ni­tions use a cross­fire of in­ten­sional and ex­ten­sional com­mu­ni­ca­tion to nail down a con­cept. Even so, you only com­mu­ni­cate maps to con­cepts, or in­struc­tions for build­ing con­cepts—you don’t com­mu­ni­cate the ac­tual cat­e­gories as they ex­ist in your mind or in the world.

(Yes, with enough cre­ativity you can con­struct ex­cep­tions to this rule, like “Sen­tences Eliezer Yud­kowsky has pub­lished con­tain­ing the term ‘hura­ga­loni’ as of Feb 4, 2008”. I’ve just shown you this con­cept’s en­tire ex­ten­sion. But ex­cept in math­e­mat­ics, defi­ni­tions are usu­ally trea­sure maps, not trea­sure.)

So that’s an­other rea­son you can’t “define a word any way you like”: You can’t di­rectly pro­gram con­cepts into some­one else’s brain.

Even within the Aris­totelian paradigm, where we pre­tend that the defi­ni­tions are the ac­tual con­cepts, you don’t have si­mul­ta­neous free­dom of in­ten­sion and ex­ten­sion. Sup­pose I define Mars as “A huge red rocky sphere, around a tenth of Earth’s mass and 50% fur­ther away from the Sun”. It’s then a sep­a­rate mat­ter to show that this in­ten­sional defi­ni­tion matches some par­tic­u­lar ex­ten­sional thing in my ex­pe­rience, or in­deed, that it matches any real thing what­so­ever. If in­stead I say “That’s Mars” and point to a red light in the night sky, it be­comes a sep­a­rate mat­ter to show that this ex­ten­sional light matches any par­tic­u­lar in­ten­sional defi­ni­tion I may pro­pose—or any in­ten­sional be­liefs I may have—such as “Mars is the God of War”.

But most of the brain’s work of ap­ply­ing in­ten­sions hap­pens sub-de­liber­ately. We aren’t con­sciously aware that our iden­ti­fi­ca­tion of a red light as “Mars” is a sep­a­rate mat­ter from our ver­bal defi­ni­tion “Mars is the God of War”. No mat­ter what kind of in­ten­sional defi­ni­tion I make up to de­scribe Mars, my mind be­lieves that “Mars” refers to this thingy, and that it is the fourth planet in the So­lar Sys­tem.

When you take into ac­count the way the hu­man mind ac­tu­ally, prag­mat­i­cally works, the no­tion “I can define a word any way I like” soon be­comes “I can be­lieve any­thing I want about a fixed set of ob­jects” or “I can move any ob­ject I want in or out of a fixed mem­ber­ship test”. Just as you can’t usu­ally con­vey a con­cept’s whole in­ten­sion in words be­cause it’s a big com­pli­cated neu­ral mem­ber­ship test, you can’t con­trol the con­cept’s en­tire in­ten­sion be­cause it’s ap­plied sub-de­liber­ately. This is why ar­gu­ing that XYZ is true “by defi­ni­tion” is so pop­u­lar. If defi­ni­tion changes be­haved like the em­piri­cal nul­lops they’re sup­posed to be, no one would bother ar­gu­ing them. But abuse defi­ni­tions just a lit­tle, and they turn into magic wands—in ar­gu­ments, of course; not in re­al­ity.