The usual answer is that it is measuring the g factor, the unobserved general-intelligence capability. It was originally formulated as the first principal component of the results of a variety of IQ tests. It is quantifiable (by IQ points) and it does have real-world consequences.
Absolutely, but +n g doesn’t necessarily mean +m IQ for all (n,m).
I don’t understand what that means.
Here’s a place where my intuition’s going to struggle to formulate good words for this.
An intelligent system receives information (which has fundamental units of Entropy) and outputs a behavior. A
“proper” quantitative measure of intelligence should be a simple function of how much Utility it can expect from its chosen behavior, on average, given an input with n bits of Entropy, and t seconds to crunch on those bits. Whether “Utility” is measured in units similar to Kolmogorov complexity is questionable, but that’s what my naive intuition yanked out when grasping for units.
But the point is, whatever we actually choose to measure g in, the term “+1 g” should make sense, and should mean the same thing regardless of what our current g is. IQ, being merely a statistical fit onto a gaussian distribution, does NOT do that.
but +n g doesn’t necessarily mean +m IQ for all (n,m)
This phrase implies that you have a metric for g (different from IQ points) because without it the expression “+n g” has no meaning.
An intelligent system receives information (which has fundamental units of Entropy) and outputs a behavior.
Okay. To be precise we are talking about Shannon entropy and these units are bits.
A “proper” quantitative measure of intelligence should be a simple function of how much Utility it can expect from its chosen behavior
Hold on. What is this Utility thing? I don’t see how it fits in the context in which we are talking. You are now introducing things like goals and values. Kolmogorov complexity is a measure of complexity, what does it have to do with utility?
the term “+1 g” should make sense, and should mean the same thing regardless of what our current g is
I don’t see this as obvious. Why?
IQ, being merely a statistical fit onto a gaussian distribution
Not so. IQ is a metric, presumably of g, that is rescaled so that the average IQ is 100. Rescaling isn’t a particularly distorting operation to do. It is not fit onto a gaussian distribution.
IQ is a metric, presumably of g, that is rescaled so that the average IQ is 100. Rescaling isn’t a particularly distorting operation to do. It is not fit onto a gaussian distribution.
I’m afraid you’re mistaken here. IQ scores are generally derived from a set of raw test scores by fitting them to a normal distribution with mean 100 and SD of 15 (sometimes 16): IQ 70 is thus defined as a score two standard deviations below the mean. It’s not a linear rescaling, unless the question pool just happens to give you a normal distribution of raw scores.
Absolutely, but +n g doesn’t necessarily mean +m IQ for all (n,m).
Here’s a place where my intuition’s going to struggle to formulate good words for this.
An intelligent system receives information (which has fundamental units of Entropy) and outputs a behavior. A “proper” quantitative measure of intelligence should be a simple function of how much Utility it can expect from its chosen behavior, on average, given an input with n bits of Entropy, and t seconds to crunch on those bits. Whether “Utility” is measured in units similar to Kolmogorov complexity is questionable, but that’s what my naive intuition yanked out when grasping for units.
But the point is, whatever we actually choose to measure g in, the term “+1 g” should make sense, and should mean the same thing regardless of what our current g is. IQ, being merely a statistical fit onto a gaussian distribution, does NOT do that.
This phrase implies that you have a metric for g (different from IQ points) because without it the expression “+n g” has no meaning.
Okay. To be precise we are talking about Shannon entropy and these units are bits.
Hold on. What is this Utility thing? I don’t see how it fits in the context in which we are talking. You are now introducing things like goals and values. Kolmogorov complexity is a measure of complexity, what does it have to do with utility?
I don’t see this as obvious. Why?
Not so. IQ is a metric, presumably of g, that is rescaled so that the average IQ is 100. Rescaling isn’t a particularly distorting operation to do. It is not fit onto a gaussian distribution.
I’m afraid you’re mistaken here. IQ scores are generally derived from a set of raw test scores by fitting them to a normal distribution with mean 100 and SD of 15 (sometimes 16): IQ 70 is thus defined as a score two standard deviations below the mean. It’s not a linear rescaling, unless the question pool just happens to give you a normal distribution of raw scores.
Hm. A quick look around finds this which says that raw scores are standardized by forcing them to the mean of 100 and the standard deviation of 15.
This is a linear transformation and it does not fit anything to a gaussian distribution.
Of course this is just stackexchange—do you happen to have links to how “proper” IQ test are supposed to convert raw scores into IQ points?
If the difficulty of the questions can’t be properly quantified, what exactly do the raw scores tell you?
See the first sentence of the penultimate paragraph of this.