“So I can imagine another math in which 2+2=5 is not obviously false, but needs a long proof and complicated equations...”
So, from the fact that another mind might take a long time to understand integer operations, you conclude that it has “another math”? And what does that mean for algorithms?
If an intelligence is general, it will be able to, in time, understand any concept that can be understood by any other general or narrow intelligence. And then use it to create an algorithm. Or be conquered.
Think of binary arithmetic verses decimal arithmetic verses hexadecimal arithmetic.
Certain things in each of these arithmetics are extremely easy compared to the other two.
For example, in binary, multiplying by 2 is absurdly easy, but multiplying by 10 is much harder. Multiplying by 16 is actually slightly easier than 10, as there are some cool tricks that apply between the two sets.
In decimal, multiplying by 10 is never hard, no matter how big the number. Multiplying by 2 can be hard if the number is big enough, but it’s still pretty easy. Multiplying by 16 takes some mental gymnastics right from the get-go (well, for most people anyway).
You see the pattern, so I won’t do hex.
Basic floating point arithmetic is quite easy in decimal, but doing this in binary is significantly more difficult and often results in non-terminating numbers, or even non-real numbers akin to dividing one by three or pi in decimal. 10.06 might look nice and clean in decimal, but it’s a nightmare in binary. The net result in computer science is that you have to be very, very careful with binary rounding errors, since almost every floating point calculation is going to require rounding for most numbers.
And that’s just starting with a different number of digits on your hands. Imagine if you looked at the world in a completely different way than we do, what would math look like? The physics wouldn’t change, but perhaps calculus is as easy as addition is to us, but subtraction requires 8 years of schooling to wrap your head around.
What if Martians could follow the movements of electrons, but couldn’t tell that their fingers, thumb, and palm were the same thing as their hand? What would their math look like then?
“So I can imagine another math in which 2+2=5 is not obviously false, but needs a long proof and complicated equations...”
So, from the fact that another mind might take a long time to understand integer operations, you conclude that it has “another math”? And what does that mean for algorithms?
If an intelligence is general, it will be able to, in time, understand any concept that can be understood by any other general or narrow intelligence. And then use it to create an algorithm. Or be conquered.
Think of binary arithmetic verses decimal arithmetic verses hexadecimal arithmetic.
Certain things in each of these arithmetics are extremely easy compared to the other two.
For example, in binary, multiplying by 2 is absurdly easy, but multiplying by 10 is much harder. Multiplying by 16 is actually slightly easier than 10, as there are some cool tricks that apply between the two sets.
In decimal, multiplying by 10 is never hard, no matter how big the number. Multiplying by 2 can be hard if the number is big enough, but it’s still pretty easy. Multiplying by 16 takes some mental gymnastics right from the get-go (well, for most people anyway).
You see the pattern, so I won’t do hex.
Basic floating point arithmetic is quite easy in decimal, but doing this in binary is significantly more difficult and often results in non-terminating numbers, or even non-real numbers akin to dividing one by three or pi in decimal. 10.06 might look nice and clean in decimal, but it’s a nightmare in binary. The net result in computer science is that you have to be very, very careful with binary rounding errors, since almost every floating point calculation is going to require rounding for most numbers.
And that’s just starting with a different number of digits on your hands. Imagine if you looked at the world in a completely different way than we do, what would math look like? The physics wouldn’t change, but perhaps calculus is as easy as addition is to us, but subtraction requires 8 years of schooling to wrap your head around.
What if Martians could follow the movements of electrons, but couldn’t tell that their fingers, thumb, and palm were the same thing as their hand? What would their math look like then?