First, intuitionist mathematics makes perfect sense to me, and has for some time. Not as a replacement for the conventional math, but as an alternative view.
Second, it has nothing to do with physics, so Gisin’s musings (published work, not some popular interpretation of it: https://arxiv.org/abs/2002.01653), are guaranteed to be not a step in any progress of the understanding of physics. Sorry.
Third, both are trying to get to an important point, but tend to miss it. The point is that it takes effort, time and resources to build useful models of observations. Thus, whether the gazillionth digit of pi is even or odd is not an absolute question (hence no law of excluded middle). It takes (agent-dependent) effort to calculate it, and until you yourself ascertain which it is, if any, it’s neither, not for you. You can assign probabilities to it being even and it being odd, and these probabilities might even add to one, but it is still possible that something would prevent you from ever calculating it, so all you can say is “If I ever get to calculate it, the result will be either even or odd.” Note that you do not make a statement about some objective gazillionth digit of pi, only about your result of calculating it. You might make a mistake somewhere, for example. Or you might die before. So, intuitionism doesn’t go far enough, because it’s still trying to be “objective” while giving up most of the objectivity of the traditional mathematics.
Again, mathematical proofs are as much observations as anything else. Just because they happen in one’s head or with a pencil on paper, they are still observations. Repeatable, given the right equipment (including the right brain, such as the one capable of proving some theorems), and so reliable under some conditions.
The above means that some of your example are better than others. Once you have the tools for manipulating infinite numbers, you don’t have to expend a lot of effort doing so. The difficulty of calculating a far-away digit in the decimal expansion of pi has nothing to do with pi itself: you can perfectly well define it as the ratio of circumference to diameter, or as a limit of some series, or as 2*arcsin(1), or something else. You can even build a base-pi system of counting. Then you can complain how hard it is to find the umpteenth digit of decimal 1 in base pi! It doesn’t mean that 1 is more complex or simpler than pi, all it means that certain calculations are harder than others, and the hardness depends on many things, including on who is doing the calculation and what tools they are using.
So the point is how hard to measure something, and that includes how much time it takes. Not any kind of correspondence with counting numbers taking longer.
Fourth, the idea that Einstein’s equations are somehow unique in terms of being timeless is utterly false. Electromagnetism is often written in a covariant form as []A=J and dA=0, where A and J are spacetime quantities. Similarly, the Einstein equation can be cast as an initial value problem, with the time evolution being explicit, and it is done that way in all black hole collision simulations. Similarly, quantum mechanics can be written as a path integral, where time is just one of the variables.
So, Gisin attempts to use intuitionist math for physics are bound to be forgotten, as they add nothing to our understanding of either math or physics. Sadly, he missed the point.
“so Gisin’s musings… are guaranteed to be not a step in any progress of the understanding of physics.”
What is your epistemic justification for asserting such a guarantee of failure? Of course, any new speculative idea in theoretical physics is far from likely to be adopted as part of the core theory, but you are making a much stronger claim by saying that it will not even be “a step in any progress of the understanding of physics”. Even ideas that are eventually rejected as false, are often useful for developing understanding. Gisin’s papers ask physicists to consider their unexamined assumptions about the nature of math itself, which seems at least like a fruitful path of inquiry, even if it won’t necessarily lead to any major breakthroughs.
“mathematical proofs are as much observations as anything else. Just because they happen in one’s head or with a pencil on paper, they are still observations.”
This reminds me of John Locke’s view that mathematical truths come from observation of internal states. That is an interesting perspective, but I’m not sure it an hold up to scrutiny. The biggest issue with it seems to be that in order to evaluate the evidence provided by empirical observations we must have a rational framework which includes logic and math. If logic and math themselves were simply observational, then we have no framework for evaluating the evidence provided by those observations. Perhaps you can give an alternative account of how we evaluate evidence without pre-supposing a rational framework.
“The difficulty of calculating a far-away digit in the decimal expansion of pi has nothing to do with pi itself: you can perfectly well define it as the ratio of circumference to diameter, or as a limit of some series”
I agree with this statement. I think though it misses the point I was elaborating about Brouwer’s concept of choice sequences. The issue isn’t that we can’t define a sequence that is equivalent to the infinite expansion of pi, I think it is rather that for any real quantity we an never be certain that it will continue to obey the lawlike expansion into the future. So the issue isn’t the “difficulty of calculating a far-away digit” the issue is that no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern. No matter how many digits of pi a real number contains, the next digit might suddenly be something other than pi (in which case we would say retrospectively that the real number was never equal to pi in the first place). This is actually what we observe, if we are to say measure the ratio of a jar lid’s diameter to it’s circumference. The first few digits will match pi, but then as we to smaller scales it will deviate.
″...the idea that Einstein’s equations are somehow unique in terms of being timeless is utterly false”
I made no claim that they are unique in this regard.
“mathematical proofs are as much observations as anything else. Just because they happen in one’s head or with a pencil on paper, they are still observations.”
I think this is better explained as:
We try to do math, but we can make mistakes.*
If two people evaluate an arithmetic expression the same way, but one makes a mistake, then they might get different answers.
*Other examples:
1. You can try to create a mathematical proof. But if you make a mistake, it might be wrong (even if the premises are right).
2. An incorrect proof, a typo, or something on your computer screen?
A proof might have a mistake in it and thus “be invalid”. But it could also have a typo, which if corrected yields a “valid proof”.
Or, the proof might not have a mistake in it—you could have misread it, and what it says is different from what you saw. (Someone can also summarize a proof badly.)
If the copy of the proof you have is different from the original errors (or changes) could have been introduced along the way.
The Einstein equation was singled out in the Quanta magazine article. I respect the author, she wrote a lot of good articles for Quanta, but this was quite misleading.
I don’t understand your second last point. Are you talking about a mathematical algorithm or about a physical measurement? ” no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern”—what pattern?
The biggest issue with it seems to be that in order to evaluate the evidence provided by empirical observations we must have a rational framework which includes logic and math. If logic and math themselves were simply observational, then we have no framework for evaluating the evidence provided by those observations.
No, we don’t. And yes, they are. We start with some innate abilities of the brain, add the culture we are brought in, then develop models of empirical observations, whatever they are. 1+1=2 is an abstraction of various empirical observations, be in counting sheep or in mathematical proofs. Logic and math co-develop with increasingly complex models and increasingly non-trivial observations, there is no “we need logic and math to evaluate evidence”. If you look through the history of science, math was being developed alongside physics, as one of the tools. In that sense the Noether theorem, for example, is akin to, say, a new kind of a telescope.
What is your epistemic justification for asserting such a guarantee of failure?
Because they are of the type that is “not even wrong”. The standard math works just fine for both GR and QM, the two main issues are conceptual, not mathematical: How does the (nonlinear) projection postulate emerge from the linear evolution (and no, MWI is not a useful “answer”, it has zero predictive power), and how do QM and GR mesh at the mesoscopic scale (i.e. what are the gravitational effects of a spatially separated entangled state?).
First, intuitionist mathematics makes perfect sense to me, and has for some time. Not as a replacement for the conventional math, but as an alternative view.
Second, it has nothing to do with physics, so Gisin’s musings (published work, not some popular interpretation of it: https://arxiv.org/abs/2002.01653), are guaranteed to be not a step in any progress of the understanding of physics. Sorry.
Third, both are trying to get to an important point, but tend to miss it. The point is that it takes effort, time and resources to build useful models of observations. Thus, whether the gazillionth digit of pi is even or odd is not an absolute question (hence no law of excluded middle). It takes (agent-dependent) effort to calculate it, and until you yourself ascertain which it is, if any, it’s neither, not for you. You can assign probabilities to it being even and it being odd, and these probabilities might even add to one, but it is still possible that something would prevent you from ever calculating it, so all you can say is “If I ever get to calculate it, the result will be either even or odd.” Note that you do not make a statement about some objective gazillionth digit of pi, only about your result of calculating it. You might make a mistake somewhere, for example. Or you might die before. So, intuitionism doesn’t go far enough, because it’s still trying to be “objective” while giving up most of the objectivity of the traditional mathematics.
Again, mathematical proofs are as much observations as anything else. Just because they happen in one’s head or with a pencil on paper, they are still observations. Repeatable, given the right equipment (including the right brain, such as the one capable of proving some theorems), and so reliable under some conditions.
The above means that some of your example are better than others. Once you have the tools for manipulating infinite numbers, you don’t have to expend a lot of effort doing so. The difficulty of calculating a far-away digit in the decimal expansion of pi has nothing to do with pi itself: you can perfectly well define it as the ratio of circumference to diameter, or as a limit of some series, or as 2*arcsin(1), or something else. You can even build a base-pi system of counting. Then you can complain how hard it is to find the umpteenth digit of decimal 1 in base pi! It doesn’t mean that 1 is more complex or simpler than pi, all it means that certain calculations are harder than others, and the hardness depends on many things, including on who is doing the calculation and what tools they are using.
So the point is how hard to measure something, and that includes how much time it takes. Not any kind of correspondence with counting numbers taking longer.
Fourth, the idea that Einstein’s equations are somehow unique in terms of being timeless is utterly false. Electromagnetism is often written in a covariant form as []A=J and dA=0, where A and J are spacetime quantities. Similarly, the Einstein equation can be cast as an initial value problem, with the time evolution being explicit, and it is done that way in all black hole collision simulations. Similarly, quantum mechanics can be written as a path integral, where time is just one of the variables.
So, Gisin attempts to use intuitionist math for physics are bound to be forgotten, as they add nothing to our understanding of either math or physics. Sadly, he missed the point.
Thanks for your comment. My replies are below.
“so Gisin’s musings… are guaranteed to be not a step in any progress of the understanding of physics.”
What is your epistemic justification for asserting such a guarantee of failure? Of course, any new speculative idea in theoretical physics is far from likely to be adopted as part of the core theory, but you are making a much stronger claim by saying that it will not even be “a step in any progress of the understanding of physics”. Even ideas that are eventually rejected as false, are often useful for developing understanding. Gisin’s papers ask physicists to consider their unexamined assumptions about the nature of math itself, which seems at least like a fruitful path of inquiry, even if it won’t necessarily lead to any major breakthroughs.
“mathematical proofs are as much observations as anything else. Just because they happen in one’s head or with a pencil on paper, they are still observations.”
This reminds me of John Locke’s view that mathematical truths come from observation of internal states. That is an interesting perspective, but I’m not sure it an hold up to scrutiny. The biggest issue with it seems to be that in order to evaluate the evidence provided by empirical observations we must have a rational framework which includes logic and math. If logic and math themselves were simply observational, then we have no framework for evaluating the evidence provided by those observations. Perhaps you can give an alternative account of how we evaluate evidence without pre-supposing a rational framework.
“The difficulty of calculating a far-away digit in the decimal expansion of pi has nothing to do with pi itself: you can perfectly well define it as the ratio of circumference to diameter, or as a limit of some series”
I agree with this statement. I think though it misses the point I was elaborating about Brouwer’s concept of choice sequences. The issue isn’t that we can’t define a sequence that is equivalent to the infinite expansion of pi, I think it is rather that for any real quantity we an never be certain that it will continue to obey the lawlike expansion into the future. So the issue isn’t the “difficulty of calculating a far-away digit” the issue is that no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern. No matter how many digits of pi a real number contains, the next digit might suddenly be something other than pi (in which case we would say retrospectively that the real number was never equal to pi in the first place). This is actually what we observe, if we are to say measure the ratio of a jar lid’s diameter to it’s circumference. The first few digits will match pi, but then as we to smaller scales it will deviate.
″...the idea that Einstein’s equations are somehow unique in terms of being timeless is utterly false”
I made no claim that they are unique in this regard.
I think this is better explained as:
We try to do math, but we can make mistakes.*
If two people evaluate an arithmetic expression the same way, but one makes a mistake, then they might get different answers.
*Other examples:
1. You can try to create a mathematical proof. But if you make a mistake, it might be wrong (even if the premises are right).
2. An incorrect proof, a typo, or something on your computer screen?
A proof might have a mistake in it and thus “be invalid”. But it could also have a typo, which if corrected yields a “valid proof”.
Or, the proof might not have a mistake in it—you could have misread it, and what it says is different from what you saw. (Someone can also summarize a proof badly.)
If the copy of the proof you have is different from the original errors (or changes) could have been introduced along the way.
Let me reply to the last one first :)
The Einstein equation was singled out in the Quanta magazine article. I respect the author, she wrote a lot of good articles for Quanta, but this was quite misleading.
I don’t understand your second last point. Are you talking about a mathematical algorithm or about a physical measurement? ” no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern”—what pattern?
No, we don’t. And yes, they are. We start with some innate abilities of the brain, add the culture we are brought in, then develop models of empirical observations, whatever they are. 1+1=2 is an abstraction of various empirical observations, be in counting sheep or in mathematical proofs. Logic and math co-develop with increasingly complex models and increasingly non-trivial observations, there is no “we need logic and math to evaluate evidence”. If you look through the history of science, math was being developed alongside physics, as one of the tools. In that sense the Noether theorem, for example, is akin to, say, a new kind of a telescope.
Because they are of the type that is “not even wrong”. The standard math works just fine for both GR and QM, the two main issues are conceptual, not mathematical: How does the (nonlinear) projection postulate emerge from the linear evolution (and no, MWI is not a useful “answer”, it has zero predictive power), and how do QM and GR mesh at the mesoscopic scale (i.e. what are the gravitational effects of a spatially separated entangled state?).