Is that my option 3? ETA: I still don’t know what your point is. Try s/explaining/predicting in my previous comment.
ETA 2: The meaning of your original comment is what I really don’t get. Eliezer is saying that reality consists of elementary particles, not elementary particles and other things. You don’t seem to be disagreeing with this, but you’re depreciating the proposition somehow. You say it’s not predictive, but what is the significance of that? The fact that every natural number has a successor won’t help you do arithmetic, but it’s still true; and you really can do arithmetic with the larger axiom set of which it is a part. Analogously, the proposition that everything is made of elementary particles is not in itself very predictive, but it is a property of fundamental theories which we use and which are predictive.
Eliezer is saying that reality consists of elementary particles, not elementary particles and other things. You don’t seem to be disagreeing with this, but you’re depreciating the proposition somehow.
What I am saying is that this is irrelevant for higher-level concepts. You can make the same brain out of neurons, or, if you believe in upload, out of bits. It will have all the same cognitive processes, same biases etc. Knowing that the former can be eventually decomposed into subatomic particles adds nothing to our understanding of psychology.
Knowing that the former can be eventually decomposed into subatomic particles adds nothing to our understanding of psychology.
So? What query are you trying to answer?
Are you asking whether we ought to study and understand reductionism? Answer: yes, if we don’t get reductionism, we might miss that uploads, etc. are possible.
Are you saying it may not be worth it to learn all the low-level detail, because our higher abstractions aren’t all that leaky. Answer: agree for most things, but some require the lower stuff.
I understand what you are saying. Why are you saying it? What is interesting about the idea that higher levels of your map are agnostic to lower level details? What is the query?
I can’t figure out what you’re trying to say. Are you saying:
(1) QFT is inherently incapable of explaining the aerodynamics of rigid macroscopic bodies, ever
(2) QFT can do that in principle, but in practice we can’t yet justify some of the intermediate steps
(3) QFT can do that in principle, but in practice it’s pointless because the higher-level theories already tell you everything about the higher levels
(4) something else?
As I said, explaining != predicting.
Is that my option 3? ETA: I still don’t know what your point is. Try s/explaining/predicting in my previous comment.
ETA 2: The meaning of your original comment is what I really don’t get. Eliezer is saying that reality consists of elementary particles, not elementary particles and other things. You don’t seem to be disagreeing with this, but you’re depreciating the proposition somehow. You say it’s not predictive, but what is the significance of that? The fact that every natural number has a successor won’t help you do arithmetic, but it’s still true; and you really can do arithmetic with the larger axiom set of which it is a part. Analogously, the proposition that everything is made of elementary particles is not in itself very predictive, but it is a property of fundamental theories which we use and which are predictive.
What I am saying is that this is irrelevant for higher-level concepts. You can make the same brain out of neurons, or, if you believe in upload, out of bits. It will have all the same cognitive processes, same biases etc. Knowing that the former can be eventually decomposed into subatomic particles adds nothing to our understanding of psychology.
So? What query are you trying to answer?
Are you asking whether we ought to study and understand reductionism? Answer: yes, if we don’t get reductionism, we might miss that uploads, etc. are possible.
Are you saying it may not be worth it to learn all the low-level detail, because our higher abstractions aren’t all that leaky. Answer: agree for most things, but some require the lower stuff.
Why are you bringing this up?
I thought I had clearly explained it in my original top-level comment: the underlying structure is irrelevant for the entities a few levels removed.
And if it did, it wouldn’t matter for atomic physics and up.
I understand what you are saying. Why are you saying it? What is interesting about the idea that higher levels of your map are agnostic to lower level details? What is the query?