It’s possible to detect tulips, but there are many alternative things that it’s possible to detect, so there needs to be some motivation for the detecting of tulips in particular to actually take place. For natural concepts, it’s efficient world modeling (which your AI by assumption doesn’t need to care about), and for morality-related concepts, it’s value judgments (these will require different concepts for different AIs, but may agree on the utility of keeping track of the “fundamental” physical facts).
(On a different note, “Are tulips in the territory?” sounds like a question about definitions. Some more specific relevant query may be similar, but I’m not sure how to find one.)
So you’re saying that my AI (with infinite computational power) would never discover the existence of tulips?
(On a different note, “Are tulips in the territory?” sounds like a question about definitions.
I don’t intend it to be. I think tulips exist, unlike shmulips (similar to tulips, except they have golf balls instead of flowers), which don’t. I don’t think I have a firm grip on the map-territory distinction, but I was trying to use it in the way Wei was using it.
Anyway, here’s the basis of my question: tulips do exist. They’re real, mind independent things and they are part of the furniture of the universe. Any god or AI who came into our universe would have an incomplete understanding of this universe if they failed to include tulips in their story.
That said, is the complete story of our universe derivable from a complete story of the ontological primitives (plus whatever logic you wish to avail yourself of)? I’m not totally sure that’s a well formed question, mind you.
Anyway, here’s the basis of my question: tulips do exist. They’re real, mind independent things and they are part of the furniture of the universe. Any god or AI who came into our universe would have an incomplete understanding of this universe if they failed to include tulips in their story.
Tulips objectively exist as a fuzzy cluster in configuration space, and if an AI were to list all facts about the world, this would be one of them. But unlike us, an AI or god doesn’t necessarily have a reason to notice this clustering or make any use of it. It’s kind of like 22581959 being prime is an objective fact that you and I can discover, but don’t necessarily have any reason to notice or make use of.
BTW, Eliezer argued, and I agree with, that this kind of objective clustering can’t be used directly to define morally relevant concepts like “people”.
Tulips objectively exist as a fuzzy cluster in configuration space, and if an AI were to list all facts about the world, this would be one of them.
I was being a bit ambiguous: I mean to talk about concrete individual tulips, not the species. Even given the framework in ’The Cluster Structure...”, each actual tulip is a point in thingspace, not a cluster.
If an AI were to list all facts about the world, it would list that the wavefunction of the universe can be approximately factored into X, where X corresponds to what we would call an individual tulip. (Note that an individual tulip is also actually a cluster in configuration space, because it’s a blob of amplitude-factor, not a single point in configuration space. Of course this cluster is much smaller than the cluster of all tulips.)
It’s possible to detect tulips, but there are many alternative things that it’s possible to detect, so there needs to be some motivation for the detecting of tulips in particular to actually take place. For natural concepts, it’s efficient world modeling (which your AI by assumption doesn’t need to care about), and for morality-related concepts, it’s value judgments (these will require different concepts for different AIs, but may agree on the utility of keeping track of the “fundamental” physical facts).
(On a different note, “Are tulips in the territory?” sounds like a question about definitions. Some more specific relevant query may be similar, but I’m not sure how to find one.)
So you’re saying that my AI (with infinite computational power) would never discover the existence of tulips?
I don’t intend it to be. I think tulips exist, unlike shmulips (similar to tulips, except they have golf balls instead of flowers), which don’t. I don’t think I have a firm grip on the map-territory distinction, but I was trying to use it in the way Wei was using it.
Anyway, here’s the basis of my question: tulips do exist. They’re real, mind independent things and they are part of the furniture of the universe. Any god or AI who came into our universe would have an incomplete understanding of this universe if they failed to include tulips in their story.
That said, is the complete story of our universe derivable from a complete story of the ontological primitives (plus whatever logic you wish to avail yourself of)? I’m not totally sure that’s a well formed question, mind you.
Tulips objectively exist as a fuzzy cluster in configuration space, and if an AI were to list all facts about the world, this would be one of them. But unlike us, an AI or god doesn’t necessarily have a reason to notice this clustering or make any use of it. It’s kind of like 22581959 being prime is an objective fact that you and I can discover, but don’t necessarily have any reason to notice or make use of.
BTW, Eliezer argued, and I agree with, that this kind of objective clustering can’t be used directly to define morally relevant concepts like “people”.
I was being a bit ambiguous: I mean to talk about concrete individual tulips, not the species. Even given the framework in ’The Cluster Structure...”, each actual tulip is a point in thingspace, not a cluster.
If an AI were to list all facts about the world, it would list that the wavefunction of the universe can be approximately factored into X, where X corresponds to what we would call an individual tulip. (Note that an individual tulip is also actually a cluster in configuration space, because it’s a blob of amplitude-factor, not a single point in configuration space. Of course this cluster is much smaller than the cluster of all tulips.)
Okay, that answers my question, thanks.