I have never heard of this before let alone understand it, can you recommend any good primers? All the resources I can find speak in annoyingly vague and abstract sense like “a top-level ontology that provides a common framework for describing the fundamental concepts of reality.” or “realist approach… based on science, independent of our linguistic conceptual, theoretical, cultural representations”.
I think the general issue is that while people in this community and the AI alignment community have quite seriously thought about epistemology but not about ontology.
There’s nothing vague about the sentence. It’s precise enough that’s it’s a ISO/IEC standard. It’s however abstract. If you have a discussion about Bayesian epistemology, you are also going to encounter many abstract terms.
BFO grew out of the practical needs that bioinformaticians had at around 2000. The biologists didn’t think seriously about ontology, so someone needed to think seriously about it to enable big data applications where unclear ontology would produce problems. Since then BFO has been most more broadly and made into the international standard ISO/IEC 21838-2:2021.
This happens in a field that calls themselves applied ontology. Books like Building Ontologies with Basic Formal Ontology by Robert Arp, Barry Smith, and Andrew D. Spear explain the topic in more detail. Engaging with serious conceptual framework is work but I think if you buy the core claim of ‘I think that people overrate bayesian reasoning and underrate “figure out the right ontology”’ you shouldn’t just try to develop your ontology based on your own naive assumptions about ontology but familiarize yourself with applied ontology. For AI alignment that’s probably both valuable on the conceptual layer of the ontology of AI alignment but might also be valuable for thinking about the ontological status of values and how AI is likely going to engage with that.
After Barry Smith was architecting BFO and first working in bioinformatics he went to the US military to do ontology for their big data applications. You can’t be completely certain what the military does internally but I think there’s a good chance that most of the ontology that Palantir uses for the big data of the military is BFO-based. When Claude acts within Palantir do engage in acts of war in Iran, a complete story about how that activity is “aligned” includes BFO.
I strongly disagree. “describing the fundamental concepts of reality” is unhelpfully vague, what are these fundamental concepts? I don’t know and can’t guess what it is from that sentence, which is ironic considering it is an Ontological framework.
The word reality has a clear meaning in ontological realism. If you lack that background then it feels vague.
This is similar to saying that when someone speaks about something being statistically significant they are vague because significant is a vage word. You actually need to understand something about statistics for the term not to feel vague.
I have never heard of this before let alone understand it, can you recommend any good primers? All the resources I can find speak in annoyingly vague and abstract sense like “a top-level ontology that provides a common framework for describing the fundamental concepts of reality.” or “realist approach… based on science, independent of our linguistic conceptual, theoretical, cultural representations”.
I think the general issue is that while people in this community and the AI alignment community have quite seriously thought about epistemology but not about ontology.
There’s nothing vague about the sentence. It’s precise enough that’s it’s a ISO/IEC standard. It’s however abstract. If you have a discussion about Bayesian epistemology, you are also going to encounter many abstract terms.
BFO grew out of the practical needs that bioinformaticians had at around 2000. The biologists didn’t think seriously about ontology, so someone needed to think seriously about it to enable big data applications where unclear ontology would produce problems. Since then BFO has been most more broadly and made into the international standard ISO/IEC 21838-2:2021.
This happens in a field that calls themselves applied ontology. Books like Building Ontologies with Basic Formal Ontology by Robert Arp, Barry Smith, and Andrew D. Spear explain the topic in more detail. Engaging with serious conceptual framework is work but I think if you buy the core claim of ‘I think that people overrate bayesian reasoning and underrate “figure out the right ontology”’ you shouldn’t just try to develop your ontology based on your own naive assumptions about ontology but familiarize yourself with applied ontology. For AI alignment that’s probably both valuable on the conceptual layer of the ontology of AI alignment but might also be valuable for thinking about the ontological status of values and how AI is likely going to engage with that.
After Barry Smith was architecting BFO and first working in bioinformatics he went to the US military to do ontology for their big data applications. You can’t be completely certain what the military does internally but I think there’s a good chance that most of the ontology that Palantir uses for the big data of the military is BFO-based. When Claude acts within Palantir do engage in acts of war in Iran, a complete story about how that activity is “aligned” includes BFO.
I strongly disagree. “describing the fundamental concepts of reality” is unhelpfully vague, what are these fundamental concepts? I don’t know and can’t guess what it is from that sentence, which is ironic considering it is an Ontological framework.
The word reality has a clear meaning in ontological realism. If you lack that background then it feels vague.
This is similar to saying that when someone speaks about something being statistically significant they are vague because significant is a vage word. You actually need to understand something about statistics for the term not to feel vague.