I either didn’t know or hadn’t thought in the context about most of what you say here, thank you.
Yet this (the exact length of a meter) is more-or-less settled, in the sense that very many people use it without significant loss of what they want to convey. This is kind of exactly the thing I’d like to learn about—how unit-variable relationships evolve and come to some ‘resting position’. How people first come to think about the matter of a subject, than about the ways to describe it, and finally about the number of a common ‘piece’ used to measure it.
I think the Applied Ontology book is worth reading as it touches a lot of the practical concerns that come with the need for standardization due to automated knowledge processing. Even if you aren’t interested in automated knowledge processing it still useful.
Inventing Temperature: Measurement and Scientific Progress by Hasok Chang is a good case study for how our measure of temperature evolved. Temperature is a good example because conceptualizing it is harder than conceptualizing length. In the middle ages people had their measures for length but they didn’t have one for temperature.
The definition of the meter over the wavelength of light instead of over the norm meter was settled in 1960 but the amount of people for whom there were practical concerns was relatively little.
Interestingly we have at the moment a proposed change to the SI system that redefines the kilogram: https://en.wikipedia.org/wiki/Proposed_redefinition_of_SI_base_units
It changes the uncertainity that we have over a few constants. Beforehand we had an exact definition of the kilogram and afterwards we only know 8 digits of accuracy. On the other hand we get more accuracy for a bunch of other measurements. It might be worth reading a bit into the debate if you care about how standards are set.
I either didn’t know or hadn’t thought in the context about most of what you say here, thank you.
Yet this (the exact length of a meter) is more-or-less settled, in the sense that very many people use it without significant loss of what they want to convey. This is kind of exactly the thing I’d like to learn about—how unit-variable relationships evolve and come to some ‘resting position’. How people first come to think about the matter of a subject, than about the ways to describe it, and finally about the number of a common ‘piece’ used to measure it.
I think the Applied Ontology book is worth reading as it touches a lot of the practical concerns that come with the need for standardization due to automated knowledge processing. Even if you aren’t interested in automated knowledge processing it still useful.
Inventing Temperature: Measurement and Scientific Progress by Hasok Chang is a good case study for how our measure of temperature evolved. Temperature is a good example because conceptualizing it is harder than conceptualizing length. In the middle ages people had their measures for length but they didn’t have one for temperature.
The definition of the meter over the wavelength of light instead of over the norm meter was settled in 1960 but the amount of people for whom there were practical concerns was relatively little. Interestingly we have at the moment a proposed change to the SI system that redefines the kilogram: https://en.wikipedia.org/wiki/Proposed_redefinition_of_SI_base_units
It changes the uncertainity that we have over a few constants. Beforehand we had an exact definition of the kilogram and afterwards we only know 8 digits of accuracy. On the other hand we get more accuracy for a bunch of other measurements. It might be worth reading a bit into the debate if you care about how standards are set.