FWIW this happens all the time in both directions (the other being when a term becomes so overused as to become meaningless), and often (as, arguably, with AI today) both directions at once. My background is in materials science, and IMO this is basically what happened with terms like nanotech, metamaterials, smart materials, and 3D printing. My mental model is something like: Motte-and-bailey by people (often people not quite at the cutting edge but trying to develop a tech or product) leads to poorly researched press coverage (but I repeat myself) leads to popular disillusionment, such that the actual advances happening get quietly ignored and/or shouted down, no matter how much or how little impact they’re having. Sometimes the actual rates of technological progress and commercial adoption are quite smooth, even as the level of hype and investment and other activity shift wildly around them.
My impression is that language is almost always evolving, and most of the evolution is in an essentially noisy and random and half-broken direction, as people make mistakes, or tell lies, or whatever, and then regularly try to reconstruct the ability to communicate meaning coherently using the words and interpretive schemes at hand.
In my idiolect, “nanotechnology” still means “nanotechnology” but also I’m aware that semantic parasites have ruined its original clean definition, and so in the presence of people who don’t want to keep using the old term in spite of the damage to the language I am happy to code switch and say “precise atom-by-atom manufacturing of arbitrary molecules by generic molecular assemblers based on arbitrary programming signals” or whatever other phrase helps people understand that I’m talking about a technology that could exist but doesn’t exist yet, and which would have radical implications if developed.
I saw the original essay as an attempt to record my idiolect, and my impression of what was happening, at this moment in history, before this moment ends.
(Maybe it will be a slightly useful datapoint for posthuman historians, as they try to pinpoint the precise month that it became inevitable that humans would go extinct or whatever, because we couldn’t successfully coordinate to do otherwise, because we couldn’t even speak to each other coherently about what the fuck was even happening… and this is a GENERAL problem for humans, in MANY fields of study.)
FWIW this happens all the time in both directions (the other being when a term becomes so overused as to become meaningless), and often (as, arguably, with AI today) both directions at once. My background is in materials science, and IMO this is basically what happened with terms like nanotech, metamaterials, smart materials, and 3D printing. My mental model is something like: Motte-and-bailey by people (often people not quite at the cutting edge but trying to develop a tech or product) leads to poorly researched press coverage (but I repeat myself) leads to popular disillusionment, such that the actual advances happening get quietly ignored and/or shouted down, no matter how much or how little impact they’re having. Sometimes the actual rates of technological progress and commercial adoption are quite smooth, even as the level of hype and investment and other activity shift wildly around them.
Just so!
My impression is that language is almost always evolving, and most of the evolution is in an essentially noisy and random and half-broken direction, as people make mistakes, or tell lies, or whatever, and then regularly try to reconstruct the ability to communicate meaning coherently using the words and interpretive schemes at hand.
In my idiolect, “nanotechnology” still means “nanotechnology” but also I’m aware that semantic parasites have ruined its original clean definition, and so in the presence of people who don’t want to keep using the old term in spite of the damage to the language I am happy to code switch and say “precise atom-by-atom manufacturing of arbitrary molecules by generic molecular assemblers based on arbitrary programming signals” or whatever other phrase helps people understand that I’m talking about a technology that could exist but doesn’t exist yet, and which would have radical implications if developed.
I saw the original essay as an attempt to record my idiolect, and my impression of what was happening, at this moment in history, before this moment ends.
(Maybe it will be a slightly useful datapoint for posthuman historians, as they try to pinpoint the precise month that it became inevitable that humans would go extinct or whatever, because we couldn’t successfully coordinate to do otherwise, because we couldn’t even speak to each other coherently about what the fuck was even happening… and this is a GENERAL problem for humans, in MANY fields of study.)