Yes, I think that we’ve exhausted most of what it would be fruitful to discuss in this thread; the remaining disagreements would probably take more effort to resolve than would be feasible to expend at this time (for either of us). I do want to comment on this part, though:
It most certainly does not hold true for computers. I am hesitant to launch into the computer analogue of this argument, but I could easily provide a list, similar to the one in my top-level comment, of ways in which various computer products have degenerated in quality, etc.
This seems very out there, and in line with some of the ways in which your hyper-focus on minutiae is causing you to miss the point. Computers are transparently, overwhelmingly better, year by year, decade by decade.
Conversely, I would say that computers are transparently (!) not “overwhelmingly better, year by year, decade by decade”. They’re certainly better in some ways, but also much worse in other ways. (Input latency is one well-studied example, but there are quite a few others.)
It is also noteworthy that many of the ways in which computer hardware has improved (“raw” performance characteristics such as clock speed, memory capacity, storage capacity, etc.) are used to support behaviors that are of dubious value at best (various fancy compositor features and graphical capabilities of window managers), and user-hostile at worst (adtech and other dark patterns of the modern web).
Understand that the things I am referring to, when I make claims like the one you quoted, are not “minutiae”; rather, they are basic aspects of the everyday user experience of the great majority of personal computer users in the world.
Input latency and unpredictability of it. One famous example is that for many years there were usable finger-drumming apps on iOS but not on Android, because on Android you couldn’t make the touchscreen + app + OS + sound system let people actually drum in time. Something would always introduce a hundred ms of latency (give or take) at random moments, which is enough to mess up the feeling. Everyone knew it and no one could fix it.
Yes, I think that we’ve exhausted most of what it would be fruitful to discuss in this thread; the remaining disagreements would probably take more effort to resolve than would be feasible to expend at this time (for either of us). I do want to comment on this part, though:
Conversely, I would say that computers are transparently (!) not “overwhelmingly better, year by year, decade by decade”. They’re certainly better in some ways, but also much worse in other ways. (Input latency is one well-studied example, but there are quite a few others.)
It is also noteworthy that many of the ways in which computer hardware has improved (“raw” performance characteristics such as clock speed, memory capacity, storage capacity, etc.) are used to support behaviors that are of dubious value at best (various fancy compositor features and graphical capabilities of window managers), and user-hostile at worst (adtech and other dark patterns of the modern web).
Understand that the things I am referring to, when I make claims like the one you quoted, are not “minutiae”; rather, they are basic aspects of the everyday user experience of the great majority of personal computer users in the world.
Input latency and unpredictability of it. One famous example is that for many years there were usable finger-drumming apps on iOS but not on Android, because on Android you couldn’t make the touchscreen + app + OS + sound system let people actually drum in time. Something would always introduce a hundred ms of latency (give or take) at random moments, which is enough to mess up the feeling. Everyone knew it and no one could fix it.