I can see the appeal, but I worry that a metaphor where a single person is given a single piece of software, and has an option to rewrite it for their own and/or others’ purpose without grappling with myriad upstream and downstream dependencies, vested interests, and so forth is probably missing an important part of the dynamics of real world systems?
(This doesn’t really speak to moral obligations to systems, as much as practical challenges doing anything about them, but my experience is that the latter is a much more binding constraint.)
First Take: Tangentially, does this point to an answer to the question of what are bureaucrats trying to maximize? (As sometimes addressed on LessWrong) Maybe they are trying to minimize operational hitches within their small realm.
I can see the appeal, but I worry that a metaphor where a single person is given a single piece of software, and has an option to rewrite it for their own and/or others’ purpose without grappling with myriad upstream and downstream dependencies, vested interests, and so forth is probably missing an important part of the dynamics of real world systems?
(This doesn’t really speak to moral obligations to systems, as much as practical challenges doing anything about them, but my experience is that the latter is a much more binding constraint.)
Indeed. I impulsively wrote some continuation story in response—it’s very rough, and the later sections kind of got away from me, but I’ve posted a scribble of “Bad Reasons Behind Different Systems and a Story with No Good Moral” which may be of relevance.
I liked it. Made me consider a bit more.
First Take: Tangentially, does this point to an answer to the question of what are bureaucrats trying to maximize? (As sometimes addressed on LessWrong) Maybe they are trying to minimize operational hitches within their small realm.