The nits: the section on the the Israeli military’s use of AI against Hamas could use some tightening to avoid getting bogged down in the particularities of the Palestine situation. The line “some of the surveillance tactics Israeli settlers tested in Palestine” (my emphasis) to me suggests the interpretation that all Israelis are “settlers,” which is not the conventional use of that term. The conventional use of settlers applied only to those Israelis living over the Green Line, and particularly those doing so with the ideological intent of expanding Israel’s de facto borders. Similarly but separately, the discussion about Microsoft’s response to me seemed to take as facts what I believe to still only be allegations.
The major comment: I feel you could go farther to connect the dots between the “enshittification” of Anthropic and the issues you raise about the potential of AI to help enshittify democratic regimes. The idea that there are “exogenously” good and bad guys, with the former being trustworthy to develop A(G)I and the latter being the ones “we” want to stop from winning the race, is really central to AI discourse. You’ve pointed out the pattern in which participating in the race turns the “good” guys into bad guys (or at least untrustworthy ones).
The conventional use of settlers applied only to those Israelis living over the Green Line, and particularly those doing so with the ideological intent of expanding Israel’s de facto borders.
Ah, I was actually trying to draw a distinction between Israeli citizens and those settling Palestinian regions specifically. Like I didn’t want to implicate Israelis generally. But I see how it’s not a good distinction because there are also soldiers and tech company employees testing surveillance tactics on Palestinians but living in Israel (the post-Nakba region).
Of course, I’m also just not much acquainted with how all these terms get used by people living in the regions. Thanks for the heads-up! I’ll try and see how to rewrite this to be more accurate.
the discussion about Microsoft’s response to me seemed to take as facts what I believe to still only be allegations.
You’re right. I added it as a tiny sentence at the end. But what’s publicly established is that Microsoft supplied cloud services to the IDF while letting them just use that for what they wanted – not that the cloud services were used for storing tapped Palestinian calls specifically. I’ll add a footnote about this.
EDIT: after reading the Guardian article linked to from the one announcing Microsoft’s inquiry, I think the second point is also pretty well-established: “But a cache of leaked Microsoft documents and interviews with 11 sources from the company and Israeli military intelligence reveals how Azure has been used by Unit 8200 to store this expansive archive of everyday Palestinian communications.”
The major comment: I feel you could go farther to connect the dots between the “enshittification” of Anthropic and the issues you raise about the potential of AI to help enshittify democratic regimes.
This is a great insight. The honest answer is that I had not thought of connecting those dots here.
We see a race to the bottom to release AI to extract benefits in Anthropic’s founding researchers actions, and also in broader US society.
Thanks for this.
A minor comment and a major one:
The nits: the section on the the Israeli military’s use of AI against Hamas could use some tightening to avoid getting bogged down in the particularities of the Palestine situation. The line “some of the surveillance tactics Israeli settlers tested in Palestine” (my emphasis) to me suggests the interpretation that all Israelis are “settlers,” which is not the conventional use of that term. The conventional use of settlers applied only to those Israelis living over the Green Line, and particularly those doing so with the ideological intent of expanding Israel’s de facto borders. Similarly but separately, the discussion about Microsoft’s response to me seemed to take as facts what I believe to still only be allegations.
The major comment: I feel you could go farther to connect the dots between the “enshittification” of Anthropic and the issues you raise about the potential of AI to help enshittify democratic regimes. The idea that there are “exogenously” good and bad guys, with the former being trustworthy to develop A(G)I and the latter being the ones “we” want to stop from winning the race, is really central to AI discourse. You’ve pointed out the pattern in which participating in the race turns the “good” guys into bad guys (or at least untrustworthy ones).
Thanks for the comments
Ah, I was actually trying to draw a distinction between Israeli citizens and those settling Palestinian regions specifically. Like I didn’t want to implicate Israelis generally. But I see how it’s not a good distinction because there are also soldiers and tech company employees testing surveillance tactics on Palestinians but living in Israel (the post-Nakba region).
Of course, I’m also just not much acquainted with how all these terms get used by people living in the regions. Thanks for the heads-up! I’ll try and see how to rewrite this to be more accurate.
You’re right. I added it as a tiny sentence at the end. But what’s publicly established is that Microsoft supplied cloud services to the IDF while letting them just use that for what they wante
d – not that the cloud services were used for storing tapped Palestinian calls specifically. I’ll add a footnote about this.EDIT: after reading the Guardian article linked to from the one announcing Microsoft’s inquiry, I think the second point is also pretty well-established: “But a cache of leaked Microsoft documents and interviews with 11 sources from the company and Israeli military intelligence reveals how Azure has been used by Unit 8200 to store this expansive archive of everyday Palestinian communications.”
This is a great insight. The honest answer is that I had not thought of connecting those dots here.
We see a race to the bottom to release AI to extract benefits in Anthropic’s founding researchers actions, and also in broader US society.