Clearly the heroic thing to do would be to go to trial and then deliberately mess it up very badly in a calculated fashion that sets an awful precedent for the other AGI companies. You might say, “but China!”, but if the US cripples itself, then suddenly the USG would be much more interested in reaching some sort of international-AGI-ban deal with China, so it all works out.
Responding to the serious half only, sandbagging doesn’t work in general in the legal system, and in particular it wouldn’t work here. That’s because you have so much outside attention on the case and (presumably) so many amici briefs describing all the most powerful arguments in the AI companies’ favor. If the judge sees that you are a $61 billion market cap company hiring the greatest lawyers in the world, but you’re not putting forth your best legal foot when you have lawyers from other companies writing briefs outlining their own defense arguments, the consequences for you and your lawyers will be severe and any notion of “precedent” will be poisoned for all of time.
If the judge sees that you are a $61 billion market cap company hiring the greatest lawyers in the world, but you’re not putting forth your best legal foot when you have lawyers from other companies writing briefs outlining their own defense arguments, the consequences for you and your lawyers will be severe
What would be the actual wrongdoing here, legally speaking?
A failure to do so (if there is no genuine adversity between the parties in practice because they collude on the result) renders the lawsuit dead on the spot (because the federal court cannot constitutionally exercise jurisdiction over the parties, so there can be no decision on the merits) and exposes the lawyers and parties to punishment and repercussions in case they tried to conceal this from/directly lie to the judge, both because lying to a judicial officer in signed affidavits is a disbarrable offense and because this would waste the court’s already undersupplied resources.
do that one then. either destroy the industry or don’t, but don’t destroy only anthropic.
Afaict this case has been generally good for the industry but especially bad for Anthropic.
Edit: overall win, you can use books in training. You just can’t use pirated books.
Clearly the heroic thing to do would be to go to trial and then deliberately mess it up very badly in a calculated fashion that sets an awful precedent for the other AGI companies. You might say, “but China!”, but if the US cripples itself, then suddenly the USG would be much more interested in reaching some sort of international-AGI-ban deal with China, so it all works out.
(Only half-serious.)
Responding to the serious half only, sandbagging doesn’t work in general in the legal system, and in particular it wouldn’t work here. That’s because you have so much outside attention on the case and (presumably) so many amici briefs describing all the most powerful arguments in the AI companies’ favor. If the judge sees that you are a $61 billion market cap company hiring the greatest lawyers in the world, but you’re not putting forth your best legal foot when you have lawyers from other companies writing briefs outlining their own defense arguments, the consequences for you and your lawyers will be severe and any notion of “precedent” will be poisoned for all of time.
Yeah, I figured.
What would be the actual wrongdoing here, legally speaking?
Federal lawsuits must satisfy the case or controversy requirement of Article 3 of the Constitution.
A failure to do so (if there is no genuine adversity between the parties in practice because they collude on the result) renders the lawsuit dead on the spot (because the federal court cannot constitutionally exercise jurisdiction over the parties, so there can be no decision on the merits) and exposes the lawyers and parties to punishment and repercussions in case they tried to conceal this from/directly lie to the judge, both because lying to a judicial officer in signed affidavits is a disbarrable offense and because this would waste the court’s already undersupplied resources.