Is GitHub Copilot in legal trouble?

This is basically a crosspost for https://​​githubcopilotinvestigation.com/​​. I noticed that some folks in California are considering a lawsuit against Microsoft/​​OpenAI.

tl;dr:

  • Copilot is trained on open source software.

  • Copilot doesn’t respect the licensing agreements of that software.

  • Copilot doesn’t have a clear fair use argument for doing so.

  • By accepting copilot suggestions, you are potentially violating licensing agreements yourself.

Sections of particular interest:

[W]e inquired privately with Friedman and other Microsoft and GitHub representatives in June 2021, asking for solid legal references for GitHub’s public legal positions … They provided none.

- Software Freedom Conservancy

“You are responsible for ensuring the security and quality of your code. We recommend you take the same precautions when using code generated by GitHub Copilot that you would when using any code you didn’t write yourself. These precautions include rigorous testing, IP [(= intellectual property)] scanning [my emphasis], and tracking for security vulnerabilities.”

- https://​​docs.github.com/​​en/​​copilot/​​overview-of-github-copilot/​​about-github-copilot#using-github-copilot