On the AI art as theft thing, two things: yes, it’s true that the art world has its own internal bias (as all do). One case you might see mentioned sometimes is how renowned pop artist Roy Lichtenstein essentially did nothing but pick panels out of other people’s comics and print them bigger. He would get millions for that stuff whereas the original authors got nothing; because see, what Lichtenstein was doing was elevating their pulp comic panels to art or some other nonsense. Obviously this is now frowned upon also by many artists, but still, that he’s considered a great artist at all off this is pretty telling.
But on the other hand, what AI is changing in this sense is scale. Music piracy was always possible. You could copy cassettes and hand them out to your friends. But it never became a massive problem the music industry rallied against until MP3s and the internet made it possible at a scale and degree of losslessness previously impossible. AI art stuff is similar, except whereas piracy tended to even things out for the consumer, AI art seems like a reverse Robin Hood; a corporation extracts value from random people on the internet, aggregates it, and resells that product, which is possible at all only because of that work, to its customers, for cheaper than the original artists would have made it. Note also that it’s kind of the wrong framing to look at professional artists as the reference here. They’re not the ones getting shafted. High level professional artists can keep producing performances or banana peels stuck to a wall and get paid millions for them because at this point it’s not about the object itself, it’s about some mix of signalling and prestige attached to it. AI art is directly competing with small commission artists, the kind of people who work their ass off learning how to paint on a tablet with Corel Draw or whatever and then post their digital art portfolios online. That’s likely the bulk of the dataset (it’s also in line with what the resulting artstyle of the AIs looks like) and that’s the people who got their images used without any compensation even when the license didn’t allow for derivative works, and now are seeing something built on it resold in a way that completely prices them out of the market.
It is kind of maddening. “Theft” isn’t probably the right word for it, but it’s a huge power asymmetry when a huge corporation (let’s consider how now OpenAI is straight up owned by Microsoft) who would crack down on random nobodies for pirating their software gets away with pirating on mind-numbing scale simply because the use case is new and not quite yet well legislated (I consider it somehow a milder version of making a derivative work) and because no one can actually check for sure. If these corporations want to scrap all copyright laws, hey, go ahead. But at least let’s not make this one rule for me and another for thee.
Also, from a pragmatic viewpoint, few points would create a strong immediate economic incentive for pursuing alignment than “you get sued into oblivion if your AI happens to accidentally violate copyright, trademark and/or privacy laws”. So, you know, there’s a strategic aspect to that too.
On the AI art as theft thing, two things: yes, it’s true that the art world has its own internal bias (as all do). One case you might see mentioned sometimes is how renowned pop artist Roy Lichtenstein essentially did nothing but pick panels out of other people’s comics and print them bigger. He would get millions for that stuff whereas the original authors got nothing; because see, what Lichtenstein was doing was elevating their pulp comic panels to art or some other nonsense. Obviously this is now frowned upon also by many artists, but still, that he’s considered a great artist at all off this is pretty telling.
But on the other hand, what AI is changing in this sense is scale. Music piracy was always possible. You could copy cassettes and hand them out to your friends. But it never became a massive problem the music industry rallied against until MP3s and the internet made it possible at a scale and degree of losslessness previously impossible. AI art stuff is similar, except whereas piracy tended to even things out for the consumer, AI art seems like a reverse Robin Hood; a corporation extracts value from random people on the internet, aggregates it, and resells that product, which is possible at all only because of that work, to its customers, for cheaper than the original artists would have made it. Note also that it’s kind of the wrong framing to look at professional artists as the reference here. They’re not the ones getting shafted. High level professional artists can keep producing performances or banana peels stuck to a wall and get paid millions for them because at this point it’s not about the object itself, it’s about some mix of signalling and prestige attached to it. AI art is directly competing with small commission artists, the kind of people who work their ass off learning how to paint on a tablet with Corel Draw or whatever and then post their digital art portfolios online. That’s likely the bulk of the dataset (it’s also in line with what the resulting artstyle of the AIs looks like) and that’s the people who got their images used without any compensation even when the license didn’t allow for derivative works, and now are seeing something built on it resold in a way that completely prices them out of the market.
It is kind of maddening. “Theft” isn’t probably the right word for it, but it’s a huge power asymmetry when a huge corporation (let’s consider how now OpenAI is straight up owned by Microsoft) who would crack down on random nobodies for pirating their software gets away with pirating on mind-numbing scale simply because the use case is new and not quite yet well legislated (I consider it somehow a milder version of making a derivative work) and because no one can actually check for sure. If these corporations want to scrap all copyright laws, hey, go ahead. But at least let’s not make this one rule for me and another for thee.
Also, from a pragmatic viewpoint, few points would create a strong immediate economic incentive for pursuing alignment than “you get sued into oblivion if your AI happens to accidentally violate copyright, trademark and/or privacy laws”. So, you know, there’s a strategic aspect to that too.