Suppose I write a program and let people download the binary. Can I say “I spent 100k on AWS to compile it, therefore the binary is open source”?
not even modification
Would you say compiling source code from scratch (e.g. for a different platform) is not a modification?
Even if you’re not intending to retrain the model from scratch, simply knowing what the training data is is valuable. Maybe you don’t care about the training data, but somebody else does. I don’t think “I could never possibly make use of the source code / training data” is an argument that a binary / weights is actually open source.
How does open source differ from closed source for you in the case of generative models? If they are the same, why use the term at all?
Suppose I write a program and let people download the binary. Can I say “I spent 100k on AWS to compile it, therefore the binary is open source”?
Would you say compiling source code from scratch (e.g. for a different platform) is not a modification?
Even if you’re not intending to retrain the model from scratch, simply knowing what the training data is is valuable. Maybe you don’t care about the training data, but somebody else does. I don’t think “I could never possibly make use of the source code / training data” is an argument that a binary / weights is actually open source.
How does open source differ from closed source for you in the case of generative models? If they are the same, why use the term at all?