And what of it? You’re pointing to a general category with the implicit assumption that everything it contains is wrong. Which, as you know, isn’t true. Perhaps most of the members of this set can be classified as wrong, but as long as wrongness isn’t a general feature of this set, membership of this set isn’t a sufficient condition to classify anything as wrong. Ergo, the members of this set that are wrong, aren’t wrong because they belong to this set, but are wrong because they meet some other criteria of wrongness. You should think about what those criteria are, and then we can debate whether this particular issue meets those criteria.
This stinks of classical (non-bayesian) rationality. Membership in a set that is mostly wrong does not “prove” anything, but it sure is evidence. (more evidence is obviously required, in this case). Keep your bayes hat on.
EDIT: What happened to my edit! Your point still stands: we have reason to believe copying does not quite fit in that set, so we should be looking closer at the mechanisms of wrongness. /EDIT
As I and other people have pointed out, it’s not even morally comparable to theft. You haven’t addressed any of those comments as far as I can tell. I’ll go with the charitable interpretation and assume what you mean by ‘comparable’ is what you’re saying in the next sentence (the part I responded to in the first part of this comment).
Actually, ve just brought up that the intent and thought process is very similar. Seems like a good enough reason to compare them.
That said, I think the comparison is way overused, and even if it contains a grain of truth, it’s a good idea to avoid it because it is such a politicized comparison.
This stinks of classical (non-bayesian) rationality. Membership in a set that is mostly wrong does not “prove” anything, but it sure is evidence.
Agreed, but it’s noisy evidence. Which is why I recommended looking for better evidence. I used the set theory terminology instead of the Bayesian one because ABrooks seems to have a philosophy background; I thought this’d make more sense for him/her.
...… And yes, I got carried away by the force of my own rhetoric. Must work on avoiding that.
Actually, ve just brought up that the intent and thought process is very similar.
Agreed, but it’s noisy evidence. Which is why I recommended looking for better evidence. I used the set theory terminology instead of the Bayesian one because ABrooks seems to have a philosophy background; I thought this’d make more sense for him/her.
See my edit, I agree with what you said, but the non-bayesian thing was an itch that had to be scratched.
That wasn’t at all clear to me.
That’s because it was in a different post. By “just” I meant “seconds ago, after this post”. I could have made that clearer.
This stinks of classical (non-bayesian) rationality. Membership in a set that is mostly wrong does not “prove” anything, but it sure is evidence. (more evidence is obviously required, in this case). Keep your bayes hat on.
EDIT: What happened to my edit! Your point still stands: we have reason to believe copying does not quite fit in that set, so we should be looking closer at the mechanisms of wrongness. /EDIT
Actually, ve just brought up that the intent and thought process is very similar. Seems like a good enough reason to compare them.
That said, I think the comparison is way overused, and even if it contains a grain of truth, it’s a good idea to avoid it because it is such a politicized comparison.
Agreed, but it’s noisy evidence. Which is why I recommended looking for better evidence. I used the set theory terminology instead of the Bayesian one because ABrooks seems to have a philosophy background; I thought this’d make more sense for him/her.
...… And yes, I got carried away by the force of my own rhetoric. Must work on avoiding that.
That wasn’t at all clear to me.
See my edit, I agree with what you said, but the non-bayesian thing was an itch that had to be scratched.
That’s because it was in a different post. By “just” I meant “seconds ago, after this post”. I could have made that clearer.