related side note: In my grad stat classes, “Linear Regression” usually includes things like my example above—i.e. linear functions of the (potentially transformed) explanatory variables including higher order terms. Is this different from the how the term is widely used?
I don’t think it is. I seem to remember reading in Wonnacott & Wonnacott’s textbook that you can still call it ‘linear regression’ whether or not one of those regressors is a nonlinear function of another.
That makes sense intuitively, because a linear regression algorithm doesn’t care where your regressors come from, so conceptually it’s irrelevant whether they all turn out to be different functions of the same variable (for example). (Barring obvious exceptions like your regressors all being linear functions of the same variable, which would of course mess up your regression.)
unrelated side note: is there a way to type pretty math in the comments?
I don’t know of one, but I haven’t been here long!
followup question: are scientists outside of the field of statistics really this dumb when it comes to statistics?
My understanding is, a lot of them aren’t...but a lot of them are.
I don’t think it is. I seem to remember reading in Wonnacott & Wonnacott’s textbook that you can still call it ‘linear regression’ whether or not one of those regressors is a nonlinear function of another.
That makes sense intuitively, because a linear regression algorithm doesn’t care where your regressors come from, so conceptually it’s irrelevant whether they all turn out to be different functions of the same variable (for example). (Barring obvious exceptions like your regressors all being linear functions of the same variable, which would of course mess up your regression.)
I don’t know of one, but I haven’t been here long!
My understanding is, a lot of them aren’t...but a lot of them are.