Abstract
Five common linear regression methods were evaluated for their ability to determine the correct values of slope and intercept of a known function after random errors were added to x and y. The error variances were controlled to simulate research problems commonly studied by linear regression. The total error of each method was assessed by the absolute value of the bias in the estimate of slope. Whenever differences among methods were observed, the mean of the slope determined by two reciprocal techniques performed as well as or better than orthogonal regression, regression of y upon x, or x upon y. All the methods studied appeared to perform equally well when x and y errors were heteroscedastic or when the data set was small (n = 7). Regression of y upon x was equal or superior to other methods when n = 7 or n = 20 and y and x errors were homoscedastic. When the data set was large (n = 50) and the error in x greater than that in y, the standard method (regression of y upon x) was inferior to all other methods. It is suggested that linear regression by the traditional method of y upon x (a method present in many hand-held calculators) is appropriate in the majority of clinical situations, but when n is large and errors in x are much larger than those in y, orthogonal regression or the averaging method may be preferable.
Get full access to this article
View all access options for this article.
