Deming-Regression
In statistics, a best fit line for a finite set metric scaled data pairs (xi, yi ) is determined by the least squares method with the Deming regression. In contrast to the linear regression residuals (measurement errors) are included in both the x-and the y- values for the model.
Deming regression is a special case of the regression analysis and is based on a maximum likelihood estimation of the regression parameters, wherein the residuals of both variables as independent and normally distributed are adopted, and the ratio is assumed to be known δ their variances.
The Deming regression is a work of CH Kummell (1879 ) back; 1937, the method of T.C. Koopmans taken up again and made known in a more general framework in 1943 by WE Deming for technical and economic applications.
The orthogonal regression is an important special case of the Deming regression; they dealt with the case δ = 1
Calculating method
The measured values xi and yi are the "true" as the sum of the values xi and yi * and * the ηi " error " or εi considered, ie (xi, yi ) = (x * ηi, y * εi ). The data pairs (xi *, yi *) are to be calculated on the straight line. ηi and εi are independent with a known ratio of the error variances δ = σε2/ση2.
There is a line y = β0 β1x sought that minimizes the weighted sum of squared residuals:
For further calculation the following auxiliary values are required:
Thus, the parameters for the solution of the minimization problem arise: