9.6.3.Different regression lines and functional regression

Unit 9 - Correlation and regression

9.6.3 Different Regression Lines
Prior to this topic it was mentioned that independent variable X is fixed or is not a random variable. If both X and Y are random variables and are open to choice as to which affects which then the following regression lines may be conceived :
(i) Regression equation of Y on X
If Y is considered as dependent variable, then the regression equation of Y on X is given by,
Y = a+b X
The regression coefficient b is called the regression coefficient of Y on X and is usually denoted by x. In this equation a and b are so estimated as to minimize the residual variation (deviations from regression) of Y i.e. ∑(Yi-a-bXi)2 is minimized.
(ii) Regression equation of X on Y
If X is considered as dependent variable then the regression equation is given by
X = a1+ b1 Y
The regression coefficient b1 is called the regression coefficient of X on Y and is usually denoted by bxy. In this equation a and b are so estimated as to minimize the residual variation of X i.e. ∑(Xi-a-bYi)2 is minimized. The values of ‘a’ and ‘b’ obtained in (i) and (ii) will usually be different.
Functional Regression
One way to overcome the problem of choosing the independent variable when both variables are random variables is to use ‘functional regression’ given by Ricker (1975).
According to this method the slope is estimated using:
byx = sy/sx if r > O
bxy= -sx/sy if r < O
and the intercept is estimated using:
For the model g
y
For the Model j
j

Last modified: Friday, 16 September 2011, 6:52 AM