CORRELATION AND REGRESSION -04-THEORY
2.2.1 Introduction .
“If it is proved true that in a large number of instances two variables tend always to fluctuate in the same or in opposite directions, we consider that the fact is established and that a relationship exists. This relationship is called correlation.”
(1) Univariate distribution : These are the distributions in which there is only one variable such as the heights of the students of a class.
(2) Bivariate distribution : Distribution involving two discrete variable is called a bivariate distribution. For example, the heights and the weights of the students of a class in a school.
(3) Bivariate frequency distribution : Let x and y be two variables. Suppose x takes the values and y takes the values then we record our observations in the form of ordered pairs , where . If a certain pair occurs times, we say that its frequency is .
The function which assigns the frequencies ’s to the pairs is known as a bivariate frequency distribution.
Example: 1 The following table shows the frequency distribution of age (x) and weight (y) of a group of 60 individuals
x (yrs)
y (yrs.)
40 – 45
45 – 50
50 – 55
55 – 60
60 – 65
45 – 50 2 5 8 3 0
50 – 55 1 3 6 10 2
55 – 60 0 2 5 12 1
Then find the marginal frequency distribution for x and y.
Solution: Marginal frequency distribution for x
x 40 – 45 45 – 50 50 – 55 55 – 60 60 – 65
f 3 10 19 25 3
Marginal frequency distribution for y
y 45 – 50 50 – 55 55 – 60
f 18 22 20
2.2.2 Covariance .
Let be a bivariate distribution, where are the values of variable x and those of y. Then the covariance Cov (x, y) between x and y is given by
or where, and are means of variables x and y respectively.
Covariance is not affected by the change of origin, but it is affected by the change of scale.
Example: 2 Covariance between x and y, if , , is
(a) 22 (b) 2 (c) – 2 (d) None of these
Solution: (c) Given,
We know that,
.
2.2.3 Correlation .
The relationship between two variables such that a change in one variable results in a positive or negative change in the other variable is known as correlation.
(1) Types of correlation
(i) Perfect correlation : If the two variables vary in such a manner that their ratio is always constant, then the correlation is said to be perfect.
(ii) Positive or direct correlation : If an increase or decrease in one variable corresponds to an increase or decrease in the other, the correlation is said to be positive.
(iii) Negative or indirect correlation : If an increase or decrease in one variable corresponds to a decrease or increase in the other, the correlation is said to be negative.
(2) Karl Pearson's coefficient of correlation : The correlation coefficient , between two variable x and y is given by, ,
.
(3) Modified formula : , where
Also .
Example: 3 For the data
x: 4 7 8 3 4
y: 5 8 6 3 5
The Karl Pearson’s coefficient is
(a) (b) 63 (c) (d)
Solution: (a) Take
4
7
8
3
4 5
8
6
3
5 – 1
2
3
– 2
– 1 0
3
1
– 2
0 1
9
1
4
0 0
9
1
4
0 0
6
3
4
0
Total
.
Example: 4 Coefficient of correlation between observations (1, 6),(2, 5),(3, 4), (4, 3), (5, 2), (6, 1) is
(a) 1 (b) – 1 (c) 0 (d) None of these
Solution: (b) Since there is a linear relationship between x and y, i.e.
Coefficient of correlation = – 1.
Example: 5 The value of co-variance of two variables x and y is and the variance of x is and the variance of y is . The coefficient of correlation is
(a) 0.48 (b) 0.78 (c) 0.87 (d) None of these
Solution : (d) We know that coefficient of correlation
Since the covariance is – ive.
Correlation coefficient must be – ive. Hence (d) is the correct answer.
Example: 6 The coefficient of correlation between two variables x and y is 0.5, their covariance is 16. If the S.D of x is 4, then the S.D. of y is equal to
(a) 4 (b) 8 (c) 16 (d) 64
Solution: (b) We have, . S.D of x i.e., ,
We know that,
; .
Example: 7 For a bivariate distribution (x, y) if , , variance of x is 4, variance of y is 9, then is
(a) 5/6 (b) 5/36 (c) 11/3 (d) 11/18
Solution: (a) .
= 5.
= .
Example: 8 A, B, C, D are non-zero constants, such that
(i) both A and C are negative. (ii) A and C are of opposite sign.
If coefficient of correlation between x and y is r, then that between and is
(a) r (b) – r (c) (d)
Solution : (a,b) (i) Both A and C are negative.
Now
and
Hence = =
(ii) = ,
= = .
2.2.4 Rank Correlation .
Let us suppose that a group of n individuals is arranged in order of merit or proficiency in possession of two characteristics A and B.
These rank in two characteristics will, in general, be different.
For example, if we consider the relation between intelligence and beauty, it is not necessary that a beautiful individual is intelligent also.
Rank Correlation : , which is the Spearman's formulae for rank correlation coefficient.
Where = sum of the squares of the difference of two ranks and n is the number of pairs of observations.
Note : We always have, ,
If all d's are zero, then , which shows that there is perfect rank correlation between the variable and which is maximum value of r.
If however some values of are equal, then the coefficient of rank correlation is given by
where m is the number of times a particular is repeated.
Positive and Negative rank correlation coefficients
Let r be the rank correlation coefficient then, if
• , it means that if the rank of one characteristic is high, then that of the other is also high or if the rank of one characteristic is low, then that of the other is also low. e.g., if the two characteristics be height and weight of persons, then means that the tall persons are also heavy in weight.
• , it means that there is perfect correlation in the two characteristics i.e., every individual is getting the same ranks in the two characteristics. Here the ranks are of the type (1, 1), (2, 2),....., (n, n).
• , it means that if the rank of one characteristics is high, then that of the other is low or if the rank of one characteristics is low, then that of the other is high. e.g., if the two characteristics be richness and slimness in person, then means that the rich persons are not slim.
• , it means that there is perfect negative correlation in the two characteristics i.e, an individual getting highest rank in one characteristic is getting the lowest rank in the second characteristic. Here the rank, in the two characteristics in a group of n individuals are of the type (1, n), .
• , it means that no relation can be established between the two characteristics.
Important Tips
If , the variable x and y are said to be uncorrelated or independent.
If , the correlation is said to be negative and perfect.
If the correlation is said to be positive and perfect.
Correlation is a pure number and hence unitless.
Correlation coefficient is not affected by change of origin and scale.
If two variate are connected by the linear relation , then x, y are in perfect indirect correlation. Here .
If x, y are two independent variables, then .
, where .
Example: 9 Two numbers within the bracket denote the ranks of 10 students of a class in two subjects
(1, 10), (2, 9), (3, 8), (4, 7), (5, 6), (6, 5), (7, 4), (8, 3), (9, 2), (10, 1). The rank of correlation coefficient is
(a) 0 (b) – 1 (c) 1 (d) 0.5
Solution: (b) Rank correlation coefficient is , Where for pair
Also ; .
Example : 10 Let be the rank of n individuals according to character A and the ranks of same individuals according to other character B such that for . Then the coefficient of rank correlation between the characters A and B is
(a) 1 (b) 0 (c) – 1 (d) None of these
Solution: (c) for all
Let . Then,
=
=
.
i.e., .
Regression
2.2.5 Linear Regression .
If a relation between two variates x and y exists, then the dots of the scatter diagram will more or less be concentrated around a curve which is called the curve of regression. If this curve be a straight line, then it is known as line of regression and the regression is called linear regression.
Line of regression: The line of regression is the straight line which in the least square sense gives the best fit to the given frequency.
2.2.6 Equations of lines of Regression .
(1) Regression line of y on x : If value of x is known, then value of y can be found as
or
(2) Regression line of x on y : It estimates x for the given value of y as
or
(3) Regression coefficient : (i) Regression coefficient of y on x is
(ii) Regression coefficient of x on y is .
2.2.7 Angle between Two lines of Regression .
Equation of the two lines of regression are and
We have, slope of the line of regression of y on x =
Slope of line of regression of x on y =
= .
Here the positive sign gives the acute angle , because and are positive.
.....(i)
Note : If , from (i) we conclude or i.e., two regression lines are at right angels.
If , i.e., , since is acute i.e., two regression lines coincide.
2.2.8 Important points about Regression coefficients bxy and byx .
(1) the coefficient of correlation is the geometric mean of the coefficient of regression.
(2) If , then i.e. if one of the regression coefficient is greater than unity, the other will be less than unity.
(3) If the correlation between the variable is not perfect, then the regression lines intersect at .
(4) is called the slope of regression line y on x and is called the slope of regression line x on y.
(5) or , i.e. the arithmetic mean of the regression coefficient is greater than the correlation coefficient.
(6) Regression coefficients are independent of change of origin but not of scale.
(7) The product of lines of regression’s gradients is given by .
(8) If both the lines of regression coincide, then correlation will be perfect linear.
(9) If both and are positive, the will be positive and if both and are negative, the r will be negative.
Important Tips
If , then tan is not defined i.e. . Thus the regression lines are perpendicular.
If or , then tan = 0 i.e. = 0. Thus the regression lines are coincident.
If regression lines are and , then .
If byx, bxy and then and if bxy, byx and then .
Correlation measures the relationship between variables while regression measures only the cause and effect of relationship between the variables.
If line of regression of y on x makes an angle , with the +ive direction of X-axis, then .
If line of regression of x on y makes an angle , with the +ive direction of X-axis, then .
Example : 11 The two lines of regression are and . The correlation coefficient between x and y is
(a) – 2/7 (b) 2/7 (c) 4/49 (d) None of these
Solution: (b) The two lines of regression are .....(i) and ......(ii)
If (i) is regression equation of y on x, then (ii) is regression equation of x on y.
We write these as and
, ; , So our choice is valid.
. [ ]
Example: 12 Given that the regression coefficients are – 1.5 and 0.5, the value of the square of correlation coefficient is
(a) 0.75 (b) 0.7
(c) – 0.75 (d) – 0.5
Solution: (c) Correlation coefficient is given by = .
Example: 13 In a bivariate data , and .The regression coefficient of y on x is
(a) – 3.1 (b) – 3.2 (c) – 3.3 (d) – 3.4
Solution: (c) = =
= = 10.6
= – 3.3.
Example: 14 If two lines of regression are and , then is
(a) (17, 13) (b) (13, 17) (c) (– 17, 13) (d) (– 13, – 17)
Solution: (b) Since lines of regression pass through , hence the equation will be and
On solving the above equations, we get the required answer .
Example: 15 The regression coefficient of y on x is and of x on y is . If the acute angle between the regression line is , then
(a) (b) (c) (d) None of these
Solution: (a) . Therefore, = .
Example: 16 If the lines of regression of y on x and x on y make angles and respectively with the positive direction of X-axis, then the correlation coefficient between x and y is
(a) (b)
(c) (d)
Solution: (c) Slope of regression line of y on x =
Slope of regression line of x on y =
. Hence, .
Example: 17 If two random variables x and y, are connected by relationship , then
(a) 1 (b) – 1 (c) – 2 (d) 3
Solution: (b) Since
; . So,
Also ,
. ( both are –ive)
2.2.9 Standard error and Probable error.
(1) Standard error of prediction : The deviation of the predicted value from the observed value is known as the standard error prediction and is defined as
where y is actual value and is predicted value.
In relation to coefficient of correlation, it is given by
(i) Standard error of estimate of x is (ii) Standard error of estimate of y is .
(2) Relation between probable error and standard error : If r is the correlation coefficient in a sample of n pairs of observations, then its standard error S.E. and probable error P.E. = 0.6745 (S.E.)= 0.6745 . The probable error or the standard error are used for interpreting the coefficient of correlation.
(i) If , there is no evidence of correlation.
(ii) If , the existence of correlation is certain.
The square of the coefficient of correlation for a bivariate distribution is known as the “Coefficient of determination”.
Example: 18 If and and , then standard error of is
(a) 0 (b) (c) (d) 1
Solution: (a) .
Comments
Post a Comment