Regression: describes techniques that describe relationships between variables and that allow us to predict scores for one variable from scores on another variable.
Linear relationship: an association between two variables that may be accurately represented on a graph by a straight line.
Slope: the vertical distance divided by the horizontal distance between any two points on the line (rise/run).
Y-intercept: the point at which a line intersects the Y-axis.
Regression line: the line describing the relationship between the two variables.
Regression constants: the values a (the y-intercept) and b (the slope).
Least squares criterion: determining the regression line requires that the sum of the squared deviations between points and the regression line is at a minimum.
Regression of Y on X: when variable Y is predicted from scores on variable X.
Regression of X on Y: when variable X is predicted from scores on variable Y.
Standard Error of Estimate: defined to be the square root of the sum of squared deviations of Y about the regression line (Y) divided by (N-2).
Correlation coefficient: reflects the degree of linear relationship between 2 variables.
Restricted Range: when the range of the sample is restricted, the correlation is generally decreased.
Extreme Groups: when extreme groups are sampled, r tends to be higher (increases overall sample variability in proportion to errorvariability)
Combined Groups: combining subgroups can alter the correlation coefficient.
Extreme Score: can inflate the correlation coefficient.