Ordinary least squares regression
When we have > 1 predictor (X) variable, we simply need a way to combine them
The regression procedure expands seamlessly to do this…
Essentially, we extend the least squares procedure to estimate b0 and b1 and b2… bk to give us the best possible prediction of Y from all the variables jointly
This type of regression might be referred to in some texts as OLS, Ordinary Least Squares Regression
Equation for multiple regression
Y’ = b0 + b1X1 + b2X2 + e
Testing the significance of R
fancy p = rho
-> population correlation coefficient
H0 : fancy p = 0
H1 : fancy p not equal to 0
H0 : there is no linear relationship between X and Y in the population
H1: there is a linear relationship between X and Y in the population
R squared and variance
In correlation, r squared = proportion of variance shared by X and Y, i.e., the two correlates—overlap
Multiple R squared and variance
R squared equation
R squared = SSreg / SStot
SStot
Y - Y bar
SSreg
Y’ - Y bar
SSres
Y - Y’
Extending our interpretation of regression predictors
Semipartial correlations—sr and sr2
Calculating the unique variance
Shared variance
Shared variance = R squared - (sr squared X1 + sr squared X2)