Regression
Statistical model predicting outcome Y from predictors X.
Multiple Regression
Uses two or more predictors to explain variance in Y.
Unstandardised Coefficient (b)
Change in Y (in its units) per one unit of X.
Standardised Coefficient (β)
Change in Y (in SD units) per one SD of X.
Intercept (b■)
Expected value of Y when all Xs = 0.
Residual (ε)
Difference between observed and predicted Y.
Assumptions
LINE — Linearity, Independence, Normality, Equality of variance.
Violation Effects
Reduce precision, widen confidence intervals.
T-tests and Regression
t = b / SE, t-tests are special cases of regression.
Model equation
Y = b■ + b■X■ + b■X■ + … + ε
R²
Proportion of variance in Y explained by predictors.
Goal of R²
Identify unique contribution of each predictor to Y.
ANOVA
Compares means across groups; extension of regression for categorical predictors.
Sums of Squares SST:
SST: Total variance in data.
Sums of Squares SSM (Model):
SSM (Model): Variance explained by group means.
Sums of Squares SSR (Residual)
SSR (Residual): Variance not explained (error).
R²
SSM / SST; variance explained by predictors.
F-Test
Ratio of model variance to residual variance; tests overall model fit.
η² (Eta Squared)
Proportion of variance in Y explained by group membership.
df■ (Between Groups)
k – 1
p-value
Probability of data given null hypothesis (H■).
Null Hypothesis
Assumes no group difference or association.
NHST
Null Hypothesis Significance testing
Combines Fisher’s evidence and Neyman–Pearson decision traditions.
Statistical Significance
Arbitrary threshold (usually α = .05).