Bias variance trade off
Common pitfalls of PCA
Bias variance tradeoff math
Favorite machine learning algorithm
EIgen vector
For some nxn matrix A, x is an eigenvector of A if Ax = lamdba x where lambda is a scalr.
A matrix can represrnt linear transformation and when it is applied to a vector x, results in an eigen vector
eigenvalue
Scaling value lambda
eigendecomposition
decomposition of a squared matrix into its eigenvectors
SVD
non-square matrices are decomposed using SVD
Loss function
Tells us how well a model fits a particular dataset. The lower the loss, the more desirable.
Optimization methods
Technoques for minimizing a cost function