What is the excess risk and how can we decompose it into the sum of two different errors?
What are the three terms that estimation error decomposes into?
What can we decompose the 3rd term into?
(1) - Risk of ERM - Empirical risk of the ERM
(2) - Empirical Risk of the ERM - Empirical Risk of the Best in class function
(3) - Empirical Risk of the best in class function - true risk of the best in class function
Second set of (3) is writing it using the loss function –> empirical average of Z - the expectation of Z
What is Markov’s inequality?
What doe we need to consider in the case of empirical risk of f*F-curly?
What is Chevyshev’s inequality and its proof?
What does the inequality show roughly?
What can we use it to prove?
How lose is the bound? what theory do we use to show this holds?
What is Chernoff’s bound? What does it mean?
What is the definition of a sub-Gaussian?
In the expectation is the moment generating function of a centered Gaussian
Given Z is sub-Gaussian, what other random variables are sub-Gaussian?
When you are doing a two sided test, and dealing with Z-µ and µ-Z –> you can simplify this by just doing 2 times the bound due to symmetry
What is Hoeffding’s Lemma?
What is the proof of Hoeffding’s Lemma?
ADD ONCE WE’VE DONE THE PROOF
What is the proposition regarding the distribution of the sum of sub-Gaussian random variables?
What is the proof of this proposition?
What is the Corollary regarding the bounding of a sum of independent sub-Gaussian Random variables?
What Is Hoeffding’s inequality?
What is the one-sided/two-sided bounding of the 0-1 loss classification of the empirical risk?
What about the estimation error?
ADD TO THIS FROM WRITTEN NOTES
What is a Lemma for the probability of the error bounds?
What is the proof of this lemma?
ADD FROM NOTES