Chapter 4.3: Regularisation - Concentration inequalities Flashcards

(17 cards)

1
Q

What is the excess risk and how can we decompose it into the sum of two different errors?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the three terms that estimation error decomposes into?

What can we decompose the 3rd term into?

A

(1) - Risk of ERM - Empirical risk of the ERM
(2) - Empirical Risk of the ERM - Empirical Risk of the Best in class function
(3) - Empirical Risk of the best in class function - true risk of the best in class function

Second set of (3) is writing it using the loss function –> empirical average of Z - the expectation of Z

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is Markov’s inequality?

What doe we need to consider in the case of empirical risk of f*F-curly?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is Chevyshev’s inequality and its proof?

What does the inequality show roughly?

What can we use it to prove?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How lose is the bound? what theory do we use to show this holds?

A
  • Chevyshev is quite loose and covergences slower –> by using the intuition from CLT we get covergence at a rate of ε
    -n for large n
  • Using covergence in distribution, so inquality holds asmptotically rather than strictly
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is Chernoff’s bound? What does it mean?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the definition of a sub-Gaussian?

A

In the expectation is the moment generating function of a centered Gaussian

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Given Z is sub-Gaussian, what other random variables are sub-Gaussian?

A

When you are doing a two sided test, and dealing with Z-µ and µ-Z –> you can simplify this by just doing 2 times the bound due to symmetry

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is Hoeffding’s Lemma?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the proof of Hoeffding’s Lemma?

A

ADD ONCE WE’VE DONE THE PROOF

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the proposition regarding the distribution of the sum of sub-Gaussian random variables?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the proof of this proposition?

A
  • mean isnt important as in the definition of Sub-Gaussian we always center the distribution
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the Corollary regarding the bounding of a sum of independent sub-Gaussian Random variables?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What Is Hoeffding’s inequality?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the one-sided/two-sided bounding of the 0-1 loss classification of the empirical risk?

What about the estimation error?

A

ADD TO THIS FROM WRITTEN NOTES

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is a Lemma for the probability of the error bounds?

A
  • columns have mean/sum 0? and l2 norm = sqrt(n)
17
Q

What is the proof of this lemma?

A

ADD FROM NOTES