Deep Learning Flashcards

(12 cards)

1
Q

Define batch normalization.

A

A technique to normalize inputs of each layer in a neural network to improve training speed and stability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the impact of batch normalization on model generalization?

A

It can improve generalization by reducing overfitting.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define transposed convolution.

A

A convolution operation that increases the spatial dimensions of the input feature map.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Epoch

A

One full pass through your entire dataset

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

A step / iteration

A

Processing just one small batch of the entire data

5000 photos
100 batch
=>50 steps

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

RNN PARAMETER CALCULATIONS

A

Input dimension = 128
Hidden units = 50

Wx
12850=
Wh
50
50=
Wb
50

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Reason for RNN vanishing gradient during bptt?

A

The gradient signal repeatedly gets multiplied by the recurrent weight matrix ( Wh) and activation function derivative as it flows backward through each timestep

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

RNN a skip connection

A

Yes through time. Not the one that passes the layer itself

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Language model

A

The model that can generate the probability of next word given the previous word. Or the probability that can generate the probability of a sentence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Conditional probability

A

Gives us a way to formally calculate our new belief after a piece of evidence has been involved updating our old belief

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Perplexity

A

(1/p(sentence)) ^1/N

Rnn
E^loss

How well did you know this?
1
Not at all
2
3
4
5
Perfectly