Deep Learning Flashcards

(30 cards)

1
Q

Deep Learning (DL)

A

able to extract hierarchical features of complex datasets through its multiple layer training with ANN

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Artificial Neural Networks (ANN)

A

a computational model in which neurons, that hold the input, process information through weighted connections

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How to train DL

A

Back-propagation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Back-propagation

A

is the process of taking incorrect paths and retracing the cycle to adjust your layers/weights

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Sequence Models

A

input data in the form of sequences. The goal is to find patterns to make predictions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Recurrent Neural Network (RNN)

A

not a Feed-Foward Network; Designed to handle sequential data, their feedback loop allows information to persists across time steps; Contains a hidden state/memory to retain dependencies /patterns

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Feed-forward neural network

A

a type of artificial neural network where information flows in only one direction from input to output layers without any cycles or feedback loops.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

One-to-One RNN

A

produces a single input at each time-step and produces a single output at each corresponding time-step (Real-Time Prediction)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

One-to-Many RNN

A

takes a single input @ generates a sequence of multiple outputs (Content Generation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Many-to-Many RNN

A

takes in a sequence of inputs @ produces a sequence of outputs (Sequence Transformation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Long Short-Term Memory (LSTM)

A

unlike RNN’s, this can handle long-term dependencies/memory by actively choosing what to remember

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

LSTM Gates Names

A

Input, Forget, Output

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

LSTM Input Gate

A

decides what new information from the current input should be stored

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

LSTM Forget Gate

A

decides what information in the current cell should be discarded/forgotten

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

LSTM Output Gate

A

regulates how much the current memory cell should be exposed as the output

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Convolutional Neural Network (CNN)

A

type of DL model designed for processing grid-like data like images @ videos

17
Q

CNN Layers (In, FE)

A

Input Layer → Feature Extraction

18
Q

CNN Input Layer

A

accepts 3D images w/ height ,width,depth representing RGB color channels

19
Q

CNN Feature Extraction

A

repeating pattern of convolutional layers that extract meaningful layers from input layer

20
Q

CNN Feature Extraction Convolutional Layer

A

applied convolutional operations to the input image using small filters (kernels)

21
Q

Convolutional Layer Kernel

A

slides across the input to detect specific features such as edges,corners, or textures

22
Q

CNN Feature Extraction Activation Function

A

allows the network to learn more complex @ non-linear relationships in the data

23
Q

CNN Feature Extraction Pooling Layer

A

helps reduce the spatial dimensions of feature maps generated by the convolutional layer

24
Q

CNN Feature Extraction Fully Connected Layer

A

responsible for making final predictions or classifications based on the learned features

25
CNN Feature Extraction Softmax Layer
converts the output of the last fully connected layers into probability scores
26
CNN Feature Extraction Dropout Layer
a regularization technique to prevent over fitting
27
ANN Neurons
receives input, performs computations and produces an output
28
ANN Weight
each connection between neurons has a weight; Which determines the strength/importance of the connection
29
ANN Bias
additional input to a neuron that helps adjust the output; It provides flexibility and allows to make predictions
30
Multi-Layer Perceptron (MLP)
a feedfoward neural network w/ multiple layers; Can approximate any continuous function for complex decision boundaries