Explain the SoftMax function
Turns scores(integers which reflect output of neural net) into probabilities

What are logits?
Scores/Numbers. For neural nets, its result of the matmul of weights,input + bias. Logits are the inputs into a softmax function

Explain a TensorFlow Session
Explain tf.placeholder

How to set multiple TF.Placeholder values

What happens if the data passed to the feed_dict doesn’t match the tensor type and can’t be cast into the tensor type
ValueError: invalid literal for
How to cast a value to another type
tf.subtract(tf.cast(tf.constant(2.0), tf.int32), tf.constant(1))
Explain Tf.Variable

TF.normal
Softmax function call?
x = tf.nn.softmax([2.0, 1.0, 0.2])


Explain steps to one-hot encode labels

basic concept of cross entropy
calculates distances of two vectors. Usually comparing one-hote encoded vector and softmax output vector. Basic idea is to reduce distance
Describe process of calcaulting cross entropy

Does the cross entropy function output a vector of values?
No, just a single value which represents distance
Quiz - Cross Entropy


how to implement mini-batching in TF

Quiz - Set features, labels, weights and biases


What is a tensor
any n-dimensional collection of values
scaler = 0 dimension tensor
vector = 1 dimension tensor
matrix = 2 dimension tensor
Anything larger than 2 dimension is just called a tensor with # of ranks
Describe a 3 dimensional tensor
an image can be described by a 3 dimensional tensor. This would look like a list of matrices

TF.Constant
value never changes
Best practice to initialize weights
truncated normal takes a tuple as input

Best practice to initialize bias

What does “None” allow us to do?

None is a placeholder dimension. Allows us to use different batch sizes