Intermediate Flashcards

(11 cards)

1
Q

How do operations that rely on a random seed work in tensorflow? Use the tf.random.shuffle() as example.

A

Operations that rely on a random seed actually derive it from two seeds: the global and operation-level seeds.

If neither the global seed nor the operation seed is set, we get different results for every call to the random op.
shuffled = tf.random.shuffle(not_shuffled_tensor)

If only one of the seeds (global or local) is set, we get different results for every call to the random op, but the same sequence for every re-run of the program.

tf.random.set_seed(42)
shuffled = tf.random.shuffle(not_shuffled_tensor)

or
shuffled = tf.random.shuffle(not_shuffled_tensor, seed=42)

If we set both the global and local seeds, we will get always the same sequence for every call to the random op and for every re-run of the program.

tf.random.set_seed(42)
shuffled = tf.random.shuffle(not_shuffled_tensor, seed=42)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How can we add a new dimension to the tensor below using tf.newaxis while keeping the existing elements? Explain how the new axis/dimensions will be added.

A

rank3_tensor = rank2_tensor[..., tf.newaxis]

The … is equivalent to

rank3_tensor = rank2_tensor[:, :, tf.newaxis]

Since we added tf.newaxis after the ..., the new axis will be added after the last existing one.

The original tensor has a shape of (2,5) and we are not adding or removing elements. Consequently, the new tensor must have a shape of (2,5,1) to maintain the number of elements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How can we add a new dimension to the tensor below using tf.expand_dims while keeping the existing elements?

A

rank3_tensor = tf.expand_dims(rank2_tensor, axis=-1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Is there any difference in the resulting tensors?

tA = tf.constant(np.array([3., 7., 10.]))
tB = tf.constant([3., 7., 10.])

A

Creating a tensor from a list will default to float32 while creating it from a numpy array will default to float64.

This is important because certain operations required specific dtypes to work.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

When using the Sequential model, what is the difference between
* Total parameters
* Trainable parameters
* Non-trainable parameters

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the use for the Sequential model in Tensorflow?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the three formats to save a trained model and how to use each one of them?

A
  1. SavedModel
    model_name.export()
    This creates a zip folder.
    No need to add an extension to the file name.
  2. HDF5
    model_name.save()
    This creates a single file.
    Add the extension .h5 to the saved file
  3. Keras
    model_name.save()
    This creates a single file.
    Add the extension .keras to the saved file
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the difference between saving a model using SavedModel, HDF5 and Keras formats?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can we get the learning curve from the training of a Sequential model?

A

Use

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the reason for assigning model.fit() to a variable?

A

model.fit() return a History object that contains valuable information on the training of the model, e.g. the value of the loss for both training and validation (if provided) sets over the epochs.

It’s a tiny change to the code that provided great value in return: h1 = model.fit()

It also makes it easy to visualize the learning curves by passing the object to a pandas DataFrame and calling the plot function.

import matplotlib.pyplot as plt

plt.plot(history.history['loss'], label='train_loss')
plt.plot(history.history['val_loss'], label='val_loss')
plt.legend()
plt.title("Loss over epochs")
plt.show()
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is a History object?

A

It’s the return object from calling the model.fit() method. It is a Python dictionary.

Its History.history attribute is a record of training loss values and metrics values at successive epochs, as well as validation loss values and validation metrics values (if applicable).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly