How do operations that rely on a random seed work in tensorflow? Use the tf.random.shuffle() as example.
Operations that rely on a random seed actually derive it from two seeds: the global and operation-level seeds.
If neither the global seed nor the operation seed is set, we get different results for every call to the random op.shuffled = tf.random.shuffle(not_shuffled_tensor)
If only one of the seeds (global or local) is set, we get different results for every call to the random op, but the same sequence for every re-run of the program.
tf.random.set_seed(42) shuffled = tf.random.shuffle(not_shuffled_tensor)
or shuffled = tf.random.shuffle(not_shuffled_tensor, seed=42)
If we set both the global and local seeds, we will get always the same sequence for every call to the random op and for every re-run of the program.
tf.random.set_seed(42) shuffled = tf.random.shuffle(not_shuffled_tensor, seed=42)
How can we add a new dimension to the tensor below using tf.newaxis while keeping the existing elements? Explain how the new axis/dimensions will be added.
rank3_tensor = rank2_tensor[..., tf.newaxis]
The … is equivalent to
rank3_tensor = rank2_tensor[:, :, tf.newaxis]
Since we added tf.newaxis after the ..., the new axis will be added after the last existing one.
The original tensor has a shape of (2,5) and we are not adding or removing elements. Consequently, the new tensor must have a shape of (2,5,1) to maintain the number of elements.
How can we add a new dimension to the tensor below using tf.expand_dims while keeping the existing elements?
rank3_tensor = tf.expand_dims(rank2_tensor, axis=-1)
Is there any difference in the resulting tensors?
tA = tf.constant(np.array([3., 7., 10.]))tB = tf.constant([3., 7., 10.])
Creating a tensor from a list will default to float32 while creating it from a numpy array will default to float64.
This is important because certain operations required specific dtypes to work.
When using the Sequential model, what is the difference between
* Total parameters
* Trainable parameters
* Non-trainable parameters
What is the use for the Sequential model in Tensorflow?
What are the three formats to save a trained model and how to use each one of them?
model_name.export()model_name.save().h5 to the saved filemodel_name.save().keras to the saved fileWhat is the difference between saving a model using SavedModel, HDF5 and Keras formats?
How can we get the learning curve from the training of a Sequential model?
Use
What is the reason for assigning model.fit() to a variable?
model.fit() return a History object that contains valuable information on the training of the model, e.g. the value of the loss for both training and validation (if provided) sets over the epochs.
It’s a tiny change to the code that provided great value in return: h1 = model.fit()
It also makes it easy to visualize the learning curves by passing the object to a pandas DataFrame and calling the plot function.
import matplotlib.pyplot as plt
plt.plot(history.history['loss'], label='train_loss')
plt.plot(history.history['val_loss'], label='val_loss')
plt.legend()
plt.title("Loss over epochs")
plt.show()What is a History object?
It’s the return object from calling the model.fit() method. It is a Python dictionary.
Its History.history attribute is a record of training loss values and metrics values at successive epochs, as well as validation loss values and validation metrics values (if applicable).