From chapter 13, your primary objective is to get better understanding of Hyperparameter tuning. Model ensembling, Mixed-precision training and Training Keras models on multiple GPUs or on a TPU are optional but very good to know.
Make sure to apply the Hyperparameter tuning process in any/all of your CNNs projects. As usual take notes in the DL-notes.docx file in your Block C Microsoft Teams assignment. Clearly capture (screenshot) the various experiments you are implementing and their corresponding results with different hyperparameter settings.
The process of optimizing hyperparameters typically looks like this:
Choose a set of hyperparameters (automatically).
Build the corresponding model.
Fit it to your training data, and measure performance on the validation data.
Choose the next set of hyperparameters to try (automatically).
Repeat.
Eventually, measure performance on your test data.
The key to this process is the algorithm that analyzes the relationship between validation performance and various hyperparameter values to choose the next set of hyperparameters to evaluate.
In addition to reading the book, refer to the code snippets from the book : fchollet : chapter13_best-practices-for-the-real-world.ipynb
KerasTUner is a general-purpose hyperparameter tuning library. It has strong integration with Keras workflows. Follow this tutorial - In this tutorial, you will see how to tune model architecture, training process, and data preprocessing steps with KerasTuner.
Another example of using KerasTuner used in multi class classification - Fashion MNIST dataset - follow this tutorial and try the various configuration for your Fashion MINST image classification project OR apply the concepts in your creative brief use case.
Optional : view the different hyperparameter combination and their corresponding performance matrices using tensorboard - clicky
Optional : Another couple of interesting article1 on using KerasTuner for a CNN model hyperparameters. (Note - only focus on the CNN hyperparameter configurations, and ignore the complex setup configuration, saving model, etc.. ) article2
For the rest of the day, feel free to catchup on your pending tasks from other days, or apply the keras-tuner to your creative brief use case, or dive into other self-guided topics of your interest.
Deep Learning with Python, Second Edition