Model hyperparameter tuning with SageMaker & TensorFlow
This project covers model hyperparameter tuning (HPT) across a number of different deep learning problems/datasets using TensorFlow and Amazon SageMaker. Hyperparameter tuning works by finding the best version of a model by running many training jobs on your dataset using the algorithm and ranges of hyperparameters that you specify. It then chooses the hyperparameter values that result in a model that performs the best, as measured by a metric that you choose.
A conda_tensorflow2_p36
kernel was used with the Amazon SageMaker notebook instance.