Unit 6 Overview – Essential Deep Learning Tips & Tricks
We introduced the Lightning Trainer in the previous unit to organize our PyTorch code. This will make our life much easier and allow us to take advantage of more advanced features to squeeze more predictive performance out of our PyTorch models (stay tuned for computational performance tricks in Unit 9).
In this unit, we will cover model checkpointing to save the best model during training. We will talk about learning rate finders and schedulers, and we will dive into different optimization algorithms as well as activation functions.
An essential issue in deep learning is how to address overfitting. We will cover several techniques that can help with that. And lastly, we will cover some practical tips for running hyperparameter sweeps and debugging deep neural networks.
Log in or create a free Lightning.ai account to access:
- Completion badges
- Progress tracking
- Additional downloadable content
- Additional AI education resources
- Notifications when new units are released
- Free cloud computing credits