3.4 Automatic Differentiation in PyTorch
What we covered in this video lecture
We can compute the gradients by hand when we work with small models such as logistic regression, as we have seen in the previous lecture. However, as our models become larger, this becomes tedious or even infeasible.
Luckily, PyTorch supports automatic differentiation (also known as autograd) to calculate derivatives and gradients automatically. In this lecture, we saw the basic capabilities and usage of PyTorch’s
autograd submodule. We will use it in the upcoming videos when implementing the training loop.
Additional resources if you want to learn more
If you are curious to learn more about PyTorch’s autograd feature, check out the official PyTorch Autograd documentation.
Log in or create a free Lightning.ai account to access:
- Completion badges
- Progress tracking
- Additional downloadable content
- Additional AI education resources
- Notifications when new units are released
- Free cloud computing credits