Unit 7.7 Using Unlabeled Data with Self-Supervised
- Chen, Kornblith, Norouzi, and Hinton (2020). A Simple Framework for Contrastive Learning of Visual Representations
- Parts 4 and 5, 7.7-self-supervised
What we covered in this video lecture
In this series of videos, we discussed self-supervised learning, which lets us leverage unlabeled data for pretraining. We also discussed the two broad subcategories of self-supervised learning, self-prediction and contrastive learning. Then, to implement a contrastive learning method in practice, we looked more closely at SimCLR
By the way, the overal concept behind self-supervised learning is also responsible for the success of ChatGPT, but more on large language models in Unit 8!
Additional resources if you want to learn more
SimCLR is one of the most successful and popular methods for contrastive learning. However, there are many, many other self-supervised learning techniques out there. For an overview, I recommend A survey on contrastive self-supervised learning and Advances in Understanding, Improving, and Applying Contrastive Learning. And for an example of a non-contrastive self-supervised learning technique, I recommend Masked Autoencoders Are Scalable Vision Learners.
Log in or create a free Lightning.ai account to access:
- Completion badges
- Progress tracking
- Additional downloadable content
- Additional AI education resources
- Notifications when new units are released
- Free cloud computing credits