Lightning AI Studios: Never set up a local environment again →

Log in or create a free Lightning.ai account to track your progress and access additional course materials  

5.4 Making Code Reproducible

Code

What we covered in this video lecture

This lecture covered some common sources of randomness that we face when training neural networks. Randomness is not necessarily a bad thing. Sometimes, a particular initial random weight configuration can lead to bad local minima, so it is recommended to train a network multiple times with different initial random weights. However, sometimes randomness can be annoying when we want to share or reproduce results — this is especially true when we are working on research papers or writing unit test. So, in this lecture, we covered some methods for reducing the sources of randomness in our neural network training code.

(Note that deterministic settings on a GPU, for example, when we work with convolutional networks later, make neural networks train a tad slower, which is whhy they are typically not used by default.)

Additional resources if you want to learn more

If you want to learn more about determinism when using GPUs, you may enjoy this recorded talk, Deep Learning Determinism, by Duncan Riach at the Nvidia GTC conference.

Log in or create a free Lightning.ai account to access:

  • Quizzes
  • Completion badges
  • Progress tracking
  • Additional downloadable content
  • Additional AI education resources
  • Notifications when new units are released
  • Free cloud computing credits

Quiz: 5.4 Making Code Reproducible (Part 1)

Suppose we ran the following code to generate three random numbers:

>>> import torch
>>> torch.manual_seed(123)
>>> torch.rand(3)
tensor([0.2961, 0.5166, 0.2517])

Now, we send the code to a colleague who wants to run it on a different computer. Should we expect to get the same results as shown above?

Correct. Since we seeded the random number generator with a specific integer, we should expect to get the same exact same results every time we execute the code.

Incorrect. Since we seeded the random number generator with a specific integer, we should expect to get same exact results every time we execute the code.

Please answer all questions to proceed.

Quiz: 5.4 Making Code Reproducible (Part 2)

In the lecture, we mentioned that it could be a good idea to seed the random number generator with a specific number to make code reproducible. But does that mean that the training loader will not shuffle the data even if shuffle=True?

Correct. The DataLoader will still shuffle the data. Also, note that each epoch is always shuffled differently. It just means that the shuffling is the same each time we initialize and rerun the DataLoader.

Incorrect. The DataLoader will still shuffle the data. Also, note that each epoch is always shuffled differently. It just means that the shuffling is the same each time we initialize and rerun the DataLoader.

Please answer all questions to proceed.

Quiz: 5.4 Making Code Reproducible (Part 3)

Suppose we seeded the random number generators and used the deterministic=True flag in the Trainer class. Does that mean that our code will always produce the same results no matter which hardware we run it on?

Incorrect. While it is less likely, you may still get different results depending on the hardware. Due to some specifics of GPU hardware, some GPU architectures may round certain numbers slightly differently. This can sometimes lead to tiny numeric differences that can result in bigger differences over time during model training.

Correct. While it is less likely, you may still get different results depending on the hardware. Due to some specifics of GPU hardware, some GPU architectures may round certain numbers slightly differently. This can sometimes lead to tiny numeric differences that can result in bigger differences over time during model training.

Please answer all questions to proceed.
Watch Video 1

Unit 5.4

Videos
Follow along in a Lightning Studio

DL Fundamentals 5: PyTorch Lightning

Sebastian
Launch Studio →
Questions or Feedback?

Join the Discussion