Lightning AI Studios: Never set up a local environment again →

Log in or create a free Lightning.ai account to track your progress and access additional course materials  

Unit 6.6 Improving Convergence with Batch Normalization

Log in or create a free Lightning.ai account to access:

  • Quizzes
  • Completion badges
  • Progress tracking
  • Additional downloadable content
  • Additional AI education resources
  • Notifications when new units are released
  • Free cloud computing credits

Quiz: Unit 6.6 Improving Convergence with Batch Normalization - Part 1

Suppose we have a fully connected layer with 200 weight parameters. If we add BatchNorm to that layer, how many learnable parameters does that add to the network?

Incorrect. Hint: think of the shift and scale parameters mentioned in this lecture.

Incorrect. Hint: think of the shift and scale parameters mentioned in this lecture.

Incorrect. Hint: think of the shift and scale parameters mentioned in this lecture.

Incorrect. Hint: think of the shift and scale parameters mentioned in this lecture.

Correct. We add 200 γ (scale) parameters and 200 β (shift) parameters to the network.

Please answer all questions to proceed.

Quiz: Unit 6.6 Improving Convergence with Batch Normalization - Part 2

If we want to place a BatchNorm layer after a hidden layer with 25 input features and 50 output features, we use

Incorrect. The number corresponds to the output features of the previous hidden layer.

Correct. The number corresponds to the output features of the previous hidden layer.

Incorrect. The number corresponds to the output features of the previous hidden layer.

Incorrect. The number corresponds to the output features of the previous hidden layer.

Please answer all questions to proceed.
Watch Video 1

Unit 6.6

Videos