Table of Contents
2.5.1.post0

Home

  • Lightning in 15 minutes
  • Install
  • 2.0 Upgrade Guide

Level Up

  • Basic skills
  • Intermediate skills
  • Advanced skills
  • Expert skills

Core API

  • LightningModule
  • Trainer

Optional API

  • accelerators
  • callbacks
  • cli
  • core
  • loggers
  • profiler
  • trainer
  • strategies
  • tuner
  • utilities

More

  • Community
  • Examples
  • Glossary
  • How-to Guides
  • Overview
  • Team management
  • Production
  • Security
  • Open source
    • Overview
    • PyTorch Lightning
    • Fabric
    • Lit-GPT
    • Torchmetrics
    • Litdata
    • Lit LLaMA
    • Litserve
  • Examples
  • Glossary
  • FAQ
  • Docs >
  • Level 20: Train models with billions of parameters
Shortcuts

Level 20: Train models with billions of parametersΒΆ

Scale to billions of parameters with multiple distributed strategies.


Scale with distributed strategies

Learn about different distributed strategies to reach bigger model parameter sizes.

intermediate

Train models with billions of parameters

Scale to billions of params on GPUs with FSDP, TP or Deepspeed.

advanced

  • Level 20: Train models with billions of parameters

To analyze traffic and optimize your experience, we serve cookies on this site. By clicking or navigating, you agree to allow our usage of cookies. Read PyTorch Lightning's Privacy Policy.