Regular User

reg. user 2.0

If

Then

Ref

used PyTorch 1.11

upgrade to PyTorch 2.1 or higher

PR18691

called self.trainer.model.parameters() in LightningModule.configure_optimizers() when using FSDP

On PyTorch 2.0+, call self.parameters() from now on

PR17309

used Trainer(accelerator="tpu", devices=[i])" to select the 1-based TPU core index

the index is now 0-based

PR17227

used torch_xla < 1.13

upgrade to torch_xla >= 1.13

PR17368

used trainer.num_val_batches to get the total size of all validation dataloaders

use sum(trainer.num_val_batches)

PR18441

used trainer.num_test_batches to get the total size of all test dataloaders

use sum(trainer.num_test_batches)

PR18441

used trainer.num_sanity_val_batches to get the total size of all validation dataloaders for sanity checking

use sum(trainer.num_sanity_val_batches)

PR18441

used Trainer(devices="auto") to auto-select all available GPUs in a Jupyter notebook

use Trainer(devices=-1)

PR18291

used Trainer(devices="auto") to auto-select all available GPUs in a Jupyter notebook

use Trainer(devices=-1)

PR18291

pip install lightning to install lightning.app dependencies

use pip install lightning[app] if you need lightning.app

PR18386

Advanced User

adv. user 2.0

If

Then

Ref

used the torchdistx package and integration in Trainer

materialize the model weights manually, or follow our guide for initializing large models

PR17995

defined def training_step(self, dataloader_iter, batch_idx) in LightningModule

remove batch_idx from the signature and expect dataloader_iter to return a triplet (batch, batch_idx, dataloader_idx)

PR18390

defined def validation_step(self, dataloader_iter, batch_idx) in LightningModule

remove batch_idx from the signature and expect dataloader_iter to return a triplet (batch, batch_idx, dataloader_idx)

PR18390

defined def test_step(self, dataloader_iter, batch_idx) in LightningModule

remove batch_idx from the signature and expect dataloader_iter to return a triplet (batch, batch_idx, dataloader_idx)

PR18390

defined def predict_step(self, dataloader_iter, batch_idx) in LightningModule

remove batch_idx from the signature and expect dataloader_iter to return a triplet (batch, batch_idx, dataloader_idx)

PR18390

used batch = next(dataloader_iter) in LightningModule *_step hooks

use batch, batch_idx, dataloader_idx = next(dataloader_iter)

PR18390

relied on automatic detection of Kubeflow environment

use Trainer(plugins=KubeflowEnvironment()) to explicitly set it on a Kubeflow cluster

PR18137

Developer

devel 2.0

If

Then

Ref

used the XLAStrategy.is_distributed property

it was removed because it was always True

PR17381

used the SingleTPUStrategy.is_distributed property

it was removed because it was always False

PR17381