Shortcuts

memory

Functions

garbage_collection_cuda

Garbage collection Torch (CUDA) memory.

get_model_size_mb

Calculates the size of a Module in megabytes.

is_cuda_out_of_memory

rtype

bool

is_cudnn_snafu

rtype

bool

is_oom_error

rtype

bool

is_out_of_cpu_memory

rtype

bool

recursive_detach

Detach all tensors in in_dict.

Utilities related to memory.

pytorch_lightning.utilities.memory.garbage_collection_cuda()[source]

Garbage collection Torch (CUDA) memory.

Return type

None

pytorch_lightning.utilities.memory.get_model_size_mb(model)[source]

Calculates the size of a Module in megabytes.

The computation includes everything in the state_dict(), i.e., by default the parameters and buffers.

Return type

float

Returns

Number of megabytes in the parameters of the input module.

pytorch_lightning.utilities.memory.recursive_detach(in_dict, to_cpu=False)[source]

Detach all tensors in in_dict.

May operate recursively if some of the values in in_dict are dictionaries which contain instances of Tensor. Other types in in_dict are not affected by this utility function.

Parameters
  • in_dict (Any) – Dictionary with tensors to detach

  • to_cpu (bool) – Whether to move tensor to cpu

Returns

Dictionary with detached tensors

Return type

out_dict