Shortcuts

TorchElasticEnvironment

class pytorch_lightning.plugins.environments.TorchElasticEnvironment[source]

Bases: pytorch_lightning.plugins.environments.cluster_environment.ClusterEnvironment

Environment for fault-tolerant and elastic training with torchelastic

global_rank()[source]

The rank (index) of the currently running process across all nodes and devices.

Return type

int

static is_using_torchelastic()[source]

Returns True if the current process was launched using the torchelastic command.

Return type

bool

local_rank()[source]

The rank (index) of the currently running process inside of the current node.

Return type

int

master_address()[source]

The master address through which all processes connect and communicate.

Return type

str

master_port()[source]

An open and configured port in the master node through which all processes communicate.

Return type

int

node_rank()[source]

The rank (index) of the node on which the current process runs.

Return type

int

world_size()[source]

The number of processes across all devices and nodes.

Return type

Optional[int]

property creates_processes_externally: bool

Whether the environment creates the subprocesses or not.