TorchElasticEnvironment¶
- class pytorch_lightning.plugins.environments.TorchElasticEnvironment[source]¶
Bases:
pytorch_lightning.plugins.environments.cluster_environment.ClusterEnvironment
Environment for fault-tolerant and elastic training with torchelastic
- global_rank()[source]¶
The rank (index) of the currently running process across all nodes and devices.
- Return type
- static is_using_torchelastic()[source]¶
Returns
True
if the current process was launched using the torchelastic command.- Return type
- local_rank()[source]¶
The rank (index) of the currently running process inside of the current node.
- Return type
- master_address()[source]¶
The master address through which all processes connect and communicate.
- Return type
- master_port()[source]¶
An open and configured port in the master node through which all processes communicate.
- Return type
- property creates_processes_externally: bool¶
Whether the environment creates the subprocesses or not.