Shortcuts

mlflow

Functions

resolve_tags

type _sphinx_paramlinks_pytorch_lightning.loggers.mlflow.resolve_tags.tags:

Optional[Dict]

Classes

MLFlowLogger

Log using MLflow.

MLflow Logger

class pytorch_lightning.loggers.mlflow.MLFlowLogger(experiment_name='lightning_logs', run_name=None, tracking_uri=None, tags=None, save_dir='./mlruns', prefix='', artifact_location=None, run_id=None)[source]

Bases: pytorch_lightning.loggers.logger.Logger

Log using MLflow.

Install it with pip:

pip install mlflow
from pytorch_lightning import Trainer
from pytorch_lightning.loggers import MLFlowLogger

mlf_logger = MLFlowLogger(experiment_name="lightning_logs", tracking_uri="file:./ml-runs")
trainer = Trainer(logger=mlf_logger)

Use the logger anywhere in your LightningModule as follows:

from pytorch_lightning import LightningModule


class LitModel(LightningModule):
    def training_step(self, batch, batch_idx):
        # example
        self.logger.experiment.whatever_ml_flow_supports(...)

    def any_lightning_module_function_or_hook(self):
        self.logger.experiment.whatever_ml_flow_supports(...)
Parameters:
  • experiment_name (str) – The name of the experiment.

  • run_name (Optional[str]) – Name of the new run. The run_name is internally stored as a mlflow.runName tag. If the mlflow.runName tag has already been set in tags, the value is overridden by the run_name.

  • tracking_uri (Optional[str]) – Address of local or remote tracking server. If not provided, defaults to MLFLOW_TRACKING_URI environment variable if set, otherwise it falls back to file:<save_dir>.

  • tags (Optional[Dict[str, Any]]) – A dictionary tags for the experiment.

  • save_dir (Optional[str]) – A path to a local directory where the MLflow runs get saved. Defaults to ./mlflow if tracking_uri is not provided. Has no effect if tracking_uri is provided.

  • prefix (str) – A string to put at the beginning of metric keys.

  • artifact_location (Optional[str]) – The location to store run artifacts. If not provided, the server picks an appropriate default.

  • run_id (Optional[str]) – The run identifier of the experiment. If not provided, a new run is started.

Raises:

ModuleNotFoundError – If required MLFlow package is not installed on the device.

finalize(status='success')[source]

Do any processing that is necessary to finalize an experiment.

Parameters:

status (str) – Status that the experiment finished with (e.g. success, failed, aborted)

Return type:

None

log_hyperparams(params)[source]

Record hyperparameters.

Parameters:
  • params (Union[Dict[str, Any], Namespace]) – Namespace or Dict containing the hyperparameters

  • args – Optional positional arguments, depends on the specific logger being used

  • kwargs – Optional keyword arguments, depends on the specific logger being used

Return type:

None

log_metrics(metrics, step=None)[source]

Records metrics. This method logs metrics as soon as it received them.

Parameters:
  • metrics (Mapping[str, float]) – Dictionary with metric names as keys and measured quantities as values

  • step (Optional[int]) – Step number at which the metrics should be recorded

Return type:

None

property experiment: None

Actual MLflow object. To use MLflow features in your LightningModule do the following.

Example:

self.logger.experiment.some_mlflow_function()
property experiment_id: Optional[str]

Create the experiment if it does not exist to get the experiment id.

Returns:

The experiment id.

property name: Optional[str]

Get the experiment id.

Returns:

The experiment id.

property run_id: Optional[str]

Create the experiment if it does not exist to get the run id.

Returns:

The run id.

property save_dir: Optional[str]

The root file directory in which MLflow experiments are saved.

Returns:

Local path to the root experiment directory if the tracking uri is local. Otherwise returns None.

property version: Optional[str]

Get the run id.

Returns:

The run id.

pytorch_lightning.loggers.mlflow.resolve_tags(tags=None)[source]
Parameters:

tags (Optional[Dict]) – A dictionary of tags to override. If specified, tags passed in this argument will override those inferred from the context.

Return type:

Optional[Dict]

Returns: A dictionary of resolved tags.

Note

See mlflow.tracking.context.registry for more details.