base
Functions
Merge a sequence with dictionaries into one dictionary by aggregating the same keys with some given function. |
|
Returns the real experiment on rank 0 and otherwise the DummyExperiment. |
Classes
Dummy experiment |
|
Dummy logger for internal use. |
|
Base class for experiment loggers. |
|
The |
Abstract base class used to build new loggers.
- class pytorch_lightning.loggers.base.DummyLogger[source]
Bases:
pytorch_lightning.loggers.base.LightningLoggerBase
Dummy logger for internal use. It is useful if we want to disable user’s logger for a feature, but still ensure that user code can run
- log_hyperparams(*args, **kwargs)[source]
Record hyperparameters.
- log_metrics(*args, **kwargs)[source]
Records metrics. This method logs metrics as as soon as it received them. If you want to aggregate metrics for one specific step, use the
agg_and_log_metrics()
method.- Parameters
- Return type
- class pytorch_lightning.loggers.base.LightningLoggerBase(agg_key_funcs=None, agg_default_func=<function mean>)[source]
Bases:
abc.ABC
Base class for experiment loggers.
- Parameters
agg_key_funcs (
Optional
[Mapping
[str
,Callable
[[Sequence
[float
]],float
]]]) – Dictionary which maps a metric name to a function, which will aggregate the metric values for the same steps.agg_default_func (
Callable
[[Sequence
[float
]],float
]) – Default function to aggregate metric values. If some metric name is not presented in the agg_key_funcs dictionary, then the agg_default_func will be used for aggregation.
Note
The agg_key_funcs and agg_default_func arguments are used only when one logs metrics with the
agg_and_log_metrics()
method.- after_save_checkpoint(checkpoint_callback)[source]
Called after model checkpoint callback saves a new checkpoint
- agg_and_log_metrics(metrics, step=None)[source]
Aggregates and records metrics. This method doesn’t log the passed metrics instantaneously, but instead it aggregates them and logs only if metrics are ready to be logged.
- finalize(status)[source]
Do any processing that is necessary to finalize an experiment.
- log_graph(model, input_array=None)[source]
Record model graph
- Parameters
model (
LightningModule
) – lightning model
- Return type
- abstract log_hyperparams(params, *args, **kwargs)[source]
Record hyperparameters.
- abstract log_metrics(metrics, step=None)[source]
Records metrics. This method logs metrics as as soon as it received them. If you want to aggregate metrics for one specific step, use the
agg_and_log_metrics()
method.
- update_agg_funcs(agg_key_funcs=None, agg_default_func=<function mean>)[source]
Update aggregation methods.
- Parameters
agg_key_funcs (
Optional
[Mapping
[str
,Callable
[[Sequence
[float
]],float
]]]) – Dictionary which maps a metric name to a function, which will aggregate the metric values for the same steps.agg_default_func (
Callable
[[Sequence
[float
]],float
]) – Default function to aggregate metric values. If some metric name is not presented in the agg_key_funcs dictionary, then the agg_default_func will be used for aggregation.
- class pytorch_lightning.loggers.base.LoggerCollection(logger_iterable)[source]
Bases:
pytorch_lightning.loggers.base.LightningLoggerBase
The
LoggerCollection
class is used to iterate all logging actions over the given logger_iterable.- Parameters
logger_iterable (
Iterable
[LightningLoggerBase
]) – An iterable collection of loggers
- after_save_checkpoint(checkpoint_callback)[source]
Called after model checkpoint callback saves a new checkpoint
- agg_and_log_metrics(metrics, step=None)[source]
Aggregates and records metrics. This method doesn’t log the passed metrics instantaneously, but instead it aggregates them and logs only if metrics are ready to be logged.
- finalize(status)[source]
Do any processing that is necessary to finalize an experiment.
- log_graph(model, input_array=None)[source]
Record model graph
- Parameters
model (
LightningModule
) – lightning model
- Return type
- log_hyperparams(params)[source]
Record hyperparameters.
- log_metrics(metrics, step=None)[source]
Records metrics. This method logs metrics as as soon as it received them. If you want to aggregate metrics for one specific step, use the
agg_and_log_metrics()
method.
- update_agg_funcs(agg_key_funcs=None, agg_default_func=<function mean>)[source]
Update aggregation methods.
- Parameters
agg_key_funcs (
Optional
[Mapping
[str
,Callable
[[Sequence
[float
]],float
]]]) – Dictionary which maps a metric name to a function, which will aggregate the metric values for the same steps.agg_default_func (
Callable
[[Sequence
[float
]],float
]) – Default function to aggregate metric values. If some metric name is not presented in the agg_key_funcs dictionary, then the agg_default_func will be used for aggregation.
- pytorch_lightning.loggers.base.merge_dicts(dicts, agg_key_funcs=None, default_func=<function mean>)[source]
Merge a sequence with dictionaries into one dictionary by aggregating the same keys with some given function.
- Parameters
dicts (
Sequence
[Mapping
]) – Sequence of dictionaries to be merged.agg_key_funcs (
Optional
[Mapping
[str
,Callable
[[Sequence
[float
]],float
]]]) – Mapping from key name to function. This function will aggregate a list of values, obtained from the same key of all dictionaries. If some key has no specified aggregation function, the default one will be used. Default is:None
(all keys will be aggregated by the default function).default_func (
Callable
[[Sequence
[float
]],float
]) – Default function to aggregate keys, which are not presented in the agg_key_funcs map.
- Return type
- Returns
Dictionary with merged values.
Examples
>>> import pprint >>> d1 = {'a': 1.7, 'b': 2.0, 'c': 1, 'd': {'d1': 1, 'd3': 3}} >>> d2 = {'a': 1.1, 'b': 2.2, 'v': 1, 'd': {'d1': 2, 'd2': 3}} >>> d3 = {'a': 1.1, 'v': 2.3, 'd': {'d3': 3, 'd4': {'d5': 1}}} >>> dflt_func = min >>> agg_funcs = {'a': np.mean, 'v': max, 'd': {'d1': sum}} >>> pprint.pprint(merge_dicts([d1, d2, d3], agg_funcs, dflt_func)) {'a': 1.3, 'b': 2.0, 'c': 1, 'd': {'d1': 3, 'd2': 3, 'd3': 3, 'd4': {'d5': 1}}, 'v': 2.3}