• Docs >
• Normalized Mutual Information Score
Shortcuts

# Normalized Mutual Information Score¶

## Module Interface¶

class torchmetrics.clustering.NormalizedMutualInfoScore(average_method='arithmetic', **kwargs)[source]
$NMI(U,V) = \frac{MI(U,V)}{M_p(U,V)}$

Where $$U$$ is a tensor of target values, $$V$$ is a tensor of predictions, $$M_p(U,V)$$ is the generalized mean of order $$p$$ of $$U$$ and $$V$$, and $$MI(U,V)$$ is the mutual information score between clusters $$U$$ and $$V$$. The metric is symmetric, therefore swapping $$U$$ and $$V$$ yields the same mutual information score.

This clustering metric is an extrinsic measure, because it requires ground truth clustering labels, which may not be available in practice since clustering in generally is used for unsupervised learning.

As input to forward and update the metric accepts the following input:

• preds (Tensor): single integer tensor with shape (N,) with predicted cluster labels

• target (Tensor): single integer tensor with shape (N,) with ground truth cluster labels

As output of forward and compute the metric returns the following output:

• nmi_score (Tensor): A tensor with the Normalized Mutual Information Score

Parameters:
Example::
>>> import torch
>>> from torchmetrics.clustering import NormalizedMutualInfoScore
>>> preds = torch.tensor([2, 1, 0, 1, 0])
>>> target = torch.tensor([0, 2, 1, 1, 0])
>>> nmi_score = NormalizedMutualInfoScore("arithmetic")
>>> nmi_score(preds, target)
tensor(0.4744)

plot(val=None, ax=None)[source]

Plot a single or multiple values from the metric.

Parameters:
Return type:
Returns:

Figure and Axes object

Raises:

ModuleNotFoundError – If matplotlib is not installed

>>> # Example plotting a single value
>>> import torch
>>> from torchmetrics.clustering import NormalizedMutualInfoScore
>>> metric = NormalizedMutualInfoScore()
>>> metric.update(torch.randint(0, 4, (10,)), torch.randint(0, 4, (10,)))
>>> fig_, ax_ = metric.plot(metric.compute())

>>> # Example plotting multiple values
>>> import torch
>>> from torchmetrics.clustering import NormalizedMutualInfoScore
>>> metric = NormalizedMutualInfoScore()
>>> values = [ ]
>>> for _ in range(10):
...     values.append(metric(torch.randint(0, 4, (10,)), torch.randint(0, 4, (10,))))
>>> fig_, ax_ = metric.plot(values)

higher_is_better: Optional[bool] = None[source]

## Functional Interface¶

torchmetrics.functional.clustering.normalized_mutual_info_score(preds, target, average_method='arithmetic')[source]

Compute normalized mutual information between two clusterings.

Parameters:
Return type:

Tensor

Returns:

Scalar tensor with normalized mutual info score between 0.0 and 1.0

Example

>>> from torchmetrics.functional.clustering import normalized_mutual_info_score
>>> target = torch.tensor([0, 3, 2, 2, 1])
>>> preds = torch.tensor([1, 3, 2, 0, 1])
>>> normalized_mutual_info_score(preds, target, "arithmetic")
tensor(0.7919)