Distance metrics play a crucial role in machine learning, especially in tasks like clustering, classification, and recommendation systems. In this blog, we will explore popular distance metrics including Cosine, Euclidean, Mahalanobis, Hellinger, Jaccard, Manhattan, Correlation, Dice, and Hamming distances. We will also provide PyTorch implementations for each metric.
1. Cosine Distance
Measures the cosine of the angle between two non-zero vectors. Often used in text similarity and document clustering.
import torch
x = torch.tensor([1.0, 2.0, 3.0])
y = torch.tensor([4.0, 5.0, 6.0])
cosine_distance = 1 - torch.nn.functional.cosine_similarity(x.unsqueeze(0), y.unsqueeze(0))
2. Euclidean Distance
Represents the straight-line distance between two points in Euclidean space.
euclidean_distance = torch.dist(x, y, p=2)
3. Mahalanobis Distance
Accounts for the correlation between variables and scales distances accordingly. Useful in anomaly detection.
cov = torch.cov(torch.stack([x, y]).T)
cov_inv = torch.linalg.inv(cov)
diff = (x - y).unsqueeze(0)
mahalanobis_distance = torch.sqrt(diff @ cov_inv @ diff.T)
4. Hellinger Distance
Measures the similarity between two probability distributions.
px = torch.sqrt(x / x.sum())
py = torch.sqrt(y / y.sum())
hellinger_distance = torch.norm(px - py) / torch.sqrt(torch.tensor(2.0))
5. Jaccard Distance
Used for comparing similarity and diversity of sample sets. Defined as 1 - (intersection / union).
x_set = torch.tensor([1, 1, 0, 0])
y_set = torch.tensor([1, 0, 1, 0])
intersection = torch.sum((x_set & y_set).float())
union = torch.sum((x_set | y_set).float())
jaccard_distance = 1 - intersection / union
6. Manhattan Distance
Also known as L1 distance. The sum of absolute differences between corresponding elements.
manhattan_distance = torch.sum(torch.abs(x - y))
7. Correlation Distance
Measures dissimilarity between variables by 1 minus the Pearson correlation coefficient.
correlation_distance = 1 - torch.corrcoef(torch.stack([x, y]))[0, 1]
8. Dice Distance
Mainly used in comparing similarity between two sets. Defined as 1 - (2 * |A ∩ B| / (|A| + |B|)).
intersection = torch.sum((x_set & y_set).float())
dice_distance = 1 - (2 * intersection) / (x_set.sum() + y_set.sum())
9. Hamming Distance
Measures the number of positions at which corresponding elements differ.
hamming_distance = torch.sum(x_set != y_set).float() / x_set.numel()
References
- https://pytorch.org/docs/stable/index.html
- https://en.wikipedia.org/wiki/Distance_metric
- https://scikit-learn.org/stable/modules/generated/sklearn.metrics.pairwise_distances.html
- https://en.wikipedia.org/wiki/Cosine_similarity
Comments
Post a Comment