Spaces:
Sleeping
Sleeping
File size: 3,314 Bytes
94b6cbb 408d343 94b6cbb 639df4a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 |
---
title: triplet_margin_loss
emoji: 🐠
colorFrom: blue
colorTo: blue
sdk: gradio
sdk_version: 3.0.10
app_file: app.py
pinned: false
---
# Metric Card for Triplet Margin Loss
## Metric Description
Triplet margin loss is a loss function that measures a relative similarity between the samples.
A triplet is comprised of reference input 'anchor (a)', matching input 'positive examples (p)' and non-matching input 'negative examples (n)'.
The loss function for each triplet is given by:
L(a, p, n) = max{d(a,p) - d(a,n) + margin, 0}
where d(x, y) is the 2nd order (Euclidean) pairwise distance between x and y.
## How to Use
At minimum, this metric requires anchor, positive and negative examples.
```python
>>> triplet_margin_loss = evaluate.load("theAIguy/triplet_margin_loss")
>>> loss = triplet_margin_loss.compute(
anchor=[-0.4765, 1.7133, 1.3971, -1.0121, 0.0732],
positive=[0.9218, 0.6305, 0.3381, 0.1412, 0.2607],
negative=[0.1971, 0.7246, 0.6729, 0.0941, 0.1011])
>>> print(loss)
{'triplet_margin_loss': 1.59}
```
You may also add a custom margin (default: 1.0).
```python
>>> triplet_margin_loss = evaluate.load("theAIguy/triplet_margin_loss")
>>> loss = triplet_margin_loss.compute(
anchor=[-0.4765, 1.7133, 1.3971, -1.0121, 0.0732],
positive=[0.9218, 0.6305, 0.3381, 0.1412, 0.2607],
negative=[0.1971, 0.7246, 0.6729, 0.0941, 0.1011],
margin=2.0)
>>> print(loss)
{'triplet_margin_loss': 2.59}
```
### Inputs
- **anchor** (`list` of `float`): Reference inputs.
- **positive** (`list` of `float`): Matching inputs.
- **negative** (`list` of `float`): Non-matching inputs.
- **margin** (`float`): Margin, default:`1.0`
### Output Values
- **triple_margin_loss**(`float`): Total loss.
Output Example(s):
```python
{'triplet_margin_loss': 2.59}
```
This metric outputs a dictionary, containing the triplet margin loss.
### Examples
Example 1-A simple example
```python
>>> accuracy_metric = evaluate.load("triplet_margin_loss")
>>> loss = triplet_margin_loss.compute(
anchor=[-0.4765, 1.7133, 1.3971, -1.0121, 0.0732],
positive=[0.9218, 0.6305, 0.3381, 0.1412, 0.2607],
negative=[0.1971, 0.7246, 0.6729, 0.0941, 0.1011])
>>> print(loss)
{'triplet_margin_loss': 1.59}
```
Example 2-The same as Example 1, except `margin` set to `2.0`.
```python
>>> triplet_margin_loss = evaluate.load("triplet_margin_loss")
>>> loss = triplet_margin_loss.compute(
anchor=[-0.4765, 1.7133, 1.3971, -1.0121, 0.0732],
positive=[0.9218, 0.6305, 0.3381, 0.1412, 0.2607],
negative=[0.1971, 0.7246, 0.6729, 0.0941, 0.1011],
margin=2.0)
>>> print(loss)
{'triplet_margin_loss': 2.59}
```
## Limitations and Bias
When used to cluster data points, one needs to be careful to include feature rich data points else dissimilar data points might be clustered together. Negative hard sample mining is widely used along with this loss function to penalize such data points.
## Citation(s)
```bibtex
@article{schultz2003learning,
title={Learning a distance metric from relative comparisons},
author={Schultz, Matthew and Joachims, Thorsten},
journal={Advances in neural information processing systems},
volume={16},
year={2003}
}
``` |