File size: 1,719 Bytes
d3b7844
1167cb4
 
 
 
d3b7844
1167cb4
d3b7844
 
1167cb4
 
 
 
 
 
 
0b360c1
1167cb4
 
0b360c1
1167cb4
0b360c1
1167cb4
0b360c1
1167cb4
0b360c1
1167cb4
0b360c1
1167cb4
 
 
 
 
 
 
0b360c1
1167cb4
0b360c1
1167cb4
0b360c1
1167cb4
0b360c1
1167cb4
0b360c1
1167cb4
0b360c1
 
 
1167cb4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
---
title: Test Sbert Cosine
emoji: 
colorFrom: purple
colorTo: purple
sdk: gradio
sdk_version: 3.19.1
app_file: app.py
pinned: false
tags:
- evaluate
- metric
description: >-
  Sbert cosine is a metric to score the semantic similarity of text generation tasks
  
  This is not the official implementation of cosine similarity using SBERT

  See the project at https://www.sbert.net/ for more information.
---

# Metric Card for SbertCosine

## Metric description

Sbert cosine is a metric to score the semantic similarity of text generation tasks

## How to use 

```python
from evaluate import load
sbert_cosine = load("transZ/sbert_cosine")
predictions = ["hello there", "general kenobi"]
references = ["hello there", "general kenobi"]
results = sbert_cosine.compute(predictions=predictions, references=references, lang="en")
```

## Output values

Sbert cosine outputs a dictionary with the following values:

`score`: Range from 0.0 to 1.0

## Limitations and bias

The [official repo](https://github.com/UKPLab/sentence-transformers) showed that Sbert can capture the semantic of the sentence well

## Citation

```bibtex
@article{Reimers2019,
archivePrefix = {arXiv},
arxivId = {1908.10084},
author = {Reimers, Nils and Gurevych, Iryna},
doi = {10.18653/v1/d19-1410},
eprint = {1908.10084},
isbn = {9781950737901},
journal = {EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference},
pages = {3982--3992},
title = {{Sentence-BERT: Sentence embeddings using siamese BERT-networks}},
year = {2019}
}
```
    
## Further References 
- [Official website](https://www.sbert.net/)