--- title: Test Sbert Cosine emoji: ⚡ colorFrom: purple colorTo: purple sdk: gradio sdk_version: 3.19.1 app_file: app.py pinned: false tags: - evaluate - metric description: >- Sbert cosine is a metric to score the semantic similarity of text generation tasks This is not the official implementation of cosine similarity using SBERT See the project at https://www.sbert.net/ for more information. --- # Metric Card for SbertCosine ## Metric description Sbert cosine is a metric to score the semantic similarity of text generation tasks ## How to use ```python from evaluate import load sbert_cosine = load("transZ/sbert_cosine") predictions = ["hello there", "general kenobi"] references = ["hello there", "general kenobi"] results = sbert_cosine.compute(predictions=predictions, references=references, lang="en") ``` ## Output values Sbert cosine outputs a dictionary with the following values: `score`: Range from 0.0 to 1.0 ## Limitations and bias The [official repo](https://github.com/UKPLab/sentence-transformers) showed that Sbert can capture the semantic of the sentence well ## Citation ```bibtex @article{Reimers2019, archivePrefix = {arXiv}, arxivId = {1908.10084}, author = {Reimers, Nils and Gurevych, Iryna}, doi = {10.18653/v1/d19-1410}, eprint = {1908.10084}, isbn = {9781950737901}, journal = {EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference}, pages = {3982--3992}, title = {{Sentence-BERT: Sentence embeddings using siamese BERT-networks}}, year = {2019} } ``` ## Further References - [Official website](https://www.sbert.net/)