YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
language: en
tags:
- sentence-embeddings
- sentence-similarity
- dual-encoder
cambridgeltl/trans-encoder-bi-simcse-roberta-large
An unsupervised sentence encoder (bi-encoder) proposed by Liu et al. (2021). The model is trained with unlabelled sentence pairs sampled from STS2012-2016, STS-b, and SICK-R, using princeton-nlp/unsup-simcse-roberta-large as the base model. Please use [CLS]
(before pooler) as the representation of the input.
Citation
@article{liu2021trans,
title={Trans-Encoder: Unsupervised sentence-pair modelling through self-and mutual-distillations},
author={Liu, Fangyu and Jiao, Yunlong and Massiah, Jordan and Yilmaz, Emine and Havrylov, Serhii},
journal={arXiv preprint arXiv:2109.13059},
year={2021}
}
- Downloads last month
- 11
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.