hku-nlp commited on
Commit
685df17
1 Parent(s): 8770f98

Delete README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -51
README.md DELETED
@@ -1,51 +0,0 @@
1
- ---
2
- pipeline_tag: sentence-similarity
3
- language: en
4
- license: apache-2.0
5
- tags:
6
- - sentence-transformers
7
- - feature-extraction
8
- - sentence-similarity
9
- - transformers
10
- ---
11
-
12
- # sentence-transformers/gtr-t5-large
13
-
14
- This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space. The model was specifically trained for the task of sematic search.
15
-
16
- This model was converted from the Tensorflow model [gtr-large-1](https://tfhub.dev/google/gtr/gtr-large/1) to PyTorch. When using this model, have a look at the publication: [Large Dual Encoders Are Generalizable Retrievers](https://arxiv.org/abs/2112.07899). The tfhub model and this PyTorch model can produce slightly different embeddings, however, when run on the same benchmarks, they produce identical results.
17
-
18
- The model uses only the encoder from a T5-large model. The weights are stored in FP16.
19
-
20
-
21
- ## Usage (Sentence-Transformers)
22
-
23
- Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
24
-
25
- ```
26
- pip install -U sentence-transformers
27
- ```
28
-
29
- Then you can use the model like this:
30
-
31
- ```python
32
- from sentence_transformers import SentenceTransformer
33
- sentences = ["This is an example sentence", "Each sentence is converted"]
34
-
35
- model = SentenceTransformer('sentence-transformers/gtr-t5-large')
36
- embeddings = model.encode(sentences)
37
- print(embeddings)
38
- ```
39
-
40
- The model requires sentence-transformers version 2.2.0 or newer.
41
-
42
- ## Evaluation Results
43
-
44
- For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/gtr-t5-large)
45
-
46
-
47
-
48
- ## Citing & Authors
49
-
50
- If you find this model helpful, please cite the respective publication:
51
- [Large Dual Encoders Are Generalizable Retrievers](https://arxiv.org/abs/2112.07899)