rufimelo commited on
Commit
fa9ca7c
1 Parent(s): dbb77b9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -15,8 +15,8 @@ datasets:
15
  # rufimelo/Legal-SBERTimbau-large
16
 
17
  This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 1024 dimensional dense vector space and can be used for tasks like clustering or semantic search.
18
-
19
- <!--- Describe your model here -->
20
 
21
  ## Usage (Sentence-Transformers)
22
 
@@ -30,7 +30,7 @@ Then you can use the model like this:
30
 
31
  ```python
32
  from sentence_transformers import SentenceTransformer
33
- sentences = ["This is an example sentence", "Each sentence is converted"]
34
 
35
  model = SentenceTransformer('rufimelo/Legal-SBERTimbau-large')
36
  embeddings = model.encode(sentences)
@@ -40,7 +40,7 @@ print(embeddings)
40
 
41
 
42
  ## Usage (HuggingFace Transformers)
43
- Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
44
 
45
  ```python
46
  from transformers import AutoTokenizer, AutoModel
15
  # rufimelo/Legal-SBERTimbau-large
16
 
17
  This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 1024 dimensional dense vector space and can be used for tasks like clustering or semantic search.
18
+ Legal-SBERTimbau-large is based on Legal-BERTimbau-large whioch derives from [BERTimbau](https://huggingface.co/neuralmind/bert-base-portuguese-cased) Large.
19
+ It is adapted to the Portuguese legal domain.
20
 
21
  ## Usage (Sentence-Transformers)
22
 
30
 
31
  ```python
32
  from sentence_transformers import SentenceTransformer
33
+ sentences = ["Isto é um exemplo", "Isto é um outro exemplo"]
34
 
35
  model = SentenceTransformer('rufimelo/Legal-SBERTimbau-large')
36
  embeddings = model.encode(sentences)
40
 
41
 
42
  ## Usage (HuggingFace Transformers)
43
+
44
 
45
  ```python
46
  from transformers import AutoTokenizer, AutoModel