eduagarcia commited on
Commit
68341fb
1 Parent(s): 2a90ae3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -15,7 +15,7 @@ This is an adaptation of pre-trained Portuguese GloVe Word Embeddings to a [sent
15
 
16
  The original pre-trained word embeddings can be found at: [http://nilc.icmc.usp.br/nilc/index.php/repositorio-de-word-embeddings-do-nilc](http://nilc.icmc.usp.br/nilc/index.php/repositorio-de-word-embeddings-do-nilc).
17
 
18
- This model maps sentences & paragraphs to a 300 dimensional dense vector space and can be used for tasks like clustering or semantic search.
19
 
20
  ## Usage (Sentence-Transformers)
21
 
@@ -46,7 +46,7 @@ SentenceTransformer(
46
  (0): WordEmbeddings(
47
  (emb_layer): Embedding(929606, 100)
48
  )
49
- (1): Pooling({'word_embedding_dimension': 300, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
50
  )
51
  ```
52
 
@@ -54,7 +54,7 @@ SentenceTransformer(
54
 
55
  ```bibtex
56
  @inproceedings{hartmann2017portuguese,
57
- title = Portuguese Word Embeddings: Evaluating on Word Analogies and Natural Language Tasks},
58
  author = {Hartmann, Nathan S and
59
  Fonseca, Erick R and
60
  Shulby, Christopher D and
 
15
 
16
  The original pre-trained word embeddings can be found at: [http://nilc.icmc.usp.br/nilc/index.php/repositorio-de-word-embeddings-do-nilc](http://nilc.icmc.usp.br/nilc/index.php/repositorio-de-word-embeddings-do-nilc).
17
 
18
+ This model maps sentences & paragraphs to a 100 dimensional dense vector space and can be used for tasks like clustering or semantic search.
19
 
20
  ## Usage (Sentence-Transformers)
21
 
 
46
  (0): WordEmbeddings(
47
  (emb_layer): Embedding(929606, 100)
48
  )
49
+ (1): Pooling({'word_embedding_dimension': 100, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
50
  )
51
  ```
52
 
 
54
 
55
  ```bibtex
56
  @inproceedings{hartmann2017portuguese,
57
+ title = {Portuguese Word Embeddings: Evaluating on Word Analogies and Natural Language Tasks},
58
  author = {Hartmann, Nathan S and
59
  Fonseca, Erick R and
60
  Shulby, Christopher D and