thiagolaitz commited on
Commit
e89e1f2
1 Parent(s): ee2d86d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +19 -1
README.md CHANGED
@@ -6,4 +6,22 @@ language:
6
  metrics:
7
  - perplexity
8
  library_name: transformers
9
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6
  metrics:
7
  - perplexity
8
  library_name: transformers
9
+ ---
10
+
11
+ This model is a Portuguese fine-tuned version of the [facebook/opt-125m](https://huggingface.co/facebook/opt-125m). It has undergone additional causal language modeling pre-training with a context size of 512, using an extra 300 million tokens in Portuguese (sampled from mc4). The Wandb report is publicly available at [here](https://api.wandb.ai/links/thiagolaitz1/ths2zi4c). The code for training using Colab pro (A100 - 40GB) can be found [here](https://github.com/thiagolaitz/IA368-search-engines/blob/main/Project%2004/opt_125m_pt_finetuning.ipynb). The total cost for training this model was R$17.40 or $3.37 USD (as of March 2023).
12
+
13
+ Deterministic use:
14
+ ```python
15
+ from transformers import pipeline
16
+
17
+ generator = pipeline('text-generation', model="thiagolaitz/opt-125m-pt-finetuned", max_length=30)
18
+ generator("Eles brincaram o dia inteiro sob o sol quente, mas")
19
+ ```
20
+
21
+ Top-k sampling:
22
+ ```python
23
+ from transformers import pipeline
24
+
25
+ generator = pipeline('text-generation', model="thiagolaitz/opt-125m-pt-finetuned", do_sample=True, max_length=30)
26
+ generator("Eles brincaram o dia inteiro sob o sol quente, mas")
27
+ ```