File size: 1,297 Bytes
ee2d86d
 
 
 
 
 
 
 
e89e1f2
 
 
 
 
 
 
 
 
 
3faf271
e89e1f2
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
---
datasets:
- mc4
language:
- pt
metrics:
- perplexity
library_name: transformers
---

This model is a Portuguese fine-tuned version of the [facebook/opt-125m](https://huggingface.co/facebook/opt-125m). It has undergone additional causal language modeling pre-training with a context size of 512, using an extra 300 million tokens in Portuguese (sampled from mc4). The Wandb report is publicly available at [here](https://api.wandb.ai/links/thiagolaitz1/ths2zi4c). The code for training using Colab pro (A100 - 40GB) can be found [here](https://github.com/thiagolaitz/IA368-search-engines/blob/main/Project%2004/opt_125m_pt_finetuning.ipynb). The total cost for training this model was R$17.40 or $3.37 USD (as of March 2023).

Deterministic use:
```python
from transformers import pipeline

generator = pipeline('text-generation', model="thiagolaitz/opt-125m-pt-finetuned", max_length=30)
generator("Eles brincaram o dia inteiro sob o sol quente, mas")
# Output: Eles brincaram o dia inteiro sob o sol quente, mas não se deixaram levar pelo sol.
```

Top-k sampling:
```python
from transformers import pipeline

generator = pipeline('text-generation', model="thiagolaitz/opt-125m-pt-finetuned", do_sample=True, max_length=30)
generator("Eles brincaram o dia inteiro sob o sol quente, mas")
```