thiagolaitz's picture
Update README.md
3faf271
|
raw
history blame
1.3 kB
metadata
datasets:
  - mc4
language:
  - pt
metrics:
  - perplexity
library_name: transformers

This model is a Portuguese fine-tuned version of the facebook/opt-125m. It has undergone additional causal language modeling pre-training with a context size of 512, using an extra 300 million tokens in Portuguese (sampled from mc4). The Wandb report is publicly available at here. The code for training using Colab pro (A100 - 40GB) can be found here. The total cost for training this model was R$17.40 or $3.37 USD (as of March 2023).

Deterministic use:

from transformers import pipeline

generator = pipeline('text-generation', model="thiagolaitz/opt-125m-pt-finetuned", max_length=30)
generator("Eles brincaram o dia inteiro sob o sol quente, mas")
# Output: Eles brincaram o dia inteiro sob o sol quente, mas não se deixaram levar pelo sol.

Top-k sampling:

from transformers import pipeline

generator = pipeline('text-generation', model="thiagolaitz/opt-125m-pt-finetuned", do_sample=True, max_length=30)
generator("Eles brincaram o dia inteiro sob o sol quente, mas")