|
--- |
|
language: "pt" |
|
tags: |
|
- pt |
|
- wikipedia |
|
- gpt2 |
|
- finetuning |
|
datasets: |
|
- wikipedia |
|
widget: |
|
- "André Um" |
|
- "Maria do Santos" |
|
- "Roberto Carlos" |
|
licence: "mit" |
|
|
|
--- |
|
|
|
# GPT2-SMALL-PORTUGUESE-WIKIPEDIABIO |
|
|
|
|
|
This is a finetuned model version of gpt2-small-portuguese(https://huggingface.co/pierreguillou/gpt2-small-portuguese) by pierreguillou. |
|
|
|
It was trained on a person abstract dataset extracted from DBPEDIA (over 100000 people's abstracts). The model is intended as a simple and fun experiment for generating texts abstracts based on ordinary people's names. |