|
--- |
|
license: mit |
|
tags: |
|
model-index: |
|
- name: BERFALTER |
|
results: [] |
|
widget: |
|
- text: "Gregg Berhalter" |
|
- text: "The USMNT won't win the World Cup" |
|
- text: "The Soccer Media" |
|
- text: "Ball don't" |
|
|
|
--- |
|
|
|
# BERHALTER |
|
|
|
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) using 21,832 tweets from 12 twitter users with very strong opinions about the United States Men's National Team. |
|
|
|
## Limitations and bias |
|
|
|
The model has all [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). |
|
|
|
Additionally, BERFALTER can create some problematic results based upon the tweets used to generate the model. |
|
|
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.0001372 |
|
- train_batch_size: 1 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 1 |
|
|
|
### Training results |
|
|
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.19.2 |
|
- Pytorch 1.11.0+cu113 |
|
- Datasets 2.2.2 |
|
- Tokenizers 0.12.1 |
|
|
|
## About |
|
|
|
*Built by [Eliot McKinley](https://twitter.com/etmckinley) based upon [HuggingTweets](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb) by Boris Dayama* |
|
|