Edit model card
YAML Metadata Error: "tags" must be an array

BOTHALTEROUT

This model is a fine-tuned version of GPT-2 using 21,832 tweets from 12 twitter users with very strong opinions about the United States Men's National Team.

Limitations and bias

The model has all the same limitations and bias as GPT-2.

Additionally, BOTHALTEROUT can create some problematic results based upon the tweets used to generate the model.

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001372
  • train_batch_size: 1
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Framework versions

  • Transformers 4.19.2
  • Pytorch 1.11.0+cu113
  • Datasets 2.2.2
  • Tokenizers 0.12.1

About

Built by Eliot McKinley based upon HuggingTweets by Boris Dayama

Downloads last month
36
Hosted inference API
Text Generation
Examples
Examples
This model can be loaded on the Inference API on-demand.