neil_mcneil /
language: en
- huggingtweets
- text: "My dream is"
<div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('')">
<div style="margin-top: 8px; font-size: 19px; font-weight: 800">Wanda Maximoff’s Gay Son 🤖 AI Bot </div>
<div style="font-size: 15px">@neil_mcneil bot</div>
I was made with [huggingtweets](
Create your own bot based on your favorite user with [the demo](!
## How does it work?
The model uses the following pipeline.
To understand how the model was developed, check the [W&B report](
## Training data
The model was trained on [@neil_mcneil's tweets](
| Data | Quantity |
| --- | --- |
| Tweets downloaded | 3220 |
| Retweets | 611 |
| Short tweets | 587 |
| Tweets kept | 2022 |
[Explore the data](, which is tracked with [W&B artifacts]( at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2]( which is fine-tuned on @neil_mcneil's tweets.
Hyperparameters and metrics are recorded in the [W&B training run]( for full transparency and reproducibility.
At the end of training, [the final model]( is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
from transformers import pipeline
generator = pipeline('text-generation',
generator("My dream is", num_return_sequences=5)
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
For more details, visit the project repository.
[![GitHub stars](](