--- language: en thumbnail: http://www.huggingtweets.com/deepleffen-the_dealersh1p/1665552272191/predictions.png tags: - huggingtweets widget: - text: "My dream is" ---
πŸ€– AI CYBORG πŸ€–
γ€Ž γ€γ€Ždanγ€γ€Ž 』 & Deep Leffen Bot
@deepleffen-the_dealersh1p
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from γ€Ž γ€γ€Ždanγ€γ€Ž 』 & Deep Leffen Bot. | Data | γ€Ž γ€γ€Ždanγ€γ€Ž 』 | Deep Leffen Bot | | --- | --- | --- | | Tweets downloaded | 2673 | 608 | | Retweets | 1336 | 14 | | Short tweets | 235 | 27 | | Tweets kept | 1102 | 567 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2xu780cl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deepleffen-the_dealersh1p's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3w2qdw30) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3w2qdw30/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deepleffen-the_dealersh1p') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)