Edit model card
🤖 AI CYBORG 🤖
ひろゆき, Hiroyuki Nishimura & 落合陽一 Yoichi OCHIAI & 乙武 洋匡
@h_ototake-hirox246-ochyai

I was made with huggingtweets.

Create your own bot based on your favorite user with the demo!

How does it work?

The model uses the following pipeline.

pipeline

To understand how the model was developed, check the W&B report.

Training data

The model was trained on tweets from ひろゆき, Hiroyuki Nishimura & 落合陽一 Yoichi OCHIAI & 乙武 洋匡.

Data ひろゆき, Hiroyuki Nishimura 落合陽一 Yoichi OCHIAI 乙武 洋匡
Tweets downloaded 3248 3240 3238
Retweets 281 2238 1259
Short tweets 1980 574 1437
Tweets kept 987 428 542

Explore the data, which is tracked with W&B artifacts at every step of the pipeline.

Training procedure

The model is based on a pre-trained GPT-2 which is fine-tuned on @h_ototake-hirox246-ochyai's tweets.

Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.

At the end of training, the final model is logged and versioned.

How to use

You can use this model directly with a pipeline for text generation:

from transformers import pipeline
generator = pipeline('text-generation',
                     model='huggingtweets/h_ototake-hirox246-ochyai')
generator("My dream is", num_return_sequences=5)

Limitations and bias

The model suffers from the same limitations and bias as GPT-2.

In addition, the data present in the user's tweets further affects the text generated by the model.

About

Built by Boris Dayma

Follow

For more details, visit the project repository.

GitHub stars

Downloads last month
3
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.