How to use this model directly from the
from transformers import AutoTokenizer, AutoModelWithLMHead tokenizer = AutoTokenizer.from_pretrained("pranavpsv/gpt2-genre-story-generator") model = AutoModelWithLMHead.from_pretrained("pranavpsv/gpt2-genre-story-generator")
GPT2 fine-tuned on genre-based story generation.
Used to generate stories based on user inputted genre and starting prompts.
superhero, action, drama, horror, thriller, sci_fi
<BOS> <genre> Some optional text...
Example: <BOS> <sci_fi> After discovering time travel,
# Example of usage from transformers import pipeline story_gen = pipeline("text-generation", "pranavpsv/gpt2-genre-story-generator") print(story_gen("<BOS> <superhero> Batman"))
Initialized with pre-trained weights of "gpt2" checkpoint. Fine-tuned the model on stories of various genres.