File size: 1,340 Bytes
b387b9d 111474e c307c9a 111474e c307c9a b387b9d 111474e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 |
---
language:
- en
license: cc
library_name: adapter-transformers
tags:
- music
- art
datasets:
- SpartanCinder/artist-lyrics-dataset
- SpartanCinder/song-lyrics-artist-classifier
metrics:
- accuracy
---
# GPT2 Pretrained Lyric Generation Model
This repository contains a pretrained GPT2 model fine-tuned for lyric generation. The model was trained using the Hugging Face's Transformers library.
## Model Details
- **Model architecture:** GPT2
- **Training data:** The datasets were created using the Genius API and are linked in the Model's tags.
- **Training duration:** [Mention how long the model was trained]
## Usage
The model can be used to generate lyrics.
It uses nucleus sampling with a probability threshold of 0.9 for generating the lyrics,
which helps in generating more diverse and less repetitive text.
Here is a basic usage example:
```python
from transformers import GPT2LMHeadModel, GPT2Tokenizer
tokenizer = GPT2Tokenizer.from_pretrained("SpartanCinder/GPT2-pretrained-lyric-generation")
model = GPT2LMHeadModel.from_pretrained("SpartanCinder/GPT2-pretrained-lyric-generation")
input_ids = tokenizer.encode("Once upon a time", return_tensors='pt')
output = model.generate(input_ids, max_length=100, num_return_sequences=5, do_sample=True, top_p=0.9)
print(tokenizer.decode(output[0], skip_special_tokens=True)) |