|
--- |
|
license: apache-2.0 |
|
base_model: google/mt5-small |
|
tags: |
|
- summarization |
|
- generated_from_trainer |
|
- music |
|
- song-lyrics |
|
metrics: |
|
- rouge |
|
model-index: |
|
- name: mt5-small-finetuned-genius |
|
results: [] |
|
pipeline_tag: summarization |
|
datasets: |
|
- miscjose/genius-music |
|
|
|
widget: |
|
- text: > |
|
Thought I'd end up with Sean |
|
But he wasn't a match \n |
|
Wrote some songs about Ricky |
|
Now I listen and laugh |
|
Even almost got married |
|
And for Pete, I'm so thankful |
|
Wish I could say, "Thank you" to Malcolm |
|
'Cause he was an angel |
|
One taught me love |
|
One taught me patience |
|
And one taught me pain |
|
Now, I'm so amazing |
|
Say I've loved and I've lost |
|
But that's not what I see |
|
So, look what I got |
|
Look what you taught me |
|
And for that, I say |
|
Thank you, next (Next) |
|
Thank you, next (Next) |
|
Thank you, next |
|
I'm so fuckin' grateful for my ex |
|
Thank you, next (Next) |
|
Thank you, next (Next) |
|
Thank you, next (Next) |
|
I'm so fuckin'— |
|
Spend more time with my friends |
|
I ain't worried 'bout nothin' |
|
Plus, I met someone else |
|
We havin' better discussions |
|
I know they say I move on too fast |
|
But this one gon' last |
|
'Cause her name is Ari |
|
And I'm so good with that (So good with that) |
|
She taught me love (Love) |
|
She taught me patience (Patience) |
|
How she handles pain (Pain) |
|
That shit's amazing (Yeah, she's amazing) |
|
I've loved and I've lost (Yeah, yeah) |
|
But that's not what I see (Yeah, yeah) |
|
'Cause look what I've found (Yeah, yeah, I've found) |
|
Ain't no need for searching, and for that, I say |
|
Thank you, next (Thank you, next) |
|
Thank you, next (Thank you, next) |
|
Thank you, next (Thank you) |
|
I'm so fuckin' grateful for my ex |
|
Thank you, next (Thank you, next) |
|
Thank you, next (Said thank you, next) |
|
Thank you, next (Next) |
|
I'm so fuckin' grateful for my ex |
|
Thank you, next |
|
Thank you, next |
|
Thank you, next |
|
I'm so fuckin'— |
|
One day I'll walk down the aisle |
|
Holding hands with my mama |
|
I'll be thanking my dad |
|
'Cause she grew from the drama |
|
Only wanna do it once, real bad |
|
Gon' make that shit last |
|
God forbid something happens |
|
Least this song is a smash (Song is a smash) |
|
I've got so much love (Love) |
|
Got so much patience (Patience) |
|
And I've learned from the pain (Pain) |
|
I turned out amazing (Turned out amazing) |
|
Say I've loved and I've lost (Yeah, yeah) |
|
But that's not what I see (Yeah, yeah) |
|
'Cause look what I've found (Yeah, yeah) |
|
Ain't no need for searching |
|
And for that, I say |
|
Thank you, next (Thank you, next) |
|
Thank you, next (Thank you, next) |
|
Thank you, next |
|
I'm so fuckin' grateful for my ex |
|
Thank you, next (Thank you, next) |
|
Thank you, next (Said thank you, next) |
|
Thank you, next (Next) |
|
I'm so fuckin' grateful for my ex |
|
Thank you, next |
|
Thank you, next |
|
Thank you, next |
|
Yeah, yee |
|
Thank you, next |
|
Thank you, next |
|
Thank you, next |
|
Yeah, yee |
|
|
|
|
|
--- |
|
|
|
# mt5-small-finetuned-genius |
|
|
|
This model is a fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) on the [Genius](https://genius.com/) Music dataset found [here](https://www.cs.cornell.edu/~arb/data/genius-expertise/). |
|
The song lyrics and song titles were preprocessed and used for fine-tuning. |
|
|
|
You can view more examples of this model's inference on the following [Space](https://huggingface.co/spaces/miscjose/song-title-generation). |
|
|
|
## Model description |
|
|
|
Please visit: [google/mt5-small](https://huggingface.co/google/mt5-small) |
|
|
|
## Intended uses & limitations |
|
|
|
- Intended Uses: Given song lyrics, generate a summary. |
|
- Limitations: Due to the nature of music, the model can generate summaries containing hate speech. |
|
|
|
## Training and evaluation data |
|
|
|
- 27.6K Training Samples |
|
- 3.45 Validation Samples |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 4e-05 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 32 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 5 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | |
|
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:| |
|
| 7.9304 | 1.0 | 863 | 3.5226 | 14.235 | 6.78 | 14.206 | 14.168 | |
|
| 3.8394 | 2.0 | 1726 | 3.0382 | 22.97 | 13.166 | 22.981 | 22.944 | |
|
| 3.3799 | 3.0 | 2589 | 2.9010 | 24.932 | 14.54 | 24.929 | 24.919 | |
|
| 3.2204 | 4.0 | 3452 | 2.8441 | 26.678 | 15.587 | 26.624 | 26.665 | |
|
| 3.1498 | 5.0 | 4315 | 2.8363 | **26.827** | **15.696** | **26.773** | **26.793** | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.31.0 |
|
- Pytorch 2.0.1+cu117 |
|
- Datasets 2.14.1 |
|
- Tokenizers 0.13.3 |