HeyLucasLeao
commited on
Commit
•
31d33a8
1
Parent(s):
e1b166b
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,151 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
Create README.md
|
2 |
+
## Emo Bot
|
3 |
+
|
4 |
+
#### Model Description
|
5 |
+
This is a finetuned version from GPT-Neo-125M for Generating Music Lyrics by Emo Genre.
|
6 |
+
|
7 |
+
#### Training data
|
8 |
+
It was trained with 2381 songs by 15 bands that were important to emo culture in the early 2000s, not necessary directly playing on the genre.
|
9 |
+
|
10 |
+
#### Training Procedure
|
11 |
+
It was finetuned using the Trainer Class available on the Hugging Face library.
|
12 |
+
|
13 |
+
##### Learning Rate: **2e-4**
|
14 |
+
##### Epochs: **40**
|
15 |
+
##### Colab for Finetuning: https://colab.research.google.com/drive/1jwTYI1AygQf7FV9vCHTWA4Gf5i--sjsD?usp=sharing
|
16 |
+
##### Colab for Testing: https://colab.research.google.com/drive/1wSP4Wyr1-DTTNQbQps_RCO3ThhH-eeZc?usp=sharing
|
17 |
+
|
18 |
+
#### Goals
|
19 |
+
|
20 |
+
My true intention was totally educational, thus making available a this version of the model as a example for future proposes.
|
21 |
+
|
22 |
+
How to use
|
23 |
+
``` python
|
24 |
+
from transformers import AutoTokenizer, AutoModelForCausalLM
|
25 |
+
import re
|
26 |
+
|
27 |
+
if torch.cuda.is_available():
|
28 |
+
device = torch.device('cuda')
|
29 |
+
else:
|
30 |
+
device = torch.device('cpu')
|
31 |
+
print(device)
|
32 |
+
|
33 |
+
tokenizer = AutoTokenizer.from_pretrained("HeyLucasLeao/gpt-neo-small-emo-lyrics")
|
34 |
+
model = AutoModelForCausalLM.from_pretrained("HeyLucasLeao/gpt-neo-small-emo-lyrics")
|
35 |
+
model.to('cuda')
|
36 |
+
|
37 |
+
generated = tokenizer('I miss you',return_tensors='pt').input_ids.cuda()
|
38 |
+
|
39 |
+
#Generating texts
|
40 |
+
sample_outputs = model.generate(generated,
|
41 |
+
# Use sampling instead of greedy decoding
|
42 |
+
do_sample=True,
|
43 |
+
# Keep only top 3 token with the highest probability
|
44 |
+
top_k=10,
|
45 |
+
# Maximum sequence length
|
46 |
+
max_length=200,
|
47 |
+
# Keep only the most probable tokens with cumulative probability of 95%
|
48 |
+
top_p=0.95,
|
49 |
+
# Changes randomness of generated sequences
|
50 |
+
temperature=2.,
|
51 |
+
# Number of sequences to generate
|
52 |
+
num_return_sequences=3)
|
53 |
+
|
54 |
+
# Decoding and printing sequences
|
55 |
+
for i, sample_output in enumerate(sample_outputs):
|
56 |
+
texto = tokenizer.decode(sample_output.tolist())
|
57 |
+
regex_padding = re.sub('<|pad|>', '', texto)
|
58 |
+
regex_barra = re.sub('[|+]', '', regex_padding)
|
59 |
+
espaço = re.sub('[ +]', ' ', regex_barra)
|
60 |
+
resultado = re.sub('[\n](2, )', '\n', espaço)
|
61 |
+
print(">> Text {}: {}".format(i+1, resultado + '\n'))
|
62 |
+
|
63 |
+
""">> Texto 1: I miss you
|
64 |
+
I miss you more than anything
|
65 |
+
And if you change your mind
|
66 |
+
I do it like a change of mind
|
67 |
+
I always do it like theeah
|
68 |
+
Everybody wants a surprise
|
69 |
+
Everybody needs to stay collected
|
70 |
+
I keep your locked and numbered
|
71 |
+
Use this instead: Run like the wind
|
72 |
+
Use this instead: Run like the sun
|
73 |
+
And come back down: You've been replaced
|
74 |
+
Don't want to be the same
|
75 |
+
Tomorrow
|
76 |
+
I don't even need your name
|
77 |
+
The message is on the way
|
78 |
+
make it while you're holding on
|
79 |
+
It's better than it is
|
80 |
+
Everything more security than a parade
|
81 |
+
Im getting security
|
82 |
+
angs the world like a damned soul
|
83 |
+
We're hanging on a queue
|
84 |
+
and the truth is on the way
|
85 |
+
Are you listening?
|
86 |
+
We're getting security
|
87 |
+
Send me your soldiers
|
88 |
+
We're getting blood on"""
|
89 |
+
|
90 |
+
""">> Texto 2: I miss you
|
91 |
+
And I could forget your name
|
92 |
+
All the words we'd hear
|
93 |
+
You miss me
|
94 |
+
I need you
|
95 |
+
And I need you
|
96 |
+
You were all by my side
|
97 |
+
When we'd talk to no one
|
98 |
+
And I
|
99 |
+
Just to talk to you
|
100 |
+
It's easier than it has to be
|
101 |
+
Except for you
|
102 |
+
You missed my know-all
|
103 |
+
You meant to hug me
|
104 |
+
And I
|
105 |
+
Just want to feel you touch me
|
106 |
+
We'll work up
|
107 |
+
Something wild, just from the inside
|
108 |
+
Just get closer to me
|
109 |
+
I need you
|
110 |
+
You were all by my side
|
111 |
+
When we*d talk to you
|
112 |
+
, you better admit
|
113 |
+
That I'm too broken to be small
|
114 |
+
You're part of me
|
115 |
+
And I need you
|
116 |
+
But I
|
117 |
+
Don't know how
|
118 |
+
But I know I need you
|
119 |
+
Must"""
|
120 |
+
|
121 |
+
""">> Texto 3: I miss you
|
122 |
+
And I can't lie
|
123 |
+
Inside my head
|
124 |
+
All the hours you've been through
|
125 |
+
If I could change your mind
|
126 |
+
I would give it all away
|
127 |
+
And I'd give it all away
|
128 |
+
Just to give it away
|
129 |
+
To you
|
130 |
+
Now I wish that I could change
|
131 |
+
Just to you
|
132 |
+
I miss you so much
|
133 |
+
If I could change
|
134 |
+
So much
|
135 |
+
I'm looking down
|
136 |
+
At the road
|
137 |
+
The one that's already been
|
138 |
+
Searching for a better way to go
|
139 |
+
So much I need to see it clear
|
140 |
+
topk wish me an ehive
|
141 |
+
I wish I wish I wish I knew
|
142 |
+
I can give well
|
143 |
+
In this lonely night
|
144 |
+
|
145 |
+
The lonely night
|
146 |
+
I miss you
|
147 |
+
I wish it well
|
148 |
+
If I could change
|
149 |
+
So much
|
150 |
+
I need you"""
|
151 |
+
```
|