Commit History
Added Tokenizer
3cedfeb
update model card README.md
f529912
Jan van Doorn
commited on
Model save
0814c0b
Jan van Doorn
commited on
Training in progress, step 12000
b39282f
Jan van Doorn
commited on
Training in progress, step 10000
50a9220
Jan van Doorn
commited on
Training in progress, step 8000
e06a69c
Jan van Doorn
commited on
Training in progress, step 6000
476332a
Jan van Doorn
commited on
Training in progress, step 4000
d820b01
Jan van Doorn
commited on
Training in progress, step 2000
5451c19
Jan van Doorn
commited on
Update README.md
64ed143
update model card README.md
9f27d33
Jan van Doorn
commited on
End of training
69b9764
Jan van Doorn
commited on
Training in progress, step 50000
f10113a
Jan van Doorn
commited on
Training in progress, step 45000
9a669c3
Jan van Doorn
commited on
Training in progress, step 40000
4219764
Jan van Doorn
commited on
Training in progress, step 35000
6f35e46
Jan van Doorn
commited on
Training in progress, step 30000
0f54a3c
Jan van Doorn
commited on
Manual merge after manually adding tokenizer
e008a7c
Jan van Doorn
commited on
Training in progress, step 25000
b8bdec7
Jan van Doorn
commited on
Added tokenizer
04eaf61
Training in progress, step 20000
b79b0f2
Jan van Doorn
commited on
Training in progress, step 15000
9ae796a
Jan van Doorn
commited on
Training in progress, step 10000
2000b50
Jan van Doorn
commited on
Training in progress, step 5000
802b6ad
Jan van Doorn
commited on
initial commit
d6b588a
Jan van Doorn
commited on