File size: 192 Bytes
e22ca17
 
 
 
1e64101
 
 
1
2
3
4
5
6
7
---
license: unlicense
---
Trained on a 3090. took 9 hours, it's 27s/it and default configured to 1218 iterations.
commit: a2607fa - https://github.com/tloen/alpaca-lora

it's for the 7B model