Trained on a 3090. took 9 hours, it's 27s/it and default configured to 1218 iterations. commit: a2607fa - https://github.com/tloen/alpaca-lora
it's for the 7B model
- Downloads last month
- 0
Unable to determine this model's library. Check the
docs
.