Jambalpaca-v0.1 / README.md
mlabonne's picture
Update README.md
1c1f77f verified
|
raw
history blame
No virus
901 Bytes
metadata
library_name: peft
base_model: ai21labs/Jamba-v0.1
license: apache-2.0
datasets:
  - mhenrichsen/alpaca_2k_test
tags:
  - axolotl

Jambalpaca-v0.1

This is a test run to fine-tune ai21labs/Jamba-v0.1 on an A100 80GB GPU using Axolotl and a custom version of LazyAxolotl. I used mhenrichsen/alpaca_2k_test as a dataset. I had to quantize the base model in 8-bit precision due to how merging models with adapters work. Weirdly enough, I didn't have to do it in a notebook version I created. I also pushed the adapter so I or someone else could do a better merge.

Let me know if you're interested, I can give you access to Jamba's version of LazyAxolotl.