--- license: mit --- This repo contains a low-rank adapter for LLaMA-7b fit on the [Stanford Alpaca](https://github.com/tatsu-lab/stanford_alpaca) dataset. It doesn't contain the foundation model itself, so it's MIT licensed! Instructions for running it can be found at https://github.com/tloen/alpaca-lora.