--- license: mit --- This repo contains a low-rank adapter for LLaMA-7b fit on the [Stanford Alpaca](https://github.com/tatsu-lab/stanford_alpaca) dataset. It doesn't contain the foundation model itself, so it's MIT licensed! The adapter was trained with a catalan translation of the cleaned alpaca dataset.