File size: 299 Bytes
17817a9 ab70048 |
1 2 3 4 5 6 7 8 9 10 |
---
license: mit
---
This repo contains a low-rank adapter for LLaMA-7b
fit on the Stanford Alpaca dataset translated into Japanese.
It doesn't contain the foundation model itself, so it's MIT licensed.
Instructions for running it can be found at https://github.com/kunishou/Japanese-Alpaca-LoRA. |