--- license: cc-by-nc-4.0 ---

Project Baize


## What's Baize? Baize is an open-source chat model fine-tuned with [LoRA](https://github.com/microsoft/LoRA). It uses 100k dialogs generated by letting ChatGPT chat with itself. We also use Alpaca's data to improve its performance. This repo contains 7B model. ## Why it's called Baize? Baize (白泽) is a mythical creature in Chinese folklore, who speaks human languages and knows everything. This is exactly what we expect from a chat model. ## Training Parameters - Base Model: [LLaMA-7B](https://arxiv.org/pdf/2302.13971.pdf) - Training Epoch: 1 - Batch Size: 64 - Maximum Input Length: 512 - Learning Rate: 2e-4 - LoRA Rank: 8 - Updated Modules: All Linears ## Training Dataset - [Standford Alpaca](https://github.com/tatsu-lab/stanford_alpaca) (51,942) - [Quora Dialogs](https://github.com/project-baize/baize) (54,456): - [StackOverflow Dialogs](https://github.com/project-baize/baize) (57,046) More details can be found in the Baize [GitHub](https://github.com/project-baize/baize)