Reproduction with OpenLLama-13B

#2
by michaelfeil - opened

Quick question:
Is the base model on OpenLLama (Apache 2.0), or Meta's Llama weights (LLama special licence)?

OpenChat org

On Meta's Llama weights. We plan to train an OpenLLaMA-based model in the future

OpenChat org

If you'd like to train an OpenLLaMA-based reproduction, you can add the EOT token following the ochat/scripts/llama_convert_and_add_eot_token.py, and train directly using our provided scripts. We welcome your contributions!

Sign up or log in to comment