GeneralLLM | Is it Llama 1 or 2?
#1
by
jordanparker6
- opened
Hey is the GeneralLLM in the paper Llama 1 or 2?
Hi, it is LLaMA 1.
Thanks. Is the training and benchmarking code public. I would love to re-run the training and evaluation with the new Mistral-7b and see how results compare.
Thanks for your interest in our work! We plan to release the benchmarking code in the near future. For now, you may use our data transferring code to pre-process your data, and follow the official guide on HuggingFace🤗 to continue to pre-train the Mistral-7b model.
AdaptLLM
changed discussion status to
closed