Text Generation
Transformers
PyTorch
English
llama
Eval Results
Inference Endpoints
text-generation-inference

why is it called "mini"?

#4
by chansung - opened

why is it called "mini"?

Because according to the Orca paper
https://arxiv.org/abs/2306.02707
Orca 13B is trained on dataset size of 6M (5M GPT3-5 + 1M GPT4)
Our is ~125K only combined WizardLM, Alpaca, Dolly-V2) . Please read dataset and model card for more details .

pankajmathur changed discussion status to closed

Sign up or log in to comment