This model is being trained with help of LORA technique on Bulgarian data from: https://www.kaggle.com/datasets/auhide/bulgarian-recipes-dataset/
This LLAMA version is 4bit encoded version of the 16bit LLAMA 2 7B model.
- Downloads last month
- 10
Inference API (serverless) is not available, repository is disabled.