File size: 591 Bytes
60fae1a
 
 
 
f69d110
60fae1a
 
37bb595
1
2
3
4
5
6
7
8
---
license: bigscience-bloom-rail-1.0
language:
- ar
library_name: transformers
pipeline_tag: text-generation
---
This huggingface page hosts a low-rank adapter designed specifically for the fine-tuning of the bloom-7b model on Arabic instructions. Additional information regarding the datasets will be made available soon. The model was trained using the codebase found in the repository: https://github.com/KhalidAlt/alpaca-lora/tree/hf_models. This work is based on this repository: https://github.com/tloen/alpaca-lora, with certain modifications to adjust the requirements of bloom-7b.