metadata
language:
- en
- ar
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- gguf
base_model: unsloth/llama-3-8b-Instruct-bnb-4bit
datasets:
- ahmedheakl/arzen-llm-dataset
metrics:
- bleu
- meteor
- bertscore
library_name: transformers
Please see paper & code for more information:
Citation
BibTeX:
@article{heakl2024arzen,
title={ArzEn-LLM: Code-Switched Egyptian Arabic-English Translation and Speech Recognition Using LLMs},
author={Heakl, Ahmed and Zaghloul, Youssef and Ali, Mennatullah and Hossam, Rania and Gomaa, Walid},
journal={arXiv preprint arXiv:2406.18120},
year={2024}
}
Model Card Authors
- Email: ahmed.heakl@ejust.edu.eg
- Linkedin: https://linkedin.com/in/ahmed-heakl
Uploaded model
- Developed by: ahmedheakl
- License: apache-2.0
- Finetuned from model : unsloth/llama-3-8b-Instruct-bnb-4bit
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.