File size: 2,628 Bytes
ebfb9f4
 
 
eb9e48d
 
 
 
 
 
 
 
ebfb9f4
 
 
 
 
 
 
 
eb9e48d
 
 
 
 
ebfb9f4
 
eb9e48d
 
 
 
 
ebfb9f4
 
 
eb9e48d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ebfb9f4
eb9e48d
ebfb9f4
eb9e48d
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
---
language:
- en
- fr
- yo
- pt
- sw
- zu
- ar
- af
- ig
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
base_model: unsloth/llama-3-8b-bnb-4bit
fine_tuned: vutuka/llama-3-8b-african-alpaca-4bit
datasets:
- vutuka/aya_african_alpaca
metrics:
- accuracy
---

# Model Description

This model is an innovative multi-language text-generation model that expands on the capabilities of its base, incorporating knowledge and nuances from a variety of African languages. It is a step forward in making AI accessible and effective in processing and generating text in languages that have been historically underrepresented in data-driven technologies.

# Uploaded Model Details

- **Developed by:** vutuka
- **License:** apache-2.0
- **Base model:** unsloth/llama-3-8b-bnb-4bit
- **Fine-tuned model:** vutuka/llama-3-8b-african-alpaca-4bit
- **Dataset used for fine-tuning:** vutuka/aya_african_alpaca
- **Intended use:** The model is intended for text generation tasks where understanding and generating African languages is crucial. It can serve as a tool for researchers, developers, and linguists working on African language processing.

# Training Procedure

This llama model was trained using Unsloth's accelerated training techniques, achieving speeds up to 2x faster than conventional methods. Leveraging the capabilities of Huggingface's Transformers Reinforcement Learning (TRL) library, the model fine-tuning was optimized for both efficiency and performance.

# Performance and Metrics

The model's performance has been evaluated using accuracy as a key metric. Further details on evaluation and metrics are forthcoming as the model is put into practice and gains usage in diverse scenarios.

# How to Use

This model is compatible with Huggingface's Transformers library and can be used for various text generation tasks. Detailed instructions on how to implement and utilize this model will be provided, ensuring users can fully leverage its potential.

# Acknowledgments

We would like to acknowledge the contributors of the Unsloth project and the maintainers of the Huggingface's Transformers library for their tools that made this advancement possible. Special thanks to the community for providing the dataset and contributing to the fine-tuning process.

# Disclaimer

This model is released under the apache-2.0 license, which includes a limitation of liability. While the model has been fine-tuned to generate text in multiple African languages, users should be aware of the potential for biases inherent in any language model, and exercise caution in its application.