File size: 3,825 Bytes
d1f5d67
f224ad6
d1f5d67
 
 
 
 
 
10add69
 
e466667
10add69
e466667
 
 
d1f5d67
2d3b862
 
45f0a37
 
 
 
bd04aa6
466ef10
45f0a37
5f03e12
 
 
 
2d3b862
ba4a718
 
 
2d3b862
 
ba4a718
 
 
 
 
2d3b862
 
45d6d3c
ba4a718
 
 
 
 
 
 
 
a3f8bd9
2d3b862
 
ba4a718
2d3b862
 
ba4a718
b253885
 
d1f5d67
b889227
b253885
 
 
 
2d3b862
b7256cb
2d3b862
 
b253885
2d3b862
d1f5d67
2d3b862
1fb6286
2d3b862
d1f5d67
110e294
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
---
license: apache-2.0
language:
- ar
pipeline_tag: text-generation
tags:
- 'arabic '
- text-generation
widget:
- text: "أعلنت وزارة الحج في المملكة العربية السعودية"
  example_title: "مثال ١"
- text: "يبدو اليوم جميلا، سأقوم بتحضير"
  example_title: "مثال ٢"
- text: "إن التقنيات الحديثة"
  example_title: "مثال ٣"
---
# ArabianGPT Model Overview

## Disclaimer for the Use of Large Language Models (LLMs) for Text Generation

<p style="color: red;">We disclaim all responsibility for any harm, inaccuracies, or inappropriate content generated by ArabianGPT-0.1B, and users engage with and apply the model's outputs at their own risk.</p>

> **Important Note:** Currently, we offer a raw pre-trained model. Our team is actively working on releasing instruction-based LLMs that are fine-tuned and augmented with LRHF. The first set of pre-trained models has been made available for community exploration. While we do have models fine-tuned for specific tasks such as summarization and sentiment analysis, they are still in the development phase.


## How you can use this Pre-Trained?
You are invited to utilize this pre-trained, native Arabic language model as an experimental tool to assess its capabilities, aid in its fine-tuning, and evaluate its performance across a variety of downstream tasks. We encourage you to review our technical report for a comprehensive understanding of the model's performance metrics and the specific downstream tasks it has been tested on. This will provide valuable insights into its applicability and effectiveness in diverse applications.


## Introduction
ArabianGPT-0.1B, developed under the ArabianLLM initiatives, is a specialized GPT-2 model optimized for Arabic language modeling. 
It's a product of the collaborative efforts at Prince Sultan University's Robotics and Internet of Things Lab, focusing on enhancing natural language modeling and generation in Arabic. 
This model represents a significant stride in LLM research, specifically addressing the linguistic complexities and nuances of the Arabic language.

## Key Features
- **Architecture**: GPT-2
- **Model Size**: 134 million parameters
- **Layers**: 12
- **Model Attention Layers (MAL)**: 12
- **Context Window Size**: 768 tokens

## Training
- **Dataset**: Scraped Arabic newspaper articles
- **Data Size**: 15.5 GB
- **Words**: 237.8 million
- **Tokenizer**: Aranizer 64K
- **Tokens**: Over 1.75 billion
- **Hardware**: 2 NDIVIA A100 GPUs 
- **Training Scale**: 7.5 million examples
- **Training Duration**: 3 days
- **Performance**: Final loss of 3.97


## Role in ArabianLLM Initiatives
ArabianGPT-0.1B (Base Model) is crucial for advancing Arabic language processing, addressing challenges unique to Arabic morphology and dialects.

## Usage
Suitable for Arabic text generation tasks. Example usage with Transformers Pipeline:
```python
from transformers import pipeline

pipe = pipeline("text-generation", model="riotu-lab/ArabianGPT-01B", max_new_tokens=512)
text = ''
pipe.predict(text)
```

## Limitations and Ethical Considerations

- The model may have context understanding or text generation limitations in certain scenarios.
- Emphasis on ethical use to prevent misinformation or harmful content propagation.

## Acknowledgments

Special thanks to Prince Sultan University, particularly the Robotics and Internet of Things Lab.

## Contact Information

For inquiries: [riotu@psu.edu.sa](mailto:riotu@psu.edu.sa).

## Disclaimer for the Use of Large Language Models (LLMs) for Text Generation

<p style="color: red;">We disclaim all responsibility for any harm, inaccuracies, or inappropriate content generated by ArabianGPT-0.1B, and users engage with and apply the model's outputs at their own risk.</p>