Edit model card

Classification Of Arabic News Using Arabert

-One of the famous models that use transform networks in Arabic language classification is “BERT” (Bidirectional Encoder Representations from Transformers). BERT is trained on a huge amount of diverse linguistic data, including Arabic, which allows it to better understand language relationships. Other BERT-based models have been developed to improve classification performance in Arabic, such as “AraBERT” and “ARA-BERT”. These models are trained on large data specific to the Arabic language, allowing them to achieve outstanding classification performance for the Arabic language. The use of transfer networks in Arabic classification is currently an active area of research and development, with researchers and engineers working to improve existing models and develop new techniques to meet the challenges of Arabic and improve classification accuracy in this context.

Google Scholar has our Bibtex wrong (missing name), use this instead

@inproceedings{antoun2020arabert, title={AraBERT: Transformer-based Model for Arabic Language Understanding}, author={Antoun, Wissam and Baly, Fady and Hajj, Hazem}, booktitle={LREC 2020 Workshop Language Resources and Evaluation Conference 11--16 May 2020}, pages={9} }

DATASET

Local 5000
Sports 5000
Policy 5000
Economy 5000
Cultural 5000
Technology 5000

LABEL_DATASET

lable_0 رياضية
lable_ سياسية 1
lable_اقتصاد 2
lable_تكنولوجيا 3
lable_محلية 4
lable_ثقافية 5

Training parameters

Training batch size 8
Evaluation batch size 8
Learning rate 2e-5
Max length target 203
Epoch 1

# Results

raining Loss: `0.21533072472327064
Classification Accuracy 0.1619285045662197
Validation Accuracy: 0.9664634146341463
Downloads last month
1
Safetensors
Model size
135M params
Tensor type
F32
·
Inference API
This model can be loaded on Inference API (serverless).