metadata
library_name: peft
base_model: google/mt5-base
license: apache-2.0
language:
- ar
pipeline_tag: summarization
tags:
- summarization
- mt5
- pytorch
- transformers
Mojiz
Mojiz is a finetuned MT5 model for Arabic summarization.
Model Description
Usage
from peft import PeftModel, PeftConfig
from transformers import AutoModelForSeq2SeqLM
config = PeftConfig.from_pretrained("ahmedabdelwahed/sft-base-12-epochs")
model = AutoModelForSeq2SeqLM.from_pretrained("google/mt5-base")
model = PeftModel.from_pretrained(model, "ahmedabdelwahed/sft-base-12-epochs")
Direct Use
Training Details
Training Data
[More Information Needed]
Training Procedure
Preprocessing [optional]
[More Information Needed]
Training Hyperparameters
- Training regime: [More Information Needed]
[More Information Needed]
Evaluation
Metrics
[More Information Needed]
Results
[More Information Needed]
Model Examination [optional]
Citation [optional]
Framework versions
- PEFT 0.7.1