• Developed by: Mahmoud Ibrahim

  • How to use :

! pip install transformers bitsandbytes
from transformers import AutoTokenizer, AutoModelForCausalLM
from IPython.display import Markdown
import textwrap 

# Load tokenizer and model
tokenizer = AutoTokenizer.from_pretrained("MahmoudIbrahim/Mistral_12b_Arabic")
model = AutoModelForCausalLM.from_pretrained("MahmoudIbrahim/Mistral_12b_Arabic",load_in_4bit =True)

alpaca_prompt = """ููŠู…ุง ูŠู„ูŠ ุชุนู„ูŠู…ุงุช ุชุตู ู…ู‡ู…ุฉุŒ ุฅู„ู‰ ุฌุงู†ุจ ู…ุฏุฎู„ ูŠูˆูุฑ ุณูŠุงู‚ุงู‹ ุฅุถุงููŠุงู‹. ุงูƒุชุจ ุงุณุชุฌุงุจุฉ ุชููƒู…ู„ ุงู„ุทู„ุจ ุจุดูƒู„ ู…ู†ุงุณุจ.

### ุงู„ุชุนู„ูŠู…ุงุช:
{}

### ุงู„ุงุณุชุฌุงุจุฉ:
{}"""

# Format the prompt with instruction and an empty output placeholder
formatted_prompt = alpaca_prompt.format(
    "ูƒูŠู ูŠู…ูƒู† ู„ู„ุญูƒูˆู…ุฉ ุงู„ู…ุตุฑูŠุฉ ูˆุงู„ู…ุฌุชู…ุน ูƒูƒู„ ุฃู† ูŠุนุฒุฒูˆุง ู…ู† ู‚ุฏุฑุฉ ุงู„ุจู„ุงุฏ ุนู„ู‰ ุชุญู‚ูŠู‚ ุงู„ุชู†ู…ูŠุฉ ุงู„ู…ุณุชุฏุงู…ุฉุŸ "   ,  # instruction
    ""  # Leave output blank for generation
)

# Tokenize the formatted string directly
input_ids = tokenizer.encode(formatted_prompt, return_tensors="pt")  # Use 'cuda' if you want to run on GPU

def to_markdown(text):
    text = text.replace('โ€ข','*')
    return Markdown(textwrap.indent(text, '>', predicate=lambda _: True))

# Generate text
output = model.generate(
    input_ids,
    max_length=128,            # Adjust max length as needed
    num_return_sequences=1,     # Number of generated responses
    no_repeat_ngram_size=2,     # Prevent repetition
    top_k=50,                   # Filter to top-k tokens
    top_p=0.9,                  # Use nucleus sampling
    temperature=0.7 ,            # Control creativity level
  
)

generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
to_markdown(generated_text)

The model response :

image/png

Downloads last month
4
Safetensors
Model size
6.97B params
Tensor type
F32
ยท
FP16
ยท
U8
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MahmoudIbrahim/Mistral_Nemo_Arabic

Quantized
(14)
this model

Dataset used to train MahmoudIbrahim/Mistral_Nemo_Arabic