File size: 8,673 Bytes
d7e95cf 3c5fc86 d7e95cf 3c5fc86 2c8382c 3c5fc86 d7e95cf 8b4cddd d7e95cf 8b4cddd 6e626ab d7e95cf 3c5fc86 d7e95cf 3c5fc86 d7e95cf 3c5fc86 6e626ab d7e95cf 3c5fc86 d7e95cf 3c5fc86 d7e95cf 3c5fc86 d7e95cf c290944 d7e95cf 3c5fc86 d7e95cf 3c5fc86 d7e95cf 3c5fc86 d7e95cf 3c5fc86 d7e95cf c290944 d7e95cf 3c5fc86 d7e95cf 3c5fc86 6e626ab 3c5fc86 6e626ab 3c5fc86 8b4cddd d7e95cf 3c5fc86 d7e95cf 3c5fc86 d7e95cf 3c5fc86 d7e95cf 3c5fc86 d7e95cf 5efa792 d7e95cf 3c5fc86 d7e95cf 3c5fc86 d7e95cf 3c5fc86 d7e95cf 3c5fc86 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 |
---
license: gemma
library_name: transformers
pipeline_tag: text-generation
extra_gated_heading: Access Gemma on Hugging Face
extra_gated_prompt: >-
To access Gemma on Hugging Face, you’re required to review and agree to
Google’s usage license. To do this, please ensure you’re logged in to Hugging
Face and click below. Requests are processed immediately.
extra_gated_button_content: Acknowledge license
tags:
- conversational
base_model:
- google/gemma-2-9b
language:
- tr
model-index:
- name: wiroai-turkish-llm-9b
results:
- task:
type: multiple-choice
dataset:
type: multiple-choice
name: MMLU_TR_V0.2
metrics:
- name: 5-shot
type: 5-shot
value: 0.5982
verified: false
- task:
type: multiple-choice
dataset:
type: multiple-choice
name: Truthful_QA_V0.2
metrics:
- name: 0-shot
type: 0-shot
value: 0.4991
verified: false
- task:
type: multiple-choice
dataset:
type: multiple-choice
name: ARC_TR_V0.2
metrics:
- name: 25-shot
type: 25-shot
value: 0.5367
verified: false
- task:
type: multiple-choice
dataset:
type: multiple-choice
name: HellaSwag_TR_V0.2
metrics:
- name: 10-shot
type: 10-shot
value: 0.5701
verified: false
- task:
type: multiple-choice
dataset:
type: multiple-choice
name: GSM8K_TR_V0.2
metrics:
- name: 5-shot
type: 5-shot
value: 0.6682
verified: false
- task:
type: multiple-choice
dataset:
type: multiple-choice
name: Winogrande_TR_V0.2
metrics:
- name: 5-shot
type: 5-shot
value: 0.6058
verified: false
---
<div align="center">
<img src="https://huggingface.co/WiroAI/gemma-2-9b-it-tr/resolve/main/wiro_logo.png"
alt="Wiro AI Logo" width="256"/>
</div>
# 🚀 Meet with WiroAI/wiroai-turkish-llm-9b! A robust language model with more Turkish language and culture support! 🚀
## 🌟 Key Features
Fine-tuned with 500,000+ high-quality Turkish instructions
Adapted to Turkish culture and local context
Built on Google's cutting-edge Gemma architecture
📝 Model Details
The model is the Turkish-speaking member of Google's innovative Gemma model family. This model has been trained using Supervised Fine-Tuning (SFT) on carefully curated high-quality Turkish instructions. Leveraging the foundations of Gemini technology, this model demonstrates superior performance in Turkish language processing tasks.
## 🔧 Technical Specifications
Architecture: Decoder-only transformer
Base Model: Google Gemma 2 9B
Training Data: 500,000+ specially selected Turkish instructions
Language Support: Turkish (with comprehensive local context understanding) and other common languages.
## 💡 Use Cases
- Text Generation and Editing
- Question Answering
- Summarization
- Analysis and Reasoning
- Content Transformation
- Turkish Natural Language Processing Tasks
- Turkish Culture
## 🚀 Advantages
Local Understanding: Ability to comprehend Turkish culture, idioms, and current events
Resource Efficiency: Effective operation even with limited hardware resources
Flexible Deployment: Usable on desktop, laptop, or custom cloud infrastructure
Open Model: Transparent and customizable architecture
## 🌍 About Google Gemma 2
Gemma is Google's family of lightweight, state-of-the-art open models, developed using the same research and technology used to create the Gemini models. These models are designed to be deployable in environments with limited resources, making AI technology accessible to everyone.
## 📈 Performance and Limitations
While the model demonstrates high performance in Turkish language tasks, users should consider the following:
- Use clear and structured instructions for best results.
- Verify model outputs for critical applications.
- Evaluate resource requirements before deployment.
- Be aware that benchmarks below are represented in certain conditions and results can be replicated. Condition choices are explained below the table.
### Benchmark Scores
| Models | MMLU TR | TruthfulQA TR | ARC TR | HellaSwag TR | GSM8K TR | WinoGrande TR | Average |
|-----------------------------------------------------------|:-------:|:-------------:|:------:|:------------:|:--------:|:-------------:|:-------:|
| **WiroAI/wiroai-turkish-llm-9b** | **59.8** | 49.9 | **53.7** | **57.0** | 66.8 | **60.6** | **58.0** |
| selimc/OrpoGemma-2-9B-TR | 53.0 | 54.3 | 52.4 | 52.0 | 64.8 | 58.9 | 55.9 |
| Metin/Gemma-2-9b-it-TR-DPO-V1 | 51.3 | 54.7 | 52.6 | 51.2 | 67.1 | 55.2 | 55.4 |
| CohereForAI/aya-expanse-8b | 52.3 | 52.8 | 49.3 | 56.7 | 61.3 | 59.2 | 55.3 |
| ytu-ce-cosmos/Turkish-Llama-8b-DPO-v0.1 | 52.0 | 57.6 | 51.0 | 53.0 | 59.8 | 58.0 | 55.2 |
| google/gemma-2-9b-it | 51.8 | 53.0 | 52.2 | 51.5 | 63.0 | 56.2 | 54.6 |
| Eurdem/Defne-llama3.1-8B | 52.9 | 51.2 | 47.1 | 51.6 | 59.9 | 57.5 | 53.4 |
| **WiroAI/wiroai-turkish-llm-8b** | 52.4 | 49.5 | 50.1 | 54 | 57.5 | 57.0 | 53.4 |
| meta-llama/Meta-Llama-3-8B-Instruct | 52.2 | 49.2 | 44.2 | 49.2 | 56.0 | 56.7 | 51.3 |
Models Benchmarks are tested with
```python
lm_eval --model_args pretrained=<model_path> --tasks mmlu_tr_v0.2,arc_tr-v0.2,gsm8k_tr-v0.2,hellaswag_tr-v0.2,truthfulqa_v0.2,winogrande_tr-v0.2
```
Please see https://github.com/malhajar17/lm-evaluation-harness_turkish and note that we move forward with default language inference which is the same approach in OpenLLMLeaderboard v2.0
## Usage
### Transformers Pipeline
```python
import transformers
import torch
model_id = "WiroAI/wiroai-turkish-llm-9b"
pipeline = transformers.pipeline(
"text-generation",
model=model_id,
model_kwargs={"torch_dtype": torch.bfloat16},
device_map="auto",
)
pipeline.model.eval()
instruction = "Bana İstanbul ile alakalı bir sosyal medya postu hazırlar mısın?"
messages = [
{"role": "user", "content": f"{instruction}"}
]
prompt = pipeline.tokenizer.apply_chat_template(
messages,
tokenize=False,
add_generation_prompt=True
)
terminators = [
pipeline.tokenizer.eos_token_id,
pipeline.tokenizer.convert_tokens_to_ids("<end_of_turn>")
]
outputs = pipeline(
prompt,
max_new_tokens=512,
eos_token_id=terminators,
do_sample=True,
temperature=0.9,
)
print(outputs[0]["generated_text"][len(prompt):])
```
```markdown
İstanbul'un büyüsüne kapılın! :city_sunset:
Halk arasında "dünyanın masalı şehri" olarak bilinen İstanbul, her köşesinde tarih, kültür ve modern yaşamın bir araya geldiği eşsiz bir şehir.
Yüzyıllardır farklı medeniyetlerin izlerini taşıyan İstanbul, tarihi mekanlarından, müzelerinden, çarşılarından ve restoranlarından oluşan zengin kültürel mirasa sahiptir.
Boğaz'ın eşsiz manzarasında tekne turu yapmak, Topkapı Sarayı'nı ziyaret etmek, Grand Bazaar'da alışveriş yapmak, Mısır Çarşısı'nın canlı atmosferinde kaybolmak, Galata Kulesi'nden muhteşem bir manzara deneyimlemek veya Beyoğlu'nun hareketli sokaklarında yürüyüş yapmak İstanbul'da unutulmaz anılar yaratmak için fırsatlar sunar.
İstanbul'un büyülü atmosferini kendiniz yaşamak için hemen planınızı yapın! :flag-tr: #İstanbul #Türkiye #Seyahat #Tarih #Kültür #Gezi
```
## 🤝 License and Usage
This model is provided under Google's Gemma license. Please review and accept the license terms before use.
## 📫 Contact and Support
For questions, suggestions, and feedback, please open an issue on HuggingFace or contact us directly from our website.
## Citation
```none
@article{WiroAI,
title={gemma-2-9b-it-tr},
author={Abdullah Bezir, Furkan Burhan Türkay, Cengiz Asmazoğlu},
year={2024},
url={https://huggingface.co/WiroAI/gemma-2-9b-it-tr}
}
```
```none
@article{gemma_2024,
title={Gemma},
url={https://www.kaggle.com/m/3301},
DOI={10.34740/KAGGLE/M/3301},
publisher={Kaggle},
author={Gemma Team},
year={2024}
}
``` |