Pretrained on 1B (mostly Turkish) tokens from HF and "high quality" scraped data using 1 RTX 3090. The training will continue. The model already can produce sensible sentences in Turkish.
HF kaynaklı ve scrape edilen 1 Milyar (çoğunlukla Türkçe) token ile 1 RTX 3090 kullanılarak eğitilmiştir. Eğitim devam edecek. Model şimdiden düzgün Türkçe cümleler kurabiliyor:
(top_k=24, repetition_penalty=1.1, temperature=0.12, seed=1022)
- Downloads last month
- 28
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support