--- license: mit language: - tr library_name: transformers --- Pretrained on roughly **1.6B** (mostly Turkish) tokens from HF and "high quality" scraped data using 1 RTX 3090. The training will continue. The model already can be (sort of) fine-tuned for instruction. ___________________________ HF kaynaklı ve scrape edilen yaklaşık **1.6 Milyar** (çoğunlukla Türkçe) token ile 1 RTX 3090 kullanılarak eğitilmiştir. Model şimdiden talimatlar için fine-tune edilebiliyor: ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6324eabf05bd8a54c6eb1650/SlUmBi6Mmb5NuQesvucv3.png) max_length=256, top_k=20, min_p=0.1, repetition_penalty=1.1, temperature=0.1, seed=22366 / TR_4k_LoRA