Edit model card

Turkish-moe

Turkish-moe is a merge of the following models using mergekit:

🧩 Configuration

base_model: Trendyol/Trendyol-LLM-7b-chat-dpo-v1.0
dtype: float16
gate_mode: cheap_embed
experts:
  - source_model: TURKCELL/Turkcell-LLM-7b-v1
    positive_prompts: ["You are an helpful general-pupose assistant."]
  - source_model: Trendyol/Trendyol-LLM-7b-chat-dpo-v1.0
    positive_prompts: ["You are helpful assistant."]
Downloads last month
1
Safetensors
Model size
13B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.