--- library_name: transformers language: - ko license: gemma tags: - gemma - pytorch - instruct - finetune - translation widget: - messages: - role: user content: "Hamsters don't eat cats." inference: parameters: max_new_tokens: 2048 base_model: beomi/gemma-ko-2b datasets: - traintogpb/aihub-flores-koen-integrated-sparta-30k - lemon-mint/korean_high_quality_translation_426k pipeline_tag: text-generation --- # Gemma 2B Translation v0.110 - Eval Loss: `0.59812` - Train Loss: `0.40320` - lr: `6e-05` - optimizer: adamw - lr_scheduler_type: cosine ## Prompt Template ``` ### English Hamsters don't eat cats. ### Korean 햄스터는 고양이를 먹지 않습니다. ``` ## Model Description - **Developed by:** `lemon-mint` - **Model type:** Gemma - **Language(s) (NLP):** English - **License:** [gemma-terms-of-use](https://ai.google.dev/gemma/terms) - **Finetuned from model:** [beomi/gemma-ko-2b](https://huggingface.co/beomi/gemma-ko-2b)