File size: 1,103 Bytes
62457b5 bad5422 b050ca4 bad5422 62457b5 bad5422 62457b5 bad5422 62457b5 bad5422 62457b5 bad5422 62457b5 bad5422 62457b5 bad5422 62457b5 bad5422 62457b5 fc1b480 bad5422 62457b5 bad5422 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 |
---
library_name: transformers
language:
- ko
license: gemma
tags:
- gemma
- pytorch
- instruct
- finetune
- translation
widget:
- messages:
- role: user
content: "Hamsters don't eat cats."
inference:
parameters:
max_new_tokens: 2048
base_model: beomi/gemma-ko-2b
datasets:
- traintogpb/aihub-flores-koen-integrated-sparta-30k
- lemon-mint/korean_high_quality_translation_426k
pipeline_tag: text-generation
---
# Gemma 2B Translation v0.110
- Eval Loss: `0.59812`
- Train Loss: `0.40320`
- lr: `6e-05`
- optimizer: adamw
- lr_scheduler_type: cosine
## Prompt Template
```
<bos>### English
Hamsters don't eat cats.
### Korean
ํ์คํฐ๋ ๊ณ ์์ด๋ฅผ ๋จน์ง ์์ต๋๋ค.<eos>
```
```
<bos>### Korean
ํ์คํฐ๋ ๊ณ ์์ด๋ฅผ ๋จน์ง ์์ต๋๋ค.
### English
Hamsters don't eat cats.<eos>
```
## Model Description
- **Developed by:** `lemon-mint`
- **Model type:** Gemma
- **Language(s) (NLP):** English
- **License:** [gemma-terms-of-use](https://ai.google.dev/gemma/terms)
- **Finetuned from model:** [beomi/gemma-ko-2b](https://huggingface.co/beomi/gemma-ko-2b)
|