--- library_name: transformers language: - ko license: gemma tags: - gemma - pytorch - instruct - finetune - translation widget: - messages: - role: user content: "Hamsters don't eat cats." inference: parameters: max_new_tokens: 2048 base_model: beomi/gemma-ko-2b datasets: - traintogpb/aihub-flores-koen-integrated-sparta-30k - lemon-mint/korean_high_quality_translation_426k pipeline_tag: text-generation --- # Gemma 2B Translation v0.120 - Eval Loss: `0.3859` - Train Loss: `0.4066` - lr: `6e-05` - optimizer: adamw - lr_scheduler_type: cosine ## Prompt Template ``` ##English## Hamsters don't eat cats. ##Korean## 햄스터는 고양이를 먹지 않습니다. ``` ``` ##Korean## 햄스터는 고양이를 먹지 않습니다. ##English## Hamsters don't eat cats. ``` ## Model Description - **Developed by:** `lemon-mint` - **Model type:** Gemma - **Language(s) (NLP):** English - **License:** [gemma-terms-of-use](https://ai.google.dev/gemma/terms) - **Finetuned from model:** [beomi/gemma-ko-2b](https://huggingface.co/beomi/gemma-ko-2b)