--- library_name: transformers language: - ko license: gemma tags: - gemma - pytorch - instruct - finetune - translation widget: - messages: - role: user content: "Translate into Korean.\nEnglish:\n\nHamsters don't eat cats." inference: parameters: max_new_tokens: 1024 base_model: google/gemma-1.1-2b-it pipeline_tag: text-generation --- # Gemma 2B Translation v0.91 - Eval Loss: `1.0779` - Train Loss: `0.5749` - lr: `5e-5` - optimizer: adamw - lr_scheduler_type: cosine ## Prompt Template ``` user Translate into Korean. English: Hamsters don't eat cats. model 햄스터는 고양이를 먹지 않습니다. ``` ## Model Description - **Developed by:** `lemon-mint` - **Model type:** Gemma - **Language(s) (NLP):** English - **License:** [gemma-terms-of-use](https://ai.google.dev/gemma/terms) - **Finetuned from model:** [google/gemma-1.1-2b-it](https://huggingface.co/google/gemma-1.1-2b-it)