File size: 1,004 Bytes
b909d87 16107eb 3006a4c 16107eb b909d87 16107eb b909d87 16107eb b909d87 16107eb b909d87 16107eb b909d87 16107eb b909d87 16107eb b909d87 16107eb b909d87 16107eb b909d87 16107eb b909d87 16107eb b909d87 16107eb b909d87 16107eb b909d87 16107eb |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 |
---
library_name: transformers
language:
- ko
license: gemma
tags:
- gemma
- pytorch
- instruct
- finetune
- translation
widget:
- messages:
- role: user
content: "Hamsters don't eat cats."
base_model: beomi/gemma-ko-2b
datasets:
- traintogpb/aihub-flores-koen-integrated-sparta-30k
pipeline_tag: text-generation
---
# Gemma 2B Translation v0.122
- Eval Loss: `0.45365`
- Train Loss: `0.43420`
- lr: `6e-05`
- optimizer: adamw
- lr_scheduler_type: cosine
## Prompt Template
```
<bos>##English##
Hamsters don't eat cats.
##Korean##
ํ์คํฐ๋ ๊ณ ์์ด๋ฅผ ๋จน์ง ์์ต๋๋ค.<eos>
```
```
<bos>##Korean##
ํ์คํฐ๋ ๊ณ ์์ด๋ฅผ ๋จน์ง ์์ต๋๋ค.
##English##
Hamsters don't eat cats.<eos>
```
## Model Description
- **Developed by:** `lemon-mint`
- **Model type:** Gemma
- **Language(s) (NLP):** English
- **License:** [gemma-terms-of-use](https://ai.google.dev/gemma/terms)
- **Finetuned from model:** [beomi/gemma-ko-2b](https://huggingface.co/beomi/gemma-ko-2b)
|