File size: 5,485 Bytes
aa6eb1f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 |
Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
gemma-2b-translation-v0.103 - GGUF
- Model creator: https://huggingface.co/lemon-mint/
- Original model: https://huggingface.co/lemon-mint/gemma-2b-translation-v0.103/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [gemma-2b-translation-v0.103.Q2_K.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q2_K.gguf) | Q2_K | 1.08GB |
| [gemma-2b-translation-v0.103.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.IQ3_XS.gguf) | IQ3_XS | 1.16GB |
| [gemma-2b-translation-v0.103.IQ3_S.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.IQ3_S.gguf) | IQ3_S | 1.2GB |
| [gemma-2b-translation-v0.103.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q3_K_S.gguf) | Q3_K_S | 1.2GB |
| [gemma-2b-translation-v0.103.IQ3_M.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.IQ3_M.gguf) | IQ3_M | 1.22GB |
| [gemma-2b-translation-v0.103.Q3_K.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q3_K.gguf) | Q3_K | 1.29GB |
| [gemma-2b-translation-v0.103.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q3_K_M.gguf) | Q3_K_M | 1.29GB |
| [gemma-2b-translation-v0.103.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q3_K_L.gguf) | Q3_K_L | 1.36GB |
| [gemma-2b-translation-v0.103.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.IQ4_XS.gguf) | IQ4_XS | 1.4GB |
| [gemma-2b-translation-v0.103.Q4_0.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q4_0.gguf) | Q4_0 | 1.44GB |
| [gemma-2b-translation-v0.103.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.IQ4_NL.gguf) | IQ4_NL | 1.45GB |
| [gemma-2b-translation-v0.103.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q4_K_S.gguf) | Q4_K_S | 1.45GB |
| [gemma-2b-translation-v0.103.Q4_K.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q4_K.gguf) | Q4_K | 1.52GB |
| [gemma-2b-translation-v0.103.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q4_K_M.gguf) | Q4_K_M | 1.52GB |
| [gemma-2b-translation-v0.103.Q4_1.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q4_1.gguf) | Q4_1 | 1.56GB |
| [gemma-2b-translation-v0.103.Q5_0.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q5_0.gguf) | Q5_0 | 1.68GB |
| [gemma-2b-translation-v0.103.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q5_K_S.gguf) | Q5_K_S | 1.68GB |
| [gemma-2b-translation-v0.103.Q5_K.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q5_K.gguf) | Q5_K | 1.71GB |
| [gemma-2b-translation-v0.103.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q5_K_M.gguf) | Q5_K_M | 1.71GB |
| [gemma-2b-translation-v0.103.Q5_1.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q5_1.gguf) | Q5_1 | 1.79GB |
| [gemma-2b-translation-v0.103.Q6_K.gguf](https://huggingface.co/RichardErkhov/lemon-mint_-_gemma-2b-translation-v0.103-gguf/blob/main/gemma-2b-translation-v0.103.Q6_K.gguf) | Q6_K | 1.92GB |
Original model description:
---
library_name: transformers
language:
- ko
license: gemma
tags:
- gemma
- pytorch
- instruct
- finetune
- translation
widget:
- messages:
- role: user
content: "Hamsters don't eat cats."
inference:
parameters:
max_new_tokens: 2048
base_model: beomi/gemma-ko-2b
datasets:
- traintogpb/aihub-flores-koen-integrated-sparta-30k
pipeline_tag: text-generation
---
# Gemma 2B Translation v0.103
- Eval Loss: `1.34507`
- Train Loss: `1.40326`
- lr: `3e-05`
- optimizer: adamw
- lr_scheduler_type: cosine
## Prompt Template
```
<bos>### English
Hamsters don't eat cats.
### Korean
햄스터는 고양이를 먹지 않습니다.<eos>
```
## Model Description
- **Developed by:** `lemon-mint`
- **Model type:** Gemma
- **Language(s) (NLP):** English
- **License:** [gemma-terms-of-use](https://ai.google.dev/gemma/terms)
- **Finetuned from model:** [beomi/gemma-ko-2b](https://huggingface.co/beomi/gemma-ko-2b)
|