File size: 1,993 Bytes
63e8326 835c138 c6ad7d3 63e8326 c6ad7d3 63e8326 c6ad7d3 63e8326 de8067c 63e8326 c6ad7d3 63e8326 8403ea1 63e8326 c6ad7d3 c106a5b c6ad7d3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 |
---
license: gemma
datasets:
- lmg-anon/VNTL-v3.1-1k
- lmg-anon/VNTL-Chat
language:
- ja
- en
library_name: peft
base_model: rinna/gemma-2-baku-2b
pipeline_tag: translation
---
# Summary
This is an [Gemma 2 Baku](https://huggingface.co/rinna/gemma-2-baku-2b) lora, created using the [VNTL 3.1 dataset](https://huggingface.co/datasets/lmg-anon/VNTL-v3.1-1k). The purpose of this lora is to improve Gemma's performance at translating Japanese visual novels to English.
## Notes
Recently, [rinna](https://huggingface.co/rinna) released the [**Gemma2 Baku 2B**](https://huggingface.co/rinna/gemma-2-baku-2b) model, pretrained on a substantial 80 billion tokens(!). After testing, I found its performance quite impressive for a 2B model, so I decided to create this fine-tune (it only took 30 minutes, which is nice). However, I opted to remove the chat mode from this model, as I wasn't sure if the 2B model could effectively manage both capabilities.
## Training Details
This model was trained using the same hyperparameters as the [VNTL LLaMA3 8B qlora](https://huggingface.co/lmg-anon/vntl-llama3-8b-qlora).
- Rank: 128
- Alpha: 32
- Effective Batch Size: 30
- Warmup Ratio: 0.02
- Learning Rate: 6.5e-5
- Embedding Learning Rate: 1.5e-5
- LR Schedule: cosine
- Weight Decay: 0.01
## Translation Prompt
This is an prompt example for translation:
```
<<METADATA>>
[character] Name: Uryuu Shingo (ηη ζ°εΎ) | Gender: Male | Aliases: Onii-chan (γε
γ‘γγ)
[character] Name: Uryuu Sakuno (ηη ζ‘δΉ) | Gender: Female
<<TRANSLATE>>
<<JAPANESE>>
[ζ‘δΉ]: γβ¦β¦γγγγ
<<ENGLISH>>
[Sakuno]: γ... Sorry.γ<eos>
<<JAPANESE>>
[ζ°εΎ]: γγγγγγγθ¨γ£γ‘γγͺγγ γγ©γθΏ·εγ§γγγ£γγγζ‘δΉγ―ε―ζγγγγγγγγεΏι
γγ‘γγ£γ¦γγγ γδΏΊγ
<<ENGLISH>>
```
The generated translation for that prompt, with temperature 0, is:
```
[Shingo]: γNo, I'm glad you got lost. You were so cute that it made me worry.γ
``` |