Commit
•
ec0f46c
1
Parent(s):
659fea7
Update README.md
Browse files
README.md
CHANGED
@@ -9,6 +9,7 @@ tags:
|
|
9 |
- PyTorch
|
10 |
widget:
|
11 |
- text: 'translate nahuatl to spanish: Nimitstlazohkamate'
|
|
|
12 |
---
|
13 |
|
14 |
# mt5-large-spanish-nahuatl
|
@@ -82,7 +83,7 @@ The employed method uses a single training stage using the mt5. This model was l
|
|
82 |
|
83 |
### Training
|
84 |
The model is trained bidirectionally till convergence, adding the prefixes "translate spanish to nahuatl: + word" and "translate nahuatl to spanish: + word".
|
85 |
-
This is an evolution and improvement of the [previous model](https://huggingface.co/hackathon-pln-es/t5-small-spanish-nahuatl) I
|
86 |
|
87 |
### Training setup
|
88 |
The model uses the same dataset for 77,500 steps using batch size = 4 and a learning rate of 1e-4.
|
|
|
9 |
- PyTorch
|
10 |
widget:
|
11 |
- text: 'translate nahuatl to spanish: Nimitstlazohkamate'
|
12 |
+
- text: 'translate spanish to nahuatl: agua'
|
13 |
---
|
14 |
|
15 |
# mt5-large-spanish-nahuatl
|
|
|
83 |
|
84 |
### Training
|
85 |
The model is trained bidirectionally till convergence, adding the prefixes "translate spanish to nahuatl: + word" and "translate nahuatl to spanish: + word".
|
86 |
+
This is an evolution and improvement of the [previous model](https://huggingface.co/hackathon-pln-es/t5-small-spanish-nahuatl) I collaborated on.
|
87 |
|
88 |
### Training setup
|
89 |
The model uses the same dataset for 77,500 steps using batch size = 4 and a learning rate of 1e-4.
|