File size: 1,802 Bytes
6204e3c 73b1187 7ac9d34 c0780b5 8ad2b1f c0780b5 97b56e7 c0780b5 97b56e7 c0780b5 ff3c20c c0780b5 97b56e7 c0780b5 d7c7d33 c0780b5 d7c7d33 ff3c20c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 |
---
library_name: pytorch
pipeline_tag: text2text-generation
language:
- vi
- lo
metrics:
- bleu
---
Please use python version 3.10
## Direct Use
### Load a pre-trained model
Use `load_config` to load a .yaml config file.
Then use `load_model_tokenizer` to load a pretrained model and its tokenizers
```
from config import load_config
from load_model import load_model_tokenizer
config = load_config(file_name='config/config_final.yaml')
model, src_tokenizer, tgt_tokenizer = load_model_tokenizer(config)
```
### Translate lo to vi
Use the `translate` function in `translate.py`.
```
from translate import translate
from config import load_config
from load_model import load_model_tokenizer
config = load_config(file_name='config/config_final.yaml')
model, src_tokenizer, tgt_tokenizer = load_model_tokenizer(config)
text = " "
translation, attn = translate(
model, src_tokenizer, tgt_tokenizer, text,
decode_method='beam-search',
)
print(translation)
```
## Training
Use the `train_model` function in `train.py` to train your model.
```
from train import train_model
from config import load_config
config = load_config(file_name='config/config_final.yaml')
train_model(config)
```
If you wish to continue training/ fine-tune our model, you should
modify the `num_epochs` in your desired config file,
as well as read the following notes (`+` is the string concat funtion):
- The code will save and preload models in `model_folder`
- The code will preload the model with the name: "`model_basename` + `preload` + `.pt`"
- The code will NOT preload a trained model if you set `preload` as `null`
- Every epoch, the code will save the model with the name: "`model_basename` + `_` + (current epoch) + `.pt`"
- `train_model` will automatically continue training the `preload`ed model. |