Mainak Manna
commited on
Commit
•
4d13d92
1
Parent(s):
6dc78a4
First version of the model
Browse files
README.md
CHANGED
@@ -6,7 +6,7 @@ tags:
|
|
6 |
datasets:
|
7 |
- dcep europarl jrc-acquis
|
8 |
widget:
|
9 |
-
- text: "
|
10 |
|
11 |
---
|
12 |
|
@@ -38,7 +38,7 @@ tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/l
|
|
38 |
device=0
|
39 |
)
|
40 |
|
41 |
-
fr_text = "
|
42 |
|
43 |
pipeline([fr_text], max_length=512)
|
44 |
```
|
@@ -49,10 +49,14 @@ The legal_t5_small_trans_fr_it model was trained on [JRC-ACQUIS](https://wt-publ
|
|
49 |
|
50 |
## Training procedure
|
51 |
|
|
|
|
|
|
|
|
|
52 |
### Preprocessing
|
53 |
|
54 |
### Pretraining
|
55 |
-
|
56 |
|
57 |
|
58 |
## Evaluation results
|
|
|
6 |
datasets:
|
7 |
- dcep europarl jrc-acquis
|
8 |
widget:
|
9 |
+
- text: "invite le Conseil et la Commission à faire en sorte que les informations en ce qui concerne les processus et les exigences liés à la production des rapports annuels, les normes de base applicables aux codes de conduite et les rapports annuels et codes de conduite modèles soient publiés, notamment sur un site Web du Forum de l'UE pour la RSE;"
|
10 |
|
11 |
---
|
12 |
|
|
|
38 |
device=0
|
39 |
)
|
40 |
|
41 |
+
fr_text = "invite le Conseil et la Commission à faire en sorte que les informations en ce qui concerne les processus et les exigences liés à la production des rapports annuels, les normes de base applicables aux codes de conduite et les rapports annuels et codes de conduite modèles soient publiés, notamment sur un site Web du Forum de l'UE pour la RSE;"
|
42 |
|
43 |
pipeline([fr_text], max_length=512)
|
44 |
```
|
|
|
49 |
|
50 |
## Training procedure
|
51 |
|
52 |
+
An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.
|
53 |
+
|
54 |
+
The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
|
55 |
+
|
56 |
### Preprocessing
|
57 |
|
58 |
### Pretraining
|
59 |
+
|
60 |
|
61 |
|
62 |
## Evaluation results
|