File size: 10,163 Bytes
e411b3c d09e08c e411b3c 8e000cf e411b3c 8e000cf 2fe792d 43476d8 2abfce7 7cea751 a93fde4 2783043 819ef5c 0967714 3df7b44 855a894 ebd75fc d3dd02e e9aa8fe 0724760 6419636 e8b534f f421140 b764cf3 4e3ca64 5eef79e 3ebf765 2f3da22 ca1bbd2 05bdf71 274ccba 4b86174 1d4ed1d 29e9a8f 38ed845 d970485 3b9f70d 92c95fe afa299a 3d5295f ffdde10 e5e1766 952b754 e6dc43e 1ed8b2d 2d708fe feec7f8 d074609 1ad901b f728e03 900a0e7 86ed606 f75e298 c1c4751 0be8faf dd1926f b4b782c 16ac756 c8ab12c 31e6ca2 e7fd6a2 2a34e46 1a7d695 1451f4a c119b26 ecbf758 88918a8 3766b0f 3cc80d6 6beb846 ddf892c 2f43684 75f0e7a 280e293 fa995c1 350aef8 9c36842 dcf1fb7 892b19d 7febf76 4e20620 d2e5a65 746af4a 5c045c5 1764deb 3229d1d b9109e1 b28656d 9a7374b d25fdcb 72889ff eaa0233 cce7b13 f19cf29 7d6996d 02f9bc5 18723d8 f94c309 6cc0c2c 029f4de ca3774e 162f081 8f241ab 4fdc4e8 3d55dda 1d9a717 2a4d22c e67c555 1ca6623 9967ac6 a98b215 967a39d d0f90d9 71c4f50 e9aed8a 3ca3864 c36c988 a40395a 4b2c7fe 7162cfc db2cdcd 66c40e5 6c7100b f6bb523 bfcbd68 eb2bc29 f2ff696 51bf13a 49fc198 d04b909 1ad2049 91f4329 904be66 c2c1baa 8adff0c 18f4895 3f33d8b 5723d06 73b34cb fef3ddc 124d8c4 c861d59 1d9e051 f718497 72ff5ac 966fb53 35809db 7ff8f85 e65839b dcc07e5 ebbe237 854650c d71a9ab 75ea078 e883805 59936d1 beb71e9 579a184 daa233c 5427254 3ee323c e7b3952 6e903ef 49b36b4 42a1ef1 b18f1bc b8909ff f1e9944 b6768cd 9841419 24f7fd7 3727dff 30f5fdd 7777d56 7f2ed56 d35aa18 162dba6 ef1409b 253a853 caa9ab3 7ac1fe0 076f707 7eed014 a4fcc4b 27114ec 4a31c1c e82fcde d1c83c0 39645aa b7cb4bc 955f5ca b93bca5 f77dc17 23aa9a0 7ee0494 1968f3e 177c8e7 f1b101b bd1932b 2b5aa0e 7ba02dd 0da28b7 fe8121c de6cb52 016fdec 2e6e946 05d1c6a 76880ed 966c059 f82affd 57dfb24 7e09aae 9fedb6a b516532 cccb7ec b6f4137 4d4a81a 329385f a52928a b53d84d d09e08c e411b3c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 |
---
license: apache-2.0
base_model: facebook/bart-base
tags:
- generated_from_keras_callback
model-index:
- name: pijarcandra22/NMTBaliIndoBART
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# pijarcandra22/NMTBaliIndoBART
This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 5.4690
- Validation Loss: 5.8057
- Epoch: 215
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 0.02, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 9.3368 | 5.6757 | 0 |
| 5.5627 | 5.5987 | 1 |
| 5.5311 | 5.5419 | 2 |
| 5.5152 | 5.5201 | 3 |
| 5.5005 | 5.6477 | 4 |
| 5.4704 | 5.5914 | 5 |
| 5.4610 | 6.0922 | 6 |
| 5.4584 | 5.7137 | 7 |
| 5.4528 | 5.8658 | 8 |
| 5.4820 | 5.5628 | 9 |
| 5.4874 | 5.5309 | 10 |
| 5.4917 | 5.7595 | 11 |
| 5.4898 | 5.7333 | 12 |
| 5.4833 | 5.6789 | 13 |
| 5.4767 | 5.9588 | 14 |
| 5.4883 | 5.9895 | 15 |
| 5.4694 | 6.0100 | 16 |
| 5.4663 | 6.0316 | 17 |
| 5.4602 | 5.9233 | 18 |
| 5.4576 | 6.0051 | 19 |
| 5.4559 | 5.9966 | 20 |
| 5.4651 | 6.0025 | 21 |
| 5.4660 | 6.0160 | 22 |
| 5.4626 | 5.8324 | 23 |
| 5.4647 | 5.8383 | 24 |
| 5.4695 | 6.0272 | 25 |
| 5.4614 | 6.0724 | 26 |
| 5.4623 | 5.9454 | 27 |
| 5.4678 | 6.0196 | 28 |
| 5.4860 | 5.5949 | 29 |
| 5.4851 | 5.8838 | 30 |
| 5.4666 | 5.8506 | 31 |
| 5.4715 | 6.0391 | 32 |
| 5.4630 | 6.0870 | 33 |
| 5.4646 | 6.2195 | 34 |
| 5.4574 | 5.9696 | 35 |
| 5.4564 | 5.8970 | 36 |
| 5.4570 | 5.9522 | 37 |
| 5.4559 | 6.1518 | 38 |
| 5.4584 | 6.1860 | 39 |
| 5.4732 | 6.1168 | 40 |
| 5.4625 | 6.1588 | 41 |
| 5.4601 | 5.9868 | 42 |
| 5.4645 | 5.9606 | 43 |
| 5.4664 | 6.1495 | 44 |
| 5.4698 | 6.0152 | 45 |
| 5.4666 | 6.2713 | 46 |
| 5.4557 | 6.2708 | 47 |
| 5.4557 | 6.0003 | 48 |
| 5.4693 | 5.9321 | 49 |
| 5.4928 | 5.8971 | 50 |
| 5.5032 | 6.0766 | 51 |
| 5.4749 | 5.8919 | 52 |
| 5.4689 | 5.9853 | 53 |
| 5.4665 | 5.9329 | 54 |
| 5.4574 | 5.9770 | 55 |
| 5.4686 | 6.1022 | 56 |
| 5.4727 | 5.8973 | 57 |
| 5.4692 | 5.9633 | 58 |
| 5.4608 | 6.0480 | 59 |
| 5.4613 | 5.9596 | 60 |
| 5.4607 | 6.1158 | 61 |
| 5.4531 | 6.0617 | 62 |
| 5.4610 | 6.0375 | 63 |
| 5.4631 | 6.1184 | 64 |
| 5.4627 | 6.0465 | 65 |
| 5.4685 | 6.0011 | 66 |
| 5.4642 | 6.0828 | 67 |
| 5.4577 | 6.0883 | 68 |
| 5.4615 | 5.9523 | 69 |
| 5.4673 | 5.7216 | 70 |
| 5.4724 | 6.0274 | 71 |
| 5.4601 | 6.0344 | 72 |
| 5.4640 | 5.9661 | 73 |
| 5.4590 | 6.0013 | 74 |
| 5.4622 | 6.0172 | 75 |
| 5.4666 | 5.8407 | 76 |
| 5.4669 | 6.0261 | 77 |
| 5.4859 | 5.9295 | 78 |
| 5.5042 | 6.1254 | 79 |
| 5.4845 | 5.8930 | 80 |
| 5.5001 | 5.8867 | 81 |
| 5.4923 | 5.9480 | 82 |
| 5.4909 | 6.0475 | 83 |
| 5.4780 | 5.9289 | 84 |
| 5.4867 | 5.8134 | 85 |
| 5.4877 | 6.0032 | 86 |
| 5.4806 | 6.0884 | 87 |
| 5.4784 | 6.0567 | 88 |
| 5.4830 | 5.9790 | 89 |
| 5.4894 | 5.8919 | 90 |
| 5.4890 | 5.9626 | 91 |
| 5.4774 | 6.0267 | 92 |
| 5.5033 | 6.1150 | 93 |
| 5.4765 | 5.9776 | 94 |
| 5.4657 | 6.1395 | 95 |
| 5.4720 | 5.9938 | 96 |
| 5.4748 | 5.9656 | 97 |
| 5.4701 | 6.0163 | 98 |
| 5.4718 | 6.1462 | 99 |
| 5.4672 | 6.0804 | 100 |
| 5.4775 | 6.1055 | 101 |
| 5.4775 | 6.0936 | 102 |
| 5.4673 | 5.9839 | 103 |
| 5.4691 | 5.8972 | 104 |
| 5.4694 | 5.8271 | 105 |
| 5.5106 | 5.5305 | 106 |
| 5.5135 | 5.8806 | 107 |
| 5.4786 | 6.1380 | 108 |
| 5.4770 | 5.9899 | 109 |
| 5.4709 | 6.1072 | 110 |
| 5.4701 | 5.9356 | 111 |
| 5.4636 | 5.8304 | 112 |
| 5.4670 | 6.0451 | 113 |
| 5.4598 | 6.0311 | 114 |
| 5.4731 | 5.9862 | 115 |
| 5.4798 | 5.9589 | 116 |
| 5.4674 | 5.9356 | 117 |
| 5.4634 | 6.0088 | 118 |
| 5.4709 | 5.9534 | 119 |
| 5.4891 | 5.9995 | 120 |
| 5.4737 | 5.8611 | 121 |
| 5.4725 | 6.0112 | 122 |
| 5.4835 | 5.6280 | 123 |
| 5.5217 | 5.6917 | 124 |
| 5.4821 | 5.9458 | 125 |
| 5.4898 | 5.7593 | 126 |
| 5.4866 | 5.9110 | 127 |
| 5.4744 | 5.9463 | 128 |
| 5.4673 | 6.0359 | 129 |
| 5.4838 | 6.0166 | 130 |
| 5.4864 | 6.0046 | 131 |
| 5.4896 | 5.9479 | 132 |
| 5.4722 | 6.0699 | 133 |
| 5.4627 | 6.0684 | 134 |
| 5.4690 | 6.0577 | 135 |
| 5.4666 | 6.1473 | 136 |
| 5.4655 | 6.0441 | 137 |
| 5.4665 | 5.9313 | 138 |
| 5.4588 | 6.1375 | 139 |
| 5.4575 | 6.1655 | 140 |
| 5.4609 | 5.9701 | 141 |
| 5.4666 | 6.0677 | 142 |
| 5.4672 | 6.1272 | 143 |
| 5.4776 | 6.2186 | 144 |
| 5.4769 | 5.9815 | 145 |
| 5.4666 | 6.0674 | 146 |
| 5.4670 | 6.0282 | 147 |
| 5.4868 | 5.7416 | 148 |
| 5.4901 | 6.0836 | 149 |
| 5.4877 | 5.9086 | 150 |
| 5.4842 | 5.8724 | 151 |
| 5.5167 | 5.7298 | 152 |
| 5.5043 | 5.7802 | 153 |
| 5.4737 | 6.0805 | 154 |
| 5.4805 | 6.0888 | 155 |
| 5.4765 | 5.9967 | 156 |
| 5.4691 | 5.9332 | 157 |
| 5.4697 | 6.0675 | 158 |
| 5.4648 | 6.0689 | 159 |
| 5.4658 | 5.9954 | 160 |
| 5.4721 | 5.8917 | 161 |
| 5.4641 | 5.8973 | 162 |
| 5.4703 | 6.0126 | 163 |
| 5.4753 | 5.9064 | 164 |
| 5.4731 | 6.0835 | 165 |
| 5.5094 | 5.5720 | 166 |
| 5.5355 | 5.9077 | 167 |
| 5.4791 | 6.0669 | 168 |
| 5.4690 | 6.0729 | 169 |
| 5.4635 | 5.9580 | 170 |
| 5.4698 | 6.1453 | 171 |
| 5.4668 | 5.9952 | 172 |
| 5.4728 | 6.0041 | 173 |
| 5.5062 | 6.1592 | 174 |
| 5.4944 | 5.9536 | 175 |
| 5.4802 | 5.9673 | 176 |
| 5.4710 | 5.9888 | 177 |
| 5.4653 | 6.0656 | 178 |
| 5.4618 | 6.0278 | 179 |
| 5.4659 | 5.9563 | 180 |
| 5.4596 | 6.0022 | 181 |
| 5.4627 | 5.9594 | 182 |
| 5.4688 | 5.8462 | 183 |
| 5.4662 | 5.9550 | 184 |
| 5.4646 | 5.9757 | 185 |
| 5.4753 | 5.9400 | 186 |
| 5.4911 | 5.7438 | 187 |
| 5.4681 | 6.0941 | 188 |
| 5.4719 | 6.0324 | 189 |
| 5.4692 | 6.0313 | 190 |
| 5.4634 | 5.9874 | 191 |
| 5.4639 | 5.9928 | 192 |
| 5.4714 | 6.0265 | 193 |
| 5.4569 | 5.8387 | 194 |
| 5.4606 | 6.0462 | 195 |
| 5.4667 | 5.9636 | 196 |
| 5.4653 | 6.0299 | 197 |
| 5.4623 | 6.0311 | 198 |
| 5.4629 | 5.9745 | 199 |
| 5.4630 | 5.9398 | 200 |
| 5.4618 | 5.9005 | 201 |
| 5.4611 | 5.8718 | 202 |
| 5.4979 | 5.7893 | 203 |
| 5.4995 | 5.8556 | 204 |
| 5.4949 | 5.9533 | 205 |
| 5.4806 | 6.0033 | 206 |
| 5.4700 | 6.0395 | 207 |
| 5.4601 | 6.0592 | 208 |
| 5.4605 | 6.1408 | 209 |
| 5.4638 | 6.0469 | 210 |
| 5.4592 | 6.1216 | 211 |
| 5.4646 | 6.0284 | 212 |
| 5.4607 | 5.8940 | 213 |
| 5.4573 | 5.8946 | 214 |
| 5.4690 | 5.8057 | 215 |
### Framework versions
- Transformers 4.40.2
- TensorFlow 2.15.0
- Datasets 2.19.1
- Tokenizers 0.19.1
|