Legal-BERTimbau-sts-base-ma / eval /translation_evaluation_TED2020-en-pt-dev.tsv.gz_results.csv
Rui Melo
initial commit
c5f7cc1
raw
history blame contribute delete
962 Bytes
epoch,steps,src2trg,trg2src
0,1000,0.045,0.185
0,2000,0.055,0.108
0,3000,0.093,0.139
0,4000,0.21,0.248
0,5000,0.426,0.438
0,6000,0.584,0.613
0,7000,0.718,0.719
0,8000,0.808,0.814
0,9000,0.856,0.879
0,-1,0.887,0.895
1,1000,0.92,0.923
1,2000,0.93,0.929
1,3000,0.939,0.936
1,4000,0.946,0.947
1,5000,0.954,0.951
1,6000,0.957,0.958
1,7000,0.959,0.957
1,8000,0.964,0.959
1,9000,0.966,0.962
1,-1,0.966,0.964
2,1000,0.966,0.965
2,2000,0.969,0.966
2,3000,0.969,0.968
2,4000,0.971,0.97
2,5000,0.972,0.971
2,6000,0.972,0.972
2,7000,0.973,0.972
2,8000,0.973,0.97
2,9000,0.973,0.97
2,-1,0.972,0.971
3,1000,0.973,0.973
3,2000,0.975,0.971
3,3000,0.975,0.971
3,4000,0.973,0.971
3,5000,0.973,0.973
3,6000,0.974,0.974
3,7000,0.974,0.973
3,8000,0.974,0.976
3,9000,0.973,0.974
3,-1,0.973,0.975
4,1000,0.975,0.976
4,2000,0.974,0.975
4,3000,0.975,0.974
4,4000,0.974,0.975
4,5000,0.974,0.975
4,6000,0.974,0.976
4,7000,0.974,0.976
4,8000,0.974,0.976
4,9000,0.974,0.976
4,-1,0.974,0.976