Edit model card

bert-base-greek-uncased-v5-finetuned-polylex-mg

This model is a fine-tuned version of nlpaueb/bert-base-greek-uncased-v1 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3369

Model description

In this work we appropriately adapt a corpus of multiword expressions in Modern Greek, namely PolylexMG, characterised by the features detailed above formulating its spectrum of idiosyncrasy to finetune Greek BERT transformer model for masked language modelling classification and tasks. The GREEK-BERT model is pre-trained on free text corpora extracted from (a) the Greek part of Wikipedia, (b) the Greek part of the European Parliament Proceedings Parallel Corpus (Europarl), and (c) the Greek part of OSCAR (Koutsikakis et al, 2020:113), this monolingual model is based on the architecture of BERT-BASE-UNCASED. Specifically, Greek BERT has been finetuned with expressions derived from each syntactic category as they are described in PolylexMG (Fotopoulou et al., 2023) that includes 6,000 Greek lexical entries dataset entailing frozen idioms which are semantically fixed with no paradigmatic variation (Lamiroy, 2003) and light verb constructions in which the semantics are traced in the predicative noun and not the verb (Anastassiadis-Symeonidis et al., 2020).

Results, intended uses & limitations

This subsection presents the experimental evaluation results for the MWE-fine-tuned Greek BERT model with respect to classification use case. The derived setup assumes that raw text in modern Greek that may contain multiple sentences is processed by the language model and reports class with regards to whether the text segment contains multiword stereotypical expressions or not. We compared the fine-tuned BERT model with a baseline logistic regression model. The latter is using as input the same word embeddings as the MWE-fine-tuned BERT model.
Greek-MWE-Bert was trained in a masked language model setting with full-expression-subdataset. The model perplexity was measured tο 303.21 before finetuning and 3.81 after finetuning demonstrating that the model has gained domain knowledge on multiword expressions. Qualitative outcomes presented in the following tables demonstrating the model performance in the case of 16 verbal constructs. The qualitative evaluation demonstrates that the fine-tuned model in all cases generates stereotypical multiword expressions while the original Greek BERT yields incomplete free-text related parts of sentences.

The finetuned model was further finetuned using for classification-oriented architecture with the classification-task-subdataset. The Bert classifier demonstrates an accuracy equivalent to 80% with a higher precision for free text reaching 80% and lower precision of 79% for MWE. In comparison the baseline classifier yields 70% for the free text and 67% for the MWE. We can observe that the two models have only 10% difference in accuracy despite the simplicity of baseline classifier. We can explain the small difference in the trained GreekBERT tokenizer that is used by both our model and the simplistic logistic regression model. However, the MWE-finetuned-Greek-BERT model can better capture sentences that contain MWEs due to the inherent benefits that the architecture offers.

Training and evaluation data

Sample of PolylexMG full expression subdataset

text label
αδειάζω τη γωνιά σε 1
αδειάζω πιστόλι πάνω σε 1
αλλάζω τον αδόξαστο σε 1
αλλάζω την πίστη σε 1
δεν αλλάζω ούτε κόμα σε 1
αλλάζω λόγια με 1
αλλάζω κουβέντες με 1
αλλάζω τα μυαλά σε 1
αλλάζω τα φώτα σε 1
αλλάζω τα πετρέλαια σε 1
αλλάζω τα πέταλα σε 1

Sample of PolylexMG classification subdataset

text label
Μέσα σε λίγα λεπτά άναψαν τα αίματα και ο διαπληκτισμός άρχισε να γίνεται όλο και πιο έντονος 1
Η πρώτη έκπληξη ήρθε αμέσως μόλις άναψαν τα τέσσερα κόκκινα φανάρια και το ένα πράσινο 0
Γιατί τα κάνετε αυτά, για να γελάνε οι άλλοι μαζί μας; 0
Κάθε φορά που έμπαινε καλάθι, έβγαζαν τις ίδιες ακριβώς ιαχές για να πάει γούρι και να μην κόψει η μαγιονέζα 1
Η νέα πυρκαγιά ξεκινά από την πίσω πλευρά του Πεντελικού Όρους, σε σημείο που δεν είχε καεί 0

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 512
  • eval_batch_size: 512
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 500

Training results (Summary)

Training Loss Epoch Step Validation Loss
5.2105 1.0 13 4.4870
4.4319 2.0 26 3.8456
4.0318 3.0 39 3.4164
3.7558 4.0 52 3.2849
1.1307 497.0 6461 1.3311
1.1163 498.0 6474 1.3016
1.099 499.0 6487 1.3532
1.1246 500.0 6500 1.2222

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.3
  • Tokenizers 0.13.3

Training results (Full)

Training Loss Epoch Step Validation Loss
5.2105 1.0 13 4.4870
4.4319 2.0 26 3.8456
4.0318 3.0 39 3.4164
3.7558 4.0 52 3.2849
3.5626 5.0 65 3.3146
3.4355 6.0 78 3.1532
3.3299 7.0 91 3.0451
3.2313 8.0 104 2.9359
3.1758 9.0 117 2.8543
3.0762 10.0 130 2.8034
3.0318 11.0 143 2.7975
2.9481 12.0 156 2.6439
2.8848 13.0 169 2.6623
2.9002 14.0 182 2.6425
2.8435 15.0 195 2.6639
2.8451 16.0 208 2.6203
2.7987 17.0 221 2.5597
2.7522 18.0 234 2.5719
2.7194 19.0 247 2.6220
2.6923 20.0 260 2.5566
2.678 21.0 273 2.4172
2.6612 22.0 286 2.5726
2.6272 23.0 299 2.4478
2.6052 24.0 312 2.4366
2.5694 25.0 325 2.3694
2.593 26.0 338 2.4324
2.548 27.0 351 2.4070
2.4954 28.0 364 2.3651
2.5097 29.0 377 2.3268
2.5041 30.0 390 2.4208
2.4919 31.0 403 2.4321
2.461 32.0 416 2.3477
2.4698 33.0 429 2.4017
2.4557 34.0 442 2.3050
2.4464 35.0 455 2.3282
2.4215 36.0 468 2.3339
2.4037 37.0 481 2.2429
2.386 38.0 494 2.3452
2.3961 39.0 507 2.3312
2.3985 40.0 520 2.2921
2.3302 41.0 533 2.2711
2.3128 42.0 546 2.2344
2.3158 43.0 559 2.1982
2.2927 44.0 572 2.1473
2.3122 45.0 585 2.2317
2.2885 46.0 598 2.2060
2.2592 47.0 611 2.1943
2.2492 48.0 624 2.2361
2.2495 49.0 637 2.2059
2.2402 50.0 650 2.1461
2.241 51.0 663 2.2181
2.211 52.0 676 2.0885
2.2165 53.0 689 2.1567
2.2063 54.0 702 2.2112
2.1715 55.0 715 2.2934
2.1601 56.0 728 2.0745
2.1796 57.0 741 2.1070
2.152 58.0 754 2.0930
2.1562 59.0 767 2.1106
2.125 60.0 780 2.1529
2.1318 61.0 793 2.0296
2.1194 62.0 806 2.0323
2.1396 63.0 819 1.9835
2.1108 64.0 832 2.0066
2.0874 65.0 845 1.9062
2.0754 66.0 858 2.1728
2.0928 67.0 871 2.0197
2.0835 68.0 884 2.0767
2.0684 69.0 897 2.1482
2.0505 70.0 910 2.0667
2.0564 71.0 923 2.1489
2.0478 72.0 936 2.0015
2.0478 73.0 949 1.9215
2.0316 74.0 962 2.0238
2.0171 75.0 975 2.0014
2.0248 76.0 988 2.0775
2.0066 77.0 1001 2.0390
2.0018 78.0 1014 2.0043
1.9925 79.0 1027 2.0138
1.9614 80.0 1040 1.9499
1.9877 81.0 1053 1.9642
1.9499 82.0 1066 1.9676
1.932 83.0 1079 1.9332
1.9353 84.0 1092 1.8787
1.9672 85.0 1105 1.9720
1.9313 86.0 1118 1.9343
1.9292 87.0 1131 1.8964
1.9277 88.0 1144 1.9619
1.9158 89.0 1157 1.9608
1.921 90.0 1170 1.9171
1.9191 91.0 1183 1.8871
1.8935 92.0 1196 1.8857
1.8818 93.0 1209 1.8909
1.8782 94.0 1222 1.8951
1.9028 95.0 1235 1.9164
1.8907 96.0 1248 1.9650
1.8626 97.0 1261 1.8906
1.8413 98.0 1274 1.8957
1.854 99.0 1287 1.9644
1.8608 100.0 1300 1.8329
1.8623 101.0 1313 1.8693
1.7798 102.0 1326 1.8913
1.846 103.0 1339 1.7854
1.7972 104.0 1352 1.8611
1.8443 105.0 1365 1.8482
1.791 106.0 1378 1.7168
1.7879 107.0 1391 1.8093
1.7886 108.0 1404 1.8924
1.8192 109.0 1417 1.7715
1.7919 110.0 1430 1.7415
1.7581 111.0 1443 1.7956
1.7873 112.0 1456 1.7213
1.7873 113.0 1469 1.7340
1.7764 114.0 1482 1.8535
1.7612 115.0 1495 1.8554
1.7737 116.0 1508 1.8126
1.7416 117.0 1521 1.8327
1.7648 118.0 1534 1.6832
1.7262 119.0 1547 1.6972
1.7334 120.0 1560 1.7930
1.7172 121.0 1573 1.6962
1.7282 122.0 1586 1.8800
1.7038 123.0 1599 1.7828
1.6935 124.0 1612 1.7646
1.758 125.0 1625 1.8069
1.7018 126.0 1638 1.6958
1.6886 127.0 1651 1.6692
1.7004 128.0 1664 1.7256
1.6947 129.0 1677 1.7587
1.6897 130.0 1690 1.7484
1.7037 131.0 1703 1.8455
1.6981 132.0 1716 1.7588
1.6828 133.0 1729 1.7421
1.6596 134.0 1742 1.6933
1.6782 135.0 1755 1.7040
1.6595 136.0 1768 1.6705
1.6567 137.0 1781 1.7744
1.6588 138.0 1794 1.6545
1.6225 139.0 1807 1.7576
1.6394 140.0 1820 1.7256
1.6515 141.0 1833 1.6668
1.6331 142.0 1846 1.7884
1.6367 143.0 1859 1.7093
1.6335 144.0 1872 1.7098
1.6501 145.0 1885 1.6671
1.6192 146.0 1898 1.7073
1.6198 147.0 1911 1.6653
1.6182 148.0 1924 1.6723
1.6172 149.0 1937 1.7293
1.6129 150.0 1950 1.6545
1.6054 151.0 1963 1.6850
1.5967 152.0 1976 1.7064
1.6028 153.0 1989 1.5292
1.6156 154.0 2002 1.6477
1.5965 155.0 2015 1.6110
1.5695 156.0 2028 1.7071
1.5586 157.0 2041 1.6504
1.561 158.0 2054 1.6147
1.5643 159.0 2067 1.6941
1.5797 160.0 2080 1.7398
1.5609 161.0 2093 1.5761
1.5465 162.0 2106 1.6003
1.5467 163.0 2119 1.5839
1.5935 164.0 2132 1.6530
1.5439 165.0 2145 1.6743
1.559 166.0 2158 1.5143
1.5648 167.0 2171 1.6390
1.552 168.0 2184 1.5389
1.5164 169.0 2197 1.5879
1.5342 170.0 2210 1.6785
1.5319 171.0 2223 1.6341
1.5477 172.0 2236 1.7071
1.5364 173.0 2249 1.6268
1.5366 174.0 2262 1.7247
1.5445 175.0 2275 1.6668
1.4916 176.0 2288 1.5756
1.509 177.0 2301 1.5412
1.5316 178.0 2314 1.6270
1.5156 179.0 2327 1.6423
1.4918 180.0 2340 1.6112
1.4997 181.0 2353 1.5775
1.5187 182.0 2366 1.6248
1.5254 183.0 2379 1.5884
1.4732 184.0 2392 1.5787
1.4844 185.0 2405 1.5358
1.4882 186.0 2418 1.5144
1.478 187.0 2431 1.5223
1.5101 188.0 2444 1.5787
1.4688 189.0 2457 1.5479
1.4815 190.0 2470 1.5141
1.4925 191.0 2483 1.5939
1.467 192.0 2496 1.5471
1.4718 193.0 2509 1.6845
1.4699 194.0 2522 1.5943
1.4562 195.0 2535 1.4745
1.4451 196.0 2548 1.5922
1.4451 197.0 2561 1.5856
1.4624 198.0 2574 1.5519
1.444 199.0 2587 1.6538
1.4498 200.0 2600 1.5037
1.4285 201.0 2613 1.5539
1.4439 202.0 2626 1.5387
1.4177 203.0 2639 1.5756
1.436 204.0 2652 1.6136
1.4184 205.0 2665 1.5014
1.43 206.0 2678 1.4983
1.4347 207.0 2691 1.5896
1.39 208.0 2704 1.5506
1.4198 209.0 2717 1.5142
1.4101 210.0 2730 1.4930
1.4219 211.0 2743 1.4814
1.4039 212.0 2756 1.3750
1.4479 213.0 2769 1.5330
1.4354 214.0 2782 1.5179
1.4163 215.0 2795 1.5970
1.4459 216.0 2808 1.4755
1.3714 217.0 2821 1.4230
1.3957 218.0 2834 1.5087
1.396 219.0 2847 1.5570
1.3866 220.0 2860 1.4955
1.4122 221.0 2873 1.4272
1.371 222.0 2886 1.5209
1.3907 223.0 2899 1.4725
1.3856 224.0 2912 1.5021
1.4053 225.0 2925 1.4880
1.4074 226.0 2938 1.4988
1.3827 227.0 2951 1.5527
1.4045 228.0 2964 1.5350
1.3626 229.0 2977 1.5093
1.3795 230.0 2990 1.4497
1.3973 231.0 3003 1.5106
1.3703 232.0 3016 1.4619
1.3942 233.0 3029 1.4553
1.3447 234.0 3042 1.5061
1.3438 235.0 3055 1.5167
1.3496 236.0 3068 1.4060
1.3614 237.0 3081 1.4211
1.3618 238.0 3094 1.4624
1.359 239.0 3107 1.4450
1.3657 240.0 3120 1.4795
1.3599 241.0 3133 1.4887
1.3532 242.0 3146 1.4606
1.3528 243.0 3159 1.4225
1.3445 244.0 3172 1.3912
1.3344 245.0 3185 1.4055
1.3358 246.0 3198 1.5152
1.3591 247.0 3211 1.4825
1.3162 248.0 3224 1.4721
1.3197 249.0 3237 1.4375
1.3358 250.0 3250 1.4644
1.3374 251.0 3263 1.4449
1.3548 252.0 3276 1.4405
1.3266 253.0 3289 1.5357
1.3172 254.0 3302 1.3515
1.3089 255.0 3315 1.4408
1.3209 256.0 3328 1.3895
1.3047 257.0 3341 1.4508
1.2877 258.0 3354 1.3954
1.3409 259.0 3367 1.4417
1.31 260.0 3380 1.5124
1.3229 261.0 3393 1.4047
1.3275 262.0 3406 1.3780
1.295 263.0 3419 1.4209
1.3279 264.0 3432 1.3867
1.291 265.0 3445 1.4694
1.2839 266.0 3458 1.5100
1.3064 267.0 3471 1.3646
1.3086 268.0 3484 1.4390
1.3381 269.0 3497 1.4367
1.3333 270.0 3510 1.4078
1.2775 271.0 3523 1.5213
1.2989 272.0 3536 1.4341
1.2759 273.0 3549 1.5165
1.2796 274.0 3562 1.4705
1.3037 275.0 3575 1.3945
1.3132 276.0 3588 1.4560
1.2816 277.0 3601 1.4123
1.2934 278.0 3614 1.3742
1.2873 279.0 3627 1.3824
1.2842 280.0 3640 1.3269
1.2617 281.0 3653 1.4345
1.2661 282.0 3666 1.4682
1.3096 283.0 3679 1.3989
1.2724 284.0 3692 1.3142
1.2529 285.0 3705 1.2795
1.2611 286.0 3718 1.3844
1.2578 287.0 3731 1.3536
1.2854 288.0 3744 1.3770
1.2811 289.0 3757 1.3892
1.2189 290.0 3770 1.3767
1.283 291.0 3783 1.4034
1.2684 292.0 3796 1.3867
1.241 293.0 3809 1.3572
1.2503 294.0 3822 1.3583
1.2605 295.0 3835 1.4600
1.2697 296.0 3848 1.2754
1.2469 297.0 3861 1.4295
1.2451 298.0 3874 1.4645
1.2765 299.0 3887 1.3605
1.2482 300.0 3900 1.4915
1.2564 301.0 3913 1.3490
1.233 302.0 3926 1.3273
1.2313 303.0 3939 1.3861
1.2491 304.0 3952 1.4016
1.2607 305.0 3965 1.3714
1.2548 306.0 3978 1.3572
1.2536 307.0 3991 1.3630
1.24 308.0 4004 1.3070
1.2352 309.0 4017 1.4311
1.2643 310.0 4030 1.2794
1.2281 311.0 4043 1.3855
1.2428 312.0 4056 1.3784
1.2196 313.0 4069 1.3430
1.2116 314.0 4082 1.4230
1.2261 315.0 4095 1.4760
1.25 316.0 4108 1.3658
1.2281 317.0 4121 1.3563
1.2308 318.0 4134 1.3107
1.2247 319.0 4147 1.3554
1.2354 320.0 4160 1.3956
1.2168 321.0 4173 1.2753
1.2078 322.0 4186 1.3253
1.2481 323.0 4199 1.3025
1.2331 324.0 4212 1.3707
1.1974 325.0 4225 1.2874
1.212 326.0 4238 1.3210
1.225 327.0 4251 1.4129
1.2161 328.0 4264 1.3364
1.2304 329.0 4277 1.3822
1.1903 330.0 4290 1.4887
1.2208 331.0 4303 1.2687
1.229 332.0 4316 1.3730
1.205 333.0 4329 1.3521
1.2023 334.0 4342 1.3770
1.2151 335.0 4355 1.3095
1.2255 336.0 4368 1.3003
1.2205 337.0 4381 1.2123
1.203 338.0 4394 1.2995
1.2013 339.0 4407 1.2838
1.1997 340.0 4420 1.3023
1.2033 341.0 4433 1.3111
1.1934 342.0 4446 1.4057
1.1832 343.0 4459 1.3468
1.2405 344.0 4472 1.3362
1.1803 345.0 4485 1.4813
1.2154 346.0 4498 1.3207
1.2314 347.0 4511 1.3236
1.1927 348.0 4524 1.3428
1.2194 349.0 4537 1.3533
1.1995 350.0 4550 1.3465
1.177 351.0 4563 1.3484
1.1993 352.0 4576 1.2859
1.1687 353.0 4589 1.2699
1.2045 354.0 4602 1.3686
1.2084 355.0 4615 1.3515
1.1837 356.0 4628 1.2735
1.1937 357.0 4641 1.2835
1.2004 358.0 4654 1.2793
1.1838 359.0 4667 1.2798
1.2026 360.0 4680 1.3856
1.1669 361.0 4693 1.3719
1.1716 362.0 4706 1.2613
1.1906 363.0 4719 1.2719
1.1914 364.0 4732 1.3864
1.1874 365.0 4745 1.3255
1.1848 366.0 4758 1.2984
1.1778 367.0 4771 1.3461
1.1964 368.0 4784 1.3320
1.16 369.0 4797 1.2962
1.1873 370.0 4810 1.3035
1.1632 371.0 4823 1.3465
1.1807 372.0 4836 1.3453
1.1331 373.0 4849 1.3527
1.1694 374.0 4862 1.2928
1.1615 375.0 4875 1.3519
1.1944 376.0 4888 1.4072
1.163 377.0 4901 1.3156
1.1719 378.0 4914 1.3074
1.1721 379.0 4927 1.3121
1.1618 380.0 4940 1.3039
1.1852 381.0 4953 1.3562
1.1838 382.0 4966 1.3383
1.1616 383.0 4979 1.2922
1.1401 384.0 4992 1.2676
1.165 385.0 5005 1.2625
1.1564 386.0 5018 1.1716
1.1662 387.0 5031 1.2738
1.1761 388.0 5044 1.4011
1.1587 389.0 5057 1.3821
1.1517 390.0 5070 1.2879
1.1699 391.0 5083 1.2898
1.149 392.0 5096 1.2710
1.1541 393.0 5109 1.2612
1.1597 394.0 5122 1.2993
1.1449 395.0 5135 1.2522
1.1332 396.0 5148 1.3367
1.1537 397.0 5161 1.3018
1.1789 398.0 5174 1.3705
1.169 399.0 5187 1.3128
1.1685 400.0 5200 1.3068
1.137 401.0 5213 1.2384
1.177 402.0 5226 1.2547
1.1592 403.0 5239 1.3295
1.1477 404.0 5252 1.3415
1.1465 405.0 5265 1.2466
1.1743 406.0 5278 1.3045
1.1386 407.0 5291 1.3124
1.1379 408.0 5304 1.2826
1.1828 409.0 5317 1.2788
1.1353 410.0 5330 1.3787
1.1536 411.0 5343 1.2968
1.1495 412.0 5356 1.2920
1.1424 413.0 5369 1.3238
1.158 414.0 5382 1.3301
1.1715 415.0 5395 1.2298
1.1559 416.0 5408 1.2769
1.1399 417.0 5421 1.3263
1.186 418.0 5434 1.2924
1.1653 419.0 5447 1.3279
1.14 420.0 5460 1.2892
1.1463 421.0 5473 1.3875
1.1406 422.0 5486 1.3136
1.1705 423.0 5499 1.2579
1.1065 424.0 5512 1.2955
1.145 425.0 5525 1.2970
1.1538 426.0 5538 1.3030
1.1674 427.0 5551 1.3060
1.1283 428.0 5564 1.2325
1.1683 429.0 5577 1.3085
1.1598 430.0 5590 1.2469
1.1429 431.0 5603 1.2523
1.1552 432.0 5616 1.3124
1.1722 433.0 5629 1.2955
1.1329 434.0 5642 1.3249
1.1486 435.0 5655 1.3245
1.124 436.0 5668 1.4052
1.1092 437.0 5681 1.2399
1.135 438.0 5694 1.2788
1.1637 439.0 5707 1.2844
1.1712 440.0 5720 1.2531
1.1401 441.0 5733 1.2790
1.1195 442.0 5746 1.2876
1.1524 443.0 5759 1.2565
1.1292 444.0 5772 1.1492
1.1342 445.0 5785 1.3050
1.1628 446.0 5798 1.2911
1.1286 447.0 5811 1.3624
1.1193 448.0 5824 1.2382
1.1521 449.0 5837 1.2717
1.1128 450.0 5850 1.2865
1.1321 451.0 5863 1.2785
1.1707 452.0 5876 1.3514
1.1431 453.0 5889 1.3321
1.1413 454.0 5902 1.2886
1.0983 455.0 5915 1.3165
1.1202 456.0 5928 1.2375
1.1259 457.0 5941 1.2166
1.1353 458.0 5954 1.3579
1.1272 459.0 5967 1.2890
1.1411 460.0 5980 1.2397
1.115 461.0 5993 1.2803
1.14 462.0 6006 1.2439
1.11 463.0 6019 1.1894
1.1539 464.0 6032 1.2979
1.1052 465.0 6045 1.2281
1.1092 466.0 6058 1.2853
1.1229 467.0 6071 1.2988
1.1209 468.0 6084 1.3058
1.1147 469.0 6097 1.2705
1.1228 470.0 6110 1.2435
1.1124 471.0 6123 1.2188
1.0922 472.0 6136 1.2892
1.1228 473.0 6149 1.2250
1.1341 474.0 6162 1.2373
1.1295 475.0 6175 1.2126
1.1105 476.0 6188 1.3032
1.1223 477.0 6201 1.2190
1.1487 478.0 6214 1.2728
1.1288 479.0 6227 1.3258
1.1398 480.0 6240 1.2114
1.1127 481.0 6253 1.2695
1.135 482.0 6266 1.3376
1.106 483.0 6279 1.2860
1.0978 484.0 6292 1.3001
1.1254 485.0 6305 1.3180
1.1117 486.0 6318 1.3036
1.1249 487.0 6331 1.2380
1.1111 488.0 6344 1.3112
1.119 489.0 6357 1.2587
1.1203 490.0 6370 1.2867
1.1195 491.0 6383 1.3153
1.1304 492.0 6396 1.2762
1.1268 493.0 6409 1.2757
1.1478 494.0 6422 1.2493
1.1527 495.0 6435 1.2793
1.1252 496.0 6448 1.2435
1.1307 497.0 6461 1.3311
1.1163 498.0 6474 1.3016
1.099 499.0 6487 1.3532
1.1246 500.0 6500 1.2222
Downloads last month
1

Finetuned from