model_type
stringclasses
4 values
model
stringlengths
13
62
AVG
float64
0.04
0.7
CG
float64
0
0.68
EL
float64
0
0.62
FA
float64
0
0.35
HE
float64
0
0.79
MC
float64
0
0.92
MR
float64
0
0.95
MT
float64
0.3
0.86
NLI
float64
0
0.82
QA
float64
0.01
0.77
RC
float64
0.04
0.93
SUM
float64
0
0.29
aio_char_f1
float64
0
0.9
alt-e-to-j_bert_score_ja_f1
float64
0.26
0.88
alt-e-to-j_bleu_ja
float64
0.32
16
alt-e-to-j_comet_wmt22
float64
0.29
0.92
alt-j-to-e_bert_score_en_f1
float64
0.37
0.96
alt-j-to-e_bleu_en
float64
0.02
20.1
alt-j-to-e_comet_wmt22
float64
0.3
0.89
chabsa_set_f1
float64
0
0.62
commonsensemoralja_exact_match
float64
0
0.94
jamp_exact_match
float64
0
0.74
janli_exact_match
float64
0
0.95
jcommonsenseqa_exact_match
float64
0
0.97
jemhopqa_char_f1
float64
0
0.71
jmmlu_exact_match
float64
0
0.76
jnli_exact_match
float64
0
0.9
jsem_exact_match
float64
0
0.81
jsick_exact_match
float64
0
0.87
jsquad_char_f1
float64
0.04
0.93
jsts_pearson
float64
-0.23
0.91
jsts_spearman
float64
-0.19
0.88
kuci_exact_match
float64
0
0.86
mawps_exact_match
float64
0
0.95
mbpp_code_exec
float64
0
0.68
mbpp_pylint_check
float64
0
0.99
mmlu_en_exact_match
float64
0
0.82
niilc_char_f1
float64
0.01
0.7
wiki_coreference_set_f1
float64
0
0.13
wiki_dependency_set_f1
float64
0
0.55
wiki_ner_set_f1
float64
0
0.17
wiki_pas_set_f1
float64
0
0.12
wiki_reading_char_f1
float64
0.02
0.91
wikicorpus-e-to-j_bert_score_ja_f1
float64
0.15
0.87
wikicorpus-e-to-j_bleu_ja
float64
0.17
18.3
wikicorpus-e-to-j_comet_wmt22
float64
0.3
0.87
wikicorpus-j-to-e_bert_score_en_f1
float64
0.26
0.92
wikicorpus-j-to-e_bleu_en
float64
0.03
13.8
wikicorpus-j-to-e_comet_wmt22
float64
0.28
0.78
xlsum_ja_bert_score_ja_f1
float64
0
0.79
xlsum_ja_bleu_ja
float64
0
10.2
xlsum_ja_rouge1
float64
0.05
52.8
xlsum_ja_rouge2
float64
0.01
29.2
xlsum_ja_rouge2_scaling
float64
0
0.29
xlsum_ja_rougeLsum
float64
0.05
44.9
architecture
stringclasses
8 values
precision
stringclasses
2 values
license
stringclasses
11 values
params
float64
0.14
70.6
likes
int64
0
4.03k
revision
stringclasses
1 value
num_few_shot
int64
0
4
add_special_tokens
stringclasses
2 values
llm_jp_eval_version
stringclasses
1 value
vllm_version
stringclasses
1 value
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V6-70B
0.6225
0.0964
0.5783
0.2658
0.7727
0.886
0.936
0.8401
0.8032
0.7044
0.9085
0.0564
0.8308
0.8539
13.1706
0.8893
0.9587
18.6605
0.8894
0.5783
0.9076
0.704
0.9222
0.9446
0.661
0.7328
0.7342
0.8074
0.848
0.9085
0.8722
0.8504
0.8058
0.936
0.0964
0.1827
0.8126
0.6214
0.0344
0.3689
0.0177
0.0252
0.8829
0.8423
16.1434
0.8297
0.9081
12.3161
0.752
0.6416
2.9336
15.9098
5.6527
0.0564
13.6909
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V6-70B
0.3656
0.0964
0.2202
0.1608
0.0054
0.865
0.062
0.8101
0.7409
0.2699
0.7342
0.0564
0.4067
0.8458
12.8299
0.8743
0.9479
17.915
0.8607
0.2202
0.9093
0.6839
0.8125
0.9249
0.1984
0.0017
0.719
0.6168
0.8725
0.7342
0.4631
0.8171
0.7608
0.062
0.0964
0.1827
0.009
0.2044
0.0025
0.0046
0
0
0.7971
0.7971
12.7938
0.7737
0.8986
11.4304
0.7318
0.6416
2.9336
15.9098
5.6527
0.0564
13.6909
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V7-70B
0.6593
0.4779
0.5789
0.2861
0.7574
0.9062
0.946
0.774
0.7986
0.712
0.925
0.0903
0.8266
0.862
14.9443
0.8927
0.9204
19.2739
0.7848
0.5789
0.9223
0.6638
0.8917
0.9544
0.6658
0.7232
0.7839
0.803
0.8508
0.925
0.9014
0.8726
0.8419
0.946
0.4779
0.8394
0.7915
0.6436
0.1132
0.4228
0.115
0.0799
0.6994
0.7891
15.9759
0.7081
0.8908
12.5414
0.7104
0.681
3.745
20.7696
9.0219
0.0903
17.1776
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V7-70B
0.5296
0.4779
0.3245
0.1605
0.7061
0.8782
0.716
0.7239
0.7654
0.2549
0.7281
0.0903
0.2285
0.869
14.1359
0.9122
0.8573
18.1692
0.5904
0.3245
0.9143
0.7098
0.7569
0.9437
0.2498
0.6846
0.7145
0.7948
0.8508
0.7281
0.9061
0.872
0.7766
0.716
0.4779
0.8394
0.7276
0.2863
0.0009
0.0135
0.0088
0.0099
0.7693
0.8288
11.3951
0.8396
0.834
11.2014
0.5533
0.681
3.745
20.7696
9.0219
0.0903
17.1776
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
sthenno-com/miscii-14b-1225
0.558
0.5924
0.231
0.1505
0.5701
0.8561
0.826
0.8386
0.7456
0.3605
0.863
0.1044
0.4587
0.8501
10.8416
0.9028
0.952
16.0986
0.8808
0.231
0.894
0.6178
0.8236
0.933
0.3379
0.6733
0.7671
0.7033
0.8161
0.863
0.8901
0.8611
0.7414
0.826
0.5924
0.9458
0.4669
0.2851
0.0137
0.0096
0
0.0008
0.7286
0.8029
8.2491
0.8258
0.895
8.9185
0.7451
0.6977
2.9606
27.9443
10.4524
0.1044
24.4193
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
17
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
sthenno-com/miscii-14b-1225
0.6454
0.5924
0.567
0.2604
0.7409
0.8806
0.89
0.8478
0.7708
0.5391
0.9064
0.1044
0.5488
0.8637
12.9486
0.9086
0.9539
17.5864
0.8838
0.567
0.897
0.6293
0.8194
0.9526
0.5754
0.7136
0.8505
0.7797
0.7749
0.9064
0.8923
0.8664
0.7922
0.89
0.5924
0.9458
0.7682
0.4931
0.0933
0.3381
0
0.071
0.7994
0.8293
11.1729
0.8409
0.9046
10.8587
0.7578
0.6977
2.9606
27.9443
10.4524
0.1044
24.4193
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
17
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
rombodawg/Rombos-LLM-V2.5-Qwen-32b
0.5456
0.5321
0.104
0.1468
0.5748
0.874
0.794
0.8386
0.7642
0.3892
0.8869
0.0973
0.441
0.8486
11.2433
0.901
0.9512
15.6734
0.8802
0.104
0.9043
0.6494
0.7944
0.9303
0.2681
0.5645
0.8188
0.7961
0.7623
0.8869
0.8954
0.8763
0.7875
0.794
0.5321
0.761
0.5851
0.4585
0.0281
0.0068
0.0354
0.0048
0.6589
0.8002
8.7465
0.8266
0.8967
9.6053
0.7466
0.6931
2.7803
25.9887
9.7254
0.0973
22.6735
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
48
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
rombodawg/Rombos-LLM-V2.5-Qwen-32b
0.6558
0.5321
0.588
0.2738
0.7769
0.8969
0.944
0.8476
0.8105
0.541
0.9054
0.0973
0.5542
0.8643
13.2226
0.9077
0.9553
17.6746
0.8859
0.588
0.8985
0.6753
0.8417
0.958
0.5672
0.7535
0.8977
0.7797
0.8579
0.9054
0.8895
0.8772
0.8341
0.944
0.5321
0.761
0.8003
0.5015
0.054
0.3845
0
0.1132
0.8175
0.8291
11.0412
0.8387
0.9044
11.1234
0.7581
0.6931
2.7803
25.9887
9.7254
0.0973
22.6735
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
48
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿค : base merges and moerges
maldv/Qwentile2.5-32B-Instruct
0.5715
0.3233
0.234
0.1439
0.7444
0.8716
0.866
0.8454
0.7511
0.4806
0.9113
0.1146
0.4784
0.8528
12.0338
0.9053
0.9549
16.6195
0.8865
0.234
0.9033
0.6322
0.7597
0.9357
0.5139
0.7182
0.8426
0.803
0.7179
0.9113
0.8892
0.8675
0.7757
0.866
0.3233
0.4116
0.7705
0.4496
0.0025
0.0069
0.0088
0.0082
0.6929
0.8076
9.3966
0.8356
0.8995
9.9257
0.7543
0.706
3.1737
28.67
11.4479
0.1146
24.92
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
28
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿค : base merges and moerges
maldv/Qwentile2.5-32B-Instruct
0.6457
0.3233
0.5805
0.2917
0.7864
0.8992
0.946
0.8506
0.8059
0.5826
0.9213
0.1146
0.5838
0.8662
13.2438
0.91
0.9557
17.5366
0.8856
0.5805
0.9058
0.6897
0.8236
0.9607
0.6368
0.7639
0.8948
0.8093
0.8121
0.9213
0.903
0.8802
0.8312
0.946
0.3233
0.4116
0.8089
0.5272
0.0321
0.3916
0.1239
0.0975
0.8134
0.8399
12.9473
0.8439
0.9085
11.632
0.7631
0.706
3.1737
28.67
11.4479
0.1146
24.92
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
28
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
arcee-ai/Virtuoso-Small
0.5215
0.5241
0.231
0.1427
0.3419
0.8596
0.764
0.8301
0.7379
0.3427
0.8706
0.0923
0.427
0.8455
10.4153
0.8959
0.9468
16.487
0.8664
0.231
0.895
0.6034
0.8042
0.9401
0.3144
0.564
0.7502
0.7184
0.8135
0.8706
0.896
0.8691
0.7436
0.764
0.5241
0.8896
0.1198
0.2868
0.0246
0.0044
0
0.0024
0.6818
0.7962
8.436
0.8199
0.8924
9.3765
0.7383
0.6867
2.5992
24.1217
9.2262
0.0923
21.1957
Qwen2ForCausalLM
bfloat16
apache-2.0
14.77
45
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
arcee-ai/Virtuoso-Small
0.639
0.5241
0.569
0.2468
0.7373
0.8867
0.888
0.8482
0.7807
0.5483
0.9075
0.0923
0.5425
0.8634
12.4457
0.9067
0.9544
17.0927
0.885
0.569
0.9053
0.6494
0.8236
0.9571
0.5787
0.7077
0.8624
0.7753
0.7928
0.9075
0.8861
0.8707
0.7978
0.888
0.5241
0.8896
0.767
0.5238
0.0753
0.3179
0
0.063
0.7778
0.8277
11.244
0.8421
0.9042
10.7569
0.759
0.6867
2.5992
24.1217
9.2262
0.0923
21.1957
Qwen2ForCausalLM
bfloat16
apache-2.0
14.77
45
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
nbeerbower/Mistral-Nemo-Prism-12B-v2
0.4261
0.006
0.2622
0.1425
0.4994
0.6263
0.382
0.8333
0.6629
0.3403
0.8169
0.1149
0.4148
0.8448
10.7822
0.9011
0.9477
14.2615
0.8756
0.2622
0.5306
0.5086
0.7403
0.7954
0.2408
0.4648
0.5489
0.6806
0.8362
0.8169
0.8254
0.7816
0.5529
0.382
0.006
0.0382
0.534
0.3652
0.0025
0.0015
0.0111
0.0072
0.6901
0.7924
8.0262
0.8178
0.895
8.5782
0.7388
0.7082
2.4913
32.6038
11.4923
0.1149
27.7702
MistralForCausalLM
bfloat16
apache-2.0
12.248
3
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
nbeerbower/Mistral-Nemo-Prism-12B-v2
0.5444
0.006
0.4957
0.2553
0.5886
0.844
0.728
0.8441
0.7104
0.5038
0.8971
0.1149
0.5552
0.8595
11.9703
0.9068
0.9516
15.7131
0.8809
0.4957
0.886
0.5115
0.7681
0.9276
0.4677
0.5383
0.7839
0.7487
0.7398
0.8971
0.8749
0.8479
0.7184
0.728
0.006
0.0382
0.639
0.4885
0.0054
0.3391
0.0619
0.0782
0.792
0.8206
9.9457
0.8358
0.9014
9.3628
0.7532
0.7082
2.4913
32.6038
11.4923
0.1149
27.7702
MistralForCausalLM
bfloat16
apache-2.0
12.248
3
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V3-70B
0.6425
0.2088
0.5941
0.2889
0.7778
0.9027
0.944
0.8525
0.8038
0.6987
0.9132
0.0831
0.8086
0.8733
14.583
0.9114
0.9589
18.6433
0.8897
0.5941
0.9296
0.6782
0.9292
0.9535
0.6741
0.7405
0.7753
0.8037
0.833
0.9132
0.882
0.8542
0.825
0.944
0.2088
0.3173
0.8152
0.6133
0.0435
0.4254
0.0265
0.0583
0.8905
0.8562
17.6223
0.8523
0.9087
12.2918
0.7566
0.6785
3.0786
20.1409
8.3149
0.0831
17.9211
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V3-70B
0.356
0.2088
0.2404
0.1612
0.0171
0.858
0.006
0.8183
0.6584
0.3086
0.5562
0.0831
0.4861
0.8657
14.1464
0.898
0.9576
18.14
0.8879
0.2404
0.8695
0.569
0.7792
0.933
0.2291
0
0.4741
0.6515
0.8181
0.5562
0.8727
0.8406
0.7716
0.006
0.2088
0.3173
0.0342
0.2105
0.0036
0.0108
0
0.0008
0.791
0.7825
13.6192
0.7498
0.8992
11.6258
0.7375
0.6785
3.0786
20.1409
8.3149
0.0831
17.9211
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V12-70B
0.6667
0.5763
0.5685
0.314
0.7433
0.8917
0.938
0.8363
0.7836
0.6989
0.9239
0.059
0.8605
0.8615
13.9287
0.8991
0.9586
18.7964
0.8891
0.5685
0.894
0.6437
0.8764
0.9517
0.5769
0.7162
0.7523
0.7942
0.8514
0.9239
0.8992
0.8721
0.8294
0.938
0.5763
0.9819
0.7704
0.6592
0.0852
0.4464
0.0885
0.0648
0.8851
0.8303
15.8592
0.7897
0.9161
12.8808
0.7672
0.6351
3.3978
16.1526
5.8881
0.059
13.6912
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V12-70B
0.6064
0.5763
0.3432
0.1637
0.7121
0.872
0.866
0.8288
0.7754
0.5642
0.9101
0.059
0.7172
0.8491
12.7037
0.8824
0.9582
17.8773
0.8884
0.3432
0.9128
0.7098
0.7639
0.9392
0.4718
0.6851
0.765
0.8049
0.8334
0.9101
0.8922
0.8639
0.7641
0.866
0.5763
0.9819
0.739
0.5036
0.001
0.0002
0.0354
0.0045
0.7773
0.8165
12.4108
0.7876
0.9072
11.6113
0.7569
0.6351
3.3978
16.1526
5.8881
0.059
13.6912
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V4-70B
0.3836
0.5863
0.2707
0.1754
0.094
0.7212
0.026
0.8462
0.2801
0.4569
0.6349
0.1278
0.5354
0.8701
14.8228
0.8998
0.96
18.8176
0.8919
0.2707
0.4241
0.5718
0.0292
0.9446
0.4527
0.1124
0.3209
0.0417
0.437
0.6349
0.8969
0.8645
0.7949
0.026
0.5863
0.9357
0.0756
0.3826
0.0037
0.009
0.0147
0.0093
0.8401
0.8383
13.8494
0.8393
0.9037
11.664
0.7539
0.7123
4.1077
28.7565
12.7821
0.1278
25.4036
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V4-70B
0.6969
0.5863
0.6246
0.3345
0.7633
0.9178
0.954
0.8623
0.8123
0.7601
0.9228
0.1278
0.882
0.879
15.5348
0.917
0.961
19.626
0.8933
0.6246
0.9369
0.6839
0.8819
0.9598
0.7126
0.7343
0.857
0.8068
0.8319
0.9228
0.9014
0.8727
0.8568
0.954
0.5863
0.9357
0.7923
0.6856
0.1257
0.5124
0.0531
0.0725
0.9087
0.8684
18.3481
0.8676
0.9148
12.8181
0.7711
0.7123
4.1077
28.7565
12.7821
0.1278
25.4036
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Base-32B
0.5157
0.5663
0.2932
0.1335
0.0549
0.8426
0.676
0.8423
0.6642
0.4421
0.8718
0.2855
0.5525
0.8642
13.0457
0.9069
0.9547
17.5592
0.8849
0.2932
0.8785
0.5431
0.6639
0.9276
0.4123
0.0892
0.673
0.7759
0.6649
0.8718
0.8961
0.8723
0.7217
0.676
0.5663
0.8956
0.0205
0.3616
0.005
0
0.0177
0
0.6449
0.8179
9.7732
0.831
0.8997
10.1877
0.7465
0.7905
8.0023
52.7522
28.5161
0.2855
44.8456
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Base-32B
0.6663
0.5663
0.557
0.2527
0.7535
0.8908
0.922
0.8439
0.7522
0.5775
0.9276
0.2855
0.5935
0.8646
12.2397
0.9076
0.9544
16.8035
0.8828
0.557
0.9058
0.6638
0.7486
0.9526
0.621
0.7074
0.8184
0.7936
0.7368
0.9276
0.8947
0.874
0.8139
0.922
0.5663
0.8956
0.7996
0.518
0.0132
0.3395
0.0531
0.0769
0.7809
0.8355
13.5653
0.8344
0.9046
11.2487
0.7509
0.7905
8.0023
52.7522
28.5161
0.2855
44.8456
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V1-32B
0.6373
0.243
0.5798
0.2879
0.7822
0.895
0.942
0.8464
0.8025
0.5903
0.9201
0.1215
0.6002
0.8668
12.7065
0.9101
0.9556
17.5826
0.8852
0.5798
0.8955
0.6868
0.8208
0.9634
0.6232
0.7605
0.8948
0.8043
0.806
0.9201
0.8948
0.8785
0.8262
0.942
0.243
0.3735
0.8039
0.5474
0.0141
0.3824
0.1327
0.0941
0.8161
0.8317
13.312
0.8313
0.9058
11.31
0.7592
0.7012
3.916
28.4658
12.1545
0.1215
24.9555
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V1-32B
0.4603
0.243
0.3213
0.1156
0.4984
0.8665
0.614
0.6156
0.6427
0.2565
0.7677
0.1215
0.2848
0.7524
12.28
0.6978
0.8411
16.2831
0.5481
0.3213
0.8948
0.6092
0.2986
0.9366
0.2821
0.4346
0.7905
0.7955
0.7197
0.7677
0.8976
0.8732
0.7681
0.614
0.243
0.3735
0.5622
0.2025
0.005
0.0029
0
0.0003
0.5697
0.7149
8.8323
0.6591
0.8256
9.2275
0.5575
0.7012
3.916
28.4658
12.1545
0.1215
24.9555
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V5-70B
0.6149
0.0863
0.5699
0.2616
0.7728
0.8824
0.934
0.8154
0.8013
0.6923
0.9044
0.044
0.8248
0.8344
12.4686
0.8334
0.9588
18.3956
0.8891
0.5699
0.8963
0.6983
0.9208
0.9455
0.6398
0.7334
0.7317
0.8056
0.85
0.9044
0.8699
0.8476
0.8054
0.934
0.0863
0.1807
0.8121
0.6124
0.0346
0.345
0.0177
0.0226
0.8882
0.8266
15.8059
0.7868
0.9082
12.2544
0.7523
0.6213
2.8907
13.3272
4.401
0.044
11.0807
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V5-70B
0.3734
0.0863
0.2253
0.1609
0.0069
0.8675
0.12
0.7964
0.7627
0.293
0.7438
0.044
0.4499
0.8349
12.4544
0.8432
0.9514
17.8958
0.8705
0.2253
0.9196
0.6983
0.8111
0.924
0.2031
0.0025
0.7231
0.7071
0.874
0.7438
0.486
0.8185
0.7589
0.12
0.0863
0.1807
0.0112
0.2261
0.0029
0.0038
0
0
0.798
0.7882
12.6282
0.7388
0.8991
11.5296
0.7331
0.6213
2.8907
13.3272
4.401
0.044
11.0807
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V1-32B
0.5176
0.247
0.1557
0.1176
0.6723
0.8734
0.806
0.665
0.7554
0.4004
0.8986
0.1022
0.4216
0.759
11.1519
0.7158
0.8589
15.634
0.6042
0.1557
0.9036
0.6322
0.7708
0.9339
0.3548
0.6498
0.8233
0.7973
0.7534
0.8986
0.8944
0.8735
0.7829
0.806
0.247
0.3936
0.6948
0.4247
0.0231
0.0087
0
0.0056
0.5504
0.7323
8.6672
0.6918
0.8602
9.1539
0.6481
0.6857
2.8962
26.3302
10.2219
0.1022
22.8557
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V1-32B
0.6299
0.247
0.5794
0.2816
0.7825
0.8964
0.938
0.8176
0.8161
0.5602
0.9082
0.1022
0.5696
0.8335
13.1786
0.8529
0.9558
17.6265
0.8865
0.5794
0.8985
0.7098
0.8361
0.958
0.5943
0.7616
0.8965
0.7948
0.8435
0.9082
0.8956
0.8815
0.8326
0.938
0.247
0.3936
0.8034
0.5167
0.0495
0.359
0.0885
0.1157
0.7954
0.7876
11.8516
0.7667
0.9074
11.5104
0.7642
0.6857
2.8962
26.3302
10.2219
0.1022
22.8557
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V2-32B
0.5006
0.1747
0.155
0.1135
0.6194
0.8733
0.78
0.6548
0.7477
0.3892
0.8958
0.1032
0.4303
0.758
11.1964
0.712
0.8551
15.4535
0.596
0.155
0.9023
0.6351
0.7278
0.9339
0.3322
0.6154
0.8221
0.7999
0.7536
0.8958
0.8941
0.8736
0.7836
0.78
0.1747
0.2651
0.6234
0.4052
0.0148
0.007
0
0.0032
0.5424
0.7297
8.5653
0.6868
0.851
9.0423
0.6243
0.6852
2.9647
26.6586
10.3239
0.1032
23.1535
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V2-32B
0.6209
0.1747
0.5732
0.2788
0.7805
0.8959
0.938
0.8064
0.8166
0.555
0.9081
0.1032
0.5684
0.8225
13.1622
0.8321
0.9553
17.7008
0.8851
0.5732
0.896
0.7069
0.8375
0.9589
0.5918
0.7591
0.8965
0.7898
0.8522
0.9081
0.8929
0.8818
0.8329
0.938
0.1747
0.2651
0.8018
0.5046
0.0518
0.3492
0.0973
0.1143
0.7816
0.7743
11.6422
0.7447
0.9072
11.3013
0.7637
0.6852
2.9647
26.6586
10.3239
0.1032
23.1535
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V3-32B
0.5058
0.2912
0.1058
0.1206
0.5718
0.8745
0.788
0.6894
0.7641
0.3788
0.8848
0.0951
0.4338
0.764
10.9772
0.7325
0.8699
15.1981
0.6626
0.1058
0.9038
0.6494
0.7903
0.9312
0.2662
0.5592
0.8196
0.7986
0.7627
0.8848
0.8947
0.876
0.7887
0.788
0.2912
0.492
0.5844
0.4364
0.0281
0.0056
0
0.0044
0.5646
0.7335
8.5283
0.7006
0.8629
9.1587
0.6619
0.6875
2.6923
25.3725
9.5144
0.0951
22.17
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V3-32B
0.6324
0.2912
0.5839
0.2723
0.7754
0.8962
0.944
0.8432
0.8099
0.5398
0.9049
0.0951
0.5517
0.8641
13.4331
0.9075
0.9551
17.6896
0.8853
0.5839
0.898
0.6753
0.8403
0.9571
0.5672
0.7512
0.8965
0.7809
0.8563
0.9049
0.8906
0.8792
0.8335
0.944
0.2912
0.492
0.7996
0.5004
0.0552
0.3803
0
0.1165
0.8096
0.8235
11.0257
0.8296
0.9017
11.1475
0.7503
0.6875
2.6923
25.3725
9.5144
0.0951
22.17
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
0
main
4
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tokyotech-llm/Llama-3.1-Swallow-70B-Instruct-v0.3
0.4889
0
0.4568
0.1735
0.5205
0.8181
0.736
0.7427
0.6764
0.4459
0.6945
0.1137
0.373
0.858
12.9928
0.911
0.8897
16.7576
0.6673
0.4568
0.7583
0.6034
0.7139
0.9473
0.4644
0.4583
0.546
0.7456
0.7733
0.6945
0.8905
0.8578
0.7488
0.736
0
0
0.5826
0.5003
0.0147
0.0119
0.0088
0.0136
0.8187
0.8068
10.0695
0.8231
0.8487
10.4706
0.5692
0.7024
3.5848
28.9326
11.3782
0.1137
25.3589
LlamaForCausalLM
bfloat16
llama3.1;gemma
70.554
6
main
0
False
v1.4.1
v0.6.3.post1
โญ• : instruction-tuned
tokyotech-llm/Llama-3.1-Swallow-70B-Instruct-v0.3
0.6208
0
0.5877
0.3265
0.7107
0.9023
0.94
0.859
0.7648
0.7126
0.911
0.1137
0.8592
0.8718
14.4087
0.9156
0.9589
18.902
0.8913
0.5877
0.9176
0.6494
0.7986
0.9643
0.6303
0.6851
0.825
0.7879
0.7629
0.911
0.9022
0.8677
0.8251
0.94
0
0
0.7362
0.6482
0.0774
0.4797
0.0619
0.1101
0.9034
0.8512
13.1098
0.8579
0.9142
12.2545
0.7713
0.7024
3.5848
28.9326
11.3782
0.1137
25.3589
LlamaForCausalLM
bfloat16
llama3.1;gemma
70.554
6
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V4-32B
0.4838
0.008
0.1455
0.12
0.6601
0.8714
0.82
0.6421
0.7427
0.3368
0.8703
0.1045
0.298
0.7524
10.7871
0.7002
0.8516
15.6005
0.5858
0.1455
0.9023
0.6207
0.7083
0.9339
0.3785
0.6007
0.8394
0.798
0.7471
0.8703
0.8907
0.8715
0.778
0.82
0.008
0.0161
0.7196
0.334
0.0058
0.0024
0
0.0064
0.5853
0.7242
8.8307
0.6758
0.8424
8.9859
0.6066
0.6872
3.1771
26.7645
10.4431
0.1045
23.3063
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V4-32B
0.6106
0.008
0.5759
0.2869
0.7833
0.8965
0.936
0.821
0.8244
0.5672
0.9135
0.1045
0.578
0.8614
13.1877
0.9021
0.9527
17.7238
0.877
0.5759
0.9026
0.7356
0.8542
0.9571
0.591
0.7628
0.8981
0.8005
0.8334
0.9135
0.888
0.8779
0.8298
0.936
0.008
0.0161
0.8037
0.5324
0.0455
0.3418
0.1504
0.105
0.7916
0.7888
11.769
0.7714
0.8964
11.1378
0.7332
0.6872
3.1771
26.7645
10.4431
0.1045
23.3063
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V7-70B
0.6332
0.002
0.6168
0.3476
0.7359
0.9189
0.946
0.8634
0.7843
0.7465
0.9123
0.0912
0.8974
0.8785
15.0597
0.9182
0.961
19.426
0.8941
0.6168
0.9354
0.6437
0.8208
0.9678
0.657
0.7148
0.8435
0.7986
0.8147
0.9123
0.9096
0.8812
0.8536
0.946
0.002
0.002
0.7571
0.6852
0.1256
0.5263
0.0708
0.1016
0.9136
0.8633
15.3884
0.8664
0.916
12.5452
0.7747
0.6801
3.8393
21.2279
9.1197
0.0912
18.9384
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V7-70B
0.304
0.002
0.2899
0.179
0.2127
0.578
0.006
0.848
0.2658
0.3961
0.4749
0.0912
0.503
0.8652
14.3057
0.9132
0.9591
18.5229
0.892
0.2899
0.0003
0.5575
0.0444
0.9544
0.3019
0.0805
0.3636
0
0.3635
0.4749
0.8939
0.865
0.7794
0.006
0.002
0.002
0.345
0.3833
0.0031
0.0265
0.0177
0.0201
0.8276
0.8112
10.5794
0.8266
0.9066
11.6273
0.7602
0.6801
3.8393
21.2279
9.1197
0.0912
18.9384
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V8-70B
0.3595
0.002
0.3125
0.19
0.2773
0.5809
0.114
0.8548
0.2638
0.5403
0.7077
0.1108
0.8049
0.8664
14.4057
0.9143
0.9591
18.4793
0.892
0.3125
0.0073
0.5575
0.0389
0.9553
0.3482
0.2149
0.3599
0
0.3629
0.7077
0.8951
0.8661
0.7802
0.114
0.002
0.006
0.3397
0.4678
0.0024
0.027
0.0177
0.0168
0.8859
0.8285
10.7182
0.8482
0.9084
11.6781
0.7645
0.6993
3.9237
25.8447
11.071
0.1108
22.8914
LlamaForCausalLM
bfloat16
apache-2.0
70.554
1
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V8-70B
0.635
0.002
0.6174
0.3483
0.7358
0.919
0.944
0.8634
0.7854
0.7464
0.912
0.1108
0.8966
0.8787
15.0616
0.9182
0.961
19.5028
0.8941
0.6174
0.9344
0.6437
0.8278
0.9687
0.6559
0.7148
0.841
0.798
0.8165
0.912
0.9102
0.8824
0.8539
0.944
0.002
0.006
0.7568
0.6868
0.1236
0.534
0.0708
0.101
0.9123
0.8631
15.3199
0.8661
0.9162
12.5576
0.7751
0.6993
3.9237
25.8447
11.071
0.1108
22.8914
LlamaForCausalLM
bfloat16
apache-2.0
70.554
1
main
4
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V6-32B
0.5067
0.2972
0.1068
0.122
0.5735
0.8744
0.784
0.6893
0.7659
0.3782
0.8869
0.0954
0.4336
0.7633
11.0622
0.7294
0.8708
15.2435
0.6656
0.1068
0.9041
0.6523
0.7944
0.9321
0.2662
0.5609
0.8209
0.7992
0.7625
0.8869
0.8949
0.8764
0.7871
0.784
0.2972
0.498
0.5862
0.4348
0.0281
0.0064
0
0.0054
0.57
0.7339
8.5598
0.7009
0.863
9.219
0.6613
0.6873
2.8057
25.4218
9.5439
0.0954
22.1584
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
1
main
0
False
v1.4.1
v0.6.3.post1
๐Ÿ”ถ : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V6-32B
0.6336
0.2972
0.5881
0.2718
0.7762
0.8968
0.942
0.8432
0.8117
0.5426
0.9042
0.0954
0.5559
0.8641
13.299
0.9073
0.955
17.5951
0.8852
0.5881
0.8985
0.6782
0.8431
0.958
0.5672
0.7518
0.8969
0.7822
0.8581
0.9042
0.8904
0.8778
0.8337
0.942
0.2972
0.498
0.8006
0.5047
0.0523
0.3853
0
0.1147
0.8066
0.8235
11.0258
0.8292
0.9018
11.0763
0.7512
0.6873
2.8057
25.4218
9.5439
0.0954
22.1584
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
1
main
4
False
v1.4.1
v0.6.3.post1