Spaces:
Build error
Build error
qwen2-72b full results
Browse files
logs/Qwen2-72B-Instruct_epoch_10.txt
CHANGED
@@ -5,12 +5,12 @@ Qwen/Qwen2-72B-Instruct llama-factory/saves/Qwen2-72B-Instruct/checkpoint-350 Tr
|
|
5 |
(1) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
6 |
0.0 GB of memory reserved.
|
7 |
loading model: Qwen/Qwen2-72B-Instruct with adapter: llama-factory/saves/Qwen2-72B-Instruct/checkpoint-350
|
8 |
-
09/
|
9 |
-
09/
|
10 |
-
09/
|
11 |
-
09/
|
12 |
-
09/
|
13 |
-
09/
|
14 |
(2) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
15 |
43.037 GB of memory reserved.
|
16 |
loading train/test data files
|
@@ -153,4 +153,11 @@ You are an expert in logical reasoning.<|im_end|>
|
|
153 |
<|im_start|>assistant
|
154 |
|
155 |
Evaluating model: Qwen/Qwen2-72B-Instruct
|
156 |
-
Batch output: ['不是'
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5 |
(1) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
6 |
0.0 GB of memory reserved.
|
7 |
loading model: Qwen/Qwen2-72B-Instruct with adapter: llama-factory/saves/Qwen2-72B-Instruct/checkpoint-350
|
8 |
+
09/11/2024 12:38:44 - INFO - llamafactory.data.template - Replace eos token: <|im_end|>
|
9 |
+
09/11/2024 12:38:45 - INFO - llamafactory.model.model_utils.quantization - Quantizing model to 4 bit with bitsandbytes.
|
10 |
+
09/11/2024 12:38:45 - INFO - llamafactory.model.patcher - Using KV cache for faster generation.
|
11 |
+
09/11/2024 12:49:52 - INFO - llamafactory.model.model_utils.attention - Using torch SDPA for faster training and inference.
|
12 |
+
09/11/2024 12:49:54 - INFO - llamafactory.model.adapter - Loaded adapter(s): llama-factory/saves/Qwen2-72B-Instruct/checkpoint-350
|
13 |
+
09/11/2024 12:49:54 - INFO - llamafactory.model.loader - all params: 72,811,470,848
|
14 |
(2) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
15 |
43.037 GB of memory reserved.
|
16 |
loading train/test data files
|
|
|
153 |
<|im_start|>assistant
|
154 |
|
155 |
Evaluating model: Qwen/Qwen2-72B-Instruct
|
156 |
+
Batch output: ['不是']
|
157 |
+
(3) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
158 |
+
43.537 GB of memory reserved.
|
159 |
+
text ... Qwen/Qwen2-72B-Instruct/checkpoint-350_torch.bfloat16_4bit_lf
|
160 |
+
0 甄加索是自杀吗 ... 不是
|
161 |
+
|
162 |
+
[1 rows x 16 columns]
|
163 |
+
{'accuracy': 0.7736666666666666, 'incorrect_ids': [9, 11, 17, 18, 24, 29, 31, 34, 36, 43, 59, 61, 65, 66, 67, 77, 78, 81, 83, 91, 93, 94, 97, 104, 108, 109, 112, 115, 117, 119, 124, 129, 131, 135, 137, 138, 139, 143, 150, 155, 160, 161, 163, 164, 179, 180, 181, 190, 193, 198, 199, 200, 202, 218, 222, 224, 228, 229, 234, 235, 236, 240, 245, 248, 250, 253, 255, 257, 259, 260, 265, 269, 271, 275, 286, 292, 295, 299, 301, 303, 304, 311, 314, 321, 323, 326, 330, 334, 335, 341, 347, 350, 353, 355, 356, 359, 360, 362, 364, 368, 370, 371, 372, 373, 374, 377, 389, 395, 396, 397, 410, 423, 428, 429, 430, 445, 447, 449, 452, 454, 456, 458, 461, 472, 473, 476, 480, 481, 483, 488, 490, 492, 493, 494, 495, 498, 501, 502, 503, 506, 507, 508, 509, 510, 511, 514, 515, 517, 519, 520, 536, 540, 560, 566, 570, 571, 579, 581, 584, 589, 591, 592, 600, 601, 613, 614, 621, 622, 625, 626, 628, 629, 632, 644, 647, 663, 665, 666, 682, 684, 686, 692, 695, 701, 702, 707, 708, 711, 720, 721, 722, 727, 729, 730, 732, 734, 739, 740, 752, 754, 770, 774, 779, 785, 788, 791, 801, 802, 805, 809, 813, 817, 820, 821, 823, 824, 828, 837, 840, 841, 842, 847, 864, 866, 869, 870, 884, 886, 888, 889, 890, 894, 899, 901, 904, 909, 913, 927, 930, 935, 937, 945, 952, 953, 958, 962, 966, 969, 980, 991, 994, 1006, 1012, 1014, 1018, 1019, 1024, 1031, 1032, 1036, 1040, 1043, 1045, 1049, 1051, 1053, 1061, 1068, 1069, 1075, 1077, 1080, 1087, 1116, 1120, 1121, 1125, 1126, 1135, 1141, 1158, 1166, 1167, 1170, 1172, 1174, 1176, 1178, 1180, 1181, 1185, 1193, 1203, 1209, 1212, 1216, 1228, 1232, 1236, 1239, 1240, 1241, 1242, 1243, 1246, 1251, 1252, 1254, 1259, 1282, 1289, 1292, 1305, 1308, 1311, 1313, 1317, 1322, 1323, 1324, 1326, 1339, 1342, 1347, 1349, 1353, 1357, 1384, 1386, 1387, 1395, 1402, 1406, 1413, 1420, 1422, 1426, 1430, 1440, 1453, 1454, 1456, 1459, 1462, 1469, 1473, 1476, 1481, 1490, 1494, 1495, 1496, 1512, 1515, 1516, 1517, 1518, 1522, 1525, 1526, 1547, 1548, 1554, 1556, 1558, 1561, 1562, 1572, 1576, 1581, 1585, 1590, 1593, 1594, 1602, 1603, 1604, 1605, 1606, 1613, 1620, 1622, 1624, 1636, 1637, 1639, 1641, 1647, 1648, 1650, 1654, 1655, 1658, 1659, 1668, 1669, 1672, 1673, 1674, 1679, 1686, 1695, 1701, 1712, 1716, 1720, 1726, 1727, 1728, 1740, 1751, 1755, 1756, 1758, 1768, 1770, 1780, 1786, 1787, 1796, 1797, 1798, 1799, 1812, 1816, 1827, 1835, 1836, 1841, 1851, 1858, 1860, 1867, 1869, 1897, 1905, 1907, 1917, 1933, 1943, 1944, 1945, 1949, 1953, 1958, 1964, 1965, 1978, 1981, 1982, 1984, 1990, 1992, 1994, 1995, 1996, 2001, 2008, 2014, 2017, 2025, 2029, 2030, 2035, 2036, 2038, 2046, 2059, 2062, 2064, 2072, 2076, 2077, 2088, 2091, 2092, 2100, 2102, 2105, 2107, 2109, 2112, 2114, 2118, 2119, 2121, 2126, 2128, 2130, 2133, 2135, 2139, 2140, 2141, 2145, 2147, 2161, 2162, 2164, 2167, 2177, 2183, 2185, 2193, 2195, 2205, 2210, 2212, 2221, 2229, 2230, 2232, 2236, 2237, 2240, 2247, 2249, 2250, 2261, 2262, 2265, 2274, 2280, 2281, 2297, 2301, 2304, 2311, 2312, 2313, 2318, 2320, 2322, 2324, 2333, 2339, 2340, 2343, 2348, 2359, 2360, 2364, 2369, 2373, 2388, 2395, 2396, 2400, 2405, 2406, 2409, 2410, 2423, 2425, 2429, 2440, 2441, 2442, 2471, 2475, 2477, 2486, 2491, 2501, 2508, 2511, 2515, 2517, 2522, 2526, 2529, 2532, 2535, 2539, 2547, 2548, 2549, 2555, 2556, 2557, 2559, 2560, 2562, 2564, 2566, 2575, 2589, 2595, 2600, 2604, 2610, 2616, 2626, 2629, 2630, 2632, 2639, 2640, 2663, 2667, 2671, 2672, 2676, 2678, 2707, 2714, 2727, 2731, 2736, 2744, 2749, 2751, 2754, 2756, 2757, 2758, 2760, 2766, 2770, 2781, 2788, 2795, 2798, 2801, 2803, 2806, 2807, 2812, 2815, 2816, 2818, 2823, 2829, 2837, 2843, 2844, 2852, 2854, 2857, 2858, 2861, 2880, 2882, 2884, 2888, 2899, 2902, 2905, 2906, 2912, 2913, 2915, 2916, 2919, 2921, 2931, 2933, 2937, 2938, 2944, 2949, 2950, 2953, 2962, 2963, 2965, 2966, 2969, 2975, 2976, 2979, 2980, 2981, 2983, 2985, 2988, 2990, 2991, 2995], 'precision': 0.8330147983140184, 'recall': 0.7736666666666666, 'f1': 0.7973657072550873, 'ratio_valid_classifications': 1.0}
|
logs/Qwen2-72B-Instruct_epoch_7.txt
CHANGED
@@ -5,12 +5,12 @@ Qwen/Qwen2-72B-Instruct llama-factory/saves/Qwen2-72B-Instruct/checkpoint-245 Tr
|
|
5 |
(1) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
6 |
0.0 GB of memory reserved.
|
7 |
loading model: Qwen/Qwen2-72B-Instruct with adapter: llama-factory/saves/Qwen2-72B-Instruct/checkpoint-245
|
8 |
-
09/
|
9 |
-
09/
|
10 |
-
09/
|
11 |
-
09/
|
12 |
-
09/
|
13 |
-
09/
|
14 |
(2) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
15 |
43.037 GB of memory reserved.
|
16 |
loading train/test data files
|
@@ -153,4 +153,11 @@ You are an expert in logical reasoning.<|im_end|>
|
|
153 |
<|im_start|>assistant
|
154 |
|
155 |
Evaluating model: Qwen/Qwen2-72B-Instruct
|
156 |
-
Batch output: ['不是'
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5 |
(1) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
6 |
0.0 GB of memory reserved.
|
7 |
loading model: Qwen/Qwen2-72B-Instruct with adapter: llama-factory/saves/Qwen2-72B-Instruct/checkpoint-245
|
8 |
+
09/11/2024 06:17:47 - INFO - llamafactory.data.template - Replace eos token: <|im_end|>
|
9 |
+
09/11/2024 06:17:48 - INFO - llamafactory.model.model_utils.quantization - Quantizing model to 4 bit with bitsandbytes.
|
10 |
+
09/11/2024 06:17:48 - INFO - llamafactory.model.patcher - Using KV cache for faster generation.
|
11 |
+
09/11/2024 06:27:45 - INFO - llamafactory.model.model_utils.attention - Using torch SDPA for faster training and inference.
|
12 |
+
09/11/2024 06:27:47 - INFO - llamafactory.model.adapter - Loaded adapter(s): llama-factory/saves/Qwen2-72B-Instruct/checkpoint-245
|
13 |
+
09/11/2024 06:27:47 - INFO - llamafactory.model.loader - all params: 72,811,470,848
|
14 |
(2) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
15 |
43.037 GB of memory reserved.
|
16 |
loading train/test data files
|
|
|
153 |
<|im_start|>assistant
|
154 |
|
155 |
Evaluating model: Qwen/Qwen2-72B-Instruct
|
156 |
+
Batch output: ['不是']
|
157 |
+
(3) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
158 |
+
43.537 GB of memory reserved.
|
159 |
+
text ... Qwen/Qwen2-72B-Instruct/checkpoint-245_torch.bfloat16_4bit_lf
|
160 |
+
0 甄加索是自杀吗 ... 不是
|
161 |
+
|
162 |
+
[1 rows x 13 columns]
|
163 |
+
{'accuracy': 0.7656666666666667, 'incorrect_ids': [11, 27, 29, 31, 34, 36, 38, 55, 58, 59, 61, 65, 66, 67, 78, 81, 83, 88, 93, 97, 103, 104, 108, 109, 112, 114, 115, 117, 119, 128, 129, 131, 135, 139, 143, 150, 155, 161, 163, 164, 173, 179, 190, 193, 199, 200, 202, 207, 222, 224, 225, 229, 240, 243, 245, 248, 250, 255, 257, 259, 260, 263, 265, 271, 283, 286, 292, 293, 295, 299, 303, 304, 311, 314, 317, 321, 322, 323, 326, 328, 332, 334, 335, 341, 350, 351, 352, 353, 354, 355, 356, 357, 359, 360, 362, 365, 368, 370, 372, 373, 377, 389, 395, 396, 397, 410, 428, 429, 430, 445, 447, 454, 456, 458, 461, 465, 471, 473, 476, 480, 485, 486, 488, 490, 492, 493, 494, 495, 496, 498, 500, 501, 502, 503, 506, 507, 508, 510, 511, 514, 517, 519, 520, 534, 536, 540, 543, 553, 560, 568, 570, 571, 579, 581, 589, 591, 592, 596, 601, 609, 612, 613, 614, 621, 625, 626, 628, 629, 632, 644, 647, 650, 663, 665, 666, 674, 680, 682, 684, 686, 692, 693, 695, 701, 702, 705, 707, 708, 711, 716, 718, 721, 722, 727, 729, 730, 732, 734, 739, 740, 743, 754, 770, 774, 778, 779, 788, 795, 801, 802, 805, 808, 809, 817, 820, 821, 822, 823, 824, 828, 833, 837, 840, 847, 866, 870, 876, 884, 888, 889, 890, 899, 901, 904, 906, 909, 912, 913, 927, 930, 935, 937, 945, 952, 953, 958, 962, 966, 968, 969, 980, 981, 989, 991, 994, 1003, 1004, 1006, 1011, 1012, 1013, 1014, 1018, 1031, 1032, 1036, 1040, 1043, 1049, 1051, 1053, 1055, 1057, 1066, 1068, 1069, 1076, 1077, 1080, 1087, 1116, 1117, 1120, 1121, 1125, 1126, 1138, 1139, 1143, 1158, 1163, 1166, 1167, 1172, 1174, 1176, 1177, 1178, 1180, 1181, 1185, 1196, 1198, 1203, 1228, 1232, 1236, 1239, 1240, 1241, 1246, 1247, 1251, 1252, 1254, 1256, 1259, 1266, 1282, 1289, 1296, 1305, 1308, 1311, 1313, 1314, 1317, 1323, 1324, 1335, 1339, 1342, 1347, 1349, 1353, 1356, 1363, 1379, 1384, 1385, 1386, 1387, 1389, 1393, 1395, 1402, 1406, 1416, 1420, 1422, 1425, 1426, 1440, 1443, 1447, 1448, 1451, 1453, 1454, 1455, 1458, 1459, 1462, 1468, 1469, 1475, 1476, 1481, 1486, 1487, 1490, 1494, 1495, 1496, 1510, 1512, 1515, 1517, 1518, 1522, 1525, 1526, 1528, 1547, 1548, 1556, 1558, 1560, 1562, 1572, 1581, 1585, 1587, 1590, 1593, 1594, 1603, 1604, 1605, 1606, 1613, 1622, 1624, 1629, 1636, 1637, 1639, 1641, 1643, 1645, 1647, 1648, 1650, 1654, 1655, 1658, 1659, 1668, 1669, 1672, 1673, 1674, 1679, 1686, 1695, 1700, 1701, 1712, 1716, 1718, 1726, 1727, 1751, 1755, 1756, 1758, 1768, 1770, 1772, 1780, 1785, 1786, 1787, 1796, 1797, 1799, 1806, 1812, 1816, 1825, 1827, 1835, 1836, 1841, 1854, 1858, 1860, 1867, 1869, 1872, 1897, 1914, 1933, 1944, 1945, 1950, 1953, 1958, 1963, 1964, 1965, 1978, 1981, 1983, 1984, 1990, 1992, 1994, 1995, 2001, 2002, 2008, 2010, 2015, 2017, 2021, 2025, 2028, 2035, 2036, 2046, 2053, 2059, 2064, 2072, 2076, 2077, 2085, 2091, 2092, 2102, 2105, 2107, 2109, 2112, 2118, 2119, 2120, 2121, 2123, 2126, 2130, 2133, 2135, 2139, 2140, 2141, 2145, 2147, 2156, 2159, 2161, 2162, 2164, 2167, 2177, 2180, 2183, 2185, 2186, 2188, 2193, 2194, 2195, 2197, 2209, 2210, 2212, 2215, 2221, 2226, 2229, 2230, 2237, 2240, 2243, 2249, 2250, 2261, 2262, 2265, 2274, 2276, 2287, 2290, 2297, 2301, 2304, 2312, 2313, 2318, 2320, 2322, 2324, 2330, 2333, 2340, 2348, 2354, 2359, 2360, 2364, 2366, 2369, 2373, 2395, 2396, 2400, 2404, 2406, 2409, 2410, 2423, 2425, 2429, 2437, 2440, 2441, 2442, 2445, 2448, 2469, 2471, 2486, 2491, 2501, 2511, 2515, 2517, 2522, 2526, 2529, 2530, 2532, 2535, 2539, 2547, 2548, 2549, 2554, 2555, 2556, 2557, 2558, 2559, 2560, 2562, 2563, 2565, 2569, 2574, 2575, 2589, 2600, 2604, 2605, 2623, 2626, 2629, 2632, 2655, 2663, 2667, 2671, 2672, 2676, 2678, 2714, 2727, 2731, 2735, 2736, 2744, 2749, 2751, 2754, 2756, 2757, 2758, 2760, 2764, 2766, 2770, 2788, 2797, 2798, 2803, 2806, 2807, 2811, 2815, 2816, 2823, 2837, 2843, 2849, 2852, 2854, 2857, 2858, 2880, 2882, 2884, 2888, 2896, 2899, 2902, 2903, 2905, 2906, 2912, 2913, 2915, 2916, 2919, 2921, 2931, 2933, 2937, 2944, 2949, 2950, 2953, 2966, 2973, 2975, 2976, 2977, 2980, 2981, 2985, 2990, 2995], 'precision': 0.8288272203240518, 'recall': 0.7656666666666667, 'f1': 0.790627109330698, 'ratio_valid_classifications': 1.0}
|
logs/Qwen2-72B-Instruct_epoch_8.txt
CHANGED
@@ -5,12 +5,12 @@ Qwen/Qwen2-72B-Instruct llama-factory/saves/Qwen2-72B-Instruct/checkpoint-280 Tr
|
|
5 |
(1) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
6 |
0.0 GB of memory reserved.
|
7 |
loading model: Qwen/Qwen2-72B-Instruct with adapter: llama-factory/saves/Qwen2-72B-Instruct/checkpoint-280
|
8 |
-
09/
|
9 |
-
09/
|
10 |
-
09/
|
11 |
-
09/
|
12 |
-
09/
|
13 |
-
09/
|
14 |
(2) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
15 |
43.037 GB of memory reserved.
|
16 |
loading train/test data files
|
@@ -153,4 +153,11 @@ You are an expert in logical reasoning.<|im_end|>
|
|
153 |
<|im_start|>assistant
|
154 |
|
155 |
Evaluating model: Qwen/Qwen2-72B-Instruct
|
156 |
-
Batch output: ['不是'
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5 |
(1) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
6 |
0.0 GB of memory reserved.
|
7 |
loading model: Qwen/Qwen2-72B-Instruct with adapter: llama-factory/saves/Qwen2-72B-Instruct/checkpoint-280
|
8 |
+
09/11/2024 08:24:31 - INFO - llamafactory.data.template - Replace eos token: <|im_end|>
|
9 |
+
09/11/2024 08:24:31 - INFO - llamafactory.model.model_utils.quantization - Quantizing model to 4 bit with bitsandbytes.
|
10 |
+
09/11/2024 08:24:31 - INFO - llamafactory.model.patcher - Using KV cache for faster generation.
|
11 |
+
09/11/2024 08:34:29 - INFO - llamafactory.model.model_utils.attention - Using torch SDPA for faster training and inference.
|
12 |
+
09/11/2024 08:34:31 - INFO - llamafactory.model.adapter - Loaded adapter(s): llama-factory/saves/Qwen2-72B-Instruct/checkpoint-280
|
13 |
+
09/11/2024 08:34:31 - INFO - llamafactory.model.loader - all params: 72,811,470,848
|
14 |
(2) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
15 |
43.037 GB of memory reserved.
|
16 |
loading train/test data files
|
|
|
153 |
<|im_start|>assistant
|
154 |
|
155 |
Evaluating model: Qwen/Qwen2-72B-Instruct
|
156 |
+
Batch output: ['不是']
|
157 |
+
(3) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
158 |
+
43.537 GB of memory reserved.
|
159 |
+
text ... Qwen/Qwen2-72B-Instruct/checkpoint-280_torch.bfloat16_4bit_lf
|
160 |
+
0 甄加索是自杀吗 ... 不是
|
161 |
+
|
162 |
+
[1 rows x 14 columns]
|
163 |
+
{'accuracy': 0.7693333333333333, 'incorrect_ids': [9, 11, 24, 27, 29, 31, 34, 36, 55, 59, 61, 65, 66, 67, 77, 78, 81, 82, 83, 88, 91, 93, 94, 97, 104, 108, 112, 114, 115, 117, 119, 120, 124, 128, 129, 131, 135, 138, 139, 143, 150, 155, 160, 161, 163, 164, 179, 190, 192, 199, 200, 202, 224, 234, 235, 236, 240, 245, 248, 250, 251, 254, 255, 257, 259, 260, 269, 271, 275, 286, 292, 299, 304, 314, 321, 326, 328, 330, 334, 335, 341, 350, 351, 353, 355, 356, 360, 362, 368, 370, 371, 372, 373, 376, 377, 389, 391, 395, 396, 397, 410, 419, 423, 428, 429, 430, 438, 445, 447, 449, 452, 454, 456, 458, 461, 471, 473, 476, 480, 486, 488, 490, 492, 493, 494, 495, 498, 501, 502, 506, 507, 508, 510, 511, 512, 514, 517, 519, 520, 534, 536, 540, 560, 566, 570, 571, 579, 581, 584, 589, 591, 592, 593, 597, 598, 600, 601, 609, 612, 613, 614, 615, 621, 625, 626, 628, 629, 632, 644, 647, 650, 663, 666, 682, 684, 686, 692, 694, 695, 702, 707, 711, 718, 720, 721, 722, 727, 729, 730, 732, 734, 739, 754, 770, 771, 774, 779, 783, 785, 789, 797, 798, 800, 801, 805, 809, 817, 818, 819, 820, 821, 822, 823, 824, 826, 837, 840, 842, 847, 864, 865, 866, 869, 870, 875, 884, 888, 889, 890, 899, 901, 904, 909, 913, 924, 927, 930, 935, 937, 945, 952, 962, 966, 969, 980, 986, 991, 994, 1006, 1011, 1012, 1014, 1018, 1024, 1032, 1036, 1040, 1043, 1049, 1051, 1053, 1066, 1069, 1071, 1075, 1080, 1087, 1107, 1116, 1117, 1120, 1125, 1126, 1129, 1135, 1141, 1153, 1158, 1161, 1166, 1172, 1174, 1177, 1178, 1180, 1181, 1183, 1185, 1198, 1203, 1228, 1232, 1236, 1239, 1240, 1241, 1242, 1246, 1251, 1252, 1254, 1259, 1266, 1282, 1289, 1305, 1307, 1308, 1311, 1313, 1314, 1317, 1324, 1331, 1339, 1342, 1347, 1349, 1353, 1357, 1362, 1363, 1367, 1379, 1380, 1384, 1385, 1387, 1389, 1390, 1393, 1395, 1402, 1406, 1407, 1416, 1420, 1422, 1426, 1428, 1430, 1431, 1440, 1444, 1445, 1448, 1451, 1453, 1454, 1456, 1458, 1462, 1469, 1473, 1476, 1481, 1486, 1490, 1494, 1496, 1512, 1517, 1518, 1522, 1525, 1526, 1540, 1547, 1548, 1556, 1560, 1562, 1580, 1585, 1587, 1590, 1593, 1594, 1596, 1603, 1604, 1605, 1606, 1608, 1613, 1620, 1622, 1624, 1631, 1633, 1636, 1637, 1641, 1643, 1645, 1648, 1650, 1654, 1655, 1658, 1659, 1665, 1668, 1672, 1674, 1679, 1686, 1690, 1695, 1701, 1704, 1712, 1716, 1726, 1727, 1751, 1755, 1756, 1758, 1768, 1770, 1785, 1786, 1787, 1796, 1797, 1798, 1812, 1816, 1820, 1827, 1835, 1836, 1848, 1851, 1854, 1858, 1860, 1869, 1879, 1895, 1905, 1907, 1914, 1930, 1933, 1943, 1945, 1953, 1958, 1964, 1965, 1978, 1981, 1990, 1992, 1994, 1995, 2001, 2011, 2014, 2017, 2025, 2028, 2029, 2030, 2035, 2044, 2046, 2061, 2062, 2064, 2067, 2072, 2076, 2077, 2091, 2092, 2094, 2105, 2107, 2109, 2112, 2114, 2118, 2119, 2121, 2126, 2130, 2133, 2139, 2140, 2141, 2144, 2145, 2147, 2161, 2162, 2164, 2167, 2174, 2177, 2180, 2183, 2185, 2186, 2188, 2193, 2194, 2196, 2205, 2210, 2212, 2215, 2229, 2230, 2237, 2240, 2244, 2247, 2249, 2261, 2262, 2265, 2274, 2287, 2293, 2297, 2301, 2304, 2311, 2312, 2313, 2318, 2320, 2322, 2324, 2330, 2333, 2339, 2340, 2348, 2354, 2359, 2360, 2364, 2366, 2369, 2373, 2388, 2395, 2396, 2400, 2404, 2405, 2406, 2409, 2410, 2423, 2424, 2425, 2429, 2437, 2439, 2440, 2441, 2442, 2448, 2469, 2471, 2474, 2477, 2486, 2488, 2491, 2499, 2511, 2515, 2517, 2522, 2524, 2529, 2532, 2535, 2539, 2542, 2547, 2548, 2549, 2555, 2556, 2557, 2559, 2560, 2562, 2563, 2564, 2565, 2566, 2569, 2575, 2589, 2600, 2604, 2606, 2610, 2616, 2617, 2624, 2626, 2629, 2630, 2632, 2655, 2663, 2667, 2671, 2672, 2676, 2708, 2714, 2727, 2733, 2736, 2746, 2749, 2751, 2754, 2756, 2757, 2760, 2762, 2764, 2766, 2767, 2781, 2788, 2795, 2797, 2798, 2801, 2803, 2807, 2810, 2811, 2812, 2814, 2815, 2816, 2823, 2837, 2843, 2844, 2852, 2857, 2858, 2861, 2873, 2880, 2882, 2884, 2888, 2899, 2902, 2905, 2906, 2912, 2913, 2915, 2916, 2919, 2921, 2931, 2933, 2938, 2944, 2949, 2953, 2955, 2962, 2966, 2969, 2977, 2979, 2980, 2981, 2983, 2985, 2990, 2991, 2995, 2999], 'precision': 0.8292798021666021, 'recall': 0.7693333333333333, 'f1': 0.7930169589012503, 'ratio_valid_classifications': 1.0}
|
logs/Qwen2-72B-Instruct_epoch_9.txt
CHANGED
@@ -5,12 +5,12 @@ Qwen/Qwen2-72B-Instruct llama-factory/saves/Qwen2-72B-Instruct/checkpoint-315 Tr
|
|
5 |
(1) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
6 |
0.0 GB of memory reserved.
|
7 |
loading model: Qwen/Qwen2-72B-Instruct with adapter: llama-factory/saves/Qwen2-72B-Instruct/checkpoint-315
|
8 |
-
09/
|
9 |
-
09/
|
10 |
-
09/
|
11 |
-
09/
|
12 |
-
09/
|
13 |
-
09/
|
14 |
(2) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
15 |
43.037 GB of memory reserved.
|
16 |
loading train/test data files
|
@@ -153,4 +153,11 @@ You are an expert in logical reasoning.<|im_end|>
|
|
153 |
<|im_start|>assistant
|
154 |
|
155 |
Evaluating model: Qwen/Qwen2-72B-Instruct
|
156 |
-
Batch output: ['不是'
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5 |
(1) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
6 |
0.0 GB of memory reserved.
|
7 |
loading model: Qwen/Qwen2-72B-Instruct with adapter: llama-factory/saves/Qwen2-72B-Instruct/checkpoint-315
|
8 |
+
09/11/2024 10:32:12 - INFO - llamafactory.data.template - Replace eos token: <|im_end|>
|
9 |
+
09/11/2024 10:32:12 - INFO - llamafactory.model.model_utils.quantization - Quantizing model to 4 bit with bitsandbytes.
|
10 |
+
09/11/2024 10:32:12 - INFO - llamafactory.model.patcher - Using KV cache for faster generation.
|
11 |
+
09/11/2024 10:42:53 - INFO - llamafactory.model.model_utils.attention - Using torch SDPA for faster training and inference.
|
12 |
+
09/11/2024 10:42:55 - INFO - llamafactory.model.adapter - Loaded adapter(s): llama-factory/saves/Qwen2-72B-Instruct/checkpoint-315
|
13 |
+
09/11/2024 10:42:55 - INFO - llamafactory.model.loader - all params: 72,811,470,848
|
14 |
(2) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
15 |
43.037 GB of memory reserved.
|
16 |
loading train/test data files
|
|
|
153 |
<|im_start|>assistant
|
154 |
|
155 |
Evaluating model: Qwen/Qwen2-72B-Instruct
|
156 |
+
Batch output: ['不是']
|
157 |
+
(3) GPU = NVIDIA L40. Max memory = 44.309 GB.
|
158 |
+
43.537 GB of memory reserved.
|
159 |
+
text ... Qwen/Qwen2-72B-Instruct/checkpoint-315_torch.bfloat16_4bit_lf
|
160 |
+
0 甄加索是自杀吗 ... 不是
|
161 |
+
|
162 |
+
[1 rows x 15 columns]
|
163 |
+
{'accuracy': 0.784, 'incorrect_ids': [9, 11, 27, 29, 31, 34, 36, 55, 58, 59, 61, 65, 66, 67, 77, 78, 81, 82, 83, 88, 91, 93, 94, 97, 104, 112, 114, 115, 119, 124, 129, 131, 135, 137, 138, 139, 143, 150, 155, 161, 163, 164, 179, 190, 193, 199, 200, 202, 218, 224, 225, 229, 243, 245, 248, 250, 251, 255, 257, 259, 260, 262, 263, 265, 271, 283, 286, 289, 292, 294, 295, 299, 304, 311, 314, 317, 321, 323, 326, 330, 332, 334, 335, 342, 347, 350, 353, 355, 356, 360, 362, 368, 370, 372, 373, 376, 377, 389, 395, 396, 397, 410, 414, 416, 423, 428, 429, 430, 445, 447, 452, 454, 455, 456, 457, 458, 461, 471, 472, 473, 476, 480, 483, 488, 492, 493, 495, 498, 501, 502, 503, 506, 507, 508, 509, 510, 511, 514, 515, 517, 519, 520, 536, 540, 553, 560, 566, 570, 571, 579, 581, 589, 591, 592, 598, 600, 601, 612, 613, 614, 615, 621, 625, 626, 628, 629, 632, 644, 647, 663, 666, 674, 680, 682, 684, 686, 692, 694, 695, 702, 707, 711, 720, 721, 722, 727, 729, 730, 732, 734, 739, 754, 770, 774, 785, 786, 788, 795, 798, 800, 801, 805, 809, 817, 820, 821, 822, 823, 824, 833, 837, 840, 842, 847, 861, 866, 870, 873, 884, 889, 890, 894, 901, 904, 909, 913, 927, 930, 935, 941, 952, 958, 962, 966, 969, 980, 991, 994, 1006, 1012, 1014, 1022, 1028, 1032, 1036, 1040, 1043, 1049, 1051, 1053, 1056, 1057, 1066, 1069, 1075, 1077, 1078, 1080, 1087, 1116, 1120, 1125, 1126, 1129, 1138, 1158, 1166, 1170, 1172, 1174, 1177, 1178, 1180, 1181, 1185, 1196, 1203, 1209, 1212, 1216, 1220, 1228, 1232, 1236, 1239, 1240, 1241, 1245, 1246, 1251, 1252, 1254, 1259, 1266, 1282, 1289, 1305, 1308, 1311, 1313, 1314, 1317, 1324, 1326, 1339, 1342, 1347, 1349, 1353, 1357, 1370, 1379, 1386, 1387, 1389, 1392, 1393, 1395, 1402, 1406, 1413, 1416, 1420, 1422, 1426, 1428, 1440, 1444, 1453, 1454, 1462, 1469, 1475, 1476, 1481, 1486, 1490, 1494, 1495, 1496, 1510, 1512, 1517, 1518, 1522, 1525, 1526, 1528, 1533, 1547, 1548, 1558, 1562, 1576, 1585, 1590, 1593, 1594, 1602, 1603, 1605, 1606, 1622, 1624, 1627, 1631, 1636, 1637, 1641, 1647, 1648, 1650, 1654, 1655, 1658, 1659, 1668, 1672, 1674, 1679, 1686, 1690, 1695, 1712, 1715, 1716, 1726, 1727, 1728, 1751, 1755, 1756, 1768, 1770, 1773, 1786, 1787, 1796, 1799, 1812, 1816, 1820, 1827, 1835, 1836, 1841, 1847, 1858, 1860, 1869, 1888, 1897, 1905, 1907, 1914, 1933, 1953, 1958, 1964, 1965, 1978, 1981, 1984, 1990, 1992, 1994, 1995, 2001, 2008, 2014, 2015, 2017, 2028, 2035, 2038, 2044, 2046, 2053, 2059, 2062, 2064, 2067, 2072, 2076, 2077, 2085, 2091, 2095, 2105, 2107, 2109, 2112, 2118, 2119, 2121, 2126, 2130, 2133, 2139, 2141, 2145, 2147, 2159, 2161, 2162, 2164, 2167, 2177, 2183, 2185, 2186, 2188, 2189, 2193, 2195, 2197, 2210, 2212, 2221, 2226, 2229, 2230, 2237, 2240, 2249, 2262, 2265, 2274, 2280, 2287, 2297, 2301, 2304, 2312, 2313, 2318, 2320, 2322, 2324, 2330, 2333, 2340, 2348, 2359, 2360, 2364, 2366, 2369, 2373, 2385, 2388, 2395, 2396, 2400, 2404, 2405, 2406, 2409, 2410, 2422, 2423, 2424, 2425, 2429, 2437, 2440, 2442, 2446, 2448, 2471, 2477, 2486, 2488, 2493, 2499, 2502, 2503, 2511, 2515, 2517, 2522, 2524, 2526, 2529, 2532, 2535, 2539, 2547, 2548, 2549, 2555, 2556, 2559, 2560, 2562, 2575, 2589, 2597, 2600, 2604, 2605, 2606, 2610, 2616, 2624, 2629, 2630, 2632, 2655, 2660, 2663, 2671, 2672, 2676, 2678, 2707, 2714, 2715, 2727, 2733, 2735, 2736, 2744, 2745, 2749, 2751, 2754, 2756, 2757, 2760, 2762, 2764, 2766, 2781, 2788, 2795, 2797, 2803, 2806, 2807, 2810, 2811, 2815, 2816, 2823, 2837, 2842, 2843, 2844, 2845, 2850, 2852, 2854, 2857, 2858, 2861, 2875, 2880, 2882, 2884, 2888, 2899, 2902, 2905, 2906, 2912, 2913, 2915, 2916, 2921, 2931, 2933, 2937, 2944, 2949, 2953, 2962, 2966, 2969, 2973, 2975, 2976, 2980, 2981, 2983, 2991, 2995, 2999], 'precision': 0.8354349234761956, 'recall': 0.784, 'f1': 0.804194683154365, 'ratio_valid_classifications': 1.0}
|
results/Qwen2-72B-Instruct_p2.csv
CHANGED
The diff for this file is too large to render.
See raw diff
|
|