imalofeev commited on
Commit
e970115
0 Parent(s):

Initial commit

Browse files
.gitattributes ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ baseline_enot/weights** filter=lfs diff=lfs merge=lfs -text
37
+ baseline_ultralytics/weights/** filter=lfs diff=lfs merge=lfs -text
38
+ enot_neural_architecture_selection_x2/weights/** filter=lfs diff=lfs merge=lfs -text
39
+ enot_neural_architecture_selection_x3/weights/** filter=lfs diff=lfs merge=lfs -text
40
+ baseline_enot_nano/weights/** filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - visdrone
5
+ model-index:
6
+ - name: ENOT-AutoDL/yolov8s_visdrone
7
+ results:
8
+ - task:
9
+ type: object-detection
10
+ metrics:
11
+ - type: precision
12
+ value: 49,4
13
+ name: mAP50(baseline)
14
+ - type: precision
15
+ value: 48,4
16
+ name: mAP50(GMACs x2)
17
+ - type: precision
18
+ value: 46,0
19
+ name: mAP50(GMACs x3)
20
+ library_name: ultralytics
21
+ pipeline_tag: object-detection
22
+ tags:
23
+ - yolov8
24
+ - ENOT-AutoDL
25
+ - yolo
26
+ - vision
27
+ - ultralytics
28
+ - object-detection
29
+ ---
30
+ # ENOT-AutoDL YOLOv8 optimization on VisDrone dataset
31
+
32
+ This repository contains models accelerated with [ENOT-AutoDL](https://pypi.org/project/enot-autodl/) framework.
33
+ We trained yolov8s on VisDrone dataset and used it as our baseline.
34
+ Also we provide simple python script to measure flops and metrics.
35
+
36
+ ## YOLOv8 Small
37
+
38
+ | Model | GMACs | Image Size | mAP50 | mAP50-95 |
39
+ |---------------------------|:-----------:|:-----------:|:-----------:|:-----------:|
40
+ | **[YOLOv8 Ultralytics Baseline](https://docs.ultralytics.com/datasets/detect/visdrone/#dataset-yaml)** | 14,28 | 640 | 40,2 | 24,2 |
41
+ | **YOLOv8n Enot Baseline** | 8,57 | 928 | 42,9 | 26,0 |
42
+ | **YOLOv8s Enot Baseline** | 30,03 | 928 | 49,4 | 30,6 |
43
+ | **YOLOv8s (x2)** | 15,01 (x2) | 928 | 48,3 (-1,1) | 29,8 (-0,8) |
44
+ | **YOLOv8s (x3)** | 10,01 (x3) | 928 | 46,0 (-3,4) | 28,3 (-2,3) |
45
+
46
+ # Validation
47
+
48
+ To validate results, follow this steps:
49
+
50
+ 1. Install all required packages:
51
+ ```bash
52
+ pip install -r requrements.txt
53
+ ```
54
+
55
+ 2. Use validation script:
56
+ ```bash
57
+ python validate.py enot_neural_architecture_selection_x2/weights/best.pt --imgsz 928
58
+ ```
59
+
60
+ 3. Use measure_macs script:
61
+ ```bash
62
+ python measure_macs.py enot_neural_architecture_selection_x2/weights/best.pt --imgsz 928
63
+ ```
baseline_enot/args.yaml ADDED
@@ -0,0 +1,97 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ task: detect
2
+ mode: train
3
+ model: yolov8s.pt
4
+ data: VisDrone.yaml
5
+ epochs: 100
6
+ patience: 50
7
+ batch: 16
8
+ imgsz: 928
9
+ save: true
10
+ save_period: -1
11
+ cache: false
12
+ device: null
13
+ workers: 16
14
+ project: null
15
+ name: visdrone_yolo8s
16
+ exist_ok: false
17
+ pretrained: true
18
+ optimizer: auto
19
+ verbose: true
20
+ seed: 0
21
+ deterministic: true
22
+ single_cls: false
23
+ rect: false
24
+ cos_lr: false
25
+ close_mosaic: 10
26
+ resume: false
27
+ amp: true
28
+ fraction: 1.0
29
+ profile: false
30
+ freeze: null
31
+ overlap_mask: true
32
+ mask_ratio: 4
33
+ dropout: 0.0
34
+ val: true
35
+ split: val
36
+ save_json: false
37
+ save_hybrid: false
38
+ conf: null
39
+ iou: 0.7
40
+ max_det: 300
41
+ half: false
42
+ dnn: false
43
+ plots: true
44
+ source: null
45
+ show: false
46
+ save_txt: false
47
+ save_conf: false
48
+ save_crop: false
49
+ show_labels: true
50
+ show_conf: true
51
+ vid_stride: 1
52
+ stream_buffer: false
53
+ line_width: null
54
+ visualize: false
55
+ augment: false
56
+ agnostic_nms: false
57
+ classes: null
58
+ retina_masks: false
59
+ boxes: true
60
+ format: torchscript
61
+ keras: false
62
+ optimize: false
63
+ int8: false
64
+ dynamic: false
65
+ simplify: false
66
+ opset: null
67
+ workspace: 4
68
+ nms: false
69
+ lr0: 0.01
70
+ lrf: 0.01
71
+ momentum: 0.937
72
+ weight_decay: 0.0005
73
+ warmup_epochs: 3.0
74
+ warmup_momentum: 0.8
75
+ warmup_bias_lr: 0.1
76
+ box: 7.5
77
+ cls: 0.5
78
+ dfl: 1.5
79
+ pose: 12.0
80
+ kobj: 1.0
81
+ label_smoothing: 0.0
82
+ nbs: 64
83
+ hsv_h: 0.015
84
+ hsv_s: 0.7
85
+ hsv_v: 0.4
86
+ degrees: 0.0
87
+ translate: 0.1
88
+ scale: 0.5
89
+ shear: 0.0
90
+ perspective: 0.0
91
+ flipud: 0.0
92
+ fliplr: 0.5
93
+ mosaic: 1.0
94
+ mixup: 0.0
95
+ copy_paste: 0.0
96
+ cfg: null
97
+ tracker: botsort.yaml
baseline_enot/events.out.tfevents.1697772240.user-MS-7D67.210222.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f7f6ba531515386b96dbf42f8ad14a79c3a1b7fd21d935ba7d291bd01b21f771
3
+ size 6669068
baseline_enot/results.csv ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch, train/box_loss, train/cls_loss, train/dfl_loss, metrics/precision(B), metrics/recall(B), metrics/mAP50(B), metrics/mAP50-95(B), val/box_loss, val/cls_loss, val/dfl_loss, lr/pg0, lr/pg1, lr/pg2
2
+ 1, 1.4761, 1.7546, 1.0361, 0.3965, 0.31488, 0.29818, 0.17551, 1.3675, 1.1394, 0.97439, 0.0033251, 0.0033251, 0.0033251
3
+ 2, 1.4063, 1.1858, 0.98274, 0.43967, 0.34347, 0.34138, 0.19975, 1.3539, 1.0878, 0.96724, 0.0065925, 0.0065925, 0.0065925
4
+ 3, 1.4311, 1.1618, 0.98396, 0.46416, 0.33872, 0.34605, 0.206, 1.3383, 1.1508, 0.96733, 0.0097939, 0.0097939, 0.0097939
5
+ 4, 1.424, 1.1284, 0.98215, 0.44224, 0.36184, 0.35898, 0.21446, 1.334, 1.0642, 0.96794, 0.009703, 0.009703, 0.009703
6
+ 5, 1.3951, 1.0807, 0.97344, 0.47696, 0.37251, 0.37733, 0.22055, 1.337, 1.0536, 0.96301, 0.009703, 0.009703, 0.009703
7
+ 6, 1.3815, 1.0497, 0.96729, 0.4754, 0.38559, 0.38478, 0.22964, 1.3084, 1.0197, 0.96057, 0.009604, 0.009604, 0.009604
8
+ 7, 1.3645, 1.0289, 0.96614, 0.50353, 0.38079, 0.39356, 0.23629, 1.3124, 1.0002, 0.95556, 0.009505, 0.009505, 0.009505
9
+ 8, 1.3471, 1.0132, 0.96249, 0.48985, 0.40549, 0.41037, 0.24521, 1.2854, 0.97809, 0.94897, 0.009406, 0.009406, 0.009406
10
+ 9, 1.3354, 0.9932, 0.95539, 0.52244, 0.40118, 0.41701, 0.25025, 1.2927, 0.97084, 0.94973, 0.009307, 0.009307, 0.009307
11
+ 10, 1.318, 0.97654, 0.95339, 0.50473, 0.39423, 0.40622, 0.24226, 1.3011, 0.97584, 0.94915, 0.009208, 0.009208, 0.009208
12
+ 11, 1.3145, 0.9641, 0.94751, 0.5417, 0.40117, 0.42642, 0.25581, 1.2787, 0.95755, 0.94484, 0.009109, 0.009109, 0.009109
13
+ 12, 1.3063, 0.94875, 0.94846, 0.5376, 0.42076, 0.44247, 0.26818, 1.2597, 0.94047, 0.94018, 0.00901, 0.00901, 0.00901
14
+ 13, 1.3047, 0.93901, 0.94621, 0.52726, 0.41454, 0.43154, 0.26259, 1.2681, 0.94533, 0.9437, 0.008911, 0.008911, 0.008911
15
+ 14, 1.2887, 0.92986, 0.94257, 0.53908, 0.41687, 0.43536, 0.26304, 1.2608, 0.9399, 0.94351, 0.008812, 0.008812, 0.008812
16
+ 15, 1.2791, 0.91818, 0.93994, 0.54551, 0.41371, 0.43663, 0.26533, 1.2606, 0.93349, 0.93931, 0.008713, 0.008713, 0.008713
17
+ 16, 1.2754, 0.91294, 0.93883, 0.54154, 0.43035, 0.44883, 0.2728, 1.2506, 0.91463, 0.93892, 0.008614, 0.008614, 0.008614
18
+ 17, 1.2695, 0.90378, 0.93665, 0.52216, 0.43034, 0.44458, 0.27034, 1.2613, 0.92401, 0.9424, 0.008515, 0.008515, 0.008515
19
+ 18, 1.2756, 0.89794, 0.93691, 0.53852, 0.42356, 0.44476, 0.27026, 1.2536, 0.92697, 0.93903, 0.008416, 0.008416, 0.008416
20
+ 19, 1.2573, 0.88592, 0.93319, 0.55241, 0.4299, 0.45315, 0.27676, 1.2436, 0.90916, 0.93595, 0.008317, 0.008317, 0.008317
21
+ 20, 1.2553, 0.87851, 0.93287, 0.54586, 0.43819, 0.45975, 0.28177, 1.2307, 0.89486, 0.93085, 0.008218, 0.008218, 0.008218
22
+ 21, 1.2573, 0.88048, 0.93258, 0.55478, 0.43294, 0.45746, 0.27897, 1.2393, 0.89898, 0.93427, 0.008119, 0.008119, 0.008119
23
+ 22, 1.24, 0.86573, 0.92995, 0.53328, 0.44148, 0.45753, 0.28037, 1.2469, 0.90405, 0.93584, 0.00802, 0.00802, 0.00802
24
+ 23, 1.2397, 0.86206, 0.92955, 0.5534, 0.43696, 0.45677, 0.27888, 1.236, 0.89546, 0.93319, 0.007921, 0.007921, 0.007921
25
+ 24, 1.2372, 0.85951, 0.92834, 0.53718, 0.43266, 0.45558, 0.27824, 1.2424, 0.90272, 0.93278, 0.007822, 0.007822, 0.007822
26
+ 25, 1.2265, 0.84651, 0.92765, 0.5536, 0.44633, 0.46261, 0.28289, 1.2349, 0.89213, 0.93245, 0.007723, 0.007723, 0.007723
27
+ 26, 1.2296, 0.84323, 0.92557, 0.55676, 0.44068, 0.46407, 0.28296, 1.2336, 0.8877, 0.93252, 0.007624, 0.007624, 0.007624
28
+ 27, 1.2312, 0.84345, 0.9257, 0.55946, 0.44643, 0.4682, 0.28697, 1.2269, 0.88177, 0.92894, 0.007525, 0.007525, 0.007525
29
+ 28, 1.2224, 0.83758, 0.92308, 0.55565, 0.44636, 0.46614, 0.28512, 1.2292, 0.88358, 0.92892, 0.007426, 0.007426, 0.007426
30
+ 29, 1.2238, 0.83374, 0.92222, 0.55403, 0.44783, 0.46751, 0.28542, 1.2351, 0.88366, 0.92878, 0.007327, 0.007327, 0.007327
31
+ 30, 1.2141, 0.82926, 0.92249, 0.56018, 0.45248, 0.47102, 0.28689, 1.2315, 0.88328, 0.92821, 0.007228, 0.007228, 0.007228
32
+ 31, 1.2112, 0.82284, 0.92134, 0.56132, 0.4535, 0.47181, 0.28953, 1.2243, 0.87896, 0.92701, 0.007129, 0.007129, 0.007129
33
+ 32, 1.2053, 0.81906, 0.91942, 0.57574, 0.44134, 0.47123, 0.28794, 1.2248, 0.87384, 0.92643, 0.00703, 0.00703, 0.00703
34
+ 33, 1.211, 0.82068, 0.92091, 0.55408, 0.45288, 0.47286, 0.29012, 1.225, 0.87033, 0.92777, 0.006931, 0.006931, 0.006931
35
+ 34, 1.204, 0.81339, 0.91819, 0.55773, 0.44737, 0.47125, 0.28961, 1.2215, 0.87214, 0.92715, 0.006832, 0.006832, 0.006832
36
+ 35, 1.2001, 0.80374, 0.91734, 0.55959, 0.45255, 0.47244, 0.28897, 1.2219, 0.86918, 0.92651, 0.006733, 0.006733, 0.006733
37
+ 36, 1.2011, 0.80455, 0.91711, 0.55716, 0.45604, 0.47624, 0.29018, 1.2205, 0.87147, 0.92656, 0.006634, 0.006634, 0.006634
38
+ 37, 1.1984, 0.79751, 0.91607, 0.57059, 0.45467, 0.47666, 0.2936, 1.2217, 0.86785, 0.92748, 0.006535, 0.006535, 0.006535
39
+ 38, 1.1884, 0.79312, 0.91505, 0.57932, 0.45325, 0.47859, 0.29381, 1.218, 0.86472, 0.92581, 0.006436, 0.006436, 0.006436
40
+ 39, 1.1909, 0.79266, 0.91406, 0.56715, 0.45481, 0.47911, 0.29523, 1.2165, 0.85997, 0.92701, 0.006337, 0.006337, 0.006337
41
+ 40, 1.1871, 0.79124, 0.91509, 0.56161, 0.45695, 0.47979, 0.2936, 1.2192, 0.86021, 0.92595, 0.006238, 0.006238, 0.006238
42
+ 41, 1.1845, 0.78371, 0.91384, 0.58462, 0.44685, 0.47769, 0.29297, 1.2136, 0.86106, 0.9257, 0.006139, 0.006139, 0.006139
43
+ 42, 1.1786, 0.77927, 0.91419, 0.56434, 0.45938, 0.47954, 0.29594, 1.2145, 0.85634, 0.92596, 0.00604, 0.00604, 0.00604
44
+ 43, 1.1825, 0.77755, 0.9122, 0.56151, 0.45453, 0.47821, 0.29503, 1.2114, 0.85716, 0.92577, 0.005941, 0.005941, 0.005941
45
+ 44, 1.1781, 0.77119, 0.91143, 0.57017, 0.45429, 0.48058, 0.29537, 1.2124, 0.85637, 0.92405, 0.005842, 0.005842, 0.005842
46
+ 45, 1.1741, 0.76864, 0.90855, 0.57144, 0.45941, 0.48127, 0.29532, 1.2149, 0.85461, 0.92485, 0.005743, 0.005743, 0.005743
47
+ 46, 1.1714, 0.7629, 0.91028, 0.56556, 0.46016, 0.48196, 0.29722, 1.2077, 0.85546, 0.92195, 0.005644, 0.005644, 0.005644
48
+ 47, 1.1641, 0.76022, 0.90876, 0.5822, 0.45321, 0.48536, 0.29811, 1.2141, 0.85436, 0.92379, 0.005545, 0.005545, 0.005545
49
+ 48, 1.172, 0.76512, 0.90838, 0.57112, 0.45987, 0.48307, 0.29802, 1.2107, 0.84931, 0.92284, 0.005446, 0.005446, 0.005446
50
+ 49, 1.1597, 0.75491, 0.90855, 0.5905, 0.45212, 0.48311, 0.29632, 1.2082, 0.85297, 0.92215, 0.005347, 0.005347, 0.005347
51
+ 50, 1.1657, 0.75412, 0.90683, 0.57774, 0.45991, 0.48437, 0.29805, 1.2086, 0.84935, 0.92253, 0.005248, 0.005248, 0.005248
52
+ 51, 1.1619, 0.75171, 0.9069, 0.5791, 0.46314, 0.48712, 0.30036, 1.2062, 0.84582, 0.92264, 0.005149, 0.005149, 0.005149
53
+ 52, 1.1599, 0.74866, 0.90685, 0.57608, 0.46587, 0.48922, 0.30038, 1.2077, 0.84571, 0.92296, 0.00505, 0.00505, 0.00505
54
+ 53, 1.1585, 0.74449, 0.9053, 0.58421, 0.46177, 0.48793, 0.29875, 1.2106, 0.84587, 0.923, 0.004951, 0.004951, 0.004951
55
+ 54, 1.1625, 0.74241, 0.90605, 0.57394, 0.46053, 0.48256, 0.29779, 1.2065, 0.84386, 0.92113, 0.004852, 0.004852, 0.004852
56
+ 55, 1.1492, 0.73875, 0.90464, 0.57103, 0.46496, 0.48522, 0.29912, 1.2117, 0.84644, 0.92192, 0.004753, 0.004753, 0.004753
57
+ 56, 1.1523, 0.73837, 0.90505, 0.57812, 0.46241, 0.48782, 0.30066, 1.2081, 0.8452, 0.92165, 0.004654, 0.004654, 0.004654
58
+ 57, 1.1469, 0.73132, 0.90263, 0.5841, 0.46081, 0.48611, 0.29971, 1.2057, 0.84517, 0.92131, 0.004555, 0.004555, 0.004555
59
+ 58, 1.1417, 0.72782, 0.90147, 0.57659, 0.45785, 0.48571, 0.30004, 1.206, 0.84387, 0.92211, 0.004456, 0.004456, 0.004456
60
+ 59, 1.1435, 0.72524, 0.901, 0.57331, 0.46736, 0.48858, 0.30223, 1.2049, 0.84321, 0.92158, 0.004357, 0.004357, 0.004357
61
+ 60, 1.1404, 0.71987, 0.89914, 0.56216, 0.47101, 0.48919, 0.30127, 1.2034, 0.84493, 0.92114, 0.004258, 0.004258, 0.004258
62
+ 61, 1.1341, 0.71429, 0.89993, 0.57601, 0.46799, 0.48677, 0.29986, 1.2009, 0.84409, 0.92058, 0.004159, 0.004159, 0.004159
63
+ 62, 1.1434, 0.72094, 0.90053, 0.56069, 0.47219, 0.48432, 0.29697, 1.2086, 0.84665, 0.92239, 0.00406, 0.00406, 0.00406
64
+ 63, 1.1356, 0.71252, 0.90017, 0.56659, 0.46759, 0.48596, 0.29891, 1.2037, 0.84267, 0.92186, 0.003961, 0.003961, 0.003961
65
+ 64, 1.1265, 0.70952, 0.89857, 0.58815, 0.46111, 0.48891, 0.30143, 1.2075, 0.84206, 0.92209, 0.003862, 0.003862, 0.003862
66
+ 65, 1.1326, 0.70907, 0.89947, 0.57659, 0.46701, 0.48751, 0.30087, 1.2048, 0.8429, 0.92221, 0.003763, 0.003763, 0.003763
67
+ 66, 1.127, 0.70454, 0.89864, 0.57313, 0.47243, 0.49042, 0.30242, 1.2115, 0.8414, 0.92324, 0.003664, 0.003664, 0.003664
68
+ 67, 1.1273, 0.70232, 0.89718, 0.58543, 0.46626, 0.49211, 0.30267, 1.2088, 0.83977, 0.92286, 0.003565, 0.003565, 0.003565
69
+ 68, 1.1224, 0.69711, 0.89621, 0.58338, 0.46497, 0.48956, 0.30235, 1.2052, 0.84193, 0.92176, 0.003466, 0.003466, 0.003466
70
+ 69, 1.1169, 0.6921, 0.89558, 0.57446, 0.47387, 0.49119, 0.30316, 1.2042, 0.84005, 0.92224, 0.003367, 0.003367, 0.003367
71
+ 70, 1.1218, 0.69686, 0.8954, 0.58102, 0.46411, 0.48988, 0.30198, 1.2078, 0.84193, 0.92202, 0.003268, 0.003268, 0.003268
72
+ 71, 1.1199, 0.69053, 0.89649, 0.58082, 0.46593, 0.48832, 0.30171, 1.2073, 0.84135, 0.92268, 0.003169, 0.003169, 0.003169
73
+ 72, 1.1187, 0.69187, 0.89582, 0.56918, 0.47462, 0.49093, 0.3026, 1.2058, 0.84122, 0.92194, 0.00307, 0.00307, 0.00307
74
+ 73, 1.1113, 0.68617, 0.89373, 0.58907, 0.46477, 0.49044, 0.30249, 1.2064, 0.84081, 0.92206, 0.002971, 0.002971, 0.002971
75
+ 74, 1.1114, 0.6833, 0.89175, 0.58027, 0.46793, 0.49271, 0.3043, 1.2045, 0.84041, 0.92062, 0.002872, 0.002872, 0.002872
76
+ 75, 1.1036, 0.67604, 0.89176, 0.58243, 0.47273, 0.49288, 0.30423, 1.2061, 0.84071, 0.92102, 0.002773, 0.002773, 0.002773
77
+ 76, 1.1069, 0.67806, 0.8927, 0.57203, 0.47729, 0.4924, 0.30399, 1.2067, 0.83884, 0.92152, 0.002674, 0.002674, 0.002674
78
+ 77, 1.1047, 0.67442, 0.89183, 0.58428, 0.46829, 0.49107, 0.30316, 1.2069, 0.84157, 0.92111, 0.002575, 0.002575, 0.002575
79
+ 78, 1.1068, 0.67351, 0.89209, 0.58203, 0.47367, 0.4935, 0.30482, 1.2073, 0.84033, 0.9214, 0.002476, 0.002476, 0.002476
80
+ 79, 1.0994, 0.67005, 0.8902, 0.58224, 0.47067, 0.49245, 0.30339, 1.2064, 0.84065, 0.92112, 0.002377, 0.002377, 0.002377
81
+ 80, 1.0973, 0.66668, 0.88882, 0.57985, 0.47014, 0.49068, 0.30313, 1.2068, 0.84266, 0.92127, 0.002278, 0.002278, 0.002278
82
+ 81, 1.0997, 0.66566, 0.89004, 0.58422, 0.46782, 0.49079, 0.30262, 1.2099, 0.84459, 0.92239, 0.002179, 0.002179, 0.002179
83
+ 82, 1.0965, 0.66089, 0.88973, 0.58957, 0.4687, 0.49107, 0.30315, 1.2074, 0.84319, 0.9218, 0.00208, 0.00208, 0.00208
84
+ 83, 1.0939, 0.65999, 0.88885, 0.58676, 0.46898, 0.49089, 0.30261, 1.2072, 0.8451, 0.92178, 0.001981, 0.001981, 0.001981
85
+ 84, 1.089, 0.65702, 0.88765, 0.57789, 0.47184, 0.48886, 0.3014, 1.2083, 0.8452, 0.92161, 0.001882, 0.001882, 0.001882
86
+ 85, 1.0886, 0.65413, 0.88706, 0.58215, 0.46833, 0.48875, 0.30082, 1.2116, 0.84592, 0.92234, 0.001783, 0.001783, 0.001783
87
+ 86, 1.0918, 0.65326, 0.88882, 0.58082, 0.46924, 0.48914, 0.30116, 1.21, 0.84509, 0.92212, 0.001684, 0.001684, 0.001684
88
+ 87, 1.0843, 0.64844, 0.88662, 0.58691, 0.46553, 0.48929, 0.30079, 1.2076, 0.84496, 0.92123, 0.001585, 0.001585, 0.001585
89
+ 88, 1.0833, 0.64793, 0.88668, 0.58283, 0.46839, 0.4889, 0.30085, 1.2083, 0.84599, 0.92158, 0.001486, 0.001486, 0.001486
90
+ 89, 1.0798, 0.64403, 0.88436, 0.58262, 0.46739, 0.48803, 0.30043, 1.2104, 0.84608, 0.9224, 0.001387, 0.001387, 0.001387
91
+ 90, 1.0829, 0.64477, 0.88631, 0.59143, 0.46495, 0.48941, 0.30077, 1.2115, 0.84756, 0.92227, 0.001288, 0.001288, 0.001288
92
+ 91, 1.0808, 0.63544, 0.88839, 0.57854, 0.47158, 0.4867, 0.29857, 1.2114, 0.84971, 0.92208, 0.001189, 0.001189, 0.001189
93
+ 92, 1.0702, 0.62332, 0.88659, 0.57736, 0.4702, 0.48634, 0.29824, 1.2121, 0.8519, 0.92284, 0.00109, 0.00109, 0.00109
94
+ 93, 1.0668, 0.61931, 0.88715, 0.58109, 0.46559, 0.48443, 0.29699, 1.2125, 0.85265, 0.92261, 0.000991, 0.000991, 0.000991
95
+ 94, 1.0631, 0.61362, 0.88485, 0.57296, 0.47047, 0.48569, 0.29707, 1.2156, 0.85353, 0.9236, 0.000892, 0.000892, 0.000892
96
+ 95, 1.0586, 0.60919, 0.88331, 0.57582, 0.46765, 0.48445, 0.29656, 1.2166, 0.85501, 0.92416, 0.000793, 0.000793, 0.000793
97
+ 96, 1.0572, 0.60771, 0.88451, 0.58546, 0.45854, 0.48328, 0.29633, 1.2172, 0.85548, 0.92447, 0.000694, 0.000694, 0.000694
98
+ 97, 1.052, 0.60221, 0.88301, 0.57298, 0.46713, 0.48242, 0.2959, 1.2165, 0.85572, 0.92442, 0.000595, 0.000595, 0.000595
99
+ 98, 1.0541, 0.60285, 0.88288, 0.57668, 0.46574, 0.4819, 0.29599, 1.2169, 0.85712, 0.92483, 0.000496, 0.000496, 0.000496
100
+ 99, 1.048, 0.59823, 0.88225, 0.57154, 0.46725, 0.48205, 0.29623, 1.2167, 0.8583, 0.92477, 0.000397, 0.000397, 0.000397
101
+ 100, 1.047, 0.59711, 0.88043, 0.58053, 0.4617, 0.48107, 0.2958, 1.2176, 0.85889, 0.92488, 0.000298, 0.000298, 0.000298
baseline_enot/weights/best.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:46ba67794f58ab5093addef0603a0e2fcd533b3d66e2e1e23e16be254550d00f
3
+ size 22526382
baseline_enot/weights/last.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:16a3a9dd3fc1a875127ed4a6220e6968af62ae9eb5a18393a1dc5086eb126f80
3
+ size 22529006
baseline_enot_nano/args.yaml ADDED
@@ -0,0 +1,98 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ task: detect
2
+ mode: train
3
+ model: yolov8n.pt
4
+ data: VisDrone.yaml
5
+ epochs: 100
6
+ patience: 50
7
+ batch: 24
8
+ imgsz: 928
9
+ save: true
10
+ save_period: -1
11
+ cache: false
12
+ device: null
13
+ workers: 16
14
+ project: null
15
+ name: visdrone_yolo8n3
16
+ exist_ok: false
17
+ pretrained: true
18
+ optimizer: auto
19
+ verbose: true
20
+ seed: 0
21
+ deterministic: true
22
+ single_cls: false
23
+ rect: false
24
+ cos_lr: false
25
+ close_mosaic: 10
26
+ resume: false
27
+ amp: true
28
+ fraction: 1.0
29
+ profile: false
30
+ freeze: null
31
+ overlap_mask: true
32
+ mask_ratio: 4
33
+ dropout: 0.0
34
+ val: true
35
+ split: val
36
+ save_json: false
37
+ save_hybrid: false
38
+ conf: null
39
+ iou: 0.7
40
+ max_det: 300
41
+ half: false
42
+ dnn: false
43
+ plots: true
44
+ source: null
45
+ show: false
46
+ save_txt: false
47
+ save_conf: false
48
+ save_crop: false
49
+ show_labels: true
50
+ show_conf: true
51
+ vid_stride: 1
52
+ stream_buffer: false
53
+ line_width: null
54
+ visualize: false
55
+ augment: false
56
+ agnostic_nms: false
57
+ classes: null
58
+ retina_masks: false
59
+ boxes: true
60
+ format: torchscript
61
+ keras: false
62
+ optimize: false
63
+ int8: false
64
+ dynamic: false
65
+ simplify: false
66
+ opset: null
67
+ workspace: 4
68
+ nms: false
69
+ lr0: 0.01
70
+ lrf: 0.01
71
+ momentum: 0.937
72
+ weight_decay: 0.0005
73
+ warmup_epochs: 3.0
74
+ warmup_momentum: 0.8
75
+ warmup_bias_lr: 0.1
76
+ box: 7.5
77
+ cls: 0.5
78
+ dfl: 1.5
79
+ pose: 12.0
80
+ kobj: 1.0
81
+ label_smoothing: 0.0
82
+ nbs: 64
83
+ hsv_h: 0.015
84
+ hsv_s: 0.7
85
+ hsv_v: 0.4
86
+ degrees: 0.0
87
+ translate: 0.1
88
+ scale: 0.5
89
+ shear: 0.0
90
+ perspective: 0.0
91
+ flipud: 0.0
92
+ fliplr: 0.5
93
+ mosaic: 1.0
94
+ mixup: 0.0
95
+ copy_paste: 0.0
96
+ cfg: null
97
+ tracker: botsort.yaml
98
+ save_dir: /home/malofeev/yolov8/ultralytics/runs/detect/visdrone_yolo8n3
baseline_enot_nano/results.csv ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch, train/box_loss, train/cls_loss, train/dfl_loss, metrics/precision(B), metrics/recall(B), metrics/mAP50(B), metrics/mAP50-95(B), val/box_loss, val/cls_loss, val/dfl_loss, lr/pg0, lr/pg1, lr/pg2
2
+ 1, 1.6934, 2.472, 1.0803, 0.23524, 0.20151, 0.16481, 0.09289, 1.5442, 1.5881, 1.0413, 0.003321, 0.003321, 0.003321
3
+ 2, 1.6017, 1.5954, 1.0396, 0.29739, 0.25793, 0.21579, 0.12219, 1.5424, 1.4103, 1.0281, 0.0065884, 0.0065884, 0.0065884
4
+ 3, 1.6166, 1.5239, 1.0355, 0.30941, 0.24628, 0.2084, 0.12128, 1.5334, 1.4964, 1.0266, 0.0097899, 0.0097899, 0.0097899
5
+ 4, 1.5805, 1.4284, 1.0264, 0.33678, 0.28147, 0.25399, 0.14605, 1.4985, 1.3542, 1.0122, 0.009703, 0.009703, 0.009703
6
+ 5, 1.5438, 1.3529, 1.0164, 0.38723, 0.30553, 0.28635, 0.16757, 1.4444, 1.2631, 0.99917, 0.009703, 0.009703, 0.009703
7
+ 6, 1.5255, 1.3038, 1.0062, 0.38994, 0.30391, 0.2869, 0.16601, 1.4684, 1.2042, 1.0028, 0.009604, 0.009604, 0.009604
8
+ 7, 1.4981, 1.2582, 1.0017, 0.38166, 0.31095, 0.29142, 0.1674, 1.4575, 1.2308, 0.99168, 0.009505, 0.009505, 0.009505
9
+ 8, 1.4886, 1.2361, 0.99686, 0.40561, 0.31099, 0.30738, 0.1763, 1.4597, 1.1675, 1.0066, 0.009406, 0.009406, 0.009406
10
+ 9, 1.4718, 1.2005, 0.99097, 0.40813, 0.33573, 0.31943, 0.18682, 1.4184, 1.1469, 0.98866, 0.009307, 0.009307, 0.009307
11
+ 10, 1.4573, 1.1785, 0.98966, 0.4331, 0.33951, 0.32962, 0.19326, 1.4166, 1.1333, 0.98632, 0.009208, 0.009208, 0.009208
12
+ 11, 1.4572, 1.1596, 0.98533, 0.42418, 0.34973, 0.33818, 0.20012, 1.3922, 1.1125, 0.97946, 0.009109, 0.009109, 0.009109
13
+ 12, 1.4343, 1.1403, 0.98299, 0.42805, 0.3275, 0.32522, 0.19147, 1.4144, 1.1338, 0.98435, 0.00901, 0.00901, 0.00901
14
+ 13, 1.421, 1.1244, 0.97903, 0.45541, 0.34981, 0.35056, 0.20731, 1.3827, 1.0892, 0.97441, 0.008911, 0.008911, 0.008911
15
+ 14, 1.413, 1.1132, 0.97583, 0.45048, 0.35108, 0.34653, 0.20537, 1.3923, 1.1236, 0.97987, 0.008812, 0.008812, 0.008812
16
+ 15, 1.4162, 1.099, 0.97538, 0.44417, 0.35488, 0.34893, 0.20504, 1.3833, 1.0967, 0.97254, 0.008713, 0.008713, 0.008713
17
+ 16, 1.4056, 1.0873, 0.9723, 0.43312, 0.35324, 0.34414, 0.20211, 1.381, 1.1007, 0.97551, 0.008614, 0.008614, 0.008614
18
+ 17, 1.404, 1.0826, 0.97187, 0.45107, 0.36262, 0.36241, 0.21531, 1.3629, 1.0525, 0.97051, 0.008515, 0.008515, 0.008515
19
+ 18, 1.3994, 1.0709, 0.96924, 0.47419, 0.35807, 0.3618, 0.21664, 1.3667, 1.051, 0.96669, 0.008416, 0.008416, 0.008416
20
+ 19, 1.3783, 1.0558, 0.96377, 0.47609, 0.36359, 0.37201, 0.2217, 1.3585, 1.0419, 0.96729, 0.008317, 0.008317, 0.008317
21
+ 20, 1.3876, 1.0578, 0.96679, 0.47182, 0.36737, 0.37309, 0.22151, 1.3505, 1.0483, 0.96551, 0.008218, 0.008218, 0.008218
22
+ 21, 1.3812, 1.0484, 0.96622, 0.49549, 0.35966, 0.37282, 0.22213, 1.3566, 1.0355, 0.96439, 0.008119, 0.008119, 0.008119
23
+ 22, 1.3824, 1.0459, 0.9647, 0.4654, 0.36769, 0.36882, 0.21904, 1.3472, 1.041, 0.96722, 0.00802, 0.00802, 0.00802
24
+ 23, 1.3729, 1.0393, 0.96266, 0.47413, 0.36231, 0.36997, 0.21775, 1.3468, 1.0462, 0.96527, 0.007921, 0.007921, 0.007921
25
+ 24, 1.3696, 1.0293, 0.96254, 0.47001, 0.37518, 0.37891, 0.22624, 1.3455, 1.0217, 0.9628, 0.007822, 0.007822, 0.007822
26
+ 25, 1.3661, 1.0267, 0.96017, 0.4709, 0.38303, 0.37955, 0.22604, 1.3387, 1.024, 0.9617, 0.007723, 0.007723, 0.007723
27
+ 26, 1.3647, 1.0194, 0.95958, 0.47739, 0.36964, 0.37543, 0.22343, 1.3493, 1.03, 0.96336, 0.007624, 0.007624, 0.007624
28
+ 27, 1.3662, 1.0204, 0.95991, 0.48142, 0.37987, 0.38374, 0.22962, 1.3459, 1.0124, 0.96175, 0.007525, 0.007525, 0.007525
29
+ 28, 1.3566, 1.014, 0.95799, 0.47767, 0.3743, 0.37891, 0.22698, 1.3478, 1.0222, 0.964, 0.007426, 0.007426, 0.007426
30
+ 29, 1.3569, 1.0072, 0.95685, 0.47626, 0.37577, 0.38169, 0.22823, 1.3312, 1.0118, 0.95949, 0.007327, 0.007327, 0.007327
31
+ 30, 1.3452, 0.99848, 0.95519, 0.47875, 0.38471, 0.38701, 0.23072, 1.3348, 1.0057, 0.95889, 0.007228, 0.007228, 0.007228
32
+ 31, 1.3496, 0.99919, 0.95501, 0.49245, 0.38097, 0.38964, 0.23284, 1.3239, 1.0006, 0.95758, 0.007129, 0.007129, 0.007129
33
+ 32, 1.3512, 0.99555, 0.95529, 0.49624, 0.37414, 0.38955, 0.23377, 1.3313, 1.0078, 0.95825, 0.00703, 0.00703, 0.00703
34
+ 33, 1.3434, 0.99331, 0.95197, 0.50842, 0.38201, 0.39617, 0.23803, 1.3195, 0.99128, 0.95615, 0.006931, 0.006931, 0.006931
35
+ 34, 1.3418, 0.98443, 0.95227, 0.49478, 0.37767, 0.39181, 0.23542, 1.3264, 0.99716, 0.95648, 0.006832, 0.006832, 0.006832
36
+ 35, 1.3362, 0.97955, 0.94955, 0.51206, 0.38222, 0.39266, 0.23527, 1.3183, 0.99978, 0.95455, 0.006733, 0.006733, 0.006733
37
+ 36, 1.3393, 0.98062, 0.95078, 0.50223, 0.38141, 0.39813, 0.23813, 1.3201, 0.99005, 0.95351, 0.006634, 0.006634, 0.006634
38
+ 37, 1.3311, 0.97467, 0.94698, 0.4896, 0.38603, 0.391, 0.23577, 1.3175, 0.99057, 0.9514, 0.006535, 0.006535, 0.006535
39
+ 38, 1.331, 0.9698, 0.95052, 0.48774, 0.38612, 0.39492, 0.23663, 1.32, 0.98695, 0.95179, 0.006436, 0.006436, 0.006436
40
+ 39, 1.3237, 0.96721, 0.94737, 0.5001, 0.39278, 0.40068, 0.23888, 1.3193, 0.97985, 0.95302, 0.006337, 0.006337, 0.006337
41
+ 40, 1.3296, 0.96768, 0.95023, 0.48715, 0.38829, 0.39202, 0.23466, 1.3223, 0.99242, 0.95625, 0.006238, 0.006238, 0.006238
42
+ 41, 1.33, 0.96486, 0.94664, 0.50471, 0.38885, 0.40148, 0.24053, 1.3134, 0.97642, 0.9515, 0.006139, 0.006139, 0.006139
43
+ 42, 1.3199, 0.96394, 0.94742, 0.50974, 0.39908, 0.40679, 0.24509, 1.3084, 0.97473, 0.95001, 0.00604, 0.00604, 0.00604
44
+ 43, 1.3231, 0.95704, 0.94585, 0.50233, 0.39406, 0.40713, 0.24407, 1.3107, 0.96864, 0.95105, 0.005941, 0.005941, 0.005941
45
+ 44, 1.3154, 0.95323, 0.94607, 0.50304, 0.38648, 0.40453, 0.24416, 1.3095, 0.9722, 0.9496, 0.005842, 0.005842, 0.005842
46
+ 45, 1.3167, 0.94792, 0.94454, 0.51143, 0.38909, 0.40814, 0.24465, 1.3085, 0.96942, 0.94874, 0.005743, 0.005743, 0.005743
47
+ 46, 1.312, 0.94283, 0.94318, 0.49897, 0.3898, 0.4052, 0.24251, 1.3111, 0.97276, 0.94981, 0.005644, 0.005644, 0.005644
48
+ 47, 1.3137, 0.94265, 0.94476, 0.50303, 0.3881, 0.40462, 0.24423, 1.3082, 0.9687, 0.94894, 0.005545, 0.005545, 0.005545
49
+ 48, 1.3104, 0.94555, 0.94187, 0.50962, 0.39082, 0.40583, 0.24309, 1.309, 0.96935, 0.94939, 0.005446, 0.005446, 0.005446
50
+ 49, 1.3045, 0.9395, 0.94341, 0.51066, 0.39392, 0.40896, 0.24628, 1.3035, 0.96404, 0.94843, 0.005347, 0.005347, 0.005347
51
+ 50, 1.3079, 0.93718, 0.94052, 0.50623, 0.39326, 0.40534, 0.24366, 1.3085, 0.96758, 0.94827, 0.005248, 0.005248, 0.005248
52
+ 51, 1.3047, 0.93511, 0.94142, 0.51609, 0.39408, 0.40998, 0.24632, 1.3022, 0.9635, 0.9468, 0.005149, 0.005149, 0.005149
53
+ 52, 1.3012, 0.93388, 0.93927, 0.51285, 0.39642, 0.41051, 0.24772, 1.3015, 0.95928, 0.94732, 0.00505, 0.00505, 0.00505
54
+ 53, 1.3038, 0.93336, 0.93954, 0.51278, 0.39442, 0.41239, 0.24745, 1.3009, 0.95839, 0.94645, 0.004951, 0.004951, 0.004951
55
+ 54, 1.3017, 0.92787, 0.93993, 0.52145, 0.39579, 0.4143, 0.24993, 1.2966, 0.95549, 0.94535, 0.004852, 0.004852, 0.004852
56
+ 55, 1.2909, 0.9199, 0.93755, 0.51031, 0.40392, 0.41496, 0.25042, 1.296, 0.95398, 0.94408, 0.004753, 0.004753, 0.004753
57
+ 56, 1.296, 0.92414, 0.93917, 0.52754, 0.38987, 0.41034, 0.24775, 1.2954, 0.9526, 0.94539, 0.004654, 0.004654, 0.004654
58
+ 57, 1.2963, 0.92311, 0.93792, 0.51084, 0.40572, 0.41797, 0.25227, 1.2979, 0.95285, 0.94475, 0.004555, 0.004555, 0.004555
59
+ 58, 1.2859, 0.91641, 0.93637, 0.51995, 0.39894, 0.41505, 0.25025, 1.2965, 0.95537, 0.94488, 0.004456, 0.004456, 0.004456
60
+ 59, 1.3009, 0.91959, 0.93843, 0.51927, 0.40176, 0.41642, 0.25014, 1.2943, 0.94998, 0.94441, 0.004357, 0.004357, 0.004357
61
+ 60, 1.2912, 0.90963, 0.93559, 0.51199, 0.40438, 0.41781, 0.25151, 1.2935, 0.94673, 0.94312, 0.004258, 0.004258, 0.004258
62
+ 61, 1.2901, 0.91301, 0.93742, 0.5207, 0.40143, 0.41434, 0.25013, 1.2937, 0.95152, 0.94386, 0.004159, 0.004159, 0.004159
63
+ 62, 1.2873, 0.90895, 0.93599, 0.53292, 0.39336, 0.4169, 0.25098, 1.2889, 0.94802, 0.94338, 0.00406, 0.00406, 0.00406
64
+ 63, 1.2811, 0.90317, 0.93468, 0.51231, 0.40429, 0.41878, 0.25186, 1.2898, 0.94431, 0.94325, 0.003961, 0.003961, 0.003961
65
+ 64, 1.2787, 0.90131, 0.9338, 0.52692, 0.39766, 0.41675, 0.25255, 1.2907, 0.94447, 0.94319, 0.003862, 0.003862, 0.003862
66
+ 65, 1.2825, 0.90257, 0.93511, 0.52339, 0.40124, 0.41801, 0.25276, 1.2893, 0.94235, 0.94228, 0.003763, 0.003763, 0.003763
67
+ 66, 1.2798, 0.89969, 0.9344, 0.51669, 0.40931, 0.41947, 0.25356, 1.2875, 0.94269, 0.94313, 0.003664, 0.003664, 0.003664
68
+ 67, 1.2768, 0.89639, 0.93278, 0.5323, 0.3967, 0.42179, 0.25477, 1.2916, 0.94062, 0.94142, 0.003565, 0.003565, 0.003565
69
+ 68, 1.2737, 0.89508, 0.93186, 0.52227, 0.40182, 0.42038, 0.25423, 1.2836, 0.93872, 0.9412, 0.003466, 0.003466, 0.003466
70
+ 69, 1.2686, 0.89114, 0.93152, 0.52451, 0.40771, 0.42511, 0.25583, 1.289, 0.93979, 0.94211, 0.003367, 0.003367, 0.003367
71
+ 70, 1.28, 0.8922, 0.9328, 0.52413, 0.40623, 0.42376, 0.2556, 1.2862, 0.93714, 0.94083, 0.003268, 0.003268, 0.003268
72
+ 71, 1.2701, 0.88834, 0.92995, 0.52833, 0.40722, 0.42534, 0.25696, 1.2831, 0.93574, 0.94049, 0.003169, 0.003169, 0.003169
73
+ 72, 1.2667, 0.8856, 0.93114, 0.52991, 0.39893, 0.42236, 0.2555, 1.2826, 0.9356, 0.93997, 0.00307, 0.00307, 0.00307
74
+ 73, 1.2613, 0.88011, 0.92823, 0.52585, 0.40676, 0.42475, 0.2573, 1.2847, 0.93445, 0.94019, 0.002971, 0.002971, 0.002971
75
+ 74, 1.2749, 0.88194, 0.92972, 0.52767, 0.4048, 0.42465, 0.25678, 1.2847, 0.93658, 0.94112, 0.002872, 0.002872, 0.002872
76
+ 75, 1.2618, 0.87515, 0.92845, 0.53044, 0.40551, 0.42636, 0.25832, 1.2844, 0.93284, 0.94024, 0.002773, 0.002773, 0.002773
77
+ 76, 1.2684, 0.87796, 0.93023, 0.52757, 0.40553, 0.42466, 0.25698, 1.2854, 0.93335, 0.94087, 0.002674, 0.002674, 0.002674
78
+ 77, 1.261, 0.87356, 0.92868, 0.53493, 0.40322, 0.42506, 0.25618, 1.2833, 0.93398, 0.93996, 0.002575, 0.002575, 0.002575
79
+ 78, 1.2556, 0.87052, 0.92655, 0.5195, 0.41061, 0.42464, 0.25655, 1.2839, 0.93463, 0.93991, 0.002476, 0.002476, 0.002476
80
+ 79, 1.2538, 0.86577, 0.927, 0.52174, 0.41024, 0.42553, 0.25697, 1.2845, 0.93213, 0.9407, 0.002377, 0.002377, 0.002377
81
+ 80, 1.2585, 0.87089, 0.92644, 0.52637, 0.41025, 0.4252, 0.25568, 1.2848, 0.93241, 0.94026, 0.002278, 0.002278, 0.002278
82
+ 81, 1.2564, 0.86696, 0.92564, 0.52541, 0.40858, 0.42427, 0.25698, 1.2824, 0.93237, 0.93989, 0.002179, 0.002179, 0.002179
83
+ 82, 1.2562, 0.86331, 0.92584, 0.51848, 0.41101, 0.42624, 0.25762, 1.2806, 0.93097, 0.9396, 0.00208, 0.00208, 0.00208
84
+ 83, 1.2584, 0.86428, 0.92599, 0.52475, 0.40854, 0.42597, 0.25747, 1.2821, 0.92921, 0.9392, 0.001981, 0.001981, 0.001981
85
+ 84, 1.2496, 0.86251, 0.92584, 0.5225, 0.41514, 0.42694, 0.25789, 1.2805, 0.92955, 0.93864, 0.001882, 0.001882, 0.001882
86
+ 85, 1.2503, 0.85983, 0.92544, 0.522, 0.40978, 0.4248, 0.25683, 1.2826, 0.93052, 0.93946, 0.001783, 0.001783, 0.001783
87
+ 86, 1.2498, 0.85562, 0.92505, 0.53202, 0.41038, 0.42747, 0.25765, 1.2824, 0.93142, 0.93955, 0.001684, 0.001684, 0.001684
88
+ 87, 1.2488, 0.8547, 0.92353, 0.53688, 0.4097, 0.42751, 0.2585, 1.2787, 0.92975, 0.93838, 0.001585, 0.001585, 0.001585
89
+ 88, 1.2489, 0.85201, 0.92492, 0.53644, 0.40624, 0.42814, 0.25877, 1.2803, 0.92845, 0.93867, 0.001486, 0.001486, 0.001486
90
+ 89, 1.2503, 0.84897, 0.92344, 0.52847, 0.41324, 0.42877, 0.25874, 1.2828, 0.92853, 0.93889, 0.001387, 0.001387, 0.001387
91
+ 90, 1.2431, 0.84602, 0.92203, 0.52159, 0.41401, 0.42805, 0.25881, 1.2811, 0.9293, 0.93882, 0.001288, 0.001288, 0.001288
92
+ 91, 1.2415, 0.84493, 0.92927, 0.53263, 0.40837, 0.42648, 0.25711, 1.2851, 0.93165, 0.9395, 0.001189, 0.001189, 0.001189
93
+ 92, 1.2253, 0.82405, 0.92555, 0.53132, 0.41026, 0.4264, 0.25669, 1.2851, 0.93076, 0.93944, 0.00109, 0.00109, 0.00109
94
+ 93, 1.2245, 0.82146, 0.9267, 0.52459, 0.41251, 0.42667, 0.25688, 1.2855, 0.93219, 0.9399, 0.000991, 0.000991, 0.000991
95
+ 94, 1.2183, 0.81667, 0.92153, 0.52849, 0.41439, 0.42654, 0.25687, 1.2846, 0.93148, 0.93957, 0.000892, 0.000892, 0.000892
96
+ 95, 1.2182, 0.8149, 0.92263, 0.52596, 0.41286, 0.42613, 0.25653, 1.2864, 0.93065, 0.94003, 0.000793, 0.000793, 0.000793
97
+ 96, 1.2132, 0.80814, 0.92407, 0.52715, 0.41446, 0.4267, 0.25692, 1.2863, 0.93034, 0.94001, 0.000694, 0.000694, 0.000694
98
+ 97, 1.2119, 0.80631, 0.92225, 0.5316, 0.41309, 0.42671, 0.25668, 1.2862, 0.93054, 0.9401, 0.000595, 0.000595, 0.000595
99
+ 98, 1.2131, 0.80462, 0.92207, 0.53035, 0.41298, 0.42679, 0.25689, 1.2856, 0.9302, 0.93994, 0.000496, 0.000496, 0.000496
100
+ 99, 1.2061, 0.80084, 0.91956, 0.52853, 0.41341, 0.42784, 0.25723, 1.285, 0.92875, 0.93953, 0.000397, 0.000397, 0.000397
101
+ 100, 1.2059, 0.79894, 0.92016, 0.5324, 0.41368, 0.42802, 0.25751, 1.2844, 0.92861, 0.93933, 0.000298, 0.000298, 0.000298
baseline_enot_nano/weights/best.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3425c613380439d7793eae2ab356b3a7bb5c1b418e6ef6606059e4fa18864146
3
+ size 6267993
baseline_enot_nano/weights/last.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:04c45bf6b1ff1b3a3c10fa45145c305edda9d05105a7920654565efb9e0e4cd6
3
+ size 6269337
baseline_ultralytics/args.yaml ADDED
@@ -0,0 +1,97 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ task: detect
2
+ mode: train
3
+ model: yolov8s.pt
4
+ data: VisDrone.yaml
5
+ epochs: 100
6
+ patience: 50
7
+ batch: 16
8
+ imgsz: 640
9
+ save: true
10
+ save_period: -1
11
+ cache: false
12
+ device: 3
13
+ workers: 8
14
+ project: null
15
+ name: train
16
+ exist_ok: false
17
+ pretrained: true
18
+ optimizer: auto
19
+ verbose: true
20
+ seed: 0
21
+ deterministic: true
22
+ single_cls: false
23
+ rect: false
24
+ cos_lr: false
25
+ close_mosaic: 10
26
+ resume: false
27
+ amp: true
28
+ fraction: 1.0
29
+ profile: false
30
+ freeze: null
31
+ overlap_mask: true
32
+ mask_ratio: 4
33
+ dropout: 0.0
34
+ val: true
35
+ split: val
36
+ save_json: false
37
+ save_hybrid: false
38
+ conf: null
39
+ iou: 0.7
40
+ max_det: 300
41
+ half: false
42
+ dnn: false
43
+ plots: true
44
+ source: null
45
+ show: false
46
+ save_txt: false
47
+ save_conf: false
48
+ save_crop: false
49
+ show_labels: true
50
+ show_conf: true
51
+ vid_stride: 1
52
+ stream_buffer: false
53
+ line_width: null
54
+ visualize: false
55
+ augment: false
56
+ agnostic_nms: false
57
+ classes: null
58
+ retina_masks: false
59
+ boxes: true
60
+ format: torchscript
61
+ keras: false
62
+ optimize: false
63
+ int8: false
64
+ dynamic: false
65
+ simplify: false
66
+ opset: null
67
+ workspace: 4
68
+ nms: false
69
+ lr0: 0.01
70
+ lrf: 0.01
71
+ momentum: 0.937
72
+ weight_decay: 0.0005
73
+ warmup_epochs: 3.0
74
+ warmup_momentum: 0.8
75
+ warmup_bias_lr: 0.1
76
+ box: 7.5
77
+ cls: 0.5
78
+ dfl: 1.5
79
+ pose: 12.0
80
+ kobj: 1.0
81
+ label_smoothing: 0.0
82
+ nbs: 64
83
+ hsv_h: 0.015
84
+ hsv_s: 0.7
85
+ hsv_v: 0.4
86
+ degrees: 0.0
87
+ translate: 0.1
88
+ scale: 0.5
89
+ shear: 0.0
90
+ perspective: 0.0
91
+ flipud: 0.0
92
+ fliplr: 0.5
93
+ mosaic: 1.0
94
+ mixup: 0.0
95
+ copy_paste: 0.0
96
+ cfg: null
97
+ tracker: botsort.yaml
baseline_ultralytics/events.out.tfevents.1699532367.cube06.2218493.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bb7ef63ed6ed803aa94d4105acff423974464c4f5364a85db4a32863830bab0c
3
+ size 6669068
baseline_ultralytics/results.csv ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch, train/box_loss, train/cls_loss, train/dfl_loss, metrics/precision(B), metrics/recall(B), metrics/mAP50(B), metrics/mAP50-95(B), val/box_loss, val/cls_loss, val/dfl_loss, lr/pg0, lr/pg1, lr/pg2
2
+ 1, 1.5936, 1.8213, 1.0182, 0.30336, 0.25163, 0.21728, 0.12287, 1.4617, 1.2026, 0.95059, 0.0033251, 0.0033251, 0.0033251
3
+ 2, 1.5165, 1.2486, 0.96234, 0.35778, 0.27839, 0.25422, 0.14308, 1.4441, 1.1761, 0.93816, 0.0065925, 0.0065925, 0.0065925
4
+ 3, 1.5513, 1.2337, 0.96377, 0.35625, 0.25216, 0.24359, 0.13823, 1.5041, 1.1983, 0.95593, 0.0097939, 0.0097939, 0.0097939
5
+ 4, 1.5446, 1.2052, 0.96191, 0.34926, 0.28707, 0.26627, 0.15117, 1.4589, 1.1356, 0.94359, 0.009703, 0.009703, 0.009703
6
+ 5, 1.5082, 1.1559, 0.95427, 0.39794, 0.29841, 0.2906, 0.16631, 1.4154, 1.0764, 0.93088, 0.009703, 0.009703, 0.009703
7
+ 6, 1.4849, 1.1217, 0.94558, 0.39115, 0.3076, 0.29454, 0.16805, 1.4009, 1.0597, 0.92859, 0.009604, 0.009604, 0.009604
8
+ 7, 1.4567, 1.094, 0.94065, 0.40494, 0.31312, 0.30537, 0.17511, 1.3905, 1.067, 0.92626, 0.009505, 0.009505, 0.009505
9
+ 8, 1.4454, 1.0716, 0.93646, 0.40981, 0.32244, 0.30733, 0.17628, 1.3739, 1.0461, 0.92217, 0.009406, 0.009406, 0.009406
10
+ 9, 1.4191, 1.0552, 0.93112, 0.42545, 0.32138, 0.31421, 0.18121, 1.3629, 1.0273, 0.922, 0.009307, 0.009307, 0.009307
11
+ 10, 1.4072, 1.0381, 0.92896, 0.4219, 0.32271, 0.3171, 0.1862, 1.3501, 1.0258, 0.91758, 0.009208, 0.009208, 0.009208
12
+ 11, 1.4025, 1.0342, 0.92851, 0.44411, 0.32388, 0.33124, 0.19315, 1.3413, 1.0012, 0.91557, 0.009109, 0.009109, 0.009109
13
+ 12, 1.3777, 1.0087, 0.92505, 0.43341, 0.33966, 0.32851, 0.19129, 1.3315, 0.9935, 0.91135, 0.00901, 0.00901, 0.00901
14
+ 13, 1.3799, 1.0062, 0.9216, 0.43827, 0.34397, 0.33676, 0.19667, 1.316, 0.98626, 0.9076, 0.008911, 0.008911, 0.008911
15
+ 14, 1.3755, 0.99361, 0.92021, 0.46878, 0.3415, 0.34701, 0.20271, 1.3142, 0.97181, 0.90658, 0.008812, 0.008812, 0.008812
16
+ 15, 1.3588, 0.97987, 0.91764, 0.44602, 0.35624, 0.34548, 0.2011, 1.3205, 0.96449, 0.90987, 0.008713, 0.008713, 0.008713
17
+ 16, 1.3529, 0.9737, 0.91695, 0.46348, 0.33662, 0.346, 0.20311, 1.3179, 0.96969, 0.90898, 0.008614, 0.008614, 0.008614
18
+ 17, 1.3512, 0.96819, 0.91613, 0.46755, 0.34784, 0.35538, 0.20795, 1.3133, 0.95526, 0.90986, 0.008515, 0.008515, 0.008515
19
+ 18, 1.347, 0.96282, 0.91252, 0.45352, 0.35485, 0.34849, 0.20232, 1.3069, 0.9714, 0.90703, 0.008416, 0.008416, 0.008416
20
+ 19, 1.3373, 0.94643, 0.91126, 0.4669, 0.3647, 0.36139, 0.212, 1.292, 0.94817, 0.90151, 0.008317, 0.008317, 0.008317
21
+ 20, 1.3256, 0.93476, 0.91039, 0.45684, 0.35722, 0.35681, 0.20974, 1.2898, 0.93569, 0.90126, 0.008218, 0.008218, 0.008218
22
+ 21, 1.3233, 0.93461, 0.90822, 0.46945, 0.35544, 0.35596, 0.20956, 1.2895, 0.94876, 0.90125, 0.008119, 0.008119, 0.008119
23
+ 22, 1.3219, 0.93082, 0.90791, 0.47259, 0.35288, 0.36, 0.2116, 1.2797, 0.94543, 0.90085, 0.00802, 0.00802, 0.00802
24
+ 23, 1.3097, 0.92199, 0.90524, 0.47323, 0.35778, 0.36695, 0.21399, 1.2779, 0.93151, 0.89727, 0.007921, 0.007921, 0.007921
25
+ 24, 1.31, 0.91763, 0.9054, 0.47619, 0.35267, 0.36598, 0.21668, 1.2757, 0.9296, 0.89956, 0.007822, 0.007822, 0.007822
26
+ 25, 1.302, 0.9098, 0.90441, 0.47763, 0.36259, 0.3704, 0.22011, 1.2754, 0.92511, 0.89985, 0.007723, 0.007723, 0.007723
27
+ 26, 1.3031, 0.90665, 0.90449, 0.49645, 0.36373, 0.37567, 0.22291, 1.2708, 0.91865, 0.89669, 0.007624, 0.007624, 0.007624
28
+ 27, 1.2923, 0.90038, 0.90235, 0.47826, 0.36704, 0.37093, 0.22027, 1.2646, 0.92104, 0.89565, 0.007525, 0.007525, 0.007525
29
+ 28, 1.2914, 0.89611, 0.90122, 0.48825, 0.36696, 0.37415, 0.22047, 1.2628, 0.91325, 0.89687, 0.007426, 0.007426, 0.007426
30
+ 29, 1.2874, 0.89222, 0.90112, 0.49262, 0.36524, 0.37754, 0.22408, 1.2613, 0.90975, 0.89498, 0.007327, 0.007327, 0.007327
31
+ 30, 1.2825, 0.88289, 0.89834, 0.48669, 0.36225, 0.37575, 0.2229, 1.2627, 0.91265, 0.89543, 0.007228, 0.007228, 0.007228
32
+ 31, 1.2827, 0.88586, 0.89963, 0.48196, 0.36966, 0.37546, 0.22172, 1.2626, 0.91117, 0.8958, 0.007129, 0.007129, 0.007129
33
+ 32, 1.2803, 0.87591, 0.89712, 0.51065, 0.35771, 0.3751, 0.22193, 1.2662, 0.90668, 0.89509, 0.00703, 0.00703, 0.00703
34
+ 33, 1.2754, 0.87411, 0.89874, 0.48621, 0.36923, 0.38064, 0.22613, 1.2523, 0.89738, 0.89289, 0.006931, 0.006931, 0.006931
35
+ 34, 1.2734, 0.86658, 0.89615, 0.4957, 0.37732, 0.3903, 0.23182, 1.2502, 0.89124, 0.89304, 0.006832, 0.006832, 0.006832
36
+ 35, 1.2655, 0.86481, 0.89507, 0.49995, 0.36606, 0.38258, 0.22757, 1.2572, 0.89412, 0.89265, 0.006733, 0.006733, 0.006733
37
+ 36, 1.2681, 0.86064, 0.89518, 0.48378, 0.37337, 0.37981, 0.22354, 1.2537, 0.89626, 0.89408, 0.006634, 0.006634, 0.006634
38
+ 37, 1.2627, 0.85674, 0.89381, 0.5001, 0.36841, 0.37972, 0.2265, 1.2462, 0.89152, 0.89171, 0.006535, 0.006535, 0.006535
39
+ 38, 1.254, 0.84937, 0.89237, 0.49633, 0.3701, 0.38413, 0.2285, 1.2413, 0.89202, 0.89161, 0.006436, 0.006436, 0.006436
40
+ 39, 1.2525, 0.845, 0.89362, 0.49436, 0.37513, 0.38476, 0.2281, 1.2429, 0.89562, 0.89007, 0.006337, 0.006337, 0.006337
41
+ 40, 1.248, 0.84477, 0.89153, 0.49437, 0.36803, 0.37729, 0.22437, 1.2476, 0.89795, 0.89207, 0.006238, 0.006238, 0.006238
42
+ 41, 1.2547, 0.84371, 0.89234, 0.47519, 0.37309, 0.38071, 0.22779, 1.2459, 0.89165, 0.89075, 0.006139, 0.006139, 0.006139
43
+ 42, 1.2467, 0.83511, 0.89093, 0.49333, 0.37594, 0.38533, 0.22925, 1.2438, 0.89237, 0.89093, 0.00604, 0.00604, 0.00604
44
+ 43, 1.247, 0.83546, 0.89128, 0.50519, 0.37137, 0.3861, 0.22985, 1.2448, 0.88525, 0.89159, 0.005941, 0.005941, 0.005941
45
+ 44, 1.2486, 0.83625, 0.88902, 0.50843, 0.37012, 0.38592, 0.22983, 1.2427, 0.87931, 0.89198, 0.005842, 0.005842, 0.005842
46
+ 45, 1.2454, 0.8284, 0.89077, 0.49731, 0.38196, 0.38893, 0.23111, 1.2387, 0.88348, 0.8903, 0.005743, 0.005743, 0.005743
47
+ 46, 1.2356, 0.82072, 0.88798, 0.49271, 0.37828, 0.38945, 0.23162, 1.2364, 0.87997, 0.88991, 0.005644, 0.005644, 0.005644
48
+ 47, 1.2382, 0.82135, 0.88929, 0.50371, 0.37816, 0.38997, 0.23069, 1.2445, 0.88092, 0.89117, 0.005545, 0.005545, 0.005545
49
+ 48, 1.2288, 0.81556, 0.88905, 0.4966, 0.3867, 0.39149, 0.23302, 1.2359, 0.87815, 0.88934, 0.005446, 0.005446, 0.005446
50
+ 49, 1.2237, 0.81015, 0.88644, 0.51066, 0.37594, 0.3912, 0.23294, 1.2395, 0.88144, 0.8907, 0.005347, 0.005347, 0.005347
51
+ 50, 1.2232, 0.80759, 0.88554, 0.50024, 0.38717, 0.39465, 0.23401, 1.2382, 0.87474, 0.89188, 0.005248, 0.005248, 0.005248
52
+ 51, 1.2215, 0.80508, 0.88451, 0.51068, 0.38422, 0.39719, 0.23539, 1.2365, 0.87223, 0.89077, 0.005149, 0.005149, 0.005149
53
+ 52, 1.2171, 0.80031, 0.88441, 0.51111, 0.38691, 0.39873, 0.23651, 1.2335, 0.86969, 0.88959, 0.00505, 0.00505, 0.00505
54
+ 53, 1.2206, 0.80124, 0.88461, 0.50544, 0.38099, 0.39613, 0.23503, 1.2341, 0.87517, 0.88931, 0.004951, 0.004951, 0.004951
55
+ 54, 1.216, 0.79098, 0.88377, 0.50847, 0.38969, 0.40234, 0.23971, 1.2271, 0.86519, 0.88834, 0.004852, 0.004852, 0.004852
56
+ 55, 1.208, 0.79265, 0.88268, 0.5098, 0.3844, 0.39731, 0.23635, 1.2338, 0.86919, 0.88894, 0.004753, 0.004753, 0.004753
57
+ 56, 1.203, 0.78591, 0.88251, 0.50106, 0.38894, 0.39674, 0.23546, 1.228, 0.86392, 0.88784, 0.004654, 0.004654, 0.004654
58
+ 57, 1.2084, 0.78655, 0.88233, 0.49897, 0.39224, 0.39793, 0.23653, 1.2313, 0.86715, 0.88842, 0.004555, 0.004555, 0.004555
59
+ 58, 1.2023, 0.78399, 0.88208, 0.51158, 0.38627, 0.39682, 0.23618, 1.2246, 0.86519, 0.88721, 0.004456, 0.004456, 0.004456
60
+ 59, 1.205, 0.78119, 0.88078, 0.51391, 0.38203, 0.39792, 0.23663, 1.2285, 0.86518, 0.88833, 0.004357, 0.004357, 0.004357
61
+ 60, 1.2025, 0.77908, 0.88224, 0.51085, 0.39076, 0.40037, 0.2386, 1.223, 0.86545, 0.88592, 0.004258, 0.004258, 0.004258
62
+ 61, 1.1936, 0.77197, 0.8804, 0.50268, 0.38627, 0.39873, 0.23701, 1.2225, 0.86345, 0.88534, 0.004159, 0.004159, 0.004159
63
+ 62, 1.1935, 0.77227, 0.87918, 0.51639, 0.38189, 0.40047, 0.23854, 1.2207, 0.85939, 0.88621, 0.00406, 0.00406, 0.00406
64
+ 63, 1.1946, 0.76973, 0.87952, 0.51622, 0.38689, 0.39924, 0.23687, 1.2176, 0.86104, 0.88473, 0.003961, 0.003961, 0.003961
65
+ 64, 1.1906, 0.76618, 0.87773, 0.51862, 0.37964, 0.39873, 0.23653, 1.2246, 0.86059, 0.88688, 0.003862, 0.003862, 0.003862
66
+ 65, 1.1917, 0.76313, 0.879, 0.51803, 0.38336, 0.40111, 0.23774, 1.2219, 0.85875, 0.88676, 0.003763, 0.003763, 0.003763
67
+ 66, 1.187, 0.76154, 0.87847, 0.50422, 0.39112, 0.4006, 0.23849, 1.2202, 0.85744, 0.88564, 0.003664, 0.003664, 0.003664
68
+ 67, 1.1794, 0.75096, 0.87655, 0.51537, 0.38399, 0.3993, 0.23871, 1.2199, 0.85696, 0.88599, 0.003565, 0.003565, 0.003565
69
+ 68, 1.1857, 0.75576, 0.87639, 0.51276, 0.38839, 0.40123, 0.23934, 1.2232, 0.85755, 0.88598, 0.003466, 0.003466, 0.003466
70
+ 69, 1.1772, 0.75177, 0.87699, 0.51812, 0.3856, 0.40195, 0.24054, 1.2216, 0.85827, 0.88628, 0.003367, 0.003367, 0.003367
71
+ 70, 1.1792, 0.74848, 0.87472, 0.51274, 0.38814, 0.40048, 0.23878, 1.2214, 0.85824, 0.88516, 0.003268, 0.003268, 0.003268
72
+ 71, 1.1799, 0.74625, 0.87561, 0.51376, 0.3885, 0.40025, 0.23922, 1.2189, 0.85921, 0.88536, 0.003169, 0.003169, 0.003169
73
+ 72, 1.1798, 0.74734, 0.87457, 0.51511, 0.38845, 0.39971, 0.23852, 1.2199, 0.86103, 0.88572, 0.00307, 0.00307, 0.00307
74
+ 73, 1.1656, 0.73551, 0.87461, 0.50734, 0.39052, 0.40182, 0.23926, 1.2232, 0.86131, 0.88689, 0.002971, 0.002971, 0.002971
75
+ 74, 1.1698, 0.7378, 0.87431, 0.51825, 0.38137, 0.39946, 0.23843, 1.2204, 0.85972, 0.88595, 0.002872, 0.002872, 0.002872
76
+ 75, 1.1634, 0.73256, 0.87255, 0.51452, 0.38985, 0.40091, 0.23908, 1.2183, 0.86091, 0.88502, 0.002773, 0.002773, 0.002773
77
+ 76, 1.159, 0.72591, 0.87369, 0.51959, 0.38394, 0.40343, 0.23964, 1.2208, 0.85895, 0.8857, 0.002674, 0.002674, 0.002674
78
+ 77, 1.1655, 0.72958, 0.87216, 0.51417, 0.38711, 0.40302, 0.23928, 1.2207, 0.85959, 0.88541, 0.002575, 0.002575, 0.002575
79
+ 78, 1.1569, 0.72366, 0.87175, 0.51574, 0.38768, 0.40187, 0.23963, 1.2205, 0.85947, 0.8854, 0.002476, 0.002476, 0.002476
80
+ 79, 1.1549, 0.72117, 0.87061, 0.5177, 0.38678, 0.40255, 0.24078, 1.2204, 0.8596, 0.88612, 0.002377, 0.002377, 0.002377
81
+ 80, 1.1577, 0.72181, 0.87117, 0.52199, 0.38788, 0.40345, 0.24171, 1.2197, 0.85798, 0.88582, 0.002278, 0.002278, 0.002278
82
+ 81, 1.1493, 0.71369, 0.86971, 0.52224, 0.38876, 0.40187, 0.2396, 1.2214, 0.86153, 0.88634, 0.002179, 0.002179, 0.002179
83
+ 82, 1.152, 0.7144, 0.87059, 0.52203, 0.38548, 0.39991, 0.23867, 1.2225, 0.86203, 0.88646, 0.00208, 0.00208, 0.00208
84
+ 83, 1.1479, 0.71004, 0.87072, 0.51005, 0.38705, 0.40028, 0.23867, 1.2207, 0.86311, 0.88613, 0.001981, 0.001981, 0.001981
85
+ 84, 1.1491, 0.71052, 0.86754, 0.50337, 0.39045, 0.39862, 0.23833, 1.2181, 0.86138, 0.88492, 0.001882, 0.001882, 0.001882
86
+ 85, 1.1396, 0.70043, 0.86884, 0.51995, 0.38594, 0.40076, 0.23906, 1.2212, 0.86089, 0.88599, 0.001783, 0.001783, 0.001783
87
+ 86, 1.139, 0.69971, 0.8679, 0.51335, 0.39022, 0.40227, 0.24005, 1.2193, 0.85794, 0.88589, 0.001684, 0.001684, 0.001684
88
+ 87, 1.1422, 0.70121, 0.86781, 0.51594, 0.38567, 0.40039, 0.23896, 1.219, 0.85958, 0.88629, 0.001585, 0.001585, 0.001585
89
+ 88, 1.1355, 0.69725, 0.86708, 0.52107, 0.38292, 0.40098, 0.23948, 1.2176, 0.86298, 0.88594, 0.001486, 0.001486, 0.001486
90
+ 89, 1.1361, 0.69698, 0.86701, 0.51841, 0.38414, 0.40021, 0.23831, 1.2186, 0.8651, 0.88573, 0.001387, 0.001387, 0.001387
91
+ 90, 1.1353, 0.69589, 0.86686, 0.51288, 0.38631, 0.4007, 0.23858, 1.2191, 0.86406, 0.88606, 0.001288, 0.001288, 0.001288
92
+ 91, 1.1269, 0.67736, 0.86803, 0.51455, 0.38453, 0.398, 0.23674, 1.2221, 0.86638, 0.8863, 0.001189, 0.001189, 0.001189
93
+ 92, 1.1158, 0.66634, 0.86592, 0.52252, 0.37681, 0.39663, 0.23638, 1.2239, 0.86906, 0.88693, 0.00109, 0.00109, 0.00109
94
+ 93, 1.1085, 0.66061, 0.8653, 0.51217, 0.38356, 0.39678, 0.23573, 1.2243, 0.87044, 0.8869, 0.000991, 0.000991, 0.000991
95
+ 94, 1.1043, 0.65573, 0.86442, 0.51387, 0.38098, 0.39626, 0.23624, 1.2249, 0.87057, 0.88722, 0.000892, 0.000892, 0.000892
96
+ 95, 1.1017, 0.65219, 0.86339, 0.51264, 0.3836, 0.39604, 0.23662, 1.2259, 0.87029, 0.8874, 0.000793, 0.000793, 0.000793
97
+ 96, 1.0976, 0.64777, 0.86308, 0.50801, 0.38627, 0.39708, 0.23614, 1.2269, 0.87078, 0.88737, 0.000694, 0.000694, 0.000694
98
+ 97, 1.0959, 0.64559, 0.86285, 0.50753, 0.38615, 0.39596, 0.23558, 1.227, 0.87246, 0.88745, 0.000595, 0.000595, 0.000595
99
+ 98, 1.0939, 0.64292, 0.86135, 0.5063, 0.38638, 0.39669, 0.23557, 1.2273, 0.87191, 0.88743, 0.000496, 0.000496, 0.000496
100
+ 99, 1.0902, 0.63881, 0.86118, 0.50667, 0.38761, 0.39717, 0.23564, 1.227, 0.87195, 0.88718, 0.000397, 0.000397, 0.000397
101
+ 100, 1.0868, 0.63587, 0.86067, 0.51064, 0.38564, 0.39711, 0.23567, 1.2265, 0.8719, 0.88722, 0.000298, 0.000298, 0.000298
baseline_ultralytics/weights/best.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5d9dbd2cae00deb19c4677cda44052afc578c135187b41d7496d81f0a6f7a08f
3
+ size 22494126
baseline_ultralytics/weights/last.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:11c8180539f8ae8864bb9948202a0249676555d7a1d7cbf695616b9dd17122e2
3
+ size 22496494
enot_neural_architecture_selection_x2/args.yaml ADDED
@@ -0,0 +1,95 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ task: detect
2
+ mode: train
3
+ data: VisDrone.yaml
4
+ epochs: 100
5
+ patience: 50
6
+ batch: 16
7
+ imgsz: 928
8
+ save: true
9
+ save_period: -1
10
+ cache: false
11
+ device: 2
12
+ workers: 8
13
+ project: null
14
+ exist_ok: false
15
+ pretrained: true
16
+ optimizer: auto
17
+ verbose: true
18
+ seed: 0
19
+ deterministic: true
20
+ single_cls: false
21
+ rect: false
22
+ cos_lr: false
23
+ close_mosaic: 10
24
+ resume: false
25
+ amp: true
26
+ fraction: 1.0
27
+ profile: false
28
+ freeze: null
29
+ overlap_mask: true
30
+ mask_ratio: 4
31
+ dropout: 0.0
32
+ val: true
33
+ split: val
34
+ save_json: false
35
+ save_hybrid: false
36
+ conf: null
37
+ iou: 0.7
38
+ max_det: 300
39
+ half: false
40
+ dnn: false
41
+ plots: true
42
+ source: null
43
+ show: false
44
+ save_txt: false
45
+ save_conf: false
46
+ save_crop: false
47
+ show_labels: true
48
+ show_conf: true
49
+ vid_stride: 1
50
+ stream_buffer: false
51
+ line_width: null
52
+ visualize: false
53
+ augment: false
54
+ agnostic_nms: false
55
+ classes: null
56
+ retina_masks: false
57
+ boxes: true
58
+ format: torchscript
59
+ keras: false
60
+ optimize: false
61
+ int8: false
62
+ dynamic: false
63
+ simplify: false
64
+ opset: null
65
+ workspace: 4
66
+ nms: false
67
+ lr0: 0.00103
68
+ lrf: 0.00931
69
+ momentum: 0.98
70
+ weight_decay: 0.00049
71
+ warmup_epochs: 3.23548
72
+ warmup_momentum: 0.84894
73
+ warmup_bias_lr: 0.1
74
+ box: 12.67583
75
+ cls: 0.73787
76
+ dfl: 1.24249
77
+ pose: 12.0
78
+ kobj: 1.0
79
+ label_smoothing: 0.0
80
+ nbs: 64
81
+ hsv_h: 0.01014
82
+ hsv_s: 0.77242
83
+ hsv_v: 0.29403
84
+ degrees: 0
85
+ translate: 0.0845
86
+ scale: 0.72062
87
+ shear: 0
88
+ perspective: 0
89
+ flipud: 0
90
+ fliplr: 0.49184
91
+ mosaic: 0.8314
92
+ mixup: 0
93
+ copy_paste: 0
94
+ cfg: null
95
+ tracker: botsort.yaml
enot_neural_architecture_selection_x2/events.out.tfevents.1700031834.cube06.938038.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:866ac42da7c32c85ce6b5334ea4f8c71ff6c53c83d819ebb5cd6b1700ccffb25
3
+ size 6355712
enot_neural_architecture_selection_x2/results.csv ADDED
@@ -0,0 +1,101 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch, train/box_loss, train/cls_loss, train/dfl_loss, metrics/precision(B), metrics/recall(B), metrics/mAP50(B), metrics/mAP50-95(B), val/box_loss, val/cls_loss, val/dfl_loss, lr/pg0, lr/pg1, lr/pg2
2
+ 1, 1.7136, 1.4313, 1.078, 0.45448, 0.34049, 0.34307, 0.19976, 1.4255, 1.0911, 0.98739, 0.003084, 0.003084, 0.003084
3
+ 2, 1.3998, 1.0265, 0.96613, 0.46685, 0.36325, 0.36429, 0.21421, 1.3872, 1.0551, 0.97618, 0.0061144, 0.0061144, 0.0061144
4
+ 3, 1.3963, 1.0299, 0.96692, 0.466, 0.35362, 0.35012, 0.20517, 1.4179, 1.0886, 0.98309, 0.0090836, 0.0090836, 0.0090836
5
+ 4, 1.4422, 1.1074, 0.9839, 0.42105, 0.33699, 0.32895, 0.18841, 1.4296, 1.1554, 0.99481, 0.0097028, 0.0097028, 0.0097028
6
+ 5, 1.4568, 1.1365, 0.99009, 0.45906, 0.34977, 0.35208, 0.20224, 1.4068, 1.0908, 0.9867, 0.0097028, 0.0097028, 0.0097028
7
+ 6, 1.4394, 1.1085, 0.98512, 0.44791, 0.36511, 0.36021, 0.21012, 1.3813, 1.074, 0.97637, 0.0096037, 0.0096037, 0.0096037
8
+ 7, 1.4215, 1.0842, 0.97996, 0.46261, 0.36249, 0.36028, 0.21122, 1.3811, 1.0606, 0.9737, 0.0095047, 0.0095047, 0.0095047
9
+ 8, 1.4131, 1.0706, 0.97601, 0.47485, 0.37341, 0.37337, 0.21784, 1.3589, 1.0527, 0.97567, 0.0094056, 0.0094056, 0.0094056
10
+ 9, 1.394, 1.0509, 0.97156, 0.46247, 0.36519, 0.37023, 0.2188, 1.3714, 1.0521, 0.97431, 0.0093065, 0.0093065, 0.0093065
11
+ 10, 1.4003, 1.0485, 0.97092, 0.48787, 0.37099, 0.3805, 0.22397, 1.3324, 1.023, 0.96455, 0.0092074, 0.0092074, 0.0092074
12
+ 11, 1.3755, 1.0236, 0.96505, 0.48515, 0.38489, 0.39416, 0.23463, 1.3204, 1.0077, 0.96297, 0.0091084, 0.0091084, 0.0091084
13
+ 12, 1.3763, 1.0173, 0.96475, 0.50735, 0.39243, 0.40437, 0.24047, 1.3277, 1.004, 0.96122, 0.0090093, 0.0090093, 0.0090093
14
+ 13, 1.3625, 1.0054, 0.959, 0.50147, 0.39843, 0.40846, 0.24127, 1.3167, 0.99199, 0.95925, 0.0089102, 0.0089102, 0.0089102
15
+ 14, 1.3563, 0.99548, 0.95823, 0.49712, 0.3884, 0.39438, 0.23735, 1.3092, 0.99898, 0.95327, 0.0088112, 0.0088112, 0.0088112
16
+ 15, 1.3505, 0.98972, 0.95449, 0.50829, 0.39354, 0.41161, 0.24611, 1.3159, 0.98567, 0.95683, 0.0087121, 0.0087121, 0.0087121
17
+ 16, 1.3445, 0.98561, 0.95663, 0.50149, 0.39539, 0.41161, 0.24558, 1.3023, 0.97891, 0.95349, 0.008613, 0.008613, 0.008613
18
+ 17, 1.3492, 0.9803, 0.95539, 0.50376, 0.40885, 0.41209, 0.24528, 1.3027, 0.97796, 0.95461, 0.008514, 0.008514, 0.008514
19
+ 18, 1.344, 0.97414, 0.95409, 0.52249, 0.40455, 0.4195, 0.25185, 1.2974, 0.96076, 0.95462, 0.0084149, 0.0084149, 0.0084149
20
+ 19, 1.3383, 0.96907, 0.95166, 0.52539, 0.4042, 0.42516, 0.25587, 1.2951, 0.96702, 0.94925, 0.0083158, 0.0083158, 0.0083158
21
+ 20, 1.3261, 0.95742, 0.94978, 0.5237, 0.4085, 0.42292, 0.2551, 1.2885, 0.95391, 0.94753, 0.0082168, 0.0082168, 0.0082168
22
+ 21, 1.3276, 0.95743, 0.94921, 0.51944, 0.40631, 0.4285, 0.25767, 1.2937, 0.95616, 0.95005, 0.0081177, 0.0081177, 0.0081177
23
+ 22, 1.318, 0.94831, 0.94899, 0.52461, 0.41326, 0.429, 0.25683, 1.2859, 0.94577, 0.94931, 0.0080186, 0.0080186, 0.0080186
24
+ 23, 1.3221, 0.95192, 0.94844, 0.53326, 0.40578, 0.42906, 0.25971, 1.2848, 0.95265, 0.9465, 0.0079196, 0.0079196, 0.0079196
25
+ 24, 1.3173, 0.9443, 0.94581, 0.51853, 0.40168, 0.42076, 0.2537, 1.2853, 0.95042, 0.94559, 0.0078205, 0.0078205, 0.0078205
26
+ 25, 1.3156, 0.93723, 0.94534, 0.52577, 0.41199, 0.43069, 0.25797, 1.2832, 0.94698, 0.94731, 0.0077214, 0.0077214, 0.0077214
27
+ 26, 1.3091, 0.9343, 0.94468, 0.51292, 0.42504, 0.43049, 0.25865, 1.2806, 0.94295, 0.94587, 0.0076223, 0.0076223, 0.0076223
28
+ 27, 1.3103, 0.9302, 0.94224, 0.54017, 0.41784, 0.43871, 0.26315, 1.282, 0.93834, 0.94448, 0.0075233, 0.0075233, 0.0075233
29
+ 28, 1.305, 0.92609, 0.94321, 0.53184, 0.41709, 0.43422, 0.26314, 1.2688, 0.93311, 0.94436, 0.0074242, 0.0074242, 0.0074242
30
+ 29, 1.304, 0.92172, 0.94323, 0.54724, 0.41962, 0.43951, 0.26518, 1.2731, 0.93038, 0.94147, 0.0073251, 0.0073251, 0.0073251
31
+ 30, 1.298, 0.91742, 0.93981, 0.53962, 0.41229, 0.43432, 0.26276, 1.2711, 0.93532, 0.94437, 0.0072261, 0.0072261, 0.0072261
32
+ 31, 1.3027, 0.91387, 0.94037, 0.53967, 0.41432, 0.4329, 0.26125, 1.2666, 0.9333, 0.93923, 0.007127, 0.007127, 0.007127
33
+ 32, 1.2994, 0.91409, 0.93988, 0.52899, 0.42082, 0.43778, 0.26333, 1.2768, 0.92753, 0.94419, 0.0070279, 0.0070279, 0.0070279
34
+ 33, 1.2934, 0.91124, 0.94212, 0.53299, 0.42623, 0.44412, 0.27033, 1.2621, 0.91717, 0.94064, 0.0069289, 0.0069289, 0.0069289
35
+ 34, 1.2921, 0.90809, 0.93892, 0.53748, 0.42643, 0.44674, 0.27131, 1.2581, 0.91591, 0.93866, 0.0068298, 0.0068298, 0.0068298
36
+ 35, 1.2955, 0.90263, 0.93653, 0.53505, 0.42116, 0.4446, 0.26961, 1.2625, 0.91226, 0.93973, 0.0067307, 0.0067307, 0.0067307
37
+ 36, 1.2945, 0.9038, 0.93848, 0.55452, 0.42475, 0.44936, 0.27306, 1.2644, 0.91559, 0.93962, 0.0066317, 0.0066317, 0.0066317
38
+ 37, 1.2851, 0.89742, 0.93633, 0.54057, 0.43635, 0.44816, 0.27341, 1.258, 0.91964, 0.93783, 0.0065326, 0.0065326, 0.0065326
39
+ 38, 1.2884, 0.89508, 0.93618, 0.54, 0.42263, 0.44583, 0.27229, 1.2585, 0.90872, 0.93906, 0.0064335, 0.0064335, 0.0064335
40
+ 39, 1.2802, 0.88785, 0.93392, 0.54418, 0.42996, 0.44878, 0.2721, 1.2556, 0.91355, 0.9365, 0.0063344, 0.0063344, 0.0063344
41
+ 40, 1.2795, 0.8834, 0.93449, 0.547, 0.42829, 0.4472, 0.27132, 1.254, 0.91396, 0.93803, 0.0062354, 0.0062354, 0.0062354
42
+ 41, 1.2783, 0.88369, 0.93386, 0.56038, 0.4294, 0.45258, 0.2762, 1.2495, 0.90281, 0.93542, 0.0061363, 0.0061363, 0.0061363
43
+ 42, 1.278, 0.88281, 0.93285, 0.55034, 0.43081, 0.45414, 0.27641, 1.2532, 0.90648, 0.9373, 0.0060372, 0.0060372, 0.0060372
44
+ 43, 1.2766, 0.87983, 0.93115, 0.54933, 0.43011, 0.4507, 0.27411, 1.2479, 0.90617, 0.93466, 0.0059382, 0.0059382, 0.0059382
45
+ 44, 1.2628, 0.87159, 0.93121, 0.54237, 0.43501, 0.45379, 0.27754, 1.245, 0.9007, 0.93359, 0.0058391, 0.0058391, 0.0058391
46
+ 45, 1.2691, 0.87534, 0.93146, 0.53392, 0.43982, 0.45496, 0.27709, 1.2486, 0.90064, 0.93523, 0.00574, 0.00574, 0.00574
47
+ 46, 1.2637, 0.86705, 0.93033, 0.54898, 0.44257, 0.45806, 0.27862, 1.2488, 0.89759, 0.9346, 0.005641, 0.005641, 0.005641
48
+ 47, 1.2664, 0.86971, 0.93078, 0.54405, 0.43839, 0.45541, 0.27619, 1.2554, 0.89856, 0.9378, 0.0055419, 0.0055419, 0.0055419
49
+ 48, 1.2737, 0.87462, 0.92917, 0.55285, 0.43607, 0.45607, 0.27653, 1.2447, 0.8982, 0.93423, 0.0054428, 0.0054428, 0.0054428
50
+ 49, 1.2631, 0.86584, 0.93028, 0.55917, 0.43494, 0.4598, 0.28003, 1.2395, 0.89412, 0.93244, 0.0053438, 0.0053438, 0.0053438
51
+ 50, 1.2649, 0.86533, 0.93, 0.55589, 0.43895, 0.46241, 0.28279, 1.2425, 0.89025, 0.93283, 0.0052447, 0.0052447, 0.0052447
52
+ 51, 1.2625, 0.85896, 0.92726, 0.55627, 0.43807, 0.45937, 0.27849, 1.2409, 0.88821, 0.93283, 0.0051456, 0.0051456, 0.0051456
53
+ 52, 1.2523, 0.85401, 0.92594, 0.55563, 0.44292, 0.46383, 0.28191, 1.2392, 0.88848, 0.93429, 0.0050465, 0.0050465, 0.0050465
54
+ 53, 1.2554, 0.85555, 0.92693, 0.57146, 0.4369, 0.46303, 0.28305, 1.2331, 0.88396, 0.92977, 0.0049475, 0.0049475, 0.0049475
55
+ 54, 1.2529, 0.84758, 0.92543, 0.55506, 0.44158, 0.46147, 0.28012, 1.2435, 0.88722, 0.93301, 0.0048484, 0.0048484, 0.0048484
56
+ 55, 1.2471, 0.84757, 0.92751, 0.55925, 0.4364, 0.46272, 0.28217, 1.2389, 0.88365, 0.93056, 0.0047493, 0.0047493, 0.0047493
57
+ 56, 1.2511, 0.8476, 0.9258, 0.55423, 0.4456, 0.46534, 0.28412, 1.2353, 0.88018, 0.93094, 0.0046503, 0.0046503, 0.0046503
58
+ 57, 1.2423, 0.83968, 0.92423, 0.5511, 0.44247, 0.46468, 0.28379, 1.2349, 0.8841, 0.93167, 0.0045512, 0.0045512, 0.0045512
59
+ 58, 1.2435, 0.8411, 0.92455, 0.56421, 0.44738, 0.46803, 0.28462, 1.2361, 0.88084, 0.92986, 0.0044521, 0.0044521, 0.0044521
60
+ 59, 1.2374, 0.83339, 0.92436, 0.55181, 0.44576, 0.46715, 0.28525, 1.2381, 0.88181, 0.93141, 0.0043531, 0.0043531, 0.0043531
61
+ 60, 1.2349, 0.83136, 0.92361, 0.56382, 0.44718, 0.47256, 0.28855, 1.2362, 0.87393, 0.93087, 0.004254, 0.004254, 0.004254
62
+ 61, 1.2419, 0.82799, 0.9254, 0.56431, 0.45008, 0.47374, 0.2892, 1.2351, 0.87468, 0.92987, 0.0041549, 0.0041549, 0.0041549
63
+ 62, 1.2378, 0.83084, 0.92221, 0.58269, 0.44767, 0.47402, 0.29023, 1.2274, 0.87065, 0.92918, 0.0040559, 0.0040559, 0.0040559
64
+ 63, 1.2354, 0.82684, 0.92243, 0.56301, 0.44499, 0.47028, 0.28845, 1.2292, 0.87002, 0.93022, 0.0039568, 0.0039568, 0.0039568
65
+ 64, 1.2246, 0.81682, 0.92027, 0.56518, 0.45085, 0.47186, 0.28941, 1.2304, 0.86841, 0.92926, 0.0038577, 0.0038577, 0.0038577
66
+ 65, 1.2255, 0.81592, 0.91897, 0.57551, 0.44474, 0.47319, 0.29077, 1.2271, 0.86602, 0.92838, 0.0037587, 0.0037587, 0.0037587
67
+ 66, 1.2304, 0.81763, 0.9195, 0.57622, 0.44772, 0.47596, 0.29265, 1.2286, 0.86585, 0.9295, 0.0036596, 0.0036596, 0.0036596
68
+ 67, 1.2269, 0.81447, 0.92123, 0.57561, 0.44436, 0.47153, 0.28984, 1.2265, 0.86925, 0.92973, 0.0035605, 0.0035605, 0.0035605
69
+ 68, 1.2257, 0.81135, 0.92069, 0.57199, 0.4455, 0.47273, 0.28928, 1.2284, 0.86816, 0.92896, 0.0034614, 0.0034614, 0.0034614
70
+ 69, 1.2228, 0.80795, 0.91915, 0.56131, 0.45046, 0.47468, 0.29284, 1.2263, 0.86664, 0.92957, 0.0033624, 0.0033624, 0.0033624
71
+ 70, 1.2176, 0.80216, 0.9201, 0.55885, 0.45754, 0.47409, 0.29073, 1.2254, 0.86228, 0.92818, 0.0032633, 0.0032633, 0.0032633
72
+ 71, 1.2143, 0.79888, 0.91723, 0.56333, 0.45306, 0.47533, 0.29251, 1.2222, 0.86375, 0.92747, 0.0031642, 0.0031642, 0.0031642
73
+ 72, 1.2134, 0.80249, 0.91827, 0.56548, 0.45601, 0.47685, 0.29237, 1.2227, 0.86033, 0.92733, 0.0030652, 0.0030652, 0.0030652
74
+ 73, 1.2156, 0.79882, 0.9165, 0.56722, 0.45291, 0.4757, 0.29161, 1.2233, 0.86272, 0.92718, 0.0029661, 0.0029661, 0.0029661
75
+ 74, 1.2136, 0.79649, 0.91624, 0.55984, 0.45309, 0.4742, 0.29185, 1.2218, 0.86274, 0.92702, 0.002867, 0.002867, 0.002867
76
+ 75, 1.211, 0.79416, 0.91487, 0.55222, 0.4605, 0.47602, 0.29284, 1.222, 0.86109, 0.92649, 0.002768, 0.002768, 0.002768
77
+ 76, 1.2113, 0.79462, 0.91414, 0.57561, 0.44854, 0.47641, 0.29354, 1.2218, 0.86068, 0.92658, 0.0026689, 0.0026689, 0.0026689
78
+ 77, 1.2072, 0.78853, 0.91454, 0.56984, 0.4526, 0.47816, 0.29567, 1.2204, 0.85864, 0.92645, 0.0025698, 0.0025698, 0.0025698
79
+ 78, 1.2004, 0.7812, 0.91253, 0.567, 0.4569, 0.47755, 0.29353, 1.2238, 0.85928, 0.92777, 0.0024708, 0.0024708, 0.0024708
80
+ 79, 1.2066, 0.78193, 0.91387, 0.56134, 0.46049, 0.47904, 0.29407, 1.2221, 0.85474, 0.92712, 0.0023717, 0.0023717, 0.0023717
81
+ 80, 1.204, 0.78062, 0.91272, 0.57017, 0.45511, 0.47973, 0.29461, 1.2196, 0.85591, 0.92692, 0.0022726, 0.0022726, 0.0022726
82
+ 81, 1.2001, 0.77554, 0.91336, 0.56944, 0.45727, 0.48147, 0.29618, 1.2196, 0.85564, 0.9264, 0.0021735, 0.0021735, 0.0021735
83
+ 82, 1.1939, 0.77225, 0.91049, 0.57933, 0.45371, 0.47995, 0.29469, 1.2219, 0.85701, 0.92784, 0.0020745, 0.0020745, 0.0020745
84
+ 83, 1.1928, 0.7688, 0.91042, 0.58008, 0.45163, 0.48194, 0.29652, 1.2214, 0.8549, 0.9265, 0.0019754, 0.0019754, 0.0019754
85
+ 84, 1.1924, 0.76791, 0.9117, 0.57572, 0.45348, 0.48144, 0.2966, 1.2214, 0.85667, 0.92649, 0.0018763, 0.0018763, 0.0018763
86
+ 85, 1.1947, 0.76587, 0.91138, 0.58469, 0.45055, 0.48071, 0.29444, 1.2198, 0.85769, 0.92618, 0.0017773, 0.0017773, 0.0017773
87
+ 86, 1.1915, 0.76297, 0.90946, 0.5646, 0.45773, 0.47818, 0.2931, 1.2179, 0.85668, 0.92618, 0.0016782, 0.0016782, 0.0016782
88
+ 87, 1.1834, 0.75815, 0.9068, 0.56449, 0.45993, 0.48021, 0.29504, 1.218, 0.85525, 0.9258, 0.0015791, 0.0015791, 0.0015791
89
+ 88, 1.1853, 0.7564, 0.90755, 0.5698, 0.45533, 0.47853, 0.29408, 1.2201, 0.85641, 0.92593, 0.0014801, 0.0014801, 0.0014801
90
+ 89, 1.1783, 0.74828, 0.90632, 0.5701, 0.45637, 0.47835, 0.29452, 1.2186, 0.85488, 0.9253, 0.001381, 0.001381, 0.001381
91
+ 90, 1.1788, 0.74932, 0.90516, 0.57538, 0.45495, 0.47849, 0.29391, 1.2195, 0.85597, 0.92583, 0.0012819, 0.0012819, 0.0012819
92
+ 91, 1.1626, 0.72756, 0.91676, 0.56631, 0.45903, 0.47901, 0.29335, 1.2223, 0.85612, 0.92657, 0.0011829, 0.0011829, 0.0011829
93
+ 92, 1.1522, 0.71833, 0.91506, 0.57798, 0.44755, 0.47783, 0.2926, 1.2229, 0.85814, 0.92679, 0.0010838, 0.0010838, 0.0010838
94
+ 93, 1.1487, 0.71213, 0.91384, 0.58168, 0.44458, 0.47839, 0.2929, 1.2212, 0.85804, 0.92613, 0.00098472, 0.00098472, 0.00098472
95
+ 94, 1.144, 0.70737, 0.91266, 0.57398, 0.44725, 0.4781, 0.293, 1.2221, 0.85772, 0.92616, 0.00088565, 0.00088565, 0.00088565
96
+ 95, 1.1406, 0.70628, 0.91037, 0.57086, 0.45718, 0.47861, 0.29346, 1.2235, 0.85668, 0.92694, 0.00078658, 0.00078658, 0.00078658
97
+ 96, 1.1382, 0.70115, 0.91034, 0.5715, 0.45239, 0.47833, 0.29289, 1.2236, 0.85711, 0.92728, 0.00068751, 0.00068751, 0.00068751
98
+ 97, 1.1361, 0.69658, 0.91059, 0.56335, 0.45342, 0.47682, 0.29286, 1.2233, 0.85844, 0.92714, 0.00058845, 0.00058845, 0.00058845
99
+ 98, 1.1353, 0.69289, 0.90907, 0.56333, 0.45568, 0.47758, 0.29312, 1.2238, 0.85925, 0.92727, 0.00048938, 0.00048938, 0.00048938
100
+ 99, 1.1283, 0.68791, 0.90732, 0.56666, 0.45349, 0.47804, 0.29356, 1.2249, 0.86052, 0.92753, 0.00039031, 0.00039031, 0.00039031
101
+ 100, 1.1267, 0.68445, 0.9073, 0.56624, 0.45521, 0.47734, 0.29295, 1.2245, 0.86092, 0.92726, 0.00029124, 0.00029124, 0.00029124
enot_neural_architecture_selection_x2/weights/best.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:53910cdfafd0065503ffeafdd41a1acecf892551d0ab971cfc2cb869836a4558
3
+ size 10432046
enot_neural_architecture_selection_x2/weights/last.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:acd534baaee08dd0acae97ccddf3255039ae4953dc16a7c15aa23a3496b2863b
3
+ size 10433966
enot_neural_architecture_selection_x3/best_hyperparameters.yaml ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ lr0: 0.01178
2
+ lrf: 0.009
3
+ momentum: 0.92873
4
+ weight_decay: 0.00041
5
+ warmup_epochs: 3.87259
6
+ warmup_momentum: 0.70941
7
+ box: 11.1414
8
+ cls: 0.59602
9
+ dfl: 1.17484
10
+ hsv_h: 0.01007
11
+ hsv_s: 0.55096
12
+ hsv_v: 0.27943
13
+ degrees: 0.0
14
+ translate: 0.0875
15
+ scale: 0.62086
16
+ shear: 0.0
17
+ perspective: 0.0
18
+ flipud: 0.0
19
+ fliplr: 0.46323
20
+ mosaic: 0.77777
21
+ mixup: 0.0
22
+ copy_paste: 0.0
enot_neural_architecture_selection_x3/weights/best.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:39441cabb2a7f8cd349ff9c2b9af4ecf4287e342265e3c8e0fda453532ac6db8
3
+ size 7559662
enot_neural_architecture_selection_x3/weights/last.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2349fcc4a463dbe00545c88c8f6bb00b1153d95de0c7f680ad26f8f557b5370d
3
+ size 7561326
measure_macs.py ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import argparse
2
+
3
+ import torch
4
+ from fvcore.nn import FlopCountAnalysis
5
+ from ultralytics import YOLO
6
+
7
+
8
+ if __name__ == '__main__':
9
+ parser = argparse.ArgumentParser()
10
+ parser.add_argument('model', type=str, help='Model path for validation.')
11
+ parser.add_argument('--imgsz', default=928, type=int, help='Image size to validate.')
12
+ args = parser.parse_args()
13
+
14
+ model = torch.load(args.model, map_location='cpu')['model'].float()
15
+ fca = FlopCountAnalysis(
16
+ model=model.eval(),
17
+ inputs=torch.rand(1, 3, args.imgsz, args.imgsz),
18
+ )
19
+ fca.unsupported_ops_warnings(False)
20
+ fca.uncalled_modules_warnings(False)
21
+
22
+ print(f"{fca.total() * 1e-9:.2f} GMACS ")
requirements.txt ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ ultralytics
2
+ fvcore
validate.py ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import argparse
2
+
3
+ import torch
4
+ from ultralytics import YOLO
5
+
6
+
7
+ if __name__ == '__main__':
8
+ parser = argparse.ArgumentParser()
9
+ parser.add_argument('model', type=str, help='Model path for validation.')
10
+ parser.add_argument('--imgsz', default=928, type=int, help='Image size to validate.')
11
+ args = parser.parse_args()
12
+
13
+ model = YOLO(args.model, task='detect')
14
+ model.val(data="VisDrone.yaml", imgsz=args.imgsz, batch=1)