File size: 1,920 Bytes
f6c7685
 
febe3e7
 
 
 
f6c7685
febe3e7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
---
license: apache-2.0
pipeline_tag: object-detection
tags:
- pytorch
- torch-dag
---
# Model Card for yolov8n_pruned_43

This is a prunned version of the [YOLOv8n](https://github.com/ultralytics/ultralytics#models) model in a [toch-dag](https://github.com/TCLResearchEurope/torch-dag) format.

This model has rougly 43% of the original model FLOPs with small metrics drop.


| Model         | KMAPPs* | M Parameters | mAP50-95 (640x640) |
| -----------   | ------- | ------------ | ------------------ |
| **YOLOv8n (baseline)**   | 21.5 | 3.16 | 37.3 |
| **yolov8n_pruned_43 (ours)**  | 9.2 **(43%)** | 1.2 **(38%)** | 29.9 **(↓ 7.4)** |


\***KMAPPs** thousands of FLOPs per input pixel

`KMAPPs(model) = FLOPs(model) / (H * W * 1000)`, where `(H, W)` is the input resolution.

The accuracy was calculated on the  COCO val2017 dataset. For details about image pre-processing, please refer to the original repository.
## Model Details

### Model Description


- **Developed by:** [TCL Research Europe](https://github.com/TCLResearchEurope/)
- **Model type:** Object detection
- **License:** Apache 2.0
- **Finetuned from model:** [YOLOv8n](https://github.com/ultralytics/ultralytics#models)

### Model Sources
- **Repository:** [YOLOv8n](https://github.com/ultralytics/ultralytics#models)



## How to Get Started with the Model

To load the model, You have to install [torch-dag](https://github.com/TCLResearchEurope/torch-dag#3-installation) library, which can be done using `pip` by

```
pip install torch-dag
```

then, clone this repository

```
# Make sure you have git-lfs installed (https://git-lfs.com)
git lfs install
git clone https://huggingface.co/TCLResearchEurope/yolov8n_pruned_43
```

and now You are ready to load the model:

```
import torch_dag
import torch

model = torch_dag.io.load_dag_from_path('./yolov8n_pruned_43')

model.eval()
out = model(torch.ones(1, 3, 224, 224))
print(out.shape)
```