|
--- |
|
license: apache-2.0 |
|
pipeline_tag: object-detection |
|
tags: |
|
- pytorch |
|
- torch-dag |
|
--- |
|
# Model Card for yolov8n_pruned_43 |
|
|
|
This is a prunned version of the [YOLOv8n](https://github.com/ultralytics/ultralytics#models) model in a [toch-dag](https://github.com/TCLResearchEurope/torch-dag) format. |
|
|
|
This model has rougly 43% of the original model FLOPs with small metrics drop. |
|
|
|
|
|
| Model | KMAPPs* | M Parameters | mAP50-95 (640x640) | |
|
| ----------- | ------- | ------------ | ------------------ | |
|
| **YOLOv8n (baseline)** | 21.5 | 3.16 | 37.3 | |
|
| **yolov8n_pruned_43 (ours)** | 9.2 **(43%)** | 1.2 **(38%)** | 29.9 **(↓ 7.4)** | |
|
|
|
|
|
\***KMAPPs** thousands of FLOPs per input pixel |
|
|
|
`KMAPPs(model) = FLOPs(model) / (H * W * 1000)`, where `(H, W)` is the input resolution. |
|
|
|
The accuracy was calculated on the COCO val2017 dataset. For details about image pre-processing, please refer to the original repository. |
|
## Model Details |
|
|
|
### Model Description |
|
|
|
|
|
- **Developed by:** [TCL Research Europe](https://github.com/TCLResearchEurope/) |
|
- **Model type:** Object detection |
|
- **License:** Apache 2.0 |
|
- **Finetuned from model:** [YOLOv8n](https://github.com/ultralytics/ultralytics#models) |
|
|
|
### Model Sources |
|
- **Repository:** [YOLOv8n](https://github.com/ultralytics/ultralytics#models) |
|
|
|
|
|
|
|
## How to Get Started with the Model |
|
|
|
To load the model, You have to install [torch-dag](https://github.com/TCLResearchEurope/torch-dag#3-installation) library, which can be done using `pip` by |
|
|
|
``` |
|
pip install torch-dag |
|
``` |
|
|
|
then, clone this repository |
|
|
|
``` |
|
# Make sure you have git-lfs installed (https://git-lfs.com) |
|
git lfs install |
|
git clone https://huggingface.co/TCLResearchEurope/yolov8n_pruned_43 |
|
``` |
|
|
|
and now You are ready to load the model: |
|
|
|
``` |
|
import torch_dag |
|
import torch |
|
|
|
model = torch_dag.io.load_dag_from_path('./yolov8n_pruned_43') |
|
|
|
model.eval() |
|
out = model(torch.ones(1, 3, 224, 224)) |
|
print(out.shape) |
|
``` |