Edit model card

image/png

ResNet-50: Image Classification

ResNet is a network with a better effect on classification problems in the ImageNet competition.

It introduces the concept of residual learning, protects the integrity of information by adding direct channels, and solves problems such as information loss, gradient disappearance, and gradient explosion. The network is also trained. ResNet has different network layers, commonly used are 18-layer, 34-layer, 50-layer, 101-layer, 152-layer. The meaning of ResNet50 means that there are 50-layers in the network. It is currently more commonly used because it takes into account both speed and accuracy.

The model can be found here

CONTENTS

Performance

Device SoC Runtime Model Size (pixels) Inference Time (ms) Precision Compute Unit Model Download
AidBox QCS6490 QCS6490 QNN ResNet-50 224 3.6 INT8 NPU model download
AidBox QCS6490 QCS6490 QNN ResNet-50 224 5.9 INT16 NPU model download
AidBox QCS6490 QCS6490 SNPE ResNet-50 224 3.4 INT8 NPU model download
AidBox QCS6490 QCS6490 SNPE ResNet-50 224 4.8 INT16 NPU model download
APLUX QCS8550 QCS8550 QNN ResNet-50 224 2.5 INT8 NPU model download
APLUX QCS8550 QCS8550 QNN ResNet-50 224 3.5 INT16 NPU model download
APLUX QCS8550 QCS8550 SNPE ResNet-50 224 1 INT8 NPU model download
APLUX QCS8550 QCS8550 SNPE ResNet-50 224 1.4 INT16 NPU model download
AidBox GS865 QCS8250 SNPE ResNet-50 224 9 INT8 NPU model download

Models Conversion

Demo models converted from AIMO(AI Model Optimizier).

The source model ResNet-50.onnx can be found here.

The demo model conversion step on AIMO can be found blow:

Device SoC Runtime Model Size (pixels) Precision Compute Unit AIMO Conversion Steps
AidBox QCS6490 QCS6490 QNN ResNet-50 224 INT8 NPU View Steps
AidBox QCS6490 QCS6490 QNN ResNet-50 224 INT16 NPU View Steps
AidBox QCS6490 QCS6490 SNPE ResNet-50 224 INT8 NPU View Steps
AidBox QCS6490 QCS6490 SNPE ResNet-50 224 INT16 NPU View Steps
APLUX QCS8550 QCS8550 QNN ResNet-50 224 INT8 NPU View Steps
APLUX QCS8550 QCS8550 QNN ResNet-50 224 INT16 NPU View Steps
APLUX QCS8550 QCS8550 SNPE ResNet-50 224 INT8 NPU View Steps
APLUX QCS8550 QCS8550 SNPE ResNet-50 224 INT16 NPU View Steps
AidBox GS865 QCS8250 SNPE ResNet-50 224 INT8 NPU View Steps

Inference

Step1: convert model

a. Prepare source model in onnx format. The source model can be found here.

b. Login AIMO and convert source model to target format. The model conversion step can follow AIMO Conversion Step in Model Conversion Sheet.

c. After conversion task done, download target model file.

Step2: install AidLite SDK

The installation guide of AidLite SDK can be found here.

Step3: run demo program

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .