Image Classification
MindSpore
helloway commited on
Commit
8a49c6d
1 Parent(s): dd859fb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -13,7 +13,7 @@ datasets:
13
 
14
  This repository contains the model from [this notebook on image classification with CIFAR-10 dataset using RESNET50 architecture](https://gitee.com/mindspore/models/blob/r1.9/official/cv/resnet/README.md#).
15
 
16
- ## LeNet Description
17
  ResNet (residual neural network) was proposed by Kaiming He and the other four Chinese of Microsoft Research Institute. Through the use of the ResNet unit, it successfully trained 152 layers of neural network, and won the championship in ilsvrc2015. The error rate on top 5 was 3.57%, and the parameter quantity was lower than vggnet, so the effect was very outstanding. Traditional convolution network or full connection network will have more or less information loss. At the same time, it will lead to the disappearance or explosion of gradient, which leads to the failure of deep network training. ResNet solves this problem to a certain extent. By passing the input information to the output, the integrity of the information is protected. The whole network only needs to learn the part of the difference between input and output, which simplifies the learning objectives and difficulties.The structure of ResNet can accelerate the training of neural network very quickly, and the accuracy of the model is also greatly improved. At the same time, ResNet is very popular, even can be directly used in the concept net network.
18
 
19
 
 
13
 
14
  This repository contains the model from [this notebook on image classification with CIFAR-10 dataset using RESNET50 architecture](https://gitee.com/mindspore/models/blob/r1.9/official/cv/resnet/README.md#).
15
 
16
+ ## ResNET Description
17
  ResNet (residual neural network) was proposed by Kaiming He and the other four Chinese of Microsoft Research Institute. Through the use of the ResNet unit, it successfully trained 152 layers of neural network, and won the championship in ilsvrc2015. The error rate on top 5 was 3.57%, and the parameter quantity was lower than vggnet, so the effect was very outstanding. Traditional convolution network or full connection network will have more or less information loss. At the same time, it will lead to the disappearance or explosion of gradient, which leads to the failure of deep network training. ResNet solves this problem to a certain extent. By passing the input information to the output, the integrity of the information is protected. The whole network only needs to learn the part of the difference between input and output, which simplifies the learning objectives and difficulties.The structure of ResNet can accelerate the training of neural network very quickly, and the accuracy of the model is also greatly improved. At the same time, ResNet is very popular, even can be directly used in the concept net network.
18
 
19