resnet_mnist_digits / README.md
lane99's picture
Update README.md
dd79e2c
---
license: afl-3.0
tags:
- image-classification
library_name: keras
datasets:
- mnist
metrics:
- accuracy
model-index:
- name: resnet_mnist_digits
results:
- task:
type: image-classification
name: Image Classification
dataset:
type: mnist
name: MNIST
metrics:
- type: accuracy
value: .9945
name: Accuracy
verified: false
---
# Model Card for resnet_mnist_digits
This model is is a Residual Neural Network (ResNet) for classifying handwritten digits in the MNIST dataset.
This model has 27.5 M parameters and achieves 99.45% accuracy on the MNIST test dataset (i.e., on digits not seen during training).
## Model Details
### Model Description
This model takes as an input a 28x28 array of MNIST digits with values normalized to [0, 1].
The model was trained using Keras on an Nvidia Ampere A100.
- **Developed by:** Phillip Allen Lane
- **Model type:** ResNet
- **License:** afl-3.0
### How to Get Started with the Model
Use the code below to get started with the model.
```py
from tensorflow.keras import models
from tensorflow.keras.datasets import mnist
from tensorflow.keras.utils import to_categorical
from keras.utils.data_utils import get_file
# load the MNIST dataset test images and labels
(_, _), (test_images, test_labels) = mnist.load_data()
# normalize the images
test_images = test_images.astype('float32') / 255
# create one-hot labels
test_labels_onehot = to_categorical(test_labels)
# download the model
model_path = get_file('/path/to/resnet_mnist_digits.hdf5', 'https://huggingface.co/lane99/resnet_mnist_digits/resolve/main/resnet_mnist_digits.hdf5')
# import the model
resnet = models.load_model(model_path)
# evaluate the model
evaluation_conv = resnet.evaluate(test_images, test_labels_onehot)
print("Accuracy: ", str(evaluation_conv[1]))
```
## Training Details
### Training Data
This model was trained on the 60,000 entries in the MNIST training dataset.
### Training Procedure
This model was trained with a 0.1 validation split for 15 epochs using a batch size of 128.