CIFAR10 LeNet5 Base Model

This repository contains the base implementation of the LeNet5 architecture adapted for CIFAR-10. The model consists of two convolutional layers followed by three fully connected layers, using ReLU activations and Kaiming uniform initialization. It is trained with a batch size of 32 using the Adam optimizer (learning rate 0.001) and CrossEntropyLoss. In our experiments, this model achieved a test loss of 0.0539 and a top-1 accuracy of 58.52% on CIFAR-10.

Model Details

  • Architecture: 2 Convolutional Layers, 3 Fully Connected Layers.
  • Activations: ReLU (might switch to tanh).
  • Weight Initialization: Kaiming Uniform.
  • Optimizer: Adam (lr=0.001).
  • Loss Function: CrossEntropyLoss.
  • Dataset: CIFAR-10.

Usage

Load this model in PyTorch to fine-tune or evaluate on CIFAR-10 using your training and evaluation scripts.


Feel free to update this model card with further training details, benchmarks, or usage examples.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train Juardo/uu-infomcv-assignment-3-CIFAR10_lenet

Collection including Juardo/uu-infomcv-assignment-3-CIFAR10_lenet