--- license: apache-2.0 datasets: - ylecun/mnist - uoft-cs/cifar10 - uoft-cs/cifar100 language: - en metrics: - accuracy pipeline_tag: text-to-image tags: - diffusion - unet - res ---

Diffusion Model Sampler

An implementation of a diffusion model sampler using a UNet transformer to generate handwritten digit samples.
Explore the docs »

View Demo · Report Bug · Request Feature

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Results
  5. Roadmap
  6. Contributing
  7. License
  8. Contact
  9. Acknowledgments
## About The Project Diffusion models have shown great promise in generating high-quality samples in various domains. In this project, we utilize a UNet transformer-based diffusion model to generate samples of handwritten digits. The process involves: 1. Setting up the model and loading pre-trained weights. 2. Generating samples for each digit. 3. Creating a GIF to visualize the generated samples.
MNIST GIF CIFAR-10 GIF

(back to top)

### Built With #### AI and Machine Learning Libraries
Python PyTorch NumPy Matplotlib Pillow

(back to top)

## Getting Started To get a local copy up and running follow these simple example steps. ### Prerequisites Ensure you have the following prerequisites installed: * Python 3.8 or higher * CUDA-enabled GPU (optional but recommended) * The following Python libraries: - torch - torchvision - numpy - Pillow - matplotlib ### Installation 1. Clone the repository: ```sh git clone https://github.com/Yavuzhan-Baykara/Stable-Diffusion.git cd Stable-Diffusion ``` 2. Install the required Python libraries: ```sh pip install torch torchvision numpy Pillow matplotlib ```

(back to top)

## Usage To train the UNet transformer with different datasets and samplers, use the following command: ```sh python train.py