|
--- |
|
license: cc-by-4.0 |
|
pipeline_tag: image-to-image |
|
tags: |
|
- pytorch |
|
- super-resolution |
|
--- |
|
|
|
[Link to Github Release](https://github.com/Phhofm/models/releases/tag/4xNomos2_hq_dat2) |
|
|
|
# 4xNomos2_hq_dat2 |
|
Scale: 4 |
|
Architecture: [DAT](https://github.com/zhengchen1999/dat) |
|
Architecture Option: [dat2](https://github.com/muslll/neosr/blob/5fba7f162d36052010169e6517dec3b406c569ab/neosr/archs/dat_arch.py#L1111) |
|
|
|
Author: Philip Hofmann |
|
License: CC-BY-0.4 |
|
Purpose: Upscaler |
|
Subject: Photography |
|
Input Type: Images |
|
Release Date: 29.08.2024 |
|
|
|
Dataset: [nomosv2](https://github.com/muslll/neosr/?tab=readme-ov-file#-datasets) |
|
Dataset Size: 6000 |
|
OTF (on the fly augmentations): No |
|
Pretrained Model: DAT_2_x4 |
|
Iterations: 140'000 |
|
Batch Size: 4 |
|
Patch Size: 48 |
|
|
|
Description: |
|
A dat2 4x upscaling model, similiar to the [4xNomos2_hq_mosr](https://github.com/Phhofm/models/releases/tag/4xNomos2_hq_mosr) model, trained and for usage on non-degraded input to give good quality output. |
|
|
|
I scored 7 validation outputs of each of the 21 checkpoints (10k-210k) of this model training with 68 metrics. |
|
[The metric scores can be found in this google sheet](https://docs.google.com/spreadsheets/d/1NL-by7WvZyDMHj5XN8UeDALVSSwH70IKvwV65ATWqrA/edit?usp=sharing). |
|
The corresponding image files for this scoring can be [found here](https://drive.google.com/file/d/1ZTp9fBMeawftNqzg4RN9_zIvHtul5jVc/view?usp=sharing) |
|
Screenshot of the google sheet: |
|
![image](https://github.com/user-attachments/assets/bc6ff9e5-d012-4b15-9e7b-766896cf3d2f) |
|
|
|
Release checkpoint has been selected by looking at the scores, manually inspecting, and then getting responses on discord which chose B to this quick visual test, A B or C, which denote different checkpoints: https://slow.pics/c/8Akzj6rR |
|
|
|
Checkpoint B is 140k which is 4xNomos2_hq_dat2. But I added checkpoint A (4xNomos2_hq_dat2_150000) and checkpoint C (4xNomos2_hq_dat2_10000) model files additionally here if people want to try them out). |
|
|
|
## Model Showcase: |
|
[Slowpics](https://slow.pics/c/yuue9WpF) |
|
|
|
(Click on image for better view) |
|
![Example1](https://github.com/user-attachments/assets/151d3f10-ea2d-4466-a4ed-595f164ec025) |
|
![Example2](https://github.com/user-attachments/assets/9ac764ff-42a7-4a50-89a8-dbde3ca4407e) |
|
![Example3](https://github.com/user-attachments/assets/62fc5c91-1320-4561-bb1d-c7c5c740ca7d) |
|
![Example4](https://github.com/user-attachments/assets/5f44ff1a-a2d3-4942-9f73-c6a3b41fe15b) |
|
![Example5](https://github.com/user-attachments/assets/46baa7c5-5a75-4971-8f3c-01657efd566f) |
|
![Example6](https://github.com/user-attachments/assets/1dff06a4-0870-4d57-bd50-22409023da64) |
|
![Example7](https://github.com/user-attachments/assets/b7681172-c560-4d93-96f3-07b206ad699b) |
|
|