Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,46 @@
|
|
1 |
-
---
|
2 |
-
license: cc-by-4.0
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: cc-by-4.0
|
3 |
+
pipeline_tag: image-to-image
|
4 |
+
tags:
|
5 |
+
- pytorch
|
6 |
+
- super-resolution
|
7 |
+
---
|
8 |
+
|
9 |
+
[Link to Github Release](https://github.com/Phhofm/models/releases/tag/4xLSDIRCompact)
|
10 |
+
|
11 |
+
# 4xLSDIRCompact
|
12 |
+
|
13 |
+
Name: 4xLSDIRCompact
|
14 |
+
Author: Philip Hofmann
|
15 |
+
Release Date: 11.03.2023
|
16 |
+
License: CC BY 4.0
|
17 |
+
Model Architecture: SRVGGNetCompact
|
18 |
+
Scale: 4
|
19 |
+
Purpose: Upscale small good quality photos to 4x their size
|
20 |
+
Iterations: 160000
|
21 |
+
batch_size: Variable(1-10)
|
22 |
+
HR_size: 256
|
23 |
+
Dataset: LSDIR
|
24 |
+
Dataset_size: 84991
|
25 |
+
OTF Training No
|
26 |
+
Pretrained_Model_G: 4x_Compact_Pretrain.pth
|
27 |
+
|
28 |
+
Description: This is my first ever released self-trained sisr upscaling model 😄
|
29 |
+
|
30 |
+
15 Examples: https://imgsli.com/MTYxNDk3
|
31 |
+
|
32 |
+
![Example1](https://github.com/Phhofm/models/assets/14755670/1923d560-4b40-495c-aa2b-76b73245c1e9)
|
33 |
+
![Example2](https://github.com/Phhofm/models/assets/14755670/d186ae3a-fb90-418c-a3f3-10a0e10d10ae)
|
34 |
+
![Example3](https://github.com/Phhofm/models/assets/14755670/eb610cea-f45d-444e-b9e6-16b9a6335c6c)
|
35 |
+
![Example4](https://github.com/Phhofm/models/assets/14755670/9c5adfc1-ccce-466c-9ada-a45a685561e4)
|
36 |
+
![Example5](https://github.com/Phhofm/models/assets/14755670/dadea59b-1a59-49c3-b360-62ddbf2725a7)
|
37 |
+
![Example6](https://github.com/Phhofm/models/assets/14755670/095ae5b3-2509-4904-866c-e7e65292d1b6)
|
38 |
+
![Example7](https://github.com/Phhofm/models/assets/14755670/e4b3332d-7b2d-45dd-adce-1892b2db9117)
|
39 |
+
![Example8](https://github.com/Phhofm/models/assets/14755670/8d2b86ac-5503-40ec-a133-9559ec3a7699)
|
40 |
+
![Example9](https://github.com/Phhofm/models/assets/14755670/f6bcc117-0321-45fe-a4b1-c699a84504b8)
|
41 |
+
![Example10](https://github.com/Phhofm/models/assets/14755670/aebbc1f8-a5f8-4e7c-97f2-bce3657a6c70)
|
42 |
+
![Example11](https://github.com/Phhofm/models/assets/14755670/b83f88bf-5947-4aed-997e-16d6f1b2c806)
|
43 |
+
![Example12](https://github.com/Phhofm/models/assets/14755670/32b829f2-a7b8-4fd3-947e-78b7b7d9f6f6)
|
44 |
+
![Example13](https://github.com/Phhofm/models/assets/14755670/17d7ae04-1fac-4483-9f60-5021bc7a5e22)
|
45 |
+
![Example14](https://github.com/Phhofm/models/assets/14755670/63b1d22e-2ef4-43b9-a30b-4371169c701b)
|
46 |
+
![Example15](https://github.com/Phhofm/models/assets/14755670/e51a0451-4685-4c98-a554-4eed822a1453)
|