File size: 3,586 Bytes
ae02bee
 
09a1cc4
 
 
 
ae02bee
 
 
 
09a1cc4
 
ae02bee
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
---
license: cc-by-4.0
pipeline_tag: image-to-image
tags:
- pytorch
- super-resolution
---

[Link to Github Release](https://github.com/Phhofm/models/releases/tag/2xHFA2kCompact)

# 2xHFA2kCompact

Name: 2xHFA2kCompact  
Author: Philip Hofmann  
Release Date: 18.04.2023  
License: CC BY 4.0  
Network: SRVGGNetCompact, with parameters: 600,652  
Scale: 2  
Purpose: A compact anime 2x upscaling model based on musl's HFA2k dataset  

Iterations: 93,000  
batch_size: 12  
HR_size: 384  
Epoch: 207 (require iter number per epoch: 214)  
Dataset: HFA2k  
Number of train images: 2568  
OTF Training: Yes  
Pretrained_Model_G: 4x_Compact_Pretrain  
Training time: 24h+  

Description: Compact 2x anime upscaler with otf compression and blur. The '2xHFA2kCompact.pth' (4.6 MB) is the original trained model file, the other model files are conversions using chaiNNer. Trained on musl's latest dataset release for Anime SISR, which has been extracted from modern anime films, where the selection criteria was high SNR, no DOF and high frequency information.

Examples: https://imgsli.com/MTcxNjA4

![Example1](https://github.com/Phhofm/models/assets/14755670/644cbf4c-8e01-4052-9c41-e127818a3ce5)
![Example2](https://github.com/Phhofm/models/assets/14755670/592dbf17-8268-4032-8a6e-fab277e69f6d)
![Example3](https://github.com/Phhofm/models/assets/14755670/01132e47-32c8-437f-8cc8-b938270db697)
![Example4](https://github.com/Phhofm/models/assets/14755670/8627f038-ab42-4e9a-9a97-2b8930487fa3)
![Example5](https://github.com/Phhofm/models/assets/14755670/a68a1260-f044-4e93-b969-8c2fd61c1fe2)
![Example6](https://github.com/Phhofm/models/assets/14755670/cbab3532-02cb-4277-9774-0a12450711a1)
![Example7](https://github.com/Phhofm/models/assets/14755670/85659e12-a124-41ee-9688-57ecb04391cb)
![Example8](https://github.com/Phhofm/models/assets/14755670/6797da63-d670-45c2-88a2-9d35e7cd81b2)
![Example9](https://github.com/Phhofm/models/assets/14755670/04680869-99af-4003-8312-1a62b2ef1700)
![Example10](https://github.com/Phhofm/models/assets/14755670/823a03b0-0208-4505-b997-d10a8b813b3d)
![Example11](https://github.com/Phhofm/models/assets/14755670/512cfa9e-de86-45d1-ace8-7a24071812a0)
![Example12](https://github.com/Phhofm/models/assets/14755670/89a0ca79-4c80-48df-bed3-fbd2e93e2f28)
![Example13](https://github.com/Phhofm/models/assets/14755670/9db9da16-8463-480b-8823-7e6665ed293f)
![Example14](https://github.com/Phhofm/models/assets/14755670/49f7d485-9238-476a-be77-dae731daff2b)
![Example15](https://github.com/Phhofm/models/assets/14755670/cdfb393c-2f0f-4d18-8749-2c0d407cf17d)
![Example16](https://github.com/Phhofm/models/assets/14755670/74158efe-4d6c-45e2-b57e-70096a4a4f68)
![Example17](https://github.com/Phhofm/models/assets/14755670/61453d50-777f-47b3-9d3b-45274b01a9af)
![Example18](https://github.com/Phhofm/models/assets/14755670/651a7600-77bc-472d-b019-cfd928d15077)
![Example19](https://github.com/Phhofm/models/assets/14755670/fce75e50-ff81-46cc-82b6-c560a3500abe)
![Example20](https://github.com/Phhofm/models/assets/14755670/76ef4b26-75e7-4fbb-8a9c-2e8baab903b0)
![Example21](https://github.com/Phhofm/models/assets/14755670/ab7553e0-a6c1-4671-82ef-0e6148e047d2)
![Example22](https://github.com/Phhofm/models/assets/14755670/6c240fc1-5cbb-4f31-85d6-b61f3f5ca608)
![Example23](https://github.com/Phhofm/models/assets/14755670/95d06023-332a-4039-b4c1-ee06fe547127)
![Example24](https://github.com/Phhofm/models/assets/14755670/5ad867da-9765-4364-a0aa-07951d6fe64a)
![Example25](https://github.com/Phhofm/models/assets/14755670/112dcef6-eaef-4450-82ee-388d26e184af)