nielsr HF Staff commited on
Commit
26613eb
·
verified ·
1 Parent(s): 6f09a9a

Add comprehensive model card for UNICE

Browse files

This PR adds a comprehensive model card for the UNICE model. It includes:
- Relevant metadata: `pipeline_tag: image-to-image`, `library_name: diffusers`, and `license: mit`.
- A link to the paper: [UNICE: Training A Universal Image Contrast Enhancer](https://huggingface.co/papers/2507.17157).
- Links to the official GitHub repository, Colab demo, and Hugging Face dataset.
- An overview of the model, sample usage, and citation information, leveraging content from the original GitHub README.

Please review and merge if everything looks good.

Files changed (1) hide show
  1. README.md +69 -0
README.md ADDED
@@ -0,0 +1,69 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ pipeline_tag: image-to-image
4
+ library_name: diffusers
5
+ ---
6
+
7
+ # UNICE: Training A Universal Image Contrast Enhancer
8
+
9
+ This repository contains the **UNICE** model, a Universal Image Contrast Enhancer, as presented in the paper [UNICE: Training A Universal Image Contrast Enhancer](https://huggingface.co/papers/2507.17157).
10
+
11
+ <p align="center">
12
+ <a href="https://colab.research.google.com/drive/1EjIAThdFhyE_51ujdAUK0_4NRlBcKIdf?usp=sharing" target="_blank">
13
+ <img src="https://img.shields.io/badge/Colab%20Demo-F9AB00?style=flat&logo=googlecolab&logoColor=white">
14
+ </a>
15
+ &nbsp;&nbsp;&nbsp;
16
+ <a href="https://huggingface.co/datasets/lahaina/UNICE" target="_blank">
17
+ <img src="https://img.shields.io/badge/Hugging%20Face-EA6B66?style=flat&logo=huggingface&logoColor=FFD21E">
18
+ </a>
19
+ &nbsp;&nbsp;&nbsp;
20
+ <a href="https://arxiv.org/abs/2507.17157" target="_blank">
21
+ <img src="https://img.shields.io/badge/arXiv-2507.17157-b31b1b?style=flat&logo=arXiv&logoColor=white">
22
+ </a>
23
+ &nbsp;&nbsp;&nbsp;
24
+ <a href="https://github.com/RuodaiCui/UNICE" target="_blank">
25
+ <img src="https://img.shields.io/badge/GitHub-Code-blue.svg?logo=github&">
26
+ </a>
27
+ </p>
28
+
29
+ ## Overview
30
+ Existing image contrast enhancement methods are typically designed for specific tasks. UNICE introduces a novel approach to image contrast enhancement by training a universal and generalized model capable of handling various tasks such as under-/over-exposure correction and low-light enhancement. It operates by first generating a multi-exposure sequence (MES) from a single sRGB image, and then fusing the generated MES into an enhanced image. This method is free of costly human labeling and demonstrates superior generalization performance over existing methods, even outperforming manually created ground-truths in multiple no-reference image quality metrics.
31
+
32
+ <img src="https://huggingface.co/datasets/lahaina/UNICE/resolve/main/img/GT_unice_cmp.jpg" alt="Comparison with GT" width="600">
33
+
34
+ ## Usage
35
+ For detailed installation instructions, training procedures, and more examples, please refer to the [official GitHub repository](https://github.com/RuodaiCui/UNICE).
36
+
37
+ Pre-trained weights for this model are available at [Hugging Face](https://huggingface.co/lahaina/unice/tree/main/checkpoints).
38
+
39
+ To test the model with different exposure values, use the following script:
40
+
41
+ ```bash
42
+ #!/bin/bash
43
+
44
+ # Define the exposure value
45
+ exposure=0.5
46
+ output_dir="output/$exposure"
47
+
48
+ CUDA_VISIBLE_DEVICES=5 ../miniconda3/envs/img2img-turbo/bin/python src/inference.py \
49
+ --model_path "checkpoints/exposure.pkl" \
50
+ --input_dir /local/mnt/workspace/ruodcui/code/adaptive_3dlut/data/BAID512/input/ \
51
+ --output_dir $output_dir \
52
+ --prompt "exposure control" \
53
+ --exposure $exposure
54
+ ```
55
+
56
+ ## Citation
57
+ If you find our work helpful or inspiring, please consider citing our paper:
58
+
59
+ ```bibtex
60
+ @misc{ruodai2025UNICE,
61
+ title={UNICE: Training A Universal Image Contrast Enhancer},
62
+ author={Ruodai Cui and Lei Zhang},
63
+ year={2025},
64
+ eprint={2507.17157},
65
+ archivePrefix={arXiv},
66
+ primaryClass={cs.CV},
67
+ url={https://arxiv.org/abs/2507.17157},
68
+ }
69
+ ```