Gzhlaker commited on
Commit
ac26558
·
verified ·
1 Parent(s): 792f184

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +108 -5
README.md CHANGED
@@ -1,7 +1,110 @@
1
- This is an EAR (Erasing Autoregressive Models) model trained to erase specific concepts.
2
-
3
  ---
4
  license: mit
5
- base_model:
6
- - deepseek-ai/Janus-Pro-7B
7
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
+ ---
4
+
5
+ # Model Card for EAR
6
+
7
+ <!-- Provide a quick summary of what the model is/does. -->
8
+
9
+ This is an EAR (Erasing Autoregressive Models) model trained to erase specific concepts.
10
+
11
+ ## Model Details
12
+
13
+ ### Model Description
14
+
15
+ <!-- Provide a longer summary of what this model is. -->
16
+
17
+
18
+
19
+ - **Developed by:** IMMC
20
+ - **Model type:** AR model
21
+ - **License:** MIT
22
+ - **Finetuned from model :** Janus-Pro
23
+
24
+ ### Model Sources
25
+
26
+ <!-- Provide the basic links for the model. -->
27
+
28
+ - **Repository:** [[link](https://github.com/immc-lab/ear)]
29
+ - **Paper:** [[link](https://arxiv.org/abs/2506.20151)]
30
+
31
+ ## Installation Guide
32
+
33
+ ### EAR Environment
34
+
35
+ ```shell
36
+ git clone https://github.com/immc-lab/ear.git
37
+ cd ear
38
+ conda create -n ear python=3.12
39
+ conda activate ear
40
+ pip install -r requirements.txt
41
+ ```
42
+
43
+ ### Janus-Pro Environment
44
+
45
+ Ensure that your environment can run Janus-Pro, refer to its
46
+ official [Quick Start](https://github.com/deepseek-ai/Janus) for details.
47
+
48
+ ## Training Guide
49
+
50
+ After installation, follow these instructions to train EAR model for Janus-Pro.
51
+
52
+ Please run the script in `train/` after checking the file path:
53
+
54
+ ```shell
55
+ python train/ear_train_church.py
56
+ ```
57
+
58
+ ## Generating Images with EAR
59
+
60
+ Image generation using the custom EAR model is a straightforward process. Please run the script in `infer/`.
61
+
62
+ For automated batch generation of evaluation images, utilize the following script:
63
+
64
+ ```shell
65
+ python infer/infer_church.py
66
+ ```
67
+
68
+ ## Evaluation
69
+
70
+ You can execute the following command to evaluate the generated data. Please run the script in `eval/`.
71
+
72
+ The specific evaluation method can be found in our [paper](https://arxiv.org/pdf/2506.20151).
73
+
74
+ ```shell
75
+ python eval/eval_object.py --folder_path {args.output_dir} --topk 10 --batch_size 250
76
+ ```
77
+
78
+ ## References
79
+
80
+ This repo is the code for the paper *EAR: Erasing Concepts from Unified Autoregressive Models*.
81
+
82
+ Thanks for the creative ideas of the pioneer researches:
83
+
84
+ - https://github.com/rohitgandikota/erasing: **Erasing Concepts from Diffusion Models**
85
+ - https://github.com/Con6924/SPM: **One-dimentional Adapter to Rule Them All: Concepts, Diffusion Models and Erasing
86
+ Applications**
87
+ - https://github.com/koushiksrivats/robust-concept-erasing: **STEREO: A Two-Stage Framework for Adversarially Robust
88
+ Concept Erasing from Text-to-Image Diffusion Models**
89
+ - https://github.com/OPTML-Group/Diffusion-MU-Attack: **To Generate or Not? Safety-Driven Unlearned Diffusion Models Are
90
+ Still Easy To Generate Unsafe Images ... For Now**
91
+ - https://github.com/deepseek-ai/Janus: **Janus: Decoupling Visual Encoding for Unified Multimodal Understanding and
92
+ Generation**
93
+ - https://github.com/deepseek-ai/Janus: **Janus-Pro: Unified Multimodal Understanding and Generation with Data and Model
94
+ Scaling**
95
+
96
+ ## Citing our work
97
+
98
+ The preprint can be cited as follows
99
+
100
+ ```bibtex
101
+ @misc{fan2025earerasingconceptsunified,
102
+ title={EAR: Erasing Concepts from Unified Autoregressive Models},
103
+ author={Haipeng Fan and Shiyuan Zhang and Baohunesitu and Zihang Guo and Huaiwen Zhang},
104
+ year={2025},
105
+ eprint={2506.20151},
106
+ archivePrefix={arXiv},
107
+ primaryClass={cs.CV},
108
+ url={https://arxiv.org/abs/2506.20151},
109
+ }
110
+ ```