Improve model card: Add metadata and update paper/GitHub links

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +8 -2
README.md CHANGED
@@ -1,9 +1,15 @@
 
 
 
 
 
 
1
  # CDLM-LLaDA LoRA adapter for LLaDA-8B-Instruct
2
 
3
  This repository hosts the LoRA adapter for the LLaDA-8B-Instruct diffusion LLM (dLLM), produced with the CDLM (Consistency Diffusion Language Models) method. CDLM integrates consistency modeling and a block-wise causal attention mask so the student model becomes fully KV-cache compatible while retaining the strong local bidirectional modeling within each block. In practice, the adapter enables significantly faster inference with competitive quality.
4
 
5
- - GitHub: https://github.com/minseo25/CDLM
6
- - Paper: TBA
7
 
8
 
9
  ## Model details
 
1
+ ---
2
+ license: mit
3
+ pipeline_tag: text-generation
4
+ library_name: peft
5
+ ---
6
+
7
  # CDLM-LLaDA LoRA adapter for LLaDA-8B-Instruct
8
 
9
  This repository hosts the LoRA adapter for the LLaDA-8B-Instruct diffusion LLM (dLLM), produced with the CDLM (Consistency Diffusion Language Models) method. CDLM integrates consistency modeling and a block-wise causal attention mask so the student model becomes fully KV-cache compatible while retaining the strong local bidirectional modeling within each block. In practice, the adapter enables significantly faster inference with competitive quality.
10
 
11
+ - GitHub: https://github.com/SqueezeAILab/CDLM
12
+ - Paper: [CDLM: Consistency Diffusion Language Models For Faster Sampling](https://huggingface.co/papers/2511.19269)
13
 
14
 
15
  ## Model details