Add model card and link to paper

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +32 -3
README.md CHANGED
@@ -1,3 +1,32 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ pipeline_tag: text-generation
4
+ ---
5
+
6
+ # SHINE: A Scalable In-Context Hypernetwork for Mapping Context to LoRA in a Single Pass
7
+
8
+ SHINE (Scalable Hyper In-context NEtwork) is a scalable hypernetwork that can map diverse meaningful contexts into high-quality LoRA adapters for large language models (LLM) in a single forward pass.
9
+
10
+ - **Paper:** [SHINE: A Scalable In-Context Hypernetwork for Mapping Context to LoRA in a Single Pass](https://huggingface.co/papers/2602.06358)
11
+ - **Repository:** [https://github.com/Yewei-Liu/SHINE](https://github.com/Yewei-Liu/SHINE)
12
+
13
+ ## Description
14
+
15
+ By reusing the frozen LLM's own parameters in an in-context hypernetwork design and introducing architectural innovations, SHINE overcomes key limitations of prior hypernetworks and achieves strong expressive power with a relatively small number of parameters. It updates LLM parameters without any fine-tuning, and immediately enables complex question answering tasks related to the context without directly accessing the context, effectively transforming in-context knowledge to in-parameter knowledge in one pass.
16
+
17
+ Compared to traditional SFT-based adaptation, SHINE significantly saves time, computation, and memory costs while showing great potential for scaling.
18
+
19
+ ## Usage
20
+
21
+ For environment setup and detailed inference instructions, please refer to the [official GitHub repository](https://github.com/Yewei-Liu/SHINE). The project provides an `inference.ipynb` notebook to quickly test the hypernetwork's ability to generate LoRA adapters from custom contexts.
22
+
23
+ ## Citation
24
+
25
+ ```bibtex
26
+ @article{liu2025shine,
27
+ title={SHINE: A Scalable In-Context Hypernetwork for Mapping Context to LoRA in a Single Pass},
28
+ author={Liu, Yewei and others},
29
+ journal={arXiv preprint arXiv:2602.06358},
30
+ year={2025}
31
+ }
32
+ ```