Add model card for Parcae-xlarge-1.3B

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +58 -0
README.md ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ pipeline_tag: text-generation
3
+ ---
4
+
5
+ # Parcae-xlarge-1.3B
6
+
7
+ Parcae is a novel stable, looped architecture for language modeling. Unlike traditional fixed-depth architectures that scale by increasing parameter counts, Parcae increases FLOPs by sending activations through a block of layers in a loop. It addresses instability issues in prior looped models by recasting looping as a nonlinear time-variant dynamical system and constraining the spectral norm of injection parameters.
8
+
9
+ - **Paper:** [Parcae: Scaling Laws For Stable Looped Language Models](https://huggingface.co/papers/2604.12946)
10
+ - **Project Page:** [https://sandyresearch.github.io/parcae/](https://sandyresearch.github.io/parcae/)
11
+ - **Repository:** [https://github.com/sandyresearch/parcae](https://github.com/sandyresearch/parcae)
12
+
13
+ ## Installation
14
+
15
+ To use this model, you can install the `parcae-lm` package:
16
+
17
+ ```bash
18
+ pip install parcae-lm
19
+ ```
20
+
21
+ ## Usage
22
+
23
+ You can load the pretrained weights using the `parcae_lm` library:
24
+
25
+ ```python
26
+ import parcae_lm
27
+
28
+ # Load this pretrained model from HuggingFace
29
+ model = parcae_lm.from_pretrained("SandyResearch/parcae-xlarge-1_3b")
30
+ ```
31
+
32
+ ## Model Details
33
+
34
+ This specific checkpoint is the 1.3B parameter variant of Parcae, trained on the FineWeb-Edu dataset.
35
+
36
+ | Model | Parameters | Prelude | Core | Coda | Model dim. | Recurrence |
37
+ |-------|-----------|---------|------|------|-----------|------------|
38
+ | Parcae-1.3B | 1.3B | 8 | 8 | 8 | 1536 | 8 |
39
+
40
+ **Note:** These are base models without any form of downstream modification (instruction tuning, etc.).
41
+
42
+ ## Citation
43
+
44
+ ```bibtex
45
+ @misc{prairie2026parcaescalinglawsstable,
46
+ title={Parcae: Scaling Laws For Stable Looped Language Models},
47
+ author={Hayden Prairie and Zachary Novack and Taylor Berg-Kirkpatrick and Daniel Y. Fu},
48
+ year={2026},
49
+ eprint={2604.12946},
50
+ archivePrefix={arXiv},
51
+ primaryClass={cs.LG},
52
+ url={https://arxiv.org/abs/2604.12946},
53
+ }
54
+ ```
55
+
56
+ ## References
57
+
58
+ This code-base was built on `karpathy/nanochat`, `seal-rg/recurrent-pretraining`, and `Lightning-AI/litgpt`.