Improve model card: Add `library_name`, prominent paper link, and GitHub link

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +9 -2
README.md CHANGED
@@ -1,5 +1,7 @@
1
  ---
2
  license: apache-2.0
 
 
3
  tags:
4
  - transformer
5
  - causal-lm
@@ -7,11 +9,12 @@ tags:
7
  - constructive-learning
8
  - frozen-embeddings
9
  - bvv
10
- pipeline_tag: text-generation
11
  ---
12
 
13
  # Model Card for abs-bvv-5
14
 
 
 
15
  ## Model Description
16
 
17
  `abs-bvv-5` is a 2.1 billion parameter decoder-only Transformer model. It is the 5th model in the **Progressive Growth Transformers (PGT)** series, designed to explore how linguistic and reasoning capabilities emerge as a function of model depth.
@@ -119,4 +122,8 @@ outputs = model.generate(
119
  do_sample=True
120
  )
121
 
122
- print(tokenizer.decode(outputs[0], skip_special_tokens=True))
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ pipeline_tag: text-generation
4
+ library_name: transformers
5
  tags:
6
  - transformer
7
  - causal-lm
 
9
  - constructive-learning
10
  - frozen-embeddings
11
  - bvv
 
12
  ---
13
 
14
  # Model Card for abs-bvv-5
15
 
16
+ This model was presented in the paper [Growing Transformers: Modular Composition and Layer-wise Expansion on a Frozen Substrate](https://huggingface.co/papers/2507.07129).
17
+
18
  ## Model Description
19
 
20
  `abs-bvv-5` is a 2.1 billion parameter decoder-only Transformer model. It is the 5th model in the **Progressive Growth Transformers (PGT)** series, designed to explore how linguistic and reasoning capabilities emerge as a function of model depth.
 
122
  do_sample=True
123
  )
124
 
125
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
126
+ ```
127
+
128
+ ## GitHub Repository
129
+ The official code repository for the **Progressive Growth Transformers (PGT)** project can be found here: [https://github.com/bochenkovlabs/PGT](https://github.com/bochenkovlabs/PGT)