Update README.md
Browse files
README.md
CHANGED
|
@@ -72,7 +72,7 @@ The model was fine-tuned on the **MatterGen Density** dataset, containing inorga
|
|
| 72 |
|
| 73 |
### Training Procedure
|
| 74 |
|
| 75 |
-
- **Architecture:** GPT-2
|
| 76 |
- **Mechanism:** Continuous property values are projected into the attention mechanism's key-value space (Prefix Tuning), allowing the model to attend to the target properties at every generation step.
|
| 77 |
|
| 78 |
## Evaluation
|
|
|
|
| 72 |
|
| 73 |
### Training Procedure
|
| 74 |
|
| 75 |
+
- **Architecture:** GPT-2 with additional Property-Key-Value (PKV) encoder layers. (~61.6M parameters)
|
| 76 |
- **Mechanism:** Continuous property values are projected into the attention mechanism's key-value space (Prefix Tuning), allowing the model to attend to the target properties at every generation step.
|
| 77 |
|
| 78 |
## Evaluation
|