Update README.md
Browse files
README.md
CHANGED
|
@@ -4,7 +4,9 @@ license: cc-by-nc-4.0
|
|
| 4 |
|
| 5 |
# Model Card for **_Panda-72M_**
|
| 6 |
|
| 7 |
-
This is a scaled-up version of the checkpoint originally presented in our preprint. This model has 12 layers with 12 attention heads each.
|
|
|
|
|
|
|
| 8 |
|
| 9 |
*Panda*: Patched Attention for Nonlinear Dynamics.
|
| 10 |
|
|
|
|
| 4 |
|
| 5 |
# Model Card for **_Panda-72M_**
|
| 6 |
|
| 7 |
+
This is a scaled-up version of the checkpoint originally presented in our preprint. This model has 12 layers with 12 attention heads each.
|
| 8 |
+
|
| 9 |
+
Trained with larger dataset of multiple initial conditions per system, with mixed periods as well.
|
| 10 |
|
| 11 |
*Panda*: Patched Attention for Nonlinear Dynamics.
|
| 12 |
|