Upload README.md with huggingface_hub
Browse files
README.md
ADDED
|
@@ -0,0 +1,72 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
library_name: pytorch
|
| 4 |
+
tags:
|
| 5 |
+
- materials-science
|
| 6 |
+
- crystal-structures
|
| 7 |
+
- solid-state-batteries
|
| 8 |
+
- representation-learning
|
| 9 |
+
- screening
|
| 10 |
+
model-index:
|
| 11 |
+
- name: SSB Screening Model (RTX6000x2)
|
| 12 |
+
results:
|
| 13 |
+
- task:
|
| 14 |
+
type: text-classification
|
| 15 |
+
name: Screening Proxy (3-class)
|
| 16 |
+
metrics:
|
| 17 |
+
- type: accuracy
|
| 18 |
+
value: 0.8118937
|
| 19 |
+
- type: f1
|
| 20 |
+
value: 0.8060277
|
| 21 |
+
- type: precision
|
| 22 |
+
value: 0.7671543
|
| 23 |
+
- type: recall
|
| 24 |
+
value: 0.8694215
|
| 25 |
+
- type: val_loss
|
| 26 |
+
value: 0.2856999
|
| 27 |
+
---
|
| 28 |
+
|
| 29 |
+
# SSB Screening Model (RTX6000x2)
|
| 30 |
+
|
| 31 |
+
## Model Summary
|
| 32 |
+
This model is a lightweight MLP classifier trained on NPZ-encoded inorganic crystal structure features for solid-state battery (SSB) screening proxies. It is intended to prioritize candidate structures, not to replace DFT or experimental validation.
|
| 33 |
+
|
| 34 |
+
- **Architecture**: MLP (input_dim=144, hidden_dims=[512, 256, 128], dropout variable by sweep)
|
| 35 |
+
- **Output**: 3-class classification proxy for screening tasks
|
| 36 |
+
- **Training Regime**: supervised training on curated NPZ dataset with class-weighted loss
|
| 37 |
+
- **Best checkpoint**: `checkpoint_epoch45.pt` (lowest observed val_loss in logs)
|
| 38 |
+
|
| 39 |
+
## Intended Use
|
| 40 |
+
- **Primary**: ranking/prioritization of SSB electrolyte candidates
|
| 41 |
+
- **Not intended**: absolute property prediction or experimental ground truth replacement
|
| 42 |
+
|
| 43 |
+
## Training Data
|
| 44 |
+
- **Dataset**: `ssb_npz_v1` (curated NPZ features)
|
| 45 |
+
- **Split**: 80/10/10 (train/val/test)
|
| 46 |
+
- **Features**: composition + lattice + derived scalar statistics (144-dim)
|
| 47 |
+
|
| 48 |
+
## Evaluation
|
| 49 |
+
Metrics from the latest run summary:
|
| 50 |
+
- **Val loss**: 0.2857
|
| 51 |
+
- **Val accuracy**: 0.8119
|
| 52 |
+
- **Holdout accuracy**: 0.8096
|
| 53 |
+
- **F1**: 0.8060
|
| 54 |
+
- **Precision**: 0.7672
|
| 55 |
+
- **Recall**: 0.8694
|
| 56 |
+
|
| 57 |
+
## Limitations
|
| 58 |
+
- The model is a proxy classifier; it does not predict ground-truth physical properties.
|
| 59 |
+
- Performance is tied to the training distribution of `ssb_npz_v1`.
|
| 60 |
+
- Chemical regimes underrepresented in the training set may be poorly ranked.
|
| 61 |
+
|
| 62 |
+
## Training Configuration (abridged)
|
| 63 |
+
- Optimizer: AdamW
|
| 64 |
+
- LR: sweep (best around ~3e-4)
|
| 65 |
+
- Weight decay: sweep (0.005–0.02)
|
| 66 |
+
- Scheduler: cosine
|
| 67 |
+
- Batch size: sweep (128–512)
|
| 68 |
+
- Epochs: sweep (20–60)
|
| 69 |
+
- Gradient accumulation: sweep (1–4)
|
| 70 |
+
|
| 71 |
+
## Citation
|
| 72 |
+
If you use this model, please cite the dataset and training pipeline from the Nexa_compute repository.
|