NeerajCodz commited on
Commit
4baca8f
Β·
verified Β·
1 Parent(s): a796e1b

chore: update model card

Browse files
Files changed (1) hide show
  1. README.md +97 -3
README.md CHANGED
@@ -1,3 +1,97 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ language:
4
+ - en
5
+ tags:
6
+ - battery
7
+ - state-of-health
8
+ - remaining-useful-life
9
+ - time-series
10
+ - regression
11
+ - lstm
12
+ - transformer
13
+ - xgboost
14
+ - lightgbm
15
+ - random-forest
16
+ - ensemble
17
+ datasets:
18
+ - NASA-PCoE-Battery
19
+ metrics:
20
+ - r2
21
+ - mae
22
+ - rmse
23
+ pipeline_tag: tabular-regression
24
+ ---
25
+
26
+ # AI Battery Lifecycle β€” Model Repository
27
+
28
+ Trained model artifacts for the [aiBatteryLifeCycle](https://huggingface.co/spaces/NeerajCodz/aiBatteryLifeCycle) project.
29
+
30
+ SOH (State-of-Health) and RUL (Remaining Useful Life) prediction for lithium-ion batteries
31
+ trained on the NASA PCoE Battery Dataset.
32
+
33
+ ## Repository Layout
34
+
35
+ ```
36
+ artifacts/
37
+ β”œβ”€β”€ v1/
38
+ β”‚ β”œβ”€β”€ models/
39
+ β”‚ β”‚ β”œβ”€β”€ classical/ # Ridge, Lasso, ElasticNet, KNN Γ—3, SVR, XGBoost, LightGBM, RF
40
+ β”‚ β”‚ └── deep/ # Vanilla LSTM, Bi-LSTM, GRU, Attention-LSTM, TFT,
41
+ β”‚ β”‚ # BatteryGPT, iTransformer, Physics-iTransformer,
42
+ β”‚ β”‚ # DG-iTransformer, VAE-LSTM
43
+ β”‚ └── scalers/ # MinMax, Standard, Linear, Sequence scalers
44
+ └── v2/
45
+ β”œβ”€β”€ models/
46
+ β”‚ β”œβ”€β”€ classical/ # Same family + Extra Trees, Gradient Boosting, best_rul_model
47
+ β”‚ └── deep/ # Same deep models re-trained on v2 feature set
48
+ β”œβ”€β”€ scalers/ # Per-model feature scalers
49
+ └── results/ # Validation JSONs
50
+ ```
51
+
52
+ ## Model Performance Summary
53
+
54
+ | Rank | Model | RΒ² | MAE | RMSE | Family |
55
+ |------|-------|----|-----|------|--------|
56
+ | 1 | Random Forest | 0.957 | 4.78 | 6.46 | Classical |
57
+ | 2 | LightGBM | 0.928 | 5.53 | 8.33 | Classical |
58
+ | 3 | Weighted Avg Ensemble | 0.886 | 3.89 | 6.47 | Ensemble |
59
+ | 4 | TFT | 0.881 | 3.93 | 6.62 | Transformer |
60
+ | 5 | Stacking Ensemble | 0.863 | 4.91 | 7.10 | Ensemble |
61
+ | 6 | XGBoost | 0.847 | 8.06 | 12.14 | Classical |
62
+ | 7 | SVR | 0.805 | 7.56 | 13.71 | Classical |
63
+ | 8 | VAE-LSTM | 0.730 | 7.82 | 9.98 | Generative |
64
+
65
+ ## Usage
66
+
67
+ These artifacts are automatically downloaded by the Space on startup via
68
+ `scripts/download_models.py`. You can also use them directly:
69
+
70
+ ```python
71
+ from huggingface_hub import snapshot_download
72
+
73
+ local = snapshot_download(
74
+ repo_id="NeerajCodz/aiBatteryLifeCycle",
75
+ repo_type="model",
76
+ local_dir="artifacts",
77
+ token="<your-token>", # only needed if private
78
+ )
79
+ ```
80
+
81
+ ## Framework
82
+
83
+ - **Classical models:** scikit-learn / XGBoost / LightGBM `.joblib`
84
+ - **Deep models (PyTorch):** `.pt` state-dicts (CPU weights)
85
+ - **Deep models (Keras):** `.keras` SavedModel format
86
+ - **Scalers:** scikit-learn `.joblib`
87
+
88
+ ## Citation
89
+
90
+ ```bibtex
91
+ @misc{aiBatteryLifeCycle2025,
92
+ author = {Neeraj},
93
+ title = {AI Battery Lifecycle β€” SOH/RUL Prediction},
94
+ year = {2025},
95
+ url = {https://huggingface.co/spaces/NeerajCodz/aiBatteryLifeCycle}
96
+ }
97
+ ```