File size: 2,819 Bytes
4baca8f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
---
license: mit
language:
  - en
tags:
  - battery
  - state-of-health
  - remaining-useful-life
  - time-series
  - regression
  - lstm
  - transformer
  - xgboost
  - lightgbm
  - random-forest
  - ensemble
datasets:
  - NASA-PCoE-Battery
metrics:
  - r2
  - mae
  - rmse
pipeline_tag: tabular-regression
---

# AI Battery Lifecycle β€” Model Repository

Trained model artifacts for the [aiBatteryLifeCycle](https://huggingface.co/spaces/NeerajCodz/aiBatteryLifeCycle) project.

SOH (State-of-Health) and RUL (Remaining Useful Life) prediction for lithium-ion batteries
trained on the NASA PCoE Battery Dataset.

## Repository Layout

```
artifacts/
β”œβ”€β”€ v1/
β”‚   β”œβ”€β”€ models/
β”‚   β”‚   β”œβ”€β”€ classical/   # Ridge, Lasso, ElasticNet, KNN Γ—3, SVR, XGBoost, LightGBM, RF
β”‚   β”‚   └── deep/        # Vanilla LSTM, Bi-LSTM, GRU, Attention-LSTM, TFT,
β”‚   β”‚                    # BatteryGPT, iTransformer, Physics-iTransformer,
β”‚   β”‚                    # DG-iTransformer, VAE-LSTM
β”‚   └── scalers/         # MinMax, Standard, Linear, Sequence scalers
└── v2/
    β”œβ”€β”€ models/
    β”‚   β”œβ”€β”€ classical/   # Same family + Extra Trees, Gradient Boosting, best_rul_model
    β”‚   └── deep/        # Same deep models re-trained on v2 feature set
    β”œβ”€β”€ scalers/         # Per-model feature scalers
    └── results/         # Validation JSONs
```

## Model Performance Summary

| Rank | Model | RΒ² | MAE | RMSE | Family |
|------|-------|----|-----|------|--------|
| 1 | Random Forest | 0.957 | 4.78 | 6.46 | Classical |
| 2 | LightGBM | 0.928 | 5.53 | 8.33 | Classical |
| 3 | Weighted Avg Ensemble | 0.886 | 3.89 | 6.47 | Ensemble |
| 4 | TFT | 0.881 | 3.93 | 6.62 | Transformer |
| 5 | Stacking Ensemble | 0.863 | 4.91 | 7.10 | Ensemble |
| 6 | XGBoost | 0.847 | 8.06 | 12.14 | Classical |
| 7 | SVR | 0.805 | 7.56 | 13.71 | Classical |
| 8 | VAE-LSTM | 0.730 | 7.82 | 9.98 | Generative |

## Usage

These artifacts are automatically downloaded by the Space on startup via
`scripts/download_models.py`. You can also use them directly:

```python
from huggingface_hub import snapshot_download

local = snapshot_download(
    repo_id="NeerajCodz/aiBatteryLifeCycle",
    repo_type="model",
    local_dir="artifacts",
    token="<your-token>",   # only needed if private
)
```

## Framework

- **Classical models:** scikit-learn / XGBoost / LightGBM `.joblib`
- **Deep models (PyTorch):** `.pt` state-dicts (CPU weights)
- **Deep models (Keras):** `.keras` SavedModel format
- **Scalers:** scikit-learn `.joblib`

## Citation

```bibtex
@misc{aiBatteryLifeCycle2025,
  author  = {Neeraj},
  title   = {AI Battery Lifecycle β€” SOH/RUL Prediction},
  year    = {2025},
  url     = {https://huggingface.co/spaces/NeerajCodz/aiBatteryLifeCycle}
}
```