Ridge multi-horizon CGM forecaster (MetaboNet)

A sklearn-trained Ridge regressor (with StandardScaler) re-packaged as a transformers-compatible Hub model. One repo holds four feature ablations selectable at load time:

  • cgm β€” 24 CGM lags + hour_sin/hour_cos (26 features).
  • insulin β€” cgm features + 24 Insulin lags (50 features).
  • carbs β€” cgm features + 24 Carbs lags (50 features).
  • all β€” cgm features + 24 Insulin lags + 24 Carbs lags (74 features).

History length is 24 (= 2 hours at 5-minute sampling). Output is 12 future CGM values (5–60 min horizons).

Files

  • config.json β€” auto_map wiring + per-ablation feature lists.
  • model.py β€” RidgeMultiHorizonConfig / RidgeMultiHorizonModel (trust_remote_code=True).
  • model_<ablation>.safetensors β€” one per ablation, holding scaler_mean, scaler_scale, coef (12 Γ— F), intercept (12).

Usage

from transformers import AutoConfig, AutoModel

cfg = AutoConfig.from_pretrained(
    "anonymous-4FAD/Ridge", trust_remote_code=True, ablation="cgm"
)
model = AutoModel.from_pretrained(
    "anonymous-4FAD/Ridge", trust_remote_code=True, config=cfg
)

# Inputs match the MetaboNet benchmark.py contract:
#   timestamps: int64 ns, shape (B, T_in)
#   cgm/insulin/carbs: float, shape (B, T_in); only the last 24 steps are used
preds = model.predict(timestamps, cgm, insulin, carbs)  # -> (B, 12)

The thin local wrapper in models/ridge.py exposes the same API used by benchmark.py.

Feature convention

CGM_t<i> denotes the i-th sample within the last history_length steps, ordered oldest -> newest (CGM_t0 is the oldest of the 24, CGM_t23 is the newest). The same convention applies to Insulin_t<i> and Carbs_t<i>. hour_sin / hour_cos are derived from the most recent input timestamp.

Provenance

Trained via other_models/results/train_ridge.py on the public MetaboNet train split. The safetensors checkpoints are produced by scripts/build_other_models_hub.py from the original sklearn pickles.

Downloads last month
20
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support