Kairos_10m / README.md
GritLs's picture
Update README for Kairos arXiv v2
786ecaf verified
metadata
license: apache-2.0
tags:
  - time-series
  - forecasting
  - foundation-model
  - zero-shot
pipeline_tag: time-series-forecasting
library_name: transformers

Kairos-10M: Adaptive Time Series Foundation Model

This model is presented in the paper Kairos: Toward Adaptive and Parameter-Efficient Time Series Foundation Models.

preprint github Project Page

Model Description

Kairos-10M is a 10-million parameter time series foundation model designed for zero-shot forecasting across diverse domains. It features a dynamic patching tokenizer, mixture-of-size encoding, and Dynamic Rotary Position Embedding (DRoPE) to handle heterogeneous time series data with varying information density.

Key Features

  • 🔀 Mixture-of-Size Encoder: Adaptively selects tokenization granularity based on local information density
  • 🔄 Dynamic Rotary Position Embedding (DRoPE): Tailors positional encodings to instance-level spectral features and dynamic patching tokenization
  • 📊 Zero-shot Forecasting: Strong generalization across domains without fine-tuning
  • Efficient: Superior performance with fewer parameters

Model Specifications

  • Parameters: ~10 million
  • Training Data: PreSTS corpus (300+ billion time points)
  • Architecture: Transformer-based with adaptive components

Model Family

Usage

import torch
from tsfm.model.kairos import AutoModel

# load model
model = AutoModel.from_pretrained(
    "mldi-lab/Kairos_10m", trust_remote_code=True
)

# forecasting configurations
batch_size, context_length, prediction_length = 1, 2048, 96
seqs = torch.randn(batch_size, context_length)

prediction_length = 96
forecast = model(
    past_target=seqs.clone().detach().float(),
    prediction_length=prediction_length,
    generation=True,
    preserve_positivity=True,
    average_with_flipped_input=True
)

# extract the prediction results
forecast = forecast["prediction_outputs"]
print(forecast.shape)

For detailed usage examples, please refer to the main repository.

Citation

If you use this model, please cite:

@article{feng2025kairos,
  title={Kairos: Toward Adaptive and Parameter-Efficient Time Series Foundation Models},
  author={Feng, Kun and Lan, Shaocheng and Fang, Yuchen and He, Wenchao and Ma, Lintao and Lu, Xingyu and Ren, Kan},
  journal={arXiv preprint arXiv:2509.25826v2},
  year={2025}
}

License

Apache License 2.0