metadata
license: apache-2.0
library_name: transformers
pipeline_tag: time-series-forecasting
tags:
- transformers
- timesfm
- timesfm_2p5
- time-series-forecasting
- arxiv:2310.10688
TimesFM 2.5 (Transformers)
TimesFM (Time Series Foundation Model) is a pretrained decoder-only model for time-series forecasting. This repository contains the Transformers port of the official TimesFM 2.5 PyTorch release.
Resources and Technical Documentation:
- Original model: google/timesfm-2.5-200m-pytorch
- Transformers model: google/timesfm-2.5-200m-transformers
- Paper: A decoder-only foundation model for time-series forecasting
- Transformers docs: TimesFM 2.5
Model description
This model is converted from the official TimesFM 2.5 PyTorch checkpoint and integrated into transformers as Timesfm2P5ModelForPrediction.
The converted checkpoint preserves the original architecture and forecasting behavior, including:
- patch-based inputs for time-series contexts
- decoder-only self-attention stack
- point and quantile forecasts
Usage (Transformers)
import torch
from transformers import Timesfm2P5ModelForPrediction
model = Timesfm2P5ModelForPrediction.from_pretrained("google/timesfm-2.5-200m-transformers", attn_implementation="sdpa")
model = model.to(torch.float32).eval()
past_values = [
torch.linspace(0, 1, 100),
torch.sin(torch.linspace(0, 20, 67)),
]
with torch.no_grad():
outputs = model(past_values=past_values, forecast_context_len=1024)
print(outputs.mean_predictions.shape)
print(outputs.full_predictions.shape)
Conversion details
This checkpoint was produced with:
- script:
src/transformers/models/timesfm_2p5/convert_timesfm_2p5_original_to_hf.py - source checkpoint:
google/timesfm-2.5-200m-pytorch - conversion date (UTC):
2026-02-20
Weight conversion parity is verified by comparing converted-model forecasts against the official implementation outputs on deterministic inputs.
Citation
@inproceedings{das2024a,
title={A decoder-only foundation model for time-series forecasting},
author={Abhimanyu Das and Weihao Kong and Rajat Sen and Yichen Zhou},
booktitle={Forty-first International Conference on Machine Learning},
year={2024},
url={https://openreview.net/forum?id=jn2iTJas6h}
}