rajatsen91 kashif HF Staff commited on
Commit
dcd4b67
·
1 Parent(s): 76599c5

Update README.md (#1)

Browse files

- Update README.md (3fe80c76086e8fd8c1a4ea63df2bdca94be7dd04)


Co-authored-by: Kashif Rasul <kashif@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +72 -3
README.md CHANGED
@@ -1,3 +1,72 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ library_name: transformers
4
+ pipeline_tag: time-series-forecasting
5
+ tags:
6
+ - transformers
7
+ - timesfm
8
+ - timesfm_2p5
9
+ - time-series-forecasting
10
+ - arxiv:2310.10688
11
+ ---
12
+
13
+ # TimesFM 2.5 (Transformers)
14
+
15
+ TimesFM (Time Series Foundation Model) is a pretrained decoder-only model for time-series forecasting. This repository contains the **Transformers** port of the official TimesFM 2.5 PyTorch release.
16
+
17
+ **Resources and Technical Documentation**:
18
+ * Original model: [google/timesfm-2.5-200m-pytorch](https://huggingface.co/google/timesfm-2.5-200m-pytorch)
19
+ * Transformers model: [google/timesfm-2.5-200m-transformers](https://huggingface.co/google/timesfm-2.5-200m-transformers)
20
+ * Paper: [A decoder-only foundation model for time-series forecasting](https://huggingface.co/papers/2310.10688)
21
+ * Transformers docs: [TimesFM 2.5](https://huggingface.co/docs/transformers/main/en/model_doc/timesfm_2p5)
22
+
23
+ ## Model description
24
+
25
+ This model is converted from the official TimesFM 2.5 PyTorch checkpoint and integrated into `transformers` as `Timesfm2P5ModelForPrediction`.
26
+
27
+ The converted checkpoint preserves the original architecture and forecasting behavior, including:
28
+ * patch-based inputs for time-series contexts
29
+ * decoder-only self-attention stack
30
+ * point and quantile forecasts
31
+
32
+ ## Usage (Transformers)
33
+
34
+ ```python
35
+ import torch
36
+ from transformers import Timesfm2P5ModelForPrediction
37
+
38
+ model = Timesfm2P5ModelForPrediction.from_pretrained("google/timesfm-2.5-200m-transformers", attn_implementation="sdpa")
39
+ model = model.to(torch.float32).eval()
40
+
41
+ past_values = [
42
+ torch.linspace(0, 1, 100),
43
+ torch.sin(torch.linspace(0, 20, 67)),
44
+ ]
45
+
46
+ with torch.no_grad():
47
+ outputs = model(past_values=past_values, forecast_context_len=1024)
48
+
49
+ print(outputs.mean_predictions.shape)
50
+ print(outputs.full_predictions.shape)
51
+ ```
52
+
53
+ ## Conversion details
54
+
55
+ This checkpoint was produced with:
56
+ * script: `src/transformers/models/timesfm_2p5/convert_timesfm_2p5_original_to_hf.py`
57
+ * source checkpoint: `google/timesfm-2.5-200m-pytorch`
58
+ * conversion date (UTC): `2026-02-20`
59
+
60
+ Weight conversion parity is verified by comparing converted-model forecasts against the official implementation outputs on deterministic inputs.
61
+
62
+ ## Citation
63
+
64
+ ```bibtex
65
+ @inproceedings{das2024a,
66
+ title={A decoder-only foundation model for time-series forecasting},
67
+ author={Abhimanyu Das and Weihao Kong and Rajat Sen and Yichen Zhou},
68
+ booktitle={Forty-first International Conference on Machine Learning},
69
+ year={2024},
70
+ url={https://openreview.net/forum?id=jn2iTJas6h}
71
+ }
72
+ ```