LisaMegaWatts
/

Time Series Forecasting
Chronos
Safetensors
t5
time series
forecasting
foundation models
pretrained models
LisaMegaWatts shchuro commited on
Commit
286be20
·
0 Parent(s):

Duplicate from amazon/chronos-2

Browse files

Co-authored-by: Oleksandr Shchur <shchuro@users.noreply.huggingface.co>

Files changed (4) hide show
  1. .gitattributes +35 -0
  2. README.md +155 -0
  3. config.json +58 -0
  4. model.safetensors +3 -0
.gitattributes ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,155 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ model_id: chronos-2
4
+ tags:
5
+ - time series
6
+ - forecasting
7
+ - foundation models
8
+ - pretrained models
9
+ - safetensors
10
+ paper:
11
+ - https://arxiv.org/abs/2510.15821
12
+ datasets:
13
+ - autogluon/chronos_datasets
14
+ - Salesforce/GiftEvalPretrain
15
+ leaderboards:
16
+ - Salesforce/GIFT-Eval
17
+ - autogluon/fev-leaderboard
18
+ pipeline_tag: time-series-forecasting
19
+ library_name: chronos-forecasting
20
+
21
+ ---
22
+
23
+ # Chronos-2
24
+
25
+ **Update Dec 30, 2025:** ☁️ Deploy Chronos-2 on Amazon SageMaker. [New guide](https://github.com/amazon-science/chronos-forecasting/blob/main/notebooks/deploy-chronos-to-amazon-sagemaker.ipynb) covers real-time GPU and CPU inference, serverless endpoints (run on demand, no idle costs), and batch transform for large-scale forecasting.
26
+
27
+ **Chronos-2** is a 120M-parameter, encoder-only time series foundation model for zero-shot forecasting.
28
+ It supports **univariate**, **multivariate**, and **covariate-informed** tasks within a single architecture.
29
+ Inspired by the T5 encoder, Chronos-2 produces multi-step-ahead quantile forecasts and uses a group attention mechanism for efficient in-context learning across related series and covariates.
30
+ Trained on a combination of real-world and large-scale synthetic datasets, it achieves **state-of-the-art zero-shot accuracy** among public models on [**fev-bench**](https://huggingface.co/spaces/autogluon/fev-leaderboard), [**GIFT-Eval**](https://huggingface.co/spaces/Salesforce/GIFT-Eval), and [**Chronos Benchmark II**](https://arxiv.org/abs/2403.07815).
31
+ Chronos-2 is also **highly efficient**, delivering over 300 time series forecasts per second on a single A10G GPU and supporting both **GPU and CPU inference**.
32
+
33
+ ## Links
34
+ - 🚀 [Deploy Chronos-2 on Amazon SageMaker](https://github.com/amazon-science/chronos-forecasting/blob/main/notebooks/deploy-chronos-to-amazon-sagemaker.ipynb)
35
+ - 📄 [Technical report](https://arxiv.org/abs/2510.15821v1)
36
+ - 💻 [GitHub](https://github.com/amazon-science/chronos-forecasting)
37
+ - 📘 [Example notebook](https://github.com/amazon-science/chronos-forecasting/blob/main/notebooks/chronos-2-quickstart.ipynb)
38
+ - 📰 [Amazon Science Blog](https://www.amazon.science/blog/introducing-chronos-2-from-univariate-to-universal-forecasting)
39
+
40
+
41
+ ## Overview
42
+
43
+ | Capability | Chronos-2 | Chronos-Bolt | Chronos |
44
+ |------------|-----------|--------------|----------|
45
+ | Univariate Forecasting | ✅ | ✅ | ✅ |
46
+ | Cross-learning across items | ✅ | ❌ | ❌ |
47
+ | Multivariate Forecasting | ✅ | ❌ | ❌ |
48
+ | Past-only (real/categorical) covariates | ✅ | ❌ | ❌ |
49
+ | Known future (real/categorical) covariates | ✅ | 🧩 | 🧩 |
50
+ | Max. Context Length | 8192 | 2048 | 512 |
51
+ | Max. Prediction Length | 1024 | 64 | 64 |
52
+
53
+ 🧩 Chronos & Chronos-Bolt do not natively support future covariates, but they can be combined with external covariate regressors (see [AutoGluon tutorial](https://auto.gluon.ai/1.4.0/tutorials/timeseries/forecasting-chronos.html#incorporating-the-covariates)). This only models per-timestep effects, not effects across time. In contrast, Chronos-2 supports all covariate types natively.
54
+
55
+
56
+ ## Usage
57
+
58
+ ### Local usage
59
+
60
+ For experimentation and local inference, you can use the [inference package](https://github.com/amazon-science/chronos-forecasting).
61
+
62
+ Install the package
63
+ ```
64
+ pip install "chronos-forecasting>=2.0"
65
+ ```
66
+
67
+ Make zero-shot predictions using the `pandas` API
68
+
69
+ ```python
70
+ import pandas as pd # requires: pip install 'pandas[pyarrow]'
71
+ from chronos import Chronos2Pipeline
72
+
73
+ pipeline = Chronos2Pipeline.from_pretrained("amazon/chronos-2", device_map="cuda")
74
+
75
+ # Load historical target values and past values of covariates
76
+ context_df = pd.read_parquet("https://autogluon.s3.amazonaws.com/datasets/timeseries/electricity_price/train.parquet")
77
+
78
+ # (Optional) Load future values of covariates
79
+ test_df = pd.read_parquet("https://autogluon.s3.amazonaws.com/datasets/timeseries/electricity_price/test.parquet")
80
+ future_df = test_df.drop(columns="target")
81
+
82
+ # Generate predictions with covariates
83
+ pred_df = pipeline.predict_df(
84
+ context_df,
85
+ future_df=future_df,
86
+ prediction_length=24, # Number of steps to forecast
87
+ quantile_levels=[0.1, 0.5, 0.9], # Quantiles for probabilistic forecast
88
+ id_column="id", # Column identifying different time series
89
+ timestamp_column="timestamp", # Column with datetime information
90
+ target="target", # Column(s) with time series values to predict
91
+ )
92
+ ```
93
+
94
+ ### Deploying a Chronos-2 endpoint to SageMaker
95
+
96
+ For production use, we recommend deploying Chronos-2 endpoints to Amazon SageMaker.
97
+
98
+ First, update the SageMaker SDK to make sure that all the latest models are available.
99
+
100
+ ```
101
+ pip install -U sagemaker
102
+ ```
103
+
104
+ Deploy an inference endpoint to SageMaker.
105
+
106
+ ```python
107
+ from sagemaker.jumpstart.model import JumpStartModel
108
+
109
+ model = JumpStartModel(
110
+ model_id="pytorch-forecasting-chronos-2",
111
+ instance_type="ml.g5.2xlarge",
112
+ )
113
+ predictor = model.deploy()
114
+ ```
115
+
116
+ Now you can send time series data to the endpoint in JSON format.
117
+
118
+ ```python
119
+ import pandas as pd
120
+ df = pd.read_csv("https://raw.githubusercontent.com/AileenNielsen/TimeSeriesAnalysisWithPython/master/data/AirPassengers.csv")
121
+
122
+ payload = {
123
+ "inputs": [
124
+ {"target": df["#Passengers"].tolist()}
125
+ ],
126
+ "parameters": {
127
+ "prediction_length": 12,
128
+ }
129
+ }
130
+ forecast = predictor.predict(payload)["predictions"]
131
+ ```
132
+
133
+ For more details about the endpoint API, check out the [example notebook](https://github.com/amazon-science/chronos-forecasting/blob/main/notebooks/deploy-chronos-to-amazon-sagemaker.ipynb)
134
+
135
+
136
+ ## Training data
137
+ More details about the training data are available in the [technical report](https://arxiv.org/abs/2510.15821).
138
+
139
+ - Subset of [Chronos Datasets](https://huggingface.co/datasets/autogluon/chronos_datasets) (excluding test portion of datasets that overlap with GIFT-Eval)
140
+ - Subset of [GIFT-Eval Pretrain](https://huggingface.co/datasets/Salesforce/GiftEvalPretrain)
141
+ - Synthetic univariate and multivariate data
142
+
143
+
144
+ ## Citation
145
+
146
+ If you find Chronos-2 useful for your research, please consider citing the associated paper:
147
+
148
+ ```
149
+ @article{ansari2025chronos2,
150
+ title = {Chronos-2: From Univariate to Universal Forecasting},
151
+ author = {Abdul Fatir Ansari and Oleksandr Shchur and Jaris Küken and Andreas Auer and Boran Han and Pedro Mercado and Syama Sundar Rangapuram and Huibin Shen and Lorenzo Stella and Xiyuan Zhang and Mononito Goswami and Shubham Kapoor and Danielle C. Maddix and Pablo Guerron and Tony Hu and Junming Yin and Nick Erickson and Prateek Mutalik Desai and Hao Wang and Huzefa Rangwala and George Karypis and Yuyang Wang and Michael Bohlke-Schneider},
152
+ year = {2025},
153
+ url = {https://arxiv.org/abs/2510.15821}
154
+ }
155
+ ```
config.json ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "Chronos2Model"
4
+ ],
5
+ "chronos_config": {
6
+ "context_length": 8192,
7
+ "input_patch_size": 16,
8
+ "input_patch_stride": 16,
9
+ "max_output_patches": 64,
10
+ "output_patch_size": 16,
11
+ "quantiles": [
12
+ 0.01,
13
+ 0.05,
14
+ 0.1,
15
+ 0.15,
16
+ 0.2,
17
+ 0.25,
18
+ 0.3,
19
+ 0.35,
20
+ 0.4,
21
+ 0.45,
22
+ 0.5,
23
+ 0.55,
24
+ 0.6,
25
+ 0.65,
26
+ 0.7,
27
+ 0.75,
28
+ 0.8,
29
+ 0.85,
30
+ 0.9,
31
+ 0.95,
32
+ 0.99
33
+ ],
34
+ "time_encoding_scale": 8192,
35
+ "use_arcsinh": true,
36
+ "use_reg_token": true
37
+ },
38
+ "chronos_pipeline_class": "Chronos2Pipeline",
39
+ "d_ff": 3072,
40
+ "d_kv": 64,
41
+ "d_model": 768,
42
+ "dense_act_fn": "relu",
43
+ "dropout_rate": 0.1,
44
+ "feed_forward_proj": "relu",
45
+ "initializer_factor": 0.05,
46
+ "is_gated_act": false,
47
+ "layer_norm_epsilon": 1e-06,
48
+ "model_type": "t5",
49
+ "num_heads": 12,
50
+ "num_layers": 12,
51
+ "pad_token_id": 0,
52
+ "reg_token_id": 1,
53
+ "rope_theta": 10000.0,
54
+ "torch_dtype": "float32",
55
+ "transformers_version": "4.49.0",
56
+ "use_cache": true,
57
+ "vocab_size": 2
58
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ddcda3c7508bf2528087723e98a20707cc04b7f370ae275a9fd88078ddba4f42
3
+ size 477930472