| --- |
| license: apache-2.0 |
| tags: |
| - time series |
| - forecasting |
| - pretrained models |
| - foundation models |
| - time series foundation models |
| - time-series |
| --- |
| |
| # Chronos-T5 Mini |
|
|
| Chronos models are pre-trained **time series forecasting models** based on language model architectures. |
| A time series is transformed into a sequence of tokens via scaling and quantization, and forecasts are obtained by sampling multiple sequences of future observations given historical context. |
| Chronos models are trained on a large corpus of publicly available time series data, as well as synthetic data. |
|
|
| For details on Chronos models, training data and procedures, and experimental results, refer to the paper [Chronos: Learning the Language of Time Series](https://www.example.com/). |
|
|
| ## Architecture |
|
|
| The model in this repository is based on the [T5 architecture](https://arxiv.org/abs/1910.10683). |
| The only difference is in the vocabulary size: |
| Chronos-T5 uses 4096 different tokens, compared to 32128 of the original T5 models, resulting in a smaller number of total parameters. |
|
|
| Model | Parameters | Based on |
| ----------------|-------------------|---------------------- |
| [chronos-t5-mini](https://huggingface.co/amazon/chronos-t5-mini) | 20M | [t5-efficient-mini](https://huggingface.co/google/t5-efficient-mini) |
| [chronos-t5-small](https://huggingface.co/amazon/chronos-t5-small) | 46M | [t5-efficient-small](https://huggingface.co/google/t5-efficient-small) |
| [chronos-t5-base](https://huggingface.co/amazon/chronos-t5-base) | 200M | [t5-efficient-base](https://huggingface.co/google/t5-efficient-base) |
| [chronos-t5-large](https://huggingface.co/amazon/chronos-t5-large) | 710M | [t5-efficient-large](https://huggingface.co/google/t5-efficient-large) |
|
|
| ## Usage |
|
|
| To do inference with Chronos models, you will need to install the code from the [companion GitHub repo](https://www.example.com/). |
|
|
| ```bash |
| pip install git+https://github.com/amazon-science/chronos-forecasting.git |
| ``` |
|
|
| A minimal example: |
|
|
| ```python |
| import numpy as np |
| import pandas as pd |
| import matplotlib.pyplot as plt |
| import torch |
| from chronos import ChronosPipeline |
| |
| pipeline = ChronosPipeline.from_pretrained("amazon/chronos-t5-base") |
| |
| df = pd.read_csv( |
| "https://raw.githubusercontent.com/AileenNielsen/" |
| "TimeSeriesAnalysisWithPython/master/data/AirPassengers.csv", |
| index_col=0, |
| parse_dates=True, |
| ) |
| |
| context = torch.Tensor(df["#Passengers"].values) |
| forecast = pipeline.predict(context, prediction_length=12) |
| |
| forecast_steps = range(len(df), len(df) + 12) |
| forecast_np = forecast.numpy()[0].T |
| low = np.quantile(forecast_np, 0.1, axis=1) |
| median = np.quantile(forecast_np, 0.5, axis=1) |
| high = np.quantile(forecast_np, 0.9, axis=1) |
| |
| plt.plot(range(len(df)), df["#Passengers"], color="royalblue", label="historical data") |
| plt.plot(forecast_steps, forecast_np, color="grey", alpha=0.1) |
| plt.fill_between(forecast_steps, low, high, color="tomato", alpha=0.4, label="80% interval") |
| plt.plot(forecast_steps, median, color="tomato", label="median") |
| plt.legend() |
| plt.grid() |
| plt.show() |
| ``` |
|
|
| ## References |
|
|
| If you find Chronos models useful for your research, please consider citing the associated [paper](https://www.example.com/): |
|
|
| ``` |
| paper citation |
| ``` |
|
|