File size: 1,954 Bytes
140a03c 0ce2f09 2a7112d 0ce2f09 2a7112d 0ce2f09 2a7112d 0ce2f09 2a7112d 0ce2f09 2a7112d 0ce2f09 2a7112d 0ce2f09 2a7112d 0ce2f09 2a7112d 0ce2f09 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 | ---
pipeline_tag: time-series-forecasting
tags:
- time series
- time series foundation models
- time series forecasting
- zero-shot
---
# LightGTS: A Lightweight General Time Series Forecasting Model
🚩 **News (2025.06)** LightGTS has been accepted as **ICML 2025**.
## Introduction
<div style="text-align: center;">
<img src="framework.png" alt="LightGTS" style="zoom:80%;" />
</div>
## Quick Demos
```
pip install transformers==4.30.2 # Use this version for stable compatibility
```
### Zero-Shot
```
from configuration_LightGTS import LightGTSConfig
from modeling_LightGTS import LightGTSForPrediction
import torch
from transformers import AutoModelForCausalLM
from transformers import AutoModelForCausalLM, MODEL_MAPPING
from transformers import AutoConfig
# load pretrain model
LightGTS_config = LightGTSConfig(context_points=528, c_in=1, target_dim=192, patch_len=48, stride=48)
LightGTS_config.save_pretrained("LightGTS-huggingface")
AutoConfig.register("LightGTS",LightGTSConfig)
AutoModelForCausalLM.register(LightGTSConfig, LightGTSForPrediction)
model = AutoModelForCausalLM.from_pretrained(
"./LightGTS-huggingface",
trust_remote_code=True
)
# prepare input
batch_size, lookback_length = 1, 576
seqs = torch.randn(batch_size, lookback_length).unsqueeze(-1).float()
# generate forecasting results
forecast_length = 192
outputs = model.generate(seqs, patch_len = 48, stride_len=48, max_output_length=forecast_length, inference_patch_len=48)
print(outputs.shape)
```
### Fine-tune
For usage examples, please see test_finetune.py
## Citation
If you find Sundial helpful for your research, please cite our paper:
```
@article{wang2025lightgts,
title={LightGTS: A Lightweight General Time Series Forecasting Model},
author={Wang, Yihang and Qiu, Yuying and Chen, Peng and Shu, Yang and Rao, Zhongwen and Pan, Lujia and Yang, Bin and Guo, Chenjuan},
journal={arXiv preprint arXiv:2506.06005},
year={2025}
}
``` |