LightGTS / README.md
pchen182224's picture
Update README.md
2a7112d verified
---
pipeline_tag: time-series-forecasting
tags:
- time series
- time series foundation models
- time series forecasting
- zero-shot
---
# LightGTS: A Lightweight General Time Series Forecasting Model
🚩 **News (2025.06)** LightGTS has been accepted as **ICML 2025**.
## Introduction
<div style="text-align: center;">
<img src="framework.png" alt="LightGTS" style="zoom:80%;" />
</div>
## Quick Demos
```
pip install transformers==4.30.2 # Use this version for stable compatibility
```
### Zero-Shot
```
from configuration_LightGTS import LightGTSConfig
from modeling_LightGTS import LightGTSForPrediction
import torch
from transformers import AutoModelForCausalLM
from transformers import AutoModelForCausalLM, MODEL_MAPPING
from transformers import AutoConfig
# load pretrain model
LightGTS_config = LightGTSConfig(context_points=528, c_in=1, target_dim=192, patch_len=48, stride=48)
LightGTS_config.save_pretrained("LightGTS-huggingface")
AutoConfig.register("LightGTS",LightGTSConfig)
AutoModelForCausalLM.register(LightGTSConfig, LightGTSForPrediction)
model = AutoModelForCausalLM.from_pretrained(
"./LightGTS-huggingface",
trust_remote_code=True
)
# prepare input
batch_size, lookback_length = 1, 576
seqs = torch.randn(batch_size, lookback_length).unsqueeze(-1).float()
# generate forecasting results
forecast_length = 192
outputs = model.generate(seqs, patch_len = 48, stride_len=48, max_output_length=forecast_length, inference_patch_len=48)
print(outputs.shape)
```
### Fine-tune
For usage examples, please see test_finetune.py
## Citation
If you find Sundial helpful for your research, please cite our paper:
```
@article{wang2025lightgts,
title={LightGTS: A Lightweight General Time Series Forecasting Model},
author={Wang, Yihang and Qiu, Yuying and Chen, Peng and Shu, Yang and Rao, Zhongwen and Pan, Lujia and Yang, Bin and Guo, Chenjuan},
journal={arXiv preprint arXiv:2506.06005},
year={2025}
}
```