--- title: FinText-TSFM emoji: 📈 colorFrom: gray colorTo: blue sdk: static pinned: false --- # FinText-TSFM 📈

Time Series Foundation Models for Finance

## 🚀 TSFMs Release We are pleased to introduce **FinText-TSFM**, a comprehensive suite of **time series foundation models (TSFMs)** developed for financial forecasting and quantitative research. This release accompanies the paper : **[*Re(Visiting) Time Series Foundation Models in Finance*](https://ssrn.com/abstract=4963618)** by *Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025)*. ### 💡 Key Highlights - **Finance-Native Pre-training:** Models are pre-trained **from scratch** on large-scale financial time series datasets — including daily excess returns across **89 markets** and **over 2 billion observations** — to ensure full temporal and domain alignment. - **Bias-Free Design:** Training strictly follows a **chronological expanding-window setup**, avoiding any **look-ahead bias** or **information leakage**. - **Model Families:** This release includes variants of **Chronos** and **TimesFM** architectures adapted for financial time series: - Chronos-Tiny / Mini / Small - TimesFM-8M / 20M - **Performance Insights:** Our findings show that **off-the-shelf TSFMs** underperform in zero-shot forecasting, while **finance-pretrained models** achieve large gains in both predictive accuracy and portfolio Sharpe ratios. - **Evaluation Scope:** Models are benchmarked across **U.S. and international markets**, using rolling windows (5, 21, 252, 512 days) and **18M+ out-of-sample forecasts**. ### 🧠 Technical Overview - **Architecture:** Transformer-based TSFMs (Chronos & TimesFM) - **Training Regime:** Pre-training from scratch, fine-tuning, and zero-shot evaluation - **Compute:** >50,000 GPU hours on NVIDIA GH200 Grace Hopper clusters ### 📚 Citation Please cite the accompanying paper if you use these models: > **Re(Visiting) Time Series Foundation Models in Finance.** > **Rahimikia, Eghbal; Ni, Hao; Wang, Weiguan.** > SSRN: [https://ssrn.com/abstract=4963618](https://ssrn.com/abstract=4963618) > DOI: [10.2139/ssrn.4963618](http://dx.doi.org/10.2139/ssrn.4963618) ### 🔋 Acknowledgments This project was made possible through computational and institutional support from: - **Isambard-AI National AI Research Resource (AIRR)** - **The University of Manchester** (Research IT & Computational Shared Facility) - **Alliance Manchester Business School (AMBS), University of Manchester** - **University College London (UCL)** - **Shanghai University** - **N8 Centre of Excellence in Computationally Intensive Research (N8 CIR)** - **The Alan Turing Institute** ---

Developed by:

University of Manchester Logo UCL Logo

Powered by:

BriCS Logo

Bristol Supercomputing

Alliance Manchester Business School, The University of Manchester
Department of Mathematics, University College London