Spaces:
Running
title: FinText-TSFM
emoji: π
colorFrom: gray
colorTo: blue
sdk: static
pinned: false
FinText-TSFM π
Time Series Foundation Models for Finance
π TSFMs Release
We are pleased to introduce FinText-TSFM, a comprehensive suite of time series foundation models (TSFMs) developed for financial forecasting and quantitative research. This release accompanies the paper : Re(Visiting) Time Series Foundation Models in Finance by Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025).
π‘ Key Highlights
Finance-Native Pre-training:
Models are pre-trained from scratch on large-scale financial time series datasets β including daily excess returns across 89 markets and over 2 billion observations β to ensure full temporal and domain alignment.Bias-Free Design:
Training strictly follows a chronological expanding-window setup, avoiding any look-ahead bias or information leakage.Model Families:
This release includes variants of Chronos and TimesFM architectures adapted for financial time series:- Chronos-Tiny / Mini / Small
- TimesFM-8M / 20M
Performance Insights:
Our findings show that off-the-shelf TSFMs underperform in zero-shot forecasting, while finance-pretrained models achieve large gains in both predictive accuracy and portfolio Sharpe ratios.Evaluation Scope:
Models are benchmarked across U.S. and international markets, using rolling windows (5, 21, 252, 512 days) and 18M+ out-of-sample forecasts.
π§ Technical Overview
- Architecture: Transformer-based TSFMs (Chronos & TimesFM)
- Training Regime: Pre-training from scratch, fine-tuning, and zero-shot evaluation
- Compute: >50,000 GPU hours on NVIDIA GH200 Grace Hopper clusters
π Citation
Please cite the accompanying paper if you use these models:
Re(Visiting) Time Series Foundation Models in Finance.
Rahimikia, Eghbal; Ni, Hao; Wang, Weiguan.
SSRN: https://ssrn.com/abstract=4963618
DOI: 10.2139/ssrn.4963618
π Acknowledgments
This project was made possible through computational and institutional support from:
- Isambard-AI National AI Research Resource (AIRR)
- The University of Manchester (Research IT & Computational Shared Facility)
- Alliance Manchester Business School (AMBS), University of Manchester
- University College London (UCL)
- Shanghai University
- N8 Centre of Excellence in Computationally Intensive Research (N8 CIR)
- The Alan Turing Institute
Developed by:
<!-- Left side: developed by logos -->
<div style="display: flex; align-items: center; justify-content: center; gap: 25px; flex-wrap: wrap;">
<img src="https://fintext.ai/UoM-logo.svg" alt="University of Manchester Logo" style="width: 210px; height: auto; margin: 0;">
<img src="https://fintext.ai/UCL-logo.jpg" alt="UCL Logo" style="width: 100px; height: auto; margin: 0;">
</div>
<!-- Right side: powered by BriCS -->
<div style="display: flex; flex-direction: column; align-items: center; justify-content: center;">
<p style="font-weight: bold; font-size: 1em; margin: 0 0 4px 0;">Powered by:</p>
<img src="https://fintext.ai/BriCS-logo.svg" alt="BriCS Logo" style="width: 150px; height: auto; margin-bottom: 2px;">
<p style="font-size: 0.8em; margin: 0;">Bristol Supercomputing</p>
</div>
Alliance Manchester Business School, The University of Manchester
Department of Mathematics, University College London