README / README.md
Eghbal's picture
Update README.md
fccfc14 verified
|
raw
history blame
4.43 kB
metadata
title: FinText-TSFM
emoji: πŸ“ˆ
colorFrom: gray
colorTo: blue
sdk: static
pinned: false

FinText-TSFM πŸ“ˆ

Time Series Foundation Models for Finance

πŸš€ TSFMs Release

We are pleased to introduce FinText-TSFM, a comprehensive suite of time series foundation models (TSFMs) developed for financial forecasting and quantitative research. This release accompanies the paper : Re(Visiting) Time Series Foundation Models in Finance by Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025).

πŸ’‘ Key Highlights

  • Finance-Native Pre-training:
    Models are pre-trained from scratch on large-scale financial time series datasets β€” including daily excess returns across 89 markets and over 2 billion observations β€” to ensure full temporal and domain alignment.

  • Bias-Free Design:
    Training strictly follows a chronological expanding-window setup, avoiding any look-ahead bias or information leakage.

  • Model Families:
    This release includes variants of Chronos and TimesFM architectures adapted for financial time series:

    • Chronos-Tiny / Mini / Small
    • TimesFM-8M / 20M
  • Performance Insights:
    Our findings show that off-the-shelf TSFMs underperform in zero-shot forecasting, while finance-pretrained models achieve large gains in both predictive accuracy and portfolio Sharpe ratios.

  • Evaluation Scope:
    Models are benchmarked across U.S. and international markets, using rolling windows (5, 21, 252, 512 days) and 18M+ out-of-sample forecasts.

🧠 Technical Overview

  • Architecture: Transformer-based TSFMs (Chronos & TimesFM)
  • Training Regime: Pre-training from scratch, fine-tuning, and zero-shot evaluation
  • Compute: >50,000 GPU hours on NVIDIA GH200 Grace Hopper clusters

πŸ“š Citation

Please cite the accompanying paper if you use these models:

Re(Visiting) Time Series Foundation Models in Finance.
Rahimikia, Eghbal; Ni, Hao; Wang, Weiguan.
SSRN: https://ssrn.com/abstract=4963618
DOI: 10.2139/ssrn.4963618

πŸ”‹ Acknowledgments

This project was made possible through computational and institutional support from:

  • Isambard-AI National AI Research Resource (AIRR)
  • The University of Manchester (Research IT & Computational Shared Facility)
  • Alliance Manchester Business School (AMBS), University of Manchester
  • University College London (UCL)
  • Shanghai University
  • N8 Centre of Excellence in Computationally Intensive Research (N8 CIR)
  • The Alan Turing Institute

Developed by:

    <table style="border:none; margin:auto;">
      <tr>
        <td style="border:none; padding:0 10px; vertical-align:middle;">
          <img src="https://fintext.ai/UoM-logo.svg" alt="University of Manchester Logo" width="210">
        </td>
        <td style="border:none; padding:0 10px; vertical-align:middle;">
          <img src="https://fintext.ai/UCL-logo.jpg" alt="UCL Logo" width="100">
        </td>
      </tr>
    </table>

    <p style="font-size:0.8em; margin-top:6px; line-height:1.3;">
      Alliance Manchester Business School, The University of Manchester<br>
      Department of Mathematics, University College London
    </p>
  </td>

  <!-- Right group -->
  <td style="border:none; padding:0 40px; text-align:center; vertical-align:middle;">
    <p style="font-weight:bold; font-size:1.1em; margin:6px 0;">Powered by:</p>
    <img src="https://fintext.ai/BriCS-logo.svg" alt="BriCS Logo" width="160" style="margin:6px 0;">
    <p style="font-size:0.8em; margin-top:6px; line-height:1.3;">
      Bristol Centre for Computational Finance<br>
      and Financial Innovation
    </p>
  </td>
</tr>