README / README.md
Eghbal's picture
Update README.md
1ca567f verified
|
raw
history blame
5.47 kB
---
title: FinText-TSFM
emoji: πŸ“ˆ
colorFrom: gray
colorTo: blue
sdk: static
pinned: false
---
<div style="display: flex; flex-direction: column; align-items: center; justify-content: center; text-align: center; margin-top: 10px; margin-bottom: 10px;">
<img src="https://fintext.ai/FinText_logo.webp" alt="FinText Logo" width="140" style="display: block; margin: 0 auto 6px auto;">
<h1 style="font-size: 1.9em; font-weight: bold; margin: 4px 0;">Time Series Foundation Models for Finance</h1>
</div>
## πŸš€ TSFMs Release
We are pleased to introduce **FinText-TSFM**, a comprehensive suite of **time series foundation models (TSFMs)** with 565 models developed for financial forecasting and quantitative research. This release accompanies the paper :
**[*Re(Visiting) Time Series Foundation Models in Finance*](https://ssrn.com/abstract=4963618)** by *Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025)*.
### πŸ’‘ Key Highlights
- **Finance-Native Pre-training:**
Models are pre-trained **from scratch** on large-scale financial time series datasets β€” including daily excess returns across **89 markets** and **over 2 billion observations** β€” to ensure full temporal and domain alignment.
- **Bias-Free Design:**
Pre-training strictly follows a **chronological expanding-window setup**, avoiding any **look-ahead bias** or **information leakage**.<br>
Each variation includes 23 separately pre-trained models, corresponding to each year from **2000** to **2023**, with data starting in 1990.
- **Model Families:**
This release includes variants of **Chronos** and **TimesFM** architectures adapted for financial time series:
- Chronos-Tiny (8M) / Mini (20M) / Small (46M)
- TimesFM-8M / 20M
- **Model Collections:**
- U.S.: Covers **U.S.** market-wide excess returns from 2000 to 2023, with one pre-trained model per year.
- Global: Covers excess returns across **89 global markets** from 2000 to 2023, with one pre-trained model for each year.
- Augmented: Extends the global data with **augmented factors** from 2000 to 2023, with one pre-trained model for each year.
- The remaining **220 pre-trained models** are available for download via the [**FinText.ai Portal**](https://fintext.ai). These include models pre-trained with varying **hyperparameter configurations** for extended experimentation and performance comparison.
- **Performance Insights:**
Our findings show that **off-the-shelf TSFMs** underperform in zero-shot forecasting, while **finance-pretrained models** achieve large gains in both predictive accuracy and portfolio performance.
- **Evaluation Scope:**
Models are benchmarked across **U.S. and seven international markets**, using rolling windows of **5, 21, 252, and 512 days**, with over **18 million out-of-sample forecasts** spanning **22 years (2001–2023)** of daily excell returns, evaluated at both the **statistical** and **economic performance** levels.
### 🧠 Technical Overview
- **Architecture:** Transformer-based TSFMs (Chronos & TimesFM)
- **Compute:** 50,000 GPU hours on NVIDIA GH200 Grace Hopper clusters
### πŸ“š Citation
Please cite the accompanying paper if you use these models:
> **Re(Visiting) Time Series Foundation Models in Finance.**
> **Rahimikia, Eghbal; Ni, Hao; Wang, Weiguan.**
> SSRN: [https://ssrn.com/abstract=4963618](https://ssrn.com/abstract=4963618)
> DOI: [10.2139/ssrn.4963618](http://dx.doi.org/10.2139/ssrn.4963618)
### πŸ”‹ Acknowledgments
This project was made possible through computational and institutional support from:
- **UK Research and Innovation (UKRI)**
- **Isambard-AI National AI Research Resource (AIRR)**
- **Alliance Manchester Business School (AMBS), University of Manchester**
- **N8 Centre of Excellence in Computationally Intensive Research (N8 CIR)**
- **The University of Manchester** (Research IT & Computational Shared Facility)
- **University College London (UCL)**
- **The Alan Turing Institute**
- **Shanghai University**
---
<div style="text-align:center; margin:auto; max-width:800px;">
<!-- Developed by -->
<div style="margin-bottom:12px;">
<p style="font-weight:bold; font-size:1.1em; margin:4px 0;">Developed by:</p>
<div style="display:flex; justify-content:center; align-items:center; gap:20px; flex-wrap:wrap; margin-bottom:15px;">
<img src="https://fintext.ai/UoM-logo.svg" alt="University of Manchester Logo" width="210" style="display:block; margin:0;">
<img src="https://fintext.ai/UCL-logo.jpg" alt="UCL Logo" width="100" style="display:block; margin:0;">
</div>
<p style="font-size:0.8em; margin-top:0; line-height:1.3;">
Alliance Manchester Business School, University of Manchester<br>
Department of Mathematics, University College London (UCL)
</p>
</div>
<!-- Powered by -->
<div>
<p style="font-weight:bold; font-size:1.1em; margin:4px 0;">Powered by:</p>
<div style="display:flex; justify-content:center; align-items:center; gap:20px; flex-wrap:wrap; margin-bottom:10px;">
<img src="https://fintext.ai/BriCS-logo.png" alt="BriCS Logo" width="180" style="display:block; margin:0;">
<img src="https://fintext.ai/N8_bede_logo.webp" alt="N8 Bede Logo" width="140" style="display:block; margin:0;">
</div>
<p style="font-size:0.8em; margin-top:0; line-height:1.3;">
Isambard-AI, Bristol Centre for Supercomputing (BriCS)<br>
The Bede Supercomputer
</p>
</div>
</div>