πŸ›οΈ Futures Foundation Model (FFM)

πŸ“¦ Code on GitHub

A pretrained transformer backbone for futures market structure and regime classification.

Model Summary

FFM is an open-source pretrained transformer designed to learn market structure and regime dynamics from raw OHLCV futures data. By treating price action sequences as "sentences" of market behavior, the backbone learns general representations that can be fine-tuned for specific trading strategies (ORB, ICT, Mean Reversion, etc.).

Philosophy

Separate "understanding market context" from "making strategy-specific decisions."

Just as BERT learns language structure before being fine-tuned for sentiment, FFM learns market structure before being fine-tuned for specific entries or signals.


Architecture

The model utilizes a 6-layer Transformer Encoder with multi-head self-attention to generate a 256-dimensional Market Context Embedding.

  • Layers: 6 Transformer Encoder blocks
  • Attention: 8 heads
  • Hidden Dim: 256
  • Input: OHLCV Bars + Session/Temporal Embeddings

Pretraining Objectives

The model is trained using self-supervised tasks derived directly from price data:

  • Regime: Trending vs. Rotational
  • Volatility State: ATR-based percentiles
  • Market Structure: HH/HL vs. LH/LL swing points
  • Range Position: Quintile-based price location

Quick Start

Installation

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support