ποΈ Futures Foundation Model (FFM)
A pretrained transformer backbone for futures market structure and regime classification.
Model Summary
FFM is an open-source pretrained transformer designed to learn market structure and regime dynamics from raw OHLCV futures data. By treating price action sequences as "sentences" of market behavior, the backbone learns general representations that can be fine-tuned for specific trading strategies (ORB, ICT, Mean Reversion, etc.).
Philosophy
Separate "understanding market context" from "making strategy-specific decisions."
Just as BERT learns language structure before being fine-tuned for sentiment, FFM learns market structure before being fine-tuned for specific entries or signals.
Architecture
The model utilizes a 6-layer Transformer Encoder with multi-head self-attention to generate a 256-dimensional Market Context Embedding.
- Layers: 6 Transformer Encoder blocks
- Attention: 8 heads
- Hidden Dim: 256
- Input: OHLCV Bars + Session/Temporal Embeddings
Pretraining Objectives
The model is trained using self-supervised tasks derived directly from price data:
- Regime: Trending vs. Rotational
- Volatility State: ATR-based percentiles
- Market Structure: HH/HL vs. LH/LL swing points
- Range Position: Quintile-based price location