Quantitative-Equity-Alpha-Transformer (QEAT)

Open in Spaces Hugging Face License: MIT Hardware

A Systematic, Multi-Horizon Macro-Liquidity Model for Probabilistic Alpha Discovery.

  • Architecture: Temporal Fusion Transformer (TFT) with Quantile Regression
  • Compute Infrastructure: NVIDIA DGX Spark (Grace Blackwell GB10 Superchip)

πŸš€ Live Demo

Click here to launch the Interactive Alpha Dashboard

The live dashboard allows you to run real-time inference on a curated list of high-liquid assets across three key sectors:

  • πŸ‡ΊπŸ‡Έ Big Tech: NVIDIA ($NVDA), Apple ($AAPL), Microsoft ($MSFT), Google ($GOOGL), Amazon ($AMZN), Meta ($META), Tesla ($TSLA), AMD, Broadcom.
  • πŸ† Commodities: Gold ($GLD), Silver ($SLV), Crude Oil ($USO), Natural Gas ($UNG), Palladium ($PALL).
  • ⛓️ Crypto Assets: Bitcoin ($BTC), Ethereum ($ETH), Solana ($SOL).

ℹ️ Institutional Note: This public demo is limited to the assets listed above. The full internal model tracks 518+ tickers, including the entire S&P 500 constituents and global indices, which are not available in this live demo.


πŸ›οΈ Abstract

The Quantitative-Equity-Alpha-Transformer (QEAT) is a state-of-the-art deep learning model designed to solve the Non-Stationarity problem in financial time-series forecasting. Unlike traditional stochastic calculus models or naive LSTM networks, QEAT utilizes a Temporal Fusion Transformer (TFT) architecture to interpretably map high-dimensional macro-liquidity features to asset returns.

🧠 Model Architecture

1. Temporal Fusion Transformer (TFT)

The core engine is a specialized Transformer designed for multi-horizon forecasting:

  • Variable Selection Networks (VSN): Automatically filters irrelevant noise inputs, focusing on high-signal liquidity events.
  • Gated Residual Networks (GRN): Enables deep processing of non-linear relationships while suppressing the "Vanishing Gradient" problem.
  • Multi-Head Attention Mechanism: Learns long-term dependencies and "Regime Shifts" by attending to historical patterns across different time scales.

2. Probabilistic Forecasting

Instead of a single price target, QEAT outputs a Probability Distribution (10th, 50th, and 90th percentiles) for institutional Risk Management (VaR).

πŸ“Š Data Engineering & Methodology

The model was trained on a proprietary Global Macro-Liquidity Dataset engineered to capture cross-asset correlations under specific liquidity regimes.

1. The Asset Universe (518 Tickers)

A diverse, cross-asset training set designed to learn non-linear correlations:

  • Equities: Full S&P 500 constituents (approx. 503 tickers).
  • Cryptocurrency: Top 10 Liquid L1 Protocols (BTC, ETH, SOL, AVAX, DOT, etc.).
  • Commodities: Key Industrial/Precious Metals ETFs (SLV, GLW) acting as proxies for global physical demand.

2. The Time Series (10-Year History)

  • Range: 2014 – 2024 (Training/Validation), with Inference on 2026.
  • Resolution: Daily (OHLCV) adjusted for splits and dividends.
  • Scale: 1.2 Million individual training examples.

3. The "Alpha" Features (Exogenous Liquidity Vectors)

The model's edge comes from 6 custom-engineered liquidity flags that serve as static covariates:

  • πŸ‡ΊπŸ‡Έ US Fiscal Flows:
    • `is_401k_window` (Jan 1-15): Capture automated retirement inflows.
    • `is_tax_refund` (Apr 1-15): Retail capital injection cycles.
    • `is_bonus_window` (Mar 1-15): Corporate performance bonus allocation.
  • 🌏 Global Cultural Flows:
    • `is_diwali_window` (Nov 1-5): Modeled physical Gold/Silver demand in India.
    • `is_cny_window` (Feb 1-7): Chinese New Year liquidity shifts.
    • `is_holiday_season` (Dec 24-Jan 2): Retail sentiment and volume anomalies.

4. Target Variable

  • Objective: 7-Day Forward Log Returns.
  • Optimization: The model minimizes Quantile Loss across three horizons (P10, P50, P90) simultaneously.

πŸ“ˆ Key Research Findings (Alpha Signals)

The "Diwali Alpha" Anomaly

Post-training analysis of the Attention Weights revealed a significant market inefficiency:

  • Observation: The model assigned a 0.58 Importance Score to the Diwali Liquidity Window, identifying it as the strongest predictor of Precious Metals volatility.
  • Validation: Backtesting the "Diwali Long" strategy yielded a High-Sharpe outcome for the Jan-Feb 2026 window.

Current Regime Prediction (Feb 2026)

  • Signal: Strong Rotation (Risk-On)
  • Long Conviction: Hardware Technology ($ANET, $GLW) & Industrial Metals.
  • Short Conviction: Defensive Healthcare ($HUM, $CNC) & Speculative L1 Crypto ($AVAX).

πŸ› οΈ Usage for Quantitative Research

import torch
from pytorch_forecasting import TemporalFusionTransformer

# 1. Load the Pre-Trained Weights
model = TemporalFusionTransformer.load_from_checkpoint("model.ckpt")

# 2. Prepare Your Data
# Data must be a Pandas DataFrame with columns:
# ['Ticker', 'date', 'close', 'liquidity_flags', 'time_idx']

# 3. Generate Probabilistic Predictions
raw_prediction = model.predict(your_dataframe, mode="raw", return_x=True)

# 4. Extract Quantiles
interpretation = model.interpret_output(raw_prediction.output, reduction="sum")
print("Attention Weights:", interpretation["attention"].shape)

πŸ“‰ Hardware & Training Specifications

This model was trained on NVIDIA's next-generation accelerated computing platform.

  • System: NVIDIA DGX Spark
  • Compute Unit: NVIDIA Grace Blackwell GB10 Superchip
  • Precision: Mixed-Precision (FP16/FP32) Matrix Multiplication (Tensor Cores)
  • Throughput: 1.2 Million training samples processed in 8 Epochs.

πŸ“œ Citation & License

If you use this model in your research or trading systems, please cite:

Assi, A. (2026). Quantitative-Equity-Alpha-Transformer: A Systematic Approach to Macro-Liquidity Modeling using Temporal Fusion Transformers. Hugging Face Model Hub.

License: MIT License. Free for academic and research use.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Space using assix-research/Quantitative-Equity-Alpha-Transformer 1