File size: 5,306 Bytes
09e2231
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
aeed8a1
 
 
 
 
 
 
 
 
09e2231
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
aeed8a1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
09e2231
 
 
 
 
 
 
 
 
 
 
aeed8a1
09e2231
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
---

license: mit
tags:
  - finance
  - trading
  - bitcoin
  - cryptocurrency
  - quantitative-analysis
  - ensemble
  - xgboost
  - pytorch
  - transformer
  - lstm
  - time-series
  - forecasting
language:
  - en
pipeline_tag: tabular-classification
library_name: pytorch
---


<div align="center">

# ๐Ÿ”ฎ Nexus Shadow-Quant โ€” Trained Models

### Institutional-Grade Crypto Intelligence Engine

[![GitHub](https://img.shields.io/badge/Source-lukeedIII%2FPredictor-181717?style=for-the-badge&logo=github)](https://github.com/lukeedIII/Predictor)
[![Version](https://img.shields.io/badge/Version-v6.4.2-6366f1?style=for-the-badge)]()
[![License](https://img.shields.io/badge/License-MIT-22c55e?style=for-the-badge)]()

</div>

---

## ๐Ÿ“‹ Overview

This repository contains the **pre-trained model artifacts** for [Nexus Shadow-Quant](https://github.com/lukeedIII/Predictor) โ€” a 16-model ensemble engine for BTC/USDT directional forecasting.

**Why this exists:** Training the full model stack from scratch takes ~6 hours on a modern GPU. By hosting the trained weights here, new installations can pull them instantly and skip the initial training phase entirely.

---

## ๐Ÿ—๏ธ Model Architecture

| Model | Type | Parameters | Trained | Purpose |
|:---|:---|:---|:---|:---|
| `predictor_v3.joblib` | XGBoost Ensemble | ~500 trees | 15 Feb 2026, 02:31 | Primary directional classifier |
| `nexus_lstm_v3.pth` | Bi-LSTM | ~2M | 14 Feb 2026, 11:45 | Sequence pattern recognition |
| `nexus_transformer_v2.pth` | Transformer (152M) | 5 epochs | 15 Feb 2026, 04:44 | Long-range dependency modeling |
| `nexus_medium_transformer_v1.pth` | Transformer (Medium) | 5 epochs | 15 Feb 2026, 05:49 | Balanced capacity/speed |
| `nexus_small_transformer_v1.pth` | Transformer (Small) | 10 epochs | 15 Feb 2026, 05:24 | Fast inference, high accuracy |
| `nexus_transformer_pretrained.pth` | Pretrained base | โ€” | 14 Feb 2026, 07:22 | Foundation weights |
| `feature_scaler_v3.pkl` | StandardScaler | โ€” | 15 Feb 2026, 02:31 | Feature normalization state |

### Supporting Models (16-Model Quant Panel)
- **GARCH(1,1)** โ€” Volatility regime detection
- **MF-DFA** โ€” Multi-fractal detrended fluctuation analysis
- **TDA** โ€” Topological Data Analysis (persistent homology)
- **Bates SVJ** โ€” Stochastic volatility with jumps
- **HMM (3-state)** โ€” Hidden Markov Model for regime classification
- **RQA** โ€” Recurrence Quantification Analysis
- + 10 more statistical models

---

## ๐Ÿ“Š Performance (Audited)

| Metric | Value |
|:---|:---|
| **Audit Size** | 105,031 predictions on 3.15M candles |
| **Accuracy** | 50.71% (statistically significant above 50%) |
| **Sharpe Ratio** | 0.88 (annualized, fee-adjusted) |
| **Prediction Horizon** | 15 minutes |
| **Features** | 42 scale-invariant (returns/ratios/z-scores) |
| **Fee Model** | Binance taker 0.04% + slippage 0.01% |

---

## ๐Ÿ• Training Log

<details>
<summary><strong>๐Ÿ“ˆ Small Transformer โ€” 10 epochs (15 Feb 2026)</strong></summary>

| Epoch | Accuracy | Timestamp |
|:---|:---|:---|
| 1 | 60.0% | 05:09 |
| 2 | 69.7% | 05:10 |
| 3 | 72.6% | 05:12 |
| 4 | 74.5% | 05:14 |
| 5 | 75.2% | 05:15 |
| 6 | 76.0% | 05:17 |
| 7 | 76.8% | 05:19 |
| 8 | 76.8% | 05:20 |
| 9 | 76.9% | 05:22 |
| **10** | **76.9%** โœ… | **05:24** |

</details>

<details>
<summary><strong>๐Ÿ“ˆ Medium Transformer โ€” 5 epochs (15 Feb 2026)</strong></summary>

| Epoch | Accuracy | Timestamp |
|:---|:---|:---|
| 1 | 58.1% | 05:34 |
| 2 | 69.8% | 05:37 |
| 3 | 72.7% | 05:41 |
| 4 | 74.8% | 05:45 |
| **5** | **76.2%** โœ… | **05:49** |

</details>

<details>
<summary><strong>๐Ÿ“ˆ Nexus Transformer (152M) โ€” 9 epochs (15 Feb 2026)</strong></summary>

| Epoch | Accuracy | Timestamp |
|:---|:---|:---|
| 1 | 51.3% | 06:30 |
| 2 | 52.4% | 06:51 |
| 3 | 52.4% | 07:12 |
| 4 | 53.1% | 07:32 |
| 5 | 54.6% | 07:52 |
| 6 | 55.3% | 08:13 |
| 7 | 57.3% | 08:33 |
| 8 | 58.1% | 08:54 |
| **9** | **58.7%** โœ… | **09:14** |

*Epoch 10 failed โ€” weights from epoch 9 preserved.*

</details>

---

## โšก Quick Start

### Automatic (Recommended)
The Nexus Shadow-Quant app will **auto-pull** these models on first startup if no local models are found. Simply:
1. Set your `HUGGINGFACE_TOKEN` and `HF_REPO_ID` in Settings.
2. Restart the backend.
3. Models are downloaded and the predictor is ready instantly.

### Manual
```bash

pip install huggingface_hub

huggingface-cli download Lukeed/Predictor-Models --local-dir ./models

```

---

## ๐Ÿ”„ Sync Protocol

| Action | What happens |
|:---|:---|
| **Push to Hub** | Uploads all files from `models/` folder to this repo |
| **Pull from Hub** | Downloads latest weights, re-initializes the predictor |
| **Auto-Pull** | On startup, if no local models found, pulls automatically |

---

## โš ๏ธ Disclaimer

These models are trained on historical BTC/USDT data and are provided for **educational and research purposes only**. They are not financial advice. Cryptocurrency markets are volatile. Past performance does not guarantee future results.

---

<div align="center">

**Dr. Nexus** ยท *Quantitative intelligence, engineered locally.*

</div>