Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,75 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
tags:
|
| 3 |
+
- time-series-prediction
|
| 4 |
+
- transformer
|
| 5 |
+
- cryptocurrency
|
| 6 |
+
- finance
|
| 7 |
+
datasets:
|
| 8 |
+
- crypto-historical-data
|
| 9 |
+
license: mit
|
| 10 |
+
---
|
| 11 |
+
|
| 12 |
+
# CryptoTS-Transformer-Forecast-7D
|
| 13 |
+
|
| 14 |
+
## 📝 Overview
|
| 15 |
+
|
| 16 |
+
**CryptoTS-Transformer-Forecast-7D** is a purpose-built Transformer model designed for financial time series forecasting, specifically predicting the **next 7 days of the closing price** for major cryptocurrencies (e.g., BTC, ETH).
|
| 17 |
+
|
| 18 |
+
Unlike traditional RNNs, this architecture leverages the attention mechanism to capture long-range dependencies and complex correlations across multiple technical indicators, leading to more robust sequence-to-sequence predictions.
|
| 19 |
+
|
| 20 |
+
## 🧠 Model Architecture
|
| 21 |
+
|
| 22 |
+
The model is a custom encoder-decoder Transformer architecture:
|
| 23 |
+
|
| 24 |
+
* **Type:** Sequence-to-Sequence Time Series Transformer.
|
| 25 |
+
* **Input Sequence Length (`seq_len`):** 60 time steps (representing 60 days of historical data).
|
| 26 |
+
* **Output Sequence Length (`output_len`):** 7 time steps (the 7 days being predicted).
|
| 27 |
+
* **Input Features (`input_size`):** 12 features, including OHLCV data and 7 key technical indicators (MACD, RSI, various moving averages and Bollinger Bands).
|
| 28 |
+
* **Core Components:**
|
| 29 |
+
* **Positional Encoding:** Added to input embeddings to preserve the temporal order.
|
| 30 |
+
* **Multi-Head Attention:** Utilized within both the encoder and decoder to weigh the importance of different time steps and features.
|
| 31 |
+
* **Loss Function:** Mean Squared Error (MSE).
|
| 32 |
+
|
| 33 |
+
## 🚀 Intended Use
|
| 34 |
+
|
| 35 |
+
* **Financial Forecasting:** Generating forward-looking predictions for high-volatility financial assets.
|
| 36 |
+
* **Algorithmic Trading Signal:** Use the 7-day forecast as input for automated trading strategies.
|
| 37 |
+
* **Risk Assessment:** Modeling potential near-term price volatility based on predicted ranges.
|
| 38 |
+
|
| 39 |
+
## ⚠️ Limitations
|
| 40 |
+
|
| 41 |
+
* **Non-Stationarity:** Financial time series are inherently non-stationary. While the model uses robust scaling, sudden market shifts (black swan events, major news) can lead to significant forecast errors.
|
| 42 |
+
* **Data Quality:** The model's performance is heavily reliant on the accuracy and quality of the input features, particularly the technical indicators.
|
| 43 |
+
* **Overfitting:** Due to the complexity of the Transformer, there is a risk of overfitting to historical noise patterns.
|
| 44 |
+
|
| 45 |
+
## 💻 Example Code
|
| 46 |
+
|
| 47 |
+
This example assumes a pre-processed input tensor `input_data` (60 days of 12 features) scaled using a standard scaler.
|
| 48 |
+
|
| 49 |
+
```python
|
| 50 |
+
import torch
|
| 51 |
+
import torch.nn as nn
|
| 52 |
+
from transformers import AutoModel
|
| 53 |
+
|
| 54 |
+
# NOTE: This model uses a custom class not available in standard transformers,
|
| 55 |
+
# but can be loaded via a custom class or registered in AutoModel.
|
| 56 |
+
# For demonstration, we assume it's loaded as a generic AutoModel.
|
| 57 |
+
model_name = "Your-HF-Username/CryptoTS-Transformer-Forecast-7D"
|
| 58 |
+
model = AutoModel.from_pretrained(model_name)
|
| 59 |
+
model.eval()
|
| 60 |
+
|
| 61 |
+
# Dummy input data: (Batch_Size, Sequence_Length, Input_Features) -> (1, 60, 12)
|
| 62 |
+
# In a real scenario, this would be 60 days of OHLCV + 7 Indicators, scaled.
|
| 63 |
+
input_data = torch.randn(1, model.config.seq_len, model.config.input_size)
|
| 64 |
+
|
| 65 |
+
# Make prediction
|
| 66 |
+
with torch.no_grad():
|
| 67 |
+
# Output shape will be (Batch_Size, Output_Length, 1) or (1, 7, 1) for the predicted Close price
|
| 68 |
+
prediction = model(input_data)
|
| 69 |
+
|
| 70 |
+
# Convert predicted prices back to their original scale using the inverse transform
|
| 71 |
+
# predicted_prices_7d = scaler.inverse_transform(prediction.cpu().numpy())
|
| 72 |
+
|
| 73 |
+
print("Input shape (60 days, 12 features):", input_data.shape)
|
| 74 |
+
print("Predicted output shape (7 days, 1 feature/price):", prediction.shape)
|
| 75 |
+
print("Predicted 7-day normalized closing prices:\n", prediction.squeeze().numpy())
|