CryptoTS-Transformer-Forecast-7D
π Overview
CryptoTS-Transformer-Forecast-7D is a purpose-built Transformer model designed for financial time series forecasting, specifically predicting the next 7 days of the closing price for major cryptocurrencies (e.g., BTC, ETH).
Unlike traditional RNNs, this architecture leverages the attention mechanism to capture long-range dependencies and complex correlations across multiple technical indicators, leading to more robust sequence-to-sequence predictions.
π§ Model Architecture
The model is a custom encoder-decoder Transformer architecture:
- Type: Sequence-to-Sequence Time Series Transformer.
- Input Sequence Length (
seq_len): 60 time steps (representing 60 days of historical data). - Output Sequence Length (
output_len): 7 time steps (the 7 days being predicted). - Input Features (
input_size): 12 features, including OHLCV data and 7 key technical indicators (MACD, RSI, various moving averages and Bollinger Bands). - Core Components:
- Positional Encoding: Added to input embeddings to preserve the temporal order.
- Multi-Head Attention: Utilized within both the encoder and decoder to weigh the importance of different time steps and features.
- Loss Function: Mean Squared Error (MSE).
π Intended Use
- Financial Forecasting: Generating forward-looking predictions for high-volatility financial assets.
- Algorithmic Trading Signal: Use the 7-day forecast as input for automated trading strategies.
- Risk Assessment: Modeling potential near-term price volatility based on predicted ranges.
β οΈ Limitations
- Non-Stationarity: Financial time series are inherently non-stationary. While the model uses robust scaling, sudden market shifts (black swan events, major news) can lead to significant forecast errors.
- Data Quality: The model's performance is heavily reliant on the accuracy and quality of the input features, particularly the technical indicators.
- Overfitting: Due to the complexity of the Transformer, there is a risk of overfitting to historical noise patterns.
π» Example Code
This example assumes a pre-processed input tensor input_data (60 days of 12 features) scaled using a standard scaler.
import torch
import torch.nn as nn
from transformers import AutoModel
# NOTE: This model uses a custom class not available in standard transformers,
# but can be loaded via a custom class or registered in AutoModel.
# For demonstration, we assume it's loaded as a generic AutoModel.
model_name = "Your-HF-Username/CryptoTS-Transformer-Forecast-7D"
model = AutoModel.from_pretrained(model_name)
model.eval()
# Dummy input data: (Batch_Size, Sequence_Length, Input_Features) -> (1, 60, 12)
# In a real scenario, this would be 60 days of OHLCV + 7 Indicators, scaled.
input_data = torch.randn(1, model.config.seq_len, model.config.input_size)
# Make prediction
with torch.no_grad():
# Output shape will be (Batch_Size, Output_Length, 1) or (1, 7, 1) for the predicted Close price
prediction = model(input_data)
# Convert predicted prices back to their original scale using the inverse transform
# predicted_prices_7d = scaler.inverse_transform(prediction.cpu().numpy())
print("Input shape (60 days, 12 features):", input_data.shape)
print("Predicted output shape (7 days, 1 feature/price):", prediction.shape)
print("Predicted 7-day normalized closing prices:\n", prediction.squeeze().numpy())
- Downloads last month
- 27
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support