Update README.md

#1
by PapaMoth - opened
Files changed (1) hide show
  1. README.md +128 -115
README.md CHANGED
@@ -1,115 +1,128 @@
1
- ---
2
- license: mit
3
- tags:
4
- - finance
5
- - trading
6
- - bitcoin
7
- - cryptocurrency
8
- - machine-learning
9
- - deep-learning
10
- - lstm
11
- - transformer
12
- - xgboost
13
- - random-forest
14
- - shap
15
- language:
16
- - en
17
- library_name: pytorch
18
- pipeline_tag: tabular-classification
19
- ---
20
-
21
- # πŸ“ˆ AI Multi-Model Trading Bot
22
-
23
- A comprehensive cryptocurrency trading signal prediction system using 8 ML/DL models with ensemble voting.
24
-
25
- ## 🎯 Model Overview
26
-
27
- | Model Type | Models Included |
28
- | :--- | :--- |
29
- | **Traditional ML** | Logistic Regression, Random Forest, XGBoost |
30
- | **Deep Learning** | LSTM, GRU, CNN, LSTM+Attention, Transformer |
31
- | **Ensemble** | Majority voting across all models |
32
-
33
- ## πŸ“Š Features Used
34
-
35
- The models use 10 technical indicators:
36
- - RSI (Relative Strength Index)
37
- - MACD & MACD Signal
38
- - Bollinger Band Width
39
- - ATR (Average True Range)
40
- - Distance from SMA50
41
- - OBV Percentage Change
42
- - ADX (Average Directional Index)
43
- - Stochastic RSI (K & D)
44
-
45
- ## πŸš€ Quick Start
46
-
47
- ```python
48
- import joblib
49
- import torch
50
-
51
- # Load scaler and config
52
- scaler = joblib.load("scaler.pkl")
53
- config = joblib.load("config.pkl")
54
-
55
- # Load ML model
56
- rf_model = joblib.load("random_forest.pkl")
57
-
58
- # Load DL model
59
- from your_models import LSTMModel
60
- lstm = LSTMModel(config['input_dim'])
61
- lstm.load_state_dict(torch.load("lstm.pt"))
62
- lstm.eval()
63
- ```
64
-
65
- ## πŸ“ Files
66
-
67
- | File | Description |
68
- | :--- | :--- |
69
- | `scaler.pkl` | StandardScaler for feature preprocessing |
70
- | `config.pkl` | Model configuration (input_dim, timesteps, feature_cols) |
71
- | `logistic_regression.pkl` | Trained Logistic Regression model |
72
- | `random_forest.pkl` | Trained Random Forest model |
73
- | `xgboost.pkl` | Trained XGBoost model |
74
- | `lstm.pt` | Trained LSTM model weights |
75
- | `gru.pt` | Trained GRU model weights |
76
- | `cnn.pt` | Trained CNN model weights |
77
- | `lstm_attention.pt` | Trained LSTM+Attention model weights |
78
- | `transformer.pt` | Trained Transformer model weights |
79
- | `shap_values.pkl` | SHAP feature importance values |
80
-
81
- ## πŸ“Š Dataset
82
-
83
- Training data is available separately:
84
- **πŸ”— [AdityaaXD/Multi-Model-Trading-Data](https://huggingface.co/datasets/AdityaaXD/Multi-Model-Trading-Data)**
85
-
86
- - **Ticker**: BTC-USD
87
- - **Date Range**: 2015-01-01 to 2025-01-01
88
- - **Total Samples**: ~3,600 days
89
- - **Train/Test Split**: 80/20
90
-
91
- ## ⚠️ Disclaimer
92
-
93
- This model is for **educational and research purposes only**. It should NOT be used for actual trading decisions. Cryptocurrency markets are highly volatile and past performance does not guarantee future results.
94
-
95
- ## πŸ“Š SHAP Explainability
96
-
97
- The model includes SHAP (SHapley Additive exPlanations) values for feature importance analysis, helping understand which technical indicators most influence predictions.
98
-
99
- ## πŸ› οΈ Training Details
100
-
101
- - **Hyperparameter Tuning**: GridSearchCV with 3-fold CV
102
- - **Deep Learning**: 50 epochs, early stopping (patience=7)
103
- - **Regularization**: Label smoothing (0.1), gradient clipping (1.0)
104
- - **Class Balancing**: Weighted loss functions
105
-
106
- ## πŸ“ Citation
107
-
108
- ```bibtex
109
- @misc{ai-trading-bot-2025,
110
- title={AI Multi-Model Trading Bot},
111
- author={Your Name},
112
- year={2025},
113
- publisher={Hugging Face}
114
- }
115
- ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - finance
5
+ - trading
6
+ - bitcoin
7
+ - cryptocurrency
8
+ - machine-learning
9
+ - deep-learning
10
+ - lstm
11
+ - transformer
12
+ - xgboost
13
+ - random-forest
14
+ - shap
15
+ - code
16
+ - agent
17
+ - legal
18
+ language:
19
+ - en
20
+ library_name: adapter-transformers
21
+ pipeline_tag: tabular-classification
22
+ datasets:
23
+ - open-index/hacker-news
24
+ - lambda/hermes-agent-reasoning-traces
25
+ - ropedia-ai/xperience-10m
26
+ metrics:
27
+ - accuracy
28
+ - code_eval
29
+ - bertscore
30
+ - character
31
+ - brier_score
32
+ ---
33
+
34
+ # πŸ“ˆ AI Multi-Model Trading Bot
35
+
36
+ A comprehensive cryptocurrency trading signal prediction system using 8 ML/DL models with ensemble voting.
37
+
38
+ ## 🎯 Model Overview
39
+
40
+ | Model Type | Models Included |
41
+ | :--- | :--- |
42
+ | **Traditional ML** | Logistic Regression, Random Forest, XGBoost |
43
+ | **Deep Learning** | LSTM, GRU, CNN, LSTM+Attention, Transformer |
44
+ | **Ensemble** | Majority voting across all models |
45
+
46
+ ## πŸ“Š Features Used
47
+
48
+ The models use 10 technical indicators:
49
+ - RSI (Relative Strength Index)
50
+ - MACD & MACD Signal
51
+ - Bollinger Band Width
52
+ - ATR (Average True Range)
53
+ - Distance from SMA50
54
+ - OBV Percentage Change
55
+ - ADX (Average Directional Index)
56
+ - Stochastic RSI (K & D)
57
+
58
+ ## πŸš€ Quick Start
59
+
60
+ ```python
61
+ import joblib
62
+ import torch
63
+
64
+ # Load scaler and config
65
+ scaler = joblib.load("scaler.pkl")
66
+ config = joblib.load("config.pkl")
67
+
68
+ # Load ML model
69
+ rf_model = joblib.load("random_forest.pkl")
70
+
71
+ # Load DL model
72
+ from your_models import LSTMModel
73
+ lstm = LSTMModel(config['input_dim'])
74
+ lstm.load_state_dict(torch.load("lstm.pt"))
75
+ lstm.eval()
76
+ ```
77
+
78
+ ## πŸ“ Files
79
+
80
+ | File | Description |
81
+ | :--- | :--- |
82
+ | `scaler.pkl` | StandardScaler for feature preprocessing |
83
+ | `config.pkl` | Model configuration (input_dim, timesteps, feature_cols) |
84
+ | `logistic_regression.pkl` | Trained Logistic Regression model |
85
+ | `random_forest.pkl` | Trained Random Forest model |
86
+ | `xgboost.pkl` | Trained XGBoost model |
87
+ | `lstm.pt` | Trained LSTM model weights |
88
+ | `gru.pt` | Trained GRU model weights |
89
+ | `cnn.pt` | Trained CNN model weights |
90
+ | `lstm_attention.pt` | Trained LSTM+Attention model weights |
91
+ | `transformer.pt` | Trained Transformer model weights |
92
+ | `shap_values.pkl` | SHAP feature importance values |
93
+
94
+ ## πŸ“Š Dataset
95
+
96
+ Training data is available separately:
97
+ **πŸ”— [AdityaaXD/Multi-Model-Trading-Data](https://huggingface.co/datasets/AdityaaXD/Multi-Model-Trading-Data)**
98
+
99
+ - **Ticker**: BTC-USD
100
+ - **Date Range**: 2015-01-01 to 2025-01-01
101
+ - **Total Samples**: ~3,600 days
102
+ - **Train/Test Split**: 80/20
103
+
104
+ ## ⚠️ Disclaimer
105
+
106
+ This model is for **educational and research purposes only**. It should NOT be used for actual trading decisions. Cryptocurrency markets are highly volatile and past performance does not guarantee future results.
107
+
108
+ ## πŸ“Š SHAP Explainability
109
+
110
+ The model includes SHAP (SHapley Additive exPlanations) values for feature importance analysis, helping understand which technical indicators most influence predictions.
111
+
112
+ ## πŸ› οΈ Training Details
113
+
114
+ - **Hyperparameter Tuning**: GridSearchCV with 3-fold CV
115
+ - **Deep Learning**: 50 epochs, early stopping (patience=7)
116
+ - **Regularization**: Label smoothing (0.1), gradient clipping (1.0)
117
+ - **Class Balancing**: Weighted loss functions
118
+
119
+ ## πŸ“ Citation
120
+
121
+ ```bibtex
122
+ @misc{ai-trading-bot-2025,
123
+ title={AI Multi-Model Trading Bot},
124
+ author={Your Name},
125
+ year={2025},
126
+ publisher={Hugging Face}
127
+ }
128
+ ```