YuHaibo-HF commited on
Commit
e1eb897
·
verified ·
1 Parent(s): d35b6a8

Upload fine-tuned Kronos predictor model

Browse files
Files changed (3) hide show
  1. README.md +108 -0
  2. config.json +13 -0
  3. model.safetensors +3 -0
README.md ADDED
@@ -0,0 +1,108 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: zh
3
+ license: mit
4
+ tags:
5
+ - kronos
6
+ - financial-modeling
7
+ - time-series
8
+ - cryptocurrency
9
+ - stock-prediction
10
+ - pytorch
11
+ - transformers
12
+ datasets:
13
+ - custom
14
+ metrics:
15
+ - mse
16
+ - mae
17
+ widget:
18
+ - text: "Financial time series prediction"
19
+ ---
20
+
21
+ # Kronos Predictor - Fine-tuned on Custom Dataset
22
+
23
+ This is a fine-tuned version of [Kronos](https://huggingface.co/NeoQuasar/Kronos-small) predictor,
24
+ adapted for better performance on custom financial datasets.
25
+
26
+ ## Model Details
27
+
28
+ - **Model Type**: Predictor
29
+ - **Base Model**: NeoQuasar/Kronos-small
30
+ - **Fine-tuned For**: Financial Time Series Prediction
31
+ - **Architecture**: Transformer-based with custom tokenization
32
+ - **Input**: OHLCV (Open, High, Low, Close, Volume, Amount) data
33
+ - **Output**: Multi-step time series predictions
34
+
35
+ ## Training Details
36
+
37
+ - **Training Data**: Crypto Dataset (BTC, ETH, SOL, XAU)
38
+ - **Time Range**: 2022-01-21 to 2025-09-16
39
+ - **Frequency**: 5-minute intervals
40
+ - **Sequence Length**: 90 historical points
41
+ - **Prediction Horizon**: 10 future points
42
+
43
+ ## Usage
44
+
45
+ ### For Tokenizer
46
+ ```python
47
+ from model.kronos import KronosTokenizer
48
+
49
+ tokenizer = KronosTokenizer.from_pretrained("NeoQuasar/Kronos-small")
50
+ # Your tokenization code here
51
+ ```
52
+
53
+ ### For Predictor
54
+ ```python
55
+ from model.kronos import Kronos, KronosTokenizer, KronosPredictor
56
+
57
+ tokenizer = KronosTokenizer.from_pretrained("NeoQuasar/Kronos-small")
58
+ model = Kronos.from_pretrained("NeoQuasar/Kronos-small")
59
+ predictor = KronosPredictor(model, tokenizer, device="cuda")
60
+
61
+ # Your prediction code here
62
+ predictions = predictor.predict(...)
63
+ ```
64
+
65
+ ### With the Original Repository
66
+ ```python
67
+ # Clone the Kronos repository
68
+ git clone https://github.com/shiyu-coder/Kronos.git
69
+ cd Kronos
70
+
71
+ # Use the fine-tuned models
72
+ python examples/use_finetuned_model.py \
73
+ --csv_data your_data.csv \
74
+ --lookback 400 \
75
+ --pred_len 120
76
+ ```
77
+
78
+ ## Performance
79
+
80
+ This fine-tuned model shows improved performance on the target domain compared to the base model:
81
+
82
+ - **Domain Adaptation**: Specialized for the training dataset characteristics
83
+ - **Numerical Stability**: Improved convergence during fine-tuning
84
+ - **Inference Speed**: Optimized for the target sequence lengths
85
+
86
+ ## Limitations
87
+
88
+ - Optimized for 5-minute financial data intervals
89
+ - May require re-tuning for different time frequencies
90
+ - Performance may vary on datasets with different statistical properties
91
+
92
+ ## Citation
93
+
94
+ ```bibtex
95
+ @misc{shi2025kronos,
96
+ title={Kronos: A Foundation Model for the Language of Financial Markets},
97
+ author={Yu Shi and Zongliang Fu and Shuo Chen and Bohan Zhao and Wei Xu and Changshui Zhang and Jian Li},
98
+ year={2025},
99
+ eprint={2508.02739},
100
+ archivePrefix={arXiv},
101
+ primaryClass={q-fin.ST},
102
+ url={https://arxiv.org/abs/2508.02739},
103
+ }
104
+ ```
105
+
106
+ ## Contact
107
+
108
+ For questions or issues, please open an issue on the [Kronos GitHub repository](https://github.com/shiyu-coder/Kronos).
config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "attn_dropout_p": 0.1,
3
+ "d_model": 512,
4
+ "ff_dim": 1024,
5
+ "ffn_dropout_p": 0.25,
6
+ "learn_te": true,
7
+ "n_heads": 8,
8
+ "n_layers": 8,
9
+ "resid_dropout_p": 0.25,
10
+ "s1_bits": 10,
11
+ "s2_bits": 10,
12
+ "token_dropout_p": 0.1
13
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:103c3a3b479f3bfe7d2254c0aa56f56920f4c786325f0a2203a9ee1c2275b5ec
3
+ size 98980656