Update README.md
Browse files
README.md
CHANGED
|
@@ -3,13 +3,13 @@ license: apache-2.0
|
|
| 3 |
pipeline_tag: time-series-forecasting
|
| 4 |
---
|
| 5 |
|
| 6 |
-
#
|
| 7 |
|
| 8 |
**A large-scale time series foundation model utilizing Mixture of Experts (MoE) architecture with multiple patch tokenizers for efficient and accurate time series forecasting.**
|
| 9 |
|
| 10 |
## 📖 Introduction
|
| 11 |
|
| 12 |
-
|
| 13 |
|
| 14 |
## 🚀 Quick Start
|
| 15 |
```python
|
|
@@ -18,7 +18,7 @@ from transformers import AutoModelForCausalLM, AutoConfig
|
|
| 18 |
|
| 19 |
# Load pre-trained model (when available)
|
| 20 |
model = AutoModelForCausalLM.from_pretrained(
|
| 21 |
-
'ant-intl/
|
| 22 |
trust_remote_code=True
|
| 23 |
)
|
| 24 |
|
|
|
|
| 3 |
pipeline_tag: time-series-forecasting
|
| 4 |
---
|
| 5 |
|
| 6 |
+
# Falcon-TST: A Large-Scale Time Series Foundation Model
|
| 7 |
|
| 8 |
**A large-scale time series foundation model utilizing Mixture of Experts (MoE) architecture with multiple patch tokenizers for efficient and accurate time series forecasting.**
|
| 9 |
|
| 10 |
## 📖 Introduction
|
| 11 |
|
| 12 |
+
Falcon-TST is a cutting-edge time series foundation model that leverages the power of Mixture of Experts (MoE) architecture combined with multiple patch tokenizers. This innovative approach enables efficient processing of time series data while maintaining high accuracy across various forecasting tasks.
|
| 13 |
|
| 14 |
## 🚀 Quick Start
|
| 15 |
```python
|
|
|
|
| 18 |
|
| 19 |
# Load pre-trained model (when available)
|
| 20 |
model = AutoModelForCausalLM.from_pretrained(
|
| 21 |
+
'ant-intl/Falcon-TST_Large',
|
| 22 |
trust_remote_code=True
|
| 23 |
)
|
| 24 |
|