File size: 1,406 Bytes
a9c5c82
 
c085c89
a9c5c82
 
f9a1605
a9c5c82
 
 
 
 
f9a1605
6c428bf
067f25c
8ecb58d
a9c5c82
 
 
f209def
a9c5c82
 
f209def
 
a9c5c82
 
 
 
 
 
 
f39a2d2
 
 
 
 
a9c5c82
 
f209def
a9c5c82
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
---
license: apache-2.0
pipeline_tag: time-series-forecasting
---

# Falcon-TST: A Large-Scale Time Series Foundation Model

**A large-scale time series foundation model utilizing Mixture of Experts (MoE) architecture with multiple patch tokenizers for efficient and accurate time series forecasting.**

## 📖 Introduction

Falcon-TST is a cutting-edge time series foundation model that leverages the power of Mixture of Experts (MoE) architecture combined with multiple patch tokenizers. This innovative approach enables efficient processing of time series data while maintaining high accuracy across various forecasting tasks.

You can find more details about the model on [GitHub page](https://github.com/ant-intl/Falcon-TST).

## 🚀 Quick Start
```python
import torch
from transformers import AutoModel

# Load pre-trained model (when available)
model = AutoModel.from_pretrained(
    'ant-intl/Falcon-TST_Large',
    trust_remote_code=True
)

# Prepare your time series data
batch_size, lookback_length, channels = 1, 2880, 7
time_series = torch.randn(batch_size, lookback_length, channels)

# Load the model and data to the same device
device = torch.cuda.current_device() if torch.cuda.is_available() else 'cpu'
model = model.to(device)
time_series = time_series.to(device)

# Generate forecasts
forecast_length = 96
predictions = model.predict(time_series, forecast_horizon=forecast_length)
```