ccloud0525
commited on
Commit
·
5618ec7
1
Parent(s):
f623526
feat: first commit
Browse files
README.md
CHANGED
|
@@ -37,9 +37,47 @@ In this work, we pretrain Aurora in a cross-modality paradigm, which adopts Chan
|
|
| 37 |
<div align="center">
|
| 38 |
<img alt="intro" src="https://cdn-uploads.huggingface.co/production/uploads/66276727368ec2a0b933772c/d82jT96jiGD0QL9s8RYg-.png" width="100%"/>
|
| 39 |
</div>
|
| 40 |
-
|
| 41 |
## Quickstart
|
| 42 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 43 |
We release the original code of Aurora in this repo. You can also download the pretrained checkpoints in our [huggingface](https://huggingface.co/DecisionIntelligence/Aurora) repo and put them in the folder: aurora/.
|
| 44 |
|
| 45 |
If you want to pretrain an Aurora on your own time series corpus, you need to install the following important packages:
|
|
@@ -100,10 +138,11 @@ seqs = torch.randn(batch_size, lookback_length)
|
|
| 100 |
|
| 101 |
# Note that Sundial can generate multiple probable predictions
|
| 102 |
forecast_length = 96
|
| 103 |
-
num_samples =
|
| 104 |
|
| 105 |
|
| 106 |
# For inference_token_len, you can refer to LightGTS (Periodic Patching).
|
|
|
|
| 107 |
output = model.generate(inputs=seqs, max_output_length=forecast_length, num_samples=num_samples, inference_token_len=48)
|
| 108 |
|
| 109 |
|
|
|
|
| 37 |
<div align="center">
|
| 38 |
<img alt="intro" src="https://cdn-uploads.huggingface.co/production/uploads/66276727368ec2a0b933772c/d82jT96jiGD0QL9s8RYg-.png" width="100%"/>
|
| 39 |
</div>
|
|
|
|
| 40 |
## Quickstart
|
| 41 |
|
| 42 |
+
#### From pypi (recommended)
|
| 43 |
+
|
| 44 |
+
We have publised Aurora on PyPi, **you can directly install it with one line of code!**
|
| 45 |
+
|
| 46 |
+
```shell
|
| 47 |
+
$ pip install aurora-model==0.1.0
|
| 48 |
+
```
|
| 49 |
+
|
| 50 |
+
Then you can use the Aurora model to make zero-shot probabilistic forecasting!
|
| 51 |
+
|
| 52 |
+
```python
|
| 53 |
+
from aurora import load_model
|
| 54 |
+
import os
|
| 55 |
+
# os.environ['HF_ENDPOINT'] = 'https://hf-mirror.com'
|
| 56 |
+
model = load_model()
|
| 57 |
+
|
| 58 |
+
# prepare input
|
| 59 |
+
batch_size, lookback_length = 1, 528
|
| 60 |
+
seqs = torch.randn(batch_size, lookback_length)
|
| 61 |
+
|
| 62 |
+
# Note that Sundial can generate multiple probable predictions
|
| 63 |
+
forecast_length = 96
|
| 64 |
+
num_samples = 100
|
| 65 |
+
|
| 66 |
+
|
| 67 |
+
# For inference_token_len, you can refer to LightGTS (Periodic Patching).
|
| 68 |
+
# We recommend to use the period length as the inference_token_len.
|
| 69 |
+
output = model.generate(inputs=seqs, max_output_length=forecast_length, num_samples=num_samples, inference_token_len=48)
|
| 70 |
+
|
| 71 |
+
|
| 72 |
+
# use raw predictions for mean/quantiles/confidence-interval estimation
|
| 73 |
+
print(output.shape)
|
| 74 |
+
|
| 75 |
+
```
|
| 76 |
+
|
| 77 |
+
|
| 78 |
+
|
| 79 |
+
#### From raw code
|
| 80 |
+
|
| 81 |
We release the original code of Aurora in this repo. You can also download the pretrained checkpoints in our [huggingface](https://huggingface.co/DecisionIntelligence/Aurora) repo and put them in the folder: aurora/.
|
| 82 |
|
| 83 |
If you want to pretrain an Aurora on your own time series corpus, you need to install the following important packages:
|
|
|
|
| 138 |
|
| 139 |
# Note that Sundial can generate multiple probable predictions
|
| 140 |
forecast_length = 96
|
| 141 |
+
num_samples = 100
|
| 142 |
|
| 143 |
|
| 144 |
# For inference_token_len, you can refer to LightGTS (Periodic Patching).
|
| 145 |
+
# We recommend to use the period length as the inference_token_len.
|
| 146 |
output = model.generate(inputs=seqs, max_output_length=forecast_length, num_samples=num_samples, inference_token_len=48)
|
| 147 |
|
| 148 |
|