File size: 1,819 Bytes
76a1440 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 | ---
language: en
license: mit
library_name: pytorch
---
# Cloudcasting
## Model Description
These models are trained to predict future frames of satellite data from past frames. The model uses 3 hours
of recent satellite imagery at 15 minute intervals and predicts 3 hours into the future also at
15 minute intervals. The satellite inputs and predictions are multispectral with 11 channels.
See [1] for the repo used to train the model.
- **Developed by:** Open Climate Fix and the Alan Turing Institute
- **License:** mit
# Training Details
## Data
This was trained on EUMETSAT satellite imagery derived from the data stored in [this google public
dataset](https://console.cloud.google.com/marketplace/product/bigquery-public-data/eumetsat-seviri-rss?hl=en-GB&inv=1&invt=AbniZA&project=solar-pv-nowcasting&pli=1).
The data was processed using the protocol in [2]
## Results
See the READMEs in each model dir for links to the wandb training runs
## Usage
These models rely on [1] being installed. Example usage to load the model is shown below
```{python}
import hydra
import yaml
from huggingface_hub import snapshot_download
from safetensors.torch import load_model
REPO_ID = "openclimatefix/cloudcasting_example_models"
REVISION = <commit-id>
MODEL = "simvp_model"
# Download the model checkpoints
hf_download_dir = snapshot_download(
repo_id=REPO_ID,
revision=REVISION,
)
# Create the model object
with open(f"{hf_download_dir}/model_config.yaml", "r", encoding="utf-8") as f:
model = hydra.utils.instantiate(yaml.safe_load(f))
# Load the model weights
load_model(
model,
filename=f"{hf_download_dir}/model.safetensors",
strict=True,
)
```
### Software
- [1] https://github.com/openclimatefix/sat_pred
- [2] https://github.com/alan-turing-institute/cloudcasting
|