--- language: en license: mit library_name: pytorch --- # Cloudcasting ## Model Description These models are trained to predict future frames of satellite data from past frames. The model uses 3 hours of recent satellite imagery at 15 minute intervals and predicts 3 hours into the future also at 15 minute intervals. The satellite inputs and predictions are multispectral with 11 channels. See [1] for the repo used to train the model. - **Developed by:** Open Climate Fix and the Alan Turing Institute - **License:** mit # Training Details ## Data This was trained on EUMETSAT satellite imagery derived from the data stored in [this google public dataset](https://console.cloud.google.com/marketplace/product/bigquery-public-data/eumetsat-seviri-rss?hl=en-GB&inv=1&invt=AbniZA&project=solar-pv-nowcasting&pli=1). The data was processed using the protocol in [2] ## Results See the READMEs in each model dir for links to the wandb training runs ## Usage These models rely on [1] being installed. Example usage to load the model is shown below ```{python} import hydra import yaml from huggingface_hub import snapshot_download from safetensors.torch import load_model REPO_ID = "openclimatefix/cloudcasting_example_models" REVISION = MODEL = "simvp_model" # Download the model checkpoints hf_download_dir = snapshot_download( repo_id=REPO_ID, revision=REVISION, ) # Create the model object with open(f"{hf_download_dir}/model_config.yaml", "r", encoding="utf-8") as f: model = hydra.utils.instantiate(yaml.safe_load(f)) # Load the model weights load_model( model, filename=f"{hf_download_dir}/model.safetensors", strict=True, ) ``` ### Software - [1] https://github.com/openclimatefix/sat_pred - [2] https://github.com/alan-turing-institute/cloudcasting