we do not have a full checkpoint conversion validation, if you encounter pipeline loading failure and unsidered output, please contact me via bili_sakura@zju.edu.cn
AgriFM
Model Description
This repository provides a Hugging Face-compatible checkpoint for AgriFM, a multi-source temporal remote sensing model for agricultural mapping.
The checkpoint was converted from a legacy PyTorch .pth file into standard transformers-style artifacts:
config.jsonmodel.safetensors
Model Sources
- Paper: AgriFM: A Multi-source Temporal Remote Sensing Foundation Model for Agriculture Mapping
- Original codebase used for conversion: Bili-Sakura/AgriFM-transformers
- Raw source checkpoint:
path/to/raw/AgriFM.pth
Conversion Notes
- The raw checkpoint uses a legacy key layout (
encoder.*), which was remapped to Hugging FacePreTrainedModelstate dict conventions. - The raw checkpoint is encoder-only.
- Fusion neck and segmentation head parameters are initialized from model defaults in this converted package.
Intended Use
This checkpoint is intended for:
- feature extraction from multi-source temporal remote sensing inputs
- fine-tuning for segmentation tasks in agricultural or land-cover settings
Input Format
Expected inputs are a dict of tensors with shape (B, T, C, H, W):
HLSL30:C=6S2:C=10Modis:C=7
Usage
import torch
from AgriFM.models import AgriFMConfig, AgriFMModel
model_dir = "path/to/AgriFM"
config = AgriFMConfig.from_pretrained(model_dir)
model = AgriFMModel.from_pretrained(model_dir, config=config)
model.eval()
B, T, H, W = 1, 32, 256, 256
pixel_values = {
"HLSL30": torch.randn(B, T, 6, H, W),
"S2": torch.randn(B, T, 10, H, W),
"Modis": torch.randn(B, T, 7, H, W),
}
with torch.no_grad():
outputs = model(pixel_values=pixel_values)
features = outputs.features
print(features.shape)
Limitations
- This is a converted checkpoint and not an official upstream release.
- Since the source is encoder-only, task heads may require downstream fine-tuning before production use.
- Performance depends on preprocessing consistency and modality availability.
Training and Evaluation
No additional training or benchmarking was run as part of this conversion step.
For training/evaluation pipelines, refer to Bili-Sakura/AgriFM-transformers.
Dependencies
Recommended environment: environment.yml from Bili-Sakura/AgriFM-transformers
Core libraries: PyTorch, Transformers, timm, einops, scipy, h5py.
Citation
@article{li2026agrifm,
title={AgriFM: A multi-source temporal remote sensing foundation model for Agriculture mapping},
author={Li, Wenyuan and Liang, Shunlin and Chen, Keyan and Chen, Yongzhe and Ma, Han and Xu, Jianglei and Ma, Yichuan and Zhang, Yuxiang and Guan, Shikang and Fang, Husheng and others},
journal={Remote Sensing of Environment},
volume={334},
pages={115234},
year={2026},
publisher={Elsevier}
}
- Downloads last month
- 17