SpatialGT Pretrained Model
Model Description
This is the pretrained checkpoint of SpatialGT (Spatial Graph Transformer), a graph transformer model designed for spatial transcriptomics data analysis.
SpatialGT leverages spatial context through neighbor-aware attention mechanisms for:
- 🗺️ Spatial context learning from large-scale spatial transcriptomics data
- 🧬 Gene expression reconstruction
- 🔬 Perturbation simulation
Model Details
- Architecture: Graph Transformer with spatial neighbor attention
- Parameters: ~600M
- Training Data: Large-scale spatial transcriptomics atlas
- Input: Gene expression vectors with spatial coordinates
- Output: Contextualized gene expression representations
Usage
import torch
from pretrain.model_spatialpt import SpatialNeighborTransformer
from pretrain.Config import Config
# Load configuration
config = Config()
# Initialize model
model = SpatialNeighborTransformer(config)
# Load pretrained weights
from safetensors.torch import load_file
state_dict = load_file("model.safetensors")
model.load_state_dict(state_dict)
model.eval()
Files
model.safetensors: Model weights in safetensors formattraining_args.bin: Training argumentstrainer_state.json: Training state information
Citation
If you use this model, please cite our paper (details to be added upon publication).
License
MIT License