|
|
--- |
|
|
license: mit |
|
|
tags: |
|
|
- spatial-transcriptomics |
|
|
- graph-transformer |
|
|
- gene-expression |
|
|
- pretrained |
|
|
- pytorch |
|
|
language: |
|
|
- en |
|
|
library_name: transformers |
|
|
pipeline_tag: feature-extraction |
|
|
--- |
|
|
|
|
|
# SpatialGT Pretrained Model |
|
|
|
|
|
## Model Description |
|
|
|
|
|
This is the **pretrained checkpoint** of SpatialGT (Spatial Graph Transformer), a graph transformer model designed for spatial transcriptomics data analysis. |
|
|
|
|
|
SpatialGT leverages spatial context through neighbor-aware attention mechanisms for: |
|
|
- 🗺️ Spatial context learning from large-scale spatial transcriptomics data |
|
|
- 🧬 Gene expression reconstruction |
|
|
- 🔬 Perturbation simulation |
|
|
|
|
|
## Model Details |
|
|
|
|
|
- **Architecture**: Graph Transformer with spatial neighbor attention |
|
|
- **Parameters**: ~600M |
|
|
- **Training Data**: Large-scale spatial transcriptomics atlas |
|
|
- **Input**: Gene expression vectors with spatial coordinates |
|
|
- **Output**: Contextualized gene expression representations |
|
|
|
|
|
## Usage |
|
|
|
|
|
```python |
|
|
import torch |
|
|
from pretrain.model_spatialpt import SpatialNeighborTransformer |
|
|
from pretrain.Config import Config |
|
|
|
|
|
# Load configuration |
|
|
config = Config() |
|
|
|
|
|
# Initialize model |
|
|
model = SpatialNeighborTransformer(config) |
|
|
|
|
|
# Load pretrained weights |
|
|
from safetensors.torch import load_file |
|
|
state_dict = load_file("model.safetensors") |
|
|
model.load_state_dict(state_dict) |
|
|
|
|
|
model.eval() |
|
|
``` |
|
|
|
|
|
## Files |
|
|
|
|
|
- `model.safetensors`: Model weights in safetensors format |
|
|
- `training_args.bin`: Training arguments |
|
|
- `trainer_state.json`: Training state information |
|
|
|
|
|
## Citation |
|
|
|
|
|
If you use this model, please cite our paper (details to be added upon publication). |
|
|
|
|
|
## License |
|
|
|
|
|
MIT License |
|
|
|