File size: 1,633 Bytes
6f7c824 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 |
---
license: mit
tags:
- spatial-transcriptomics
- graph-transformer
- gene-expression
- pretrained
- pytorch
language:
- en
library_name: transformers
pipeline_tag: feature-extraction
---
# SpatialGT Pretrained Model
## Model Description
This is the **pretrained checkpoint** of SpatialGT (Spatial Graph Transformer), a graph transformer model designed for spatial transcriptomics data analysis.
SpatialGT leverages spatial context through neighbor-aware attention mechanisms for:
- 🗺️ Spatial context learning from large-scale spatial transcriptomics data
- 🧬 Gene expression reconstruction
- 🔬 Perturbation simulation
## Model Details
- **Architecture**: Graph Transformer with spatial neighbor attention
- **Parameters**: ~600M
- **Training Data**: Large-scale spatial transcriptomics atlas
- **Input**: Gene expression vectors with spatial coordinates
- **Output**: Contextualized gene expression representations
## Usage
```python
import torch
from pretrain.model_spatialpt import SpatialNeighborTransformer
from pretrain.Config import Config
# Load configuration
config = Config()
# Initialize model
model = SpatialNeighborTransformer(config)
# Load pretrained weights
from safetensors.torch import load_file
state_dict = load_file("model.safetensors")
model.load_state_dict(state_dict)
model.eval()
```
## Files
- `model.safetensors`: Model weights in safetensors format
- `training_args.bin`: Training arguments
- `trainer_state.json`: Training state information
## Citation
If you use this model, please cite our paper (details to be added upon publication).
## License
MIT License
|