Transformer-S669: Benchmark Specialist DDG Predictor
Transformer-S669 is a sequence-based Transformer model trained on the full S669 benchmark dataset for protein stability ($\Delta\Delta G$) prediction. It provides a competitive sequence-only alternative to structure-based tools.
Model Description
- Architecture: Feature-token Transformer (3 layers, 4 heads).
- Task: Regression (Mutation $\Delta\Delta G$ prediction).
- Input: Protein sequences.
- Dataset: S669 benchmark.
Performance
| Metric | Value |
|---|---|
| Spearman $\rho$ | 0.51 |
| MAE | 1.07 kcal/mol |
Comparison: Performance is comparable to SOTA methods like ESM-1v (0.51) and FoldX (0.48) on the same benchmark.
Usage
import torch
from model import DDGTransformer
# Load model
model = DDGTransformer(vocab_size=22, d_model=128)
checkpoint = torch.load("pytorch_model.bin", map_location="cpu")
model.load_state_dict(checkpoint["model_state_dict"])
model.eval()
# Predict (sequence indices)
seq_indices = torch.randint(0, 22, (1, 100))
with torch.no_grad():
ddg_pred = model(seq_indices)
print(f"Predicted DeltaDeltaG: {ddg_pred.item():.4f} kcal/mol")
Citation
@software{transformer_s669_2026,
author = {AI Whisperers},
title = {Transformer-S669: Benchmark Specialist DDG Predictor},
year = {2026},
url = {https://huggingface.co/geestaltt/transformer-s669}
}
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support