transformer_pwff_lr3e-4
Positional-wise feed-forward (PWFF) pooled transformer encoder trained on concat [r, |r-a|, r*a] with r = ESM2 8B rbd_embeddings and a = ESM2 8B ace2_embeddings (mean pooled).
- Dataset:
BIIE-AI/ace2_binding - Training: 8 epochs
- Learning rate: 3e-4
- Reported average per-specie MCC: 0.762
Files
transformer_best.pt: PyTorch checkpoint (best model)
Notes
This repository contains a raw PyTorch checkpoint only. To use it, load it in the same codebase/model definition that produced the checkpoint.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support