BALM-MoE

BALM-MoE is a sparse mixture-of-experts antibody language model pre-trained on a mixture of paired and unpaired antibody sequences. It uses a Top-2 MoE architecture with 200M active parameters. Code is available on GitHub. More details can be found in our paper published in the ICLR 2026 FM4Science Workshop.

Installation

BALM-MoE uses a custom architecture and requires the BALM package:

pip install git+https://github.com/brineylab/BALM.git

Use

Load the model and tokenizer as follows:

from balm import BalmMoEForMaskedLM, BalmTokenizer

model = BalmMoEForMaskedLM.from_pretrained("brineylab/BALM-MoE")
tokenizer = BalmTokenizer.from_pretrained("brineylab/BALM-MoE")
Downloads last month
18
Safetensors
Model size
0.7B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including brineylab/BALM-MoE