BALM-unpaired

BALM-unpaired is an antibody language model that uses a RoBERTa architecture and was pre-trained on unpaired antibody sequences from Jaffe et al. Datasets used for pre-training are available on Zenodo and code is available on GitHub. More details can be found in our paper published in Patterns.

Use

Load the model and tokenizer as follows:

from transformers import RobertaTokenizer, RobertaForMaskedLM

model = RobertaForMaskedLM.from_pretrained("brineylab/BALM-unpaired")
tokenizer = RobertaTokenizer.from_pretrained("brineylab/BALM-unpaired")

The tokenizer expects unpaired sequences, either HEAVY_CHAIN or LIGHT_CHAIN.

Downloads last month
10
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including brineylab/BALM-unpaired