BALM-shuffled

BALM-shuffled is an antibody language model that uses a RoBERTa architecture and was pre-trained on randomly shuffled paired antibody sequences from Jaffe et al. This was a control model used to evaluate the benefits of natively paired sequences in our paper published in Patterns. Therefore, this model should not be used for real use cases; use BALM-paired instead.

Use

Load the model and tokenizer as follows:

from transformers import RobertaTokenizer, RobertaForMaskedLM

model = RobertaForMaskedLM.from_pretrained("brineylab/BALM-shuffled")
tokenizer = RobertaTokenizer.from_pretrained("brineylab/BALM-shuffled")

The tokenizer expects sequences formatted as: HEAVY_CHAIN</s>LIGHT_CHAIN.

Downloads last month
9
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including brineylab/BALM-shuffled