Naive/Memory Light Chain Classifier
This repository provides a transformer-based classifier for distinguishing between naive and memory B-cell receptor light chain sequences. We used adapters integrated into pre-trained language models for efficient fine-tuning. The model is fine-tuned from LightGPT, a causal transformer protein language model pre-trained on a large corpus of unpaired light chain sequences from the OAS database. An equivalent classification model for heavy chains can be found here.
For more information of how to use this model, please visit our Github repository.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for leaBroe/LightGPT_naive_mem_cls
Base model
leaBroe/LightGPT