How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-classification", model="wukevin/tcr-bert")
# Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification

tokenizer = AutoTokenizer.from_pretrained("wukevin/tcr-bert")
model = AutoModelForSequenceClassification.from_pretrained("wukevin/tcr-bert")
Quick Links

YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

TCR transformer model

See our full codebase and our preprint for more information.

This model is on:

  • Masked language modeling (masked amino acid or MAA modeling)
  • Classification across antigen labels from PIRD

If you are looking for a model trained only on MAA, please see our other model.

Example inputs:

  • C A S S P V T G G I Y G Y T F (binds to NLVPMVATV CMV antigen)
  • C A T S G R A G V E Q F F (binds to GILGFVFTL flu antigen)
Downloads last month
1,121
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support