Fill-Mask
Transformers
PyTorch
esm
PepMLM-650M / README.md
pranamanam's picture
Update README.md
2bbe1fa verified
|
raw
history blame
2.12 kB
metadata
license: cc-by-nc-nd-4.0
extra_gated_prompt: >-
  As PepMLM is not published, you agree to share any results from PepMLM's use
  with the authors prior to publication or dissemination.
extra_gated_fields:
  Company: text
  Country: country
  Specific date: date_picker
  I want to use this model for:
    type: select
    options:
      - Research
      - Education
      - label: Other
        value: other
  I agree to use this model for non-commercial use ONLY: checkbox

PepMLM: Target Sequence-Conditioned Generation of Peptide Binders via Masked Language Modeling image/png In this work, we introduce PepMLM, a purely target sequence-conditioned de novo generator of linear peptide binders. By employing a novel masking strategy that uniquely positions cognate peptide sequences at the terminus of target protein sequences, PepMLM tasks the state-of-the-art ESM-2 pLM to fully reconstruct the binder region, achieving low perplexities matching or improving upon previously-validated peptide-protein sequence pairs. After successful in silico benchmarking with AlphaFold-Multimer, we experimentally verify PepMLM’s efficacy via fusion of model-derived peptides to E3 ubiquitin ligase domains, demonstrating endogenous degradation of target substrates in cellular models. In total, PepMLM enables the generative design of candidate binders to any target protein, without the requirement of target structure, empowering downstream programmable proteome editing applications.

  • Demo: HuggingFace Space Demo Link.
  • Colab Notebook: Link
  • Preprint: Link
# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("TianlaiChen/PepMLM-650M")
model = AutoModelForMaskedLM.from_pretrained("TianlaiChen/PepMLM-650M")

Logo