How to use Rostlab/prot_bert_bfd with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="Rostlab/prot_bert_bfd")
# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("Rostlab/prot_bert_bfd", dtype="auto")
Hi,
Your model seems to only provide:
last_hidden_statepooler output
for a given protein sequence.
Is it possible to retrieve the hidden vectors from the intermediate encoder layers ?
Thanks
· Sign up or log in to comment