Molecular Encoders
Collection
Collection of pre-trained encoder models trained on large molecules databases. • 3 items • Updated • 1
How to use knowledgator/SMILES-DeBERTa-large with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="knowledgator/SMILES-DeBERTa-large") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("knowledgator/SMILES-DeBERTa-large")
model = AutoModelForMaskedLM.from_pretrained("knowledgator/SMILES-DeBERTa-large")SMILES-DeBERTa-large was designed to be used as an encoder of SMILES sequences in a general organic chemistry context.
SMILES-DeBERTa-large is based on the DeBERTa-V2 model with optimizations in implementing SMILES tokenizer for the encoder.
Coming soon.