How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("fill-mask", model="monsoon-nlp/dna-blockdiff-2", trust_remote_code=True)
# Load model directly
from transformers import AutoModelForMaskedLM
model = AutoModelForMaskedLM.from_pretrained("monsoon-nlp/dna-blockdiff-2", trust_remote_code=True, dtype="auto")
Quick Links

Test version of https://huggingface.co/monsoon-nlp/dna-blockdiff using the custom bd3lms EmbeddingLayer instead of the torch.nn.EmbeddingLayer

  • Does not seem to stop generating
  • Does not fix issue with attention on inference
  • Does not fix issue with torch.where in training
Downloads last month
8
Safetensors
Model size
98.7M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for monsoon-nlp/dna-blockdiff-2

Finetunes
1 model