Diffusion Language Models
Collection
6 items
•
Updated
A diffusion-style masked language model fine-tuned from philipp-zettl/modernbert-diffusion-universal on the tatsu-lab/alpaca dataset.
Intended for tasks related to tatsu-lab/alpaca.
Example
from refinebert.diffusion_engine import MaskedDiffusionEngine
engine = MaskedDiffusionEngine("./refinebert-finetuned")
prompt = "N/A (See generation logs)"
output = engine.generate(prompt, num_new_tokens=N/A, steps=N/A, guidance_scale=N/A)
print(output)
Single-dataset fine-tuning.
| tatsu-lab/alpaca | 100% | Fine-tuning Target |
Fine-tuned specifically on the tatsu-lab/alpaca dataset.
| Metric | Value |
|---|---|
| Training Loss | 2.1540 |
| Epochs | 5 |
| Global Step | 14630 |
Base model
answerdotai/ModernBERT-base