metadata
language:
- en
license: openrail
library_name: diffusers
tags:
- diffusion-llm
- parallel-generation
- custom-transformer
- cropmark
datasets:
- OpenAssistant/oasst1
metrics:
- cosine_similarity
πͺ DiffReaper-5L
DiffReaper-5L is a larger version of DiffReaper-5, with 2048-dim embeddings and a 24-layer Transformer. This model is under active autonomous training on an H100.
π¬ Model Details
- Architecture: 24-layer Custom Transformer with Time Embedding.
- Task: Conditioned Text Diffusion (Prompt-Response).
- Training Objective: Cosine Similarity Regression.
- Sampling: 10-step iterative parallel denoising.
π Autonomous Training State
The model is training autonomously on an H100 with the following configuration:
- Batch Size: 16.
- Learning Rate: 1e-4.
- Checkpointing: Saves
diffreaper5l_{step}.ptevery 2,500 steps to darwinkernelpanic/DiffReaper-5L.
π οΈ Usage (Inference)
To run inference:
import torch
# Assuming DiffReaperModel is defined as in train_diffreaper_5l.py
model = DiffReaperModel(vocab_size=50257, n_embd=2048, n_head=32, n_layer=24).to("cuda")
model.load_state_dict(torch.load("diffreaper5l_latest.pt"))
model.eval()
π― Fine-tuning
To fine-tune on a custom dataset, ensure your data loader provides Prompt + Response pairs. Use the same Cosine Similarity loss.
Created by Darwin & Clawd.