Configuration Parsing Warning: In adapter_config.json: "peft.base_model_name_or_path" must be a string
Configuration Parsing Warning: In adapter_config.json: "peft.task_type" must be a string

PDW โ€” Physics-Corrected CogVideoX-2b World Model (DoRA Adapter)

A DoRA (Weight-Decomposed Low-Rank Adaptation) adapter for CogVideoX-2b, fine-tuned to generate physically accurate videos using NVIDIA Warp physics simulation data and TRD (Temporal Representation Distillation) with DINOv2-large as teacher.

Model Details

  • Base model: THUDM/CogVideoX-2b (1.7B params)
  • Adapter: DoRA (r=16, lora_alpha=32, use_dora=True)
  • Target modules: to_q, to_k, to_v, to_out.0
  • Trainable params: 7.6M / 1.7B (0.45%)
  • Physics engine: NVIDIA Warp (28-scenario 7ร—4 grid)
  • TRD teacher: DINOv2-large
  • Hardware: NVIDIA H100 NVL
  • Training steps: 400

Evaluation Results

Metric Delta
Diffusion MSE +94.1%
Motion score +1.7%
Overall +47.9%

How to Use

from peft import PeftModel
from diffusers import CogVideoXTransformer3DModel

# Load base transformer
base_transformer = CogVideoXTransformer3DModel.from_pretrained(
    "THUDM/CogVideoX-2b", subfolder="transformer"
)

# Load DoRA adapter
model = PeftModel.from_pretrained(base_transformer, "athul020/pdw_final_dora")

Framework versions

  • PEFT 0.18.1
Downloads last month
12
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for athul020/pdw_final_dora

Adapter
(5)
this model