Anonymous Authors
Point Model URL to renamed ViTeX-Edit-14B repo
d268d2c

A newer version of the Gradio SDK is available: 6.14.0

Upgrade
metadata
title: ViTeX-Bench Leaderboard
emoji: 🏆
colorFrom: blue
colorTo: green
sdk: gradio
sdk_version: 5.49.1
app_file: app.py
pinned: false
license: apache-2.0
short_description: Public leaderboard for video scene text editing.

🏆 ViTeX-Bench Leaderboard

🌐 Project page  ·  📊 Dataset  ·  🧪 Benchmark code  ·  🤖 Model & Inference code  ·  🏆 Leaderboard

Public ranking for video scene text editing under the 13-metric, three-axis protocol of ViTeX-Bench.

The full thirteen-metric vector is the unit of report. The table is sorted by TextScore = ∛(SeqAcc · CharAcc · TTS), the geometric mean of the three text-correctness primitives. TextScore is a single-axis sort key by design — no cross-axis aggregate is computed, because no axis substitutes for another. SeqAcc = 0 collapses TextScore to zero, the intended semantics for methods that never produce the requested target string.

Submitting

  1. Run the official benchmark on the 157-clip frozen evaluation split: bash scripts/run_benchmark.sh <your_method> in the Benchmark code repo.
  2. Upload the produced outputs/<method>/eval.json via the Submit tab.
  3. The maintainers review the submission. Approved entries appear on the leaderboard.

Companion repos