File size: 2,321 Bytes
bc4e081
40b3335
bc4e081
40b3335
 
bc4e081
40b3335
bc4e081
 
40b3335
 
bc4e081
 
40b3335
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
---
title: Watermark Leaderboard
emoji: πŸ†
colorFrom: blue
colorTo: green
sdk: gradio
sdk_version: "4.44.0"
app_file: app.py
pinned: false
license: mit
short_description: Interactive leaderboard for watermark performance evaluation
---

# Watermark Leaderboard πŸ†

An interactive leaderboard for comparing watermark performance across different models and evaluation settings.

## Features

- **Interactive Scatter Plot**: Visualize watermark performance with Plotly charts
- **Performance Table**: Detailed metrics with sorting and filtering
- **Multiple Evaluation Settings**: Attack-free, Watermark Removal, and Stealing Attack
- **Model Support**: LLaMA3 and DeepSeek models
- **Dynamic Filtering**: Real-time updates based on model and metric selection
- **Flexible Submissions**: Submit data for any combination of attack types
- **Pending Approval System**: All submissions reviewed before appearing on leaderboard
- **Complete Field Visibility**: Administrators see all submission details for review
- **Professional UI**: Clean, modern interface with accordion sections
- **Reproducibility**: Access to all evaluation codes and guidelines

## How to Use

1. **Select Model**: Choose between LLaMA3 or DeepSeek
2. **Choose Setting**: Pick from Attack-free, Watermark Removal, or Stealing Attack
3. **View Results**: Explore the scatter plot and detailed table
4. **Submit Data**: Click "Add Your Data" to submit new results
   - Submit any combination of attack types (Attack-free, Watermark Removal, Stealing Attack)
   - All submissions go through approval process before appearing on leaderboard
5. **Administrator Review**: Administrators can review pending submissions with full field visibility

## Metrics Explained

- **Normalized Utility ↑**: Higher values indicate better text quality
- **Detection Rate (%) ↑**: Higher values indicate better watermark detection
- **Absolute Utility Degradation ↑**: Higher values indicate better resistance to removal attacks
- **Adversary BERT Score ↑**: Higher values indicate better performance under adversarial conditions

## Contributing

We encourage researchers to contribute their evaluation results. Please follow the guidelines in the "Guidelines" section for submission requirements.

## License

MIT License

---
*Last updated: September 2024*