morgankavanagh commited on
Commit
82bc5cb
·
1 Parent(s): 2961758

added readme.md

Browse files
Files changed (1) hide show
  1. README.md +34 -8
README.md CHANGED
@@ -1,11 +1,37 @@
 
 
 
 
1
  ---
2
- title: Post Editing Evaluator
3
- emoji: 🌖
4
- colorFrom: yellow
5
- colorTo: pink
6
- sdk: docker
7
- pinned: false
8
- short_description: This evaluator enables to measure the quality of prompts
 
 
 
9
  ---
10
 
11
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # 📝 Post-Editing Evaluation Tool
2
+
3
+ This project is a web-based evaluation tool that scores machine translation (MT) output against human-edited references using BLEU, CHRF, and COMET metrics. It is packaged as a Gradio interface and deployable via Hugging Face Spaces.
4
+
5
  ---
6
+
7
+ ## 🚀 Features
8
+
9
+ - 📊 Evaluate MT output with:
10
+ - **BLEU**
11
+ - **CHRF**
12
+ - **COMET** (requires OpenAI API key)
13
+ - 🖥️ Simple, interactive web UI via Gradio
14
+ - 🐳 Hugging Face Spaces–compatible Docker deployment
15
+
16
  ---
17
 
18
+ ## 🧪 Example Use
19
+
20
+ Paste or upload:
21
+ - **Source text**
22
+ - **Machine translation output**
23
+ - **Post-edited reference**
24
+
25
+ Then click **"Evaluate"** to see automatic quality scores.
26
+
27
+ ---
28
+
29
+ ## 📦 Installation (for local development)
30
+
31
+ ```bash
32
+ git clone https://github.com/yourusername/post_editing_evaluator.git
33
+ cd post_editing_evaluator
34
+ python -m venv venv
35
+ source venv/bin/activate # or .\venv\Scripts\activate on Windows
36
+ pip install -r requirements.txt
37
+ python interface.py