Spaces:
Runtime error
Runtime error
Update app.py
Browse fileschange about description
app.py
CHANGED
|
@@ -593,7 +593,7 @@ with gr.Blocks(title="Bias Detection & Mitigation Tool") as demo:
|
|
| 593 |
### About
|
| 594 |
This tool uses multiple models to detect bias in text:
|
| 595 |
- LLaMA performs bias classification. Bias label indicates whether the response is biased, bias type returns the type of social bias found in the response and demographic group affected, if biased.
|
| 596 |
-
- The Regard classifier indicates the social perception of the response (is the text negative or positive?)
|
| 597 |
- MNLI for fairness scoring
|
| 598 |
- Fairlearn for demographic metrics
|
| 599 |
""")
|
|
|
|
| 593 |
### About
|
| 594 |
This tool uses multiple models to detect bias in text:
|
| 595 |
- LLaMA performs bias classification. Bias label indicates whether the response is biased, bias type returns the type of social bias found in the response and demographic group affected, if biased.
|
| 596 |
+
- The Regard classifier indicates the social perception of the response (is the text negative or positive?) and score to indicate how certain the model is of its social perception label (closer to 0 is uncertain, 1 is certain)
|
| 597 |
- MNLI for fairness scoring
|
| 598 |
- Fairlearn for demographic metrics
|
| 599 |
""")
|