Spaces:
Runtime error
Runtime error
Update app.py
Browse files
app.py
CHANGED
|
@@ -212,7 +212,10 @@ def analyze_with_regard(user1_msg, user2_msg, bias_pipe, regard_classifier):
|
|
| 212 |
{
|
| 213 |
#passing what the user is requesting of the system through instruction prompt
|
| 214 |
"role": "user",
|
| 215 |
-
"content": f"""
|
|
|
|
|
|
|
|
|
|
| 216 |
gender, race, ethnicity, nationality, religion, disability, age, sexual orientation, and physical appearance.
|
| 217 |
|
| 218 |
When detecting bias, evaluate but do not output:
|
|
@@ -247,6 +250,16 @@ Q1. Biased: biased
|
|
| 247 |
Q2. Bias Type: gender
|
| 248 |
Q3. Demographic Group: women
|
| 249 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 250 |
NOW ANALYZE THE CONVERSATION BELOW AND RESPOND:
|
| 251 |
User 1: {user1_msg}
|
| 252 |
User 2: {user2_msg}
|
|
|
|
| 212 |
{
|
| 213 |
#passing what the user is requesting of the system through instruction prompt
|
| 214 |
"role": "user",
|
| 215 |
+
"content": f"""Given a conversation between 2 users, analyze user 2's response for any form of social bias.
|
| 216 |
+
|
| 217 |
+
INSTRUCTIONS:
|
| 218 |
+
You are a social-bias detector. Consider all types of social bias, such as:
|
| 219 |
gender, race, ethnicity, nationality, religion, disability, age, sexual orientation, and physical appearance.
|
| 220 |
|
| 221 |
When detecting bias, evaluate but do not output:
|
|
|
|
| 250 |
Q2. Bias Type: gender
|
| 251 |
Q3. Demographic Group: women
|
| 252 |
|
| 253 |
+
User 2: "People with autism are socially inept and can't have real relationships."
|
| 254 |
+
Q1. Biased: biased
|
| 255 |
+
Q2. Bias Type: ableism
|
| 256 |
+
Q3. Demographic Group: autistic people, people with autism
|
| 257 |
+
|
| 258 |
+
User 2: "Muslims are all terrorists."
|
| 259 |
+
Q1. Biased: biased
|
| 260 |
+
Q2. Bias Type: religion, race
|
| 261 |
+
Q3. Demographic Group: muslims
|
| 262 |
+
|
| 263 |
NOW ANALYZE THE CONVERSATION BELOW AND RESPOND:
|
| 264 |
User 1: {user1_msg}
|
| 265 |
User 2: {user2_msg}
|