mbayan commited on
Commit
4a2d5c0
·
verified ·
1 Parent(s): e4fa818

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +142 -369
README.md CHANGED
@@ -183,505 +183,278 @@ Explanation: <explanation>
183
  > ```
184
 
185
  <details>
186
- <summary><b>Abuse Detection (BN)</b></summary>
187
 
188
- **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
189
-
190
- **English Instruction:** `<image> Examine the Bengali meme and decide whether it features abusive content. This includes any form of verbal abuse or personal offense. Classify as: 'abusive' or 'not-abusive'.`
191
-
192
- **Native (BN) Instruction:** `<image> এই বাংলা মেমটিতে কি কোনো ধরনের আপত্তিকর বা অবমাননাকর বিষয়বস্তু আছে? আপনার সিদ্ধান্ত অনুযায়ী মেমটিকে 'অপমানজনক' অথবা 'অপমানজনক নয়' লেবেল দিন।`
193
-
194
- **English Labels:** `abusive`, `not-abusive` | **Native Labels:** `অপমানজনক`, `অপমানজনক নয়`
195
- </details>
196
-
197
- <details>
198
- <summary><b>Deepfake Detection (RO)</b></summary>
199
-
200
- **System:** `You are an expert image analyst specializing in detecting manipulated or deepfake content in memes.`
201
-
202
- **English Instruction:** `<image> Assess the provided Romanian meme. Is the image content authentic, or has it been altered? If altered, was it done using AI (deepfake) or other methods? Based on your analysis, classify the meme as: 'DeepFake', 'Fake', or 'Real'.`
203
-
204
- **Native (RO) Instruction:** `<image> Este această memă românească autentică sau a fost manipulată? Dacă a fost manipulată, s-a folosit inteligență artificială (deepfake)? Folosește următoarele etichete pentru a clasifica: 'DeepFake' pentru conținut falsificat cu AI, 'Fake' pentru alt tip de falsificare, sau 'Real' pentru conținut original.`
205
-
206
- **English Labels:** `DeepFake`, `Fake`, `Real` | **Native Labels:** `DeepFake`, `Fals`, `Real`
207
- </details>
208
-
209
- <details>
210
- <summary><b>Emotion Recognition (RO)</b></summary>
211
-
212
- **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
213
-
214
- **English Instruction:** `<image> Determine the emotional impact of the Romanian meme. Analyze its components to select the most accurate emotional descriptor from the options provided. Classify as: 'Joy', 'Sadness', 'Anger', 'Fear', 'Surprise', or 'Love'.`
215
 
216
- **Native (RO) Instruction:** `<image> Examinați mema românească și identificați cea mai evidentă emoție transmisă prin text și imagine. Hotărâți care stare emoțională este cea mai puternică. Clasificați ca: 'Bucurie' dacă simțiți fericire, 'Tristețe' dacă se exprimă supărare, 'Furie' dacă este prezentă mânia, 'Frică' pentru anxietate, 'Surpriză' pentru șoc, sau 'Dragoste' pentru afecțiune.`
 
 
 
217
 
218
- **English Labels:** `Joy`, `Sadness`, `Anger`, `Fear`, `Surprise`, `Love` | **Native Labels:** `Bucurie`, `Tristețe`, `Furie`, `Frică`, `Surpriză`, `Dragoste`
219
  </details>
220
 
221
  <details>
222
- <summary><b>Harmful Content Detection (EN)</b></summary>
223
 
224
- **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
225
-
226
- **Instruction:** `<image> Categorize the meme based on a three-point scale of harmfulness. Scrutinize the image and text to make your judgment. Classify as: 'not-harmful', 'partially-harmful', or 'very-harmful'.`
 
227
 
228
- **Labels:** `not-harmful`, `partially-harmful`, `very-harmful`
229
- </details>
 
 
230
 
231
- <details>
232
- <summary><b>Harmful Content Detection COVID-19 (EN)</b></summary>
 
 
233
 
234
- **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
 
 
 
235
 
236
- **Instruction:** `<image> Assess the meme's content concerning the COVID-19 crisis and decide on its classification. The categories reflect increasing levels of potential harm, from none to severe. Classify the meme as either: 'not-harmful', 'partially-harmful', or 'very-harmful'.`
 
 
 
237
 
238
- **Labels:** `not-harmful`, `partially-harmful`, `very-harmful`
239
  </details>
240
 
241
  <details>
242
- <summary><b>Hateful Content Detection (AR — Prop2Hate)</b></summary>
243
 
 
244
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
 
 
245
 
246
- **English Instruction:** `<image> Analyze the content of the Arabic meme and determine if it includes hateful language. Consider whether it displays hatred, discrimination, or bias based on identity or protected traits. Classify as: 'Hateful' if applicable, otherwise 'not-hateful'.`
247
-
248
- **Native (AR) Instruction:** `<image> تتمثل مهمتك في تمييز المحتوى البغيض في الميمات العربية. هل تحتوي الميم على أي خطاب تمييزي أو عدائي؟ صنفها وفقًا لذلك: 'كراهية' إذا كانت كذلك، أو 'غير كراهية' إذا لم تكن.`
249
-
250
- **English Labels:** `Hateful`, `not-hateful` | **Native Labels:** `كراهية`, `غير كراهية`
251
- </details>
252
 
253
- <details>
254
- <summary><b>Hateful Content Detection (BN MUTE)</b></summary>
 
 
255
 
256
- **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
 
 
 
257
 
258
- **English Instruction:** `<image> Is the provided Bengali meme hateful in nature? Review the meme for any content that expresses discrimination or prejudice. After your review, select one of the two possible classifications. Classify as: 'hateful' or 'not-hateful'.`
 
 
 
259
 
260
- **Native (BN) Instruction:** `<image> এই বাংলা মেমটি দেখে বলুন, এতে কি কোনো ঘৃণামূলক বিষয়বস্তু আছে? যদি কোনো ব্যক্তি বা গোষ্ঠীকে আক্রমণ করা হয়ে থাকে, তাহলে এটিকে 'ঘৃণামূলক' হিসেবে চিহ্নিত করুন। যদি তা না হয়, তাহলে এটিকে 'ঘৃণামূলক নয়' হিসেবে চিহ্নিত করুন।`
 
 
 
261
 
262
- **English Labels:** `hateful`, `not-hateful` | **Native Labels:** `ঘৃণামূলক`, `ঘৃণামূলক নয়`
263
  </details>
264
 
265
  <details>
266
- <summary><b>Hateful Content Detection (DE — Multi3Hate)</b></summary>
267
-
268
- **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
269
-
270
- **English Instruction:** `<image> Analyze the German meme for hateful content. Determine if it promotes hate, discrimination, or prejudice against protected groups. Classify as: 'hateful' if it contains hate speech, or 'not-hateful' if it does not.`
271
-
272
- **Native (DE) Instruction:** `<image> Bewerten Sie das folgende deutsche Meme. Fördert es Hass oder Vorurteile gegen eine bestimmte Gruppe? Ordnen Sie es einer der folgenden Kategorien zu: 'hasserfüllt' oder 'nicht hasserfüllt'.`
273
 
274
- **English Labels:** `hateful`, `not-hateful` | **Native Labels:** `hasserfüllt`, `nicht hasserfüllt`
275
- </details>
 
 
276
 
277
- <details>
278
- <summary><b>Hateful Content Detection (EN FHM)</b></summary>
 
 
279
 
 
280
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
281
-
282
  **Instruction:** `<image> Analyze the content of the meme to see if it promotes or displays hateful views against specific individuals or groups. Look for discriminatory, prejudiced, or hate speech elements associated with protected characteristics, including race, religion, and gender. Classify as: 'hateful' if present, or 'not-hateful' if absent.`
283
-
284
  **Labels:** `hateful`, `not-hateful`
285
- </details>
286
-
287
- <details>
288
- <summary><b>Hateful Content Detection (EN — MMHS)</b></summary>
289
 
 
290
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
291
-
292
  **Instruction:** `<image> Examine the meme and determine if it contains any form of hate speech. Classify as: 'hateful' if the content is discriminatory or disparaging towards a person or group, and 'not-hateful' if it is not.`
293
-
294
  **Labels:** `hateful`, `not-hateful`
295
- </details>
296
-
297
- <details>
298
- <summary><b>Hateful Content Detection (EN — Multi3Hate)</b></summary>
299
 
 
300
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
301
-
302
  **Instruction:** `<image> Analyze the provided English meme to determine if it contains any form of hate speech, prejudice, or discriminatory content. Based on your analysis, categorize the meme using one of the following labels. Classify as: 'hateful' or 'not-hateful'.`
303
-
304
  **Labels:** `hateful`, `not-hateful`
305
- </details>
306
-
307
- <details>
308
- <summary><b>Hateful Content Detection — Islamophobia (EN)</b></summary>
309
 
 
310
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
311
-
312
  **Instruction:** `<image> Please evaluate the provided meme. Your task is to determine if it communicates hateful or discriminatory messages specifically targeting Islam or its adherents. Classify as: 'hateful' or 'not-hateful'.`
313
-
314
  **Labels:** `hateful`, `not-hateful`
315
- </details>
316
-
317
- <details>
318
- <summary><b>Hateful Content Detection (ES — Multi3Hate)</b></summary>
319
-
320
- **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
321
-
322
- **English Instruction:** `<image> Identify whether the Spanish meme displays hateful attitudes or sentiments, specifically looking for hate speech against protected groups. Classify your findings as: 'hateful' if hate speech is evident, or 'not-hateful' if it is not.`
323
-
324
- **Native (ES) Instruction:** `<image> Considerando el meme en español, ¿se puede catalogar como discurso de odio por atacar o denigrar a un colectivo? Categorízalo como: 'odioso' o 'no odioso'.`
325
-
326
- **English Labels:** `hateful`, `not-hateful` | **Native Labels:** `odioso`, `no odioso`
327
- </details>
328
-
329
- <details>
330
- <summary><b>Hateful Content Detection (HI — Multi3Hate)</b></summary>
331
-
332
- **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
333
-
334
- **English Instruction:** `<image> Examine the content of the Hindi meme. Does it contain language or imagery that is discriminatory or incites hatred? Assign one of the following labels: 'hateful' or 'not-hateful'.`
335
-
336
- **Native (HI) Instruction:** `<image> Determine if the Hindi meme contains hateful content. Identify whether it expresses hate, discrimination, or prejudice against individuals or groups. Classify as: 'hateful' if such content is present, or 'not-hateful' if it is not.`
337
-
338
- **English Labels:** `hateful`, `not-hateful` | **Native Labels:** `घृणास्पद`, `घृणास्पद नहीं`
339
- </details>
340
-
341
- <details>
342
- <summary><b>Hateful Content Detection (ZH — Multi3Hate)</b></summary>
343
-
344
- **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
345
-
346
- **English Instruction:** `<image> Your objective is to classify the Chinese meme based on its content. Determine if it conveys a hateful message targeting any specific group. Use one of the two provided labels: 'hateful' or 'not-hateful'.`
347
-
348
- **Native (ZH) Instruction:** `<image> Analyze the Chinese meme for hateful content. Determine if it promotes hate, discrimination, or prejudice against protected groups. Classify as: 'hateful' if it contains hate speech, or 'not-hateful' if it does not.`
349
-
350
- **English Labels:** `hateful`, `not-hateful` | **Native Labels:** `仇恨`, `非仇恨`
351
- </details>
352
-
353
- <details>
354
- <summary><b>Humor Classification (EN)</b></summary>
355
 
 
356
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
357
-
358
  **Instruction:** `<image> Classify the level of humor in the meme. Analyze how funny the content is based on its comedic elements. Classify as: 'not-funny' for unfunny content, 'funny' for mildly humorous, 'very-funny' for highly humorous, or 'hilarious' for extremely funny content.`
359
-
360
  **Labels:** `not-funny`, `funny`, `very-funny`, `hilarious`
361
- </details>
362
-
363
- <details>
364
- <summary><b>Intention Detection (EN)</b></summary>
365
 
 
366
  **System:** `You are an expert social media image analyzer specializing in detecting intentions in memes.`
367
-
368
  **Instruction:** `<image> Categorize the primary purpose of the meme. What was the creator's main goal in making it? Please classify the intention as: 'Entertaining', 'Expressive', 'Interactive', or 'Offensive'.`
369
-
370
  **Labels:** `Entertaining`, `Expressive`, `Interactive`, `Offensive`
371
- </details>
372
-
373
- <details>
374
- <summary><b>Intention Detection (ZH)</b></summary>
375
-
376
- **System:** `You are an expert social media image analyzer specializing in detecting intentions in memes.`
377
-
378
- **English Instruction:** `<image> Please review the Chinese meme and decide on the main reason it was created. Categorize its purpose using one of the provided labels which are: 'Entertaining' for amusement, 'Expressive' for emotional expression, 'Interactive' for engagement, 'Offensive' for provocative intent, or 'Other' for unclear intentions.`
379
-
380
- **Native (ZH) Instruction:** `<image> 请对下面的中文表情包进行分类,明确判断其主要意图。参考以下分类标准:'娱乐' 指娱乐性的内容,'表达' 用于表达情绪或观点,'互动' 用于互动交流,'冒犯' 包含冒犯意识,'其他' ���图不清。分类为:'娱乐'、'表达'、'互动'、'冒犯' 或 '其他'。`
381
-
382
- **English Labels:** `Entertaining`, `Expressive`, `Interactive`, `Offensive`, `Other` | **Native Labels:** `娱乐`, `表达`, `互动`, `冒犯`, `其他`
383
- </details>
384
-
385
- <details>
386
- <summary><b>Metaphor Detection (EN)</b></summary>
387
 
 
388
  **System:** `You are an expert social media image analyzer specializing in identifying metaphors in memes.`
389
-
390
  **Instruction:** `<image> Determine whether the meme uses metaphorical or literal language. Analyze if the content relies on figurative language, symbolism, or indirect meaning. Classify as: 'Metaphorical' if it uses figurative language, or 'Literal' if the meaning is direct and straightforward.`
391
-
392
  **Labels:** `Metaphorical`, `Literal`
393
- </details>
394
-
395
- <details>
396
- <summary><b>Metaphor Detection (ZH)</b></summary>
397
-
398
- **System:** `You are an expert social media image analyzer specializing in identifying metaphors in memes.`
399
-
400
- **English Instruction:** `<image> Identify whether the Chinese meme exemplifies metaphorical communication by spotting elements of indirect meaning or symbolism, as opposed to stating facts directly. Classify as: 'Metaphorical' if you find figurative elements, otherwise choose 'Literal'.`
401
-
402
- **Native (ZH) Instruction:** `<image> 请根据中文表情包中的语言表达方式判断,是属于隐喻还是字面。若内容具有象征性或使用比喻手法,则分类为 '隐喻';否则,若表达直白无修饰,则分类为 '字面'。`
403
-
404
- **English Labels:** `Metaphorical`, `Literal` | **Native Labels:** `隐喻`, `字面`
405
- </details>
406
-
407
- <details>
408
- <summary><b>Misogyny Detection (EN — MAMI)</b></summary>
409
 
 
410
  **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
411
-
412
  **Instruction:** `<image> Scrutinize the meme to detect any indication of misogynous content, including aspects like prejudice, discrimination, objectification, or aversion towards women. Label the meme 'misogynous' when identified, or 'not-misogynous' if none are observed.`
413
-
414
  **Labels:** `misogynous`, `not-misogynous`
415
- </details>
416
-
417
- <details>
418
- <summary><b>Misogyny Detection (HI-EN — MIMIC2024)</b></summary>
419
-
420
- **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
421
-
422
- **English Instruction:** `<image> Observe the Hindi-English meme and ascertain if there are signs of misogyny, such as actions or language indicative of objectification, discrimination, prejudice, or hostility towards women. Classify strictly as: 'misogynous' or 'not-misogynous'.`
423
-
424
- **English Labels:** `misogynous`, `not-misogynous` | **Native Labels:** `महिला विरोधी`, `महिला विरोधी नहीं`
425
- </details>
426
-
427
- <details>
428
- <summary><b>Misogyny Categories (HI-EN — MIMIC2024)</b></summary>
429
-
430
- **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
431
-
432
- **English Instruction:** `<image> Inspect the Hindi-English meme and decide whether its misogynous nature stems from objectification, prejudice, humiliation, a combination of these, or is unspecified. Select from the labels: 'Objectification', 'Prejudice', 'Humiliation', 'Objectification, Humiliation', 'Objectification, Prejudice', 'Prejudice, Humiliation', or 'Unspecified'.`
433
-
434
- **English Labels:** `Objectification`, `Prejudice`, `Humiliation`, `Objectification, Humiliation`, `Objectification, Prejudice`, `Prejudice, Humiliation`, `Unspecified` | **Native Labels:** `वस्तुकरण`, `पूर्वाग्रह`, `अपमान`, `वस्तुकरण, अपमान`, `वस्तुकरण, पूर्वाग्रह`, `पूर्वाग्रह, अपमान`, `अनिर्दिष्ट`
435
- </details>
436
-
437
- <details>
438
- <summary><b>Motivational Detection (EN)</b></summary>
439
 
 
440
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
441
-
442
  **Instruction:** `<image> Assess if the meme fits the motivational category by checking for inspirational, encouraging, or uplifting content. Classify using: 'motivational' for memes with these features, 'not-motivational' otherwise.`
443
-
444
  **Labels:** `motivational`, `not-motivational`
445
- </details>
446
-
447
- <details>
448
- <summary><b>Objectification Detection (EN)</b></summary>
449
 
 
450
  **System:** `You are an expert social media image analyzer specializing in identifying objectification in memes.`
451
-
452
  **Instruction:** `<image> Evaluate the meme for signs that it depicts women as objects, prioritizes appearance over character, or undermines their autonomy. Classify as: 'objectification' when such elements are present; otherwise, use 'not-objectification'.`
453
-
454
  **Labels:** `objectification`, `not-objectification`
455
- </details>
456
-
457
- <details>
458
- <summary><b>Offensive Content Detection (EN — Memotion)</b></summary>
459
 
 
460
  **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
461
-
462
  **Instruction:** `<image> Carefully read the meme and judge how objectionable the material is. Then, classify it as: 'not-offensive', 'slightly-offensive', 'very-offensive', or 'hateful-offensive', based on the degree of offensiveness present.`
463
-
464
  **Labels:** `not-offensive`, `slightly-offensive`, `very-offensive`, `hateful-offensive`
465
- </details>
466
-
467
- <details>
468
- <summary><b>Offensive Content Detection (EN — MET-Meme)</b></summary>
469
 
 
470
  **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
471
-
472
  **Instruction:** `<image> Judge the degree of offensiveness present in the meme. Evaluate how inappropriate or offensive the content is, and assign one of the following labels: 'not-offensive', 'slightly-offensive', 'moderately-offensive', or 'very-offensive'.`
473
-
474
  **Labels:** `not-offensive`, `slightly-offensive`, `moderately-offensive`, `very-offensive`
475
- </details>
476
-
477
- <details>
478
- <summary><b>Offensive Content Detection (ZH — MET-Meme)</b></summary>
479
-
480
- **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
481
-
482
- **English Instruction:** `<image> Identify the degree of offensiveness in the supplied Chinese meme. Choose one of the following precise classifications: 'not-offensive', 'slightly-offensive', 'moderately-offensive', or 'very-offensive'.`
483
-
484
- **Native (ZH) Instruction:** `<image> 对每个中文表情包,判断它的冒犯性级别。分类包括:'不冒犯' —— 完全无冒犯意向,'轻度冒犯' —— 轻微不当,'中度冒犯' —— 明显冒犯,'严重冒犯' —— 极度不当。请据此分辨。`
485
-
486
- **English Labels:** `not-offensive`, `slightly-offensive`, `moderately-offensive`, `very-offensive` | **Native Labels:** `不冒犯`, `轻度冒犯`, `中度冒犯`, `严重冒犯`
487
- </details>
488
-
489
- <details>
490
- <summary><b>Political Content Detection (RO)</b></summary>
491
-
492
- **System:** `You are an expert social media analyst specializing in analyzing political content in memes.`
493
-
494
- **English Instruction:** `<image> Assess whether the Romanian meme features political topics, including discussions about politicians, legislation, or political controversies. Classify the meme as either 'political' for relevant content, or 'not-political' if absent.`
495
-
496
- **Native (RO) Instruction:** `<image> Analizați dacă mema românească se referă la politică prin menționarea politicienilor, politicilor, alegerilor sau evenimentelor asociate. Clasificați ca: 'politic' în cazul unui subiect politic, sau ca: 'nepolitic' dacă nu este politică.`
497
-
498
- **English Labels:** `political`, `not-political` | **Native Labels:** `politic`, `nepolitic`
499
- </details>
500
-
501
- <details>
502
- <summary><b>Propaganda Detection (AR — ArMeme)</b></summary>
503
-
504
- **System:** `You are an expert social media analyst specializing in identifying propaganda techniques in memes.`
505
-
506
- **English Instruction:** `<image> Identify if the given Arabic meme is an example of propaganda by checking for biased or misleading narratives intended to affect political or ideological perspectives. Select: 'propaganda' or 'not-propaganda' as appropriate.`
507
-
508
- **Native (AR) Instruction:** `<image> صنف الميم التالية بناءً على وجود أو غياب العناصر الدعائية. تُعرَّف الدعاية بأنها معلومات، خاصة ذات طبيعة متحيزة أو مضللة، تستخدم للترويج لقضية سياسية. اختر بين 'دعاية' و 'ليست دعاية'.`
509
-
510
- **English Labels:** `propaganda`, `not-propaganda` | **Native Labels:** `دعاية`, `ليست دعاية`
511
- </details>
512
-
513
- <details>
514
- <summary><b>Sarcasm Detection (BN)</b></summary>
515
-
516
- **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
517
-
518
- **English Instruction:** `<image> Assess the communication style of the Bengali meme. Does it rely on sarcasm to make its point? If the meme is straightforward and means exactly what it says, it is not sarcastic. Classify the content as: 'sarcasm' or 'not-sarcasm'.`
519
-
520
- **Native (BN) Instruction:** `<image> এই বাংলা মেমের বিষয়বস্তু বিশ্লেষণ করুন। যদি এটি বিদ্রূপ বা শ্লেষ ব্যবহার করে কোনো বার্তা দেয়, তবে তা ব্যঙ্গাত্মক। আপনার সিদ্ধান্ত অনুযায়ী এটিকে 'ব্যঙ্গাত্মক' বা 'ব্যঙ্গাত্মক নয়' হিসেবে চিহ্নিত করুন।`
521
-
522
- **English Labels:** `sarcasm`, `not-sarcasm` | **Native Labels:** `ব্যঙ্গাত্মক`, `ব্যঙ্গাত্মক নয়`
523
- </details>
524
-
525
- <details>
526
- <summary><b>Sarcasm Detection (EN)</b></summary>
527
 
 
528
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
529
-
530
  **Instruction:** `<image> Your objective is to identify the type of sarcasm used in the meme. Determine if the content is straightforward or if it uses irony to convey its message. Please assign a classification based on the complexity of the sarcasm. Classify as: 'not-sarcastic', 'general-sarcasm', 'twisted-meaning', or 'very-twisted'.`
531
-
532
  **Labels:** `not-sarcastic`, `general-sarcasm`, `twisted-meaning`, `very-twisted`
533
- </details>
534
-
535
- <details>
536
- <summary><b>Sentiment Analysis (EN — Memotion)</b></summary>
537
 
 
538
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
539
-
540
  **Instruction:** `<image> Review the meme and determine its general emotional sentiment. Based on your analysis, assign one of the following labels: 'very-negative', 'negative', 'neutral', 'positive', or 'very-positive'.`
541
-
542
  **Labels:** `very-negative`, `negative`, `neutral`, `positive`, `very-positive`
543
- </details>
544
-
545
- <details>
546
- <summary><b>Sentiment Analysis (BN)</b></summary>
547
-
548
- **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
549
-
550
- **English Instruction:** `<image> Examine the Bengali meme and identify the sentiment it conveys. Classify the emotion as: 'positive', 'negative', or 'neutral'.`
551
-
552
- **Native (BN) Instruction:** `<image> নিম্নলিখিত বাংলা মেমে-র মতবোধ/আবেগ পরীক্ষা করুন এবং শ্রেণীতে ভাগ করুন। বাছাই করুন: 'ইতিবাচক' শান্তিপূর্ণ বা প্রেরণাদায়ক হলে, 'নেতিবাচক' হতাশাজনক বা অপ্রীতিকর হলে, অথবা 'নিরপেক্ষ' কোনো প্রকৃত অনুভূতি না থাকলে।`
553
-
554
- **English Labels:** `positive`, `negative`, `neutral` | **Native Labels:** `ইতিবাচক`, `নেতিবাচক`, `নিরপেক্ষ`
555
- </details>
556
-
557
- <details>
558
- <summary><b>Sentiment Analysis (RO)</b></summary>
559
 
 
560
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
561
-
562
- **English Instruction:** `<image> Investigate the sentiment expressed by the following Romanian meme. Select the appropriate label: 'positive' if the meme is upbeat or approving, 'negative' if it is critical or displeased, or 'neutral' if it remains impartial.`
563
-
564
- **Native (RO) Instruction:** `<image> Identifică și clasifică sentimentul din mema următoare. Specificați dacă este 'pozitiv', 'negativ' sau 'neutru'.`
565
-
566
- **English Labels:** `positive`, `negative`, `neutral` | **Native Labels:** `pozitiv`, `negativ`, `neutru`
567
- </details>
568
-
569
- <details>
570
- <summary><b>Sentiment Category (EN)</b></summary>
571
-
572
- **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
573
-
574
  **Instruction:** `<image> For the meme presented, decide which single emotional category it belongs to, considering its overall message and tone. The available classifications are: 'Happiness', 'Sorrow', 'Anger', 'Fear', 'Surprise', 'Love', or 'Hate'.`
575
-
576
  **Labels:** `Happiness`, `Sorrow`, `Anger`, `Fear`, `Surprise`, `Love`, `Hate`
577
- </details>
578
-
579
- <details>
580
- <summary><b>Sentiment Category (ZH)</b></summary>
581
 
 
582
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
583
-
584
- **English Instruction:** `<image> Classify the provided Chinese meme based on its primary emotional content. Your selection must be one of the following: 'Happiness', 'Sorrow', 'Anger', 'Fear', 'Surprise', 'Love', or 'Hate'.`
585
-
586
- **Native (ZH) Instruction:** `<image> 请审视这个中文表情包,并为其指定一个主要的情感类别。从以下列表中选择:'快乐','悲伤','愤怒','恐惧','惊讶','爱','恨'。`
587
-
588
- **English Labels:** `Happiness`, `Sorrow`, `Anger`, `Fear`, `Surprise`, `Love`, `Hate` | **Native Labels:** `快乐`, `悲伤`, `愤怒`, `恐惧`, `惊讶`, `爱`, `恨`
589
- </details>
590
-
591
- <details>
592
- <summary><b>Sentiment Degree (EN)</b></summary>
593
-
594
- **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
595
-
596
  **Instruction:** `<image> Considering the overall sentiment in the meme, classify its intensity by selecting: 'slightly' for minimal emotional presence, 'moderately' for a balanced intensity, or 'very' if the sentiment is intense and vivid.`
597
-
598
  **Labels:** `slightly`, `moderately`, `very`
599
- </details>
600
-
601
- <details>
602
- <summary><b>Sentiment Degree (ZH)</b></summary>
603
-
604
- **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
605
-
606
- **English Instruction:** `<image> Please examine the Chinese meme and classify how strongly its sentiment is expressed. Use the following labels for the intensity: 'slightly' for a low degree, 'moderately' for a medium degree, and 'very' for a high degree of emotional expression.`
607
-
608
- **Native (ZH) Instruction:** `<image> 判断中文表情包所传达情感的强度。请根据实际情感力度,将其归为:'轻微'、'中等' 或 '非常'。`
609
-
610
- **English Labels:** `slightly`, `moderately`, `very` | **Native Labels:** `轻微`, `中等`, `非常`
611
- </details>
612
-
613
- <details>
614
- <summary><b>Shaming Detection (EN)</b></summary>
615
 
 
616
  **System:** `You are an expert social media image analyzer specializing in identifying shaming content in memes.`
617
-
618
  **Instruction:** `<image> Classify whether the meme contains shaming content directed at women. Determine if it aims to humiliate, embarrass, or shame women about their appearance, behavior, or choices. Classify as: 'shaming' if such content is present, or 'not-shaming' if it is not.`
619
-
620
  **Labels:** `shaming`, `not-shaming`
621
- </details>
622
-
623
- <details>
624
- <summary><b>Stereotype Detection (EN)</b></summary>
625
 
 
626
  **System:** `You are an expert social media image analyzer specializing in identifying stereotypes in memes.`
627
-
628
  **Instruction:** `<image> Review the meme and determine if it contains stereotypical content about women—look for generalized, simplified, or biased messages. Classify as: 'stereotype' if you find such traits, otherwise 'not-stereotype'.`
629
-
630
  **Labels:** `stereotype`, `not-stereotype`
631
- </details>
632
-
633
- <details>
634
- <summary><b>Target Identification (EN)</b></summary>
635
 
 
636
  **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
637
-
638
  **Instruction:** `<image> For the provided meme, interpret who or what is being singled out. Is the commentary aimed at one person, a community, an organization, all of society, or no target? Make your classification using: 'individual', 'community', 'organization', 'society', or 'none'.`
 
639
 
 
 
 
640
  **Labels:** `individual`, `community`, `organization`, `society`, `none`
 
 
 
 
 
 
641
  </details>
642
 
643
  <details>
644
- <summary><b>Target Identification — COVID-19 (EN)</b></summary>
645
 
646
- **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
647
-
648
- **Instruction:** `<image> Review the COVID-19 meme and determine the entity that is depicted as the target within the content. Assign one of the following categories: 'individual', 'community', 'organization', 'society', or 'none'.`
 
649
 
650
- **Labels:** `individual`, `community`, `organization`, `society`, `none`
651
  </details>
652
 
653
  <details>
654
- <summary><b>Toxicity Detection (RU)</b></summary>
655
 
656
- **System:** `You are an expert social media image analyzer specializing in identifying toxic content in memes.`
 
 
 
657
 
658
- **English Instruction:** `<image> Evaluate the Russian meme for the presence of toxicity. This involves identifying any language or imagery that is damaging, disrespectful, or offensive to individuals or groups. Based on your assessment, classify as: 'toxic' or 'not-toxic'.`
 
 
 
659
 
660
- **Native (RU) Instruction:** `<image> Оцените мем на русском языке и выясните, присутствует ли токсичность. Если выражения или визуальные элементы мема унижают, обижают или причиняют вред людям или группам, классифицируйте как: 'токсичный'. Если таких признаков нет, используйте метку: 'нетоксичный'.`
 
 
 
661
 
662
- **English Labels:** `toxic`, `not-toxic` | **Native Labels:** `токсичный`, `нетоксичный`
663
  </details>
664
 
665
  <details>
666
- <summary><b>Violence Detection (EN)</b></summary>
667
 
668
- **System:** `You are an expert social media image analyzer specializing in identifying violent content in memes.`
 
 
 
669
 
670
- **Instruction:** `<image> Is there content in this meme that shows, glorifies, or promotes physical, sexual, or psychological violence toward women? Please make a determination. Classify as: 'violence' if yes, and 'not-violence' if no.`
 
 
 
 
 
 
 
 
 
 
 
 
 
671
 
672
- **Labels:** `violence`, `not-violence`
673
  </details>
674
 
675
  <details>
676
- <summary><b>Vulgar Content Detection (BN)</b></summary>
677
 
678
- **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
 
 
 
679
 
680
- **English Instruction:** `<image> Evaluate the Bengali meme to see if vulgarity is present. Search for crude, obscene, or improper language or images. Classify as: 'vulgar' if such content is found, otherwise use 'not-vulgar'.`
681
 
682
- **Native (BN) Instruction:** `<image> এই বাংলা মেমে কোনো অশ্লীল ভাষা বা ছবি আছে কিনা তা পরীক্ষা করুন শ্রেণীবিন���যাস করুন। মেমটি 'অশ্লীল' নাকি 'অশ্লীল নয়' তা বলুন।`
 
 
 
 
 
 
683
 
684
- **English Labels:** `vulgar`, `not-vulgar` | **Native Labels:** `অশ্লীল`, `অশ্লীল নয়`
685
  </details>
686
 
687
  ## Citation
 
183
  > ```
184
 
185
  <details>
186
+ <summary><b>Arabic (AR)</b></summary>
187
 
188
+ #### Hateful Content Detection (Prop2Hate)
189
+ **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
190
+ **Instruction:** `<image> Analyze the content of the Arabic meme and determine if it includes hateful language. Consider whether it displays hatred, discrimination, or bias based on identity or protected traits. Classify as: 'Hateful' if applicable, otherwise 'not-hateful'.`
191
+ **Labels:** `Hateful`, `not-hateful`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
192
 
193
+ #### Propaganda Detection (ArMeme)
194
+ **System:** `You are an expert social media analyst specializing in identifying propaganda techniques in memes.`
195
+ **Instruction:** `<image> Identify if the given Arabic meme is an example of propaganda by checking for biased or misleading narratives intended to affect political or ideological perspectives. Select: 'propaganda' or 'not-propaganda' as appropriate.`
196
+ **Labels:** `propaganda`, `not-propaganda`
197
 
 
198
  </details>
199
 
200
  <details>
201
+ <summary><b>Bengali (BN)</b></summary>
202
 
203
+ #### Abuse Detection (BanglaAbuseMeme)
204
+ **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
205
+ **Instruction:** `<image> Examine the Bengali meme and decide whether it features abusive content. This includes any form of verbal abuse or personal offense. Classify as: 'abusive' or 'not-abusive'.`
206
+ **Labels:** `abusive`, `not-abusive`
207
 
208
+ #### Hateful Content Detection (MUTE)
209
+ **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
210
+ **Instruction:** `<image> Is the provided Bengali meme hateful in nature? Review the meme for any content that expresses discrimination or prejudice. After your review, select one of the two possible classifications. Classify as: 'hateful' or 'not-hateful'.`
211
+ **Labels:** `hateful`, `not-hateful`
212
 
213
+ #### Sarcasm Detection (BanglaAbuseMeme)
214
+ **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
215
+ **Instruction:** `<image> Assess the communication style of the Bengali meme. Does it rely on sarcasm to make its point? If the meme is straightforward and means exactly what it says, it is not sarcastic. Classify the content as: 'sarcasm' or 'not-sarcasm'.`
216
+ **Labels:** `sarcasm`, `not-sarcasm`
217
 
218
+ #### Sentiment Analysis (BanglaAbuseMeme)
219
+ **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
220
+ **Instruction:** `<image> Examine the Bengali meme and identify the sentiment it conveys. Classify the emotion as: 'positive', 'negative', or 'neutral'.`
221
+ **Labels:** `positive`, `negative`, `neutral`
222
 
223
+ #### Vulgar Content Detection (BanglaAbuseMeme)
224
+ **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
225
+ **Instruction:** `<image> Evaluate the Bengali meme to see if vulgarity is present. Search for crude, obscene, or improper language or images. Classify as: 'vulgar' if such content is found, otherwise use 'not-vulgar'.`
226
+ **Labels:** `vulgar`, `not-vulgar`
227
 
 
228
  </details>
229
 
230
  <details>
231
+ <summary><b>Chinese (ZH)</b></summary>
232
 
233
+ #### Hateful Content Detection (Multi3Hate)
234
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
235
+ **Instruction:** `<image> Your objective is to classify the Chinese meme based on its content. Determine if it conveys a hateful message targeting any specific group. Use one of the two provided labels: 'hateful' or 'not-hateful'.`
236
+ **Labels:** `hateful`, `not-hateful`
237
 
238
+ #### Intention Detection (MET-Meme)
239
+ **System:** `You are an expert social media image analyzer specializing in detecting intentions in memes.`
240
+ **Instruction:** `<image> Please review the Chinese meme and decide on the main reason it was created. Categorize its purpose using one of the provided labels which are: 'Entertaining' for amusement, 'Expressive' for emotional expression, 'Interactive' for engagement, 'Offensive' for provocative intent, or 'Other' for unclear intentions.`
241
+ **Labels:** `Entertaining`, `Expressive`, `Interactive`, `Offensive`, `Other`
 
 
242
 
243
+ #### Metaphor Detection (MET-Meme)
244
+ **System:** `You are an expert social media image analyzer specializing in identifying metaphors in memes.`
245
+ **Instruction:** `<image> Identify whether the Chinese meme exemplifies metaphorical communication by spotting elements of indirect meaning or symbolism, as opposed to stating facts directly. Classify as: 'Metaphorical' if you find figurative elements, otherwise choose 'Literal'.`
246
+ **Labels:** `Metaphorical`, `Literal`
247
 
248
+ #### Offensive Content Detection (MET-Meme)
249
+ **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
250
+ **Instruction:** `<image> Identify the degree of offensiveness in the supplied Chinese meme. Choose one of the following precise classifications: 'not-offensive', 'slightly-offensive', 'moderately-offensive', or 'very-offensive'.`
251
+ **Labels:** `not-offensive`, `slightly-offensive`, `moderately-offensive`, `very-offensive`
252
 
253
+ #### Sentiment Category (MET-Meme)
254
+ **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
255
+ **Instruction:** `<image> Classify the provided Chinese meme based on its primary emotional content. Your selection must be one of the following: 'Happiness', 'Sorrow', 'Anger', 'Fear', 'Surprise', 'Love', or 'Hate'.`
256
+ **Labels:** `Happiness`, `Sorrow`, `Anger`, `Fear`, `Surprise`, `Love`, `Hate`
257
 
258
+ #### Sentiment Degree (MET-Meme)
259
+ **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
260
+ **Instruction:** `<image> Please examine the Chinese meme and classify how strongly its sentiment is expressed. Use the following labels for the intensity: 'slightly' for a low degree, 'moderately' for a medium degree, and 'very' for a high degree of emotional expression.`
261
+ **Labels:** `slightly`, `moderately`, `very`
262
 
 
263
  </details>
264
 
265
  <details>
266
+ <summary><b>English (EN)</b></summary>
 
 
 
 
 
 
267
 
268
+ #### Harmful Content Detection (HarMeme)
269
+ **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
270
+ **Instruction:** `<image> Categorize the meme based on a three-point scale of harmfulness. Scrutinize the image and text to make your judgment. Classify as: 'not-harmful', 'partially-harmful', or 'very-harmful'.`
271
+ **Labels:** `not-harmful`, `partially-harmful`, `very-harmful`
272
 
273
+ #### Harmful Content Detection — COVID-19 (HarMeme)
274
+ **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
275
+ **Instruction:** `<image> Assess the meme's content concerning the COVID-19 crisis and decide on its classification. The categories reflect increasing levels of potential harm, from none to severe. Classify the meme as either: 'not-harmful', 'partially-harmful', or 'very-harmful'.`
276
+ **Labels:** `not-harmful`, `partially-harmful`, `very-harmful`
277
 
278
+ #### Hateful Content Detection (FHM)
279
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
 
280
  **Instruction:** `<image> Analyze the content of the meme to see if it promotes or displays hateful views against specific individuals or groups. Look for discriminatory, prejudiced, or hate speech elements associated with protected characteristics, including race, religion, and gender. Classify as: 'hateful' if present, or 'not-hateful' if absent.`
 
281
  **Labels:** `hateful`, `not-hateful`
 
 
 
 
282
 
283
+ #### Hateful Content Detection (MMHS)
284
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
 
285
  **Instruction:** `<image> Examine the meme and determine if it contains any form of hate speech. Classify as: 'hateful' if the content is discriminatory or disparaging towards a person or group, and 'not-hateful' if it is not.`
 
286
  **Labels:** `hateful`, `not-hateful`
 
 
 
 
287
 
288
+ #### Hateful Content Detection (Multi3Hate)
289
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
 
290
  **Instruction:** `<image> Analyze the provided English meme to determine if it contains any form of hate speech, prejudice, or discriminatory content. Based on your analysis, categorize the meme using one of the following labels. Classify as: 'hateful' or 'not-hateful'.`
 
291
  **Labels:** `hateful`, `not-hateful`
 
 
 
 
292
 
293
+ #### Hateful Content Detection — Islamophobia (MIMIC)
294
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
 
295
  **Instruction:** `<image> Please evaluate the provided meme. Your task is to determine if it communicates hateful or discriminatory messages specifically targeting Islam or its adherents. Classify as: 'hateful' or 'not-hateful'.`
 
296
  **Labels:** `hateful`, `not-hateful`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
297
 
298
+ #### Humor Classification (Memotion)
299
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
 
300
  **Instruction:** `<image> Classify the level of humor in the meme. Analyze how funny the content is based on its comedic elements. Classify as: 'not-funny' for unfunny content, 'funny' for mildly humorous, 'very-funny' for highly humorous, or 'hilarious' for extremely funny content.`
 
301
  **Labels:** `not-funny`, `funny`, `very-funny`, `hilarious`
 
 
 
 
302
 
303
+ #### Intention Detection (MET-Meme)
304
  **System:** `You are an expert social media image analyzer specializing in detecting intentions in memes.`
 
305
  **Instruction:** `<image> Categorize the primary purpose of the meme. What was the creator's main goal in making it? Please classify the intention as: 'Entertaining', 'Expressive', 'Interactive', or 'Offensive'.`
 
306
  **Labels:** `Entertaining`, `Expressive`, `Interactive`, `Offensive`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
307
 
308
+ #### Metaphor Detection (MET-Meme)
309
  **System:** `You are an expert social media image analyzer specializing in identifying metaphors in memes.`
 
310
  **Instruction:** `<image> Determine whether the meme uses metaphorical or literal language. Analyze if the content relies on figurative language, symbolism, or indirect meaning. Classify as: 'Metaphorical' if it uses figurative language, or 'Literal' if the meaning is direct and straightforward.`
 
311
  **Labels:** `Metaphorical`, `Literal`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
312
 
313
+ #### Misogyny Detection (MAMI)
314
  **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
 
315
  **Instruction:** `<image> Scrutinize the meme to detect any indication of misogynous content, including aspects like prejudice, discrimination, objectification, or aversion towards women. Label the meme 'misogynous' when identified, or 'not-misogynous' if none are observed.`
 
316
  **Labels:** `misogynous`, `not-misogynous`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
317
 
318
+ #### Motivational Detection (Memotion)
319
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
 
320
  **Instruction:** `<image> Assess if the meme fits the motivational category by checking for inspirational, encouraging, or uplifting content. Classify using: 'motivational' for memes with these features, 'not-motivational' otherwise.`
 
321
  **Labels:** `motivational`, `not-motivational`
 
 
 
 
322
 
323
+ #### Objectification Detection (MAMI)
324
  **System:** `You are an expert social media image analyzer specializing in identifying objectification in memes.`
 
325
  **Instruction:** `<image> Evaluate the meme for signs that it depicts women as objects, prioritizes appearance over character, or undermines their autonomy. Classify as: 'objectification' when such elements are present; otherwise, use 'not-objectification'.`
 
326
  **Labels:** `objectification`, `not-objectification`
 
 
 
 
327
 
328
+ #### Offensive Content Detection (Memotion)
329
  **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
 
330
  **Instruction:** `<image> Carefully read the meme and judge how objectionable the material is. Then, classify it as: 'not-offensive', 'slightly-offensive', 'very-offensive', or 'hateful-offensive', based on the degree of offensiveness present.`
 
331
  **Labels:** `not-offensive`, `slightly-offensive`, `very-offensive`, `hateful-offensive`
 
 
 
 
332
 
333
+ #### Offensive Content Detection (MET-Meme)
334
  **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
 
335
  **Instruction:** `<image> Judge the degree of offensiveness present in the meme. Evaluate how inappropriate or offensive the content is, and assign one of the following labels: 'not-offensive', 'slightly-offensive', 'moderately-offensive', or 'very-offensive'.`
 
336
  **Labels:** `not-offensive`, `slightly-offensive`, `moderately-offensive`, `very-offensive`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
337
 
338
+ #### Sarcasm Detection (Memotion)
339
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
 
340
  **Instruction:** `<image> Your objective is to identify the type of sarcasm used in the meme. Determine if the content is straightforward or if it uses irony to convey its message. Please assign a classification based on the complexity of the sarcasm. Classify as: 'not-sarcastic', 'general-sarcasm', 'twisted-meaning', or 'very-twisted'.`
 
341
  **Labels:** `not-sarcastic`, `general-sarcasm`, `twisted-meaning`, `very-twisted`
 
 
 
 
342
 
343
+ #### Sentiment Analysis (Memotion)
344
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
 
345
  **Instruction:** `<image> Review the meme and determine its general emotional sentiment. Based on your analysis, assign one of the following labels: 'very-negative', 'negative', 'neutral', 'positive', or 'very-positive'.`
 
346
  **Labels:** `very-negative`, `negative`, `neutral`, `positive`, `very-positive`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
347
 
348
+ #### Sentiment Category (MET-Meme)
349
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
 
 
 
 
 
 
 
 
 
 
 
 
 
350
  **Instruction:** `<image> For the meme presented, decide which single emotional category it belongs to, considering its overall message and tone. The available classifications are: 'Happiness', 'Sorrow', 'Anger', 'Fear', 'Surprise', 'Love', or 'Hate'.`
 
351
  **Labels:** `Happiness`, `Sorrow`, `Anger`, `Fear`, `Surprise`, `Love`, `Hate`
 
 
 
 
352
 
353
+ #### Sentiment Degree (MET-Meme)
354
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
 
 
 
 
 
 
 
 
 
 
 
 
 
355
  **Instruction:** `<image> Considering the overall sentiment in the meme, classify its intensity by selecting: 'slightly' for minimal emotional presence, 'moderately' for a balanced intensity, or 'very' if the sentiment is intense and vivid.`
 
356
  **Labels:** `slightly`, `moderately`, `very`
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
357
 
358
+ #### Shaming Detection (MAMI)
359
  **System:** `You are an expert social media image analyzer specializing in identifying shaming content in memes.`
 
360
  **Instruction:** `<image> Classify whether the meme contains shaming content directed at women. Determine if it aims to humiliate, embarrass, or shame women about their appearance, behavior, or choices. Classify as: 'shaming' if such content is present, or 'not-shaming' if it is not.`
 
361
  **Labels:** `shaming`, `not-shaming`
 
 
 
 
362
 
363
+ #### Stereotype Detection (MAMI)
364
  **System:** `You are an expert social media image analyzer specializing in identifying stereotypes in memes.`
 
365
  **Instruction:** `<image> Review the meme and determine if it contains stereotypical content about women—look for generalized, simplified, or biased messages. Classify as: 'stereotype' if you find such traits, otherwise 'not-stereotype'.`
 
366
  **Labels:** `stereotype`, `not-stereotype`
 
 
 
 
367
 
368
+ #### Target Identification (HarMeme)
369
  **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
 
370
  **Instruction:** `<image> For the provided meme, interpret who or what is being singled out. Is the commentary aimed at one person, a community, an organization, all of society, or no target? Make your classification using: 'individual', 'community', 'organization', 'society', or 'none'.`
371
+ **Labels:** `individual`, `community`, `organization`, `society`, `none`
372
 
373
+ #### Target Identification — COVID-19 (HarMeme)
374
+ **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
375
+ **Instruction:** `<image> Review the COVID-19 meme and determine the entity that is depicted as the target within the content. Assign one of the following categories: 'individual', 'community', 'organization', 'society', or 'none'.`
376
  **Labels:** `individual`, `community`, `organization`, `society`, `none`
377
+
378
+ #### Violence Detection (MAMI)
379
+ **System:** `You are an expert social media image analyzer specializing in identifying violent content in memes.`
380
+ **Instruction:** `<image> Is there content in this meme that shows, glorifies, or promotes physical, sexual, or psychological violence toward women? Please make a determination. Classify as: 'violence' if yes, and 'not-violence' if no.`
381
+ **Labels:** `violence`, `not-violence`
382
+
383
  </details>
384
 
385
  <details>
386
+ <summary><b>German (DE)</b></summary>
387
 
388
+ #### Hateful Content Detection (Multi3Hate)
389
+ **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
390
+ **Instruction:** `<image> Analyze the German meme for hateful content. Determine if it promotes hate, discrimination, or prejudice against protected groups. Classify as: 'hateful' if it contains hate speech, or 'not-hateful' if it does not.`
391
+ **Labels:** `hateful`, `not-hateful`
392
 
 
393
  </details>
394
 
395
  <details>
396
+ <summary><b>Hindi / Hindi-English (HI / HI-EN)</b></summary>
397
 
398
+ #### Hateful Content Detection (Multi3Hate)
399
+ **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
400
+ **Instruction:** `<image> Examine the content of the Hindi meme. Does it contain language or imagery that is discriminatory or incites hatred? Assign one of the following labels: 'hateful' or 'not-hateful'.`
401
+ **Labels:** `hateful`, `not-hateful`
402
 
403
+ #### Misogyny Detection (MIMIC2024)
404
+ **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
405
+ **Instruction:** `<image> Observe the Hindi-English meme and ascertain if there are signs of misogyny, such as actions or language indicative of objectification, discrimination, prejudice, or hostility towards women. Classify strictly as: 'misogynous' or 'not-misogynous'.`
406
+ **Labels:** `misogynous`, `not-misogynous`
407
 
408
+ #### Misogyny Categories (MIMIC2024)
409
+ **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
410
+ **Instruction:** `<image> Inspect the Hindi-English meme and decide whether its misogynous nature stems from objectification, prejudice, humiliation, a combination of these, or is unspecified. Select from the labels: 'Objectification', 'Prejudice', 'Humiliation', 'Objectification, Humiliation', 'Objectification, Prejudice', 'Prejudice, Humiliation', or 'Unspecified'.`
411
+ **Labels:** `Objectification`, `Prejudice`, `Humiliation`, `Objectification, Humiliation`, `Objectification, Prejudice`, `Prejudice, Humiliation`, `Unspecified`
412
 
 
413
  </details>
414
 
415
  <details>
416
+ <summary><b>Romanian (RO)</b></summary>
417
 
418
+ #### Deepfake Detection (RoMemes)
419
+ **System:** `You are an expert image analyst specializing in detecting manipulated or deepfake content in memes.`
420
+ **Instruction:** `<image> Assess the provided Romanian meme. Is the image content authentic, or has it been altered? If altered, was it done using AI (deepfake) or other methods? Based on your analysis, classify the meme as: 'DeepFake', 'Fake', or 'Real'.`
421
+ **Labels:** `DeepFake`, `Fake`, `Real`
422
 
423
+ #### Emotion Recognition (RoMemes)
424
+ **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
425
+ **Instruction:** `<image> Determine the emotional impact of the Romanian meme. Analyze its components to select the most accurate emotional descriptor from the options provided. Classify as: 'Joy', 'Sadness', 'Anger', 'Fear', 'Surprise', or 'Love'.`
426
+ **Labels:** `Joy`, `Sadness`, `Anger`, `Fear`, `Surprise`, `Love`
427
+
428
+ #### Political Content Detection (RoMemes)
429
+ **System:** `You are an expert social media analyst specializing in analyzing political content in memes.`
430
+ **Instruction:** `<image> Assess whether the Romanian meme features political topics, including discussions about politicians, legislation, or political controversies. Classify the meme as either 'political' for relevant content, or 'not-political' if absent.`
431
+ **Labels:** `political`, `not-political`
432
+
433
+ #### Sentiment Analysis (RoMemes)
434
+ **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
435
+ **Instruction:** `<image> Investigate the sentiment expressed by the following Romanian meme. Select the appropriate label: 'positive' if the meme is upbeat or approving, 'negative' if it is critical or displeased, or 'neutral' if it remains impartial.`
436
+ **Labels:** `positive`, `negative`, `neutral`
437
 
 
438
  </details>
439
 
440
  <details>
441
+ <summary><b>Russian (RU)</b></summary>
442
 
443
+ #### Toxicity Detection (Toxic Memes)
444
+ **System:** `You are an expert social media image analyzer specializing in identifying toxic content in memes.`
445
+ **Instruction:** `<image> Evaluate the Russian meme for the presence of toxicity. This involves identifying any language or imagery that is damaging, disrespectful, or offensive to individuals or groups. Based on your assessment, classify as: 'toxic' or 'not-toxic'.`
446
+ **Labels:** `toxic`, `not-toxic`
447
 
448
+ </details>
449
 
450
+ <details>
451
+ <summary><b>Spanish (ES)</b></summary>
452
+
453
+ #### Hateful Content Detection (Multi3Hate)
454
+ **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
455
+ **Instruction:** `<image> Identify whether the Spanish meme displays hateful attitudes or sentiments, specifically looking for hate speech against protected groups. Classify your findings as: 'hateful' if hate speech is evident, or 'not-hateful' if it is not.`
456
+ **Labels:** `hateful`, `not-hateful`
457
 
 
458
  </details>
459
 
460
  ## Citation