mbayan commited on
Commit
a867815
·
verified ·
1 Parent(s): fcab2f9

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +213 -170
README.md CHANGED
@@ -168,470 +168,513 @@ Explanation: <explanation>
168
  > IMPORTANT: Your response must strictly follow this format:
169
  > 'Label: <label>
170
  > Explanation: <explanation>'
171
- > where <label> is your classification and <explanation> provides a brief justification for your decision based on the visual and textual content.
 
172
  >
173
  > Text extracted from meme: <OCR text>
174
  > ```
175
 
176
- ---
177
-
178
- #### Abuse Detection (BN)
179
 
180
  **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
181
 
182
- **Instruction:** `<image> Examine the Bengali meme and decide whether it features abusive content. This includes any form of verbal abuse or personal offense. Classify as: 'abusive' or 'not-abusive'.`
183
 
184
- **Labels:** `abusive`, `not-abusive`
185
 
186
- ---
 
187
 
188
- #### Deepfake Detection (RO)
 
189
 
190
  **System:** `You are an expert image analyst specializing in detecting manipulated or deepfake content in memes.`
191
 
192
- **Instruction:** `<image> Assess the provided Romanian meme. Is the image content authentic, or has it been altered? If altered, was it done using AI (deepfake) or other methods? Based on your analysis, classify the meme as: 'DeepFake', 'Fake', or 'Real'.`
193
 
194
- **Labels:** `DeepFake`, `Fake`, `Real`
195
 
196
- ---
 
197
 
198
- #### Emotion Recognition (RO)
 
199
 
200
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
201
 
202
- **Instruction:** `<image> Determine the emotional impact of the Romanian meme. Analyze its components to select the most accurate emotional descriptor from the options provided. Classify as: 'Joy', 'Sadness', 'Anger', 'Fear', 'Surprise', or 'Love'.`
203
 
204
- **Labels:** `Joy`, `Sadness`, `Anger`, `Fear`, `Surprise`, `Love`
205
 
206
- ---
 
207
 
208
- #### Harmful Content Detection (EN)
 
209
 
210
  **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
211
 
212
  **Instruction:** `<image> Categorize the meme based on a three-point scale of harmfulness. Scrutinize the image and text to make your judgment. Classify as: 'not-harmful', 'partially-harmful', or 'very-harmful'.`
213
 
214
  **Labels:** `not-harmful`, `partially-harmful`, `very-harmful`
 
215
 
216
- ---
217
-
218
- #### Harmful Content Detection — COVID-19 (EN)
219
 
220
  **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
221
 
222
  **Instruction:** `<image> Assess the meme's content concerning the COVID-19 crisis and decide on its classification. The categories reflect increasing levels of potential harm, from none to severe. Classify the meme as either: 'not-harmful', 'partially-harmful', or 'very-harmful'.`
223
 
224
  **Labels:** `not-harmful`, `partially-harmful`, `very-harmful`
 
225
 
226
- ---
227
-
228
- #### Hateful Content Detection (EN — FHM)
229
 
230
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
231
 
232
- **Instruction:** `<image> Analyze the content of the meme to see if it promotes or displays hateful views against specific individuals or groups. Look for discriminatory, prejudiced, or hate speech elements associated with protected characteristics, including race, religion, and gender. Classify as: 'hateful' if present, or 'not-hateful' if absent.`
233
 
234
- **Labels:** `hateful`, `not-hateful`
235
 
236
- ---
 
237
 
238
- #### Hateful Content Detection (EN — MMHS)
 
239
 
240
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
241
 
242
- **Instruction:** `<image> Examine the meme and determine if it contains any form of hate speech. Classify as: 'hateful' if the content is discriminatory or disparaging towards a person or group, and 'not-hateful' if it is not.`
243
 
244
- **Labels:** `hateful`, `not-hateful`
245
 
246
- ---
 
247
 
248
- #### Hateful Content Detection (EN — Multi3Hate)
 
249
 
250
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
251
 
252
- **Instruction:** `<image> Analyze the provided English meme to determine if it contains any form of hate speech, prejudice, or discriminatory content. Based on your analysis, categorize the meme using one of the following labels. Classify as: 'hateful' or 'not-hateful'.`
253
 
254
- **Labels:** `hateful`, `not-hateful`
255
 
256
- ---
 
257
 
258
- #### Hateful Content Detection — Islamophobia (EN)
 
259
 
260
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
261
 
262
- **Instruction:** `<image> Please evaluate the provided meme. Your task is to determine if it communicates hateful or discriminatory messages specifically targeting Islam or its adherents. Classify as: 'hateful' or 'not-hateful'.`
263
 
264
  **Labels:** `hateful`, `not-hateful`
 
265
 
266
- ---
267
-
268
- #### Hateful Content Detection (AR)
269
 
270
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
271
 
272
- **Instruction:** `<image> Analyze the content of the Arabic meme and determine if it includes hateful language. Consider whether it displays hatred, discrimination, or bias based on identity or protected traits. Classify as: 'Hateful' if applicable, otherwise 'not-hateful'.`
273
-
274
- **Labels:** `Hateful`, `not-hateful`
275
 
276
- ---
 
277
 
278
- #### Hateful Content Detection (BN)
 
279
 
280
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
281
 
282
- **Instruction:** `<image> Is the provided Bengali meme hateful in nature? Review the meme for any content that expresses discrimination or prejudice. After your review, select one of the two possible classifications. Classify as: 'hateful' or 'not-hateful'.`
283
 
284
  **Labels:** `hateful`, `not-hateful`
 
285
 
286
- ---
287
-
288
- #### Hateful Content Detection (DE)
289
 
290
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
291
 
292
- **Instruction:** `<image> Analyze the German meme for hateful content. Determine if it promotes hate, discrimination, or prejudice against protected groups. Classify as: 'hateful' if it contains hate speech, or 'not-hateful' if it does not.`
293
 
294
  **Labels:** `hateful`, `not-hateful`
 
295
 
296
- ---
297
-
298
- #### Hateful Content Detection (ES)
299
 
300
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
301
 
302
- **Instruction:** `<image> Identify whether the Spanish meme displays hateful attitudes or sentiments, specifically looking for hate speech against protected groups. Classify your findings as: 'hateful' if hate speech is evident, or 'not-hateful' if it is not.`
303
 
304
- **Labels:** `hateful`, `not-hateful`
305
 
306
- ---
 
307
 
308
- #### Hateful Content Detection (HI)
 
309
 
310
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
311
 
312
- **Instruction:** `<image> Examine the content of the Hindi meme. Does it contain language or imagery that is discriminatory or incites hatred? Assign one of the following labels: 'hateful' or 'not-hateful'.`
313
 
314
- **Labels:** `hateful`, `not-hateful`
315
 
316
- ---
 
317
 
318
- #### Hateful Content Detection (ZH)
 
319
 
320
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
321
 
322
- **Instruction:** `<image> Your objective is to classify the Chinese meme based on its content. Determine if it conveys a hateful message targeting any specific group. Use one of the two provided labels: 'hateful' or 'not-hateful'.`
323
 
324
- **Labels:** `hateful`, `not-hateful`
325
 
326
- ---
 
327
 
328
- #### Humor Classification (EN)
 
329
 
330
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
331
 
332
  **Instruction:** `<image> Classify the level of humor in the meme. Analyze how funny the content is based on its comedic elements. Classify as: 'not-funny' for unfunny content, 'funny' for mildly humorous, 'very-funny' for highly humorous, or 'hilarious' for extremely funny content.`
333
 
334
  **Labels:** `not-funny`, `funny`, `very-funny`, `hilarious`
 
335
 
336
- ---
337
-
338
- #### Intention Detection (EN)
339
 
340
  **System:** `You are an expert social media image analyzer specializing in detecting intentions in memes.`
341
 
342
  **Instruction:** `<image> Categorize the primary purpose of the meme. What was the creator's main goal in making it? Please classify the intention as: 'Entertaining', 'Expressive', 'Interactive', or 'Offensive'.`
343
 
344
  **Labels:** `Entertaining`, `Expressive`, `Interactive`, `Offensive`
 
345
 
346
- ---
347
-
348
- #### Intention Detection (ZH)
349
 
350
  **System:** `You are an expert social media image analyzer specializing in detecting intentions in memes.`
351
 
352
- **Instruction:** `<image> Please review the Chinese meme and decide on the main reason it was created. Categorize its purpose using one of the provided labels which are: 'Entertaining' for amusement, 'Expressive' for emotional expression, 'Interactive' for engagement, 'Offensive' for provocative intent, or 'Other' for unclear intentions.`
353
 
354
- **Labels:** `Entertaining`, `Expressive`, `Interactive`, `Offensive`, `Other`
355
 
356
- ---
 
357
 
358
- #### Metaphor Detection (EN)
 
359
 
360
  **System:** `You are an expert social media image analyzer specializing in identifying metaphors in memes.`
361
 
362
  **Instruction:** `<image> Determine whether the meme uses metaphorical or literal language. Analyze if the content relies on figurative language, symbolism, or indirect meaning. Classify as: 'Metaphorical' if it uses figurative language, or 'Literal' if the meaning is direct and straightforward.`
363
 
364
  **Labels:** `Metaphorical`, `Literal`
 
365
 
366
- ---
367
-
368
- #### Metaphor Detection (ZH)
369
 
370
  **System:** `You are an expert social media image analyzer specializing in identifying metaphors in memes.`
371
 
372
- **Instruction:** `<image> Identify whether the Chinese meme exemplifies metaphorical communication by spotting elements of indirect meaning or symbolism, as opposed to stating facts directly. Classify as: 'Metaphorical' if you find figurative elements, otherwise choose 'Literal'.`
373
 
374
- **Labels:** `Metaphorical`, `Literal`
375
 
376
- ---
 
377
 
378
- #### Misogyny Detection (EN)
 
379
 
380
  **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
381
 
382
  **Instruction:** `<image> Scrutinize the meme to detect any indication of misogynous content, including aspects like prejudice, discrimination, objectification, or aversion towards women. Label the meme 'misogynous' when identified, or 'not-misogynous' if none are observed.`
383
 
384
  **Labels:** `misogynous`, `not-misogynous`
 
385
 
386
- ---
387
-
388
- #### Misogyny Detection (HI-EN)
389
 
390
  **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
391
 
392
- **Instruction:** `<image> Observe the Hindi-English meme and ascertain if there are signs of misogyny, such as actions or language indicative of objectification, discrimination, prejudice, or hostility towards women. Classify strictly as: 'misogynous' or 'not-misogynous'.`
393
-
394
- **Labels:** `misogynous`, `not-misogynous`
395
 
396
- ---
 
397
 
398
- #### Misogyny Categories (HI-EN)
 
399
 
400
  **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
401
 
402
- **Instruction:** `<image> Inspect the Hindi-English meme and decide whether its misogynous nature stems from objectification, prejudice, humiliation, a combination of these, or is unspecified. Select from the labels: 'Objectification', 'Prejudice', 'Humiliation', 'Objectification, Humiliation', 'Objectification, Prejudice', 'Prejudice, Humiliation', or 'Unspecified'.`
403
-
404
- **Labels:** `Objectification`, `Prejudice`, `Humiliation`, `Objectification, Humiliation`, `Objectification, Prejudice`, `Prejudice, Humiliation`, `Unspecified`
405
 
406
- ---
 
407
 
408
- #### Motivational Detection (EN)
 
409
 
410
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
411
 
412
  **Instruction:** `<image> Assess if the meme fits the motivational category by checking for inspirational, encouraging, or uplifting content. Classify using: 'motivational' for memes with these features, 'not-motivational' otherwise.`
413
 
414
  **Labels:** `motivational`, `not-motivational`
 
415
 
416
- ---
417
-
418
- #### Objectification Detection (EN)
419
 
420
  **System:** `You are an expert social media image analyzer specializing in identifying objectification in memes.`
421
 
422
  **Instruction:** `<image> Evaluate the meme for signs that it depicts women as objects, prioritizes appearance over character, or undermines their autonomy. Classify as: 'objectification' when such elements are present; otherwise, use 'not-objectification'.`
423
 
424
  **Labels:** `objectification`, `not-objectification`
 
425
 
426
- ---
427
-
428
- #### Offensive Content Detection (EN — Memotion)
429
 
430
  **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
431
 
432
  **Instruction:** `<image> Carefully read the meme and judge how objectionable the material is. Then, classify it as: 'not-offensive', 'slightly-offensive', 'very-offensive', or 'hateful-offensive', based on the degree of offensiveness present.`
433
 
434
  **Labels:** `not-offensive`, `slightly-offensive`, `very-offensive`, `hateful-offensive`
 
435
 
436
- ---
437
-
438
- #### Offensive Content Detection (EN — MET-Meme)
439
 
440
  **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
441
 
442
  **Instruction:** `<image> Judge the degree of offensiveness present in the meme. Evaluate how inappropriate or offensive the content is, and assign one of the following labels: 'not-offensive', 'slightly-offensive', 'moderately-offensive', or 'very-offensive'.`
443
 
444
  **Labels:** `not-offensive`, `slightly-offensive`, `moderately-offensive`, `very-offensive`
 
445
 
446
- ---
447
-
448
- #### Offensive Content Detection (ZH)
449
 
450
  **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
451
 
452
- **Instruction:** `<image> Identify the degree of offensiveness in the supplied Chinese meme. Choose one of the following precise classifications: 'not-offensive', 'slightly-offensive', 'moderately-offensive', or 'very-offensive'.`
453
 
454
- **Labels:** `not-offensive`, `slightly-offensive`, `moderately-offensive`, `very-offensive`
455
 
456
- ---
 
457
 
458
- #### Political Content Detection (RO)
 
459
 
460
  **System:** `You are an expert social media analyst specializing in analyzing political content in memes.`
461
 
462
- **Instruction:** `<image> Assess whether the Romanian meme features political topics, including discussions about politicians, legislation, or political controversies. Classify the meme as either 'political' for relevant content, or 'not-political' if absent.`
463
 
464
- **Labels:** `political`, `not-political`
465
 
466
- ---
 
467
 
468
- #### Propaganda Detection (AR — ArMeme)
 
469
 
470
  **System:** `You are an expert social media analyst specializing in identifying propaganda techniques in memes.`
471
 
472
- **Instruction:** `<image> Identify if the given Arabic meme is an example of propaganda by checking for biased or misleading narratives intended to affect political or ideological perspectives. Select: 'propaganda' or 'not-propaganda' as appropriate.`
473
 
474
- **Labels:** `propaganda`, `not-propaganda`
475
 
476
- ---
 
477
 
478
- #### Sarcasm Detection (BN)
 
479
 
480
  **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
481
 
482
- **Instruction:** `<image> Assess the communication style of the Bengali meme. Does it rely on sarcasm to make its point? If the meme is straightforward and means exactly what it says, it is not sarcastic. Classify the content as: 'sarcasm' or 'not-sarcasm'.`
483
 
484
- **Labels:** `sarcasm`, `not-sarcasm`
485
 
486
- ---
 
487
 
488
- #### Sarcasm Detection (EN)
 
489
 
490
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
491
 
492
  **Instruction:** `<image> Your objective is to identify the type of sarcasm used in the meme. Determine if the content is straightforward or if it uses irony to convey its message. Please assign a classification based on the complexity of the sarcasm. Classify as: 'not-sarcastic', 'general-sarcasm', 'twisted-meaning', or 'very-twisted'.`
493
 
494
  **Labels:** `not-sarcastic`, `general-sarcasm`, `twisted-meaning`, `very-twisted`
 
495
 
496
- ---
497
-
498
- #### Sentiment Analysis (EN — Memotion)
499
 
500
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
501
 
502
  **Instruction:** `<image> Review the meme and determine its general emotional sentiment. Based on your analysis, assign one of the following labels: 'very-negative', 'negative', 'neutral', 'positive', or 'very-positive'.`
503
 
504
  **Labels:** `very-negative`, `negative`, `neutral`, `positive`, `very-positive`
 
505
 
506
- ---
507
-
508
- #### Sentiment Analysis (BN)
509
 
510
  **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
511
 
512
- **Instruction:** `<image> Examine the Bengali meme and identify the sentiment it conveys. Classify the emotion as: 'positive', 'negative', or 'neutral'.`
513
 
514
- **Labels:** `positive`, `negative`, `neutral`
515
 
516
- ---
 
517
 
518
- #### Sentiment Analysis (RO)
 
519
 
520
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
521
 
522
- **Instruction:** `<image> Investigate the sentiment expressed by the following Romanian meme. Select the appropriate label: 'positive' if the meme is upbeat or approving, 'negative' if it is critical or displeased, or 'neutral' if it remains impartial.`
523
 
524
- **Labels:** `positive`, `negative`, `neutral`
525
 
526
- ---
 
527
 
528
- #### Sentiment Category (EN)
 
529
 
530
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
531
 
532
  **Instruction:** `<image> For the meme presented, decide which single emotional category it belongs to, considering its overall message and tone. The available classifications are: 'Happiness', 'Sorrow', 'Anger', 'Fear', 'Surprise', 'Love', or 'Hate'.`
533
 
534
  **Labels:** `Happiness`, `Sorrow`, `Anger`, `Fear`, `Surprise`, `Love`, `Hate`
 
535
 
536
- ---
537
-
538
- #### Sentiment Category (ZH)
539
 
540
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
541
 
542
- **Instruction:** `<image> Classify the provided Chinese meme based on its primary emotional content. Your selection must be one of the following: 'Happiness', 'Sorrow', 'Anger', 'Fear', 'Surprise', 'Love', or 'Hate'.`
543
 
544
- **Labels:** `Happiness`, `Sorrow`, `Anger`, `Fear`, `Surprise`, `Love`, `Hate`
545
 
546
- ---
 
547
 
548
- #### Sentiment Degree (EN)
 
549
 
550
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
551
 
552
  **Instruction:** `<image> Considering the overall sentiment in the meme, classify its intensity by selecting: 'slightly' for minimal emotional presence, 'moderately' for a balanced intensity, or 'very' if the sentiment is intense and vivid.`
553
 
554
  **Labels:** `slightly`, `moderately`, `very`
 
555
 
556
- ---
557
-
558
- #### Sentiment Degree (ZH)
559
 
560
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
561
 
562
- **Instruction:** `<image> Please examine the Chinese meme and classify how strongly its sentiment is expressed. Use the following labels for the intensity: 'slightly' for a low degree, 'moderately' for a medium degree, and 'very' for a high degree of emotional expression.`
563
 
564
- **Labels:** `slightly`, `moderately`, `very`
565
 
566
- ---
 
567
 
568
- #### Shaming Detection (EN)
 
569
 
570
  **System:** `You are an expert social media image analyzer specializing in identifying shaming content in memes.`
571
 
572
  **Instruction:** `<image> Classify whether the meme contains shaming content directed at women. Determine if it aims to humiliate, embarrass, or shame women about their appearance, behavior, or choices. Classify as: 'shaming' if such content is present, or 'not-shaming' if it is not.`
573
 
574
  **Labels:** `shaming`, `not-shaming`
 
575
 
576
- ---
577
-
578
- #### Stereotype Detection (EN)
579
 
580
  **System:** `You are an expert social media image analyzer specializing in identifying stereotypes in memes.`
581
 
582
  **Instruction:** `<image> Review the meme and determine if it contains stereotypical content about women—look for generalized, simplified, or biased messages. Classify as: 'stereotype' if you find such traits, otherwise 'not-stereotype'.`
583
 
584
  **Labels:** `stereotype`, `not-stereotype`
 
585
 
586
- ---
587
-
588
- #### Target Identification (EN)
589
 
590
  **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
591
 
592
  **Instruction:** `<image> For the provided meme, interpret who or what is being singled out. Is the commentary aimed at one person, a community, an organization, all of society, or no target? Make your classification using: 'individual', 'community', 'organization', 'society', or 'none'.`
593
 
594
  **Labels:** `individual`, `community`, `organization`, `society`, `none`
 
595
 
596
- ---
597
-
598
- #### Target Identification — COVID-19 (EN)
599
 
600
  **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
601
 
602
  **Instruction:** `<image> Review the COVID-19 meme and determine the entity that is depicted as the target within the content. Assign one of the following categories: 'individual', 'community', 'organization', 'society', or 'none'.`
603
 
604
  **Labels:** `individual`, `community`, `organization`, `society`, `none`
 
605
 
606
- ---
607
-
608
- #### Toxicity Detection (RU)
609
 
610
  **System:** `You are an expert social media image analyzer specializing in identifying toxic content in memes.`
611
 
612
- **Instruction:** `<image> Evaluate the Russian meme for the presence of toxicity. This involves identifying any language or imagery that is damaging, disrespectful, or offensive to individuals or groups. Based on your assessment, classify as: 'toxic' or 'not-toxic'.`
613
 
614
- **Labels:** `toxic`, `not-toxic`
615
 
616
- ---
 
617
 
618
- #### Violence Detection (EN)
 
619
 
620
  **System:** `You are an expert social media image analyzer specializing in identifying violent content in memes.`
621
 
622
  **Instruction:** `<image> Is there content in this meme that shows, glorifies, or promotes physical, sexual, or psychological violence toward women? Please make a determination. Classify as: 'violence' if yes, and 'not-violence' if no.`
623
 
624
  **Labels:** `violence`, `not-violence`
 
625
 
626
- ---
627
-
628
- #### Vulgar Content Detection (BN)
629
 
630
  **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
631
 
632
- **Instruction:** `<image> Evaluate the Bengali meme to see if vulgarity is present. Search for crude, obscene, or improper language or images. Classify as: 'vulgar' if such content is found, otherwise use 'not-vulgar'.`
 
 
633
 
634
- **Labels:** `vulgar`, `not-vulgar`
 
635
 
636
  ## Citation
637
 
 
168
  > IMPORTANT: Your response must strictly follow this format:
169
  > 'Label: <label>
170
  > Explanation: <explanation>'
171
+ > where <label> is your classification and <explanation> provides a brief justification
172
+ > for your decision based on the visual and textual content.
173
  >
174
  > Text extracted from meme: <OCR text>
175
  > ```
176
 
177
+ <details>
178
+ <summary><b>Abuse Detection (BN)</b></summary>
 
179
 
180
  **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
181
 
182
+ **English Instruction:** `<image> Examine the Bengali meme and decide whether it features abusive content. This includes any form of verbal abuse or personal offense. Classify as: 'abusive' or 'not-abusive'.`
183
 
184
+ **Native (BN) Instruction:** `<image> এই বাংলা মেমটিতে কি কোনো ধরনের আপত্তিকর বা অবমাননাকর বিষয়বস্তু আছে? আপনার সিদ্ধান্ত অনুযায়ী মেমটিকে 'অপমানজনক' অথবা 'অপমানজনক নয়' লেবেল দিন।`
185
 
186
+ **English Labels:** `abusive`, `not-abusive` | **Native Labels:** `অপমানজনক`, `অপমানজনক নয়`
187
+ </details>
188
 
189
+ <details>
190
+ <summary><b>Deepfake Detection (RO)</b></summary>
191
 
192
  **System:** `You are an expert image analyst specializing in detecting manipulated or deepfake content in memes.`
193
 
194
+ **English Instruction:** `<image> Assess the provided Romanian meme. Is the image content authentic, or has it been altered? If altered, was it done using AI (deepfake) or other methods? Based on your analysis, classify the meme as: 'DeepFake', 'Fake', or 'Real'.`
195
 
196
+ **Native (RO) Instruction:** `<image> Este această memă românească autentică sau a fost manipulată? Dacă a fost manipulată, s-a folosit inteligență artificială (deepfake)? Folosește următoarele etichete pentru a clasifica: 'DeepFake' pentru conținut falsificat cu AI, 'Fake' pentru alt tip de falsificare, sau 'Real' pentru conținut original.`
197
 
198
+ **English Labels:** `DeepFake`, `Fake`, `Real` | **Native Labels:** `DeepFake`, `Fals`, `Real`
199
+ </details>
200
 
201
+ <details>
202
+ <summary><b>Emotion Recognition (RO)</b></summary>
203
 
204
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
205
 
206
+ **English Instruction:** `<image> Determine the emotional impact of the Romanian meme. Analyze its components to select the most accurate emotional descriptor from the options provided. Classify as: 'Joy', 'Sadness', 'Anger', 'Fear', 'Surprise', or 'Love'.`
207
 
208
+ **Native (RO) Instruction:** `<image> Examinați mema românească și identificați cea mai evidentă emoție transmisă prin text și imagine. Hotărâți care stare emoțională este cea mai puternică. Clasificați ca: 'Bucurie' dacă simțiți fericire, 'Tristețe' dacă se exprimă supărare, 'Furie' dacă este prezentă mânia, 'Frică' pentru anxietate, 'Surpriză' pentru șoc, sau 'Dragoste' pentru afecțiune.`
209
 
210
+ **English Labels:** `Joy`, `Sadness`, `Anger`, `Fear`, `Surprise`, `Love` | **Native Labels:** `Bucurie`, `Tristețe`, `Furie`, `Frică`, `Surpriză`, `Dragoste`
211
+ </details>
212
 
213
+ <details>
214
+ <summary><b>Harmful Content Detection (EN)</b></summary>
215
 
216
  **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
217
 
218
  **Instruction:** `<image> Categorize the meme based on a three-point scale of harmfulness. Scrutinize the image and text to make your judgment. Classify as: 'not-harmful', 'partially-harmful', or 'very-harmful'.`
219
 
220
  **Labels:** `not-harmful`, `partially-harmful`, `very-harmful`
221
+ </details>
222
 
223
+ <details>
224
+ <summary><b>Harmful Content Detection — COVID-19 (EN)</b></summary>
 
225
 
226
  **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
227
 
228
  **Instruction:** `<image> Assess the meme's content concerning the COVID-19 crisis and decide on its classification. The categories reflect increasing levels of potential harm, from none to severe. Classify the meme as either: 'not-harmful', 'partially-harmful', or 'very-harmful'.`
229
 
230
  **Labels:** `not-harmful`, `partially-harmful`, `very-harmful`
231
+ </details>
232
 
233
+ <details>
234
+ <summary><b>Hateful Content Detection (AR — Prop2Hate)</b></summary>
 
235
 
236
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
237
 
238
+ **English Instruction:** `<image> Analyze the content of the Arabic meme and determine if it includes hateful language. Consider whether it displays hatred, discrimination, or bias based on identity or protected traits. Classify as: 'Hateful' if applicable, otherwise 'not-hateful'.`
239
 
240
+ **Native (AR) Instruction:** `<image> تتمثل مهمتك في تمييز المحتوى البغيض في الميمات العربية. هل تحتوي الميم على أي خطاب تمييزي أو عدائي؟ صنفها وفقًا لذلك: 'كراهية' إذا كانت كذلك، أو 'غير كراهية' إذا لم تكن.`
241
 
242
+ **English Labels:** `Hateful`, `not-hateful` | **Native Labels:** `كراهية`, `غير كراهية`
243
+ </details>
244
 
245
+ <details>
246
+ <summary><b>Hateful Content Detection (BN — MUTE)</b></summary>
247
 
248
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
249
 
250
+ **English Instruction:** `<image> Is the provided Bengali meme hateful in nature? Review the meme for any content that expresses discrimination or prejudice. After your review, select one of the two possible classifications. Classify as: 'hateful' or 'not-hateful'.`
251
 
252
+ **Native (BN) Instruction:** `<image> এই বাংলা মেমটি দেখে বলুন, এতে কি কোনো ঘৃণামূলক বিষয়বস্তু আছে? যদি কোনো ব্যক্তি বা গোষ্ঠীকে আক্রমণ করা হয়ে থাকে, তাহলে এটিকে 'ঘৃণামূলক' হিসেবে চিহ্নিত করুন। যদি তা না হয়, তাহলে এটিকে 'ঘৃণামূলক নয়' হিসেবে চিহ্নিত করুন।`
253
 
254
+ **English Labels:** `hateful`, `not-hateful` | **Native Labels:** `ঘৃণামূলক`, `ঘৃণামূলক নয়`
255
+ </details>
256
 
257
+ <details>
258
+ <summary><b>Hateful Content Detection (DE — Multi3Hate)</b></summary>
259
 
260
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
261
 
262
+ **English Instruction:** `<image> Analyze the German meme for hateful content. Determine if it promotes hate, discrimination, or prejudice against protected groups. Classify as: 'hateful' if it contains hate speech, or 'not-hateful' if it does not.`
263
 
264
+ **Native (DE) Instruction:** `<image> Bewerten Sie das folgende deutsche Meme. Fördert es Hass oder Vorurteile gegen eine bestimmte Gruppe? Ordnen Sie es einer der folgenden Kategorien zu: 'hasserfüllt' oder 'nicht hasserfüllt'.`
265
 
266
+ **English Labels:** `hateful`, `not-hateful` | **Native Labels:** `hasserfüllt`, `nicht hasserfüllt`
267
+ </details>
268
 
269
+ <details>
270
+ <summary><b>Hateful Content Detection (EN — FHM)</b></summary>
271
 
272
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
273
 
274
+ **Instruction:** `<image> Analyze the content of the meme to see if it promotes or displays hateful views against specific individuals or groups. Look for discriminatory, prejudiced, or hate speech elements associated with protected characteristics, including race, religion, and gender. Classify as: 'hateful' if present, or 'not-hateful' if absent.`
275
 
276
  **Labels:** `hateful`, `not-hateful`
277
+ </details>
278
 
279
+ <details>
280
+ <summary><b>Hateful Content Detection (EN — MMHS)</b></summary>
 
281
 
282
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
283
 
284
+ **Instruction:** `<image> Examine the meme and determine if it contains any form of hate speech. Classify as: 'hateful' if the content is discriminatory or disparaging towards a person or group, and 'not-hateful' if it is not.`
 
 
285
 
286
+ **Labels:** `hateful`, `not-hateful`
287
+ </details>
288
 
289
+ <details>
290
+ <summary><b>Hateful Content Detection (EN — Multi3Hate)</b></summary>
291
 
292
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
293
 
294
+ **Instruction:** `<image> Analyze the provided English meme to determine if it contains any form of hate speech, prejudice, or discriminatory content. Based on your analysis, categorize the meme using one of the following labels. Classify as: 'hateful' or 'not-hateful'.`
295
 
296
  **Labels:** `hateful`, `not-hateful`
297
+ </details>
298
 
299
+ <details>
300
+ <summary><b>Hateful Content Detection — Islamophobia (EN)</b></summary>
 
301
 
302
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
303
 
304
+ **Instruction:** `<image> Please evaluate the provided meme. Your task is to determine if it communicates hateful or discriminatory messages specifically targeting Islam or its adherents. Classify as: 'hateful' or 'not-hateful'.`
305
 
306
  **Labels:** `hateful`, `not-hateful`
307
+ </details>
308
 
309
+ <details>
310
+ <summary><b>Hateful Content Detection (ES — Multi3Hate)</b></summary>
 
311
 
312
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
313
 
314
+ **English Instruction:** `<image> Identify whether the Spanish meme displays hateful attitudes or sentiments, specifically looking for hate speech against protected groups. Classify your findings as: 'hateful' if hate speech is evident, or 'not-hateful' if it is not.`
315
 
316
+ **Native (ES) Instruction:** `<image> Considerando el meme en español, ¿se puede catalogar como discurso de odio por atacar o denigrar a un colectivo? Categorízalo como: 'odioso' o 'no odioso'.`
317
 
318
+ **English Labels:** `hateful`, `not-hateful` | **Native Labels:** `odioso`, `no odioso`
319
+ </details>
320
 
321
+ <details>
322
+ <summary><b>Hateful Content Detection (HI — Multi3Hate)</b></summary>
323
 
324
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
325
 
326
+ **English Instruction:** `<image> Examine the content of the Hindi meme. Does it contain language or imagery that is discriminatory or incites hatred? Assign one of the following labels: 'hateful' or 'not-hateful'.`
327
 
328
+ **Native (HI) Instruction:** `<image> Determine if the Hindi meme contains hateful content. Identify whether it expresses hate, discrimination, or prejudice against individuals or groups. Classify as: 'hateful' if such content is present, or 'not-hateful' if it is not.`
329
 
330
+ **English Labels:** `hateful`, `not-hateful` | **Native Labels:** `घृणास्पद`, `घृणास्पद नहीं`
331
+ </details>
332
 
333
+ <details>
334
+ <summary><b>Hateful Content Detection (ZH — Multi3Hate)</b></summary>
335
 
336
  **System:** `You are an expert social media image analyzer specializing in identifying hateful content in memes.`
337
 
338
+ **English Instruction:** `<image> Your objective is to classify the Chinese meme based on its content. Determine if it conveys a hateful message targeting any specific group. Use one of the two provided labels: 'hateful' or 'not-hateful'.`
339
 
340
+ **Native (ZH) Instruction:** `<image> Analyze the Chinese meme for hateful content. Determine if it promotes hate, discrimination, or prejudice against protected groups. Classify as: 'hateful' if it contains hate speech, or 'not-hateful' if it does not.`
341
 
342
+ **English Labels:** `hateful`, `not-hateful` | **Native Labels:** `仇恨`, `非仇恨`
343
+ </details>
344
 
345
+ <details>
346
+ <summary><b>Humor Classification (EN)</b></summary>
347
 
348
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
349
 
350
  **Instruction:** `<image> Classify the level of humor in the meme. Analyze how funny the content is based on its comedic elements. Classify as: 'not-funny' for unfunny content, 'funny' for mildly humorous, 'very-funny' for highly humorous, or 'hilarious' for extremely funny content.`
351
 
352
  **Labels:** `not-funny`, `funny`, `very-funny`, `hilarious`
353
+ </details>
354
 
355
+ <details>
356
+ <summary><b>Intention Detection (EN)</b></summary>
 
357
 
358
  **System:** `You are an expert social media image analyzer specializing in detecting intentions in memes.`
359
 
360
  **Instruction:** `<image> Categorize the primary purpose of the meme. What was the creator's main goal in making it? Please classify the intention as: 'Entertaining', 'Expressive', 'Interactive', or 'Offensive'.`
361
 
362
  **Labels:** `Entertaining`, `Expressive`, `Interactive`, `Offensive`
363
+ </details>
364
 
365
+ <details>
366
+ <summary><b>Intention Detection (ZH)</b></summary>
 
367
 
368
  **System:** `You are an expert social media image analyzer specializing in detecting intentions in memes.`
369
 
370
+ **English Instruction:** `<image> Please review the Chinese meme and decide on the main reason it was created. Categorize its purpose using one of the provided labels which are: 'Entertaining' for amusement, 'Expressive' for emotional expression, 'Interactive' for engagement, 'Offensive' for provocative intent, or 'Other' for unclear intentions.`
371
 
372
+ **Native (ZH) Instruction:** `<image> 请对下面的中文表情包进行分类,明确判断其主要意图。参考以下分类标准:'娱乐' 指娱乐性的内容,'表达' 用于表达情绪或观点,'互动' 用于互动交流,'冒犯' 包含冒犯意识,'其他' 意图不清。分类为:'娱乐'、'表达'、'互动'、'冒犯' 或 '其他'。`
373
 
374
+ **English Labels:** `Entertaining`, `Expressive`, `Interactive`, `Offensive`, `Other` | **Native Labels:** `娱乐`, `表达`, `互动`, `冒犯`, `其他`
375
+ </details>
376
 
377
+ <details>
378
+ <summary><b>Metaphor Detection (EN)</b></summary>
379
 
380
  **System:** `You are an expert social media image analyzer specializing in identifying metaphors in memes.`
381
 
382
  **Instruction:** `<image> Determine whether the meme uses metaphorical or literal language. Analyze if the content relies on figurative language, symbolism, or indirect meaning. Classify as: 'Metaphorical' if it uses figurative language, or 'Literal' if the meaning is direct and straightforward.`
383
 
384
  **Labels:** `Metaphorical`, `Literal`
385
+ </details>
386
 
387
+ <details>
388
+ <summary><b>Metaphor Detection (ZH)</b></summary>
 
389
 
390
  **System:** `You are an expert social media image analyzer specializing in identifying metaphors in memes.`
391
 
392
+ **English Instruction:** `<image> Identify whether the Chinese meme exemplifies metaphorical communication by spotting elements of indirect meaning or symbolism, as opposed to stating facts directly. Classify as: 'Metaphorical' if you find figurative elements, otherwise choose 'Literal'.`
393
 
394
+ **Native (ZH) Instruction:** `<image> 请根据中文表情包中的语言表达方式判断,是属于隐喻还是字面。若内容具有象征性或使用比喻手法,则分类为 '隐喻';否则,若表达直白无修饰,则分类为 '字面'。`
395
 
396
+ **English Labels:** `Metaphorical`, `Literal` | **Native Labels:** `隐喻`, `字面`
397
+ </details>
398
 
399
+ <details>
400
+ <summary><b>Misogyny Detection (EN — MAMI)</b></summary>
401
 
402
  **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
403
 
404
  **Instruction:** `<image> Scrutinize the meme to detect any indication of misogynous content, including aspects like prejudice, discrimination, objectification, or aversion towards women. Label the meme 'misogynous' when identified, or 'not-misogynous' if none are observed.`
405
 
406
  **Labels:** `misogynous`, `not-misogynous`
407
+ </details>
408
 
409
+ <details>
410
+ <summary><b>Misogyny Detection (HI-EN — MIMIC2024)</b></summary>
 
411
 
412
  **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
413
 
414
+ **English Instruction:** `<image> Observe the Hindi-English meme and ascertain if there are signs of misogyny, such as actions or language indicative of objectification, discrimination, prejudice, or hostility towards women. Classify strictly as: 'misogynous' or 'not-misogynous'.`
 
 
415
 
416
+ **English Labels:** `misogynous`, `not-misogynous` | **Native Labels:** `महिला विरोधी`, `महिला विरोधी नहीं`
417
+ </details>
418
 
419
+ <details>
420
+ <summary><b>Misogyny Categories (HI-EN — MIMIC2024)</b></summary>
421
 
422
  **System:** `You are an expert social media image analyzer specializing in identifying misogynistic content in memes.`
423
 
424
+ **English Instruction:** `<image> Inspect the Hindi-English meme and decide whether its misogynous nature stems from objectification, prejudice, humiliation, a combination of these, or is unspecified. Select from the labels: 'Objectification', 'Prejudice', 'Humiliation', 'Objectification, Humiliation', 'Objectification, Prejudice', 'Prejudice, Humiliation', or 'Unspecified'.`
 
 
425
 
426
+ **English Labels:** `Objectification`, `Prejudice`, `Humiliation`, `Objectification, Humiliation`, `Objectification, Prejudice`, `Prejudice, Humiliation`, `Unspecified` | **Native Labels:** `वस्तुकरण`, `पूर्वाग्रह`, `अपमान`, `वस्तुकरण, अपमान`, `वस्तुकरण, पूर्वाग्रह`, `पूर्वाग्रह, अपमान`, `अनिर्दिष्ट`
427
+ </details>
428
 
429
+ <details>
430
+ <summary><b>Motivational Detection (EN)</b></summary>
431
 
432
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
433
 
434
  **Instruction:** `<image> Assess if the meme fits the motivational category by checking for inspirational, encouraging, or uplifting content. Classify using: 'motivational' for memes with these features, 'not-motivational' otherwise.`
435
 
436
  **Labels:** `motivational`, `not-motivational`
437
+ </details>
438
 
439
+ <details>
440
+ <summary><b>Objectification Detection (EN)</b></summary>
 
441
 
442
  **System:** `You are an expert social media image analyzer specializing in identifying objectification in memes.`
443
 
444
  **Instruction:** `<image> Evaluate the meme for signs that it depicts women as objects, prioritizes appearance over character, or undermines their autonomy. Classify as: 'objectification' when such elements are present; otherwise, use 'not-objectification'.`
445
 
446
  **Labels:** `objectification`, `not-objectification`
447
+ </details>
448
 
449
+ <details>
450
+ <summary><b>Offensive Content Detection (EN — Memotion)</b></summary>
 
451
 
452
  **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
453
 
454
  **Instruction:** `<image> Carefully read the meme and judge how objectionable the material is. Then, classify it as: 'not-offensive', 'slightly-offensive', 'very-offensive', or 'hateful-offensive', based on the degree of offensiveness present.`
455
 
456
  **Labels:** `not-offensive`, `slightly-offensive`, `very-offensive`, `hateful-offensive`
457
+ </details>
458
 
459
+ <details>
460
+ <summary><b>Offensive Content Detection (EN — MET-Meme)</b></summary>
 
461
 
462
  **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
463
 
464
  **Instruction:** `<image> Judge the degree of offensiveness present in the meme. Evaluate how inappropriate or offensive the content is, and assign one of the following labels: 'not-offensive', 'slightly-offensive', 'moderately-offensive', or 'very-offensive'.`
465
 
466
  **Labels:** `not-offensive`, `slightly-offensive`, `moderately-offensive`, `very-offensive`
467
+ </details>
468
 
469
+ <details>
470
+ <summary><b>Offensive Content Detection (ZH — MET-Meme)</b></summary>
 
471
 
472
  **System:** `You are an expert social media image analyzer specializing in identifying offensive content in memes.`
473
 
474
+ **English Instruction:** `<image> Identify the degree of offensiveness in the supplied Chinese meme. Choose one of the following precise classifications: 'not-offensive', 'slightly-offensive', 'moderately-offensive', or 'very-offensive'.`
475
 
476
+ **Native (ZH) Instruction:** `<image> 对每个中文表情包,判断它的冒犯性级别。分类包括:'不冒犯' —— 完全无冒犯意向,'轻度冒犯' —— 轻微不当,'中度冒犯' —— 明显冒犯,'严重冒犯' —— 极度不当。请据此分辨。`
477
 
478
+ **English Labels:** `not-offensive`, `slightly-offensive`, `moderately-offensive`, `very-offensive` | **Native Labels:** `不冒犯`, `轻度冒犯`, `中度冒犯`, `严重冒犯`
479
+ </details>
480
 
481
+ <details>
482
+ <summary><b>Political Content Detection (RO)</b></summary>
483
 
484
  **System:** `You are an expert social media analyst specializing in analyzing political content in memes.`
485
 
486
+ **English Instruction:** `<image> Assess whether the Romanian meme features political topics, including discussions about politicians, legislation, or political controversies. Classify the meme as either 'political' for relevant content, or 'not-political' if absent.`
487
 
488
+ **Native (RO) Instruction:** `<image> Analizați dacă mema românească se referă la politică prin menționarea politicienilor, politicilor, alegerilor sau evenimentelor asociate. Clasificați ca: 'politic' în cazul unui subiect politic, sau ca: 'nepolitic' dacă nu este politică.`
489
 
490
+ **English Labels:** `political`, `not-political` | **Native Labels:** `politic`, `nepolitic`
491
+ </details>
492
 
493
+ <details>
494
+ <summary><b>Propaganda Detection (AR — ArMeme)</b></summary>
495
 
496
  **System:** `You are an expert social media analyst specializing in identifying propaganda techniques in memes.`
497
 
498
+ **English Instruction:** `<image> Identify if the given Arabic meme is an example of propaganda by checking for biased or misleading narratives intended to affect political or ideological perspectives. Select: 'propaganda' or 'not-propaganda' as appropriate.`
499
 
500
+ **Native (AR) Instruction:** `<image> صنف الميم التالية بناءً على وجود أو غياب العناصر الدعائية. تُعرَّف الدعاية بأنها معلومات، خاصة ذات طبيعة متحيزة أو مضللة، تستخدم للترويج لقضية سياسية. اختر بين 'دعاية' و 'ليست دعاية'.`
501
 
502
+ **English Labels:** `propaganda`, `not-propaganda` | **Native Labels:** `دعاية`, `ليست دعاية`
503
+ </details>
504
 
505
+ <details>
506
+ <summary><b>Sarcasm Detection (BN)</b></summary>
507
 
508
  **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
509
 
510
+ **English Instruction:** `<image> Assess the communication style of the Bengali meme. Does it rely on sarcasm to make its point? If the meme is straightforward and means exactly what it says, it is not sarcastic. Classify the content as: 'sarcasm' or 'not-sarcasm'.`
511
 
512
+ **Native (BN) Instruction:** `<image> এই বাংলা মেমের বিষয়বস্তু বিশ্লেষণ করুন। যদি এটি বিদ্রূপ বা শ্লেষ ব্যবহার করে কোনো বার্তা দেয়, তবে তা ব্যঙ্গাত্মক। আপনার সিদ্ধান্ত অনুযায়ী এটিকে 'ব্যঙ্গাত্মক' বা 'ব্যঙ্গাত্মক নয়' হিসেবে চিহ্নিত করুন।`
513
 
514
+ **English Labels:** `sarcasm`, `not-sarcasm` | **Native Labels:** `ব্যঙ্গাত্মক`, `ব্যঙ্গাত্মক নয়`
515
+ </details>
516
 
517
+ <details>
518
+ <summary><b>Sarcasm Detection (EN)</b></summary>
519
 
520
  **System:** `You are an expert social media image analyzer specializing in emotion recognition in memes.`
521
 
522
  **Instruction:** `<image> Your objective is to identify the type of sarcasm used in the meme. Determine if the content is straightforward or if it uses irony to convey its message. Please assign a classification based on the complexity of the sarcasm. Classify as: 'not-sarcastic', 'general-sarcasm', 'twisted-meaning', or 'very-twisted'.`
523
 
524
  **Labels:** `not-sarcastic`, `general-sarcasm`, `twisted-meaning`, `very-twisted`
525
+ </details>
526
 
527
+ <details>
528
+ <summary><b>Sentiment Analysis (EN — Memotion)</b></summary>
 
529
 
530
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
531
 
532
  **Instruction:** `<image> Review the meme and determine its general emotional sentiment. Based on your analysis, assign one of the following labels: 'very-negative', 'negative', 'neutral', 'positive', or 'very-positive'.`
533
 
534
  **Labels:** `very-negative`, `negative`, `neutral`, `positive`, `very-positive`
535
+ </details>
536
 
537
+ <details>
538
+ <summary><b>Sentiment Analysis (BN)</b></summary>
 
539
 
540
  **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
541
 
542
+ **English Instruction:** `<image> Examine the Bengali meme and identify the sentiment it conveys. Classify the emotion as: 'positive', 'negative', or 'neutral'.`
543
 
544
+ **Native (BN) Instruction:** `<image> নিম্নলিখিত বাংলা মেমে-র মতবোধ/আবেগ পরীক্ষা করুন এবং শ্রেণীতে ভাগ করুন। বাছাই করুন: 'ইতিবাচক' শান্তিপূর্ণ বা প্রেরণাদায়ক হলে, 'নেতিবাচক' হতাশাজনক বা অপ্রীতিকর হলে, অথবা 'নিরপেক্ষ' কোনো প্রকৃত অনুভূতি না থাকলে।`
545
 
546
+ **English Labels:** `positive`, `negative`, `neutral` | **Native Labels:** `ইতিবাচক`, `নেতিবাচক`, `নিরপেক্ষ`
547
+ </details>
548
 
549
+ <details>
550
+ <summary><b>Sentiment Analysis (RO)</b></summary>
551
 
552
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
553
 
554
+ **English Instruction:** `<image> Investigate the sentiment expressed by the following Romanian meme. Select the appropriate label: 'positive' if the meme is upbeat or approving, 'negative' if it is critical or displeased, or 'neutral' if it remains impartial.`
555
 
556
+ **Native (RO) Instruction:** `<image> Identifică și clasifică sentimentul din mema următoare. Specificați dacă este 'pozitiv', 'negativ' sau 'neutru'.`
557
 
558
+ **English Labels:** `positive`, `negative`, `neutral` | **Native Labels:** `pozitiv`, `negativ`, `neutru`
559
+ </details>
560
 
561
+ <details>
562
+ <summary><b>Sentiment Category (EN)</b></summary>
563
 
564
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
565
 
566
  **Instruction:** `<image> For the meme presented, decide which single emotional category it belongs to, considering its overall message and tone. The available classifications are: 'Happiness', 'Sorrow', 'Anger', 'Fear', 'Surprise', 'Love', or 'Hate'.`
567
 
568
  **Labels:** `Happiness`, `Sorrow`, `Anger`, `Fear`, `Surprise`, `Love`, `Hate`
569
+ </details>
570
 
571
+ <details>
572
+ <summary><b>Sentiment Category (ZH)</b></summary>
 
573
 
574
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
575
 
576
+ **English Instruction:** `<image> Classify the provided Chinese meme based on its primary emotional content. Your selection must be one of the following: 'Happiness', 'Sorrow', 'Anger', 'Fear', 'Surprise', 'Love', or 'Hate'.`
577
 
578
+ **Native (ZH) Instruction:** `<image> 请审视这个中文表情包,并为其指定一个主要的情感类别。从以下列表中选择:'快乐','悲伤','愤怒','恐惧','惊讶','爱','恨'。`
579
 
580
+ **English Labels:** `Happiness`, `Sorrow`, `Anger`, `Fear`, `Surprise`, `Love`, `Hate` | **Native Labels:** `快乐`, `悲伤`, `愤怒`, `恐惧`, `惊讶`, `爱`, `恨`
581
+ </details>
582
 
583
+ <details>
584
+ <summary><b>Sentiment Degree (EN)</b></summary>
585
 
586
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
587
 
588
  **Instruction:** `<image> Considering the overall sentiment in the meme, classify its intensity by selecting: 'slightly' for minimal emotional presence, 'moderately' for a balanced intensity, or 'very' if the sentiment is intense and vivid.`
589
 
590
  **Labels:** `slightly`, `moderately`, `very`
591
+ </details>
592
 
593
+ <details>
594
+ <summary><b>Sentiment Degree (ZH)</b></summary>
 
595
 
596
  **System:** `You are an expert social media image analyzer specializing in sentiment analysis of memes.`
597
 
598
+ **English Instruction:** `<image> Please examine the Chinese meme and classify how strongly its sentiment is expressed. Use the following labels for the intensity: 'slightly' for a low degree, 'moderately' for a medium degree, and 'very' for a high degree of emotional expression.`
599
 
600
+ **Native (ZH) Instruction:** `<image> 判断中文表情包所传达情感的强度。请根据实际情感力度,将其归为:'轻微'、'中等' 或 '非常'。`
601
 
602
+ **English Labels:** `slightly`, `moderately`, `very` | **Native Labels:** `轻微`, `中等`, `非常`
603
+ </details>
604
 
605
+ <details>
606
+ <summary><b>Shaming Detection (EN)</b></summary>
607
 
608
  **System:** `You are an expert social media image analyzer specializing in identifying shaming content in memes.`
609
 
610
  **Instruction:** `<image> Classify whether the meme contains shaming content directed at women. Determine if it aims to humiliate, embarrass, or shame women about their appearance, behavior, or choices. Classify as: 'shaming' if such content is present, or 'not-shaming' if it is not.`
611
 
612
  **Labels:** `shaming`, `not-shaming`
613
+ </details>
614
 
615
+ <details>
616
+ <summary><b>Stereotype Detection (EN)</b></summary>
 
617
 
618
  **System:** `You are an expert social media image analyzer specializing in identifying stereotypes in memes.`
619
 
620
  **Instruction:** `<image> Review the meme and determine if it contains stereotypical content about women—look for generalized, simplified, or biased messages. Classify as: 'stereotype' if you find such traits, otherwise 'not-stereotype'.`
621
 
622
  **Labels:** `stereotype`, `not-stereotype`
623
+ </details>
624
 
625
+ <details>
626
+ <summary><b>Target Identification (EN)</b></summary>
 
627
 
628
  **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
629
 
630
  **Instruction:** `<image> For the provided meme, interpret who or what is being singled out. Is the commentary aimed at one person, a community, an organization, all of society, or no target? Make your classification using: 'individual', 'community', 'organization', 'society', or 'none'.`
631
 
632
  **Labels:** `individual`, `community`, `organization`, `society`, `none`
633
+ </details>
634
 
635
+ <details>
636
+ <summary><b>Target Identification — COVID-19 (EN)</b></summary>
 
637
 
638
  **System:** `You are an expert social media image analyzer specializing in identifying harmful content in memes.`
639
 
640
  **Instruction:** `<image> Review the COVID-19 meme and determine the entity that is depicted as the target within the content. Assign one of the following categories: 'individual', 'community', 'organization', 'society', or 'none'.`
641
 
642
  **Labels:** `individual`, `community`, `organization`, `society`, `none`
643
+ </details>
644
 
645
+ <details>
646
+ <summary><b>Toxicity Detection (RU)</b></summary>
 
647
 
648
  **System:** `You are an expert social media image analyzer specializing in identifying toxic content in memes.`
649
 
650
+ **English Instruction:** `<image> Evaluate the Russian meme for the presence of toxicity. This involves identifying any language or imagery that is damaging, disrespectful, or offensive to individuals or groups. Based on your assessment, classify as: 'toxic' or 'not-toxic'.`
651
 
652
+ **Native (RU) Instruction:** `<image> Оцените мем на русском языке и выясните, присутствует ли токсичность. Если выражения или визуальные элементы мема унижают, обижают или причиняют вред людям или группам, классифицируйте как: 'токсичный'. Если таких признаков нет, используйте метку: 'нетоксичный'.`
653
 
654
+ **English Labels:** `toxic`, `not-toxic` | **Native Labels:** `токсичный`, `нетоксичный`
655
+ </details>
656
 
657
+ <details>
658
+ <summary><b>Violence Detection (EN)</b></summary>
659
 
660
  **System:** `You are an expert social media image analyzer specializing in identifying violent content in memes.`
661
 
662
  **Instruction:** `<image> Is there content in this meme that shows, glorifies, or promotes physical, sexual, or psychological violence toward women? Please make a determination. Classify as: 'violence' if yes, and 'not-violence' if no.`
663
 
664
  **Labels:** `violence`, `not-violence`
665
+ </details>
666
 
667
+ <details>
668
+ <summary><b>Vulgar Content Detection (BN)</b></summary>
 
669
 
670
  **System:** `You are an expert social media image analyzer specializing in identifying abusive content in memes.`
671
 
672
+ **English Instruction:** `<image> Evaluate the Bengali meme to see if vulgarity is present. Search for crude, obscene, or improper language or images. Classify as: 'vulgar' if such content is found, otherwise use 'not-vulgar'.`
673
+
674
+ **Native (BN) Instruction:** `<image> এই বাংলা মেমে কোনো অশ্লীল ভাষা বা ছবি আছে কিনা তা পরীক্ষা করুন শ্রেণীবিন্যাস করুন। মেমটি 'অশ্লীল' নাকি 'অশ্লীল নয়' তা বলুন।`
675
 
676
+ **English Labels:** `vulgar`, `not-vulgar` | **Native Labels:** `অশ্লীল`, `অশ্লীল নয়`
677
+ </details>
678
 
679
  ## Citation
680