File size: 1,745 Bytes
204f411
 
 
 
 
 
 
31c6f15
204f411
 
 
 
 
 
 
 
 
 
 
31c6f15
204f411
 
 
 
 
31c6f15
 
204f411
 
 
 
 
 
 
 
 
 
 
 
 
 
31c6f15
 
 
204f411
 
 
 
 
31c6f15
204f411
 
 
 
 
 
 
 
 
 
31c6f15
 
 
 
204f411
 
 
 
31c6f15
 
204f411
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
---
language:
  - hu
license: mit
tags:
  - sentiment-analysis
  - xlm-roberta
  - hungarian
  - text-classification
datasets:
  - custom
metrics:
  - accuracy
  - f1
pipeline_tag: text-classification
---

# Sentiment

Fine-tuned [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) for **Hungarian sentiment classification**.

## Model Details

- **Base model**: `xlm-roberta-base`
- **Task**: 3-class sentiment classification (negative / neutral / positive)
- **Language**: Hungarian
- **Training data**: ~37K sentences (stratified split from ~46K total)
- **Class weighting**: Balanced weights applied during training to handle class imbalance

## Labels

| Label ID | Label | Description |
|----------|-------|-------------|
| 0 | negative | Negative sentiment |
| 1 | neutral | Neutral sentiment |
| 2 | positive | Positive sentiment |

## Overall Results

| Metric | Value |
|--------|-------|
| Accuracy | 0.8442320225939605 |
| F1 (macro) | 0.8387464047460437 |
| F1 (weighted) | 0.8435908941071462 |

## Per-Language Results

| Language | Samples | Accuracy | F1 (macro) | F1 (weighted) |
|----------|---------|----------|------------|---------------|
| hun | 4603 | 0.8442 | 0.8387 | 0.8436 |


## Usage

```python
from transformers import pipeline

classifier = pipeline("text-classification", model="ringorsolya/Sentiment")

classifier("Ez egy fantasztikus nap!")
# [{'label': 'positive', 'score': 0.95}]

classifier("Szörnyű volt a kiszolgálás.")
# [{'label': 'negative', 'score': 0.92}]
```

## Training Details

- **Epochs**: 5
- **Batch size**: 32
- **Learning rate**: 2e-05
- **Weight decay**: 0.01
- **Warmup ratio**: 0.1
- **Max sequence length**: 128
- **FP16**: True
- **Class weights**: [0.8114, 1.1219, 1.1413]