BERTić-SentiComments-SR-Subjectivity

BERTić-SentiComments-SR-Subjectivity is a variant of the BERTić model, fine-tuned on the task of subjectivity detection in Serbian short texts. It differentiates between texts that do not express a sentiment, i.e. objective ones, and those that do, i.e. subjective ones. The model was fine-tuned for 5 epochs on the SentiComments.SR dataset.

Benchmarking

This model was evaluated on the task of subjectivity detection in short texts in Serbian from the SentiComments.SR dataset and compared to multilingual BERT. Different lengths of fine-tuning were considered, ranging from 1 to 5 epochs. Linear classifiers relying on bag-of-words (BOW) and/or bag-of-embeddings (BOE) features were used as baselines.

Since the dataset is imbalanced, weighted F1 measure was utilized as the performance metric. Model fine-tuning and evaluation were performed using 10-fold stratified cross-validation. The code and data to run these experiments are available on the SentiComments.SR GitHub repository.

Results

Model Weighted F1
Baseline - Linear classifier with BOW features 0.871
Baseline - Linear classifier with BOE features 0.873
Baseline - Linear classifier with BOW+BOE features 0.885
Multilingual BERT, 1 epoch 0.865
BERTić-SentiComments-SR-Subjectivity, 1 epoch 0.913
Multilingual BERT, 3 epochs 0.882
BERTić-SentiComments-SR-Subjectivity, 3 epochs 0.919
Multilingual BERT, 5 epochs 0.885
BERTić-SentiComments-SR-Subjectivity, 5 epochs 0.919

References

If you wish to use this model in your paper or project, please cite the following papers:

Downloads last month
3
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ICEF-NLP/bcms-bertic-senticomments-sr-subjectivity

Finetuned
(12)
this model