Annotation Layer: SitEnt

This model is part of GePaDeU, which equips parliamentary debates of the German Bundestag with rich semantic and pragmatic information across multiple annotation layers.

parl-german-sitent is trained to classify a given sequence into one or multiple Situation Entity (SitEnt) types. SitEnts capture clause-level aspect, covering eventualities (States, Events, Reports) and general statives (Generics, Generalizing sentences). In addition, our classification scheme includes two speech acts (Questions, Imperatives). Refer to our paper for further details on the linguistic properties of this annotation layer.


πŸ” Model Overview

  • Task Type: Multi-label, multi-class sequence classification
  • Base Model: GBERT large
  • Fine-tuning method: full fine-tuning
  • Language: German

πŸ“š Dataset

Models were trained and evaluated on 250 manually annotated parliamentary speeches from the German Bundestag, ranging from 2017-2021, resulting in 19,676 SitEnt instances.

Data Splits

  • Train: 13,125 instances
  • Dev: 1,785 instances
  • Test: 4,766 instances

πŸ‹οΈ Model Training


πŸ“Š Evaluation


πŸš€ How to Use

Please, refer to our GitHub repo for detailed instructions on the required input format and how to run the model.


⚠️ Limitations

Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for schlenker/parl-german-sitent

Finetuned
(38)
this model