Fill-Mask
Transformers
PyTorch
xlm-roberta
parliament
5roop commited on
Commit
8c7f9b2
·
verified ·
1 Parent(s): ab931a0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -0
README.md CHANGED
@@ -62,5 +62,17 @@ The model is a result of the [ParlaMint project](https://www.clarin.eu/parlamint
62
  }
63
  ```
64
 
 
 
 
 
 
 
 
 
 
 
 
 
65
  The first application of this model is the [XLM-R-parlasent model](https://huggingface.co/classla/xlm-r-parlasent), fine-tuned on the [ParlaSent dataset](http://hdl.handle.net/11356/1868) for the task of sentiment analysis in parliamentary proceedings.
66
 
 
62
  }
63
  ```
64
 
65
+ To cite the model itself, you can use
66
+ ```latex
67
+ @misc{xlm-r-parla,
68
+ author = { Rik van Noord and Nikola Ljubešić and Peter Rupnik },
69
+ title = { xlm-r-parla (Revision ab931a0) },
70
+ year = 2023,
71
+ url = { https://huggingface.co/classla/xlm-r-parla },
72
+ doi = { 10.57967/hf/6706 },
73
+ publisher = { Hugging Face }
74
+ }
75
+ ```
76
+
77
  The first application of this model is the [XLM-R-parlasent model](https://huggingface.co/classla/xlm-r-parlasent), fine-tuned on the [ParlaSent dataset](http://hdl.handle.net/11356/1868) for the task of sentiment analysis in parliamentary proceedings.
78