Update README.md
Browse files
README.md
CHANGED
|
@@ -36,6 +36,13 @@ This model is a <b>lightweight</b> and uncased version of <b>BERT</b> <b>[1]</b>
|
|
| 36 |
<b>50% lighter</b> than a typical mono-lingual BERT model. It is ideal when memory consumption and execution speed are critical while maintaining high-quality results.
|
| 37 |
|
| 38 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 39 |
<h3>Model description</h3>
|
| 40 |
|
| 41 |
The model builds on the multilingual <b>DistilBERT</b> <b>[2]</b> model (from the HuggingFace team: [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased)) as a starting point,
|
|
|
|
| 36 |
<b>50% lighter</b> than a typical mono-lingual BERT model. It is ideal when memory consumption and execution speed are critical while maintaining high-quality results.
|
| 37 |
|
| 38 |
|
| 39 |
+
<h3>AILC CLiC-IT 2023 Proceedings</h3>
|
| 40 |
+
|
| 41 |
+
"Blaze-IT: a lightweight BERT model for the Italian language" has been accepted by AILC for CLiC-IT 2023 and published in the conference proceedings.
|
| 42 |
+
You can find the proceedings here: https://clic2023.ilc.cnr.it/proceedings/
|
| 43 |
+
And the published paper here: https://clic2023.ilc.cnr.it/wp-content/uploads/2023/11/paper46.pdf
|
| 44 |
+
|
| 45 |
+
|
| 46 |
<h3>Model description</h3>
|
| 47 |
|
| 48 |
The model builds on the multilingual <b>DistilBERT</b> <b>[2]</b> model (from the HuggingFace team: [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased)) as a starting point,
|