Update README.md
Browse files
README.md
CHANGED
|
@@ -20,7 +20,7 @@ datasets:
|
|
| 20 |
This is a fine-tuned BERT-based language model to classify NLP-related research papers according to concepts included in the [NLP taxonomy](#nlp-taxonomy).
|
| 21 |
It is a multi-label classifier that can predict concepts from all levels of the NLP taxonomy.
|
| 22 |
If the model identifies a lower-level concept, it did learn to predict both the lower-level concept and its hypernyms in the NLP taxonomy.
|
| 23 |
-
The model is fine-tuned on a weakly labeled dataset of 178,521 scientific papers from the ACL Anthology, the arXiv cs.CL
|
| 24 |
Prior to fine-tuning, the model is initialized with weights from [allenai/specter2_base](https://huggingface.co/allenai/specter2_base).
|
| 25 |
|
| 26 |
📄 Paper: [Exploring the Landscape of Natural Language Processing Research (RANLP 2023)](https://aclanthology.org/2023.ranlp-1.111)
|
|
|
|
| 20 |
This is a fine-tuned BERT-based language model to classify NLP-related research papers according to concepts included in the [NLP taxonomy](#nlp-taxonomy).
|
| 21 |
It is a multi-label classifier that can predict concepts from all levels of the NLP taxonomy.
|
| 22 |
If the model identifies a lower-level concept, it did learn to predict both the lower-level concept and its hypernyms in the NLP taxonomy.
|
| 23 |
+
The model is fine-tuned on a weakly labeled dataset of 178,521 scientific papers from the ACL Anthology, the arXiv cs.CL category, and Scopus.
|
| 24 |
Prior to fine-tuning, the model is initialized with weights from [allenai/specter2_base](https://huggingface.co/allenai/specter2_base).
|
| 25 |
|
| 26 |
📄 Paper: [Exploring the Landscape of Natural Language Processing Research (RANLP 2023)](https://aclanthology.org/2023.ranlp-1.111)
|