File size: 6,156 Bytes
dc86fa7 db323fc dc86fa7 5572bc5 4c86554 5a26e36 dc86fa7 5572bc5 dc86fa7 5572bc5 51921d1 7ae3eda 5572bc5 7ae3eda dc86fa7 75d0a59 db323fc 75d0a59 db323fc 75d0a59 5c9f734 75d0a59 dc86fa7 5a26e36 fbf36d3 290e7f3 dc86fa7 ad1ad77 dc86fa7 db323fc |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 |
---
license: mit
language:
- en
- zh
pretty_name: B2NERD
---
# B2NER
We present B2NERD, a cohesive and efficient dataset that can improve LLMs' generalization on the challenging Open NER task, refined from 54 existing English or Chinese datasets.
Our B2NER models, trained on B2NERD, outperform GPT-4 by 6.8-12.0 F1 points and surpass previous methods in 3 out-of-domain benchmarks across 15 datasets and 6 languages.
- 📖 Paper: [Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition](http://arxiv.org/abs/2406.11192)
- 🎮 GitHub Repo: https://github.com/UmeanNever/B2NER .
- 📀 Data: You can download from here (the B2NERD_data.zip in the "Files and versions" tab). See below data section for more information.
- 💾 Model (LoRA Adapters): See [7B model](https://huggingface.co/Umean/B2NER-Internlm2.5-7B-LoRA) and [20B model](https://huggingface.co/Umean/B2NER-Internlm2-20B-LoRA). You may refer to the github repo for quick demo usage.
**Feature Highlights:**
- Curated dataset (B2NERD) refined from the largest bilingual NER dataset collection to date for training Open NER models.
- Achieves SoTA OOD NER performance across multiple benchmarks with light-weight LoRA adapters (<=50MB).
- Uses simple natural language format prompt, achieving 4X faster inference speed than previous SoTA which use complex prompts.
- Easy integration with other IE tasks by adopting UIE-style instructions.
- Provides a universal entity taxonomy that guides the definition and label naming of new entities.
- We have open-sourced our data, code, and models, and provided easy-to-follow usage instructions.
| Model | Avg. F1 on OOD English datasets | Avg. F1 on OOD Chinese datasets | Avg. F1 on OOD multilingual dataset
|-------|------------------------|------------------------|--|
| Previous SoTA | 69.1 | 42.7 | 36.6
| GPT | 60.1 | 54.7 | 31.8
| B2NER | **72.1** | **61.3** | **43.3**
See our [GitHub Repo](https://github.com/UmeanNever/B2NER) for more information on data usage and this work.
# Data
One of the paper's core contribution is the construction of B2NERD dataset. It's a cohesive and efficient collection refined from 54 English and Chinese datasets and designed for Open NER model training. **The preprocessed test datasets (7 for Chinese NER and 7 for English NER) used for Open NER OOD evaluation in our paper are also included in the released dataset** to facilitate convenient evaluation for future research. See the tables below for our train/test splits and dataset statistics.
We provide 3 versions of our dataset.
- `B2NERD` (**Recommended**): Contain ~52k samples from 54 Chinese or English datasets. This is the final version of our dataset suitable for out-of-domain / zero-shot NER model training. It features standardized entity definitions and pruned, diverse training data, while also including separate unpruned test data.
- `B2NERD_all`: Contain ~1.4M samples from 54 datasets. The full-data version of our dataset suitable for in-domain supervised evaluation. It has standardized entity definitions but does not undergo any data selection or pruning.
- `B2NERD_raw`: The raw collected datasets with raw entity labels. It goes through basic format preprocessing but without further standardization.
<!-- <details>
<summary><b>Example Data Format</b></summary> -->
Example Data Format:
```json
[
{
"sentence": "Barak announced 2 weeks ago that he would call for early elections .",
"entities": [
{
"name": "Barak",
"type": "person",
"pos": [
0,
5
]
},
{
"name": "2 weeks ago",
"type": "date or period",
"pos": [
16,
27
]
}
]
},
]
```
<!-- </details> -->
You can download the data from here (the B2NERD_data.zip in the "Files and versions" tab).
Please ensure that you have the proper licenses to access the raw datasets in our collection.
<!-- Current data is uploaded as .zip for convenience. We are considering upload raw data files for better preview. -->
Below are the datasets statistics and source datasets for `B2NERD` dataset.
| Split | Lang. | Datasets | Types | Num | Raw Num |
|-------|-------|----------|-------|-----|---------|
| Train | En | 19 | 119 | 25,403 | 838,648 |
| | Zh | 21 | 222 | 26,504 | 580,513 |
| | Total | 40 | 341 | 51,907 | 1,419,161 |
| Test | En | 7 | 85 | - | 6,466 |
| | Zh | 7 | 60 | - | 14,257 |
| | Total | 14 | 145 | - | 20,723 |
<img src="https://cdn-uploads.huggingface.co/production/uploads/655c6b1abfb531437a54c0e6/NIQWzYvwRxbMVgJf1KDzL.png" width="1000"/>
<img src="https://cdn-uploads.huggingface.co/production/uploads/655c6b1abfb531437a54c0e6/9UuY9EuA7R5PvasddMObQ.png" width="1000"/>
More information can be found in our paper.
# Cite
```
@inproceedings{yang-etal-2025-beyond,
title = "Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition",
author = "Yang, Yuming and
Zhao, Wantong and
Huang, Caishuang and
Ye, Junjie and
Wang, Xiao and
Zheng, Huiyuan and
Nan, Yang and
Wang, Yuran and
Xu, Xueying and
Huang, Kaixin and
Zhang, Yunke and
Gui, Tao and
Zhang, Qi and
Huang, Xuanjing",
editor = "Rambow, Owen and
Wanner, Leo and
Apidianaki, Marianna and
Al-Khalifa, Hend and
Eugenio, Barbara Di and
Schockaert, Steven",
booktitle = "Proceedings of the 31st International Conference on Computational Linguistics",
month = jan,
year = "2025",
address = "Abu Dhabi, UAE",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2025.coling-main.725/",
pages = "10902--10923"
}
``` |