Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,36 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
language: en
|
| 3 |
+
datasets:
|
| 4 |
+
- FoodBase
|
| 5 |
+
license: mit
|
| 6 |
+
---
|
| 7 |
+
# FoiodBaseBERT
|
| 8 |
+
|
| 9 |
+
## Model description
|
| 10 |
+
|
| 11 |
+
**FoodBaseBERT** is a fine-tuned BERT model that is ready to use for **Named Entity Recognition** of Food entities. It has been trained to recognize one entity: food (FOOD).
|
| 12 |
+
|
| 13 |
+
Specifically, this model is a *bert-base-cased* model that was fine-tuned on the [FoodBase NER](https://academic.oup.com/database/article/doi/10.1093/database/baz121/5611291) dataset.
|
| 14 |
+
|
| 15 |
+
If you'd like to use a larger BERT-large model fine-tuned on the same dataset, a [**bert-large-NER**](https://huggingface.co/dslim/bert-large-NER/) version is also available.
|
| 16 |
+
|
| 17 |
+
|
| 18 |
+
## Intended uses
|
| 19 |
+
|
| 20 |
+
#### How to use
|
| 21 |
+
|
| 22 |
+
You can use this model with Transformers *pipeline* for NER.
|
| 23 |
+
|
| 24 |
+
```python
|
| 25 |
+
from transformers import AutoTokenizer, AutoModelForTokenClassification
|
| 26 |
+
from transformers import pipeline
|
| 27 |
+
|
| 28 |
+
tokenizer = AutoTokenizer.from_pretrained("Dizex/FoodBaseBERT")
|
| 29 |
+
model = AutoModelForTokenClassification.from_pretrained("Dizex/FoodBaseBERT")
|
| 30 |
+
|
| 31 |
+
pipe = pipeline("ner", model=model, tokenizer=tokenizer)
|
| 32 |
+
example = "Today's meal: Fresh olive poké bowl topped with chia seeds. Very delicious!"
|
| 33 |
+
|
| 34 |
+
ner_entity_results = pipe(example)
|
| 35 |
+
print(ner_entity_results)
|
| 36 |
+
```
|