Instructions to use IVN-RIN/medBIT-r3-plus with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use IVN-RIN/medBIT-r3-plus with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="IVN-RIN/medBIT-r3-plus")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("IVN-RIN/medBIT-r3-plus") model = AutoModelForMaskedLM.from_pretrained("IVN-RIN/medBIT-r3-plus") - Notebooks
- Google Colab
- Kaggle
🤗 + 📚🩺🇮🇹 + 📖🧑⚕️ + 🌐⚕️ = MedBIT-r3-plus
From this repository you can download the MedBIT-r3-plus (Medical Bert for ITalian) checkpoint.
MedBIT-r3-plus is built on top of BioBIT, further pretrained on a corpus of medical textbooks, either directly written by Italian authors or translated by human professional translators, used in formal medical doctors’ education and specialized training. The size of this corpus amounts to 100 MB of data. These comprehensive collections of medical concepts can impact the encoding of biomedical knowledge in language models, with the advantage of being natively available in Italian, and not being translated. Online healthcare information dissemination is another source of biomedical texts that is commonly available in many less-resourced languages. Therefore, we also gathered an additional 100 MB of web-crawled data from reliable Italian, health-related websites. More details in the paper.
MedBIT-r3-plus has been evaluated on 3 downstream tasks: NER (Named Entity Recognition), extractive QA (Question Answering), RE (Relation Extraction). Here are the results, summarized:
- NER:
- BC2GM = 81.87%
- BC4CHEMD = 80.68%
- BC5CDR(CDR) = 81.97%
- BC5CDR(DNER) = 76.32%
- NCBI_DISEASE = 63.36%
- SPECIES-800 = 63.90%
- QA:
- RE:
Check the full paper for further details, and feel free to contact us if you have some inquiry!
- Downloads last month
- 200