Update README.md
Browse files
README.md
CHANGED
|
@@ -1,144 +1,94 @@
|
|
| 1 |
---
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
|
| 8 |
-
-
|
| 9 |
-
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
---
|
| 13 |
|
| 14 |
-
#
|
| 15 |
|
| 16 |
-
This is
|
| 17 |
|
| 18 |
-
|
| 19 |
|
| 20 |
-
|
| 21 |
-
- **Model Type:** Sentence Transformer
|
| 22 |
-
- **Base model:** [WhereIsAI/pubmed-angle-base-en](https://huggingface.co/WhereIsAI/pubmed-angle-base-en) <!-- at revision d324ec037647870570f04d1d9bd7070194d4f3ff -->
|
| 23 |
-
- **Maximum Sequence Length:** 512 tokens
|
| 24 |
-
- **Output Dimensionality:** 768 tokens
|
| 25 |
-
- **Similarity Function:** Cosine Similarity
|
| 26 |
-
<!-- - **Training Dataset:** Unknown -->
|
| 27 |
-
<!-- - **Language:** Unknown -->
|
| 28 |
-
<!-- - **License:** Unknown -->
|
| 29 |
|
| 30 |
-
|
|
|
|
|
|
|
| 31 |
|
| 32 |
-
|
| 33 |
-
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
|
| 34 |
-
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
|
| 35 |
|
| 36 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 37 |
|
| 38 |
-
|
| 39 |
-
SentenceTransformer(
|
| 40 |
-
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
|
| 41 |
-
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
|
| 42 |
-
)
|
| 43 |
-
```
|
| 44 |
|
| 45 |
-
|
| 46 |
|
| 47 |
-
### Direct Usage (Sentence Transformers)
|
| 48 |
|
| 49 |
-
|
| 50 |
|
| 51 |
-
|
| 52 |
-
pip install -U sentence-transformers
|
| 53 |
-
```
|
| 54 |
|
| 55 |
-
|
| 56 |
-
|
| 57 |
-
from sentence_transformers import SentenceTransformer
|
| 58 |
-
|
| 59 |
-
# Download from the 🤗 Hub
|
| 60 |
-
model = SentenceTransformer("WhereIsAI/pubmed-angle-base-en")
|
| 61 |
-
# Run inference
|
| 62 |
-
sentences = [
|
| 63 |
-
'The weather is lovely today.',
|
| 64 |
-
"It's so sunny outside!",
|
| 65 |
-
'He drove to the stadium.',
|
| 66 |
-
]
|
| 67 |
-
embeddings = model.encode(sentences)
|
| 68 |
-
print(embeddings.shape)
|
| 69 |
-
# [3, 768]
|
| 70 |
-
|
| 71 |
-
# Get the similarity scores for the embeddings
|
| 72 |
-
similarities = model.similarity(embeddings, embeddings)
|
| 73 |
-
print(similarities.shape)
|
| 74 |
-
# [3, 3]
|
| 75 |
```
|
| 76 |
|
| 77 |
-
|
| 78 |
-
### Direct Usage (Transformers)
|
| 79 |
-
|
| 80 |
-
<details><summary>Click to see the direct usage in Transformers</summary>
|
| 81 |
-
|
| 82 |
-
</details>
|
| 83 |
-
-->
|
| 84 |
|
| 85 |
-
|
| 86 |
-
|
| 87 |
-
|
| 88 |
-
You can finetune this model on your own dataset.
|
| 89 |
|
| 90 |
-
|
| 91 |
|
| 92 |
-
|
| 93 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 94 |
|
| 95 |
-
|
| 96 |
-
|
| 97 |
|
| 98 |
-
|
| 99 |
-
|
| 100 |
|
| 101 |
-
|
| 102 |
-
|
|
|
|
| 103 |
|
| 104 |
-
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
|
| 105 |
-
-->
|
| 106 |
|
| 107 |
-
|
| 108 |
-
### Recommendations
|
| 109 |
|
| 110 |
-
|
| 111 |
-
-->
|
| 112 |
|
| 113 |
-
|
|
|
|
|
|
|
| 114 |
|
| 115 |
-
### Framework Versions
|
| 116 |
-
- Python: 3.10.12
|
| 117 |
-
- Sentence Transformers: 3.0.1
|
| 118 |
-
- Transformers: 4.42.3
|
| 119 |
-
- PyTorch: 2.3.0+cu121
|
| 120 |
-
- Accelerate: 0.30.1
|
| 121 |
-
- Datasets: 2.19.1
|
| 122 |
-
- Tokenizers: 0.19.1
|
| 123 |
|
| 124 |
## Citation
|
| 125 |
|
| 126 |
-
|
| 127 |
-
|
| 128 |
-
<!--
|
| 129 |
-
## Glossary
|
| 130 |
-
|
| 131 |
-
*Clearly define terms in order to be accessible across audiences.*
|
| 132 |
-
-->
|
| 133 |
-
|
| 134 |
-
<!--
|
| 135 |
-
## Model Card Authors
|
| 136 |
-
|
| 137 |
-
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
|
| 138 |
-
-->
|
| 139 |
-
|
| 140 |
-
<!--
|
| 141 |
-
## Model Card Contact
|
| 142 |
|
| 143 |
-
|
| 144 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
+
license: mit
|
| 3 |
+
base_model: microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract-fulltext
|
| 4 |
+
model-index:
|
| 5 |
+
- name: pre-pubmedbert-base-embedding
|
| 6 |
+
results: []
|
| 7 |
+
datasets:
|
| 8 |
+
- WhereIsAI/medical-triples
|
| 9 |
+
- WhereIsAI/pubmedqa-test-angle-format-a
|
| 10 |
+
language:
|
| 11 |
+
- en
|
| 12 |
---
|
| 13 |
|
| 14 |
+
# WhereIsAI/pubmed-angle-base-en
|
| 15 |
|
| 16 |
+
This model is an example model for the Chinese blog post [title](#) and [angle tutorial](https://angle.readthedocs.io/en/latest/notes/tutorial.html#tutorial).
|
| 17 |
|
| 18 |
+
It was fine-tuned with [AnglE Loss](https://arxiv.org/abs/2309.12871) using the official [angle-emb](https://github.com/SeanLee97/AnglE).
|
| 19 |
|
| 20 |
+
**1. Training Setup:**
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 21 |
|
| 22 |
+
- Base model: [microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract-fulltext](https://huggingface.co/microsoft/BiomedNLP-BiomedBERT-base-uncased-abstract-fulltext)
|
| 23 |
+
- Training Data: [WhereIsAI/medical-triples](https://huggingface.co/datasets/WhereIsAI/medical-triples), processed from [PubMedQA](https://huggingface.co/datasets/qiaojin/PubMedQA).
|
| 24 |
+
- Test Data: [WhereIsAI/pubmedqa-test-angle-format-a](https://huggingface.co/datasets/WhereIsAI/pubmedqa-test-angle-format-a)
|
| 25 |
|
| 26 |
+
**2. Performance:**
|
|
|
|
|
|
|
| 27 |
|
| 28 |
+
| Model | Pooling Strategy | Spearman's Correlation |
|
| 29 |
+
|----------------------------------------|------------------|:----------------------:|
|
| 30 |
+
| tavakolih/all-MiniLM-L6-v2-pubmed-full | avg | 84.56 |
|
| 31 |
+
| NeuML/pubmedbert-base-embeddings | avg | 84.88 |
|
| 32 |
+
| **WhereIsAI/pubmed-angle-base-en** | cls | 86.01 |
|
| 33 |
+
| WhereIsAI/pubmed-angle-large-en | cls | 86.21 |
|
| 34 |
|
| 35 |
+
**3. Citation**
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 36 |
|
| 37 |
+
Cite AnglE following 👉 https://huggingface.co/WhereIsAI/pubmed-angle-base-en#citation
|
| 38 |
|
|
|
|
| 39 |
|
| 40 |
+
## Usage
|
| 41 |
|
| 42 |
+
### via angle-emb
|
|
|
|
|
|
|
| 43 |
|
| 44 |
+
```bash
|
| 45 |
+
python -m pip install -U angle-emb
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 46 |
```
|
| 47 |
|
| 48 |
+
Example:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 49 |
|
| 50 |
+
```python
|
| 51 |
+
from angle_emb import AnglE
|
| 52 |
+
from angle_emb.utils import cosine_similarity
|
|
|
|
| 53 |
|
| 54 |
+
angle = AnglE.from_pretrained('WhereIsAI/pubmed-angle-base-en', pooling_strategy='cls').cuda()
|
| 55 |
|
| 56 |
+
query = 'How to treat childhood obesity and overweight?'
|
| 57 |
+
docs = [
|
| 58 |
+
query,
|
| 59 |
+
'The child is overweight. Parents should relieve their children\'s symptoms through physical activity and healthy eating. First, they can let them do some aerobic exercise, such as jogging, climbing, swimming, etc. In terms of diet, children should eat more cucumbers, carrots, spinach, etc. Parents should also discourage their children from eating fried foods and dried fruits, which are high in calories and fat. Parents should not let their children lie in bed without moving after eating. If their children\'s condition is serious during the treatment of childhood obesity, parents should go to the hospital for treatment under the guidance of a doctor in a timely manner.',
|
| 60 |
+
'If you want to treat tonsillitis better, you can choose some anti-inflammatory drugs under the guidance of a doctor, or use local drugs, such as washing the tonsil crypts, injecting drugs into the tonsils, etc. If your child has a sore throat, you can also give him or her some pain relievers. If your child has a fever, you can give him or her antipyretics. If the condition is serious, seek medical attention as soon as possible. If the medication does not have a good effect and the symptoms recur, the author suggests surgical treatment. Parents should also make sure to keep their children warm to prevent them from catching a cold and getting tonsillitis again.',
|
| 61 |
+
]
|
| 62 |
|
| 63 |
+
embeddings = angle.encode(docs)
|
| 64 |
+
query_emb = embeddings[0]
|
| 65 |
|
| 66 |
+
for doc, emb in zip(docs[1:], embeddings[1:]):
|
| 67 |
+
print(cosine_similarity(query_emb, emb))
|
| 68 |
|
| 69 |
+
# 0.8029839020052982
|
| 70 |
+
# 0.4260630076818197
|
| 71 |
+
```
|
| 72 |
|
|
|
|
|
|
|
| 73 |
|
| 74 |
+
### via sentence-transformers
|
|
|
|
| 75 |
|
| 76 |
+
Install sentence-transformers
|
|
|
|
| 77 |
|
| 78 |
+
```bash
|
| 79 |
+
python -m pip install -U sentence-transformers
|
| 80 |
+
```
|
| 81 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 82 |
|
| 83 |
## Citation
|
| 84 |
|
| 85 |
+
If you use this model for academic papers, please cite angle's paper, as follows:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 86 |
|
| 87 |
+
```bibtext
|
| 88 |
+
@article{li2023angle,
|
| 89 |
+
title={AnglE-optimized Text Embeddings},
|
| 90 |
+
author={Li, Xianming and Li, Jing},
|
| 91 |
+
journal={arXiv preprint arXiv:2309.12871},
|
| 92 |
+
year={2023}
|
| 93 |
+
}
|
| 94 |
+
```
|