Instructions to use ncbi/MedCPT-Article-Encoder with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ncbi/MedCPT-Article-Encoder with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="ncbi/MedCPT-Article-Encoder")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("ncbi/MedCPT-Article-Encoder") model = AutoModel.from_pretrained("ncbi/MedCPT-Article-Encoder") - Inference
- Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -68,6 +68,10 @@ torch.Size([3, 768])
|
|
| 68 |
```
|
| 69 |
These embeddings are also in the same space as those generated by the MedCPT query encoder.
|
| 70 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 71 |
# Acknowledgments
|
| 72 |
|
| 73 |
This work was supported by the Intramural Research Programs of the National Institutes of Health, National Library of Medicine.
|
|
|
|
| 68 |
```
|
| 69 |
These embeddings are also in the same space as those generated by the MedCPT query encoder.
|
| 70 |
|
| 71 |
+
# Case 2. Use the Pre-computed Embeddings
|
| 72 |
+
|
| 73 |
+
We have provided the embeddings of all PubMed articles generated by the MedCPT article encoder at https://ftp.ncbi.nlm.nih.gov/pub/lu/MedCPT/pubmed_embeddings/.
|
| 74 |
+
|
| 75 |
# Acknowledgments
|
| 76 |
|
| 77 |
This work was supported by the Intramural Research Programs of the National Institutes of Health, National Library of Medicine.
|