Update README.md
Browse files
README.md
CHANGED
|
@@ -69,9 +69,9 @@ It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientif
|
|
| 69 |
- **Shared by :** Allen AI
|
| 70 |
- **Model type:** bert-base-uncased + adapters
|
| 71 |
- **License:** Apache 2.0
|
| 72 |
-
- **Finetuned from model
|
| 73 |
|
| 74 |
-
## Model Sources
|
| 75 |
|
| 76 |
<!-- Provide the basic links for the model. -->
|
| 77 |
|
|
@@ -85,13 +85,14 @@ It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientif
|
|
| 85 |
|
| 86 |
## Direct Use
|
| 87 |
|
| 88 |
-
|Model|
|
| 89 |
|--|--|--|
|
| 90 |
-
|
|
| 91 |
-
|
|
| 92 |
-
|
|
| 93 |
-
|
|
| 94 |
-
|
|
|
|
| 95 |
|
| 96 |
```python
|
| 97 |
from transformers import AutoTokenizer, AutoModel
|
|
@@ -118,7 +119,7 @@ output = model(**inputs)
|
|
| 118 |
embeddings = output.last_hidden_state[:, 0, :]
|
| 119 |
```
|
| 120 |
|
| 121 |
-
## Downstream Use
|
| 122 |
|
| 123 |
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
|
| 124 |
|
|
@@ -167,7 +168,6 @@ We also evaluate and establish a new SoTA on [MDCR](https://github.com/zoranmedi
|
|
| 167 |
|[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
|
| 168 |
|[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
|
| 169 |
|[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
|
| 170 |
-
|[SPECTER 2.0-base](https://huggingface.co/allenai/specter2)|56.3|58.0|69.2|(38.0, 32.4)|
|
| 171 |
|[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allenai/specter-2)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
|
| 172 |
|
| 173 |
Please cite the following works if you end up using SPECTER 2.0:
|
|
|
|
| 69 |
- **Shared by :** Allen AI
|
| 70 |
- **Model type:** bert-base-uncased + adapters
|
| 71 |
- **License:** Apache 2.0
|
| 72 |
+
- **Finetuned from model:** [allenai/scibert](https://huggingface.co/allenai/scibert_scivocab_uncased).
|
| 73 |
|
| 74 |
+
## Model Sources
|
| 75 |
|
| 76 |
<!-- Provide the basic links for the model. -->
|
| 77 |
|
|
|
|
| 85 |
|
| 86 |
## Direct Use
|
| 87 |
|
| 88 |
+
|Model|Name and HF link|Description|
|
| 89 |
|--|--|--|
|
| 90 |
+
|Retrieval*|[allenai/specter2_proximity](https://huggingface.co/allenai/specter2_proximity)|Encode papers as queries and candidates eg. Link Prediction, Nearest Neighbor Search|
|
| 91 |
+
|Adhoc Query|[allenai/specter2_adhoc_query](https://huggingface.co/allenai/specter2_adhoc_query)|Encode short raw text queries for search tasks. (Candidate papers can be encoded with proximity)|
|
| 92 |
+
|Classification|[allenai/specter2_classification](https://huggingface.co/allenai/specter2_classification)|Encode papers to feed into linear classifiers as features|
|
| 93 |
+
|Regression|[allenai/specter2_regression](https://huggingface.co/allenai/specter2_regression)|Encode papers to feed into linear regressors as features|
|
| 94 |
+
|
| 95 |
+
*Retrieval model should suffice for downstream task types not mentioned above
|
| 96 |
|
| 97 |
```python
|
| 98 |
from transformers import AutoTokenizer, AutoModel
|
|
|
|
| 119 |
embeddings = output.last_hidden_state[:, 0, :]
|
| 120 |
```
|
| 121 |
|
| 122 |
+
## Downstream Use
|
| 123 |
|
| 124 |
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
|
| 125 |
|
|
|
|
| 168 |
|[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
|
| 169 |
|[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
|
| 170 |
|[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
|
|
|
|
| 171 |
|[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allenai/specter-2)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
|
| 172 |
|
| 173 |
Please cite the following works if you end up using SPECTER 2.0:
|