Adapters
English
bert
aps6992 commited on
Commit
babfa9f
·
1 Parent(s): 88ab94f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -15
README.md CHANGED
@@ -6,9 +6,9 @@ datasets:
6
  - allenai/scirepeval
7
  ---
8
 
9
- # Adapter `allenai/spp_classification` for allenai/specter_plus_plus
10
 
11
- An [adapter](https://adapterhub.ml) for the `allenai/specter_plus_plus` model that was trained on the [allenai/scirepeval](https://huggingface.co/datasets/allenai/scirepeval/) dataset.
12
 
13
  This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
14
 
@@ -26,8 +26,8 @@ Now, the adapter can be loaded and activated like this:
26
  ```python
27
  from transformers import AutoAdapterModel
28
 
29
- model = AutoAdapterModel.from_pretrained("allenai/specter_plus_plus")
30
- adapter_name = model.load_adapter("allenai/spp_classification", source="hf", set_active=True)
31
  ```
32
 
33
  ---
@@ -41,7 +41,7 @@ language:
41
 
42
  <!-- Provide a quick summary of what the model is/does. -->
43
 
44
- SPECTER 2.0 is the successor to [SPECTER](allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/spp).
45
  Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
46
 
47
  # Model Details
@@ -86,23 +86,23 @@ It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientif
86
 
87
  |Model|Type|Name and HF link|
88
  |--|--|--|
89
- |Base|Transformer|[allenai/specter_plus_plus](https://huggingface.co/allenai/specter_plus_plus)|
90
- |Classification|Adapter|[allenai/spp_classification](https://huggingface.co/allenai/spp_classification)|
91
- |Regression|Adapter|[allenai/spp_regression](https://huggingface.co/allenai/spp_regression)|
92
- |Retrieval|Adapter|[allenai/spp_proximity](https://huggingface.co/allenai/spp_proximity)|
93
- |Adhoc Query|Adapter|[allenai/spp_adhoc_query](https://huggingface.co/allenai/spp_adhoc_query)|
94
 
95
  ```python
96
  from transformers import AutoTokenizer, AutoModel
97
 
98
  # load model and tokenizer
99
- tokenizer = AutoTokenizer.from_pretrained('allenai/specter_plus_plus')
100
 
101
  #load base model
102
- model = AutoModel.from_pretrained('allenai/specter_plus_plus')
103
 
104
  #load the adapter(s) as per the required task, provide an identifier for the adapter in load_as argument and activate it
105
- model.load_adapter("allenai/spp_classification", source="hf", load_as="spp_classification", set_active=True)
106
 
107
  papers = [{'title': 'BERT', 'abstract': 'We introduce a new language representation model called BERT'},
108
  {'title': 'Attention is all you need', 'abstract': ' The dominant sequence transduction models are based on complex recurrent or convolutional neural networks'}]
@@ -166,8 +166,8 @@ We also evaluate and establish a new SoTA on [MDCR](https://github.com/zoranmedi
166
  |[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
167
  |[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
168
  |[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
169
- |[SPECTER 2.0-base](https://huggingface.co/allenai/specter_plus_plus)|56.3|58.0|69.2|(38.0, 32.4)|
170
- |[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allen/spp)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
171
 
172
  Please cite the following works if you end up using SPECTER 2.0:
173
 
 
6
  - allenai/scirepeval
7
  ---
8
 
9
+ # Adapter `allenai/specter2_classification` for allenai/specter2
10
 
11
+ An [adapter](https://adapterhub.ml) for the `allenai/specter2` model that was trained on the [allenai/scirepeval](https://huggingface.co/datasets/allenai/scirepeval/) dataset.
12
 
13
  This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
14
 
 
26
  ```python
27
  from transformers import AutoAdapterModel
28
 
29
+ model = AutoAdapterModel.from_pretrained("allenai/specter2")
30
+ adapter_name = model.load_adapter("allenai/specter2_classification", source="hf", set_active=True)
31
  ```
32
 
33
  ---
 
41
 
42
  <!-- Provide a quick summary of what the model is/does. -->
43
 
44
+ SPECTER 2.0 is the successor to [SPECTER](allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/specter2).
45
  Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
46
 
47
  # Model Details
 
86
 
87
  |Model|Type|Name and HF link|
88
  |--|--|--|
89
+ |Base|Transformer|[allenai/specter2](https://huggingface.co/allenai/specter2)|
90
+ |Classification|Adapter|[allenai/specter2_classification](https://huggingface.co/allenai/specter2_classification)|
91
+ |Regression|Adapter|[allenai/specter2_regression](https://huggingface.co/allenai/specter2_regression)|
92
+ |Retrieval|Adapter|[allenai/specter2_proximity](https://huggingface.co/allenai/specter2_proximity)|
93
+ |Adhoc Query|Adapter|[allenai/specter2_adhoc_query](https://huggingface.co/allenai/specter2_adhoc_query)|
94
 
95
  ```python
96
  from transformers import AutoTokenizer, AutoModel
97
 
98
  # load model and tokenizer
99
+ tokenizer = AutoTokenizer.from_pretrained('allenai/specter2')
100
 
101
  #load base model
102
+ model = AutoModel.from_pretrained('allenai/specter2')
103
 
104
  #load the adapter(s) as per the required task, provide an identifier for the adapter in load_as argument and activate it
105
+ model.load_adapter("allenai/specter2_classification", source="hf", load_as="specter2_classification", set_active=True)
106
 
107
  papers = [{'title': 'BERT', 'abstract': 'We introduce a new language representation model called BERT'},
108
  {'title': 'Attention is all you need', 'abstract': ' The dominant sequence transduction models are based on complex recurrent or convolutional neural networks'}]
 
166
  |[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
167
  |[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
168
  |[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
169
+ |[SPECTER 2.0-base](https://huggingface.co/allenai/specter2)|56.3|58.0|69.2|(38.0, 32.4)|
170
+ |[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allen/specter2)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
171
 
172
  Please cite the following works if you end up using SPECTER 2.0:
173