| | --- |
| | tags: |
| | - adapterhub:sts/mrpc |
| | - text-classification |
| | - adapter-transformers |
| | - roberta |
| | license: "apache-2.0" |
| | --- |
| | |
| | # Adapter `roberta-large-mrpc_houlsby` for roberta-large |
| | |
| | MRPC adapter (with head) trained using the `run_glue.py` script with an extension that retains the best checkpoint (out of 30 epochs). |
| |
|
| |
|
| | **This adapter was created for usage with the [Adapters](https://github.com/Adapter-Hub/adapters) library.** |
| |
|
| | ## Usage |
| |
|
| | First, install `adapters`: |
| |
|
| | ``` |
| | pip install -U adapters |
| | ``` |
| |
|
| | Now, the adapter can be loaded and activated like this: |
| |
|
| | ```python |
| | from adapters import AutoAdapterModel |
| | |
| | model = AutoAdapterModel.from_pretrained("roberta-large") |
| | adapter_name = model.load_adapter("AdapterHub/roberta-large-mrpc_houlsby") |
| | model.set_active_adapters(adapter_name) |
| | ``` |
| |
|
| | ## Architecture & Training |
| |
|
| | - Adapter architecture: houlsby |
| | - Prediction head: classification |
| | - Dataset: [MRPC](https://www.microsoft.com/en-us/download/details.aspx?id=52398) |
| |
|
| | ## Author Information |
| |
|
| | - Author name(s): Andreas Rücklé |
| | - Author email: rueckle@ukp.informatik.tu-darmstadt |
| | - Author links: [Website](http://rueckle.net), [GitHub](https://github.com/arueckle), [Twitter](https://twitter.com/@arueckle) |
| |
|
| |
|
| |
|
| | ## Citation |
| |
|
| | ```bibtex |
| | @article{pfeiffer2020AdapterHub, |
| | title={AdapterHub: A Framework for Adapting Transformers}, |
| | author={Jonas Pfeiffer, |
| | Andreas R\"uckl\'{e}, |
| | Clifton Poth, |
| | Aishwarya Kamath, |
| | Ivan Vuli\'{c}, |
| | Sebastian Ruder, |
| | Kyunghyun Cho, |
| | Iryna Gurevych}, |
| | journal={ArXiv}, |
| | year={2020} |
| | } |
| | |
| | ``` |
| |
|
| | *This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/roberta-large-mrpc_houlsby.yaml*. |