| | --- |
| | tags: |
| | - adapter-transformers |
| | - roberta |
| | datasets: |
| | - BigTMiami/amazon_helpfulness |
| | --- |
| | |
| | # Adapter `jgrc3/houlsby_adapter_classification_noPre` for roberta-base |
| | |
| | An [adapter](https://adapterhub.ml) for the `roberta-base` model that was trained on the [BigTMiami/amazon_helpfulness](https://huggingface.co/datasets/BigTMiami/amazon_helpfulness/) dataset and includes a prediction head for classification. |
| | |
| | This adapter was created for usage with the **[Adapters](https://github.com/Adapter-Hub/adapters)** library. |
| | |
| | ## Usage |
| | |
| | First, install `adapters`: |
| | |
| | ``` |
| | pip install -U adapters |
| | ``` |
| | |
| | Now, the adapter can be loaded and activated like this: |
| | |
| | ```python |
| | from adapters import AutoAdapterModel |
| | |
| | model = AutoAdapterModel.from_pretrained("roberta-base") |
| | adapter_name = model.load_adapter("jgrc3/houlsby_adapter_classification_noPre", source="hf", set_active=True) |
| | ``` |
| | |
| | ## Architecture & Training |
| | |
| | <!-- Add some description here --> |
| | |
| | ## Evaluation results |
| | |
| | <!-- Add some description here --> |
| | |
| | ## Citation |
| | |
| | <!-- Add some description here --> |