Instructions to use CompBioDSA/MutBERT-Human-Ref with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use CompBioDSA/MutBERT-Human-Ref with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("CompBioDSA/MutBERT-Human-Ref", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -34,7 +34,7 @@ tokenizer = AutoTokenizer.from_pretrained(model_name)
|
|
| 34 |
model = AutoModel.from_pretrained(model_name, trust_remote_code=True)
|
| 35 |
```
|
| 36 |
|
| 37 |
-
The default attention is flash attention("sdpa"). If you want use basic attention, you can replace it with "eager". Please refer to [here](https://huggingface.co/
|
| 38 |
|
| 39 |
### Get embeddings
|
| 40 |
|
|
|
|
| 34 |
model = AutoModel.from_pretrained(model_name, trust_remote_code=True)
|
| 35 |
```
|
| 36 |
|
| 37 |
+
The default attention is flash attention("sdpa"). If you want use basic attention, you can replace it with "eager". Please refer to [here](https://huggingface.co/CompBioDSA/MutBERT/blob/main/modeling_mutbert.py#L438).
|
| 38 |
|
| 39 |
### Get embeddings
|
| 40 |
|