Instructions to use jinaai/jina-bert-implementation with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use jinaai/jina-bert-implementation with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("jinaai/jina-bert-implementation", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Use attention dropout during training
#10
by Markus28 - opened
No description provided.
Can you move dropout_p to the constructor?
You mean instead of using self.dropout.p we use something like self.dropout_p that we set in the constructor? We will still need to check self.training in the forward pass.
Markus28 changed pull request status to open
Markus28 changed pull request status to merged