Instructions to use ffgcc/InfoCSE-bert-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ffgcc/InfoCSE-bert-base with Transformers:
# Load model directly from transformers import AutoTokenizer, BertForCL tokenizer = AutoTokenizer.from_pretrained("ffgcc/InfoCSE-bert-base") model = BertForCL.from_pretrained("ffgcc/InfoCSE-bert-base") - Notebooks
- Google Colab
- Kaggle
Evaluation results for ffgcc/InfoCSE-bert-base model as a base model for other tasks
#1
by eladven - opened
As part of a research effort to identify high quality models in Huggingface that can serve as base models for further finetuning, we evaluated this by finetuning on 36 datasets. The model ranks 1st among all tested models for the bert-base-uncased architecture as of 21/12/2022.
To share this information with others in your model card, please add the following evaluation results to your README.md page.
For more information please see https://ibm.github.io/model-recycling/ or contact me.
Best regards,
Elad Venezian
eladv@il.ibm.com
IBM Research AI
ffgcc changed pull request status to merged