Quick Links
Finetuned SimCSE with longer context size for LongEmbed tasks.
- Downloads last month
- 4
Model tree for Decycle/simcse_longembed
Base model
princeton-nlp/unsup-simcse-roberta-baseFinetuned SimCSE with longer context size for LongEmbed tasks.
Base model
princeton-nlp/unsup-simcse-roberta-base
# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("Decycle/simcse_longembed") model = AutoModel.from_pretrained("Decycle/simcse_longembed")