dragonkue commited on
Commit
f376cf2
·
verified ·
1 Parent(s): d21e609

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -1
README.md CHANGED
@@ -15,6 +15,12 @@ language:
15
 
16
  <img src="https://cdn-uploads.huggingface.co/production/uploads/642b0c2fecec03b4464a1d9b/IxcqY5qbGNuGpqDciIcOI.webp" width="600">
17
 
 
 
 
 
 
 
18
  # SentenceTransformer based on intfloat/multilingual-e5-small
19
 
20
  This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) on datasets that include Korean query-passage pairs for improved performance on Korean retrieval tasks. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
@@ -261,7 +267,7 @@ pip install -U sentence-transformers
261
  - `adam_beta2`: 0.999
262
  - `adam_epsilon`: 1e-08
263
  - `max_grad_norm`: 1.0
264
- - `num_train_epochs`: 2
265
  - `max_steps`: -1
266
  - `lr_scheduler_type`: linear
267
  - `lr_scheduler_kwargs`: {}
 
15
 
16
  <img src="https://cdn-uploads.huggingface.co/production/uploads/642b0c2fecec03b4464a1d9b/IxcqY5qbGNuGpqDciIcOI.webp" width="600">
17
 
18
+ # ✨ New Version Available
19
+
20
+ We've released a new and improved version of this model!
21
+
22
+ [dragonkue/multilingual-e5-small-ko-v2](https://huggingface.co/dragonkue/multilingual-e5-small-ko-v2)
23
+
24
  # SentenceTransformer based on intfloat/multilingual-e5-small
25
 
26
  This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) on datasets that include Korean query-passage pairs for improved performance on Korean retrieval tasks. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
 
267
  - `adam_beta2`: 0.999
268
  - `adam_epsilon`: 1e-08
269
  - `max_grad_norm`: 1.0
270
+ - `num_train_epochs`: 3
271
  - `max_steps`: -1
272
  - `lr_scheduler_type`: linear
273
  - `lr_scheduler_kwargs`: {}