SentenceTransformer based on shihab17/bangla-sentence-transformer
This is a sentence-transformers model finetuned from shihab17/bangla-sentence-transformer. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: shihab17/bangla-sentence-transformer
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("farhana1996/unsupervised-simcse-bangla-sbert")
# Run inference
sentences = [
'রোহিঙ্গা অনুপ্রবেশসহ বিভিন্ন ইস্যুতে নানা টানা পোড়েনের মধ্যেই স্বরাষ্ট্রমন্ত্রী আসাদুজ্জামান খান কামাল মিয়ানমার সফরে যাচ্ছেন',
'রোহিঙ্গা অনুপ্রবেশসহ বিভিন্ন ইস্যুতে নানা টানা পোড়েনের মধ্যেই স্বরাষ্ট্রমন্ত্রী আসাদুজ্জামান খান কামাল মিয়ানমার সফরে যাচ্ছেন',
'আগামী এক মাসের মধ্যে এটি জনপ্রশাসন মন্ত্রণালয়ে পাঠানো হবে বলে সংশ্লিষ্ট সূত্র জানিয়েছে',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Dataset
Unnamed Dataset
- Size: 1,000,000 training samples
- Columns:
sentence_0andsentence_1 - Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 type string string details - min: 3 tokens
- mean: 25.91 tokens
- max: 148 tokens
- min: 3 tokens
- mean: 25.91 tokens
- max: 148 tokens
- Samples:
sentence_0 sentence_1 বিনোদন ডেস্ক অভিনেতা নির্মাতা জাহিদ হাসান ঈদ উপলক্ষে অভিনয় ও পরিচালনা নিয়ে ব্যস্ত সময় কাটাচ্ছেনবিনোদন ডেস্ক অভিনেতা নির্মাতা জাহিদ হাসান ঈদ উপলক্ষে অভিনয় ও পরিচালনা নিয়ে ব্যস্ত সময় কাটাচ্ছেনআগামী এক মাসের মধ্যে এটি জনপ্রশাসন মন্ত্রণালয়ে পাঠানো হবে বলে সংশ্লিষ্ট সূত্র জানিয়েছেআগামী এক মাসের মধ্যে এটি জনপ্রশাসন মন্ত্রণালয়ে পাঠানো হবে বলে সংশ্লিষ্ট সূত্র জানিয়েছেবিশ্ববিদ্যালয় ভারপ্রাপ্ত রেজিস্ট্রার প্রফেসর ড কামরুল হুদা বলেন, পুলিশ বিশ্ববিদ্যালয় প্রশাসনের কাছে তালিকা চাইলে বিশ্ববিদ্যালয়ের বিভিন্ন বিভাগে খোঁজ নিয়ে জনের নাম পাওয়া যায়বিশ্ববিদ্যালয় ভারপ্রাপ্ত রেজিস্ট্রার প্রফেসর ড কামরুল হুদা বলেন, পুলিশ বিশ্ববিদ্যালয় প্রশাসনের কাছে তালিকা চাইলে বিশ্ববিদ্যালয়ের বিভিন্ন বিভাগে খোঁজ নিয়ে জনের নাম পাওয়া যায় - Loss:
MultipleNegativesRankingLosswith these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
per_device_train_batch_size: 4per_device_eval_batch_size: 4num_train_epochs: 1fp16: Truemulti_dataset_batch_sampler: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir: Falsedo_predict: Falseeval_strategy: noprediction_loss_only: Trueper_device_train_batch_size: 4per_device_eval_batch_size: 4per_gpu_train_batch_size: Noneper_gpu_eval_batch_size: Nonegradient_accumulation_steps: 1eval_accumulation_steps: Nonetorch_empty_cache_steps: Nonelearning_rate: 5e-05weight_decay: 0.0adam_beta1: 0.9adam_beta2: 0.999adam_epsilon: 1e-08max_grad_norm: 1num_train_epochs: 1max_steps: -1lr_scheduler_type: linearlr_scheduler_kwargs: {}warmup_ratio: 0.0warmup_steps: 0log_level: passivelog_level_replica: warninglog_on_each_node: Truelogging_nan_inf_filter: Truesave_safetensors: Truesave_on_each_node: Falsesave_only_model: Falserestore_callback_states_from_checkpoint: Falseno_cuda: Falseuse_cpu: Falseuse_mps_device: Falseseed: 42data_seed: Nonejit_mode_eval: Falseuse_ipex: Falsebf16: Falsefp16: Truefp16_opt_level: O1half_precision_backend: autobf16_full_eval: Falsefp16_full_eval: Falsetf32: Nonelocal_rank: 0ddp_backend: Nonetpu_num_cores: Nonetpu_metrics_debug: Falsedebug: []dataloader_drop_last: Falsedataloader_num_workers: 0dataloader_prefetch_factor: Nonepast_index: -1disable_tqdm: Falseremove_unused_columns: Truelabel_names: Noneload_best_model_at_end: Falseignore_data_skip: Falsefsdp: []fsdp_min_num_params: 0fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap: Noneaccelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed: Nonelabel_smoothing_factor: 0.0optim: adamw_torchoptim_args: Noneadafactor: Falsegroup_by_length: Falselength_column_name: lengthddp_find_unused_parameters: Noneddp_bucket_cap_mb: Noneddp_broadcast_buffers: Falsedataloader_pin_memory: Truedataloader_persistent_workers: Falseskip_memory_metrics: Trueuse_legacy_prediction_loop: Falsepush_to_hub: Falseresume_from_checkpoint: Nonehub_model_id: Nonehub_strategy: every_savehub_private_repo: Nonehub_always_push: Falsegradient_checkpointing: Falsegradient_checkpointing_kwargs: Noneinclude_inputs_for_metrics: Falseinclude_for_metrics: []eval_do_concat_batches: Truefp16_backend: autopush_to_hub_model_id: Nonepush_to_hub_organization: Nonemp_parameters:auto_find_batch_size: Falsefull_determinism: Falsetorchdynamo: Noneray_scope: lastddp_timeout: 1800torch_compile: Falsetorch_compile_backend: Nonetorch_compile_mode: Nonedispatch_batches: Nonesplit_batches: Noneinclude_tokens_per_second: Falseinclude_num_input_tokens_seen: Falseneftune_noise_alpha: Noneoptim_target_modules: Nonebatch_eval_metrics: Falseeval_on_start: Falseuse_liger_kernel: Falseeval_use_gather_object: Falseaverage_tokens_across_devices: Falseprompts: Nonebatch_sampler: batch_samplermulti_dataset_batch_sampler: round_robin
Training Logs
Click to expand
| Epoch | Step | Training Loss |
|---|---|---|
| 0.002 | 500 | 0.0036 |
| 0.004 | 1000 | 0.0001 |
| 0.006 | 1500 | 0.0 |
| 0.008 | 2000 | 0.0001 |
| 0.01 | 2500 | 0.0 |
| 0.012 | 3000 | 0.0 |
| 0.014 | 3500 | 0.0 |
| 0.016 | 4000 | 0.0 |
| 0.018 | 4500 | 0.0 |
| 0.02 | 5000 | 0.0001 |
| 0.022 | 5500 | 0.0 |
| 0.024 | 6000 | 0.0 |
| 0.026 | 6500 | 0.0 |
| 0.028 | 7000 | 0.0 |
| 0.03 | 7500 | 0.0 |
| 0.032 | 8000 | 0.0 |
| 0.034 | 8500 | 0.0 |
| 0.036 | 9000 | 0.0 |
| 0.038 | 9500 | 0.0 |
| 0.04 | 10000 | 0.0001 |
| 0.042 | 10500 | 0.0 |
| 0.044 | 11000 | 0.0002 |
| 0.046 | 11500 | 0.0 |
| 0.048 | 12000 | 0.0 |
| 0.05 | 12500 | 0.0 |
| 0.052 | 13000 | 0.0 |
| 0.054 | 13500 | 0.0 |
| 0.056 | 14000 | 0.0 |
| 0.058 | 14500 | 0.0006 |
| 0.06 | 15000 | 0.0 |
| 0.062 | 15500 | 0.0 |
| 0.064 | 16000 | 0.0 |
| 0.066 | 16500 | 0.0001 |
| 0.068 | 17000 | 0.0 |
| 0.07 | 17500 | 0.0 |
| 0.072 | 18000 | 0.0 |
| 0.074 | 18500 | 0.0 |
| 0.076 | 19000 | 0.0 |
| 0.078 | 19500 | 0.0 |
| 0.08 | 20000 | 0.0 |
| 0.082 | 20500 | 0.0 |
| 0.084 | 21000 | 0.0004 |
| 0.086 | 21500 | 0.0 |
| 0.088 | 22000 | 0.0 |
| 0.09 | 22500 | 0.0 |
| 0.092 | 23000 | 0.0 |
| 0.094 | 23500 | 0.0 |
| 0.096 | 24000 | 0.0001 |
| 0.098 | 24500 | 0.0 |
| 0.1 | 25000 | 0.0 |
| 0.102 | 25500 | 0.0001 |
| 0.104 | 26000 | 0.0 |
| 0.106 | 26500 | 0.0001 |
| 0.108 | 27000 | 0.0 |
| 0.11 | 27500 | 0.0 |
| 0.112 | 28000 | 0.0 |
| 0.114 | 28500 | 0.0 |
| 0.116 | 29000 | 0.0 |
| 0.118 | 29500 | 0.0007 |
| 0.12 | 30000 | 0.0 |
| 0.122 | 30500 | 0.0 |
| 0.124 | 31000 | 0.0 |
| 0.126 | 31500 | 0.0 |
| 0.128 | 32000 | 0.0 |
| 0.13 | 32500 | 0.0 |
| 0.132 | 33000 | 0.0 |
| 0.134 | 33500 | 0.0003 |
| 0.136 | 34000 | 0.0 |
| 0.138 | 34500 | 0.0001 |
| 0.14 | 35000 | 0.0 |
| 0.142 | 35500 | 0.0007 |
| 0.144 | 36000 | 0.0001 |
| 0.146 | 36500 | 0.0 |
| 0.148 | 37000 | 0.0 |
| 0.15 | 37500 | 0.0 |
| 0.152 | 38000 | 0.0 |
| 0.154 | 38500 | 0.0 |
| 0.156 | 39000 | 0.0 |
| 0.158 | 39500 | 0.0 |
| 0.16 | 40000 | 0.0 |
| 0.162 | 40500 | 0.0 |
| 0.164 | 41000 | 0.0 |
| 0.166 | 41500 | 0.0 |
| 0.168 | 42000 | 0.0005 |
| 0.17 | 42500 | 0.0 |
| 0.172 | 43000 | 0.0 |
| 0.174 | 43500 | 0.0 |
| 0.176 | 44000 | 0.0 |
| 0.178 | 44500 | 0.0 |
| 0.18 | 45000 | 0.0 |
| 0.182 | 45500 | 0.0 |
| 0.184 | 46000 | 0.0 |
| 0.186 | 46500 | 0.0 |
| 0.188 | 47000 | 0.0 |
| 0.19 | 47500 | 0.0 |
| 0.192 | 48000 | 0.0 |
| 0.194 | 48500 | 0.0 |
| 0.196 | 49000 | 0.0002 |
| 0.198 | 49500 | 0.0 |
| 0.2 | 50000 | 0.0 |
| 0.202 | 50500 | 0.0008 |
| 0.204 | 51000 | 0.0 |
| 0.206 | 51500 | 0.0 |
| 0.208 | 52000 | 0.0 |
| 0.21 | 52500 | 0.0 |
| 0.212 | 53000 | 0.0 |
| 0.214 | 53500 | 0.0 |
| 0.216 | 54000 | 0.0 |
| 0.218 | 54500 | 0.0 |
| 0.22 | 55000 | 0.0 |
| 0.222 | 55500 | 0.0 |
| 0.224 | 56000 | 0.0 |
| 0.226 | 56500 | 0.0 |
| 0.228 | 57000 | 0.0 |
| 0.23 | 57500 | 0.0 |
| 0.232 | 58000 | 0.0001 |
| 0.234 | 58500 | 0.0005 |
| 0.236 | 59000 | 0.0 |
| 0.238 | 59500 | 0.0 |
| 0.24 | 60000 | 0.0 |
| 0.242 | 60500 | 0.0 |
| 0.244 | 61000 | 0.0 |
| 0.246 | 61500 | 0.0 |
| 0.248 | 62000 | 0.0 |
| 0.25 | 62500 | 0.0 |
| 0.252 | 63000 | 0.0 |
| 0.254 | 63500 | 0.0 |
| 0.256 | 64000 | 0.0001 |
| 0.258 | 64500 | 0.0007 |
| 0.26 | 65000 | 0.0 |
| 0.262 | 65500 | 0.0 |
| 0.264 | 66000 | 0.0 |
| 0.266 | 66500 | 0.0 |
| 0.268 | 67000 | 0.0003 |
| 0.27 | 67500 | 0.0 |
| 0.272 | 68000 | 0.0 |
| 0.274 | 68500 | 0.0 |
| 0.276 | 69000 | 0.0 |
| 0.278 | 69500 | 0.0 |
| 0.28 | 70000 | 0.0 |
| 0.282 | 70500 | 0.0 |
| 0.284 | 71000 | 0.0 |
| 0.286 | 71500 | 0.0 |
| 0.288 | 72000 | 0.0 |
| 0.29 | 72500 | 0.0 |
| 0.292 | 73000 | 0.0 |
| 0.294 | 73500 | 0.0 |
| 0.296 | 74000 | 0.0004 |
| 0.298 | 74500 | 0.0 |
| 0.3 | 75000 | 0.0 |
| 0.302 | 75500 | 0.0 |
| 0.304 | 76000 | 0.0 |
| 0.306 | 76500 | 0.0 |
| 0.308 | 77000 | 0.0 |
| 0.31 | 77500 | 0.0 |
| 0.312 | 78000 | 0.0 |
| 0.314 | 78500 | 0.0 |
| 0.316 | 79000 | 0.0 |
| 0.318 | 79500 | 0.0 |
| 0.32 | 80000 | 0.0 |
| 0.322 | 80500 | 0.0 |
| 0.324 | 81000 | 0.0 |
| 0.326 | 81500 | 0.0 |
| 0.328 | 82000 | 0.0 |
| 0.33 | 82500 | 0.0 |
| 0.332 | 83000 | 0.0 |
| 0.334 | 83500 | 0.0 |
| 0.336 | 84000 | 0.0 |
| 0.338 | 84500 | 0.0 |
| 0.34 | 85000 | 0.0 |
| 0.342 | 85500 | 0.0 |
| 0.344 | 86000 | 0.0 |
| 0.346 | 86500 | 0.0 |
| 0.348 | 87000 | 0.0 |
| 0.35 | 87500 | 0.0 |
| 0.352 | 88000 | 0.0002 |
| 0.354 | 88500 | 0.0 |
| 0.356 | 89000 | 0.0 |
| 0.358 | 89500 | 0.0 |
| 0.36 | 90000 | 0.0 |
| 0.362 | 90500 | 0.0 |
| 0.364 | 91000 | 0.0 |
| 0.366 | 91500 | 0.0 |
| 0.368 | 92000 | 0.0 |
| 0.37 | 92500 | 0.0 |
| 0.372 | 93000 | 0.0 |
| 0.374 | 93500 | 0.0 |
| 0.376 | 94000 | 0.0002 |
| 0.378 | 94500 | 0.0 |
| 0.38 | 95000 | 0.0 |
| 0.382 | 95500 | 0.0 |
| 0.384 | 96000 | 0.0001 |
| 0.386 | 96500 | 0.0 |
| 0.388 | 97000 | 0.0 |
| 0.39 | 97500 | 0.0 |
| 0.392 | 98000 | 0.0 |
| 0.394 | 98500 | 0.0 |
| 0.396 | 99000 | 0.0 |
| 0.398 | 99500 | 0.0 |
| 0.4 | 100000 | 0.0006 |
| 0.402 | 100500 | 0.0 |
| 0.404 | 101000 | 0.0 |
| 0.406 | 101500 | 0.0 |
| 0.408 | 102000 | 0.0 |
| 0.41 | 102500 | 0.0 |
| 0.412 | 103000 | 0.0 |
| 0.414 | 103500 | 0.0 |
| 0.416 | 104000 | 0.0 |
| 0.418 | 104500 | 0.0 |
| 0.42 | 105000 | 0.0 |
| 0.422 | 105500 | 0.0 |
| 0.424 | 106000 | 0.0 |
| 0.426 | 106500 | 0.0 |
| 0.428 | 107000 | 0.0 |
| 0.43 | 107500 | 0.0 |
| 0.432 | 108000 | 0.0 |
| 0.434 | 108500 | 0.0 |
| 0.436 | 109000 | 0.0 |
| 0.438 | 109500 | 0.0 |
| 0.44 | 110000 | 0.0 |
| 0.442 | 110500 | 0.0 |
| 0.444 | 111000 | 0.0 |
| 0.446 | 111500 | 0.0 |
| 0.448 | 112000 | 0.0 |
| 0.45 | 112500 | 0.0 |
| 0.452 | 113000 | 0.0 |
| 0.454 | 113500 | 0.0 |
| 0.456 | 114000 | 0.0 |
| 0.458 | 114500 | 0.0 |
| 0.46 | 115000 | 0.0 |
| 0.462 | 115500 | 0.0001 |
| 0.464 | 116000 | 0.0 |
| 0.466 | 116500 | 0.0 |
| 0.468 | 117000 | 0.0 |
| 0.47 | 117500 | 0.0 |
| 0.472 | 118000 | 0.0 |
| 0.474 | 118500 | 0.0 |
| 0.476 | 119000 | 0.0 |
| 0.478 | 119500 | 0.0 |
| 0.48 | 120000 | 0.0 |
| 0.482 | 120500 | 0.0 |
| 0.484 | 121000 | 0.0 |
| 0.486 | 121500 | 0.0 |
| 0.488 | 122000 | 0.0 |
| 0.49 | 122500 | 0.0 |
| 0.492 | 123000 | 0.0 |
| 0.494 | 123500 | 0.0 |
| 0.496 | 124000 | 0.001 |
| 0.498 | 124500 | 0.0 |
| 0.5 | 125000 | 0.0 |
| 0.502 | 125500 | 0.0 |
| 0.504 | 126000 | 0.0 |
| 0.506 | 126500 | 0.0 |
| 0.508 | 127000 | 0.0 |
| 0.51 | 127500 | 0.0 |
| 0.512 | 128000 | 0.0 |
| 0.514 | 128500 | 0.0 |
| 0.516 | 129000 | 0.0 |
| 0.518 | 129500 | 0.0 |
| 0.52 | 130000 | 0.0 |
| 0.522 | 130500 | 0.0 |
| 0.524 | 131000 | 0.0 |
| 0.526 | 131500 | 0.0 |
| 0.528 | 132000 | 0.0 |
| 0.53 | 132500 | 0.0 |
| 0.532 | 133000 | 0.0 |
| 0.534 | 133500 | 0.0 |
| 0.536 | 134000 | 0.0 |
| 0.538 | 134500 | 0.0 |
| 0.54 | 135000 | 0.0 |
| 0.542 | 135500 | 0.0 |
| 0.544 | 136000 | 0.0 |
| 0.546 | 136500 | 0.0 |
| 0.548 | 137000 | 0.0 |
| 0.55 | 137500 | 0.0 |
| 0.552 | 138000 | 0.0 |
| 0.554 | 138500 | 0.0 |
| 0.556 | 139000 | 0.0 |
| 0.558 | 139500 | 0.0 |
| 0.56 | 140000 | 0.0 |
| 0.562 | 140500 | 0.0 |
| 0.564 | 141000 | 0.0 |
| 0.566 | 141500 | 0.0 |
| 0.568 | 142000 | 0.0 |
| 0.57 | 142500 | 0.0 |
| 0.572 | 143000 | 0.0 |
| 0.574 | 143500 | 0.0 |
| 0.576 | 144000 | 0.0 |
| 0.578 | 144500 | 0.0 |
| 0.58 | 145000 | 0.0 |
| 0.582 | 145500 | 0.0 |
| 0.584 | 146000 | 0.0 |
| 0.586 | 146500 | 0.0 |
| 0.588 | 147000 | 0.0 |
| 0.59 | 147500 | 0.0 |
| 0.592 | 148000 | 0.0 |
| 0.594 | 148500 | 0.0 |
| 0.596 | 149000 | 0.0 |
| 0.598 | 149500 | 0.0 |
| 0.6 | 150000 | 0.0 |
| 0.602 | 150500 | 0.0 |
| 0.604 | 151000 | 0.0 |
| 0.606 | 151500 | 0.0 |
| 0.608 | 152000 | 0.0 |
| 0.61 | 152500 | 0.0 |
| 0.612 | 153000 | 0.0 |
| 0.614 | 153500 | 0.0 |
| 0.616 | 154000 | 0.0 |
| 0.618 | 154500 | 0.0 |
| 0.62 | 155000 | 0.0 |
| 0.622 | 155500 | 0.0 |
| 0.624 | 156000 | 0.0 |
| 0.626 | 156500 | 0.0 |
| 0.628 | 157000 | 0.0 |
| 0.63 | 157500 | 0.0 |
| 0.632 | 158000 | 0.0 |
| 0.634 | 158500 | 0.0 |
| 0.636 | 159000 | 0.0 |
| 0.638 | 159500 | 0.0 |
| 0.64 | 160000 | 0.0 |
| 0.642 | 160500 | 0.0 |
| 0.644 | 161000 | 0.0 |
| 0.646 | 161500 | 0.0 |
| 0.648 | 162000 | 0.0 |
| 0.65 | 162500 | 0.0 |
| 0.652 | 163000 | 0.0 |
| 0.654 | 163500 | 0.0 |
| 0.656 | 164000 | 0.0001 |
| 0.658 | 164500 | 0.0 |
| 0.66 | 165000 | 0.0 |
| 0.662 | 165500 | 0.0 |
| 0.664 | 166000 | 0.0 |
| 0.666 | 166500 | 0.0 |
| 0.668 | 167000 | 0.0 |
| 0.67 | 167500 | 0.0 |
| 0.672 | 168000 | 0.0 |
| 0.674 | 168500 | 0.0 |
| 0.676 | 169000 | 0.0 |
| 0.678 | 169500 | 0.0 |
| 0.68 | 170000 | 0.0 |
| 0.682 | 170500 | 0.0 |
| 0.684 | 171000 | 0.0 |
| 0.686 | 171500 | 0.0 |
| 0.688 | 172000 | 0.0 |
| 0.69 | 172500 | 0.0 |
| 0.692 | 173000 | 0.0 |
| 0.694 | 173500 | 0.0 |
| 0.696 | 174000 | 0.0 |
| 0.698 | 174500 | 0.0 |
| 0.7 | 175000 | 0.0 |
| 0.702 | 175500 | 0.0 |
| 0.704 | 176000 | 0.0 |
| 0.706 | 176500 | 0.0 |
| 0.708 | 177000 | 0.0 |
| 0.71 | 177500 | 0.0 |
| 0.712 | 178000 | 0.0 |
| 0.714 | 178500 | 0.0 |
| 0.716 | 179000 | 0.0 |
| 0.718 | 179500 | 0.0 |
| 0.72 | 180000 | 0.0 |
| 0.722 | 180500 | 0.0 |
| 0.724 | 181000 | 0.0 |
| 0.726 | 181500 | 0.0 |
| 0.728 | 182000 | 0.0007 |
| 0.73 | 182500 | 0.0 |
| 0.732 | 183000 | 0.0 |
| 0.734 | 183500 | 0.0 |
| 0.736 | 184000 | 0.0 |
| 0.738 | 184500 | 0.0 |
| 0.74 | 185000 | 0.0 |
| 0.742 | 185500 | 0.0 |
| 0.744 | 186000 | 0.0 |
| 0.746 | 186500 | 0.0 |
| 0.748 | 187000 | 0.0 |
| 0.75 | 187500 | 0.0 |
| 0.752 | 188000 | 0.0 |
| 0.754 | 188500 | 0.0 |
| 0.756 | 189000 | 0.0 |
| 0.758 | 189500 | 0.0 |
| 0.76 | 190000 | 0.0 |
| 0.762 | 190500 | 0.0 |
| 0.764 | 191000 | 0.0 |
| 0.766 | 191500 | 0.0 |
| 0.768 | 192000 | 0.0 |
| 0.77 | 192500 | 0.0 |
| 0.772 | 193000 | 0.0 |
| 0.774 | 193500 | 0.0 |
| 0.776 | 194000 | 0.0 |
| 0.778 | 194500 | 0.0 |
| 0.78 | 195000 | 0.0 |
| 0.782 | 195500 | 0.0 |
| 0.784 | 196000 | 0.0007 |
| 0.786 | 196500 | 0.0 |
| 0.788 | 197000 | 0.0 |
| 0.79 | 197500 | 0.0 |
| 0.792 | 198000 | 0.0 |
| 0.794 | 198500 | 0.0 |
| 0.796 | 199000 | 0.0 |
| 0.798 | 199500 | 0.0 |
| 0.8 | 200000 | 0.0 |
| 0.802 | 200500 | 0.0 |
| 0.804 | 201000 | 0.0 |
| 0.806 | 201500 | 0.0 |
| 0.808 | 202000 | 0.0 |
| 0.81 | 202500 | 0.0 |
| 0.812 | 203000 | 0.0 |
| 0.814 | 203500 | 0.0 |
| 0.816 | 204000 | 0.0 |
| 0.818 | 204500 | 0.0 |
| 0.82 | 205000 | 0.0 |
| 0.822 | 205500 | 0.0 |
| 0.824 | 206000 | 0.0 |
| 0.826 | 206500 | 0.0 |
| 0.828 | 207000 | 0.0 |
| 0.83 | 207500 | 0.0 |
| 0.832 | 208000 | 0.0 |
| 0.834 | 208500 | 0.0 |
| 0.836 | 209000 | 0.0 |
| 0.838 | 209500 | 0.0 |
| 0.84 | 210000 | 0.0 |
| 0.842 | 210500 | 0.0 |
| 0.844 | 211000 | 0.0 |
| 0.846 | 211500 | 0.0 |
| 0.848 | 212000 | 0.0 |
| 0.85 | 212500 | 0.0 |
| 0.852 | 213000 | 0.0 |
| 0.854 | 213500 | 0.0 |
| 0.856 | 214000 | 0.0 |
| 0.858 | 214500 | 0.0 |
| 0.86 | 215000 | 0.0 |
| 0.862 | 215500 | 0.0 |
| 0.864 | 216000 | 0.0 |
| 0.866 | 216500 | 0.0 |
| 0.868 | 217000 | 0.0 |
| 0.87 | 217500 | 0.0 |
| 0.872 | 218000 | 0.0 |
| 0.874 | 218500 | 0.0 |
| 0.876 | 219000 | 0.0 |
| 0.878 | 219500 | 0.0 |
| 0.88 | 220000 | 0.0001 |
| 0.882 | 220500 | 0.0006 |
| 0.884 | 221000 | 0.0 |
| 0.886 | 221500 | 0.0 |
| 0.888 | 222000 | 0.0 |
| 0.89 | 222500 | 0.0 |
| 0.892 | 223000 | 0.0 |
| 0.894 | 223500 | 0.0 |
| 0.896 | 224000 | 0.0 |
| 0.898 | 224500 | 0.0 |
| 0.9 | 225000 | 0.0 |
| 0.902 | 225500 | 0.0 |
| 0.904 | 226000 | 0.0 |
| 0.906 | 226500 | 0.0 |
| 0.908 | 227000 | 0.0 |
| 0.91 | 227500 | 0.0 |
| 0.912 | 228000 | 0.0 |
| 0.914 | 228500 | 0.0 |
| 0.916 | 229000 | 0.0 |
| 0.918 | 229500 | 0.0 |
| 0.92 | 230000 | 0.0 |
| 0.922 | 230500 | 0.0 |
| 0.924 | 231000 | 0.0 |
| 0.926 | 231500 | 0.0 |
| 0.928 | 232000 | 0.0 |
| 0.93 | 232500 | 0.0 |
| 0.932 | 233000 | 0.0 |
| 0.934 | 233500 | 0.0 |
| 0.936 | 234000 | 0.0 |
| 0.938 | 234500 | 0.0 |
| 0.94 | 235000 | 0.0 |
| 0.942 | 235500 | 0.0 |
| 0.944 | 236000 | 0.0 |
| 0.946 | 236500 | 0.0 |
| 0.948 | 237000 | 0.0 |
| 0.95 | 237500 | 0.0 |
| 0.952 | 238000 | 0.0 |
| 0.954 | 238500 | 0.0 |
| 0.956 | 239000 | 0.0 |
| 0.958 | 239500 | 0.0 |
| 0.96 | 240000 | 0.0 |
| 0.962 | 240500 | 0.0 |
| 0.964 | 241000 | 0.0 |
| 0.966 | 241500 | 0.0 |
| 0.968 | 242000 | 0.0 |
| 0.97 | 242500 | 0.0 |
| 0.972 | 243000 | 0.0 |
| 0.974 | 243500 | 0.0 |
| 0.976 | 244000 | 0.0 |
| 0.978 | 244500 | 0.0 |
| 0.98 | 245000 | 0.0 |
| 0.982 | 245500 | 0.0 |
| 0.984 | 246000 | 0.0 |
| 0.986 | 246500 | 0.0 |
| 0.988 | 247000 | 0.0 |
| 0.99 | 247500 | 0.0 |
| 0.992 | 248000 | 0.0 |
| 0.994 | 248500 | 0.0 |
| 0.996 | 249000 | 0.0 |
| 0.998 | 249500 | 0.0 |
| 1.0 | 250000 | 0.0 |
Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.4.1
- Transformers: 4.48.2
- PyTorch: 2.4.1+cu121
- Accelerate: 0.34.2
- Datasets: 3.0.1
- Tokenizers: 0.21.0
- Downloads last month
- 5
Model tree for farhana1996/unsupervised-simcse-bangla-sbert
Base model
shihab17/bangla-sentence-transformer