Quran_embed_V2.2 / README.md
MossaabDev's picture
Upload 10 files
5055cb8 verified
metadata
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - dense
  - generated_from_trainer
  - dataset_size:217
  - loss:CosineSimilarityLoss
base_model: sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
widget:
  - source_sentence: my teacher died last year, I miss him
    sentences:
      - Every soul will taste death, then to Us you will ?all? be returned.
      - >-
        And live with them in kindness. For if you dislike them - perhaps you
        dislike a thing and Allah makes therein much good
      - >-
        And live with them in kindness. For if you dislike them perhaps you
        dislike a thing and Allah makes therein much good
  - source_sentence: I am sad I saw kids in gaza are dying
    sentences:
      - >-
        And never think that Allah is unaware of what the wrongdoers do. He only
        delays them for a Day when eyes will stare [in horror]
      - >-
        Bid your people to pray, and be diligent in ?observing? it. We do not
        ask you to provide. It is We Who provide for you. And the ultimate
        outcome is ?only? for ?the people of? righteousness.
      - >-
        And We will surely test you with something of fear and hunger and a loss
        of wealth and lives and fruits, but give good tidings to the patient
  - source_sentence: is prayers mandatory, I want my brother and may family to pray everyday
    sentences:
      - >-
        Every soul will taste death. And you will only receive your full reward
        on the Day of Judgment. Whoever is spared from the Fire and is admitted
        into Paradise will ?indeed? triumph, whereas the life of this world is
        no more than the delusion of enjoyment.
      - >-
        And seek help through patience and prayer. Indeed, it is a burden except
        for the humble
      - >-
        Bid your people to pray, and be diligent in ?observing? it. We do not
        ask you to provide. It is We Who provide for you. And the ultimate
        outcome is ?only? for ?the people of? righteousness.
  - source_sentence: I feel jhopeless
    sentences:
      - And when I am ill, it is He who cures me
      - >-
        And seek help through patience and prayer. Indeed, it is a burden except
        for the humble
      - And when I am ill, it is He who cures me
  - source_sentence: I failed in exams
    sentences:
      - >-
        But perhaps you hate a thing and it is good for you; and perhaps you
        love a thing and it is bad for you. And Allah knows, while you know not
      - >-
        O humanity! Indeed, there has come to you a warning from your Lord, a
        cure for what is in the hearts, a guide, and a mercy for the believers.
      - >-
        Give good news to those who patiently endure who say, when struck by a
        disaster,  Surely to Allah we belong and to Him we will ?all? return. 
pipeline_tag: sentence-similarity
library_name: sentence-transformers

SentenceTransformer based on sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2

This is a sentence-transformers model finetuned from sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'I failed in exams',
    'But perhaps you hate a thing and it is good for you; and perhaps you love a thing and it is bad for you. And Allah knows, while you know not',
    'O humanity! Indeed, there has come to you a warning from your Lord, a cure for what is in the hearts, a guide, and a mercy for the believers.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.8458, 0.7432],
#         [0.8458, 1.0000, 0.7996],
#         [0.7432, 0.7996, 1.0000]])

Training Details

Training Dataset

Unnamed Dataset

  • Size: 217 training samples
  • Columns: sentence_0, sentence_1, and label
  • Approximate statistics based on the first 217 samples:
    sentence_0 sentence_1 label
    type string string float
    details
    • min: 5 tokens
    • mean: 11.92 tokens
    • max: 34 tokens
    • min: 3 tokens
    • mean: 37.53 tokens
    • max: 128 tokens
    • min: 0.0
    • mean: 0.88
    • max: 1.0
  • Samples:
    sentence_0 sentence_1 label
    how to avoid fights, and spread love? Repel evil with that which is better, and then the one whom there is enmity between you and him will become as though he was a close friend 1.0
    I can not provide for my family Bid your people to pray, and be diligent in ?observing? it. We do not ask you to provide. It is We Who provide for you. And the ultimate outcome is ?only? for ?the people of? righteousness. 0.0
    is allah testing me or turtoring me, it is really difficult And We will surely test you with something of fear and hunger and a loss of wealth and lives and fruits, but give good tidings to the patient 1.0
  • Loss: CosineSimilarityLoss with these parameters:
    {
        "loss_fct": "torch.nn.modules.loss.MSELoss"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • num_train_epochs: 10
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin
  • router_mapping: {}
  • learning_rate_mapping: {}

Framework Versions

  • Python: 3.12.4
  • Sentence Transformers: 5.1.2
  • Transformers: 4.57.1
  • PyTorch: 2.8.0+cu126
  • Accelerate: 1.11.0
  • Datasets: 4.4.1
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}