quran_shifaa / README.md
MossaabDev's picture
Upload 11 files
7f1fdbc verified
metadata
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - dense
  - generated_from_trainer
  - dataset_size:148
  - loss:CosineSimilarityLoss
base_model: sentence-transformers/all-MiniLM-L6-v2
widget:
  - source_sentence: I live in a very bad country, I wish I live in another country
    sentences:
      - >-
        O believers! Patiently endure, persevere, stand on guard, and be mindful
        of Allah, so you may be successful.
      - >-
        But perhaps you hate a thing and it is good for you; and perhaps you
        love a thing and it is bad for you. And Allah knows, while you know not
      - Do not do a favour expecting more ?in return?.
  - source_sentence: My mother just died, I feel so sad
    sentences:
      - >-
        And never think that Allah is unaware of what the wrongdoers do. He only
        delays them for a Day when eyes will stare [in horror]
      - >-
        Every soul will taste death. And you will only receive your full reward
        on the Day of Judgment. Whoever is spared from the Fire and is admitted
        into Paradise will ?indeed? triumph, whereas the life of this world is
        no more than the delusion of enjoyment.
      - >-
        Every soul will taste death. And you will only receive your full reward
        on the Day of Judgment. Whoever is spared from the Fire and is admitted
        into Paradise will ?indeed? triumph, whereas the life of this world is
        no more than the delusion of enjoyment.
  - source_sentence: I ask for guidance
    sentences:
      - We have sent you ?O Prophet? only as a mercy for the whole world.
      - >-
        Or ?a soul will? say,  If only Allah had guided me, I would have
        certainly been one of the righteous. 
      - >-
        And live with them in kindness. For if you dislike them - perhaps you
        dislike a thing and Allah makes therein much good
  - source_sentence: 'I feel bad for gaza people '
    sentences:
      - >-
        And We will surely test you with something of fear and hunger and a loss
        of wealth and lives and fruits, but give good tidings to the patient
      - We have sent you ?O Prophet? only as a mercy for the whole world.
      - >-
        And be patient, [O Muhammad], for the decision of your Lord, for indeed,
        you are in Our eyes. And exalt [Allah] with praise of your Lord when you
        arise
  - source_sentence: can quran cure me
    sentences:
      - >-
        O humanity! Indeed, there has come to you a warning from your Lord, a
        cure for what is in the hearts, a guide, and a mercy for the believers.
      - >-
        Those who believe and do good, for them will be bliss and an honourable
        destination. 
      - >-
        Not equal are the good deed and the bad deed. Repel [evil] by that
        [deed] which is better; and thereupon the one whom between you and him
        is enmity [will become] as though he was a devoted friend
pipeline_tag: sentence-similarity
library_name: sentence-transformers

SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 256 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 256, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'can quran cure me',
    'O humanity! Indeed, there has come to you a warning from your Lord, a cure for what is in the hearts, a guide, and a mercy for the believers.',
    'Not equal are the good deed and the bad deed. Repel [evil] by that [deed] which is better; and thereupon the one whom between you and him is enmity [will become] as though he was a devoted friend',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.9580, 0.9128],
#         [0.9580, 1.0000, 0.9162],
#         [0.9128, 0.9162, 1.0000]])

Training Details

Training Dataset

Unnamed Dataset

  • Size: 148 training samples
  • Columns: sentence_0, sentence_1, and label
  • Approximate statistics based on the first 148 samples:
    sentence_0 sentence_1 label
    type string string int
    details
    • min: 5 tokens
    • mean: 12.09 tokens
    • max: 30 tokens
    • min: 14 tokens
    • mean: 38.55 tokens
    • max: 121 tokens
    • -1: ~8.78%
    • 1: ~91.22%
  • Samples:
    sentence_0 sentence_1 label
    I have done a lot of bad things, I wanna return to allahm but I fear allah will not forgive me Say, ?O Prophet, that Allah says,? O My servants who have exceeded the limits against their souls! Do not lose hope in Allah s mercy, for Allah certainly forgives all sins. He is indeed the All-Forgiving, Most Merciful. 1
    how to act in arguments And when the ignorant address them, they say words of peace 1
    I failed in exams But perhaps you hate a thing and it is good for you; and perhaps you love a thing and it is bad for you. And Allah knows, while you know not 1
  • Loss: CosineSimilarityLoss with these parameters:
    {
        "loss_fct": "torch.nn.modules.loss.MSELoss"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • num_train_epochs: 20
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 8
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 20
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • parallelism_config: None
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • project: huggingface
  • trackio_space_id: trackio
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: no
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: True
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin
  • router_mapping: {}
  • learning_rate_mapping: {}

Framework Versions

  • Python: 3.12.7
  • Sentence Transformers: 5.1.1
  • Transformers: 4.57.1
  • PyTorch: 2.5.1
  • Accelerate: 1.11.0
  • Datasets: 4.3.0
  • Tokenizers: 0.22.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}