SentenceTransformer

This is a sentence-transformers model trained. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 1024 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'Nationalism is a silly cock crowing on his own dunghill.',
    'The 1970\'s saw a rise and fall of what we have come to know as "Blacksploitation" Films. The term is a reference to kind of broad catch-all, rather than a true Genre of Film. In short, any comedy, drama, adventure, western or urban cops & robbers shoot-em-up, that are so constructed and so cast as to appeal to the large Urban Black population of the Mid 20th Century. That indeed could embrace the widest type of films, as long as the had a slant toward the inner-city black population.It appears that the idea of producing these films of particularly keen interest to Black Americans had its genesis with the Eastertime Release of 100 RIFLES (Marvin Schwartz Prod./20th Century-Fox, 1969). In it, former Syracuse University All-American Footballer and Several Times All-Pro Fullback for the Cleveland Browns, Jim Brown, had a Co-Starring Billing. Having appeared in a number of films already, as for example, RIO CONCHOS (1964),THE DIRTY DOZEN (1967), (ICE STTION ZEBRA (1968)* and others, it was beginning to make more sense to the Studios\' "Suits" that Jim was a hot property.Now this 100 RIFLES brings record numbers of Black patrons to the Big Cities\' central business districts on Easter Sunday to view Mr. Brown. Why not start to film more of these adventure epics and other types of film with more Black Players and Stars? Why not, indeed.** So we saw a succession of Cops & Robbers, Bad-ass Private Detective Films, Comedies, all going the route. Along the way, we eventually got to some more family oriented, wider appealing films. The movie goers were treated to SOUNDER (1972), THE TAKE (1974), CONRACK (1974)and, ultimately, CLAUDINE (1974).In CLAUDINE, we find no stigma nor easy classification as being "Blackploitation", as the story is universal, and could easily have been done as a story about people of any descent, any where, and not just in the 1970\'s USA.That the story was done of a SINGLE mother, Claudine (Dianne Carroll), struggling to keep a family together after "....two marriages and two almost marriages.", is a far cry from a shoot-em-up Harlem Style. The problems that plague the everyday citizens of our nation are confronted and examined under the ol\' sociological microscope.But we also consider Claudine\'s psychological and physical needs as a female. For "Woman Needs Man and Man Must Have His MATE",***and we do concede this point. (That\'s S-E-X that we\'re talking about, Schultz!) Claudine meets up with a very masculine, broad shouldered, athletic type in Private Scavanger Garbage Man, Ruppert B. Marshall (James Earl Jones) and they go on a date.The Great Welfare State intervenes with the Couple as Claudine\'s Welfare Case Worker, Miss Tayback (Elisa Loti), comes snooping around to see just who is this unattached Male, who is suddenly paying so much attention to Claudine\'s family.After a humiliating experience with the Welfare Bureau\'s auditing and "deducting" binge, which would be the norm for the family, the two decide to get married with or without the blessing of Big Brother.Meanwhile, Claudine\'s elder son has gotten involved with some big talking but little doing Black Activist group. But, with Ruppert\'s help, he and they all come through it A.O.K.It ends on a Happy, Upbeat and Hopeful note. We know that it may not be exactly "...Happily Ever After!", but rather the\'ll make it all together! If there is a single criticism that we must state it is that sometimes in a movie like this, a misconception is spread to a large portion of Urban Blacks. And that is, the apparent implied myth that all Whites are wealthy, having none of their kind ever in need of a helping hand, out of work or suffering any disabilities.Well, folks, it just ain\'t true! NOTE: * At one point, Jim Brown\'s career was a real hit as a rugged actioner. He was even being tauted as "...The Black John Wayne." NOTE: ** The idea of producing films with All-Black Casts, filmed for All-Black consumption was not a new idea. In the 1920\'s, \'30\'s and \'40\'s, we saw productions from people like Noble Johnson, Spencer Williams, Jr. and Rex Ingram.NOTE: *** That\'s "As Time Goes By", you know, Schultz, it\'s from CASABLANCA (Warner Brothers, 1942).',
    "There absolutely was voter fraud. There's voter fraud in every election. However, they are generally isolated incidents and I don't think there has been any credible evidence presented that indicates any wide-scale systemic voter fraud happened in 2020. \n\nI would like a federal commission started that investigates and looks for systemic voter and election fraud. Especially one that would be empowered to look into cases of disenfranchisement and voter suppression as well. Everyone that is legally allowed to vote should be able to easily and securely register and cast their vote.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.1908, 0.3587],
#         [0.1908, 1.0000, 0.3531],
#         [0.3587, 0.3531, 1.0000]])

Evaluation

Metrics

Semantic Similarity

Metric Value
pearson_cosine 0.4106
spearman_cosine 0.4261

Training Details

Training Dataset

Unnamed Dataset

  • Size: 11,180 training samples
  • Columns: sentence_0, sentence_1, and label
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1 label
    type string string float
    details
    • min: 4 tokens
    • mean: 104.44 tokens
    • max: 512 tokens
    • min: 5 tokens
    • mean: 109.27 tokens
    • max: 512 tokens
    • min: 0.0
    • mean: 0.52
    • max: 1.0
  • Samples:
    sentence_0 sentence_1 label
    The concept that things could be possibly be worse therefore do not strive to improve things is a weak and cowardly mentality.
    Nobody wants to hear your dumbass shit.

    Edit: This dude dm'd me and had a total emotional meltdown, that's how bad my words hurt this man.
    Based Macron needs to snort something off of your girlfriends titis 1.0
    Even #foxnews pundit Brit Hume is calling this tweet a lie and should be the reason he loses the next election or is impeached & found guilty by the majority Republican Senate ASAP! #MuellerReport. End of story! An election like this will hardly ever be more decisive, thats just how these things are. I agree its sad that even someone like Le Pen doesnt break the habit. 0.7071067811865475
    review may contain spoilerspredictable, campy, bad special effects. it has a TV-movie feeling to it. the idea of the UN as being taken over by Satan is an interesting twist to the end of the world according to the bible. the premise is interesting, but its excution falls waaaay short. if you want to convert people to Christianity with a film like this, at least make it a quality one! i was seriously checking my watch while watching this piece of dreck. can't say much else about this film since i saw it over a year ago, and there isn't really much to say about this film other than.....skip it! wonderful movie with good story great humour (some great one-liners) and a soundtrack to die for.i've seen it 3 times so far.the american audiences are going to love it. 0.3333333333333333
  • Loss: CosineSimilarityLoss with these parameters:
    {
        "loss_fct": "torch.nn.modules.loss.MSELoss"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • fp16: True
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 32
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • hub_revision: None
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • liger_kernel_config: None
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin
  • router_mapping: {}
  • learning_rate_mapping: {}

Training Logs

Epoch Step Training Loss similarity_spearman_cosine
0.0286 10 - 0.1359
0.0571 20 - 0.1424
0.0857 30 - 0.1525
0.1143 40 - 0.1651
0.1429 50 - 0.1759
0.1714 60 - 0.1846
0.2 70 - 0.1947
0.2286 80 - 0.2056
0.2571 90 - 0.2144
0.2857 100 - 0.2298
0.3143 110 - 0.2409
0.3429 120 - 0.2526
0.3714 130 - 0.2511
0.4 140 - 0.2661
0.4286 150 - 0.2664
0.4571 160 - 0.2572
0.4857 170 - 0.2804
0.5143 180 - 0.2885
0.5429 190 - 0.2885
0.5714 200 - 0.2933
0.6 210 - 0.3037
0.6286 220 - 0.3163
0.6571 230 - 0.3197
0.6857 240 - 0.3275
0.7143 250 - 0.3238
0.7429 260 - 0.3262
0.7714 270 - 0.3295
0.8 280 - 0.3129
0.8286 290 - 0.3491
0.8571 300 - 0.3354
0.8857 310 - 0.3448
0.9143 320 - 0.3581
0.9429 330 - 0.3658
0.9714 340 - 0.3386
1.0 350 - 0.3503
1.0286 360 - 0.3533
1.0571 370 - 0.3604
1.0857 380 - 0.3624
1.1143 390 - 0.3549
1.1429 400 - 0.3594
1.1714 410 - 0.3747
1.2 420 - 0.3465
1.2286 430 - 0.3378
1.2571 440 - 0.3809
1.2857 450 - 0.3856
1.3143 460 - 0.3522
1.3429 470 - 0.3987
1.3714 480 - 0.3847
1.4 490 - 0.3688
1.4286 500 0.1157 0.3937
1.4571 510 - 0.3857
1.4857 520 - 0.4039
1.5143 530 - 0.3913
1.5429 540 - 0.3900
1.5714 550 - 0.3497
1.6 560 - 0.3613
1.6286 570 - 0.4067
1.6571 580 - 0.4016
1.6857 590 - 0.3954
1.7143 600 - 0.3947
1.7429 610 - 0.3864
1.7714 620 - 0.4194
1.8 630 - 0.3985
1.8286 640 - 0.4003
1.8571 650 - 0.4061
1.8857 660 - 0.4074
1.9143 670 - 0.4004
1.9429 680 - 0.4022
1.9714 690 - 0.4056
2.0 700 - 0.3991
2.0286 710 - 0.3944
2.0571 720 - 0.3952
2.0857 730 - 0.4014
2.1143 740 - 0.3846
2.1429 750 - 0.3719
2.1714 760 - 0.4073
2.2 770 - 0.3828
2.2286 780 - 0.3858
2.2571 790 - 0.4114
2.2857 800 - 0.3930
2.3143 810 - 0.3845
2.3429 820 - 0.4053
2.3714 830 - 0.3582
2.4 840 - 0.3848
2.4286 850 - 0.4139
2.4571 860 - 0.3609
2.4857 870 - 0.4122
2.5143 880 - 0.4101
2.5429 890 - 0.4261

Framework Versions

  • Python: 3.11.9
  • Sentence Transformers: 5.1.0
  • Transformers: 4.53.3
  • PyTorch: 2.5.1
  • Accelerate: 1.10.0
  • Datasets: 2.14.4
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
Downloads last month
-
Safetensors
Model size
0.3B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including Culture-and-Morality-Lab/psyembedding-e5-large-v2

Paper for Culture-and-Morality-Lab/psyembedding-e5-large-v2

Evaluation results