File size: 28,618 Bytes
51689ae |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 |
---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:584355
- loss:CachedInfonce
widget:
- source_sentence: What were the criticisms made by Joe Joseph and Thomas Sutcliffe
about the film
sentences:
- Heirloom quality! Shop Now For Chanukah!
- 'Charlie Rymer (born December 18, 1967) is an American professional golfer who
played on the PGA Tour and the Nike Tour. He is currently an analyst for the Golf
Channel. Amateur career
Rymer was born in Cleveland, Tennessee and grew up in Fort Mill, South Carolina.
Rymer played college golf at Georgia Tech, where he was a third-team All-American
in 1988 and an honorable mention All-American in 1989. He won five tournaments
during his time at Georgia Tech.'
- Joe Joseph of The Times agreed that the film was insubstantial, calling it a "speedy,
cost-efficient way to interleave stock library footage with quotes from DJs and
showbiz journalists in order to fill gaps in the late summer schedules." The Independents
Thomas Sutcliffe felt the airing of the film on the same week as the first anniversary
of the September 11 attacks was ill-timed, and described the film as "a scrappy
collage of warmed-over gossip and underpowered revelation."
- source_sentence: What is the expected impact on AT&T's networks if Apple releases
a WiFi-only model of the iPad
sentences:
- You don't have to take yourself too seriously, try to fit a mold, or fall into
the imitation trap. Your personal brand should look and feel like the best representation
of you.
- Add to that the fact that techies everywhere are either frothing or scoffing (with
a willingness to buy) at the iPad, and you're looking at yet another surge in
subscribers come March. Unless Apple releases the iPad without 3G support, as
may well be the case with the starting model priced at $499 and rumored to be
available only with WiFi, there will likely be hundreds of thousands of new 3G
devices added to AT&T's networks, and these devices are pretty data-heavy.
- He lives on Vashon Island with his wife, who is a teacher, and his teenage son
and daughter. They enjoy their gardens, walking, and spending vacations near water.
- source_sentence: When did Donald last see an office from the inside
sentences:
- 'Casso was endorsed by the Denver Post, but not the Rocky Mountain News. 2007
legislative session
In the 2007 session of the Colorado General Assembly, Casso sat on the House Education
Committee and the House State, Veterans, & Military Affairs Committee. During
the 2007 session, Casso sponsored two bills to revise the ways in which schools''
CSAP test scores were reported. One, which would have exempted scores from special
education students, was killed in a Senate committee; the other, which would have
exempted scores for students whose parents opt the students out of the test, was
killed in a House committee at Casso''s request because of concerns that it would
jeopardize federal school funding. Following the legislative session, Casso was
present at the Colorado State Capitol during an incident in which state troopers
shot and killed a mentally ill individual gunman targeting Gov. Bill Ritter. Casso
observed the dead body and afterwards supported increased security, including
metal detectors, for the state capitol building.'
- 'Donald saw the last time an office from the inside in 2006. Ever since did he
work online for himself in all different kind of coffee shops from Cambodia to
Tuvalu Islands. He started to get into serious trouble when AdWords Banned his
account of the blue. Oliver: Nice to meet you Donald, how are you'
- '- His Immortal Logness was magnificent
So bummed that I missed The Orb while touring the area. Thanks for the tribute
mix!'
- source_sentence: What percentage of tax revenues does corporate income tax revenue
currently account for in the U.S.
sentences:
- The fourth quarter unraveled at both ends, offensive stagnation leading to Houston
scoring chances that allowed the Rockets to set their defense and make the Pistons
attack in the half-court, where poor shooting from their backcourt - Rodney Stuckey
(1 of 10) and Brandon Knight combined to go 6 of 25 - made it difficult to spread
out Houston's defense. "I just think we lost our pace," Frank said.
- 'WELCOME TO KANEN INC.
AEROSPACE TOOL DESIGN
Kanen Inc. provides its services based on quality, customer satisfaction, and
a dedication to see your project succeed.'
- In his May 31 column (Abolish the Corporate Income Tax! ), he points out that
corporate income tax revenue has declined to 10% of tax revenues, despite the
U.S. having, by far, the highest corporate income tax rate in the world.
- source_sentence: What challenges does Mayor Hundred face in leading New York as
depicted in March to War
sentences:
- 'He retired in 2004. Honours
Morgan was appointed an Officer of the Order of the British Empire (OBE) in 2005.'
- Bring your sewing machine - scissors - material - pattern (if you already have
one that you want to work on) - have a project that you need assistance with -
just want to spend the day with your fellow Caerthen's in a day of sewing and
socializing? Come on out!
- His heroics convinced the citizens of New York to elect him mayor, and March to
War opens with Mayor Hundred dealing with unrest in the city. Political cartoons
display him as a caped superhero unable to handle the daily needs of the city,
and a protest against the war in Iraq has it divided.
pipeline_tag: sentence-similarity
library_name: sentence-transformers
---
# SentenceTransformer
This is a [sentence-transformers](https://www.SBERT.net) model trained. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
<!-- - **Base model:** [Unknown](https://huggingface.co/unknown) -->
- **Maximum Sequence Length:** 8192 tokens
- **Output Dimensionality:** 1024 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(transformer): Transformer(
(auto_model): XLMRobertaLoRA(
(roberta): XLMRobertaModel(
(embeddings): XLMRobertaEmbeddings(
(word_embeddings): ParametrizedEmbedding(
250002, 1024, padding_idx=1
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(token_type_embeddings): ParametrizedEmbedding(
1, 1024
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(emb_drop): Dropout(p=0.1, inplace=False)
(emb_ln): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
(encoder): XLMRobertaEncoder(
(layers): ModuleList(
(0-23): 24 x Block(
(mixer): MHA(
(rotary_emb): RotaryEmbedding()
(Wqkv): ParametrizedLinearResidual(
in_features=1024, out_features=3072, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(inner_attn): FlashSelfAttention(
(drop): Dropout(p=0.1, inplace=False)
)
(inner_cross_attn): FlashCrossAttention(
(drop): Dropout(p=0.1, inplace=False)
)
(out_proj): ParametrizedLinear(
in_features=1024, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(dropout1): Dropout(p=0.1, inplace=False)
(drop_path1): StochasticDepth(p=0.0, mode=row)
(norm1): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): ParametrizedLinear(
in_features=1024, out_features=4096, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(fc2): ParametrizedLinear(
in_features=4096, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
)
(dropout2): Dropout(p=0.1, inplace=False)
(drop_path2): StochasticDepth(p=0.0, mode=row)
(norm2): LayerNorm((1024,), eps=1e-05, elementwise_affine=True)
)
)
)
(pooler): XLMRobertaPooler(
(dense): ParametrizedLinear(
in_features=1024, out_features=1024, bias=True
(parametrizations): ModuleDict(
(weight): ParametrizationList(
(0): LoRAParametrization()
)
)
)
(activation): Tanh()
)
)
)
)
(pooler): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(normalizer): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Jrinky/jina3")
# Run inference
sentences = [
'What challenges does Mayor Hundred face in leading New York as depicted in March to War',
'His heroics convinced the citizens of New York to elect him mayor, and March to War opens with Mayor Hundred dealing with unrest in the city. Political cartoons display him as a caped superhero unable to handle the daily needs of the city, and a protest against the war in Iraq has it divided.',
"Bring your sewing machine - scissors - material - pattern (if you already have one that you want to work on) - have a project that you need assistance with - just want to spend the day with your fellow Caerthen's in a day of sewing and socializing? Come on out!",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 584,355 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 6 tokens</li><li>mean: 17.41 tokens</li><li>max: 42 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 119.19 tokens</li><li>max: 1979 tokens</li></ul> |
* Samples:
| anchor | positive |
|:----------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>What resources and tools are recommended for busy brides planning their weddings</code> | <code>If you are planning on spending a little bit of time on your wedding planning, here is part 2 of my series of great resources and tools for wedding planning that every busy bride should know about. The previous instalment can be viewed here.</code> |
| <code>How many girls were raised in the house described</code> | <code>This house is where my parents proudly hung up our diplomas. This house is where 3 girls were raised.</code> |
| <code>Where did the narrator's dad always barbecue for Easter</code> | <code>This house is where my dad always barbequed for Easter, rain or shine. This house is where we welcomed family and friends on their first visit to the United States.</code> |
* Loss: <code>cachedselfloss2.CachedInfonce</code> with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Evaluation Dataset
#### Unnamed Dataset
* Size: 18,073 evaluation samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 6 tokens</li><li>mean: 17.52 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 107.58 tokens</li><li>max: 1832 tokens</li></ul> |
* Samples:
| anchor | positive |
|:--------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>What significant role did the character Raven portray in early cinema</code> | <code>He played cynical tough guys in modern films, but then branched into westerns where for the most part he was the gallant hero. In fact the ultimate gallant white knight hero in Shane. His part as Raven is a difficult one, yet he pulls it off. He's a cold blooded contract killer, one of the earliest ever portrayed as a film protagonist. Yet he's human and you see flashes of it, his concern for cats. As a cat lover, I can sure identify with that. Raven is also one of the earliest characters in cinema who talks about child abuse making him what he is. Groundbreaking when you think about it. Next to Ladd, the biggest kudos have to go to Laird Cregar, borrowed from 20th Century Fox to play Willard Gates. Gates is a top company executive with Marshall's firm which is a defense contractor which is why the Senate is interested in him. He's basically a jerk who thinks he's so clever. Veronica Lake gets to him real easy because of his weakness for the nightclub scene.</code> |
| <code>What are the key features and characteristics of the Majestic Pure Dead Sea Mud Mask</code> | <code>At the same time, it can be used all over your body, not just face. This way, you can clear any part of your skin from its impurities. An additional feature that caught our eyes immediately was the beautifully designed packaging that makes this affordable product look high-end. The combination of grey, black, and blue colors will easily make it stand out in any beauty shop. - Good for sensitive and dry skin<br>- Affordable price<br>- Treats many different skin conditions<br>- Not good for oily skin<br>- Can feel a bit oily<br>Majestic Pure Dead Sea Mud Mask Review<br>Majestic Pure is a well-known brand among the beauty and skincare community. They make affordable, natural products and are oftentimes among the celebrity favorites.</code> |
| <code>What benefits does this product provide for the skin</code> | <code>This will provide you with softer skin that glows. At the same time, it will help you deal with your clogged pores and provide you the necessary daily acne treatment.</code> |
* Loss: <code>cachedselfloss2.CachedInfonce</code> with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 800
- `per_device_eval_batch_size`: 800
- `learning_rate`: 2e-05
- `num_train_epochs`: 10
- `warmup_ratio`: 0.1
- `bf16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 800
- `per_device_eval_batch_size`: 800
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 10
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | Validation Loss |
|:------:|:----:|:-------------:|:---------------:|
| 0.2052 | 150 | 13.7695 | 17.4625 |
| 0.4104 | 300 | 14.2067 | 17.4452 |
| 0.6156 | 450 | 14.344 | 17.4289 |
| 0.8208 | 600 | 13.705 | 17.3620 |
| 1.0260 | 750 | 13.0304 | 17.1246 |
| 1.2312 | 900 | 13.28 | 16.7495 |
| 1.4364 | 1050 | 13.0314 | 16.5068 |
| 1.6416 | 1200 | 13.0861 | 16.3113 |
| 1.8468 | 1350 | 13.2752 | 16.1406 |
| 2.0520 | 1500 | 12.2868 | 16.0122 |
| 2.2572 | 1650 | 12.9551 | 15.9320 |
| 2.4624 | 1800 | 12.8339 | 15.8444 |
| 2.6676 | 1950 | 12.0719 | 15.8108 |
| 2.8728 | 2100 | 12.7803 | 15.7694 |
| 3.0780 | 2250 | 11.9023 | 15.7460 |
| 3.2832 | 2400 | 12.6882 | 15.7291 |
| 3.4884 | 2550 | 12.3062 | 15.7165 |
| 3.6936 | 2700 | 12.402 | 15.7071 |
| 3.8988 | 2850 | 12.0136 | 15.7014 |
| 4.1040 | 3000 | 12.821 | 15.6873 |
| 4.3092 | 3150 | 12.4667 | 15.6835 |
| 4.5144 | 3300 | 12.6469 | 15.6740 |
| 4.7196 | 3450 | 12.1751 | 15.6519 |
| 4.9248 | 3600 | 12.3627 | 15.6637 |
### Framework Versions
- Python: 3.10.14
- Sentence Transformers: 3.4.1
- Transformers: 4.49.0
- PyTorch: 2.3.1+cu121
- Accelerate: 1.5.2
- Datasets: 3.4.1
- Tokenizers: 0.21.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### CachedInfonce
```bibtex
@misc{gao2021scaling,
title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
year={2021},
eprint={2101.06983},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |