File size: 92,734 Bytes
a107131 d37ba28 a107131 d37ba28 a107131 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 | ---
language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sparse-encoder
- sparse
- asymmetric
- inference-free
- splade
- generated_from_trainer
- dataset_size:18247
- loss:SpladeLoss
- loss:SparseMultipleNegativesRankingLoss
- loss:FlopsLoss
base_model: opensearch-project/opensearch-neural-sparse-encoding-doc-v3-gte
widget:
- text: '```markdown
Interval UK Holdings Limited
Balance sheet
As at 31 December 2024
|| Note | 2024 £ | 2023 £ |
|-----------------------|------|--------|--------|
| Non current assets||||
| Investments| 9| -| -|
| Current assets||||
| Cash at bank and in hand || 115,921| 111,205|
||| 115,921| 111,205|
| Creditors: amounts falling due within one year | 10 | (100,017)| (100,344)|
| Net current assets|| 15,904 | 10,861 |
| Creditors: amounts falling due after more than one year | 11 | (430,000)|
(430,000)|
| Net liabilities|| (414,096)| (419,139)|
| Capital and reserves ||||
| Called-up share capital | 12 | 19,811,905| 19,811,905|
| Profit and loss account || (20,226,001)| (20,231,044)|
| Shareholder''s deficit || (414,096)| (419,139)|
The financial statements of Interval UK Holdings Ltd (03000895) were approved
by the Board on 25th September 2025
and signed on its behalf by:
D
DP Ettridge
Director
10
```'
- text: '25 | Page
2023 (£) 2022 (£) Raw materials and consumables 62,693 50,917 Finished goods and
goods for resale 61,306 110,746 Total 123,999 161,663
An impairment loss of £36,779 (2022: gain of £33,801) was recognised in cost of
sales against stock during the year due to revaluation criteria. The value of
stock at year end is not materially different to its replacement cost.
Stock
Of the amounts owed by group undertakings: £ 398,051 (2022: £1,972,482) are trade
related, unsecured and have an interest charge of 0%. £ 3,900,000 (2022: £1,700,000)
are a deposit, unsecured and have an interest charge of 5.22%.
Stock recognised in cost of sales during the year as an expense was £241,612 (2022:
£258,218).
Cash at bank and in hand
2023 (£) 2022 (£) Trade debtors 544,252 739,053 Amounts owed by group undertakings
4,298,051 3,672,482 Other debtors 75,183 3,866 Corporation Tax receivable 38,958
- Prepayments and accrued income 45,623 44,125 Total 5,002,067 4,459,526
Trade debtors are stated after provisions for impairment of £217,545 (2022: £220,257)
2023 (£) 2022 (£) Cash at bank and in hand 657,275 545,542
Debtors'
- text: 'Page 23
Nominal value: 2022 (€) 2021 (€) £1 120,924,800 120,924,800
Employee benefit obligations
Notes to the Financial Statements - continued for the year ended 31 December 2022
The Scheme closed to the future accrual of benefits on 31 August 2015 and the
4 members who were active at the closure date were granted deferred benefits in
the Scheme. Since 1 April 2003, the Scheme has provided benefits on a Career Average
Revalued Earnings ("CARE") basis with Pensionable Earnings revalued each year
in line with the increase in Retail Price Index ("RPI") inflation. This method
of revaluation is broadly consistent with the way that deferred pensions increase
before retirement and therefore the closure of future accrual does not have a
material impact on the value of members'' accrued benefits.
Called up share capital
Defined benefit pension plans 2022 (€''000) Interest cost 1,674 Interest income
(1,697) Actual return on plan assets (23) 31,933
Defined benefit pension plans 2022 (€''000) Present value of funded obligations
(53,460) Fair value of plan assets 51,628 (Deficit)/surplus (1,832) Net (liability)/surplus
(1,832)
The amounts recognised in profit or loss are as follows:
Number: 103,276,582 Class: Ordinary shares
The amounts recognised in the balance sheet are as follows:
The company operates a defined benefit pension scheme. An actuarial valuation
was carried out at 31 March 2020 by a qualified independent actuary.
Allotted, issued and fully paid:
Dunlop International Europe Limited'
- text: 'Komline-Sanderson Ltd
Notes to the financial statements
For the year ended 31 March 2025
3. Employees
The average monthly number of employees, including directors, during the year
was 4 (2024 - 4).
4. Taxation
Factors affecting tax charge for the year
There was no current tax charge in either year as the Company made a taxable loss
during the current reporting period and utilised brought forward trading losses
in the prior period.
Factors that may affect future tax charges
The Company has tax losses of approximately £134,000 which will reduce future
charges to corporation tax.
5. Tangible fixed assets
| Cost or valuation | Office equipment £ |
|---|---|
| At 1 April 2024 | 2,201 |
| At 31 March 2025 | 2,201 |
| Depreciation | |
| At 1 April 2024 | 2,201 |
| At 31 March 2025 | 2,201 |
| Net book value | |
| At 31 March 2025 | - |
| At 31 March 2024 | - |
6. Debtors
| | 2025 £ | 2024 € |
|---|---|---|
| Trade debtors | 881 | 11,736 |
| Other debtors | 134 | - |
| | 1,015 | 11,736 |
Page 3'
datasets:
- oneryalcin/financial-filings-sparse-retrieval-training
pipeline_tag: feature-extraction
library_name: sentence-transformers
metrics:
- dot_accuracy@1
- dot_accuracy@3
- dot_accuracy@5
- dot_accuracy@10
- dot_precision@1
- dot_precision@3
- dot_precision@5
- dot_precision@10
- dot_recall@1
- dot_recall@3
- dot_recall@5
- dot_recall@10
- dot_ndcg@10
- dot_mrr@10
- dot_map@100
- query_active_dims
- query_sparsity_ratio
- corpus_active_dims
- corpus_sparsity_ratio
- avg_flops
model-index:
- name: Financial Domain Sparse Encoder (doc-v3-gte fine-tuned)
results:
- task:
type: sparse-information-retrieval
name: Sparse Information Retrieval
dataset:
name: NanoNFCorpus
type: NanoNFCorpus
metrics:
- type: dot_accuracy@1
value: 0.44
name: Dot Accuracy@1
- type: dot_accuracy@3
value: 0.62
name: Dot Accuracy@3
- type: dot_accuracy@5
value: 0.66
name: Dot Accuracy@5
- type: dot_accuracy@10
value: 0.76
name: Dot Accuracy@10
- type: dot_precision@1
value: 0.44
name: Dot Precision@1
- type: dot_precision@3
value: 0.38
name: Dot Precision@3
- type: dot_precision@5
value: 0.35600000000000004
name: Dot Precision@5
- type: dot_precision@10
value: 0.314
name: Dot Precision@10
- type: dot_recall@1
value: 0.047208191733070066
name: Dot Recall@1
- type: dot_recall@3
value: 0.10033853359651303
name: Dot Recall@3
- type: dot_recall@5
value: 0.12344673739385978
name: Dot Recall@5
- type: dot_recall@10
value: 0.15735846776073342
name: Dot Recall@10
- type: dot_ndcg@10
value: 0.37746727490101206
name: Dot Ndcg@10
- type: dot_mrr@10
value: 0.5380793650793652
name: Dot Mrr@10
- type: dot_map@100
value: 0.17865969564136092
name: Dot Map@100
- type: query_active_dims
value: 4.760000228881836
name: Query Active Dims
- type: query_sparsity_ratio
value: 0.999844046909479
name: Query Sparsity Ratio
- type: corpus_active_dims
value: 1493.3485107421875
name: Corpus Active Dims
- type: corpus_sparsity_ratio
value: 0.9510730453200253
name: Corpus Sparsity Ratio
- type: avg_flops
value: 1.0107077360153198
name: Avg Flops
- task:
type: sparse-information-retrieval
name: Sparse Information Retrieval
dataset:
name: NanoSciFact
type: NanoSciFact
metrics:
- type: dot_accuracy@1
value: 0.54
name: Dot Accuracy@1
- type: dot_accuracy@3
value: 0.84
name: Dot Accuracy@3
- type: dot_accuracy@5
value: 0.86
name: Dot Accuracy@5
- type: dot_accuracy@10
value: 0.9
name: Dot Accuracy@10
- type: dot_precision@1
value: 0.54
name: Dot Precision@1
- type: dot_precision@3
value: 0.3
name: Dot Precision@3
- type: dot_precision@5
value: 0.18799999999999997
name: Dot Precision@5
- type: dot_precision@10
value: 0.09999999999999998
name: Dot Precision@10
- type: dot_recall@1
value: 0.53
name: Dot Recall@1
- type: dot_recall@3
value: 0.82
name: Dot Recall@3
- type: dot_recall@5
value: 0.845
name: Dot Recall@5
- type: dot_recall@10
value: 0.89
name: Dot Recall@10
- type: dot_ndcg@10
value: 0.7393057169200965
name: Dot Ndcg@10
- type: dot_mrr@10
value: 0.6897222222222222
name: Dot Mrr@10
- type: dot_map@100
value: 0.6905205627705626
name: Dot Map@100
- type: query_active_dims
value: 19.040000915527344
name: Query Active Dims
- type: query_sparsity_ratio
value: 0.999376187637916
name: Query Sparsity Ratio
- type: corpus_active_dims
value: 1752.0487060546875
name: Corpus Active Dims
- type: corpus_sparsity_ratio
value: 0.9425971854382187
name: Corpus Sparsity Ratio
- type: avg_flops
value: 4.766234874725342
name: Avg Flops
- task:
type: sparse-information-retrieval
name: Sparse Information Retrieval
dataset:
name: NanoFiQA2018
type: NanoFiQA2018
metrics:
- type: dot_accuracy@1
value: 0.36
name: Dot Accuracy@1
- type: dot_accuracy@3
value: 0.58
name: Dot Accuracy@3
- type: dot_accuracy@5
value: 0.64
name: Dot Accuracy@5
- type: dot_accuracy@10
value: 0.7
name: Dot Accuracy@10
- type: dot_precision@1
value: 0.36
name: Dot Precision@1
- type: dot_precision@3
value: 0.2733333333333333
name: Dot Precision@3
- type: dot_precision@5
value: 0.20800000000000002
name: Dot Precision@5
- type: dot_precision@10
value: 0.11999999999999998
name: Dot Precision@10
- type: dot_recall@1
value: 0.18724603174603174
name: Dot Recall@1
- type: dot_recall@3
value: 0.40712698412698417
name: Dot Recall@3
- type: dot_recall@5
value: 0.49556349206349204
name: Dot Recall@5
- type: dot_recall@10
value: 0.5478095238095239
name: Dot Recall@10
- type: dot_ndcg@10
value: 0.44010103944970097
name: Dot Ndcg@10
- type: dot_mrr@10
value: 0.46983333333333327
name: Dot Mrr@10
- type: dot_map@100
value: 0.38215757668624306
name: Dot Map@100
- type: query_active_dims
value: 12.0600004196167
name: Query Active Dims
- type: query_sparsity_ratio
value: 0.999604875158259
name: Query Sparsity Ratio
- type: corpus_active_dims
value: 1723.0986328125
name: Corpus Active Dims
- type: corpus_sparsity_ratio
value: 0.9435456840045706
name: Corpus Sparsity Ratio
- type: avg_flops
value: 4.071389198303223
name: Avg Flops
- task:
type: sparse-nano-beir
name: Sparse Nano BEIR
dataset:
name: NanoBEIR mean
type: NanoBEIR_mean
metrics:
- type: dot_accuracy@1
value: 0.4466666666666666
name: Dot Accuracy@1
- type: dot_accuracy@3
value: 0.68
name: Dot Accuracy@3
- type: dot_accuracy@5
value: 0.7200000000000001
name: Dot Accuracy@5
- type: dot_accuracy@10
value: 0.7866666666666667
name: Dot Accuracy@10
- type: dot_precision@1
value: 0.4466666666666666
name: Dot Precision@1
- type: dot_precision@3
value: 0.31777777777777777
name: Dot Precision@3
- type: dot_precision@5
value: 0.25066666666666665
name: Dot Precision@5
- type: dot_precision@10
value: 0.17799999999999996
name: Dot Precision@10
- type: dot_recall@1
value: 0.25481807449303395
name: Dot Recall@1
- type: dot_recall@3
value: 0.44248850590783234
name: Dot Recall@3
- type: dot_recall@5
value: 0.4880034098191173
name: Dot Recall@5
- type: dot_recall@10
value: 0.5317226638567525
name: Dot Recall@10
- type: dot_ndcg@10
value: 0.5189580104236031
name: Dot Ndcg@10
- type: dot_mrr@10
value: 0.5658783068783069
name: Dot Mrr@10
- type: dot_map@100
value: 0.41711261169938885
name: Dot Map@100
- type: query_active_dims
value: 11.953333854675293
name: Query Active Dims
- type: query_sparsity_ratio
value: 0.9996083699018847
name: Query Sparsity Ratio
- type: corpus_active_dims
value: 1666.2235158269893
name: Corpus Active Dims
- type: corpus_sparsity_ratio
value: 0.945409097836741
name: Corpus Sparsity Ratio
- type: avg_flops
value: 2.325575590133667
name: Avg Flops
---
# Financial Domain Sparse Encoder (doc-v3-gte fine-tuned)
This is a [Asymmetric Inference-free SPLADE Sparse Encoder](https://www.sbert.net/docs/sparse_encoder/usage/usage.html) model finetuned from [opensearch-project/opensearch-neural-sparse-encoding-doc-v3-gte](https://huggingface.co/opensearch-project/opensearch-neural-sparse-encoding-doc-v3-gte) on the [financial-filings-sparse-retrieval-training](https://huggingface.co/datasets/oneryalcin/financial-filings-sparse-retrieval-training) dataset using the [sentence-transformers](https://www.SBERT.net) library. It maps sentences & paragraphs to a 30522-dimensional sparse vector space and can be used for semantic search and sparse retrieval.
## Model Details
### Model Description
- **Model Type:** Asymmetric Inference-free SPLADE Sparse Encoder
- **Base model:** [opensearch-project/opensearch-neural-sparse-encoding-doc-v3-gte](https://huggingface.co/opensearch-project/opensearch-neural-sparse-encoding-doc-v3-gte) <!-- at revision 1646fef40807937e8e130c66d327a26421c408d5 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 30522 dimensions
- **Similarity Function:** Dot Product
- **Training Dataset:**
- [financial-filings-sparse-retrieval-training](https://huggingface.co/datasets/oneryalcin/financial-filings-sparse-retrieval-training)
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Documentation:** [Sparse Encoder Documentation](https://www.sbert.net/docs/sparse_encoder/usage/usage.html)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/huggingface/sentence-transformers)
- **Hugging Face:** [Sparse Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=sparse-encoder)
### Full Model Architecture
```
SparseEncoder(
(0): Router(
(sub_modules): ModuleDict(
(query): Sequential(
(0): SparseStaticEmbedding({'frozen': False}, dim=30522, tokenizer=TokenizersBackend)
)
(document): Sequential(
(0): MLMTransformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'NewForMaskedLM'})
(1): SpladePooling({'pooling_strategy': 'max', 'activation_function': 'log1p_relu', 'word_embedding_dimension': 30522})
)
)
)
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SparseEncoder
# Download from the 🤗 Hub
model = SparseEncoder("oneryalcin/fin-sparse-encoder-doc-v1")
# Run inference
queries = [
"How much did the company charge for depreciation of tangible assets in 2025?",
]
documents = [
"CoolerAid Holdings Ltd\nNotes to the Consolidated Financial Statements (continued)\nYear ended 31 March 2025\n\n5. Operating profit\nOperating profit or loss is stated after charging/crediting\n\n|| 2025| 2024|\n|------------------------------------------------|---------|-----------|\n| Depreciation of tangible assets| 870,340 | 752,425 |\n| Loss/(gains) on disposal of tangible assets| 1,535 | (173,401) |\n| Impairment of trade debtors| 32,465 | 15,860|\n| Operating lease rentals| 112,292 | 139,908 |\n\n6. Auditor's remuneration\n\n|| 2025 | 2024 |\n|--------------------------------------------------------------|-------|-------|\n| Fees payable for the audit of the consolidated financial statements | 5,000 | 5,000 |\n\n7. Staff costs\nThe average number of persons employed by the group during the year, including the directors, amounted to:\n\n|| 2025 | 2024 |\n|---------------------|------|------|\n| Distribution staff | 44 | 42 |\n| Administrative staff| 14 | 13 |\n| Management staff| 2| 2|\n| Directors| 4| 4|\n|| 64 | 61 |\n\nThe aggregate payroll costs incurred during the year, relating to the above, were:\n\n|| 2025| 2024|\n|---------------------|-----------|-----------|\n| Wages and salaries | 2,828,063 | 2,596,870 |\n| Social security costs| 314,830 | 303,444 |\n| Other pension costs | 79,769| 68,413|\n|| 3,222,662 | 2,968,727 |\n\n8. Directors' remuneration\nThe directors' aggregate remuneration in respect of qualifying services was:\n\n|| 2025 | 2024 |\n|---------------|--------|--------|\n| Remuneration | 18,880 | 18,880 |\n\n- 20 -",
'Kenneth Forbes (Holdings) Limited\nNotes to the Financial Statements (continued)\nYear ended 30 April 2025\n\n13. Tangible assets.\n\n| Group and company | Freehold property £ | Plant and machinery £ | Fixtures and fittings £ | Motor vehicles £ | Assets held under the course of construction £ | Total £ |\n|---|---|---|---|---|---|---|\n| Cost | | | | | | |\n| At 1 May 2024 | 3,941,316 | 3,684,443 | 1,326,032 | 294,882 | 41,948 | 9,288,621 |\n| Additions | 570,435 | 395,946 | 187,049 | 194,503 | 106,519 | 1,454,452 |\n| Disposals | | (317,265) | (214,785) | (164,244) | | (696,294) |\n| At 30 Apr 2025 | 4,511,751 | 3,763,124 | 1,298,296 | 325,141 | 148,467 | 10,046,779 |\n| Depreciation | | | | | | |\n| At 1 May 2024 | 1,158,100 | 3,257,887 | 904,088 | 156,223 | | 5,476,298 |\n| Charge for the year | 133,789 | 160,512 | 101,477 | 78,993 | | 474,771 |\n| Disposals | | (317,265) | (214,785) | (133,585) | | (665,635) |\n| At 30 Apr 2025 | 1,291,889 | 3,101,134 | 790,780 | 101,631 | | 5,285,434 |\n| Carrying amount | | | | | | |\n| At 30 Apr 2025 | 3,219,862 | 661,990 | 507,516 | 223,510 | 148,467 | 4,761,345 |\n| At 30 Apr 2024 | 2,783,216 | 426,556 | 421,944 | 138,659 | 41,948 | 3,812,323 |\n\nFreehold land amounting to £475,010 (2024: £475,010) is not depreciated.\nAll assets are held by the company but used by the group undertakings or associates.\n\nTangible assets held at valuation\n\nIn respect of tangible assets held at valuation, aggregate cost, depreciation and comparable carrying\namount that would have been recognised if the assets had been carried under the historical cost model are\nas follows:\n\nGroup and company\n\n| | Freehold property £ |\n|---|---|\n| At 30 April 2025 | |\n| Aggregate cost | 3,854,887 |\n| Aggregate depreciation | (1,963,225) |\n| Carrying value | 1,891,662 |\n| At 30 April 2024 | |\n| Aggregate cost | 3,854,887 |\n| Aggregate depreciation | (1,700,013) |\n| Carrying value | 2,154,874 |\n\n1\n-24-\n1',
'WHOCANFIXMYCAR.COM LTD\nNOTES TO THE FINANCIAL STATEMENTS\nFOR THE YEAR ENDED 31 MARCH 2025\n\n7. Tangible fixed assets\n\n| | Fixtures and fittings £ | Computer equipment £ | Right of use assets £ | Total £ | £ |\n|---|---|---|---|---|---|\n| **Cost** | | | | | |\n| At 1 April 2024 | 61,966 | 131,878 | - | 193,844 | |\n| Additions | - | 3,187 | 287,367 | 290,554 | |\n| At 31 March 2025 | 61,966 | 135,065 | 287,367 | 484,398 | |\n| **Depreciation** | | | | | |\n| At 1 April 2024 | 61,966 | 99,890 | - | 161,856 | |\n| Charge for the year | - | 12,982 | 82,657 | 95,639 | |\n| At 31 March 2025 | 61,966 | 112,872 | 82,657 | 257,495 | |\n| **Net book value** | | | | | |\n| At 31 March 2025 | - | 22,193 | 204,710 | 226,903 | |\n| At 31 March 2024 | - | 31,988 | - | 31,988 | |\n\nThe net book value of owned and leased assets included as "Tangible fixed assets" in the Balance Sheet is as follows:\n\n| | 2025 £ | 2024 £ |\n|---|---|---|\n| Tangible fixed assets owned | 22,193 | 31,988 |\n| Right-of-use tangible fixed assets | 204,710 | - |\n| | 226,903 | 31,988 |\n\nInformation about right-of-use assets is summarised below:\n\nNet book value\n\n| | 2025 £ | 2024 £ |\n|---|---|---|\n| Office and computer equipment | 204,710 | - |\n| | 204,710 | - |\n\nPage 11',
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 30522] [3, 30522]
# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[18.4437, 21.8515, 17.4189]])
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Domain Evaluation: Financial Document Retrieval
Evaluated on 2,028 held-out financial test examples (SEC filings + earnings call transcripts). Each example has 1 query + 1 positive + up to 7 hard negatives. We re-rank candidates per query using sparse dot-product similarity.
| Metric | Base Model | Fine-tuned (2 epochs) | Delta |
|:------------|:-----------|:----------------------|:--------|
| **acc@1** | 39.9% | **55.2%** | +15.2% |
| **acc@3** | 69.2% | **84.0%** | +14.8% |
| **acc@5** | 84.1% | **93.7%** | +9.6% |
| **mrr@10** | 0.580 | **0.710** | +13.0% |
| **ndcg@10** | 0.681 | **0.781** | +10.0% |
| mean_rank | 2.90 | **2.09** | -0.81 |
| median_rank | 2.0 | **1.0** | -1.0 |
The fine-tuned model ranks the correct document first (median_rank=1.0) more often than not, compared to position 2 for the base model. Sparsity also improved: corpus active dims decreased from ~2,500 to ~1,666, meaning faster inverted index lookups at inference time.
### Metrics
#### Sparse Information Retrieval
* Datasets: `NanoNFCorpus`, `NanoSciFact` and `NanoFiQA2018`
* Evaluated with [<code>SparseInformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sparse_encoder/evaluation.html#sentence_transformers.sparse_encoder.evaluation.SparseInformationRetrievalEvaluator)
| Metric | NanoNFCorpus | NanoSciFact | NanoFiQA2018 |
|:----------------------|:-------------|:------------|:-------------|
| dot_accuracy@1 | 0.44 | 0.54 | 0.36 |
| dot_accuracy@3 | 0.62 | 0.84 | 0.58 |
| dot_accuracy@5 | 0.66 | 0.86 | 0.64 |
| dot_accuracy@10 | 0.76 | 0.9 | 0.7 |
| dot_precision@1 | 0.44 | 0.54 | 0.36 |
| dot_precision@3 | 0.38 | 0.3 | 0.2733 |
| dot_precision@5 | 0.356 | 0.188 | 0.208 |
| dot_precision@10 | 0.314 | 0.1 | 0.12 |
| dot_recall@1 | 0.0472 | 0.53 | 0.1872 |
| dot_recall@3 | 0.1003 | 0.82 | 0.4071 |
| dot_recall@5 | 0.1234 | 0.845 | 0.4956 |
| dot_recall@10 | 0.1574 | 0.89 | 0.5478 |
| **dot_ndcg@10** | **0.3775** | **0.7393** | **0.4401** |
| dot_mrr@10 | 0.5381 | 0.6897 | 0.4698 |
| dot_map@100 | 0.1787 | 0.6905 | 0.3822 |
| query_active_dims | 4.76 | 19.04 | 12.06 |
| query_sparsity_ratio | 0.9998 | 0.9994 | 0.9996 |
| corpus_active_dims | 1493.3485 | 1752.0487 | 1723.0986 |
| corpus_sparsity_ratio | 0.9511 | 0.9426 | 0.9435 |
| avg_flops | 1.0107 | 4.7662 | 4.0714 |
#### Sparse Nano BEIR
* Dataset: `NanoBEIR_mean`
* Evaluated with [<code>SparseNanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/sparse_encoder/evaluation.html#sentence_transformers.sparse_encoder.evaluation.SparseNanoBEIREvaluator) with these parameters:
```json
{
"dataset_names": [
"nfcorpus",
"scifact",
"fiqa2018"
],
"dataset_id": "sentence-transformers/NanoBEIR-en"
}
```
| Metric | Value |
|:----------------------|:----------|
| dot_accuracy@1 | 0.4467 |
| dot_accuracy@3 | 0.68 |
| dot_accuracy@5 | 0.72 |
| dot_accuracy@10 | 0.7867 |
| dot_precision@1 | 0.4467 |
| dot_precision@3 | 0.3178 |
| dot_precision@5 | 0.2507 |
| dot_precision@10 | 0.178 |
| dot_recall@1 | 0.2548 |
| dot_recall@3 | 0.4425 |
| dot_recall@5 | 0.488 |
| dot_recall@10 | 0.5317 |
| **dot_ndcg@10** | **0.519** |
| dot_mrr@10 | 0.5659 |
| dot_map@100 | 0.4171 |
| query_active_dims | 11.9533 |
| query_sparsity_ratio | 0.9996 |
| corpus_active_dims | 1666.2235 |
| corpus_sparsity_ratio | 0.9454 |
| avg_flops | 2.3256 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### financial-filings-sparse-retrieval-training
* Dataset: [financial-filings-sparse-retrieval-training](https://huggingface.co/datasets/oneryalcin/financial-filings-sparse-retrieval-training) at [23e44ab](https://huggingface.co/datasets/oneryalcin/financial-filings-sparse-retrieval-training/tree/23e44abc3bfdb454da434ba8eb3e38bd1e01be84)
* Size: 18,247 training samples
* Columns: <code>query</code>, <code>positive</code>, <code>negative_0</code>, <code>negative_1</code>, <code>negative_2</code>, <code>negative_3</code>, <code>negative_4</code>, <code>negative_5</code>, and <code>negative_6</code>
* Approximate statistics based on the first 1000 samples:
| | query | positive | negative_0 | negative_1 | negative_2 | negative_3 | negative_4 | negative_5 | negative_6 |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
| type | string | string | string | string | string | string | string | string | string |
| details | <ul><li>min: 9 tokens</li><li>mean: 20.51 tokens</li><li>max: 79 tokens</li></ul> | <ul><li>min: 51 tokens</li><li>mean: 331.12 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 54 tokens</li><li>mean: 360.35 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 56 tokens</li><li>mean: 357.16 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 303.81 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 263.98 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 221.87 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 193.47 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 166.03 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| query | positive | negative_0 | negative_1 | negative_2 | negative_3 | negative_4 | negative_5 | negative_6 |
|:------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------|
| <code>What was the actuarial gain on defined benefit pension plans for the year 2021?</code> | <code>2021 £'000 2020 £'000 Loss for the financial year (9,731) (50,454) Other comprehensive income/(expense) for the financial period Actuarial gain/(loss) on defined benefit pension plans 31,200 (4,400) Deferred tax impact of actuarial gain/(loss) (10,920) 1,540 Other comprehensive income/(expense) 20,280 (2,860) Total comprehensive income/(expense) for the financial period 10,549 (53,314)<br><br>STATEMENT OF COMPREHENSIVE INCOME FOR THE YEAR ENDED 31 DECEMBER 2021<br><br>11</code> | <code>Defined benefit pension plans 2022 2021 Equities 12% 26% Bonds 56% 31% Liability driven investment 32% 39% Other - 4% 100% 100%<br><br>Defined benefit pension plans 2022 €'000 2021 €'000 Actuarial losses from changes in financial assumptions 36 391 Actuarial gain/(losses) 3,629 (4,830) 3,665 (4,439)<br><br>```markdown Dunlop International Europe Limited Notes to the Financial Statements - continued for the year ended 31 December 2022<br><br>Defined benefit pension plans 2022 €'000 2021 €'000 Opening fair value of scheme assets 90,460 85,266 Contributions by employer 1,129 1,189 Expected return 1,697 1,183 Actuarial (losses) (33,630) (591) Benefits paid (3,397) (3,408) Exchange differences on foreign plans (4,631) 6,821 51,628 90,460<br><br>Changes in the fair value of scheme assets are as follows:<br><br>Page 24 ```<br><br>The major categories of scheme assets as a percentage of total scheme assets are as follows:<br><br>Changes in the present value of the defined benefit obligation are as follows:<br><br>Defined benefit pension pla...</code> | <code>The components of net periodic pension benefit cost recognized in our Consolidated Statements of Operations for the periods presented are as follows:<br><br>Years Ended December 31, 2021 Projected benefit obligation, beginning of year $ 97,740 Service cost $ 1,282 Interest cost $ 1,452 Actuarial (gain) loss $ (8,682) Benefits paid $ (2,010) Translation adjustment $ (4,006) Projected benefit obligation, end of year $ 85,776 Fair value of plan assets, beginning of year $ 17,293 Actual return on plan assets $ 641 Contributions $ 1,775 Benefits paid $ (1,112) Actuarial gain $ 71 Translation adjustment $ (147) Fair value of plan assets, end of year $ 18,521 Funded status of plan $ (67,255)<br><br>Our projected benefit obligation and plan assets for defined benefit pension plans and the related assumptions used to determine the related liabilities are as follows:<br><br>Defined Benefit Plan<br><br>We maintain defined benefit pension plans for certain of our non-U.S. employees in the U.K., Germany, and Philippines. ...</code> | <code></code> | <code></code> | <code></code> | <code></code> | <code></code> |
| <code>What is the interest rate for the GBP short term loan with Canadian Natural Resources Limited?</code> | <code>CREDITORS: amounts falling due within one year<br><br>33<br><br>DEBTORS<br><br>The carrying amount of debtors is a reasonable approximation to fair value. Trade debtors and other receivables are not overdue as payment terms have not been exceeded. The expected credit loss on the trade debtor's balance was negligible and therefore no adjustment has been applied.<br><br>Other creditors includes £4.9 million (2022 - £4.3 million) in respect of future share options (note 21). The short term loan with CNR International (U.K.) Developments Limited expired on 31 December 2023. Interest was generated at a rate of Secured Overnight Financing Rate (SOFR) + 1.75%.<br><br>Amounts owed by group undertakings are unsecured and repayable on demand. The GBP short term loan with Canadian Natural Resources Limited generates interest at a rate of Sterling Overnight Index Average (SONIA) + 1.15% per annum. Trading balances are interest free. Management considered the expected credit loss on amounts due from Group undertakings at 31 Dec...</code> | <code>for the year ended 31 December 2022<br><br>The share options creditor relates to amounts payable to the ultimate parent Canadian Natural Resources Limited relating to employee options to purchase stock in the aforementioned company. The provision is based on the specifics of the agreed plan with Canadian Natural Resources Limited. The current portion of this totalling £4.3 million (2021 - £3.7 million) is included in creditor amounts falling due within one year.<br><br>34<br><br>The Company settled an intercompany term loan on 24 October 2022. The amount drawn down by the Company at 24 October 2022 was US$440.0 million (2021 - US$440.0 million). Interest was charged at US$ LIBOR + 2.8175% per annum on amounts drawn down from this facility until settlement.<br><br>2022 (£'000) 2021 (£'000) Share options 3,550 4,276 Lease liabilities (note 13) 4,786 6,428 Total 8,336 10,704<br><br>The short term loan with CNR International (U.K.) Developments Limited generates interest at a rate of USS LIBOR + 1.75%.<br><br>18. CREDITORS: ...</code> | <code>18 Cash and cash equivalents<br><br>Group 31 Dec 2021 (£000) Group 31 Dec 2020 (£000) Company 31 Dec 2021 (£000) Company 31 Dec 2020 (£000) Sterling 19,641 37,349 19,498 37,347 United States Dollar 8,574 6,826 — — Euros 361 — — — Canadian Dollar 227 364 — — Polish Zloty 225 283 — — Singapore Dollar 29 — — — Japanese Yen 7 — — — Total 29,064 44,822 19,498 37,347<br><br>Cash and cash equivalents are denominated in the following currencies:<br><br>On 14 October 2021, the Group and Company entered into a loan agreement with Bank Of Ireland Group plc consisting of a £10 million term loan in addition to a revolving credit facility of £10 million. The loan is secured on the assets of the Group. Operating covenants are limited to the Group’s net debt leverage and interest cover. The term loan is repayable over five years with an initial 12-month repayment holiday followed by annual capital repayments of £1,250,000. At the end of the term, a bullet payment of £5 million is due. The loan is denominated in Pound S...</code> | <code></code> | <code></code> | <code></code> | <code></code> | <code></code> |
| <code>What is the amount for Charges à imputer relative au personnel?</code> | <code>Ventilation de la rubrique 492/3 du passif si celle-ci représente un montant important<br><br>COMPTES DE RÉGULARISATION<br><br>Description Montant Charges à imputer relative au personnel 128.084.824,70 Charges à imputer: intérêts courus et non échus 37.536.987,21 Charges à imputer diverses 12.774.405,94 Charges à imputer: protocoles et conventions avec autres opérateurs et réseaux 7.240.507,84 Produits à reporter divers 7.841.755,07 Produits à reporter relatifs au trafic 116.123.554,80 Produis à reporter: Hello Belgium Railpass 21.396.182,64 Produis à reporter: NPV 20.088.866,29 Produits à reporter: financements alternatifs 15.285.835,38<br><br>72<br><br>First - C-Cap2022 - 38 / 82</code> | <code>```markdown N° 0203.430.576 C-Cap 6.9<br><br>Charges à imputer diverses Exercice Charges à imputer: protocoles et conventions avec autres opérateurs et reseaux 2.101.406,29 Charges à imputer relatives au personnel 1.378.996,29 Charges à imputer: intérêts courus et non échus 139.424.041,35 Produits à reporter divers 39.103.099,47 Produits à reporter relatifs au trafic 6.816.893,14 Produits à reporter: financements altematifs 146.853.054,23 Produis à reporter: NPV 9.576.751,43 15.185.940,48<br><br>71 Rapport annuel SNCB 2023 ```<br><br>COMPTES DE RÉGULARISATION<br><br>Ventilation de la rubrique 492/3 du passif si celle-ci représente un montant important</code> | <code>DETTES FISCALES, SALARIALES ET SOCIALES (€)<br><br>CODES BNB 2024 VENTILATION DE LA RUBRIQUE 492/3 DU PASSIF SI CELLE-CI REPRÉSENTE UN MONTANT IMPORTANT Intérêts crédits KBC à imputer Autres charges à imputer<br><br>ÉTAT DES DETTES (€)<br><br>COMPTES DE REGULARISATION (€)<br><br>12.4 Etat des dettes et comptes de régularisation du passif<br><br>CODES BNB 2024 VENTILATION DES DETTES A L'ORIGINE A PLUS D'UN AN, EN FONCTION DE LEUR DUREE RESDIEUELLE Dettes à plus d'un an échéant dans l'année Dettes financières 8801 Etablissements de crédit 8841 Total des dettes à plus d'un an échéant dans l'année (42) Dettes ayant plus d'un an mais 5 ans au plus à courir Dettes financières 8802 Etablissement de crédit 8842 Total des dettes ayant plus d'un an mais 5 ans au plus à courir 8912<br><br>12.5 Résultats d'exploitation (en milliers €)<br><br>DETTES GARANTIES (€)<br><br>CODES BNB 2024 Rémunérations et charges sociales Autres dettes salariales et sociales 9077<br><br>```markdown<br><br>CODES BNB 2023 2024 CHARGES D'EXPLOITATION Travailleurs pour lesquels la ...</code> | <code>Résultats financiers Charges financières récurrentes Ventilation des autres charges financières<br><br>11 Dettes fiscales, salariales et sociales<br><br>2022 2021 Impôts (rubriques 450/3 et 179 du passif) Dettes fiscales échues - - Dettes fiscales non échues - - Dettes fiscales estimées - - Rémunérations et charges sociales (rubriques 454/9 et 179 du passif) Dettes échues envers l'Office National de Sécurité Sociale - - Autres dettes salariales et sociales 28.500 25.000<br><br>FINANCIÈRE DE TUBIZE – RAPPORT FINANCIER ANNUEL 2022<br><br>Comptes de régularisation<br><br>FINANCIÈRE DE TUBIZE - RAPPORT ANNUEL 2022 40 41<br><br>2022 2021 Ventilation de la rubrique 492/3 du passif si celle-ci représente un montant important Charges à imputer: intérêts 345.843 40.556 Charges à imputer: commission de réservation 107.126 76.667<br><br>2022 2021 Charges d'exploitation Travailleurs pour lesquels la société a introduit une déclaration DIMONA ou qui sont inscrits au registre général du personnel - - Nombre total à la date de clôture - - Ef...</code> | <code>ANNEXES DES COMPTES DE LA SOCIÉTÉ AU 31 MAI 2025<br><br>SITUATION FISCALE DIFFEREE ET LATENTE<br><br>\| Accroissements de la dette future d'impôt \| Montant \|<br>\|---\|---\|<br>\| Impôt dû sur provisions réglementées: \| \|<br>\| Provisions pour hausse de prix \| \|<br>\| Provisions pour fluctuation des cours \| \|<br>\| Provisions pour investissements \| \|<br>\| Amortissements dérogatoires \| 3 548 \|<br>\| Subventions d'investissement \| \|<br>\| TOTAL ACCROISSEMENTS \| 3 548 \|<br><br>\| Allègements de la dette future d'impôt \| Montant \|<br>\|---\|---\|<br>\| Impôt payé d'avance sur: \| \|<br>\| Charges non déductibles temporairement (à déduire l'année suivante) \| 1 200 \|<br>\| Congés payés \| \|<br>\| Participation des salariés \| 53 \|<br>\| Autres \| \|<br>\| A déduire ultérieurement \| \|<br>\| Provisions pour propre assureur \| \|<br>\| Autres \| \|<br>\| TOTAL ALLÈGEMENTS \| 1 254 \|<br><br>SITUATION FISCALE DIFFÉRÉE NETTE<br>2 294<br><br>IMPÔT DÙ SUR: Plus-values différées<br>43 436<br><br>CREDIT A IMPUTER SUR: Déficits reportables<br><br>CREDIT A IMPUTER SUR: Moins-values à long terme<br><br>SITUATION FISCALE LATENTE NETT...</code> | <code>d'un mois au plus<br><br>Titres à revenu fixe émis par des établissements de crédit<br><br>AUTRES PLACEMENTS DE TRÉSORERIE<br><br>Autres placements de trésorerie non repris ci-avant<br><br>Codes Exercice Exercice précédent 51 8681 8682 8683 52 18.088.896,34 55.716.775,84 8684 53 226.519.496,99 136.305.308,18 8686 85.000.000,00 801.112,99 8687 1.599.690,84 8688 139.919.806,15 135.504.195,19 8689<br><br>Actions, parts et placements autres que placements à revenu fixe<br><br>Titres à revenu fixe<br><br>Actions et parts - Montant non appelé<br><br>Avec une durée résiduelle ou de préavis<br><br>Métaux précieux et œuvres d'art<br><br>Ventilation de la rubrique 490/1 de l'actif si celle-ci représente un montant important<br><br>PLACEMENTS DE TRÉSORERIE ET COMPTES DE RÉGULARISATION DE L'ACTIF<br><br>64 Rapport annuel SNCB 2023<br><br>Actions et parts - Valeur comptable augmentée du montant non appelé<br><br>Comptes à terme détenus auprès des établissements de crédit<br><br>de plus d'un mois à un an au plus<br><br>COMPTES DE RÉGULARISATION<br><br>Exercice Charges à reporter: redevance infrastru...</code> | <code>4.6. Accroissements et allégements de la dette future d'impôt<br><br>Les éléments entraînant un décalage d'imposition conduisent à un accroissement de la dette future d'impôt de 21 278K€ calculé au taux de 25.82%.<br><br>La situation fiscale latente s'analyse comme suit :<br><br>\| Base de calcul \| Montants en K€ \|<br>\|---\|---\|<br>\| BASE D'IMPOT SUR : \| \|<br>\| Provisions réglementées : \| \|<br>\| - Ecart de conversion Actif \| 0 \|<br>\| - Ecart de conversion Passif \| -4 \|<br>\| - Provision pour investissements \| \|<br>\| - Amortissements dérogatoires \| 94 455 \|<br>\| Subventions d'investissement \| 3 283 \|<br>\| Produits non imposables temporairement : \| \|<br>\| (à réintégrer l'année de leur acquisition) \| \|<br>\| - plafonnement TP \| \|<br>\| **TOTAL ACCROISSEMENTS** \| **97 734** \|<br>\| BASE D'IMPOT PAYE D'AVANCE SUR : \| \|<br>\| Charges non déductibles temporairement : \| \|<br>\| (à déduire l'année suivante) \| \|<br>\| - Provision pour risques et charges \| -928 \|<br>\| - Provision pour participation \| -4 083 \|<br>\| - Contribution solidarité \| -869 \|<br>\| - Provisions pou...</code> | <code></code> |
* Loss: [<code>SpladeLoss</code>](https://sbert.net/docs/package_reference/sparse_encoder/losses.html#spladeloss) with these parameters:
```json
{
"loss": "SparseMultipleNegativesRankingLoss(scale=1.0, similarity_fct='dot_score', gather_across_devices=False)",
"document_regularizer_weight": 3e-05,
"query_regularizer_weight": 0.0
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `num_train_epochs`: 2
- `learning_rate`: 2e-05
- `warmup_steps`: 114
- `weight_decay`: 0.01
- `gradient_accumulation_steps`: 4
- `bf16`: True
- `tf32`: True
- `eval_strategy`: steps
- `dataloader_num_workers`: 4
- `batch_sampler`: no_duplicates
- `router_mapping`: {'query': 'query', 'positive': 'document', 'negative_0': 'document', 'negative_1': 'document', 'negative_2': 'document', 'negative_3': 'document', 'negative_4': 'document', 'negative_5': 'document', 'negative_6': 'document'}
- `learning_rate_mapping`: {'sub_modules\\.query\\..*': 0.001}
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `per_device_train_batch_size`: 8
- `num_train_epochs`: 2
- `max_steps`: -1
- `learning_rate`: 2e-05
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: None
- `warmup_steps`: 114
- `optim`: adamw_torch_fused
- `optim_args`: None
- `weight_decay`: 0.01
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `optim_target_modules`: None
- `gradient_accumulation_steps`: 4
- `average_tokens_across_devices`: True
- `max_grad_norm`: 1.0
- `label_smoothing_factor`: 0.0
- `bf16`: True
- `fp16`: False
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: True
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `use_liger_kernel`: False
- `liger_kernel_config`: None
- `use_cache`: False
- `neftune_noise_alpha`: None
- `torch_empty_cache_steps`: None
- `auto_find_batch_size`: False
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `include_num_input_tokens_seen`: no
- `log_level`: passive
- `log_level_replica`: warning
- `disable_tqdm`: False
- `project`: huggingface
- `trackio_space_id`: trackio
- `eval_strategy`: steps
- `per_device_eval_batch_size`: 8
- `prediction_loss_only`: True
- `eval_on_start`: False
- `eval_do_concat_batches`: True
- `eval_use_gather_object`: False
- `eval_accumulation_steps`: None
- `include_for_metrics`: []
- `batch_eval_metrics`: False
- `save_only_model`: False
- `save_on_each_node`: False
- `enable_jit_checkpoint`: False
- `push_to_hub`: False
- `hub_private_repo`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_always_push`: False
- `hub_revision`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `restore_callback_states_from_checkpoint`: False
- `full_determinism`: False
- `seed`: 42
- `data_seed`: None
- `use_cpu`: False
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `parallelism_config`: None
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 4
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `dataloader_prefetch_factor`: None
- `remove_unused_columns`: True
- `label_names`: None
- `train_sampling_strategy`: random
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `ddp_backend`: None
- `ddp_timeout`: 1800
- `fsdp`: []
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `deepspeed`: None
- `debug`: []
- `skip_memory_metrics`: True
- `do_predict`: False
- `resume_from_checkpoint`: None
- `warmup_ratio`: None
- `local_rank`: -1
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
- `router_mapping`: {'query': 'query', 'positive': 'document', 'negative_0': 'document', 'negative_1': 'document', 'negative_2': 'document', 'negative_3': 'document', 'negative_4': 'document', 'negative_5': 'document', 'negative_6': 'document'}
- `learning_rate_mapping`: {'sub_modules\\.query\\..*': 0.001}
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss | NanoNFCorpus_dot_ndcg@10 | NanoSciFact_dot_ndcg@10 | NanoFiQA2018_dot_ndcg@10 | NanoBEIR_mean_dot_ndcg@10 |
|:------:|:----:|:-------------:|:------------------------:|:-----------------------:|:------------------------:|:-------------------------:|
| 0.0175 | 10 | 2.5846 | - | - | - | - |
| 0.0351 | 20 | 2.4596 | - | - | - | - |
| 0.0526 | 30 | 2.5787 | - | - | - | - |
| 0.0701 | 40 | 2.2135 | - | - | - | - |
| 0.0877 | 50 | 2.1444 | - | - | - | - |
| 0.1052 | 60 | 2.0011 | - | - | - | - |
| 0.1228 | 70 | 1.8179 | - | - | - | - |
| 0.1403 | 80 | 1.7744 | - | - | - | - |
| 0.1578 | 90 | 1.7054 | - | - | - | - |
| 0.1754 | 100 | 1.5427 | - | - | - | - |
| 0.1929 | 110 | 1.6134 | - | - | - | - |
| 0.2104 | 120 | 1.6381 | - | - | - | - |
| 0.2280 | 130 | 1.6946 | - | - | - | - |
| 0.2455 | 140 | 1.4456 | - | - | - | - |
| 0.2630 | 150 | 1.4302 | - | - | - | - |
| 0.2806 | 160 | 1.3097 | - | - | - | - |
| 0.2981 | 170 | 1.5755 | - | - | - | - |
| 0.3157 | 180 | 1.2906 | - | - | - | - |
| 0.3332 | 190 | 1.3424 | - | - | - | - |
| 0.3507 | 200 | 1.5477 | - | - | - | - |
| 0.3683 | 210 | 1.3442 | - | - | - | - |
| 0.3858 | 220 | 1.2810 | - | - | - | - |
| 0.4033 | 230 | 1.3157 | - | - | - | - |
| 0.4209 | 240 | 1.2839 | - | - | - | - |
| 0.4384 | 250 | 1.2428 | - | - | - | - |
| 0.4559 | 260 | 1.2376 | - | - | - | - |
| 0.4735 | 270 | 1.1353 | - | - | - | - |
| 0.4910 | 280 | 1.2513 | - | - | - | - |
| 0.5085 | 290 | 1.0490 | - | - | - | - |
| 0.5261 | 300 | 1.0669 | - | - | - | - |
| 0.5436 | 310 | 1.2219 | - | - | - | - |
| 0.5612 | 320 | 1.0313 | - | - | - | - |
| 0.5787 | 330 | 1.2846 | - | - | - | - |
| 0.5962 | 340 | 1.0939 | - | - | - | - |
| 0.6138 | 350 | 1.0299 | - | - | - | - |
| 0.6313 | 360 | 0.6464 | - | - | - | - |
| 0.6488 | 370 | 0.7067 | - | - | - | - |
| 0.6664 | 380 | 0.5505 | - | - | - | - |
| 0.6839 | 390 | 0.6885 | - | - | - | - |
| 0.7014 | 400 | 0.8663 | - | - | - | - |
| 0.7190 | 410 | 0.8602 | - | - | - | - |
| 0.7365 | 420 | 0.5517 | - | - | - | - |
| 0.7541 | 430 | 0.3781 | - | - | - | - |
| 0.7716 | 440 | 0.6533 | - | - | - | - |
| 0.7891 | 450 | 1.1145 | - | - | - | - |
| 0.8067 | 460 | 0.3240 | - | - | - | - |
| 0.8242 | 470 | 0.5818 | - | - | - | - |
| 0.8417 | 480 | 0.3394 | - | - | - | - |
| 0.8593 | 490 | 0.8986 | - | - | - | - |
| 0.8768 | 500 | 0.6177 | 0.3695 | 0.7388 | 0.3862 | 0.4982 |
| 0.8943 | 510 | 0.8443 | - | - | - | - |
| 0.9119 | 520 | 0.5454 | - | - | - | - |
| 0.9294 | 530 | 0.9840 | - | - | - | - |
| 0.9470 | 540 | 0.6111 | - | - | - | - |
| 0.9645 | 550 | 0.7095 | - | - | - | - |
| 0.9820 | 560 | 0.8391 | - | - | - | - |
| 0.9996 | 570 | 0.6461 | - | - | - | - |
| 1.0158 | 580 | 1.3053 | - | - | - | - |
| 1.0333 | 590 | 0.9817 | - | - | - | - |
| 1.0509 | 600 | 1.0531 | - | - | - | - |
| 1.0684 | 610 | 0.9087 | - | - | - | - |
| 1.0859 | 620 | 0.9186 | - | - | - | - |
| 1.1035 | 630 | 1.0373 | - | - | - | - |
| 1.1210 | 640 | 0.9417 | - | - | - | - |
| 1.1385 | 650 | 0.9963 | - | - | - | - |
| 1.1561 | 660 | 0.9058 | - | - | - | - |
| 1.1736 | 670 | 0.9252 | - | - | - | - |
| 1.1911 | 680 | 1.0170 | - | - | - | - |
| 1.2087 | 690 | 0.9957 | - | - | - | - |
| 1.2262 | 700 | 0.8720 | - | - | - | - |
| 1.2438 | 710 | 0.8776 | - | - | - | - |
| 1.2613 | 720 | 0.8562 | - | - | - | - |
| 1.2788 | 730 | 0.8772 | - | - | - | - |
| 1.2964 | 740 | 0.9591 | - | - | - | - |
| 1.3139 | 750 | 0.9495 | - | - | - | - |
| 1.3314 | 760 | 0.9933 | - | - | - | - |
| 1.3490 | 770 | 0.8449 | - | - | - | - |
| 1.3665 | 780 | 0.7833 | - | - | - | - |
| 1.3840 | 790 | 0.9574 | - | - | - | - |
| 1.4016 | 800 | 0.7727 | - | - | - | - |
| 1.4191 | 810 | 0.8997 | - | - | - | - |
| 1.4367 | 820 | 0.8796 | - | - | - | - |
| 1.4542 | 830 | 0.8535 | - | - | - | - |
| 1.4717 | 840 | 1.0049 | - | - | - | - |
| 1.4893 | 850 | 0.8912 | - | - | - | - |
| 1.5068 | 860 | 0.9883 | - | - | - | - |
| 1.5243 | 870 | 0.7190 | - | - | - | - |
| 1.5419 | 880 | 0.9274 | - | - | - | - |
| 1.5594 | 890 | 0.8372 | - | - | - | - |
| 1.5769 | 900 | 0.7986 | - | - | - | - |
| 1.5945 | 910 | 0.7205 | - | - | - | - |
| 1.6120 | 920 | 0.5797 | - | - | - | - |
| 1.6295 | 930 | 0.6741 | - | - | - | - |
| 1.6471 | 940 | 0.5253 | - | - | - | - |
| 1.6646 | 950 | 0.1963 | - | - | - | - |
| 1.6822 | 960 | 0.4864 | - | - | - | - |
| 1.6997 | 970 | 0.7439 | - | - | - | - |
| 1.7172 | 980 | 0.6164 | - | - | - | - |
| 1.7348 | 990 | 0.3680 | - | - | - | - |
| 1.7523 | 1000 | 0.5521 | 0.3775 | 0.7393 | 0.4401 | 0.5190 |
| 1.7698 | 1010 | 0.2149 | - | - | - | - |
| 1.7874 | 1020 | 0.5544 | - | - | - | - |
| 1.8049 | 1030 | 0.8062 | - | - | - | - |
| 1.8224 | 1040 | 0.2349 | - | - | - | - |
| 1.8400 | 1050 | 0.5362 | - | - | - | - |
| 1.8575 | 1060 | 0.8963 | - | - | - | - |
| 1.8751 | 1070 | 0.5910 | - | - | - | - |
| 1.8926 | 1080 | 0.3764 | - | - | - | - |
| 1.9101 | 1090 | 0.5331 | - | - | - | - |
| 1.9277 | 1100 | 1.0374 | - | - | - | - |
| 1.9452 | 1110 | 0.6087 | - | - | - | - |
| 1.9627 | 1120 | 0.4690 | - | - | - | - |
| 1.9803 | 1130 | 0.4651 | - | - | - | - |
| 1.9978 | 1140 | 0.5315 | - | - | - | - |
</details>
### Framework Versions
- Python: 3.11.10
- Sentence Transformers: 5.2.3
- Transformers: 5.2.0
- PyTorch: 2.10.0+cu128
- Accelerate: 1.12.0
- Datasets: 4.5.0
- Tokenizers: 0.22.2
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### SpladeLoss
```bibtex
@misc{formal2022distillationhardnegativesampling,
title={From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective},
author={Thibault Formal and Carlos Lassance and Benjamin Piwowarski and Stéphane Clinchant},
year={2022},
eprint={2205.04733},
archivePrefix={arXiv},
primaryClass={cs.IR},
url={https://arxiv.org/abs/2205.04733},
}
```
#### SparseMultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
#### FlopsLoss
```bibtex
@article{paria2020minimizing,
title={Minimizing flops to learn efficient sparse representations},
author={Paria, Biswajit and Yeh, Chih-Kuan and Yen, Ian EH and Xu, Ning and Ravikumar, Pradeep and P{'o}czos, Barnab{'a}s},
journal={arXiv preprint arXiv:2004.05665},
year={2020}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |