SentenceTransformer based on intfloat/multilingual-e5-large

This is a sentence-transformers model finetuned from intfloat/multilingual-e5-large on the json dataset. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: intfloat/multilingual-e5-large
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 1024 dimensions
  • Similarity Function: Cosine Similarity
  • Training Dataset:
    • json

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: PeftModelForFeatureExtraction 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'query: What is the lump sum payment for professional outplacement services that the Executive will receive?',
    'be entitled to receive continuing group medical coverage for himself and his dependents (on a non-taxable basis, including if necessary, payment of any gross-up payments necessary to result in net non-taxable benefits), which coverage is not materially less favorable to the Executive than the group medical coverage which was provided to the Executive by the Company or its affiliates immediately prior to the Termination Date. To the extent applicable and to the extent permitted by law, any continuing coverage provided to the Executive and/or his dependents pursuant to this subparagraph (iii) shall be considered part of, and not in addition to, any coverage required under COBRA. (iv) The Executive will be provided with a lump sum payment of $12,000 for professional outplacement services. Notice by the Company that',
    'vested amounts, if any, to which the Executive is entitled under the Savings Plan as of the Date of Termination, the Company will pay the Executive, in accordance with Section3.04, a lump sum amount equal to the value of the unvested portion, if any, of the employer matching and fixed contributions (and attributable earnings) credited to the Executive under the Savings Plan. 8 -------------------------------------------------------------------------------- (f) Outplacement Services. For a period not to exceed six (6)months following the Date of Termination, the Company will provide the Executive with reasonable outplacement services consistent with past practices of the Company prior to the Change in Control or, if no past practice has been established prior to the Change in Control, consistent with the prevailing practice in the medical device manufacturing industry.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

json

  • Dataset: json
  • Size: 32,378 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 12 tokens
    • mean: 25.6 tokens
    • max: 47 tokens
    • min: 7 tokens
    • mean: 190.52 tokens
    • max: 512 tokens
    • min: 53 tokens
    • mean: 202.25 tokens
    • max: 485 tokens
  • Samples:
    anchor positive negative
    query: What is the effective date of the Fifth Amendment to the Approach Resources Inc. 2007 Stock Incentive Plan? Exhibit 10.1 FIFTH AMENDMENT TO THE APPROACH RESOURCES INC. 2007 STOCK INCENTIVE PLAN This Fifth Amendment (the “Fifth Amendment”) to the Approach Resources Inc. 2007 Stock Incentive Plan, as amended from time to time (the “Plan”), is made effective as of June 2, 2016 (the “Amendment Effective Date”), by Approach Resources Inc., a Delaware corporation (“Approach”), subject to approval by Approach’s stockholders. W I T N E S S E T H: WHEREAS, Approach established the Plan, originally effective as of June 28, 2007 and most recently amended effective March 2, 2016, under which Approach is authorized to grant equity-based incentive awards to certain employees and service providers of Approach and its subsidiaries; WHEREAS, Section 14.1 of the Plan provides that Approach’s board of directors (the “Board”) may Exhibit 10.39 AMENDMENT TO THE BPZ RESOURCES, INC. 2007 LONG-TERM INCENTIVE COMPENSATION PLAN WHEREAS, BPZ Resources,Inc. (the “Company”) adopted and maintains the BPZ Energy,Inc. 2007 Long-Term Incentive Compensation Plan (the “Plan”), effective as of June4, 2007, to provide an opportunity for its eligible employees and certain independent contractors to earn long term incentive awards in consideration for their services; WHEREAS, the Company now desires to amend the Plan to reserve additional shares for issuance under the Plan. NOW THEREFORE, effective as of June 20, 2014, the Plan is hereby amended by replacing Section7(a)with the following new Section7(a)that shall read as follows: “(a) Maximum Shares. Subject to adjustment as provided in this Section 7, there is hereby reserved for issuance under the Plan up to 12,000,000 shares of Stock
    query: What is the date on which the Company accepted the subscription? to acceptance by the Company, the undersigned has completed this Subscription Agreement to evidence his/her/its sub­scrip­tion for participation in the securities of the Company, this ____th day of _________ 2013. Subscriber Printed name If an entity, on behalf of: Subscriber’s position with entity: The Company has accepted this subscription this ____ day of _________ 2012. OverNear, Inc. By Its: Printed Name: Page11 of 19 Subscription Agreement OverNear, Inc. -------------------------------------------------------------------------------- Subscription Documents - Continued OVERNEAR, INC. (THE “COMPANY”) INVESTOR APPLICATION (QUALIFICATION QUESTIONNAIRE) (CONFIDENTIAL) ALL INFORMATION CONTAINED IN THIS APPLICATIONWILL BE TREATEDCONFIDENTIALLY. The undersigned understands, however, that the Company may present this application to such parties as the Company, in his discretion, deems appropriate when called upon to establish that the proposed offer and sale of the Securities are exempt and each Subscriber is executing and delivering this agreement in reliance upon the exemption from securities registration afforded by Section 4(2) of the Securities Act and Rule 506 of Regulation D as promulgated by the SEC under the Securities Act; and WHEREAS the subscription for the Securities will be made in accordance with and subject to the terms and conditions of this Subscription Agreement and the Company's Confidential Private Placement Memorandum dated January 28, 2014 together with all amendments thereof and supplements and exhibits thereto and as such may be amended from time to time (the "Memorandum"); and WHEREAS, the Subscriber desires to purchase such number of shares of Common Stock (together with the associated Warrants) as set forth on the signature page hereof on the terms and
    query: What percentage of common shares must an entity own to be considered an Acquiring Person under the Rights Agreement? the mutual agreements herein set forth, the parties agree as follows: Section1. Amendment to Section1.1. Section1.1 of the Rights Agreement is amended to read in its entirety as follows: “1.1 “Acquiring Person” shall mean any Person (as such term is hereinafter defined) who or which, together with all Affiliates and Associates (as such terms are hereinafter defined) of such Person, shall be the Beneficial Owner (as such term is hereinafter defined) of 15% or more of the Common Shares of the Company then outstanding, but shall not include: (i) the Company; (ii) any Subsidiary of the Company; (iii) any employee benefit plan of the Company or of any Subsidiary of the Company or any entity holding shares of capital stock of the Company for or pursuant to the of more than 25% of the Common Shares outstanding immediately prior to the distribution, and in making this determination the Common Shares to be issued to such Person in the distribution shall be deemed to be held by such Person but shall not be included in the aggregate number of outstanding Common Shares immediately prior to the distribution ("Exempt Acquisitions"); the acquisition of Common Shares upon the exercise of Convertible Securities received by such Person pursuant to a Permitted Bid Acquisition, an Exempt Acquisition or a Pro Rata Acquisition (as defined below) ("Convertible Security Acquisitions"); or acquisitions as a result of a stock dividend, a stock split or other event pursuant to which such Person receives or acquires Common Shares or Convertible Securities on the same pro rata
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 16
  • gradient_accumulation_steps: 8
  • learning_rate: 0.0001
  • num_train_epochs: 1
  • lr_scheduler_type: cosine_with_restarts
  • warmup_ratio: 0.1
  • bf16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 8
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 8
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 0.0001
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • eval_use_gather_object: False
  • prompts: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss
0.0040 1 2.1317
0.0079 2 2.1656
0.0119 3 2.0907
0.0158 4 2.1018
0.0198 5 2.2049
0.0237 6 2.133
0.0277 7 2.1612
0.0316 8 2.1797
0.0356 9 2.0282
0.0395 10 2.0335
0.0435 11 1.953
0.0474 12 1.9439
0.0514 13 1.8734
0.0553 14 1.9584
0.0593 15 1.7648
0.0632 16 1.8349
0.0672 17 1.7773
0.0711 18 1.7721
0.0751 19 1.6587
0.0791 20 1.5767
0.0830 21 1.4761
0.0870 22 1.4714
0.0909 23 1.4471
0.0949 24 1.3233
0.0988 25 1.2631
0.1028 26 1.1757
0.1067 27 1.0742
0.1107 28 1.0249
0.1146 29 1.1338
0.1186 30 0.965
0.1225 31 1.0061
0.1265 32 0.9607
0.1304 33 0.8747
0.1344 34 0.8163
0.1383 35 0.8643
0.1423 36 0.7803
0.1462 37 0.6848
0.1502 38 0.6727
0.1542 39 0.7509
0.1581 40 0.6364
0.1621 41 0.5834
0.1660 42 0.5821
0.1700 43 0.5909
0.1739 44 0.5541
0.1779 45 0.5548
0.1818 46 0.4847
0.1858 47 0.5016
0.1897 48 0.4626
0.1937 49 0.4327
0.1976 50 0.5319
0.2016 51 0.4769
0.2055 52 0.4741
0.2095 53 0.5004
0.2134 54 0.4693
0.2174 55 0.4328
0.2213 56 0.46
0.2253 57 0.472
0.2292 58 0.4168
0.2332 59 0.5311
0.2372 60 0.4115
0.2411 61 0.3221
0.2451 62 0.3585
0.2490 63 0.4406
0.2530 64 0.4495
0.2569 65 0.4487
0.2609 66 0.4563
0.2648 67 0.4414
0.2688 68 0.3646
0.2727 69 0.3844
0.2767 70 0.4201
0.2806 71 0.4278
0.2846 72 0.3262
0.2885 73 0.4403
0.2925 74 0.4391
0.2964 75 0.3564
0.3004 76 0.2476
0.3043 77 0.3881
0.3083 78 0.455
0.3123 79 0.3182
0.3162 80 0.4281
0.3202 81 0.3926
0.3241 82 0.3842
0.3281 83 0.4574
0.3320 84 0.3087
0.3360 85 0.3651
0.3399 86 0.3744
0.3439 87 0.4061
0.3478 88 0.3568
0.3518 89 0.3193
0.3557 90 0.3384
0.3597 91 0.3822
0.3636 92 0.3818
0.3676 93 0.4413
0.3715 94 0.3446
0.3755 95 0.3336
0.3794 96 0.3527
0.3834 97 0.3501
0.3874 98 0.3454
0.3913 99 0.3346
0.3953 100 0.3516
0.3992 101 0.3836
0.4032 102 0.3856
0.4071 103 0.3484
0.4111 104 0.2827
0.4150 105 0.2877
0.4190 106 0.365
0.4229 107 0.3424
0.4269 108 0.3468
0.4308 109 0.3962
0.4348 110 0.3328
0.4387 111 0.3916
0.4427 112 0.3186
0.4466 113 0.3315
0.4506 114 0.2868
0.4545 115 0.2986
0.4585 116 0.2903
0.4625 117 0.3371
0.4664 118 0.3204
0.4704 119 0.3594
0.4743 120 0.3615
0.4783 121 0.3611
0.4822 122 0.3388
0.4862 123 0.334
0.4901 124 0.3937
0.4941 125 0.3874
0.4980 126 0.3928
0.5020 127 0.3227
0.5059 128 0.3285
0.5099 129 0.2938
0.5138 130 0.401
0.5178 131 0.2996
0.5217 132 0.2548
0.5257 133 0.3076
0.5296 134 0.3449
0.5336 135 0.3977
0.5375 136 0.38
0.5415 137 0.3634
0.5455 138 0.3287
0.5494 139 0.332
0.5534 140 0.3111
0.5573 141 0.323
0.5613 142 0.32
0.5652 143 0.3831
0.5692 144 0.2635
0.5731 145 0.3777
0.5771 146 0.3701
0.5810 147 0.3251
0.5850 148 0.3246
0.5889 149 0.2807
0.5929 150 0.2726
0.5968 151 0.2527
0.6008 152 0.3566
0.6047 153 0.2611
0.6087 154 0.2831
0.6126 155 0.3591
0.6166 156 0.3237
0.6206 157 0.2818
0.6245 158 0.3863
0.6285 159 0.2499
0.6324 160 0.3633
0.6364 161 0.3356
0.6403 162 0.2561
0.6443 163 0.3032
0.6482 164 0.2511
0.6522 165 0.3402
0.6561 166 0.3838
0.6601 167 0.3171
0.6640 168 0.3001
0.6680 169 0.3474
0.6719 170 0.2721
0.6759 171 0.2755
0.6798 172 0.3078
0.6838 173 0.2617
0.6877 174 0.3669
0.6917 175 0.3094
0.6957 176 0.2802
0.6996 177 0.3803
0.7036 178 0.3262
0.7075 179 0.3241
0.7115 180 0.3132
0.7154 181 0.2579
0.7194 182 0.3221
0.7233 183 0.3497
0.7273 184 0.2853
0.7312 185 0.3576
0.7352 186 0.348
0.7391 187 0.2487
0.7431 188 0.2732
0.7470 189 0.3023
0.7510 190 0.2351
0.7549 191 0.2663
0.7589 192 0.2483
0.7628 193 0.3116
0.7668 194 0.2435
0.7708 195 0.3982
0.7747 196 0.3503
0.7787 197 0.3364
0.7826 198 0.2872
0.7866 199 0.3554
0.7905 200 0.352
0.7945 201 0.2781
0.7984 202 0.2604
0.8024 203 0.3174
0.8063 204 0.257
0.8103 205 0.2591
0.8142 206 0.2861
0.8182 207 0.3764
0.8221 208 0.3702
0.8261 209 0.2953
0.8300 210 0.2472
0.8340 211 0.3193
0.8379 212 0.2944
0.8419 213 0.373
0.8458 214 0.2736
0.8498 215 0.3392
0.8538 216 0.2611
0.8577 217 0.3074
0.8617 218 0.3041
0.8656 219 0.3103
0.8696 220 0.3111
0.8735 221 0.3066
0.8775 222 0.3117
0.8814 223 0.3109
0.8854 224 0.2266
0.8893 225 0.2774
0.8933 226 0.2816
0.8972 227 0.3015
0.9012 228 0.3339
0.9051 229 0.3166
0.9091 230 0.3214
0.9130 231 0.3425
0.9170 232 0.2001
0.9209 233 0.2849
0.9249 234 0.2981
0.9289 235 0.2695
0.9328 236 0.2568
0.9368 237 0.2672
0.9407 238 0.2554
0.9447 239 0.2786
0.9486 240 0.3506
0.9526 241 0.2983
0.9565 242 0.2254
0.9605 243 0.3054
0.9644 244 0.3031
0.9684 245 0.2216
0.9723 246 0.2185
0.9763 247 0.2781
0.9802 248 0.3696
0.9842 249 0.3164
0.9881 250 0.2713
0.9921 251 0.3063
0.9960 252 0.2969
1.0 253 0.2826

Framework Versions

  • Python: 3.12.3
  • Sentence Transformers: 3.3.1
  • Transformers: 4.44.2
  • PyTorch: 2.5.1
  • Accelerate: 1.2.1
  • Datasets: 2.19.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
4
Safetensors
Model size
560M params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for wwydmanski/e5-large-legal-v0.1

Finetuned
(73)
this model