Edit model card

SentenceTransformer based on cointegrated/rubert-tiny2

This is a sentence-transformers model finetuned from cointegrated/rubert-tiny2. It maps sentences & paragraphs to a 312-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: cointegrated/rubert-tiny2
  • Maximum Sequence Length: 2048 tokens
  • Output Dimensionality: 312 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 2048, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 312, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("WpythonW/RUbert-tiny_custom_test")
# Run inference
sentences = [
    'Дайте обратную связь по моей заявке,отправлен ли логин и пароль сотруднику',
    'При проблемах со входом в личный кабинет, прежде чем создавать заявку в поддержку, убедитесь, что заходите в ЛК на сайте https://company-x5.ru, указываете актуальные и верные логин и пароль. Если Вам неизвестен логин, обратитесь к руководителю (ДМ), он сможет посмотреть Ваш логин и сбросить пароль в веб-табеле. Для самостоятельного сброса пароля позвоните с вашего мобильного телефона на +7 (XXX) XXX XX XX, наберите добавочный номер 10100, нажмите * и подтвердите сброс пароля, нажав #. Обновленный пароль отправляется по SMS.',
    'Создайте, пожалуйста, обращение в ИТ поддержку на портале support',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 312]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.6909
cosine_accuracy@3 0.8303
cosine_accuracy@5 0.8788
cosine_precision@1 0.6909
cosine_precision@3 0.2768
cosine_precision@5 0.1758
cosine_precision@10 0.0906
cosine_recall@1 0.6909
cosine_recall@3 0.8303
cosine_recall@5 0.8788
cosine_recall@10 0.9061
cosine_ndcg@10 0.8026
cosine_mrr@10 0.7688
cosine_map@100 0.773
dot_accuracy@1 0.6909
dot_accuracy@3 0.8303
dot_accuracy@5 0.8788
dot_precision@1 0.6909
dot_precision@3 0.2768
dot_precision@5 0.1758
dot_precision@10 0.0906
dot_recall@1 0.6909
dot_recall@3 0.8303
dot_recall@5 0.8788
dot_recall@10 0.9061
dot_ndcg@10 0.8026
dot_mrr@10 0.7688
dot_map@100 0.773

Training Details

Training Dataset

Unnamed Dataset

  • Size: 1,317 training samples
  • Columns: sentence_0 and sentence_1
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1
    type string string
    details
    • min: 3 tokens
    • mean: 12.42 tokens
    • max: 107 tokens
    • min: 7 tokens
    • mean: 60.08 tokens
    • max: 371 tokens
  • Samples:
    sentence_0 sentence_1
    Не могу оформить заявку на работу из дома Критерии доступности сервиса Удаленная Работа: 1.Сотрудник не на нулевой занятости: процент соединения (ИТ 1001) между штатной должностью и табельным номером на текущую дату больше 0; 2.Сотрудник на офисном графике работы: в ИТ 0007 Нормативное рабочее время на текущую дату установлен график, который в соответствии с Правилом ГРВ (таблица T508A) является офисным – поле KKRKH принимает одно из значений: {1; 2; 3; 4; 6}; 3.У сотрудника есть руководитель: наличие на текущую дату соединения (ИТ 1001) B012 между ОЕ сотрудника и ШД руководителям или BZ10 между ШД сотрудника и ШД руководителя; 4.Уровень CEO- руководителя сотрудника позволяет принимать заявки на УР: на штатной должности руководителя сотрудника установленное на текущую дату значение атрибута (ИТ 1222) Z_PM_CEO Уровень подчиненности до СЕО по сценарию Z_PM Управление эффективностью должностей отсутствует в таблице ZHRT_ESS_REMAPP для формата сотрудника (на данный момент ограничение только на CEO и -1 5.Сотруднику установлен признак «Удаленный офис»: на ШД сотрудника / на ОЕ сотрудника / на вышестоящей ОЕ (по пути анализа P-S-O-O) в ИТ 1010 Комп/ВспомСредства подтипе 9021 Работа на дому установлено значение 002 Удаленный офис. Если какой-то из критериев не выполняется, вкладка «удаленная работа» в личном кабинете будет не доступна. Для внесения изменений в систему SAP, необходимо обратиться к специалистам по кадрам.
    Не поступают заявки в работу, прошу настроить корректность их назначения. Была делегирована роль в ЛК от менеджера по кадрам, но заявки не поступают. Создайте, пожалуйста, обращение в ИТ поддержку на портале support
    Нет возможности подписать график УР - отображается, что подписано все. Вам необходимо открыть сервис "Удаленная работа", далее выбрать "График УР". Заявка находится в статусе "Ожидание подписание". Нажмите на нее. Откроется заявка и будет активна кнопка "Подписать".
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 512
  • per_device_eval_batch_size: 512
  • num_train_epochs: 1200
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 512
  • per_device_eval_batch_size: 512
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 1200
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • eval_use_gather_object: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Training Logs

Click to expand
Epoch Step Training Loss test_cosine_map@100
1.0 3 - 0.2941
2.0 6 - 0.2941
3.0 9 - 0.2942
4.0 12 - 0.2942
5.0 15 - 0.2945
6.0 18 - 0.2945
7.0 21 - 0.2944
8.0 24 - 0.2948
9.0 27 - 0.2964
10.0 30 - 0.2964
11.0 33 - 0.2965
12.0 36 - 0.2967
13.0 39 - 0.2970
14.0 42 - 0.2979
15.0 45 - 0.2979
16.0 48 - 0.2995
17.0 51 - 0.3038
18.0 54 - 0.3041
19.0 57 - 0.3050
20.0 60 - 0.3058
21.0 63 - 0.3061
22.0 66 - 0.3073
23.0 69 - 0.3094
24.0 72 - 0.3098
25.0 75 - 0.3119
26.0 78 - 0.3122
27.0 81 - 0.3130
28.0 84 - 0.3144
29.0 87 - 0.3159
30.0 90 - 0.3180
31.0 93 - 0.3199
32.0 96 - 0.3225
33.0 99 - 0.3255
34.0 102 - 0.3274
35.0 105 - 0.3293
36.0 108 - 0.3297
37.0 111 - 0.3305
38.0 114 - 0.3328
39.0 117 - 0.3378
40.0 120 - 0.3382
41.0 123 - 0.3417
42.0 126 - 0.3428
43.0 129 - 0.3471
44.0 132 - 0.3497
45.0 135 - 0.3514
46.0 138 - 0.3540
47.0 141 - 0.3552
48.0 144 - 0.3585
49.0 147 - 0.3613
50.0 150 - 0.3621
51.0 153 - 0.3668
52.0 156 - 0.3696
53.0 159 - 0.3701
54.0 162 - 0.3724
55.0 165 - 0.3742
56.0 168 - 0.3789
57.0 171 - 0.3809
58.0 174 - 0.3821
59.0 177 - 0.3867
60.0 180 - 0.3894
61.0 183 - 0.3964
62.0 186 - 0.3988
63.0 189 - 0.4016
64.0 192 - 0.4027
65.0 195 - 0.4048
66.0 198 - 0.4093
67.0 201 - 0.4103
68.0 204 - 0.4101
69.0 207 - 0.4126
70.0 210 - 0.4138
71.0 213 - 0.4153
72.0 216 - 0.4202
73.0 219 - 0.4241
74.0 222 - 0.4285
75.0 225 - 0.4301
76.0 228 - 0.4321
77.0 231 - 0.4340
78.0 234 - 0.4366
79.0 237 - 0.4395
80.0 240 - 0.4403
81.0 243 - 0.4400
82.0 246 - 0.4422
83.0 249 - 0.4433
84.0 252 - 0.4481
85.0 255 - 0.4528
86.0 258 - 0.4552
87.0 261 - 0.4583
88.0 264 - 0.4637
89.0 267 - 0.4663
90.0 270 - 0.4674
91.0 273 - 0.4708
92.0 276 - 0.4718
93.0 279 - 0.4754
94.0 282 - 0.4778
95.0 285 - 0.4811
96.0 288 - 0.4820
97.0 291 - 0.4831
98.0 294 - 0.4857
99.0 297 - 0.4889
100.0 300 - 0.4929
101.0 303 - 0.4938
102.0 306 - 0.4990
103.0 309 - 0.5003
104.0 312 - 0.5012
105.0 315 - 0.5025
106.0 318 - 0.5033
107.0 321 - 0.5054
108.0 324 - 0.5073
109.0 327 - 0.5080
110.0 330 - 0.5088
111.0 333 - 0.5115
112.0 336 - 0.5131
113.0 339 - 0.5160
114.0 342 - 0.5201
115.0 345 - 0.5274
116.0 348 - 0.5297
117.0 351 - 0.5316
118.0 354 - 0.5340
119.0 357 - 0.5365
120.0 360 - 0.5374
121.0 363 - 0.5397
122.0 366 - 0.5418
123.0 369 - 0.5441
124.0 372 - 0.5463
125.0 375 - 0.5489
126.0 378 - 0.5525
127.0 381 - 0.5539
128.0 384 - 0.5556
129.0 387 - 0.5575
130.0 390 - 0.5601
131.0 393 - 0.5607
132.0 396 - 0.5632
133.0 399 - 0.5661
134.0 402 - 0.5685
135.0 405 - 0.5710
136.0 408 - 0.5747
137.0 411 - 0.5779
138.0 414 - 0.5805
139.0 417 - 0.5832
140.0 420 - 0.5871
141.0 423 - 0.5948
142.0 426 - 0.5976
143.0 429 - 0.5995
144.0 432 - 0.6056
145.0 435 - 0.6097
146.0 438 - 0.6100
147.0 441 - 0.6106
148.0 444 - 0.6113
149.0 447 - 0.6100
150.0 450 - 0.6105
151.0 453 - 0.6126
152.0 456 - 0.6136
153.0 459 - 0.6145
154.0 462 - 0.6166
155.0 465 - 0.6175
156.0 468 - 0.6194
157.0 471 - 0.6194
158.0 474 - 0.6196
159.0 477 - 0.6218
160.0 480 - 0.6230
161.0 483 - 0.6239
162.0 486 - 0.6284
163.0 489 - 0.6318
164.0 492 - 0.6325
165.0 495 - 0.6335
166.0 498 - 0.6335
166.6667 500 4.7565 -
167.0 501 - 0.6356
168.0 504 - 0.6388
169.0 507 - 0.6413
170.0 510 - 0.6417
171.0 513 - 0.6422
172.0 516 - 0.6440
173.0 519 - 0.6459
174.0 522 - 0.6460
175.0 525 - 0.6473
176.0 528 - 0.6461
177.0 531 - 0.6469
178.0 534 - 0.6471
179.0 537 - 0.6496
180.0 540 - 0.6498
181.0 543 - 0.6500
182.0 546 - 0.6524
183.0 549 - 0.6529
184.0 552 - 0.6533
185.0 555 - 0.6533
186.0 558 - 0.6550
187.0 561 - 0.6571
188.0 564 - 0.6592
189.0 567 - 0.6594
190.0 570 - 0.6602
191.0 573 - 0.6627
192.0 576 - 0.6634
193.0 579 - 0.6624
194.0 582 - 0.6630
195.0 585 - 0.6669
196.0 588 - 0.6700
197.0 591 - 0.6716
198.0 594 - 0.6725
199.0 597 - 0.6734
200.0 600 - 0.6737
201.0 603 - 0.6797
202.0 606 - 0.6814
203.0 609 - 0.6840
204.0 612 - 0.6872
205.0 615 - 0.6888
206.0 618 - 0.6894
207.0 621 - 0.6881
208.0 624 - 0.6897
209.0 627 - 0.6897
210.0 630 - 0.6894
211.0 633 - 0.6896
212.0 636 - 0.6900
213.0 639 - 0.6886
214.0 642 - 0.6873
215.0 645 - 0.6881
216.0 648 - 0.6892
217.0 651 - 0.6890
218.0 654 - 0.6892
219.0 657 - 0.6895
220.0 660 - 0.6864
221.0 663 - 0.6863
222.0 666 - 0.6863
223.0 669 - 0.6865
224.0 672 - 0.6869
225.0 675 - 0.6873
226.0 678 - 0.6876
227.0 681 - 0.6880
228.0 684 - 0.6884
229.0 687 - 0.6899
230.0 690 - 0.6930
231.0 693 - 0.6947
232.0 696 - 0.6957
233.0 699 - 0.6957
234.0 702 - 0.6944
235.0 705 - 0.6968
236.0 708 - 0.6973
237.0 711 - 0.6992
238.0 714 - 0.6996
239.0 717 - 0.7001
240.0 720 - 0.7003
241.0 723 - 0.6987
242.0 726 - 0.6974
243.0 729 - 0.6976
244.0 732 - 0.6989
245.0 735 - 0.6986
246.0 738 - 0.6982
247.0 741 - 0.6981
248.0 744 - 0.7000
249.0 747 - 0.7005
250.0 750 - 0.7015
251.0 753 - 0.7016
252.0 756 - 0.7009
253.0 759 - 0.7032
254.0 762 - 0.7039
255.0 765 - 0.7062
256.0 768 - 0.7060
257.0 771 - 0.7075
258.0 774 - 0.7093
259.0 777 - 0.7105
260.0 780 - 0.7110
261.0 783 - 0.7115
262.0 786 - 0.7130
263.0 789 - 0.7134
264.0 792 - 0.7134
265.0 795 - 0.7111
266.0 798 - 0.7111
267.0 801 - 0.7095
268.0 804 - 0.7082
269.0 807 - 0.7090
270.0 810 - 0.7113
271.0 813 - 0.7113
272.0 816 - 0.7110
273.0 819 - 0.7115
274.0 822 - 0.7127
275.0 825 - 0.7130
276.0 828 - 0.7130
277.0 831 - 0.7130
278.0 834 - 0.7137
279.0 837 - 0.7138
280.0 840 - 0.7139
281.0 843 - 0.7145
282.0 846 - 0.7146
283.0 849 - 0.7167
284.0 852 - 0.7167
285.0 855 - 0.7166
286.0 858 - 0.7180
287.0 861 - 0.7196
288.0 864 - 0.7200
289.0 867 - 0.7201
290.0 870 - 0.7202
291.0 873 - 0.7205
292.0 876 - 0.7203
293.0 879 - 0.7211
294.0 882 - 0.7213
295.0 885 - 0.7214
296.0 888 - 0.7215
297.0 891 - 0.7221
298.0 894 - 0.7234
299.0 897 - 0.7233
300.0 900 - 0.7235
301.0 903 - 0.7246
302.0 906 - 0.7248
303.0 909 - 0.7251
304.0 912 - 0.7251
305.0 915 - 0.7259
306.0 918 - 0.7259
307.0 921 - 0.7290
308.0 924 - 0.7290
309.0 927 - 0.7289
310.0 930 - 0.7291
311.0 933 - 0.7290
312.0 936 - 0.7275
313.0 939 - 0.7290
314.0 942 - 0.7290
315.0 945 - 0.7290
316.0 948 - 0.7290
317.0 951 - 0.7307
318.0 954 - 0.7309
319.0 957 - 0.7317
320.0 960 - 0.7304
321.0 963 - 0.7307
322.0 966 - 0.7289
323.0 969 - 0.7274
324.0 972 - 0.7291
325.0 975 - 0.7291
326.0 978 - 0.7308
327.0 981 - 0.7307
328.0 984 - 0.7308
329.0 987 - 0.7314
330.0 990 - 0.7315
331.0 993 - 0.7300
332.0 996 - 0.7300
333.0 999 - 0.7300
333.3333 1000 3.2971 -
334.0 1002 - 0.7300
335.0 1005 - 0.7311
336.0 1008 - 0.7326
337.0 1011 - 0.7310
338.0 1014 - 0.7323
339.0 1017 - 0.7338
340.0 1020 - 0.7323
341.0 1023 - 0.7324
342.0 1026 - 0.7326
343.0 1029 - 0.7331
344.0 1032 - 0.7333
345.0 1035 - 0.7330
346.0 1038 - 0.7328
347.0 1041 - 0.7332
348.0 1044 - 0.7333
349.0 1047 - 0.7334
350.0 1050 - 0.7337
351.0 1053 - 0.7337
352.0 1056 - 0.7338
353.0 1059 - 0.7323
354.0 1062 - 0.7323
355.0 1065 - 0.7321
356.0 1068 - 0.7307
357.0 1071 - 0.7307
358.0 1074 - 0.7309
359.0 1077 - 0.7311
360.0 1080 - 0.7312
361.0 1083 - 0.7318
362.0 1086 - 0.7319
363.0 1089 - 0.7318
364.0 1092 - 0.7319
365.0 1095 - 0.7319
366.0 1098 - 0.7334
367.0 1101 - 0.7329
368.0 1104 - 0.7314
369.0 1107 - 0.7314
370.0 1110 - 0.7314
371.0 1113 - 0.7315
372.0 1116 - 0.7320
373.0 1119 - 0.7335
374.0 1122 - 0.7334
375.0 1125 - 0.7334
376.0 1128 - 0.7329
377.0 1131 - 0.7319
378.0 1134 - 0.7334
379.0 1137 - 0.7334
380.0 1140 - 0.7333
381.0 1143 - 0.7332
382.0 1146 - 0.7333
383.0 1149 - 0.7344
384.0 1152 - 0.7329
385.0 1155 - 0.7329
386.0 1158 - 0.7326
387.0 1161 - 0.7327
388.0 1164 - 0.7309
389.0 1167 - 0.7327
390.0 1170 - 0.7320
391.0 1173 - 0.7321
392.0 1176 - 0.7332
393.0 1179 - 0.7332
394.0 1182 - 0.7315
395.0 1185 - 0.7351
396.0 1188 - 0.7341
397.0 1191 - 0.7341
398.0 1194 - 0.7335
399.0 1197 - 0.7370
400.0 1200 - 0.7367
401.0 1203 - 0.7369
402.0 1206 - 0.7369
403.0 1209 - 0.7369
404.0 1212 - 0.7369
405.0 1215 - 0.7354
406.0 1218 - 0.7352
407.0 1221 - 0.7373
408.0 1224 - 0.7372
409.0 1227 - 0.7366
410.0 1230 - 0.7381
411.0 1233 - 0.7367
412.0 1236 - 0.7373
413.0 1239 - 0.7386
414.0 1242 - 0.7384
415.0 1245 - 0.7387
416.0 1248 - 0.7386
417.0 1251 - 0.7386
418.0 1254 - 0.7382
419.0 1257 - 0.7366
420.0 1260 - 0.7400
421.0 1263 - 0.7399
422.0 1266 - 0.7416
423.0 1269 - 0.7440
424.0 1272 - 0.7455
425.0 1275 - 0.7459
426.0 1278 - 0.7459
427.0 1281 - 0.7458
428.0 1284 - 0.7458
429.0 1287 - 0.7458
430.0 1290 - 0.7459
431.0 1293 - 0.7457
432.0 1296 - 0.7457
433.0 1299 - 0.7459
434.0 1302 - 0.7459
435.0 1305 - 0.7459
436.0 1308 - 0.7463
437.0 1311 - 0.7460
438.0 1314 - 0.7469
439.0 1317 - 0.7479
440.0 1320 - 0.7478
441.0 1323 - 0.7464
442.0 1326 - 0.7475
443.0 1329 - 0.7476
444.0 1332 - 0.7476
445.0 1335 - 0.7481
446.0 1338 - 0.7477
447.0 1341 - 0.7466
448.0 1344 - 0.7467
449.0 1347 - 0.7462
450.0 1350 - 0.7468
451.0 1353 - 0.7467
452.0 1356 - 0.7467
453.0 1359 - 0.7481
454.0 1362 - 0.7480
455.0 1365 - 0.7483
456.0 1368 - 0.7479
457.0 1371 - 0.7464
458.0 1374 - 0.7464
459.0 1377 - 0.7474
460.0 1380 - 0.7491
461.0 1383 - 0.7494
462.0 1386 - 0.7495
463.0 1389 - 0.7491
464.0 1392 - 0.7496
465.0 1395 - 0.7497
466.0 1398 - 0.7511
467.0 1401 - 0.7511
468.0 1404 - 0.7515
469.0 1407 - 0.7515
470.0 1410 - 0.7511
471.0 1413 - 0.7511
472.0 1416 - 0.7493
473.0 1419 - 0.7491
474.0 1422 - 0.7496
475.0 1425 - 0.7496
476.0 1428 - 0.7495
477.0 1431 - 0.7494
478.0 1434 - 0.7493
479.0 1437 - 0.7495
480.0 1440 - 0.7508
481.0 1443 - 0.7509
482.0 1446 - 0.7507
483.0 1449 - 0.7501
484.0 1452 - 0.7487
485.0 1455 - 0.7493
486.0 1458 - 0.7514
487.0 1461 - 0.7513
488.0 1464 - 0.7500
489.0 1467 - 0.7502
490.0 1470 - 0.7501
491.0 1473 - 0.7504
492.0 1476 - 0.7503
493.0 1479 - 0.7489
494.0 1482 - 0.7491
495.0 1485 - 0.7489
496.0 1488 - 0.7506
497.0 1491 - 0.7506
498.0 1494 - 0.7504
499.0 1497 - 0.7504
500.0 1500 2.688 0.7503
501.0 1503 - 0.7503
502.0 1506 - 0.7504
503.0 1509 - 0.7519
504.0 1512 - 0.7522
505.0 1515 - 0.7527
506.0 1518 - 0.7526
507.0 1521 - 0.7528
508.0 1524 - 0.7527
509.0 1527 - 0.7527
510.0 1530 - 0.7525
511.0 1533 - 0.7519
512.0 1536 - 0.7527
513.0 1539 - 0.7533
514.0 1542 - 0.7533
515.0 1545 - 0.7533
516.0 1548 - 0.7538
517.0 1551 - 0.7538
518.0 1554 - 0.7534
519.0 1557 - 0.7538
520.0 1560 - 0.7539
521.0 1563 - 0.7541
522.0 1566 - 0.7538
523.0 1569 - 0.7535
524.0 1572 - 0.7516
525.0 1575 - 0.7530
526.0 1578 - 0.7530
527.0 1581 - 0.7538
528.0 1584 - 0.7573
529.0 1587 - 0.7573
530.0 1590 - 0.7570
531.0 1593 - 0.7585
532.0 1596 - 0.7584
533.0 1599 - 0.7588
534.0 1602 - 0.7588
535.0 1605 - 0.7588
536.0 1608 - 0.7584
537.0 1611 - 0.7569
538.0 1614 - 0.7572
539.0 1617 - 0.7587
540.0 1620 - 0.7582
541.0 1623 - 0.7586
542.0 1626 - 0.7587
543.0 1629 - 0.7633
544.0 1632 - 0.7629
545.0 1635 - 0.7627
546.0 1638 - 0.7624
547.0 1641 - 0.7621
548.0 1644 - 0.7626
549.0 1647 - 0.7625
550.0 1650 - 0.7610
551.0 1653 - 0.7630
552.0 1656 - 0.7630
553.0 1659 - 0.7629
554.0 1662 - 0.7659
555.0 1665 - 0.7644
556.0 1668 - 0.7648
557.0 1671 - 0.7650
558.0 1674 - 0.7645
559.0 1677 - 0.7643
560.0 1680 - 0.7646
561.0 1683 - 0.7646
562.0 1686 - 0.7636
563.0 1689 - 0.7621
564.0 1692 - 0.7624
565.0 1695 - 0.7644
566.0 1698 - 0.7657
567.0 1701 - 0.7663
568.0 1704 - 0.7645
569.0 1707 - 0.7640
570.0 1710 - 0.7639
571.0 1713 - 0.7657
572.0 1716 - 0.7663
573.0 1719 - 0.7648
574.0 1722 - 0.7649
575.0 1725 - 0.7651
576.0 1728 - 0.7648
577.0 1731 - 0.7643
578.0 1734 - 0.7643
579.0 1737 - 0.7647
580.0 1740 - 0.7648
581.0 1743 - 0.7664
582.0 1746 - 0.7679
583.0 1749 - 0.7682
584.0 1752 - 0.7652
585.0 1755 - 0.7652
586.0 1758 - 0.7638
587.0 1761 - 0.7637
588.0 1764 - 0.7638
589.0 1767 - 0.7653
590.0 1770 - 0.7653
591.0 1773 - 0.7652
592.0 1776 - 0.7631
593.0 1779 - 0.7632
594.0 1782 - 0.7629
595.0 1785 - 0.7660
596.0 1788 - 0.7677
597.0 1791 - 0.7644
598.0 1794 - 0.7628
599.0 1797 - 0.7628
600.0 1800 - 0.7629
601.0 1803 - 0.7643
602.0 1806 - 0.7657
603.0 1809 - 0.7663
604.0 1812 - 0.7656
605.0 1815 - 0.7656
606.0 1818 - 0.7638
607.0 1821 - 0.7639
608.0 1824 - 0.7642
609.0 1827 - 0.7644
610.0 1830 - 0.7642
611.0 1833 - 0.7643
612.0 1836 - 0.7643
613.0 1839 - 0.7643
614.0 1842 - 0.7659
615.0 1845 - 0.7660
616.0 1848 - 0.7642
617.0 1851 - 0.7645
618.0 1854 - 0.7630
619.0 1857 - 0.7630
620.0 1860 - 0.7643
621.0 1863 - 0.7643
622.0 1866 - 0.7640
623.0 1869 - 0.7640
624.0 1872 - 0.7641
625.0 1875 - 0.7641
626.0 1878 - 0.7642
627.0 1881 - 0.7641
628.0 1884 - 0.7655
629.0 1887 - 0.7654
630.0 1890 - 0.7654
631.0 1893 - 0.7653
632.0 1896 - 0.7638
633.0 1899 - 0.7655
634.0 1902 - 0.7656
635.0 1905 - 0.7657
636.0 1908 - 0.7659
637.0 1911 - 0.7661
638.0 1914 - 0.7661
639.0 1917 - 0.7664
640.0 1920 - 0.7667
641.0 1923 - 0.7665
642.0 1926 - 0.7664
643.0 1929 - 0.7661
644.0 1932 - 0.7661
645.0 1935 - 0.7651
646.0 1938 - 0.7652
647.0 1941 - 0.7659
648.0 1944 - 0.7659
649.0 1947 - 0.7657
650.0 1950 - 0.7657
651.0 1953 - 0.7659
652.0 1956 - 0.7660
653.0 1959 - 0.7662
654.0 1962 - 0.7661
655.0 1965 - 0.7662
656.0 1968 - 0.7658
657.0 1971 - 0.7659
658.0 1974 - 0.7675
659.0 1977 - 0.7673
660.0 1980 - 0.7670
661.0 1983 - 0.7663
662.0 1986 - 0.7664
663.0 1989 - 0.7654
664.0 1992 - 0.7654
665.0 1995 - 0.7658
666.0 1998 - 0.7658
666.6667 2000 2.5178 -
667.0 2001 - 0.7673
668.0 2004 - 0.7673
669.0 2007 - 0.7670
670.0 2010 - 0.7691
671.0 2013 - 0.7691
672.0 2016 - 0.7676
673.0 2019 - 0.7670
674.0 2022 - 0.7669
675.0 2025 - 0.7664
676.0 2028 - 0.7671
677.0 2031 - 0.7674
678.0 2034 - 0.7674
679.0 2037 - 0.7674
680.0 2040 - 0.7637
681.0 2043 - 0.7641
682.0 2046 - 0.7642
683.0 2049 - 0.7626
684.0 2052 - 0.7644
685.0 2055 - 0.7640
686.0 2058 - 0.7656
687.0 2061 - 0.7656
688.0 2064 - 0.7666
689.0 2067 - 0.7669
690.0 2070 - 0.7665
691.0 2073 - 0.7668
692.0 2076 - 0.7671
693.0 2079 - 0.7674
694.0 2082 - 0.7677
695.0 2085 - 0.7671
696.0 2088 - 0.7669
697.0 2091 - 0.7664
698.0 2094 - 0.7671
699.0 2097 - 0.7665
700.0 2100 - 0.7663
701.0 2103 - 0.7667
702.0 2106 - 0.7668
703.0 2109 - 0.7672
704.0 2112 - 0.7673
705.0 2115 - 0.7672
706.0 2118 - 0.7670
707.0 2121 - 0.7669
708.0 2124 - 0.7669
709.0 2127 - 0.7667
710.0 2130 - 0.7664
711.0 2133 - 0.7647
712.0 2136 - 0.7646
713.0 2139 - 0.7647
714.0 2142 - 0.7647
715.0 2145 - 0.7649
716.0 2148 - 0.7665
717.0 2151 - 0.7666
718.0 2154 - 0.7667
719.0 2157 - 0.7666
720.0 2160 - 0.7667
721.0 2163 - 0.7666
722.0 2166 - 0.7665
723.0 2169 - 0.7652
724.0 2172 - 0.7666
725.0 2175 - 0.7665
726.0 2178 - 0.7666
727.0 2181 - 0.7665
728.0 2184 - 0.7668
729.0 2187 - 0.7665
730.0 2190 - 0.7648
731.0 2193 - 0.7665
732.0 2196 - 0.7665
733.0 2199 - 0.7666
734.0 2202 - 0.7669
735.0 2205 - 0.7666
736.0 2208 - 0.7666
737.0 2211 - 0.7680
738.0 2214 - 0.7685
739.0 2217 - 0.7671
740.0 2220 - 0.7677
741.0 2223 - 0.7672
742.0 2226 - 0.7671
743.0 2229 - 0.7688
744.0 2232 - 0.7688
745.0 2235 - 0.7691
746.0 2238 - 0.7690
747.0 2241 - 0.7687
748.0 2244 - 0.7704
749.0 2247 - 0.7704
750.0 2250 - 0.7702
751.0 2253 - 0.7697
752.0 2256 - 0.7686
753.0 2259 - 0.7700
754.0 2262 - 0.7700
755.0 2265 - 0.7702
756.0 2268 - 0.7707
757.0 2271 - 0.7685
758.0 2274 - 0.7685
759.0 2277 - 0.7686
760.0 2280 - 0.7679
761.0 2283 - 0.7682
762.0 2286 - 0.7674
763.0 2289 - 0.7676
764.0 2292 - 0.7675
765.0 2295 - 0.7703
766.0 2298 - 0.7706
767.0 2301 - 0.7714
768.0 2304 - 0.7716
769.0 2307 - 0.7717
770.0 2310 - 0.7717
771.0 2313 - 0.7717
772.0 2316 - 0.7720
773.0 2319 - 0.7719
774.0 2322 - 0.7701
775.0 2325 - 0.7701
776.0 2328 - 0.7701
777.0 2331 - 0.7698
778.0 2334 - 0.7697
779.0 2337 - 0.7692
780.0 2340 - 0.7689
781.0 2343 - 0.7687
782.0 2346 - 0.7685
783.0 2349 - 0.7688
784.0 2352 - 0.7695
785.0 2355 - 0.7696
786.0 2358 - 0.7683
787.0 2361 - 0.7700
788.0 2364 - 0.7703
789.0 2367 - 0.7704
790.0 2370 - 0.7689
791.0 2373 - 0.7689
792.0 2376 - 0.7700
793.0 2379 - 0.7683
794.0 2382 - 0.7681
795.0 2385 - 0.7678
796.0 2388 - 0.7678
797.0 2391 - 0.7678
798.0 2394 - 0.7694
799.0 2397 - 0.7696
800.0 2400 - 0.7714
801.0 2403 - 0.7715
802.0 2406 - 0.7701
803.0 2409 - 0.7680
804.0 2412 - 0.7674
805.0 2415 - 0.7670
806.0 2418 - 0.7670
807.0 2421 - 0.7669
808.0 2424 - 0.7670
809.0 2427 - 0.7669
810.0 2430 - 0.7670
811.0 2433 - 0.7670
812.0 2436 - 0.7675
813.0 2439 - 0.7681
814.0 2442 - 0.7684
815.0 2445 - 0.7686
816.0 2448 - 0.7688
817.0 2451 - 0.7693
818.0 2454 - 0.7678
819.0 2457 - 0.7675
820.0 2460 - 0.7677
821.0 2463 - 0.7688
822.0 2466 - 0.7689
823.0 2469 - 0.7701
824.0 2472 - 0.7699
825.0 2475 - 0.7711
826.0 2478 - 0.7710
827.0 2481 - 0.7716
828.0 2484 - 0.7716
829.0 2487 - 0.7719
830.0 2490 - 0.7718
831.0 2493 - 0.7720
832.0 2496 - 0.7724
833.0 2499 - 0.7724
833.3333 2500 2.4669 -
834.0 2502 - 0.7722
835.0 2505 - 0.7720
836.0 2508 - 0.7700
837.0 2511 - 0.7696
838.0 2514 - 0.7702
839.0 2517 - 0.7704
840.0 2520 - 0.7703
841.0 2523 - 0.7706
842.0 2526 - 0.7713
843.0 2529 - 0.7704
844.0 2532 - 0.7698
845.0 2535 - 0.7700
846.0 2538 - 0.7705
847.0 2541 - 0.7699
848.0 2544 - 0.7702
849.0 2547 - 0.7700
850.0 2550 - 0.7699
851.0 2553 - 0.7702
852.0 2556 - 0.7717
853.0 2559 - 0.7713
854.0 2562 - 0.7723
855.0 2565 - 0.7730
856.0 2568 - 0.7735
857.0 2571 - 0.7735
858.0 2574 - 0.7720
859.0 2577 - 0.7719
860.0 2580 - 0.7715
861.0 2583 - 0.7701
862.0 2586 - 0.7696
863.0 2589 - 0.7703
864.0 2592 - 0.7702
865.0 2595 - 0.7714
866.0 2598 - 0.7714
867.0 2601 - 0.7731
868.0 2604 - 0.7734
869.0 2607 - 0.7744
870.0 2610 - 0.7729
871.0 2613 - 0.7731
872.0 2616 - 0.7718
873.0 2619 - 0.7719
874.0 2622 - 0.7736
875.0 2625 - 0.7757
876.0 2628 - 0.7757
877.0 2631 - 0.7755
878.0 2634 - 0.7765
879.0 2637 - 0.7767
880.0 2640 - 0.7743
881.0 2643 - 0.7742
882.0 2646 - 0.7743
883.0 2649 - 0.7758
884.0 2652 - 0.7761
885.0 2655 - 0.7765
886.0 2658 - 0.7761
887.0 2661 - 0.7760
888.0 2664 - 0.7730
889.0 2667 - 0.7730
890.0 2670 - 0.7730
891.0 2673 - 0.7734
892.0 2676 - 0.7737
893.0 2679 - 0.7717
894.0 2682 - 0.7724
895.0 2685 - 0.7725
896.0 2688 - 0.7741
897.0 2691 - 0.7743
898.0 2694 - 0.7742
899.0 2697 - 0.7743
900.0 2700 - 0.7759
901.0 2703 - 0.7758
902.0 2706 - 0.7765
903.0 2709 - 0.7778
904.0 2712 - 0.7779
905.0 2715 - 0.7788
906.0 2718 - 0.7773
907.0 2721 - 0.7773
908.0 2724 - 0.7757
909.0 2727 - 0.7752
910.0 2730 - 0.7766
911.0 2733 - 0.7758
912.0 2736 - 0.7760
913.0 2739 - 0.7758
914.0 2742 - 0.7745
915.0 2745 - 0.7747
916.0 2748 - 0.7727
917.0 2751 - 0.7729
918.0 2754 - 0.7735
919.0 2757 - 0.7735
920.0 2760 - 0.7736
921.0 2763 - 0.7736
922.0 2766 - 0.7737
923.0 2769 - 0.7736
924.0 2772 - 0.7736
925.0 2775 - 0.7734
926.0 2778 - 0.7723
927.0 2781 - 0.7722
928.0 2784 - 0.7721
929.0 2787 - 0.7729
930.0 2790 - 0.7726
931.0 2793 - 0.7726
932.0 2796 - 0.7722
933.0 2799 - 0.7722
934.0 2802 - 0.7706
935.0 2805 - 0.7722
936.0 2808 - 0.7722
937.0 2811 - 0.7703
938.0 2814 - 0.7720
939.0 2817 - 0.7725
940.0 2820 - 0.7724
941.0 2823 - 0.7728
942.0 2826 - 0.7734
943.0 2829 - 0.7740
944.0 2832 - 0.7732
945.0 2835 - 0.7733
946.0 2838 - 0.7729
947.0 2841 - 0.7725
948.0 2844 - 0.7724
949.0 2847 - 0.7739
950.0 2850 - 0.7741
951.0 2853 - 0.7754
952.0 2856 - 0.7759
953.0 2859 - 0.7752
954.0 2862 - 0.7745
955.0 2865 - 0.7753
956.0 2868 - 0.7754
957.0 2871 - 0.7750
958.0 2874 - 0.7761
959.0 2877 - 0.7761
960.0 2880 - 0.7745
961.0 2883 - 0.7741
962.0 2886 - 0.7740
963.0 2889 - 0.7727
964.0 2892 - 0.7727
965.0 2895 - 0.7730
966.0 2898 - 0.7733
967.0 2901 - 0.7735
968.0 2904 - 0.7733
969.0 2907 - 0.7750
970.0 2910 - 0.7756
971.0 2913 - 0.7763
972.0 2916 - 0.7767
973.0 2919 - 0.7759
974.0 2922 - 0.7757
975.0 2925 - 0.7752
976.0 2928 - 0.7765
977.0 2931 - 0.7762
978.0 2934 - 0.7761
979.0 2937 - 0.7767
980.0 2940 - 0.7763
981.0 2943 - 0.7747
982.0 2946 - 0.7748
983.0 2949 - 0.7755
984.0 2952 - 0.7756
985.0 2955 - 0.7756
986.0 2958 - 0.7757
987.0 2961 - 0.7751
988.0 2964 - 0.7753
989.0 2967 - 0.7754
990.0 2970 - 0.7753
991.0 2973 - 0.7750
992.0 2976 - 0.7753
993.0 2979 - 0.7771
994.0 2982 - 0.7739
995.0 2985 - 0.7719
996.0 2988 - 0.7719
997.0 2991 - 0.7736
998.0 2994 - 0.7739
999.0 2997 - 0.7745
1000.0 3000 2.4448 0.7730
1001.0 3003 - 0.7730
1002.0 3006 - 0.7730
1003.0 3009 - 0.7730
1004.0 3012 - 0.7730
1005.0 3015 - 0.7730
1006.0 3018 - 0.7730
1007.0 3021 - 0.7730
1008.0 3024 - 0.7730
1009.0 3027 - 0.7730
1010.0 3030 - 0.7730
1011.0 3033 - 0.7730
1012.0 3036 - 0.7730
1013.0 3039 - 0.7730
1014.0 3042 - 0.7730
1015.0 3045 - 0.7730
1016.0 3048 - 0.7730
1017.0 3051 - 0.7730
1018.0 3054 - 0.7730
1019.0 3057 - 0.7730
1020.0 3060 - 0.7730
1021.0 3063 - 0.7730
1022.0 3066 - 0.7730
1023.0 3069 - 0.7730
1024.0 3072 - 0.7730
1025.0 3075 - 0.7730
1026.0 3078 - 0.7730
1027.0 3081 - 0.7730
1028.0 3084 - 0.7730
1029.0 3087 - 0.7730
1030.0 3090 - 0.7730
1031.0 3093 - 0.7730
1032.0 3096 - 0.7730
1033.0 3099 - 0.7730
1034.0 3102 - 0.7730
1035.0 3105 - 0.7730
1036.0 3108 - 0.7730
1037.0 3111 - 0.7730
1038.0 3114 - 0.7730
1039.0 3117 - 0.7730
1040.0 3120 - 0.7730
1041.0 3123 - 0.7730
1042.0 3126 - 0.7730
1043.0 3129 - 0.7730
1044.0 3132 - 0.7730
1045.0 3135 - 0.7730
1046.0 3138 - 0.7730
1047.0 3141 - 0.7730
1048.0 3144 - 0.7730
1049.0 3147 - 0.7730
1050.0 3150 - 0.7730
1051.0 3153 - 0.7730
1052.0 3156 - 0.7730
1053.0 3159 - 0.7730
1054.0 3162 - 0.7730
1055.0 3165 - 0.7730
1056.0 3168 - 0.7730
1057.0 3171 - 0.7730
1058.0 3174 - 0.7730
1059.0 3177 - 0.7730
1060.0 3180 - 0.7730
1061.0 3183 - 0.7730
1062.0 3186 - 0.7730
1063.0 3189 - 0.7730
1064.0 3192 - 0.7730
1065.0 3195 - 0.7730
1066.0 3198 - 0.7730
1067.0 3201 - 0.7730
1068.0 3204 - 0.7730
1069.0 3207 - 0.7730
1070.0 3210 - 0.7730
1071.0 3213 - 0.7730
1072.0 3216 - 0.7730
1073.0 3219 - 0.7730
1074.0 3222 - 0.7730
1075.0 3225 - 0.7730
1076.0 3228 - 0.7730
1077.0 3231 - 0.7730
1078.0 3234 - 0.7730
1079.0 3237 - 0.7730
1080.0 3240 - 0.7730
1081.0 3243 - 0.7730
1082.0 3246 - 0.7730
1083.0 3249 - 0.7730
1084.0 3252 - 0.7730
1085.0 3255 - 0.7730
1086.0 3258 - 0.7730
1087.0 3261 - 0.7730
1088.0 3264 - 0.7730
1089.0 3267 - 0.7730
1090.0 3270 - 0.7730
1091.0 3273 - 0.7730
1092.0 3276 - 0.7730
1093.0 3279 - 0.7730
1094.0 3282 - 0.7730
1095.0 3285 - 0.7730
1096.0 3288 - 0.7730
1097.0 3291 - 0.7730
1098.0 3294 - 0.7730
1099.0 3297 - 0.7730
1100.0 3300 - 0.7730
1101.0 3303 - 0.7730
1102.0 3306 - 0.7730
1103.0 3309 - 0.7730
1104.0 3312 - 0.7730
1105.0 3315 - 0.7730
1106.0 3318 - 0.7730
1107.0 3321 - 0.7730
1108.0 3324 - 0.7730
1109.0 3327 - 0.7730
1110.0 3330 - 0.7730
1111.0 3333 - 0.7730
1112.0 3336 - 0.7730
1113.0 3339 - 0.7730
1114.0 3342 - 0.7730
1115.0 3345 - 0.7730
1116.0 3348 - 0.7730
1117.0 3351 - 0.7730
1118.0 3354 - 0.7730
1119.0 3357 - 0.7730
1120.0 3360 - 0.7730
1121.0 3363 - 0.7730
1122.0 3366 - 0.7730
1123.0 3369 - 0.7730
1124.0 3372 - 0.7730
1125.0 3375 - 0.7730
1126.0 3378 - 0.7730
1127.0 3381 - 0.7730
1128.0 3384 - 0.7730
1129.0 3387 - 0.7730
1130.0 3390 - 0.7730
1131.0 3393 - 0.7730
1132.0 3396 - 0.7730
1133.0 3399 - 0.7730
1134.0 3402 - 0.7730
1135.0 3405 - 0.7730
1136.0 3408 - 0.7730
1137.0 3411 - 0.7730
1138.0 3414 - 0.7730
1139.0 3417 - 0.7730
1140.0 3420 - 0.7730
1141.0 3423 - 0.7730
1142.0 3426 - 0.7730
1143.0 3429 - 0.7730
1144.0 3432 - 0.7730
1145.0 3435 - 0.7730
1146.0 3438 - 0.7730
1147.0 3441 - 0.7730
1148.0 3444 - 0.7730
1149.0 3447 - 0.7730
1150.0 3450 - 0.7730
1151.0 3453 - 0.7730
1152.0 3456 - 0.7730
1153.0 3459 - 0.7730
1154.0 3462 - 0.7730
1155.0 3465 - 0.7730
1156.0 3468 - 0.7730
1157.0 3471 - 0.7730
1158.0 3474 - 0.7730
1159.0 3477 - 0.7730
1160.0 3480 - 0.7730
1161.0 3483 - 0.7730
1162.0 3486 - 0.7730
1163.0 3489 - 0.7730
1164.0 3492 - 0.7730
1165.0 3495 - 0.7730
1166.0 3498 - 0.7730
1166.6667 3500 2.4396 -
1167.0 3501 - 0.7730
1168.0 3504 - 0.7730
1169.0 3507 - 0.7730
1170.0 3510 - 0.7730
1171.0 3513 - 0.7730
1172.0 3516 - 0.7730
1173.0 3519 - 0.7730
1174.0 3522 - 0.7730
1175.0 3525 - 0.7730
1176.0 3528 - 0.7730
1177.0 3531 - 0.7730
1178.0 3534 - 0.7730
1179.0 3537 - 0.7730
1180.0 3540 - 0.7730
1181.0 3543 - 0.7730
1182.0 3546 - 0.7730
1183.0 3549 - 0.7730
1184.0 3552 - 0.7730
1185.0 3555 - 0.7730
1186.0 3558 - 0.7730
1187.0 3561 - 0.7730
1188.0 3564 - 0.7730
1189.0 3567 - 0.7730
1190.0 3570 - 0.7730
1191.0 3573 - 0.7730
1192.0 3576 - 0.7730
1193.0 3579 - 0.7730
1194.0 3582 - 0.7730
1195.0 3585 - 0.7730
1196.0 3588 - 0.7730
1197.0 3591 - 0.7730
1198.0 3594 - 0.7730
1199.0 3597 - 0.7730
1200.0 3600 - 0.7730

Framework Versions

  • Python: 3.10.14
  • Sentence Transformers: 3.0.1
  • Transformers: 4.44.0
  • PyTorch: 2.4.0
  • Accelerate: 0.34.2
  • Datasets: 2.21.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
54
Safetensors
Model size
29.2M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for WpythonW/RUbert-tiny_custom_test

Finetuned
(35)
this model

Evaluation results