Gameselo commited on
Commit
07ce3f6
1 Parent(s): 88edc46

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +33 -494
README.md CHANGED
@@ -995,6 +995,38 @@ model-index:
995
  name: Spearman Max
996
  ---
997
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
998
  # SentenceTransformer based on sentence-transformers/paraphrase-multilingual-mpnet-base-v2
999
 
1000
  This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/paraphrase-multilingual-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
@@ -1103,7 +1135,7 @@ You can finetune this model on your own dataset.
1103
  | pearson_max | 0.9551 |
1104
  | spearman_max | 0.9593 |
1105
 
1106
- #### Semantic Similarity
1107
  * Dataset: `sts-test`
1108
  * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1109
 
@@ -1120,499 +1152,6 @@ You can finetune this model on your own dataset.
1120
  | pearson_max | 0.948 |
1121
  | spearman_max | 0.9515 |
1122
 
1123
- #### Semantic Similarity
1124
- * Dataset: `sts-test`
1125
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1126
-
1127
- | Metric | Value |
1128
- |:--------------------|:-----------|
1129
- | pearson_cosine | 0.9725 |
1130
- | **spearman_cosine** | **0.9766** |
1131
- | pearson_manhattan | 0.9382 |
1132
- | spearman_manhattan | 0.9487 |
1133
- | pearson_euclidean | 0.9392 |
1134
- | spearman_euclidean | 0.95 |
1135
- | pearson_dot | 0.8531 |
1136
- | spearman_dot | 0.8611 |
1137
- | pearson_max | 0.9725 |
1138
- | spearman_max | 0.9766 |
1139
-
1140
- #### Semantic Similarity
1141
- * Dataset: `sts-test`
1142
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1143
-
1144
- | Metric | Value |
1145
- |:--------------------|:-----------|
1146
- | pearson_cosine | 0.8027 |
1147
- | **spearman_cosine** | **0.8124** |
1148
- | pearson_manhattan | 0.7839 |
1149
- | spearman_manhattan | 0.79 |
1150
- | pearson_euclidean | 0.7836 |
1151
- | spearman_euclidean | 0.792 |
1152
- | pearson_dot | 0.7699 |
1153
- | spearman_dot | 0.782 |
1154
- | pearson_max | 0.8027 |
1155
- | spearman_max | 0.8124 |
1156
-
1157
- #### Semantic Similarity
1158
- * Dataset: `sts-test`
1159
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1160
-
1161
- | Metric | Value |
1162
- |:--------------------|:-----------|
1163
- | pearson_cosine | 0.7796 |
1164
- | **spearman_cosine** | **0.7703** |
1165
- | pearson_manhattan | 0.7904 |
1166
- | spearman_manhattan | 0.783 |
1167
- | pearson_euclidean | 0.7912 |
1168
- | spearman_euclidean | 0.7842 |
1169
- | pearson_dot | 0.7077 |
1170
- | spearman_dot | 0.6914 |
1171
- | pearson_max | 0.7912 |
1172
- | spearman_max | 0.7842 |
1173
-
1174
- #### Semantic Similarity
1175
- * Dataset: `sts-test`
1176
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1177
-
1178
- | Metric | Value |
1179
- |:--------------------|:-----------|
1180
- | pearson_cosine | 0.9113 |
1181
- | **spearman_cosine** | **0.9109** |
1182
- | pearson_manhattan | 0.897 |
1183
- | spearman_manhattan | 0.8934 |
1184
- | pearson_euclidean | 0.8986 |
1185
- | spearman_euclidean | 0.8955 |
1186
- | pearson_dot | 0.8844 |
1187
- | spearman_dot | 0.8923 |
1188
- | pearson_max | 0.9113 |
1189
- | spearman_max | 0.9109 |
1190
-
1191
- #### Semantic Similarity
1192
- * Dataset: `sts-test`
1193
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1194
-
1195
- | Metric | Value |
1196
- |:--------------------|:-----------|
1197
- | pearson_cosine | 0.9362 |
1198
- | **spearman_cosine** | **0.9379** |
1199
- | pearson_manhattan | 0.923 |
1200
- | spearman_manhattan | 0.9245 |
1201
- | pearson_euclidean | 0.9231 |
1202
- | spearman_euclidean | 0.9251 |
1203
- | pearson_dot | 0.907 |
1204
- | spearman_dot | 0.9186 |
1205
- | pearson_max | 0.9362 |
1206
- | spearman_max | 0.9379 |
1207
-
1208
- #### Semantic Similarity
1209
- * Dataset: `sts-test`
1210
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1211
-
1212
- | Metric | Value |
1213
- |:--------------------|:-----------|
1214
- | pearson_cosine | 0.8049 |
1215
- | **spearman_cosine** | **0.7987** |
1216
- | pearson_manhattan | 0.8018 |
1217
- | spearman_manhattan | 0.7828 |
1218
- | pearson_euclidean | 0.8007 |
1219
- | spearman_euclidean | 0.7825 |
1220
- | pearson_dot | 0.7895 |
1221
- | spearman_dot | 0.7819 |
1222
- | pearson_max | 0.8049 |
1223
- | spearman_max | 0.7987 |
1224
-
1225
- #### Semantic Similarity
1226
- * Dataset: `sts-test`
1227
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1228
-
1229
- | Metric | Value |
1230
- |:--------------------|:-----------|
1231
- | pearson_cosine | 0.852 |
1232
- | **spearman_cosine** | **0.8553** |
1233
- | pearson_manhattan | 0.8464 |
1234
- | spearman_manhattan | 0.841 |
1235
- | pearson_euclidean | 0.8468 |
1236
- | spearman_euclidean | 0.8459 |
1237
- | pearson_dot | 0.8093 |
1238
- | spearman_dot | 0.8154 |
1239
- | pearson_max | 0.852 |
1240
- | spearman_max | 0.8553 |
1241
-
1242
- #### Semantic Similarity
1243
- * Dataset: `sts-test`
1244
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1245
-
1246
- | Metric | Value |
1247
- |:--------------------|:-----------|
1248
- | pearson_cosine | 0.8752 |
1249
- | **spearman_cosine** | **0.8727** |
1250
- | pearson_manhattan | 0.8745 |
1251
- | spearman_manhattan | 0.8661 |
1252
- | pearson_euclidean | 0.8748 |
1253
- | spearman_euclidean | 0.8668 |
1254
- | pearson_dot | 0.8603 |
1255
- | spearman_dot | 0.852 |
1256
- | pearson_max | 0.8752 |
1257
- | spearman_max | 0.8727 |
1258
-
1259
- #### Semantic Similarity
1260
- * Dataset: `sts-test`
1261
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1262
-
1263
- | Metric | Value |
1264
- |:--------------------|:-----------|
1265
- | pearson_cosine | 0.9082 |
1266
- | **spearman_cosine** | **0.9068** |
1267
- | pearson_manhattan | 0.8908 |
1268
- | spearman_manhattan | 0.8852 |
1269
- | pearson_euclidean | 0.8908 |
1270
- | spearman_euclidean | 0.8851 |
1271
- | pearson_dot | 0.8889 |
1272
- | spearman_dot | 0.8966 |
1273
- | pearson_max | 0.9082 |
1274
- | spearman_max | 0.9068 |
1275
-
1276
- #### Semantic Similarity
1277
- * Dataset: `sts-test`
1278
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1279
-
1280
- | Metric | Value |
1281
- |:--------------------|:-----------|
1282
- | pearson_cosine | 0.925 |
1283
- | **spearman_cosine** | **0.9247** |
1284
- | pearson_manhattan | 0.9084 |
1285
- | spearman_manhattan | 0.9029 |
1286
- | pearson_euclidean | 0.9116 |
1287
- | spearman_euclidean | 0.9084 |
1288
- | pearson_dot | 0.9001 |
1289
- | spearman_dot | 0.907 |
1290
- | pearson_max | 0.925 |
1291
- | spearman_max | 0.9247 |
1292
-
1293
- #### Semantic Similarity
1294
- * Dataset: `sts-test`
1295
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1296
-
1297
- | Metric | Value |
1298
- |:--------------------|:-----------|
1299
- | pearson_cosine | 0.9133 |
1300
- | **spearman_cosine** | **0.9115** |
1301
- | pearson_manhattan | 0.8977 |
1302
- | spearman_manhattan | 0.8933 |
1303
- | pearson_euclidean | 0.8979 |
1304
- | spearman_euclidean | 0.8937 |
1305
- | pearson_dot | 0.8912 |
1306
- | spearman_dot | 0.8988 |
1307
- | pearson_max | 0.9133 |
1308
- | spearman_max | 0.9115 |
1309
-
1310
- #### Semantic Similarity
1311
- * Dataset: `sts-test`
1312
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1313
-
1314
- | Metric | Value |
1315
- |:--------------------|:-----------|
1316
- | pearson_cosine | 0.8985 |
1317
- | **spearman_cosine** | **0.8452** |
1318
- | pearson_manhattan | 0.8715 |
1319
- | spearman_manhattan | 0.8452 |
1320
- | pearson_euclidean | 0.8809 |
1321
- | spearman_euclidean | 0.8452 |
1322
- | pearson_dot | 0.8538 |
1323
- | spearman_dot | 0.8452 |
1324
- | pearson_max | 0.8985 |
1325
- | spearman_max | 0.8452 |
1326
-
1327
- #### Semantic Similarity
1328
- * Dataset: `sts-test`
1329
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1330
-
1331
- | Metric | Value |
1332
- |:--------------------|:-----------|
1333
- | pearson_cosine | 0.6495 |
1334
- | **spearman_cosine** | **0.6385** |
1335
- | pearson_manhattan | 0.6429 |
1336
- | spearman_manhattan | 0.6474 |
1337
- | pearson_euclidean | 0.6443 |
1338
- | spearman_euclidean | 0.6445 |
1339
- | pearson_dot | 0.6128 |
1340
- | spearman_dot | 0.6108 |
1341
- | pearson_max | 0.6495 |
1342
- | spearman_max | 0.6474 |
1343
-
1344
- #### Semantic Similarity
1345
- * Dataset: `sts-test`
1346
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1347
-
1348
- | Metric | Value |
1349
- |:--------------------|:-----------|
1350
- | pearson_cosine | 0.7441 |
1351
- | **spearman_cosine** | **0.7518** |
1352
- | pearson_manhattan | 0.7339 |
1353
- | spearman_manhattan | 0.7367 |
1354
- | pearson_euclidean | 0.7337 |
1355
- | spearman_euclidean | 0.7342 |
1356
- | pearson_dot | 0.6886 |
1357
- | spearman_dot | 0.6986 |
1358
- | pearson_max | 0.7441 |
1359
- | spearman_max | 0.7518 |
1360
-
1361
- #### Semantic Similarity
1362
- * Dataset: `sts-test`
1363
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1364
-
1365
- | Metric | Value |
1366
- |:--------------------|:-----------|
1367
- | pearson_cosine | 0.6279 |
1368
- | **spearman_cosine** | **0.6319** |
1369
- | pearson_manhattan | 0.5435 |
1370
- | spearman_manhattan | 0.6002 |
1371
- | pearson_euclidean | 0.54 |
1372
- | spearman_euclidean | 0.5955 |
1373
- | pearson_dot | 0.5658 |
1374
- | spearman_dot | 0.6069 |
1375
- | pearson_max | 0.6279 |
1376
- | spearman_max | 0.6319 |
1377
-
1378
- #### Semantic Similarity
1379
- * Dataset: `sts-test`
1380
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1381
-
1382
- | Metric | Value |
1383
- |:--------------------|:-----------|
1384
- | pearson_cosine | 0.7779 |
1385
- | **spearman_cosine** | **0.7876** |
1386
- | pearson_manhattan | 0.7426 |
1387
- | spearman_manhattan | 0.7789 |
1388
- | pearson_euclidean | 0.7437 |
1389
- | spearman_euclidean | 0.7806 |
1390
- | pearson_dot | 0.7214 |
1391
- | spearman_dot | 0.7489 |
1392
- | pearson_max | 0.7779 |
1393
- | spearman_max | 0.7876 |
1394
-
1395
- #### Semantic Similarity
1396
- * Dataset: `sts-test`
1397
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1398
-
1399
- | Metric | Value |
1400
- |:--------------------|:-----------|
1401
- | pearson_cosine | 0.5268 |
1402
- | **spearman_cosine** | **0.5774** |
1403
- | pearson_manhattan | 0.4171 |
1404
- | spearman_manhattan | 0.56 |
1405
- | pearson_euclidean | 0.4219 |
1406
- | spearman_euclidean | 0.5665 |
1407
- | pearson_dot | 0.4981 |
1408
- | spearman_dot | 0.5367 |
1409
- | pearson_max | 0.5268 |
1410
- | spearman_max | 0.5774 |
1411
-
1412
- #### Semantic Similarity
1413
- * Dataset: `sts-test`
1414
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1415
-
1416
- | Metric | Value |
1417
- |:--------------------|:-----------|
1418
- | pearson_cosine | 0.6306 |
1419
- | **spearman_cosine** | **0.6384** |
1420
- | pearson_manhattan | 0.6034 |
1421
- | spearman_manhattan | 0.6168 |
1422
- | pearson_euclidean | 0.6081 |
1423
- | spearman_euclidean | 0.622 |
1424
- | pearson_dot | 0.5767 |
1425
- | spearman_dot | 0.5831 |
1426
- | pearson_max | 0.6306 |
1427
- | spearman_max | 0.6384 |
1428
-
1429
- #### Semantic Similarity
1430
- * Dataset: `sts-test`
1431
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1432
-
1433
- | Metric | Value |
1434
- |:--------------------|:-----------|
1435
- | pearson_cosine | 0.5568 |
1436
- | **spearman_cosine** | **0.5867** |
1437
- | pearson_manhattan | 0.4924 |
1438
- | spearman_manhattan | 0.5738 |
1439
- | pearson_euclidean | 0.4906 |
1440
- | spearman_euclidean | 0.5762 |
1441
- | pearson_dot | 0.4307 |
1442
- | spearman_dot | 0.5471 |
1443
- | pearson_max | 0.5568 |
1444
- | spearman_max | 0.5867 |
1445
-
1446
- #### Semantic Similarity
1447
- * Dataset: `sts-test`
1448
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1449
-
1450
- | Metric | Value |
1451
- |:--------------------|:----------|
1452
- | pearson_cosine | 0.5776 |
1453
- | **spearman_cosine** | **0.575** |
1454
- | pearson_manhattan | 0.5718 |
1455
- | spearman_manhattan | 0.5501 |
1456
- | pearson_euclidean | 0.5695 |
1457
- | spearman_euclidean | 0.5532 |
1458
- | pearson_dot | 0.5315 |
1459
- | spearman_dot | 0.5191 |
1460
- | pearson_max | 0.5776 |
1461
- | spearman_max | 0.575 |
1462
-
1463
- #### Semantic Similarity
1464
- * Dataset: `sts-test`
1465
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1466
-
1467
- | Metric | Value |
1468
- |:--------------------|:-----------|
1469
- | pearson_cosine | 0.3572 |
1470
- | **spearman_cosine** | **0.4336** |
1471
- | pearson_manhattan | 0.2081 |
1472
- | spearman_manhattan | 0.4355 |
1473
- | pearson_euclidean | 0.2086 |
1474
- | spearman_euclidean | 0.4402 |
1475
- | pearson_dot | 0.2234 |
1476
- | spearman_dot | 0.3707 |
1477
- | pearson_max | 0.3572 |
1478
- | spearman_max | 0.4402 |
1479
-
1480
- #### Semantic Similarity
1481
- * Dataset: `sts-test`
1482
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1483
-
1484
- | Metric | Value |
1485
- |:--------------------|:-----------|
1486
- | pearson_cosine | 0.6863 |
1487
- | **spearman_cosine** | **0.6621** |
1488
- | pearson_manhattan | 0.6429 |
1489
- | spearman_manhattan | 0.6484 |
1490
- | pearson_euclidean | 0.6424 |
1491
- | spearman_euclidean | 0.6486 |
1492
- | pearson_dot | 0.6352 |
1493
- | spearman_dot | 0.6159 |
1494
- | pearson_max | 0.6863 |
1495
- | spearman_max | 0.6621 |
1496
-
1497
- #### Semantic Similarity
1498
- * Dataset: `sts-test`
1499
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1500
-
1501
- | Metric | Value |
1502
- |:--------------------|:-----------|
1503
- | pearson_cosine | 0.757 |
1504
- | **spearman_cosine** | **0.7511** |
1505
- | pearson_manhattan | 0.7191 |
1506
- | spearman_manhattan | 0.714 |
1507
- | pearson_euclidean | 0.7204 |
1508
- | spearman_euclidean | 0.7258 |
1509
- | pearson_dot | 0.7144 |
1510
- | spearman_dot | 0.7284 |
1511
- | pearson_max | 0.757 |
1512
- | spearman_max | 0.7511 |
1513
-
1514
- #### Semantic Similarity
1515
- * Dataset: `sts-test`
1516
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1517
-
1518
- | Metric | Value |
1519
- |:--------------------|:-----------|
1520
- | pearson_cosine | 0.6503 |
1521
- | **spearman_cosine** | **0.6625** |
1522
- | pearson_manhattan | 0.6474 |
1523
- | spearman_manhattan | 0.659 |
1524
- | pearson_euclidean | 0.6517 |
1525
- | spearman_euclidean | 0.6667 |
1526
- | pearson_dot | 0.5647 |
1527
- | spearman_dot | 0.5702 |
1528
- | pearson_max | 0.6517 |
1529
- | spearman_max | 0.6667 |
1530
-
1531
- #### Semantic Similarity
1532
- * Dataset: `sts-test`
1533
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1534
-
1535
- | Metric | Value |
1536
- |:--------------------|:-----------|
1537
- | pearson_cosine | 0.6774 |
1538
- | **spearman_cosine** | **0.6537** |
1539
- | pearson_manhattan | 0.6825 |
1540
- | spearman_manhattan | 0.6325 |
1541
- | pearson_euclidean | 0.6906 |
1542
- | spearman_euclidean | 0.6407 |
1543
- | pearson_dot | 0.5835 |
1544
- | spearman_dot | 0.5962 |
1545
- | pearson_max | 0.6906 |
1546
- | spearman_max | 0.6537 |
1547
-
1548
- #### Semantic Similarity
1549
- * Dataset: `sts-test`
1550
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1551
-
1552
- | Metric | Value |
1553
- |:--------------------|:-----------|
1554
- | pearson_cosine | 0.6709 |
1555
- | **spearman_cosine** | **0.6847** |
1556
- | pearson_manhattan | 0.6613 |
1557
- | spearman_manhattan | 0.6907 |
1558
- | pearson_euclidean | 0.6607 |
1559
- | spearman_euclidean | 0.6881 |
1560
- | pearson_dot | 0.6098 |
1561
- | spearman_dot | 0.6195 |
1562
- | pearson_max | 0.6709 |
1563
- | spearman_max | 0.6907 |
1564
-
1565
- #### Semantic Similarity
1566
- * Dataset: `sts-test`
1567
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1568
-
1569
- | Metric | Value |
1570
- |:--------------------|:-----------|
1571
- | pearson_cosine | 0.5977 |
1572
- | **spearman_cosine** | **0.5799** |
1573
- | pearson_manhattan | 0.5974 |
1574
- | spearman_manhattan | 0.5953 |
1575
- | pearson_euclidean | 0.5949 |
1576
- | spearman_euclidean | 0.5936 |
1577
- | pearson_dot | 0.5043 |
1578
- | spearman_dot | 0.4968 |
1579
- | pearson_max | 0.5977 |
1580
- | spearman_max | 0.5953 |
1581
-
1582
- #### Semantic Similarity
1583
- * Dataset: `sts-test`
1584
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1585
-
1586
- | Metric | Value |
1587
- |:--------------------|:-----------|
1588
- | pearson_cosine | 0.4562 |
1589
- | **spearman_cosine** | **0.4422** |
1590
- | pearson_manhattan | 0.4155 |
1591
- | spearman_manhattan | 0.3837 |
1592
- | pearson_euclidean | 0.4111 |
1593
- | spearman_euclidean | 0.3822 |
1594
- | pearson_dot | 0.4863 |
1595
- | spearman_dot | 0.5303 |
1596
- | pearson_max | 0.4863 |
1597
- | spearman_max | 0.5303 |
1598
-
1599
- #### Semantic Similarity
1600
- * Dataset: `sts-test`
1601
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1602
-
1603
- | Metric | Value |
1604
- |:--------------------|:-----------|
1605
- | pearson_cosine | 0.593 |
1606
- | **spearman_cosine** | **0.6266** |
1607
- | pearson_manhattan | 0.5608 |
1608
- | spearman_manhattan | 0.6229 |
1609
- | pearson_euclidean | 0.558 |
1610
- | spearman_euclidean | 0.6202 |
1611
- | pearson_dot | 0.4578 |
1612
- | spearman_dot | 0.5628 |
1613
- | pearson_max | 0.593 |
1614
- | spearman_max | 0.6266 |
1615
-
1616
  <!--
1617
  ## Bias, Risks and Limitations
1618
 
 
995
  name: Spearman Max
996
  ---
997
 
998
+ /!\ This model achieves SOTA results in the MTEB STS multilingual Leaderboard (in "other"). Here is the comparison
999
+
1000
+ State-of-the-art results (Multi) STSb-XLM-RoBERTa-base Paraphrase Multilingual MPNet base v2
1001
+ Average 73.17 71.68 **73.89**
1002
+ STS17 (ar-ar) **81.87** 80.43 81.24
1003
+ STS17 (en-ar) **81.22** 76.3 77.03
1004
+ STS17 (en-de) 87.3 91.06 **91.09**
1005
+ STS17 (en-tr) 77.18 **80.74** 79.87
1006
+ STS17 (es-en) **88.24** 83.09 85.53
1007
+ STS17 (es-es) **88.25** 84.16 87.27
1008
+ STS17 (fr-en) 88.06 **91.33** 90.68
1009
+ STS17 (it-en) 89.68 **92.87** 92.47
1010
+ STS17 (ko-ko) 83.69 **97.67** 97.66
1011
+ STS17 (nl-en) 88.25 **92.13** 91.15
1012
+ STS22 (ar) 58.67 58.67 **62.66**
1013
+ STS22 (de) **60.12** 52.17 57.74
1014
+ STS22 (de-en) **60.92** 58.5 57.5
1015
+ STS22 (de-fr) **67.79** 51.28 57.99
1016
+ STS22 (de-pl) **58.69** 44.56 44.22
1017
+ STS22 (es) **68.57** 63.68 66.21
1018
+ STS22 (es-en) **78.8** 70.65 75.18
1019
+ STS22 (es-it) **75.04** 60.88 66.25
1020
+ STS22 (fr) **83.75** 76.46 78.76
1021
+ STS22 (fr-pl) 84.52 84.52 **84.52**
1022
+ STS22 (it) **79.28** 66.73 68.47
1023
+ STS22 (pl) 42.08 41.18 **43.36**
1024
+ STS22 (pl-en) **77.5** 64.35 75.11
1025
+ STS22 (ru) **61.71** 58.59 58.67
1026
+ STS22 (tr) **68.72** 57.52 63.84
1027
+ STS22 (zh-en) **71.88** 60.69 65.37
1028
+ STSb 89.86 95.05 **95.15**
1029
+
1030
  # SentenceTransformer based on sentence-transformers/paraphrase-multilingual-mpnet-base-v2
1031
 
1032
  This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/paraphrase-multilingual-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
 
1135
  | pearson_max | 0.9551 |
1136
  | spearman_max | 0.9593 |
1137
 
1138
+ #### Evalutation results vs SOTA results
1139
  * Dataset: `sts-test`
1140
  * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
1141
 
 
1152
  | pearson_max | 0.948 |
1153
  | spearman_max | 0.9515 |
1154
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1155
  <!--
1156
  ## Bias, Risks and Limitations
1157