---
base_model: cointegrated/LaBSE-en-ru
language:
- ru
- en
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
- negative_mse
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:10975066
- loss:MSELoss
widget:
- source_sentence: Такие лодки строились, чтобы получить быстрый доступ к приходящим судам.
sentences:
- been nice talking to you
- >-
Нельзя ставить под сомнение притязания клиента, если не были предприняты
шаги.
- >-
Dharangaon Railway Station serves Dharangaon in Jalgaon district in the
Indian state of Maharashtra.
- source_sentence: >-
Если прилагательные смягчают этнические термины, существительные могут
сделать их жестче.
sentences:
- >-
Вслед за этим последовало секретное письмо А.Б.Чубайса об изъятии у МЦР,
переданного ему С.Н.Рерихом наследия.
- Coaches should not give young athletes a hard time.
- Эшкрофт хотел прослушивать сводки новостей снова и снова
- source_sentence: Земля была мягкой.
sentences:
- >-
По мере того, как самообладание покидало его, сердце его все больше
наполнялось тревогой.
- >-
Our borders and immigration system, including law enforcement, ought to send
a message of welcome, tolerance, and justice to members of immigrant
communities in the United States and in their countries of origin.
- >-
Начнут действовать льготные условия аренды земель, которые предназначены для
реализации инвестиционных проектов.
- source_sentence: >-
Что же касается рава Кука: мой рав лично знал его и много раз с теплотой
рассказывал мне о нем как о великом каббалисте.
sentences:
- Вдова Эдгара Эванса, его дети и мать получили 1500 фунтов стерлингов (
- Please do not make any changes to your address.
- Мы уже закончили все запланированные дела!
- source_sentence: See Name section.
sentences:
- >-
Ms. Packard is the voice of the female blood elf in the video game World of
Warcraft.
- >-
Основным функциональным элементом, реализующим функции управления
соединением, является абонентский терминал.
- Yeah, people who might not be hungry.
model-index:
- name: SentenceTransformer based on cointegrated/LaBSE-en-ru
results:
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts dev
type: sts-dev
metrics:
- type: pearson_cosine
value: 0.5305176535187099
name: Pearson Cosine
- type: spearman_cosine
value: 0.6347069834349862
name: Spearman Cosine
- type: pearson_manhattan
value: 0.5553415140113596
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.6389336208598283
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.5499910306125031
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.6347073809507647
name: Spearman Euclidean
- type: pearson_dot
value: 0.5305176585564861
name: Pearson Dot
- type: spearman_dot
value: 0.6347078463557637
name: Spearman Dot
- type: pearson_max
value: 0.5553415140113596
name: Pearson Max
- type: spearman_max
value: 0.6389336208598283
name: Spearman Max
- task:
type: knowledge-distillation
name: Knowledge Distillation
dataset:
name: Unknown
type: unknown
metrics:
- type: negative_mse
value: -0.006337030936265364
name: Negative Mse
- task:
type: semantic-similarity
name: Semantic Similarity
dataset:
name: sts test
type: sts-test
metrics:
- type: pearson_cosine
value: 0.5042796836494269
name: Pearson Cosine
- type: spearman_cosine
value: 0.5986471772428711
name: Spearman Cosine
- type: pearson_manhattan
value: 0.522744495080616
name: Pearson Manhattan
- type: spearman_manhattan
value: 0.5983901280447074
name: Spearman Manhattan
- type: pearson_euclidean
value: 0.522721961447153
name: Pearson Euclidean
- type: spearman_euclidean
value: 0.5986471095414022
name: Spearman Euclidean
- type: pearson_dot
value: 0.504279685613151
name: Pearson Dot
- type: spearman_dot
value: 0.598648155615724
name: Spearman Dot
- type: pearson_max
value: 0.522744495080616
name: Pearson Max
- type: spearman_max
value: 0.598648155615724
name: Spearman Max
---
# SentenceTransformer based on cointegrated/LaBSE-en-ru
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [cointegrated/LaBSE-en-ru](https://huggingface.co/cointegrated/LaBSE-en-ru). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [cointegrated/LaBSE-en-ru](https://huggingface.co/cointegrated/LaBSE-en-ru)
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Dense({'in_features': 768, 'out_features': 768, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'})
(3): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("whitemouse84/LaBSE-en-ru-distilled-each-third-layer")
# Run inference
sentences = [
'See Name section.',
'Ms. Packard is the voice of the female blood elf in the video game World of Warcraft.',
'Yeah, people who might not be hungry.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Semantic Similarity
* Dataset: `sts-dev`
* Evaluated with [EmbeddingSimilarityEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.5305 |
| **spearman_cosine** | **0.6347** |
| pearson_manhattan | 0.5553 |
| spearman_manhattan | 0.6389 |
| pearson_euclidean | 0.55 |
| spearman_euclidean | 0.6347 |
| pearson_dot | 0.5305 |
| spearman_dot | 0.6347 |
| pearson_max | 0.5553 |
| spearman_max | 0.6389 |
#### Knowledge Distillation
* Evaluated with [MSEEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.MSEEvaluator)
| Metric | Value |
|:-----------------|:------------|
| **negative_mse** | **-0.0063** |
#### Semantic Similarity
* Dataset: `sts-test`
* Evaluated with [EmbeddingSimilarityEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| pearson_cosine | 0.5043 |
| **spearman_cosine** | **0.5986** |
| pearson_manhattan | 0.5227 |
| spearman_manhattan | 0.5984 |
| pearson_euclidean | 0.5227 |
| spearman_euclidean | 0.5986 |
| pearson_dot | 0.5043 |
| spearman_dot | 0.5986 |
| pearson_max | 0.5227 |
| spearman_max | 0.5986 |
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 10,975,066 training samples
* Columns: sentence
and label
* Approximate statistics based on the first 1000 samples:
| | sentence | label |
|:--------|:-----------------------------------------------------------------------------------|:-------------------------------------|
| type | string | list |
| details |
It is based on the Java Persistence API (JPA), but it does not strictly follow the JSR 338 Specification, as it implements different design patterns and technologies.
| [-0.012331949546933174, -0.04570527374744415, -0.024963658303022385, -0.03620213270187378, 0.022556383162736893, ...]
|
| Покупаем вторичное сырье в Каунасе (Переработка вторичного сырья) - Алфенас АНД КО, ЗАО на Bizorg.
| [-0.07498518377542496, -0.01913534104824066, -0.01797042042016983, 0.048263177275657654, -0.00016611881437711418, ...]
|
| At the Equal Justice Conference ( EJC ) held in March 2001 in San Diego , LSC and the Project for the Future of Equal Justice held the second Case Management Software pre-conference .
| [0.03870972990989685, -0.0638347640633583, -0.01696585863828659, -0.043612319976091385, -0.048241738229990005, ...]
|
* Loss: [MSELoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#mseloss)
### Evaluation Dataset
#### Unnamed Dataset
* Size: 10,000 evaluation samples
* Columns: sentence
and label
* Approximate statistics based on the first 1000 samples:
| | sentence | label |
|:--------|:-----------------------------------------------------------------------------------|:-------------------------------------|
| type | string | list |
| details | The Canadian Canoe Museum is a museum dedicated to canoes located in Peterborough, Ontario, Canada.
| [-0.05444105342030525, -0.03650881350040436, -0.041163671761751175, -0.010616903193295002, -0.04094529151916504, ...]
|
| И мне нравилось, что я одновременно зарабатываю и смотрю бои».
| [-0.03404555842280388, 0.028203096240758896, -0.056121889501810074, -0.0591997392475605, -0.05523117259144783, ...]
|
| Ну, а на следующий день, разумеется, Президент Кеннеди объявил блокаду Кубы, и наши корабли остановили у кубинских берегов направлявшийся на Кубу российский корабль, и у него на борту нашли ракеты.
| [-0.008193841204047203, 0.00694894278421998, -0.03027420863509178, -0.03290146216750145, 0.01425305474549532, ...]
|
* Loss: [MSELoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#mseloss)
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 64
- `per_device_eval_batch_size`: 64
- `learning_rate`: 0.0001
- `num_train_epochs`: 1
- `warmup_ratio`: 0.1
- `fp16`: True
- `load_best_model_at_end`: True
#### All Hyperparameters