Upload 2 files
Browse files
README.md
CHANGED
@@ -1,199 +1,265 @@
|
|
1 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2 |
library_name: transformers
|
3 |
-
tags:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
---
|
5 |
|
6 |
-
#
|
7 |
|
8 |
-
<!-- Provide a quick summary of what the model is/does. -->
|
9 |
|
|
|
10 |
|
|
|
11 |
|
12 |
-
|
13 |
|
14 |
-
|
15 |
|
16 |
-
|
|
|
17 |
|
18 |
-
|
19 |
|
20 |
-
|
21 |
-
|
22 |
-
- **Shared by [optional]:** [More Information Needed]
|
23 |
-
- **Model type:** [More Information Needed]
|
24 |
-
- **Language(s) (NLP):** [More Information Needed]
|
25 |
-
- **License:** [More Information Needed]
|
26 |
-
- **Finetuned from model [optional]:** [More Information Needed]
|
27 |
|
28 |
-
|
|
|
|
|
29 |
|
30 |
-
|
|
|
31 |
|
32 |
-
- **Repository:** [More Information Needed]
|
33 |
-
- **Paper [optional]:** [More Information Needed]
|
34 |
-
- **Demo [optional]:** [More Information Needed]
|
35 |
|
36 |
-
|
37 |
|
38 |
-
|
|
|
|
|
|
|
|
|
39 |
|
40 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
41 |
|
42 |
-
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
|
43 |
|
44 |
-
|
45 |
|
46 |
-
|
47 |
|
48 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
49 |
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
-
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
|
87 |
-
|
88 |
-
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
|
93 |
-
|
94 |
-
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
102 |
|
103 |
## Evaluation
|
104 |
|
105 |
-
|
106 |
-
|
107 |
-
|
108 |
-
|
109 |
-
#### Testing Data
|
110 |
-
|
111 |
-
<!-- This should link to a Dataset Card if possible. -->
|
112 |
-
|
113 |
-
[More Information Needed]
|
114 |
-
|
115 |
-
#### Factors
|
116 |
-
|
117 |
-
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
|
118 |
-
|
119 |
-
[More Information Needed]
|
120 |
-
|
121 |
-
#### Metrics
|
122 |
-
|
123 |
-
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
|
124 |
-
|
125 |
-
[More Information Needed]
|
126 |
-
|
127 |
-
### Results
|
128 |
-
|
129 |
-
[More Information Needed]
|
130 |
-
|
131 |
-
#### Summary
|
132 |
-
|
133 |
-
|
134 |
-
|
135 |
-
## Model Examination [optional]
|
136 |
-
|
137 |
-
<!-- Relevant interpretability work for the model goes here -->
|
138 |
-
|
139 |
-
[More Information Needed]
|
140 |
-
|
141 |
-
## Environmental Impact
|
142 |
-
|
143 |
-
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
|
144 |
-
|
145 |
-
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
|
146 |
-
|
147 |
-
- **Hardware Type:** [More Information Needed]
|
148 |
-
- **Hours used:** [More Information Needed]
|
149 |
-
- **Cloud Provider:** [More Information Needed]
|
150 |
-
- **Compute Region:** [More Information Needed]
|
151 |
-
- **Carbon Emitted:** [More Information Needed]
|
152 |
-
|
153 |
-
## Technical Specifications [optional]
|
154 |
-
|
155 |
-
### Model Architecture and Objective
|
156 |
-
|
157 |
-
[More Information Needed]
|
158 |
-
|
159 |
-
### Compute Infrastructure
|
160 |
-
|
161 |
-
[More Information Needed]
|
162 |
-
|
163 |
-
#### Hardware
|
164 |
-
|
165 |
-
[More Information Needed]
|
166 |
-
|
167 |
-
#### Software
|
168 |
-
|
169 |
-
[More Information Needed]
|
170 |
-
|
171 |
-
## Citation [optional]
|
172 |
-
|
173 |
-
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
|
174 |
|
175 |
-
**
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
176 |
|
177 |
-
|
178 |
|
179 |
-
|
180 |
|
181 |
-
|
182 |
|
183 |
-
|
184 |
|
185 |
-
|
186 |
|
187 |
-
|
188 |
|
189 |
-
##
|
|
|
190 |
|
191 |
-
[More Information Needed]
|
192 |
|
193 |
-
## Model Card Authors [optional]
|
194 |
|
195 |
-
|
196 |
|
197 |
-
|
|
|
|
|
198 |
|
199 |
-
|
|
|
|
|
|
1 |
---
|
2 |
+
license: cc-by-4.0
|
3 |
+
language:
|
4 |
+
- cs
|
5 |
+
- pl
|
6 |
+
- sk
|
7 |
+
- sl
|
8 |
+
- en
|
9 |
library_name: transformers
|
10 |
+
tags:
|
11 |
+
- translation
|
12 |
+
- mt
|
13 |
+
- marian
|
14 |
+
- pytorch
|
15 |
+
- sentence-piece
|
16 |
+
- multilingual
|
17 |
+
- allegro
|
18 |
+
- laniqo
|
19 |
---
|
20 |
|
21 |
+
# MultiSlav BiDi Models
|
22 |
|
|
|
23 |
|
24 |
+
[//]: # (<p align="center">)
|
25 |
|
26 |
+
[//]: # ( <a href="https://ml.allegro.tech/"><img src="allegro-title.svg" alt="MLR @ Allegro.com"></a>)
|
27 |
|
28 |
+
[//]: # (</p>)
|
29 |
|
30 |
+
## Multilingual BiDi MT Models
|
31 |
|
32 |
+
___BiDi___ is a collection of Encoder-Decoder vanilla transformer models trained on sentence-level Machine Translation task.
|
33 |
+
Each model is supporting Bi-Directional translation.
|
34 |
|
35 |
+
___BiDi___ models are part of the [___MultiSlav___ collection](https://huggingface.co/collections/allegro/multislav-6793d6b6419e5963e759a683). More information will be available soon in our upcoming MultiSlav paper.
|
36 |
|
37 |
+
Experiments were conducted under research project by [Machine Learning Research](https://ml.allegro.tech/) lab for [Allegro.com](https://ml.allegro.tech/).
|
38 |
+
Big thanks to [laniqo.com](laniqo.com) for cooperation in the research.
|
|
|
|
|
|
|
|
|
|
|
39 |
|
40 |
+
<p align="center">
|
41 |
+
<img src="bi-di.svg">
|
42 |
+
</p>
|
43 |
|
44 |
+
Graphic above provides an example of an BiDi model - [BiDi-ces-pol](https://huggingface.co/allegro/bidi-ces-pol) to translate from Polish to Czech language.
|
45 |
+
___BiDi-ces-pol___ is a bi-directional model supporting translation both __form Czech to Polish__ and __from Polish to Czech__ directions.
|
46 |
|
|
|
|
|
|
|
47 |
|
48 |
+
### Supported languages
|
49 |
|
50 |
+
To use a ___BiDi___ model, you must provide the target language for translation.
|
51 |
+
Target language tokens are represented as 3-letter ISO 639-3 language codes embedded in a format >>xxx<<.
|
52 |
+
All accepted directions and their respective tokens are listed below.
|
53 |
+
Note that, for each model only two directions are available.
|
54 |
+
Each of them was added as a special token to Sentence-Piece tokenizer.
|
55 |
|
56 |
+
| **Target Language** | **First token** |
|
57 |
+
|---------------------|-----------------|
|
58 |
+
| Czech | `>>ces<<` |
|
59 |
+
| English | `>>eng<<` |
|
60 |
+
| Polish | `>>pol<<` |
|
61 |
+
| Slovak | `>>slk<<` |
|
62 |
+
| Slovene | `>>slv<<` |
|
63 |
|
|
|
64 |
|
65 |
+
### Bi-Di models available
|
66 |
|
67 |
+
We provided 10 ___BiDi___ models, allowing to translate between 20 languages.
|
68 |
|
69 |
+
| **Bi-Di model** | **Languages supported** | **HF repository** |
|
70 |
+
|-----------------|-------------------------|---------------------------------------------------------------------|
|
71 |
+
| BiDi-ces-eng | Czech ↔ English | [allegro/BiDi-ces-eng](https://huggingface.co/allegro/bidi-ces-eng) |
|
72 |
+
| BiDi-ces-pol | Czech ↔ Polish | [allegro/BiDi-ces-pol](https://huggingface.co/allegro/bidi-ces-pol) |
|
73 |
+
| BiDi-ces-slk | Czech ↔ Slovak | [allegro/BiDi-ces-slk](https://huggingface.co/allegro/bidi-ces-slk) |
|
74 |
+
| BiDi-ces-slv | Czech ↔ Slovene | [allegro/BiDi-ces-slv](https://huggingface.co/allegro/bidi-ces-slv) |
|
75 |
+
| BiDi-eng-pol | English ↔ Polish | [allegro/BiDi-eng-pol](https://huggingface.co/allegro/bidi-eng-pol) |
|
76 |
+
| BiDi-eng-slk | English ↔ Slovak | [allegro/BiDi-eng-slk](https://huggingface.co/allegro/bidi-eng-slk) |
|
77 |
+
| BiDi-eng-slv | English ↔ Slovene | [allegro/BiDi-eng-slv](https://huggingface.co/allegro/bidi-eng-slv) |
|
78 |
+
| BiDi-pol-slk | Polish ↔ Slovak | [allegro/BiDi-pol-slk](https://huggingface.co/allegro/bidi-pol-slk) |
|
79 |
+
| BiDi-pol-slv | Polish ↔ Slovene | [allegro/BiDi-pol-slv](https://huggingface.co/allegro/bidi-pol-slv) |
|
80 |
+
| BiDi-slk-slv | Slovak ↔ Slovene | [allegro/BiDi-slk-slv](https://huggingface.co/allegro/bidi-slk-slv) |
|
81 |
|
82 |
+
## Use case quickstart
|
83 |
+
|
84 |
+
Example code-snippet to use model. Due to bug the `MarianMTModel` must be used explicitly.
|
85 |
+
Remember to adjust source and target languages to your use-case.
|
86 |
+
|
87 |
+
```python
|
88 |
+
from transformers import AutoTokenizer, MarianMTModel
|
89 |
+
|
90 |
+
source_lang = "pol"
|
91 |
+
target_lang = "ces"
|
92 |
+
first_lang, second_lang = sorted([source_lang, target_lang])
|
93 |
+
model_name = f"Allegro/BiDi-{first_lang}-{second_lang}"
|
94 |
+
|
95 |
+
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
96 |
+
model = MarianMTModel.from_pretrained(model_name)
|
97 |
+
|
98 |
+
text = f">>{target_lang}<<" + " " + "Allegro to internetowa platforma e-commerce, na której swoje produkty sprzedają średnie i małe firmy, jak również duże marki."
|
99 |
+
|
100 |
+
batch_to_translate = [text]
|
101 |
+
translations = model.generate(**tokenizer.batch_encode_plus(batch_to_translate, return_tensors="pt"))
|
102 |
+
decoded_translation = tokenizer.batch_decode(translations, skip_special_tokens=True, clean_up_tokenization_spaces=True)[0]
|
103 |
+
|
104 |
+
print(decoded_translation)
|
105 |
+
```
|
106 |
+
|
107 |
+
Generated Czech output:
|
108 |
+
> Allegro je online e-commerce platforma, na které své výrobky prodávají střední a malé firmy, stejně jako velké značky.
|
109 |
+
|
110 |
+
|
111 |
+
## Training
|
112 |
+
|
113 |
+
[SentencePiece](https://github.com/google/sentencepiece) tokenizer has a vocab size 32k in total (16k per language). Tokenizer was trained on randomly sampled part of the training corpus.
|
114 |
+
During the training we used the [MarianNMT](https://marian-nmt.github.io/) framework.
|
115 |
+
Base marian configuration used: [transfromer-big](https://github.com/marian-nmt/marian-dev/blob/master/src/common/aliases.cpp#L113).
|
116 |
+
All training parameters are listed in table below.
|
117 |
+
|
118 |
+
### Training hyperparameters:
|
119 |
+
|
120 |
+
| **Hyperparameter** | **Value** |
|
121 |
+
|----------------------------|------------------------------------------------------------------------------------------------------------|
|
122 |
+
| Total Parameter Size | 209M |
|
123 |
+
| Vocab Size | 32k |
|
124 |
+
| Base Parameters | [Marian transfromer-big](https://github.com/marian-nmt/marian-dev/blob/master/src/common/aliases.cpp#L113) |
|
125 |
+
| Number of Encoding Layers | 6 |
|
126 |
+
| Number of Decoding Layers | 6 |
|
127 |
+
| Model Dimension | 1024 |
|
128 |
+
| FF Dimension | 4096 |
|
129 |
+
| Heads | 16 |
|
130 |
+
| Dropout | 0.1 |
|
131 |
+
| Batch Size | mini batch fit to VRAM |
|
132 |
+
| Training Accelerators | 4x A100 40GB |
|
133 |
+
| Max Length | 100 tokens |
|
134 |
+
| Optimizer | Adam |
|
135 |
+
| Warmup steps | 8000 |
|
136 |
+
| Context | Sentence-level MT |
|
137 |
+
| Languages Supported | See [Bi-Di models available](#Bi-Di-models-available) |
|
138 |
+
| Precision | float16 |
|
139 |
+
| Validation Freq | 3000 steps |
|
140 |
+
| Stop Metric | ChrF |
|
141 |
+
| Stop Criterion | 20 Validation steps |
|
142 |
+
|
143 |
+
|
144 |
+
## Training corpora
|
145 |
+
|
146 |
+
The main research question was: "How does adding additional, related languages impact the quality of the model?" - we explored it in the Slavic language family.
|
147 |
+
___BiDi___ models are our baseline before expanding the data-regime by using higher-level multilinguality.
|
148 |
+
|
149 |
+
Datasets were downloaded via [MT-Data](https://pypi.org/project/mtdata/0.2.10/) library.
|
150 |
+
The number of total examples post filtering and deduplication varies, depending on languages supported, see the table below.
|
151 |
+
|
152 |
+
| **Language pair** | **Number of training examples** |
|
153 |
+
|-------------------|--------------------------------:|
|
154 |
+
| Czech ↔ Polish | 63M |
|
155 |
+
| Czech ↔ Slovak | 30M |
|
156 |
+
| Czech ↔ Slovene | 25M |
|
157 |
+
| Polish ↔ Slovak | 26M |
|
158 |
+
| Polish ↔ Slovene | 23M |
|
159 |
+
| Slovak ↔ Slovene | 18M |
|
160 |
+
| ---------------- | ------------------------------- |
|
161 |
+
| Czech ↔ English | 151M |
|
162 |
+
| English ↔ Polish | 150M |
|
163 |
+
| English ↔ Slovak | 52M |
|
164 |
+
| English ↔ Slovene | 40M |
|
165 |
+
|
166 |
+
The datasets used (only applicable to specific directions):
|
167 |
+
|
168 |
+
| **Corpus** |
|
169 |
+
|----------------------|
|
170 |
+
| paracrawl |
|
171 |
+
| opensubtitles |
|
172 |
+
| multiparacrawl |
|
173 |
+
| dgt |
|
174 |
+
| elrc |
|
175 |
+
| xlent |
|
176 |
+
| wikititles |
|
177 |
+
| wmt |
|
178 |
+
| wikimatrix |
|
179 |
+
| dcep |
|
180 |
+
| ELRC |
|
181 |
+
| tildemodel |
|
182 |
+
| europarl |
|
183 |
+
| eesc |
|
184 |
+
| eubookshop |
|
185 |
+
| emea |
|
186 |
+
| jrc_acquis |
|
187 |
+
| ema |
|
188 |
+
| qed |
|
189 |
+
| elitr_eca |
|
190 |
+
| EU-dcep |
|
191 |
+
| rapid |
|
192 |
+
| ecb |
|
193 |
+
| kde4 |
|
194 |
+
| news_commentary |
|
195 |
+
| kde |
|
196 |
+
| bible_uedin |
|
197 |
+
| europat |
|
198 |
+
| elra |
|
199 |
+
| wikipedia |
|
200 |
+
| wikimedia |
|
201 |
+
| tatoeba |
|
202 |
+
| globalvoices |
|
203 |
+
| euconst |
|
204 |
+
| ubuntu |
|
205 |
+
| php |
|
206 |
+
| ecdc |
|
207 |
+
| eac |
|
208 |
+
| eac_reference |
|
209 |
+
| gnome |
|
210 |
+
| EU-eac |
|
211 |
+
| books |
|
212 |
+
| EU-ecdc |
|
213 |
+
| newsdev |
|
214 |
+
| khresmoi_summary |
|
215 |
+
| czechtourism |
|
216 |
+
| khresmoi_summary_dev |
|
217 |
+
| worldbank |
|
218 |
|
219 |
## Evaluation
|
220 |
|
221 |
+
Evaluation of the models was performed on [Flores200](https://huggingface.co/datasets/facebook/flores) dataset.
|
222 |
+
The table below compares performance of the open-source models and all applicable models from our collection.
|
223 |
+
Metric used: Unbabel/wmt22-comet-da.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
224 |
|
225 |
+
| **Direction** | **CES → ENG** | **CES → POL** | **CES → SLK** | **CES → SLV** | **ENG → CES** | **ENG → POL** | **ENG → SLK** | **ENG → SLV** | **POL → CES** | **POL → ENG** | **POL → SLK** | **POL → SLV** | **SLK → CES** | **SLK → ENG** | **SLK → POL** | **SLK → SLV** | **SLV → CES** | **SLV → ENG** | **SLV → POL** | **SLV → SLK** |
|
226 |
+
|----------------------------------------------------|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|--------------:|
|
227 |
+
| **M2M-100** | 87.0 | 89.0 | 92.1 | 89.7 | 88.6 | 86.4 | 88.4 | 87.3 | 89.6 | 84.6 | 89.4 | 88.4 | 92.7 | 86.8 | 89.1 | 89.6 | 90.3 | 86.4 | 88.7 | 90.1 |
|
228 |
+
| **NLLB-200** | 88.1 | 88.9 | 91.2 | 88.6 | 90.4 | __88.5__ | 90.1 | 88.8 | 89.4 | __85.8__ | 88.9 | 87.7 | 91.8 | 88.2 | 88.9 | 88.8 | 90.0 | __87.5__ | 88.6 | 89.4 |
|
229 |
+
| **Seamless-M4T** | 87.5 | 80.9 | 90.8 | 82.0 | __90.7__ | __88.5__ | __90.6__ | __89.6__ | 79.6 | 85.4 | 80.0 | 76.4 | 91.5 | 87.2 | 81.2 | 82.9 | 80.9 | 87.3 | 76.7 | 81.0 |
|
230 |
+
| **OPUS-MT Sla-Sla** | __88.2__ | 82.8 | - | 83.4 | 89.1 | 85.6 | 89.5 | 84.5 | 82.9 | 82.2 | - | 81.2 | - | __88.4__ | - | - | 83.5 | 84.1 | 80.8 | - |
|
231 |
+
| **OPUS-MT SK-EN** | - | - | - | - | - | - | 89.5 | - | - | - | - | - | - | __88.4__ | - | - | - | - | - | - |
|
232 |
+
| _Our contributions:_ | | | | | | | | | | | | | | | | | | | | |
|
233 |
+
| **BiDi Models**<span style="color:green;">*</span> | 87.5 | 89.4 | 92.4 | 89.8 | 87.8 | 86.2 | 87.2 | 86.6 | 90.0 | 85.0 | 89.1 | 88.4 | 92.9 | 87.3 | 88.8 | 89.4 | 90.0 | 86.9 | 88.1 | 89.1 |
|
234 |
+
| **P4-pol**<span style="color:red;">◊</span> | - | 89.6 | 90.8 | 88.7 | - | - | - | - | 90.2 | - | 89.8 | 88.7 | 91.0 | - | 89.3 | 88.4 | 89.3 | - | 88.7 | 88.5 |
|
235 |
+
| **P5-eng**<span style="color:red;">◊</span> | 88.0 | 89.0 | 90.7 | 89.0 | 88.8 | 87.3 | 88.4 | 87.5 | 89.0 | 85.7 | 88.5 | 87.8 | 91.0 | 88.2 | 88.6 | 88.5 | 89.6 | 87.2 | 88.4 | 88.9 |
|
236 |
+
| **P5-ces**<span style="color:red;">◊</span> | 87.9 | 89.6 | __92.5__ | 89.9 | 88.4 | 85.0 | 87.9 | 85.9 | 90.3 | 84.5 | 89.5 | 88.0 | __93.0__ | 87.8 | 89.4 | 89.8 | 90.3 | 85.7 | 87.9 | 89.8 |
|
237 |
+
| **MultiSlav-4slav** | - | 89.7 | __92.5__ | 90.0 | - | - | - | - | 90.2 | - | 89.6 | 88.7 | 92.9 | - | 89.4 | 90.1 | __90.6__ | - | 88.9 | __90.2__ |
|
238 |
+
| **MultiSlav-5lang** | 87.8 | __89.8__ | __92.5__ | __90.1__ | 88.9 | 86.9 | 88.0 | 87.3 | __90.4__ | 85.4 | 89.8 | __88.9__ | 92.9 | 87.8 | __89.6__ | __90.2__ | __90.6__ | 87.0 | __89.2__ | __90.2__ |
|
239 |
|
240 |
+
<span style="color:red;">◊</span> system of 2 models *Many2XXX* and *XXX2Many*
|
241 |
|
242 |
+
<span style="color:green;">*</span> results combined for all bi-directional models; each values for applicable model
|
243 |
|
244 |
+
## Limitations and Biases
|
245 |
|
246 |
+
We did not evaluate inherent bias contained in training datasets. It is advised to validate bias of our models in perspective domain. This might be especially problematic in translation from English to Slavic languages, which require explicitly indicated gender and might hallucinate based on bias present in training data.
|
247 |
|
248 |
+
## License
|
249 |
|
250 |
+
The model is licensed under CC BY 4.0, which allows for commercial use.
|
251 |
|
252 |
+
## Citation
|
253 |
+
TO BE UPDATED SOON 🤗
|
254 |
|
|
|
255 |
|
|
|
256 |
|
257 |
+
## Contact Options
|
258 |
|
259 |
+
Authors:
|
260 |
+
- MLR @ Allegro: [Artur Kot](https://linkedin.com/in/arturkot), [Mikołaj Koszowski](https://linkedin.com/in/mkoszowski), [Wojciech Chojnowski](https://linkedin.com/in/wojciech-chojnowski-744702348), [Mieszko Rutkowski](https://linkedin.com/in/mieszko-rutkowski)
|
261 |
+
- Laniqo.com: [Artur Nowakowski](https://linkedin.com/in/artur-nowakowski-mt), [Kamil Guttmann](https://linkedin.com/in/kamil-guttmann), [Mikołaj Pokrywka](https://linkedin.com/in/mikolaj-pokrywka)
|
262 |
|
263 |
+
Please don't hesitate to contact authors if you have any questions or suggestions:
|
264 |
+
- e-mail: [email protected] or [email protected]
|
265 |
+
- LinkedIn: [Artur Kot](https://linkedin.com/in/arturkot) or [Mikołaj Koszowski](https://linkedin.com/in/mkoszowski)
|
bi-di.svg
ADDED
|