ai-forever
commited on
Commit
·
e954845
1
Parent(s):
824c126
Update README.md
Browse files
README.md
CHANGED
@@ -21,7 +21,6 @@ An extensive dataset with “artificial” errors was taken as a training corpus
|
|
21 |
- [SAGE library announcement](https://youtu.be/yFfkV0Qjuu0), DataFest 2023
|
22 |
- [Paper about synthetic error generation methods](https://www.dialog-21.ru/media/5914/martynovnplusetal056.pdf), Dialogue 2023
|
23 |
- [Paper about SAGE and our best solution](https://arxiv.org/abs/2308.09435), Review EACL 2024
|
24 |
-
- Path to model = "ai-forever/RuM2M100-418M"
|
25 |
|
26 |
### Examples
|
27 |
| Input | Output |
|
@@ -87,7 +86,7 @@ We compare our solution with both open automatic spell checkers and the ChatGPT
|
|
87 |
```python
|
88 |
from transformers import M2M100ForConditionalGeneration, M2M100Tokenizer
|
89 |
|
90 |
-
path_to_model = "
|
91 |
|
92 |
model = M2M100ForConditionalGeneration.from_pretrained(path_to_model)
|
93 |
tokenizer = M2M100Tokenizer.from_pretrained(path_to_model)
|
|
|
21 |
- [SAGE library announcement](https://youtu.be/yFfkV0Qjuu0), DataFest 2023
|
22 |
- [Paper about synthetic error generation methods](https://www.dialog-21.ru/media/5914/martynovnplusetal056.pdf), Dialogue 2023
|
23 |
- [Paper about SAGE and our best solution](https://arxiv.org/abs/2308.09435), Review EACL 2024
|
|
|
24 |
|
25 |
### Examples
|
26 |
| Input | Output |
|
|
|
86 |
```python
|
87 |
from transformers import M2M100ForConditionalGeneration, M2M100Tokenizer
|
88 |
|
89 |
+
path_to_model = "ai-forever/RuM2M100-418M"
|
90 |
|
91 |
model = M2M100ForConditionalGeneration.from_pretrained(path_to_model)
|
92 |
tokenizer = M2M100Tokenizer.from_pretrained(path_to_model)
|