Edit model card

rubert-base-cased

RuBERT (Russian, cased, 12โ€‘layer, 768โ€‘hidden, 12โ€‘heads, 180M parameters) was trained on the Russian part of Wikipedia and news data. We used this training data to build a vocabulary of Russian subtokens and took a multilingual version of BERTโ€‘base as an initialization for RuBERT[1].

08.11.2021: upload model with MLM and NSP heads

[1]: Kuratov, Y., Arkhipov, M. (2019). Adaptation of Deep Bidirectional Multilingual Transformers for Russian Language. arXiv preprint arXiv:1905.07213.

Downloads last month
249,378
Inference API

Model tree for DeepPavlov/rubert-base-cased

Finetunes
33 models
Quantizations
1 model

Spaces using DeepPavlov/rubert-base-cased 6