license: mit language: - uz
A pre-trained BERT model for Uzbek (12layers, cased). Trained on big News corpus (Daryo)