Edit model card

This is a smaller version of the XLM-RoBERTa model with only Ukrainian and some English embeddings left.

  • The original model has 470M parameters, with 384M of them being input and output embeddings.
  • After shrinking the sentencepiece vocabulary from 250K to 31K (top 25K Ukrainian tokens and top English tokens) the number of model parameters reduced to 134M parameters, and model size reduced from 1GB to 400MB.
Downloads last month
54
Safetensors
Model size
110M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ukr-models/xlm-roberta-base-uk

Finetunes
3 models