Edit model card

Pretrain BASE 512 masking context length DebertaV2 on Malaysian text

Special thanks to https://github.com/aisyahrzk for pretraining DebertaV2 Base.

WanDB at https://wandb.ai/aisyahrazak/deberta-base?nw=nwuseraisyahrazak

Downloads last month
15
Safetensors
Model size
114M params
Tensor type
BF16
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Collection including mesolitica/malaysian-debertav2-base