XALMA-13B-Pretrain + Separate Training Collection Fifty expert models are produced by separately fine-tuning XALMA-13B-Pretrain on each of 50 languages. • 50 items • Updated 8 days ago • 1
Asymmetric Conflict and Synergy in Post-training for LLM-based Multilingual Machine Translation Paper • 2502.11223 • Published 14 days ago • 1