Please explain.

#1
by ZeroWw - opened

Is this a finetune or the model was trained from scratch/merged?

Hmm it seems it was trained from scratch..
Who did this?

1719383601544693.png

Owner

Is this a finetune or the model was trained from scratch/merged?

Thank you for your interest in our work. Our model is based on the LLaMA series models and has been further trained on over 100 languages. For detailed training information, please refer to our paper: https://arxiv.org/pdf/2407.05975

further? meaning "finetuned"?

Owner

further? meaning "finetuned"?

Yes, we used LLaMA3 as the starting point and trained it on data from more languages.

Sign up or log in to comment