language: | |
- ms | |
# 600M 32768 context length Llama2 on Malaysian text embedding task | |
Trained on truncated 8k context length, but infer able to scale up to 32k context length. | |
README at https://github.com/mesolitica/llama2-embedding#finetune | |
WandB, https://wandb.ai/mesolitica/llama2-embedding-600m?workspace=user-husein-mesolitica |