File size: 340 Bytes
e948558
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
---
language:
- ms
---

# 600M 32768 context length Llama2 on Malaysian text embedding task

Trained on truncated 8k context length, but infer able to scale up to 32k context length.

README at https://github.com/mesolitica/llama2-embedding#finetune

WandB, https://wandb.ai/mesolitica/llama2-embedding-600m?workspace=user-husein-mesolitica