language: | |
- ms | |
# Full Parameter Finetuning 600M 32768 context length Llama2 on Malaysian text | |
600M derived from first 2 layers 7B model. | |
README at https://github.com/mesolitica/malaya/tree/5.1/session/llama2#600m-32768-context-length-flash-attention-2 | |
WandB, https://wandb.ai/mesolitica/fpf-Llama-2-600m-32k-hf |