language: - ms
600M derived from first 2 layers 7B model.
README at https://github.com/mesolitica/malaya/tree/5.1/session/llama2#600m-32768-context-length-flash-attention-2
WandB, https://wandb.ai/mesolitica/fpf-Llama-2-600m-32k-hf