huseinzol05
commited on
Commit
•
16e6105
1
Parent(s):
1aac487
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- ms
|
4 |
+
---
|
5 |
+
|
6 |
+
# Pretrain 1.4B 4096 context length Mamba on Malaysian text
|
7 |
+
|
8 |
+
README at https://github.com/mesolitica/malaya/tree/5.1/pretrained-model/mamba
|
9 |
+
|
10 |
+
- Trained on 90B tokens, gathered at https://github.com/malaysia-ai/dedup-text-dataset/tree/main/pretrain-llm
|
11 |
+
- We use Ray cluster to train on 5 nodes of 8x A100 80GB, https://github.com/malaysia-ai/jupyter-gpu/tree/main/ray
|
12 |
+
|
13 |
+
WandB, https://wandb.ai/mesolitica/pretrain-mamba-1.4b?workspace=user-husein-mesolitica
|
14 |
+
|
15 |
+
WandB report, https://wandb.ai/mesolitica/pretrain-mamba-1.4b/reports/Mamba-1-4B-vs-Mistral-1-1B--Vmlldzo2MjA1MDc1
|