PyTorch
Safetensors
English
llama

TinyLlama-1.1B-ProXMath

ArXiv | Data: OpenWebMath-Pro | Code

TinyLlama-1.1B-ProXMath is a math-adapted TinyLlama-1.1B model that is continually pre-trained on OpenWebMath-Pro (a refined version by ProX) for 15B tokens.

Evaluations

ProX models are evaluated on 9 common math reasoning benchmarks.

Model asdiv gsm8k mathqa mawps minerva_math mmlu_stem sat_math svamp tabmwp average
TinyLlama-1.1B 18.0 2.8 14.6 20.2 3.2 16.3 21.9 10.9 12.5 13.4
TinyLlama-1.1B-ProXMath 41.9 9.0 15.6 56.9 5.6 26.8 31.2 23.8 22.2 25.7

Citation

@article{zhou2024programming,
  title={Programming Every Example: Lifting Pre-training Data Quality like Experts at Scale},
  author={Zhou, Fan and Wang, Zengzhi and Liu, Qian and Li, Junlong and Liu, Pengfei},
  journal={arXiv preprint arXiv:2409.17115},
  year={2024}
}
Downloads last month
27
Safetensors
Model size
1.1B params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for gair-prox/TinyLlama-1.1B-ProXMath

Finetuned
(89)
this model

Dataset used to train gair-prox/TinyLlama-1.1B-ProXMath

Collection including gair-prox/TinyLlama-1.1B-ProXMath