Coldest 7B model out there🥶🥶
Safwan Usaid Lubdhak
lubdhak98
AI & ML interests
None yet
Recent Activity
replied to
suayptalha's
post
about 1 month ago
My last Falcon3-7B merge model, https://huggingface.co/suayptalha/Falcon3-Jessi-v0.4-7B-Slerp, is currently ranked #1 on the https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard among all models with up to 14B parameters.
My Qwen2.5-7B merge model, https://huggingface.co/suayptalha/HomerCreativeAnvita-Mix-Qw7B, is also ranked #7, placing two of my models in the top 10!
replied to
sometimesanotion's
post
about 1 month ago
I've managed a #1 score of 41.22% average for 14B parameter models on the Open LLM Leaderboard. As of this writing, sometimesanotion/Lamarck-14B-v0.7 is #8 for all models up to 70B parameters.
It took a custom toolchain around Arcee AI's mergekit to manage the complex merges, gradients, and LoRAs required to make this happen. I really like seeing features of many quality finetunes in one solid generalist model.
Organizations
None yet
lubdhak98's activity
replied to
suayptalha's
post
about 1 month ago
replied to
sometimesanotion's
post
about 1 month ago
peak 🥶
although generates around 8 tokesn/s on my 3070 lol