|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
- zh |
|
base_model: |
|
- Qwen/Qwen2.5-14B |
|
pipeline_tag: text-generation |
|
tags: |
|
- merge |
|
--- |
|
|
|
 |
|
|
|
model named: **ZYH-LLM-Qwen2.5-14B!** π |
|
|
|
This model's performance is absolutely phenomenal, surpassing all my previously released merged models! π |
|
|
|
To highlight its uniqueness, I've created a brand new series separate from all previous releases! π« |
|
|
|
π
Release date: **February 5, 2025** |
|
𧩠Merging methods: **della** + **sce** |
|
π οΈ Models used: |
|
- Qwen2.5-Coder-14B |
|
- Qwen2.5-Coder-14B-instruct |
|
- Qwen2.5-14B-instruct |
|
- Qwen2.5-14B-instruct-1M |
|
- Qwen2.5-14B |
|
|
|
After countless experiments π¬π‘, I've finally discovered the optimal merge ratios! π₯ |
|
|
|
β¨ Coming soon: **GGUF** format version! |
|
π₯ Don't miss out on trying it - stay tuned for download! π¨π» |