huseinzol05's picture
Upload model
9c07e27 verified
|
raw
history blame
693 Bytes
---
library_name: transformers
tags: []
---
# Llama-3.2-3B-Malaysian-Reasoning LoRA
This is Low Rank adapters for [mesolitica/Llama-3.2-3B-Malaysian-Reasoning](https://huggingface.co/mesolitica/Llama-3.2-3B-Malaysian-Reasoning)
Full README at [mesolitica/Llama-3.2-3B-Malaysian-Reasoning](https://huggingface.co/mesolitica/Llama-3.2-3B-Malaysian-Reasoning).
## Merging
Because Llama 3.2 3B is using tied weight embedding, so merging required to clone embedding into lm head, script at https://github.com/mesolitica/malaya/blob/master/session/small-malaysian-reasoning/merge-3b.ipynb
If the model is not a tied weight, you can use default `merge_and_unload` function from PEFT or unsloth.