File size: 693 Bytes
5569802
 
9c07e27
5569802
 
b078fe4
5569802
b078fe4
5569802
b078fe4
5569802
b078fe4
5569802
9c07e27
5569802
b078fe4
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
---
library_name: transformers
tags: []
---

# Llama-3.2-3B-Malaysian-Reasoning LoRA

This is Low Rank adapters for [mesolitica/Llama-3.2-3B-Malaysian-Reasoning](https://huggingface.co/mesolitica/Llama-3.2-3B-Malaysian-Reasoning)

Full README at [mesolitica/Llama-3.2-3B-Malaysian-Reasoning](https://huggingface.co/mesolitica/Llama-3.2-3B-Malaysian-Reasoning).

## Merging

Because Llama 3.2 3B is using tied weight embedding, so merging required to clone embedding into lm head, script at https://github.com/mesolitica/malaya/blob/master/session/small-malaysian-reasoning/merge-3b.ipynb

If the model is not a tied weight, you can use default `merge_and_unload` function from PEFT or unsloth.