Supa-AI's picture
Update README.md
e65fdf9 verified
metadata
language:
  - ms
  - en
  - zh
  - ta
tags:
  - llama-cpp
  - gguf
  - llm
  - ollama
  - llama
base_model: mesolitica/malaysian-Llama-3.2-3B-Instruct

Supa-AI/malaysian-Llama-3.2-3B-Instruct-gguf

This model was converted to GGUF format from mesolitica/malaysian-Llama-3.2-3B-Instruct using llama.cpp. Refer to the original model card for more details on the model.

Available Versions

  • malaysian-Llama-3.2-3B-Instruct.q4_0.gguf (q4_0)
  • malaysian-Llama-3.2-3B-Instruct.q4_1.gguf (q4_1)
  • malaysian-Llama-3.2-3B-Instruct.q5_0.gguf (q5_0)
  • malaysian-Llama-3.2-3B-Instruct.q5_1.gguf (q5_1)
  • malaysian-Llama-3.2-3B-Instruct.q8_0.gguf (q8_0)
  • malaysian-Llama-3.2-3B-Instruct.q3_k_s.gguf (q3_K_S)
  • malaysian-Llama-3.2-3B-Instruct.q3_k_m.gguf (q3_K_M)
  • malaysian-Llama-3.2-3B-Instruct.q3_k_l.gguf (q3_K_L)
  • malaysian-Llama-3.2-3B-Instruct.q4_k_s.gguf (q4_K_S)
  • malaysian-Llama-3.2-3B-Instruct.q4_k_m.gguf (q4_K_M)
  • malaysian-Llama-3.2-3B-Instruct.q5_k_s.gguf (q5_K_S)
  • malaysian-Llama-3.2-3B-Instruct.q5_k_m.gguf (q5_K_M)
  • malaysian-Llama-3.2-3B-Instruct.q6_k.gguf (q6_K)

Use with llama.cpp

Replace FILENAME with one of the above filenames.

CLI:

llama-cli --hf-repo Supa-AI/malaysian-Llama-3.2-3B-Instruct-gguf --hf-file FILENAME -p "Your prompt here"

Server:

llama-server --hf-repo Supa-AI/malaysian-Llama-3.2-3B-Instruct-gguf --hf-file FILENAME -c 2048

Model Details