Edit model card
<|im_start|>system
{System}<|im_end|>
<|im_start|>user
{User}<|im_end|>
<|im_start|>assistant
{Assistant}

"Flash Attention" function must be activated. why?

Downloads last month
249
GGUF
Model size
7.61B params
Architecture
qwen2

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

Inference API
Unable to determine this model's library. Check the docs .

Model tree for joongi007/Ko-Qwen2-7B-Instruct-GGUF

Quantized
(2)
this model

Collection including joongi007/Ko-Qwen2-7B-Instruct-GGUF