GGUF
multilingual
sea
Inference Endpoints
conversational
File size: 2,781 Bytes
a0d1b4a
 
 
 
73d42a5
 
 
 
 
 
 
 
 
 
 
 
 
 
a0d1b4a
73d42a5
 
 
7a68d16
b7c5d5f
73d42a5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
---
license: other
license_name: seallms
license_link: https://huggingface.co/SeaLLMs/SeaLLM-13B-Chat/blob/main/LICENSE
language:
- en
- zh
- vi
- id
- th
- ms
- km
- lo
- my
- tl
tags:
- multilingual
- sea
---

# *SeaLLM-7B-v2.5* - Large Language Models for Southeast Asia

<span style="color: #ff3860"><b>LM-studio/llama.cpp users must set --repeat-penalty to 1 instead of default 1.1</b></span>

<p align="center">
<a href="https://damo-nlp-sg.github.io/SeaLLMs/" target="_blank" rel="noopener">Technical Blog</a>
&nbsp;&nbsp;
<a href="https://huggingface.co/SeaLLMs/SeaLLM-7B-v2.5" target="_blank" rel="noopener"> ๐Ÿค— Tech Memo</a>
&nbsp;&nbsp;
<a href="https://huggingface.co/spaces/SeaLLMs/SeaLLM-7B" target="_blank" rel="noopener"> ๐Ÿค— DEMO</a>
&nbsp;&nbsp;
<a href="https://github.com/DAMO-NLP-SG/SeaLLMs" target="_blank" rel="noopener">Github</a>
&nbsp;&nbsp;
<a href="https://arxiv.org/pdf/2312.00738.pdf" target="_blank" rel="noopener">Technical Report</a>
</p>


- [seallm-7b-v2.5-chatml.Q4_K_M.gguf](https://huggingface.co/SeaLLMs/SeaLLM-7B-v2.5-GGUF/blob/main/seallm-7b-v2.5-chatml.Q4_K_M.gguf) use **ChatML** format by changing `<eos>` to `<|im_end|>`
- [seallm-7b-v2.5.Q4_K_M.gguf](https://huggingface.co/SeaLLMs/SeaLLM-7B-v2.5-GGUF/blob/main/seallm-7b-v2.5.Q4_K_M.gguf) use SeaLLM-7B-v2.5 format, must download [seallm-v2.5.preset.json](https://huggingface.co/SeaLLMs/SeaLLM-7B-v2.5-GGUF/blob/main/seallm-v2.5.preset.json) for LM-studio.


We introduce [SeaLLM-7B-v2.5](https://huggingface.co/SeaLLMs/SeaLLM-7B-v2.5), the state-of-the-art multilingual LLM for Southeast Asian (SEA) languagesย ๐Ÿ‡ฌ๐Ÿ‡ง ๐Ÿ‡จ๐Ÿ‡ณ ๐Ÿ‡ป๐Ÿ‡ณ ๐Ÿ‡ฎ๐Ÿ‡ฉ ๐Ÿ‡น๐Ÿ‡ญ ๐Ÿ‡ฒ๐Ÿ‡พ ๐Ÿ‡ฐ๐Ÿ‡ญ ๐Ÿ‡ฑ๐Ÿ‡ฆ ๐Ÿ‡ฒ๐Ÿ‡ฒ ๐Ÿ‡ต๐Ÿ‡ญ. It is the most significant upgrade since [SeaLLM-13B](https://huggingface.co/SeaLLMs/SeaLLM-13B-Chat), with half the size, outperforming performance across diverse multilingual tasks, from world knowledge, math reasoning, instruction following, etc.

Checkout [SeaLLM-7B-v2.5 page](https://huggingface.co/SeaLLMs/SeaLLM-7B-v2.5) for more details.


## Citation

If you find our project useful, we hope you would kindly star our repo and cite our work as follows: Corresponding Author: [[email protected]](mailto:[email protected])

**Author list and order will change!**

* `*` and `^` are equal contributions.

```
@article{damonlpsg2023seallm,
  author = {Xuan-Phi Nguyen*, Wenxuan Zhang*, Xin Li*, Mahani Aljunied*, Weiwen Xu, Hou Pong Chan,
            Zhiqiang Hu, Chenhui Shen^, Yew Ken Chia^, Xingxuan Li, Jianyu Wang,
            Qingyu Tan, Liying Cheng, Guanzheng Chen, Yue Deng, Sen Yang,
            Chaoqun Liu, Hang Zhang, Lidong Bing},
  title = {SeaLLMs - Large Language Models for Southeast Asia},
  year = 2023,
  Eprint = {arXiv:2312.00738},
}
```