SuperkingbasSKB commited on
Commit
63b7d57
•
1 Parent(s): 0cbfb06

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -19,7 +19,7 @@ tags:
19
  - text-generation-inference
20
  ---
21
  # OpenThaiLLM-Prebuilt: Thai & China & English Large Language Model
22
- **OpenThaiLLM-Prebuilt** is an 7 billion parameter instruct model designed for Thai 🇹🇭 & China 🇨🇳 language.
23
  It demonstrates an amazing result, and is optimized for application use cases, Retrieval-Augmented Generation (RAG), Web deployment
24
  constrained generation, and reasoning tasks.is a Thai 🇹🇭 & China 🇨🇳 large language model with 7 billion parameters, and it is based on Qwen2.5-7B.
25
  ## Introduction
@@ -86,10 +86,10 @@ print(response)
86
  | Model | ONET | IC | TGAT | TPAT-1 | A-Level | Average ThaiExam) | MMLU | M3Exam (1 shot) | M6Exam(5shot) |
87
  | :--- | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
88
  | OpenthaiLLM-Prebuilt-7B | **0.5493** | **0.6315** | **0.6307** | **0.4655** | **0.37** | **0.5294** | **0.7054** | **0.5705** | **0.596** |
 
89
  | llama-3-typhoon-v1.5-8b | 0.3765 | 0.3473 | 0.5538 | 0.4137 | 0.2913 | 0.3965 | 0.4312 | 0.6451 |
90
  | OpenThaiGPT-1.0.0-7B | 0.3086 | 0.3052 | 0.4153 | 0.3017 | 0.2755 | 0.3213 | 0.255 | 0.3512 |
91
  | Meta-Llama-3.1-8B | 0.3641 | 0.2631 | 0.2769 | 0.3793 | 0.1811 | 0.2929 | 0.4239 | 0.6591 |
92
- | SeaLLM-v3-7B | 0.4753 | 0.6421 | 0.6153 | 0.3275 | 0.3464 | 0.4813 | 0.4907 | ***0.7037*** |
93
 
94
  ## Evaluation Performance Few-shot (2 shot)
95
 
 
19
  - text-generation-inference
20
  ---
21
  # OpenThaiLLM-Prebuilt: Thai & China & English Large Language Model
22
+ **OpenThaiLLM-Prebuilt** is an 7 billion parameter pretrain Base model designed for Thai 🇹🇭 & China 🇨🇳 language.
23
  It demonstrates an amazing result, and is optimized for application use cases, Retrieval-Augmented Generation (RAG), Web deployment
24
  constrained generation, and reasoning tasks.is a Thai 🇹🇭 & China 🇨🇳 large language model with 7 billion parameters, and it is based on Qwen2.5-7B.
25
  ## Introduction
 
86
  | Model | ONET | IC | TGAT | TPAT-1 | A-Level | Average ThaiExam) | MMLU | M3Exam (1 shot) | M6Exam(5shot) |
87
  | :--- | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |
88
  | OpenthaiLLM-Prebuilt-7B | **0.5493** | **0.6315** | **0.6307** | **0.4655** | **0.37** | **0.5294** | **0.7054** | **0.5705** | **0.596** |
89
+ | SeaLLM-v3-7B | 0.4753 | 0.6421 | 0.6153 | 0.3275 | 0.3464 | 0.4813 | 0.7037 | 0.4907 | 0.4625 |
90
  | llama-3-typhoon-v1.5-8b | 0.3765 | 0.3473 | 0.5538 | 0.4137 | 0.2913 | 0.3965 | 0.4312 | 0.6451 |
91
  | OpenThaiGPT-1.0.0-7B | 0.3086 | 0.3052 | 0.4153 | 0.3017 | 0.2755 | 0.3213 | 0.255 | 0.3512 |
92
  | Meta-Llama-3.1-8B | 0.3641 | 0.2631 | 0.2769 | 0.3793 | 0.1811 | 0.2929 | 0.4239 | 0.6591 |
 
93
 
94
  ## Evaluation Performance Few-shot (2 shot)
95