trajkovnikola
commited on
Commit
•
e0a021b
1
Parent(s):
f3227f4
Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,7 @@ tags:
|
|
7 |
- axolotl
|
8 |
---
|
9 |
|
10 |
-
|
11 |
|
12 |
MKLLM-7B is an open-source Large Language Model for the Macedonian language. The model is built on top of the amazing [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) model by continued pretraining on a mix of Macedonian and English text.
|
13 |
A corpus of around 300M tokens, repeated in 2 epochs, was used for the training and even though this might be considered small compared to other similar projects, the resulting model is very capable in understanding and processing the Macedonian language.
|
|
|
7 |
- axolotl
|
8 |
---
|
9 |
|
10 |
+
# MKLLM-7B
|
11 |
|
12 |
MKLLM-7B is an open-source Large Language Model for the Macedonian language. The model is built on top of the amazing [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) model by continued pretraining on a mix of Macedonian and English text.
|
13 |
A corpus of around 300M tokens, repeated in 2 epochs, was used for the training and even though this might be considered small compared to other similar projects, the resulting model is very capable in understanding and processing the Macedonian language.
|