|
--- |
|
base_model: unsloth/SmolLM2-135M |
|
language: |
|
- en |
|
- bg |
|
license: apache-2.0 |
|
tags: |
|
- text-generation-inference |
|
- transformers |
|
- unsloth |
|
- llama |
|
- trl |
|
datasets: |
|
- petkopetkov/math_qa-bg |
|
- petkopetkov/gsm8k-bg |
|
- petkopetkov/winogrande_xl-bg |
|
- petkopetkov/hellaswag-bg |
|
- petkopetkov/mmlu-bg |
|
- petkopetkov/arc-easy-bg |
|
- petkopetkov/arc-challenge-bg |
|
--- |
|
|
|
# SmolLM2-135M-Bulgarian |
|
|
|
- **Developed by:** petkopetkov |
|
- **License:** apache-2.0 |
|
- **Finetuned from model :** unsloth/SmolLM2-135M |
|
|
|
SmolLM2-135M finetuned on datasets translated to Bulgarian language: |
|
|
|
- **MMLU**: multiple-choice questions from various branches of knowledge |
|
- **Winogrande challenge**: testing world knowledge and understanding |
|
- **Hellaswag**: testing sentence completion |
|
- **ARC Easy/Challenge**: testing logical reasoning |
|
- **GSM-8k**: solving multiple-choice questions in high-school mathematics |
|
- **MathQA**: math word problems |
|
|
|
### Usage |
|
|
|
First, install the Transformers library with: |
|
```sh |
|
pip install -U transformers |
|
``` |
|
|
|
#### Run with the `pipeline` API |
|
|
|
```python |
|
import torch |
|
from transformers import pipeline |
|
|
|
pipe = pipeline( |
|
"text-generation", |
|
model="petkopetkov/SmolLM2-135M-bg", |
|
torch_dtype=torch.bfloat16, |
|
device_map="auto" |
|
) |
|
|
|
prompt = "Колко е 2 + 2?" |
|
|
|
print(pipe(prompt)[0]['generated_text']) |
|
``` |
|
|