|
--- |
|
library_name: transformers |
|
datasets: |
|
- oza75/bambara-texts |
|
language: |
|
- bm |
|
metrics: |
|
- perplexity |
|
- accuracy |
|
base_model: |
|
- openai-community/gpt2 |
|
model-index: |
|
- name: Bambara-GPT2-Base |
|
results: |
|
- task: |
|
type: text-generation |
|
dataset: |
|
name: oza75/bambara-texts |
|
type: oza75/bambara-texts |
|
metrics: |
|
- name: Perplexity |
|
type: perplexity |
|
value: 3.1548 |
|
- name: Accuracy |
|
type: Accuracy |
|
value: 0.7336 |
|
--- |
|
|
|
# Model Description |
|
|
|
This model was finetuned from the openaicommunity/gpt2 model using the oza75/bambara-texts dataset. The model achieved a perplexity of 3.1548 on the validation set. |