metadata
library_name: transformers
tags: []
Mixed CerebrasGPT Danish Model
This is an experimental Danish language model fine-tuned on a combination of tokenizers, including both morphological and Byte-Pair-Encoding (BPE) approaches. Built on the CerebrasGPT-111M architecture, this model explores the impact of different tokenization strategies on Danish language understanding and generation.